WO2020168252A1 - Shared privacy protected databases for person of interest - Google Patents

Shared privacy protected databases for person of interest Download PDF

Info

Publication number
WO2020168252A1
WO2020168252A1 PCT/US2020/018382 US2020018382W WO2020168252A1 WO 2020168252 A1 WO2020168252 A1 WO 2020168252A1 US 2020018382 W US2020018382 W US 2020018382W WO 2020168252 A1 WO2020168252 A1 WO 2020168252A1
Authority
WO
WIPO (PCT)
Prior art keywords
poi
image
user
database
databases
Prior art date
Application number
PCT/US2020/018382
Other languages
French (fr)
Inventor
Patrick Doherty
John Wall
John J. Dames
Michael S. Biviano
Original Assignee
Keee, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Keee, Llc filed Critical Keee, Llc
Publication of WO2020168252A1 publication Critical patent/WO2020168252A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • G06V40/173Classification, e.g. identification face re-identification, e.g. recognising unknown faces across different face tracks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/30Scenes; Scene-specific elements in albums, collections or shared content, e.g. social network photos or video

Definitions

  • the present disclosure relates generally to detecting and tracking persons of interest over a network, while maintaining privacy between discrete persons of interest databases.
  • Businesses and high profile individuals often have a need to monitor persons of interest within their vicinity.
  • a business e.g., a casino
  • high-value guests e.g., key customers
  • high profile individuals e.g., celebrities, government officials, and the like
  • threatening individuals e.g., alien fans, enemies, terrorists, and the like
  • Security or other personnel may monitor such persons of interest at specific establishments or events. For example, security personnel may be positioned throughout a venue or positioned behind a security monitor to watch for suspicious behavior or key guests. Such monitoring, however, is time consuming, tedious, inefficient, and subject to human error. Further, often persons of interests may pose threats to a large number of at-risk individuals, but current methods for sharing person of interest information between at-risk individuals and their associates do not maintain privacy between the at-risk individuals and their respective security networks.
  • the technology disclosed herein is generally related to a system and method for tracking persons of interest.
  • a method of identifying a person of interest (POI) using two or more POI databases includes receiving a first image including a person; analyzing the first image for identifiable features corresponding to the person;
  • comparing the identifiable features in the first image to corresponding features in one or more stored POI images in the two or more POI databases detecting similar features in a second image of a POI that match the identifiable features in the first image within a threshold value, wherein the second image of the POI is from a first POI database of the two or more POI databases, wherein the first POI database is owned by a first user; determining the person is the POI; and extracting information related to the second image of the POI from the first POI database.
  • a method for tracking a person of interest (POI) and sharing information related to the POI across a plurality of POI databases includes receiving an image; comparing the received image to a plurality of images of POIs in the plurality of POI databases; determining the received image matches a POI image of the plurality of images of POIs in a first POI database of the plurality of POI databases; determining an access level to the first POI database; and transferring an amount of information related to the POI image based on the access level to the first POI database.
  • a system for tracking persons of interest and sharing information related to the persons of interest over a network includes a user device; a camera; a POI database, wherein the user device, camera, and POI database are in communication over the network; and a processor.
  • the processor is configured with instructions to receive an image from at least one of the camera or the user device; scan the POI database for a matching POI image to determine that the matching POI image matches the image; detect the matching POI image in the POI database, wherein the POI database comprises information related to the matching POI image; and determine whether permission is needed to transfer the information.
  • Fig.1 is a block diagram illustrating an example of a POI tracking system.
  • Fig. 2 is a simplified block diagram of a computing device representative of one or more components of the system of Fig. 1.
  • Fig. 3 is a flow chart illustrating a method for locally filtering collected image data.
  • Fig. 4 is a flow chart illustrating a method for processing image data and selectively sharing information using one or more databases associated with the POI tracking system of Fig. 1.
  • Fig. 5 is a flow chart illustrating a method for processing image data using various databases of the POI tracking system of Fig. 1.
  • Fig. 6A is an example of a graphical user interface displaying persons of interest from a POI database of the POI tracking system of Fig. 1.
  • Fig. 6B is an example of a graphical user interface displaying a profile for a person of interest from a POI database of the POI tracking system of Fig. 1.
  • Fig. 6C is an example of a graphical user interface of the POI tracking system of Fig. 1 that provides access to various lists categorizing persons of interest of different threat levels.
  • Fig. 6D is an example of a graphical user interface displaying alerts related to persons of interest from a POI database of the POI tracking system of Fig. 1.
  • Fig. 6E is an example of a graphical user interface displaying a map of persons of interest from a POI database of the POI tracking system of Fig. 1.
  • a system and method for recognizing and tracking persons of interest is disclosed.
  • the system and methods of the present disclosure facilitate POI tracking by enabling data sharing among various interconnected POI databases.
  • Data sharing between databases of the present system is both secure and scalable.
  • the system provides multi-directional information flow among users and the system, but in a way that may be controllable, optionally limited, and secured.
  • the system includes one or more user devices, POI databases, and sensors in communication over a network. Using these devices, the system may detect a person and send information related to the person to the system for processing. For example, a camera may capture one or more images, the captured images are then analyzed by a computer or other processing element, and when the images are determined to include a person (e.g., via facial recognition), the image may then be analyzed with respect to one or more POI databases. For example, the system may compare captured information to stored POI information contained in various POI databases (e.g., discrete databases corresponding to different users) to determine whether the person detected by the sensor is a POI in at least one POI database. When a POI is detected, an alert is sent to a user and optionally one or more designated users via a user device.
  • a POI is detected
  • an alert is sent to a user and optionally one or more designated users via a user device.
  • a POI database includes information related to persons of interest.
  • a person of interest may be an individual that is considered generally threatening or dangerous, threatening or dangerous to a specific user, or the like.
  • a person of interest may be an individual that is considered important, high-profile, or the like.
  • a POI database of the present disclosure may include one or more POI profiles, which may include information specific to a POI.
  • a POI profile may include a picture of the POI and identifying information such as, name, age, residency, physical features (e.g., gender, height, weight, hair color, eye color, etc.), and the like.
  • a POI profile may also include information related to the threat level or high-profile level of the POI, the last detected location of the POI, activity of the POI, and other data as may be of-interest or useful in tracking the POI and expected actions, as will be discussed in more detail below.
  • the system can identify different POI databases (e.g., based on a user’s affiliation with or access level to a database) and determine a level of data sharing among users and their respective databases.
  • the level of data sharing may be based on the user’s affiliation with the database and/or on owner preferences associated with the database.
  • the level of data sharing may be, for example, complete, limited, or none.
  • a user may be the owner of a particular database.
  • the system may recognize the database as “owner’s database” and allow full access to all information contained therein.
  • a database may belong to another user.
  • the system may recognize the database as“a shared database” and allow limited access to information contained therein. If the user has not agreed to share his or her database, then the system may recognize the database as a“private database” or“non-shared database” and restrict access to information contained therein. In this example, the user wanting access may request access to the owner of the private database. The owner of the private database may grant or deny access.
  • one or more cameras are used to detect POIs.
  • the cameras may be general cameras used by the system or cameras specific to a user.
  • one or more cameras may be placed at a public location (e.g., a street, town hall, shopping center, grocery store, etc.).
  • Such cameras may capture images that may be used by any user within the system to determine whether a POI is present.
  • a POI may reveal that a POI is in the same town or city as the user.
  • a user may place one or more of his or her own cameras at strategic locations (e.g., a performance venue, a home, an office, etc.).
  • a performance venue e.g., a home, an office, etc.
  • Celebrity Star has a performance
  • she may place cameras at each entrance of the performance venue to scan for any POIs (e.g., appetite fans).
  • the one or more cameras capture images that will be analyzed by the system for POI detection.
  • the system of the present disclosure has both local processing and cloud processing capabilities.
  • local processing may occur within one or more cameras used with the system.
  • local processing may occur at a central processing unit that is in communication with the one or more cameras over a local area network.
  • Local processing may filter out irrelevant images (e.g., images that do not include a person/face) or redundant images (e.g., images of a person that were recently previously taken and sent to the cloud) before sending images to the cloud for more extensive analysis. In this manner, the system may be able to reduce the volume of data transmitted to the cloud, allowing more efficient data transmission and subsequent data processing.
  • the system may exclude local processing, such that all image data is transferred to the cloud for processing. It should be noted that the data transferred to the cloud may be transferred as a single stream or may be transferred in various packets or bursts similar to streaming methods.
  • Image processing occurring in the cloud may include generally any type of object identification and comparison techniques, such as, but not limited to, biometric recognition, facial recognition, other personal identification features, e.g., license plate recognition, or the like.
  • processing elements may analyze captured images for faces (e.g., where local processing does not perform this task), and in instances where a face has been identified, optionally executing more sensitive facial recognition algorithms to extract facial features, and then comparing the facial images and/or facial features to stored or database images and/or facial features in one or more POI databases, and the like.
  • biometric analysis may include scanning the image for one or more distinguishing biological traits that uniquely identify a POI in the POI database, e.g., a unique identifier may be hand geometry, earlobe geometry, and the like.
  • license plate recognition may include detecting a license plate or the original characteristics of the license plate (e.g., alphanumeric code, state, year of registration, etc.), in an image and determining the license plate number.
  • POI license plate information may be stored in a POI database to further identify a POI. License plate recognition may be beneficial, for example, to identify POIs that enter a parking structure at an event.
  • the system of the present disclosure can determine whether an image includes a person of interest (e.g., whether the person in the image matches a POI in a POI database), scan one or more POI databases for POIs (e.g., POI profile information) that match the person in the image.
  • the system may scan multiple POI databases in parallel or in a particular order (e.g., depending upon the user’s affiliation with the database).
  • the system determines whether the person in the image matches a POI within a particular match threshold. For example, a match threshold over 80% may indicate a high likelihood that the person in the image is a POI. As another example, a match threshold of 100% indicates that the person is a POI.
  • Comparing a received image to POI images within a POI database may be executed using several different image comparison techniques.
  • the system may compare pixels, detected facial attributes, colors, shading, contrast, highlights, and the like.
  • the system may detect facial attributes in each image.
  • the system may mark facial attributes with one or more points.
  • the system may mark each comer of a mouth with a point, the tip of a nose with a point, and each eye with a point.
  • the system may compare the distances between points in each image to assess whether the images are of the same face.
  • mathematical representations of POI images may be stored. The system may convert the received image into a mathematical representation of the image and assess a mathematical correlation between the mathematical representation of the received image and the stored mathematical representations of POI images to determine a match.
  • the system also can send notifications and alerts in real-time to various users and, in some embodiments, to outside parties.
  • the system may send notifications to applicable users when new POI profiles are created, existing POI profiles are updated, POI activity is detected, a facial recognition match is detected, a team communication is sent, a threat response is executed, and the like.
  • Outside parties may include a law enforcement authority, hospital, or the like.
  • a“user” is any individual using the system.
  • a user may be a high profile individual (e.g., a celebrity, government official, etc.), members of a security team (e.g., body guards, private security detail, secret service, etc.), a business (e.g., a casino, a high-end store, a film production company, etc.), or the like.
  • an “owner” is any user that has a database specific to that user.
  • persons of interest in a database owned by a specific user are of interest to that specific user (e.g., POI’s in a POI database owned by Celebrity Star may be people who are threatening or dangerous to Celebrity Star).
  • a user may have an administrative role, delegate, or an affiliate role associated with a POI database.
  • a user with an administrative role may have authority to add or edit POIs (e.g., edit the POI’s threat level, activity, etc.) within the POI database, allow various permissions to other users (e.g., provide access, grant administrative authority, etc.), confirm threats, communicate with and send alerts to affiliated users (e.g., a security team), and the like.
  • a user with an affiliate role may have authority to access the POI database, confirm threats, communicate with and send alerts to affiliated users and to the owner/administrator, and the like.
  • Fig. 1 is a block diagram illustrating an example of a POI tracking system 100.
  • the system 100 includes one or more user devices 106a-n, one or more sensors 108a-n, and one or more databases 1 lOa-n.
  • the system 100 may also include one or more servers 102 and a central processing unit (CPU) 112, which may be in communication with the user device(s) 106a-n, sensor(s) 108a-n, and database(s) l lOa-n.
  • CPU central processing unit
  • Each of the various components of the POI tracking system 100 may be in communication directly or indirectly with one another, such as through a network 104 and/or a local area network (LAN) 114. In this manner, each of the components can transmit and receive data from other components in the system.
  • the CPU 112 may be in communication with the user device(s) 106a-n and sensor(s) 108a-n over the LAN 114.
  • the server 102 may act as a go between for some of the components in the system 100.
  • the network 104 may be substantially any type or combination of types of communication system for transmitting data either through wired or wireless mechanism (e.g., cloud, WiFi, Ethernet, Bluetooth, cellular data, or the like).
  • certain components in the POI tracking system 100 may communicate via a first mode (e.g., Bluetooth) and others may communicate via a second mode (e.g., WiFi).
  • certain components may have multiple transmission mechanisms and be configured to communicate data in two or more manners.
  • the configuration of the network 104 and communication mechanisms for each of the components may be varied as desired.
  • the server 102 includes one or more computing devices that process and execute information.
  • the server 102 may include its own processing elements, memory components, and the like, and/or may be in communication with one or more external components (e.g., separate memory storage) (an example of computing elements that may be included in the server 102 is disclosed below with respect to Fig. 2).
  • the server 102 may also include one or more server computers that are interconnected together via the network 104 or separate communication protocol.
  • the server 102 may host and execute a number of the processes executed by the system 100.
  • the server 102 has or offers a number of configurable application programming interfaces (API) which can be accessed and used from an application on a user device 106 to send and receive data to the server 102.
  • API application programming interfaces
  • All applications may be required to authenticate sessions or connections via a license key or other code.
  • the user device(s) 106a-n may be any of various types of computing devices, e.g., smart phones, tablet computers, desktop computers, laptop computers, set top boxes, gaming devices, wearable devices, or the like.
  • the user device(s) 106a-n provides output to and receives input from a user.
  • the user device(s) 106a-n may receive POI database updates from a user and output POI information and notifications or alerts to a user.
  • the type and number of user devices 106a-n may vary as desired.
  • the sensor(s) 108a-n may be any type of instrument or technology used that can capture and optionally detect and recognize a person.
  • the sensor(s) 108a-n may be one or more cameras.
  • the camera may have onboard facial recognition technology, such that the camera is able to recognize a particular person when the person’s face is at least partially aligned with the camera view.
  • the cameras may merely record image data and transmit the image data to an external processing element (e.g., a server 102 or CPU 112) for processing.
  • the database(s) 1 lOa-n may be an internal to the system or external database.
  • the database 110 may be an internal database controlled by a security company or the like that monitors threats.
  • the security company may access an external database that is associated with a policing authority or enforcement agency and that includes information on dangerous persons and/or criminals.
  • the system may include a combination of internal databases and external databases, where the external databases can supplement data for the POI internal databases.
  • the system 100 may include multiple databases 110h associated with multiple users, where these POI databases 110h may share select information between users.
  • the databases 110h may be in parallel communication with the server 102 over the network 104, such that the server 102 may access multiple databases l lOn simultaneously for data processing and POI analysis.
  • the server 102 may access partial or limited information or complete information in a particular database 110 depending upon the user and the database 110.
  • the server 102 may access all information in a database 110 owned by the user requesting the information.
  • Fig. 2 A simplified block structure for a computing device that may be used with the system 100 or integrated into one or more of the system 100 components is shown in Fig. 2.
  • 1 lOa-n may include one or more of the components shown in Fig. 2 and use one or more of these components to execute one or more of the operations disclosed in methods 200, 250, and 300.
  • the computing device 150 may include one or more processing elements 152, an input/output interface 154, a network interface 156, one or more memory components 158, a display 160, and one or more external devices 162.
  • Each of the various components may be in communication with one another through one or more busses, wireless means, or the like.
  • the processing element 152 is any type of electronic device capable of processing, receiving, and/or transmitting instructions.
  • the processing element 152 may be a central processing unit, microprocessor, processor, or microcontroller.
  • select components of the computer 150 may be controlled by a first processor and other components may be controlled by a second processor, where the first and second processors may or may not be in communication with each other.
  • the memory components 158 are used by the computer 150 to store instructions for the processing element 152, as well as store data, such as user information, POI information, threat or status assessments, and the like.
  • the memory components 158 may be, for example, magneto-optical storage, read-only memory, random access memory, erasable programmable memory, flash memory, or a combination of one or more types of memory components.
  • the display 160 provides visual feedback to a user and, optionally, can act as an input element to enable a user to control, manipulate, and calibrate various components of the computing device 150.
  • the display 160 may be a liquid crystal display, plasma display, organic light-emitting diode display, and/or cathode ray tube display.
  • the display 160 may include one or more touch or input sensors, such as capacitive touch sensors, resistive grid, or the like.
  • the I/O interface 154 allows a user to enter data into the computer 150, as well as provides an input/output for the computer 150 to communicate with other devices (e.g., server 102, sensor(s) 108a-n, other computers, speakers, etc.).
  • the I/O interface 154 can include one or more input buttons, touch pads, and so on.
  • the network interface 156 provides communication to and from the computer 150 to other devices.
  • the network interface 156 allows the server 102 to
  • the network interface 156 includes one or more communication protocols, such as, but not limited to WiFi, Ethernet, Bluetooth, and so on.
  • the network interface 156 may also include one or more hardwired components, such as a Universal Serial Bus (USB) cable, or the like.
  • USB Universal Serial Bus
  • the configuration of the network interface 156 depends on the types of communication desired and may be modified to communicate via WiFi, Bluetooth, and so on.
  • the external devices 162 are one or more devices that can be used to provide various inputs to the computing device 150, e.g., mouse, microphone, keyboard, trackpad, or the like.
  • the external devices 162 may be local or remote and may vary as desired.
  • Fig. 3 is a flow chart illustrating a method for locally filtering collected image data, which helps to increase data transfer efficiency between the LAN and cloud, reducing costs.
  • the method 200 begins with operation 202 and a local processing element receives an image from a camera.
  • the local processing element may be an internal component of the camera or it may be an external processing element, such as, for example, a locally positioned central processing unit 112 in communication with the camera over a local area network 114.
  • An image may be received by the local processing element at various discrete time intervals, over a continuous time interval, or at random time intervals.
  • the camera may be coupled to an internal or external motion sensor and may be configured to record and send images when motion is detected.
  • the method 200 proceeds to operation 204 and the local processing element scans the image for a face. For example, certain pixels may correlate with certain facial features, indicating an image of a face is likely. As one example, a face may be represented by a particular arrangement of pixels, pixel characteristics (e.g., hue, brightness, location on the image, etc.). Other image features may be analyzed to determine whether a face is present, such as, for example, color, shading, contrast, highlights, and the like. [0050] After operation 204, the method 200 proceeds to operation 206 and the local processing element determines whether the scanning operation 204 detected a face in the image. If a face is not detected, the method 200 proceeds to operation 208 and the image is discarded. For example, in instances where the received image pixels match less than a predetermined threshold (e.g., 50%, 60%, etc.) of the facial pixel arrangement, the local processing element determines that the image does not include a face and the image will be discarded.
  • a predetermined threshold e.g., 50%
  • the scanning and analysis operations 204, 206 may be less sensitive than further analysis that may be done in the cloud.
  • the algorithms analyzing the image may be selected to simply determine whether the images are more likely than not to contain a face, rather than a full facial recognition assessment.
  • the analysis may be more robust and include a full facial recognition analysis.
  • the method 200 optionally proceeds to operation 210 and a cache or local storage is searched for a recent matching image.
  • the local processing element may store copies of images sent to the cloud (e.g., images where a face was determined to be present or face positive images).
  • the local processing element may compare the received image to stored face positive images to determine whether there is a match.
  • the local processing element may conduct facial recognition or image comparison techniques to determine whether the faces in each image match. For example, the local processing element may compare pixels, color, shading, contrast, highlights, and the like.
  • a match may be determined if the received image matches a stored image within a particular matching threshold. For example, if the received image matches the stored image by 50% or greater, 60% or greater, 70% or greater, or the like, then the local processing element determines that there is a match. If a match is detected, the local processing element may determine whether the matching stored image is recent. For example, the local processing element may place a timestamp on the image when the image is received or when the image is placed in storage. The timestamp may be stored as metadata associated with the image. The definition of“recent” may vary. For example,“recent” images may include prior images received or stored within seconds, minutes, or hours of the received image.
  • a received image may match a recent stored image where different cameras detect the same person within a predetermined timeframe. For example, a camera positioned at the front door captures a first image corresponding to a first person (e.g., John Doe), transmits the image to the local processing element (either on-board, directly connected, or via the LAN). The local processing element then detects a face in the image, scans a local storage for a match (e.g., facial comparison or image comparison), fails to find an image matching characteristics of the image of John Doe, stores a copy of the image, and transmits the image to the cloud for additional processing.
  • a first image corresponding to a first person e.g., John Doe
  • the local processing element detects a face in the image, scans a local storage for a match (e.g., facial comparison or image comparison), fails to find an image matching characteristics of the image of John Doe, stores a copy of the image, and transmits the image to the cloud for additional processing.
  • a second camera positioned in the lobby captures a second image of John Doe and sends the second image to the local processing element.
  • the local processing element detects a face in the image, searches storage for a match (e.g., image portion match or facial match), and determines that the first stored image of John Doe matches the current or second image of John Doe within a particular matching threshold.
  • the local processing element further determines that the first stored image of John Doe is a recent stored image since the first stored image was taken at the front door only a few seconds or minutes before the second image of John Doe was taken in the lobby, or was captured during the same “event.”
  • the method 200 proceeds to operation 208 and the received image is discarded.
  • the second image of John Doe is discarded since the first matching image was recently sent to the cloud.
  • the system allows more efficient data processing since data transmission to the cloud is reduced.
  • the additional image of John Doe provides additional location information throughout the event or system, so the exact position of John Doe can be tracked more efficiently.
  • the system By discarding images captured close in time to previously stored images, the system eliminates additional images that likely reveal similar situational data. It may be desirable to keep images captured further apart in time, as the circumstances/situational environment may change overtime (e.g., the location of the POI may change). For example, it may be desirable to know the location of a POI before, during, and/or after a performance. In this example, it may be desirable to keep any images that are taken at least 30 minutes apart.
  • redundant image filtering may be adjustable by a user. For example, a user may want to know the whereabouts of a POI at all times once it is known that the POI is on the premises. The user may set user preferences via an application on a user device to keep all images so that the user can closely monitor a POI as the POI’s location changes from camera to camera. In some instances, the user may turn off redundant image filtering for certain POIs or all POIs. For example, a user may want to track a highly dangerous POI more closely than a non-threatening POI by saving more images as they are taken.
  • the method 200 proceeds to operation 214 and the received image is sent to the cloud for additional, generally more extensive, processing.
  • method 200 may proceed directly from operation 206 to operation 214 if method 200 did not proceed to operation 210 (e.g., no redundant image filtering was performed), and the image may be sent to the cloud after it is determined that there is a face in the image.
  • the local processing element reduces the volume of data sent to the cloud.
  • Fig. 4 is a flow chart illustrating a method for processing image data and selectively sharing information using one or more databases associated with the POI tracking system of Fig. 1.
  • the method 250 begins with operation 252 and a cloud processing element receives an image.
  • the cloud processing element may receive the image from a local processing element via the network (e.g., WiFi), such as the local processing element discussed with reference to Fig. 3 and method 200.
  • local processing may be omitted and all images (e.g., images with and without faces, redundant images, etc.) may be sent directly from a camera or user device to the cloud.
  • the system may associate the received image with a particular user.
  • the received image may be received from a particular user’s camera.
  • the user may have a user profile within the system that includes information on cameras owned by the user.
  • the received image may include metadata with camera identifying information or the system may receive identifying information directly from the camera.
  • the system can compare the camera identifying information to camera information stored in user profiles to find the user (owner) associated with the camera and received image.
  • the system may associate the received image with the system generally and with no user in particular.
  • the image may come from a general camera associated with the system that is not owned by any particular user.
  • the method 250 proceeds to operation 254, and the received image is compared to images stored in one or more associated databases.
  • Multiple databases may be associated with the system. For example, various users may each have their own POI database within the system. The system may scan each database and compare the received image to POI images stored in the various POI databases. The databases may be scanned in parallel or sequentially. In many embodiments, the image comparison in this step may be more extensive than the partial image comparison that occurred during local processing (discussed with reference to Fig. 3 and method 200).
  • various image comparison techniques may be applied. For example, the system may use conventional facial recognition techniques and then compare facial features between the images. For example, facial recognition may generate geometric or mathematical descriptions of the image and use that data to assess similar structural components or the system may compare shapes, color, pixels, and the like. The system may apply a mathematical formula and conduct a statistical comparison of the images.
  • the method 250 proceeds to operation 256 and the cloud processing element determines whether the received image matches one or more images stored in the one or more associated databases. If no match is detected at operation 256, the method 250 proceeds to operation 258 and the image is discarded. If a match is detected at operation 256, the method 250 proceeds to operation 260 and the cloud processing element determines whether the match is within a threshold value.
  • the threshold value may be a percent similarity between images or any other value to measure a correlation between the images. As one example, the threshold value may be 85% similarity. In this example, if the received image matches the stored image with at least 85% similarity, then the match is within the threshold value.
  • the system may send the received image or both matching images to one or more applicable users (e.g., the owner of the camera that captured the received image, security personnel, and the like) to confirm the accuracy of the match.
  • the system may send the received image or both matching images to one or more applicable users to confirm the accuracy of the match without determining whether the match is within a threshold value.
  • the thresholds may vary depending on the desires of the user, sensitivity of the system, or profile characteristics of the POI profile (e.g., larger threat POI matches may require a lower threshold for matches) and may range from 51% to 99% and the discussion of any particular level is meant as illustrative only.
  • the method 250 proceeds to operation 258 and the image is discarded. If the match is within the threshold value, then the method 250 proceeds to operation 262 and the cloud processing element determines whether the matching stored image is from the owner’s database.
  • the system may know the user associated with the image (i.e., the owner) and therefore the associated user profile.
  • the user profile may be associated with a POI database (e.g., the owner’s database).
  • the method 250 proceeds to operation 264 and all or substantially all information related to the matching POI image is transferred to one or more applicable users.
  • Information related to the person of interest may include, for example, identifying information, last seen location, time POI was last seen, threat level, category of threat/danger, relationship to user, activity details, and the like. Identifying information may include, for example, name, age, residency, physical features (e.g., gender, height, weight, hair color, eye color, etc.), and the like.
  • the threat level may indicate the severity of danger presented by the POI. For example, threat level may be represented by numbers, colors, symbols, or the like.
  • a POI may have a danger/threat rating from 1 (low) to 10 (high).
  • different colors may represent different threat levels. For example, gray may indicate low-level threat, blue may indicate mid-level threat, yellow may indicate mid- to high- level threat, and red may indicate high-level threat.
  • gray may indicate low propensity for violence and suggest the person merely be observed; blue may indicate unknown propensity for violence, unstable, and/or obsessive; yellow may indicate propensity for violence but unable to travel; red may indicate high propensity for violence and ability to travel.
  • the category of threat/danger may include, for example, obsession, delusion, anger, violence, and the like. The relationship of the POI to the user may correlate with the threat category.
  • the POI may be an appetite fan, an enemy, in a delusional relationship with the user, and the like.
  • Activity details may relate to the POI’s interactions with the user or the user’s affiliates (e.g., family, friends, security team, etc.).
  • activity details may indicate the POI has previously been violent with or verbally abusive to the user or the user’s affiliates, exhibited lude or inappropriate conduct, or the like.
  • Such information may be transferred to the owner (e.g., Celebrity Star) and/or to any authorized users (e.g., Celebrity Star’s security team). This information may also include personal and identifying information related to the owner.
  • the one or more applicable users can determine a course of action.
  • the one or more applicable users may determine a course of action based upon the POT s threat level and prior conduct. For example, if Celebrity Star’s security team receives a notification that an angry POI (e.g., threat level red) is in the lobby, the security team may send a security guard to remove the POI from the premises. As another example, if the notification indicates that the POI is violent and has been physical with security guards in the past (e.g., threat level red), then the security team may send several security guards to remove the POI from the premises.
  • an angry POI e.g., threat level red
  • the security team may send several security guards to remove the POI from the premises.
  • the system may determine whether to send certain information to an outside party. For example, if a POI is detected, the system may scan the POI profile for any red flags. A red flag may be an arrest record, a restraining order, or some other indicator of a criminal record. If, for example, a restraining order is on file, the system may send an alert to a police station. Information sent to outside parties (e.g., a police station) may include identifying information related to the POI and the POT s detected location. In this example, the information may also include a copy of the restraining order.
  • outside parties e.g., a police station
  • the method 250 proceeds to operation 266 and the cloud processing element determines whether permission is needed to access the information associated with the matching stored image. Whether permission is needed to access information in another user’s database may depend upon the requesting user’s (e.g., the user associated with the received image) affiliations and on the other user’s preferences for his or her database. For example, the requesting user may have an established relationship with other users allowing shared access between the users’ databases. Such databases are considered shared databases. With a shared database, permission has already been granted by the owner of the database for the user to access certain information within the shared database. As such, no additional permission is necessary for the user to access information in the shared database.
  • permission has already been granted by the owner of the database for the user to access certain information within the shared database. As such, no additional permission is necessary for the user to access information in the shared database.
  • a user may set user preferences in his or her profile allowing his or her database to be shared with any user in the system (e.g., a globally shared database). In this manner, the requesting user’s identity is irrelevant to gaining access to the shared database. In this case, the requesting user may access certain information within the globally shared database without additional permission.
  • a user may set user preferences in his or her profile to restrict access to his or her database.
  • the database may be considered a private or non-shared database, and access may be restricted to ah other users. A private database requires permission for a requesting user to gain access to any information stored therein.
  • a user may share his or her database with other users, but restrict access to particular users.
  • the method 250 proceeds to operation 268 and limited information related to the matching stored POI image is transferred to one or more applicable users.
  • Limited information may include pertinent information related to the POI (e.g., identifying information and threat level), while excluding any information that may be identifying of the user (e.g., user name, location and time of POI detection, activity of POI, and the like). For example, the location and time of POI detecting may be traced back to a particular user.
  • the time and location could reveal Celebrity Star’s identity.
  • the activity note in the POI profile may reveal identifying information about the user. For example, a POI streaking on stage is a specific event that could be traced to a particular performer (e.g., if the event was covered by the news). Any identifying information may be scrubbed from the data transmitted to the requesting user. Such limited information may be transferred to the owner (e.g., Celebrity Star) and/or to any affiliated users (e.g., Celebrity Star’s security team).
  • the method 250 proceeds to operation 270 and permission is requested to the applicable user.
  • the system may automatically send a request to the user upon determining that the user’s private database has a matching POI image.
  • the system may send a message back to the requesting user indicating a matching POI image has been detected in a private database, and allowing the requesting user to determine whether to send an access request via the system.
  • the system may send the access request to the applicable user as a message. For example, the message may populate in a message tab in the user’s profile.
  • the method 250 proceeds to operation 272 and the cloud processing element determines whether permission was granted to access data associated with the matching stored POI image (e.g., in the private database). For example, a selection (e.g., by a button, toggle, or the like) by the user receiving the request may indicate a grant or denial of permission to access the user’s database. If permission was not granted, the method 250 ends and no information is transferred. If permission was granted, the method 250 proceeds to operation 274 and limited information related to the matching stored image is transferred to one or more applicable users.
  • Limited information may include pertinent information related to the POI (e.g., identifying information and threat level), while excluding any information that may be identifying of the user (e.g., user name, location and time of POI detection, activity of POI, and the like). Any identifying information may be scrubbed from the data transmitted to the requesting user. Such limited information may be transferred to the owner (e.g., Celebrity Star) and/or to any affiliated users (e.g., Celebrity Star’s security team).
  • the owner e.g., Celebrity Star
  • any affiliated users e.g., Celebrity Star’s security team.
  • the owner of the POI database containing the matching POI image may also receive certain information related to the POI detection event. This information may be limited to avoid sharing any identifying information of the user who detected the POI. For example, the information may only reveal the identity of the POI and that a new POI detection event was created for that POI. In some examples, the information may exclude the location of the POI detection event, as this may be traced back to the user who detected the POI.
  • the system may scan all POI databases in the system, and upon detecting a match with a POI image in a particular user’s database, may alert that particular user that a POI in the user’s POI database has been located and provide the user with any circumstantial details as to the POI’s location, time of detection, and the like.
  • the system provides complete information related to the POI to the user.
  • Fig. 5 is a flow chart illustrating a method for processing image data using various databases sequentially.
  • the method 300 begins with operation 302 and the system 100 determines the identity of the owner of a received image.
  • An image may be received that is associated with a particular user.
  • the image may be received from a camera associated with the user or via the user’s profile.
  • the user’s profile may include identifying information for the user, such as, for example, the user’s name, email address, username, and the like.
  • the method 300 proceeds to operation 304 and the system 100 determines the owner’s database.
  • the user profile for the owner of the received image may be associated with a POI database particular to the user. This POI database is considered the owner’s database.
  • the method 300 proceeds to operation 306 and the received image is compared to POI images in the owner’s database.
  • Conventional facial recognition and image comparison techniques may be used. For example, facial features, pixels, color, contrast, shapes, and the like may be compared between images.
  • the method 300 proceeds to operation 308 and the system 100 determines whether the received image matches a POI image in the owner’ s database within a threshold value.
  • the system may determine one or more of facial features, pixels, color, contrast, shapes, and the like match between the two images.
  • the system may assess the percent match between the images and determine whether the percent match is equal to or exceeds a matching threshold value.
  • a matching threshold value may represent a value at which there is a high probability that the images are the same person.
  • the matching threshold value may be at least 70%, at least 80%, at least 90%, or the like. In other words, if the images match by at least 70%, at least 80%, or at least 90%, then there is a high probability that the images are the same person and that there is a match.
  • the method 300 proceeds to operation 310 and the system 100 sends a notification with all associated POI information to one or more applicable users.
  • the one or more applicable users includes the owner and may include other users affiliated with the owner or granted permissions by the owner (e.g., a security detail).
  • the associated POI information may include any information related to the POI, such as, for example, identifying
  • the associated POI information may also include information related to the owner, such as, for example, the owner’s name, restraining orders, complaints, comments, and the like.
  • the owner may have full access to information in the owner’s database, while affiliated users may have full or limited access, depending upon the owner’s permissions. For example, the owner may control the type of access of affiliated users via the owner’s user profile.
  • the method 300 proceeds to operation 312 and the system 100 determines whether there is one or more shared databases.
  • the user may be affiliated with one or more shared databases through the user’s profile.
  • Celebrity Star and Famous Actor may have a similar fan base and may have previously agreed to share their respective POI databases with one another.
  • Celebrity Star may adjust user preferences in her profile to grant such access.
  • Celebrity Star may further control the type and amount of information that Famous Actor may have access to within Celebrity Start’s POI database.
  • Celebrity Star may grant full access or limited access to her POI database.
  • default access to a shared database provides limited access.
  • the system may detect a globally shared database (e.g., one that is not affiliated with the user, but is accessible to all users).
  • the method proceeds to operation 314 and the received image is compared to images stored in the one or more shared databases.
  • Any conventional methods for facial recognition and matching or image matching may be used. For example, color, shading, contrast, pixels, shapes, and the like may be compared between images.
  • the method 300 proceeds to operation 316 and the system 100 determines whether the received image matches an image in the shared database within a threshold value.
  • the matching threshold value may be a percent, ratio, or other numerical indicator of a high probability of matching images. If the images match within the threshold value (e.g., above a certain percent), then the system determines that the images match.
  • the method 300 proceeds to operation 318 and the system 100 sends a notification with limited associated POI information to one or more applicable users.
  • the information may be sent automatically upon determination that a POI image in a shared database matches the received image.
  • the limited POI information may include identifying information related to the POI (e.g., name, age, picture, residence, etc.) and the POI’s threat level (e.g., category such as violent, obsessive, delusion, or the like; color such as red, yellow, blue, gray, or the like; or other indicator of threat level).
  • the limited information may exclude any identifying information of the user (e.g., user name).
  • the one or more applicable users receiving the information may be the owner of the received image and any affiliated users (e.g., security team).
  • the information may be sent to the user as an alert and presented to the user in the user’s profile under an alert tab, as discussed in more detail below.
  • the user’s POI database is automatically updated with the information, such that the POI from the shared database becomes a POI in the user’s POI database.
  • the method 300 proceeds to operation 320 and the system 100 determines whether there is a non-shared database. Alternatively, the method 300 may proceed directly to operation 320 from operation 312 if the system 100 determines at operation 312 that no shared database exists. The system assesses user preferences for each database to determine whether it is a non-shared database. For example, when the system detects a user preference that the database remain private, the system determines a database is non-shared. [0084] If the system 100 determines at operation 320 that no non-shared database exists, then the method 300 proceeds to operation 330 and the image is discarded.
  • the method 300 proceeds to operation 322 and the system 100 sends a request to the owner of the non-shared database to approve access to the non-shared database.
  • the request may be in the form of a message sent to an inbox in the owner’s user profile, text message, email, or the like.
  • the method 300 proceeds to operation 324 and the system 100 determines whether approval to access the non-shared database has been granted.
  • the request for approval message may include a button, toggle, link, other input, or the like for the owner to easily accept or reject the request.
  • the owner may respond to the message to grant or deny access. If no response is received, then the system may determine that access is denied. If access is denied, then the method 300 proceeds to operation 330 and the image is discarded.
  • the method 300 proceeds to operation 326 and the received image is compared to images stored in the non-shared database. Any conventional methods for facial recognition and matching or image matching may be used. For example, color, shading, contrast, pixels, shapes, and the like may be compared between images.
  • the method 300 proceeds to operation 328 and the system 100 determines whether the received image matches an image in the non-shared database within a threshold value.
  • the matching threshold value may be a percent, ratio, or other numerical indicator of a high probability of matching images. If the images match within the threshold value (e.g., above a certain percent), then the system determines that the images match.
  • the method 300 proceeds to operation 318 and the system 100 sends a notification with limited associated POI information to one or more applicable users.
  • the information may be sent automatically upon determination that a POI image in a non-shared database matches the received image.
  • the limited POI information may include identifying information related to the POI (e.g., name, age, picture, residence, etc.) and the POI’s threat level (e.g., category such as violent, obsessive, delusion, or the like; color such as red, yellow, blue, gray, or the like; or other indicator of threat level).
  • the limited information may exclude any identifying information of the user (e.g., user name).
  • the one or more applicable users receiving the information may be the owner of the received image and any affiliated users (e.g., security team).
  • the information may be sent to the user as an alert and presented to the user in the user’s profile under an alert tab, as discussed in more detail below.
  • the user’s POI database is automatically updated with the information, such that the POI from the non-shared database becomes a POI in the user’s POI database.
  • the method 300 proceeds to operation 330 and the image is discarded.
  • the system may scan for a matching POI image in a non-shared database prior to requesting permission to access the non-shared database. If a matching POI image is detected, then the system may request permission to share limited information related to the matching POI image to the requesting user. In some embodiments, the system may notify the requesting user that a match has been detected prior to requesting permission to access the non-shared database or prior to receiving permission to access the non-shared database. For example, the system may send the requesting user the image received by the system (e.g., from the camera) and inform the user that the person in the image has been marked as a POI in another user’s database. Without permission from the owner of the non-shared database to access the non-shared database, however, no additional information related to the POI (e.g., name, threat level, etc.) or related to the owner is provided.
  • the POI e.g., name, threat level, etc.
  • Figs. 6A-E illustrate various user interfaces to access a POI database and track and monitor POI activity via a user profile.
  • Fig. 6A is an example of a graphical user interface 400 displaying persons of interest from a POI database of the POI tracking system of Fig. 1.
  • the POI database may be displayed in an application on a user device 106.
  • a user may obtain access to the user’s POI database by various secure means. For example, a user may create an account within the system and gain access to the user’s POI database by using a username and password that is specific to the user.
  • the user may have administrative authority over the user’s account. In other words, the user may have full access to all information stored therein and authority to edit the account, grant or deny access to others, grant administrative authority to others, and the like.
  • the user may share access to the user’s account, profile and/or POI database with other affiliated users (e.g., a security team).
  • the user may set user preferences for the level of access of other affiliated users. For example, the user may allow access to some or all information associated with the POIs in the POI database.
  • affiliated users may have their own accounts (e.g., with usernames and
  • the user interface 400 may provide access to various categories 402 within the user’s profile. Such categories may include, for example, “Activity”,“Lists”,“All”, and“Alerts.” As shown in this example, the category“All” has been selected. In this example,“All” shows a list of all POIs within the user’s POI database. As shown, each POI has a picture 404, a threat level 408, a threat category 405, and some identifying information 406 (e.g., name). Other icons may also be displayed that allow easy access to various functions within the system. For example, as shown, a people icon 416 is displayed on the lower portion of the user interface 400.
  • a user may select this icon to view a quick list of all POIs within the POI database.
  • This list may be simplified from the version displayed in Fig. 6A. For example, the list may only include POI names in alphabetical order.
  • a user may select the people icon 416 to view a list of all affiliated users.
  • a places icon 418 is also displayed on the lower portion of the user interface 400.
  • a user may select this icon to view POI locations 434 on a map 432 (e.g., see Fig. 6E).
  • POI locations may indicate the location where the POI was last seen.
  • a camera associated with the system 100 may have detected the POI in a specific location.
  • the camera may send an image of the POI, along with metadata indicating the location and time the image was taken, to the cloud.
  • the cloud processes the image according to method 250, and, upon determining that the received image matches a POI in the user’s database, may store the location and time information in the respective POI profile in the POI database.
  • the location and time information may be updated each time the POI is detected.
  • the most recent location information may be populated on the map 432 displayed on the user interface 400, as shown in Fig.
  • a user may select the POI location 434 on the map to view the POI’s profile, as discussed in more detail below.
  • the POI’s profile may include the timing information that correlates to the location information, such that a user can know when the POI was detected in that specific location 434.
  • a messages icon 420 is also displayed on the lower portion of the user interface 400. A user may select this icon to access any received messages or send any messages.
  • the user may receive messages from the system, from other users, and/or from affiliated users (e.g., the user’s security team). For example, the system may send messages requesting shared access to the user’s POI database.
  • other users may directly message the user to request shared access to the user’s POI database.
  • the other users’ identities and the user’s identity may remain hidden.
  • affiliated users may send the user a message.
  • a security team may desire to send the user a message to let the user know how they plan to handle a particular threat, whether the threat has been removed, or precautions the user should take to avoid the threat.
  • a more icon 422 is also displayed on the lower portion of the user interface 400.
  • a user may select this icon to access additional functions within the system. For example, the user may access and edit the user’s profile, set user preferences (e.g., change the database to shared or private), alter displays, and the like.
  • a camera icon 414 is displayed on an upper portion of the user interface 400.
  • This camera icon may allow quick access to an onboard camera of the user device 106.
  • a user may spot a POI or someone who appears to be a threat.
  • the user may use the onboard camera to take a picture of that individual.
  • the image may automatically upload into the system.
  • the image may be processed locally, such as by method 200, and sent to the cloud for additional processing, such as by method 250.
  • the user may take images via a separate camera function on the user device 106 and upload these images into the system.
  • the data may be streamed in various manners, including sending the data in packets or bursts, rather than single streams or may otherwise be transmitted as needed or based on the network bandwidth capabilities and the like.
  • a user may know, via the POI database or personal knowledge, that an individual is a POI.
  • a user may select the camera icon 414 to take a picture of the individual.
  • the image may display on the user interface 400, allowing the user to send the image to affiliated users.
  • the system may use positioning information built into the user device 106 (e.g., a Global Positioning System) to associate the image with a location.
  • the system may send location and time information as metadata with the image.
  • an affiliated user may receive the image, location, and time information and take appropriate action. It is also contemplated that an affiliated user may take the picture and update the system or send the image to the primary user.
  • An add icon 412 is also displayed on the upper portion of the user interface 400.
  • a user may select this icon to add information into the POI database.
  • a user may add another POI to the user’s POI database or may add additional information to an existing POI.
  • a user may update personal identifiers, locations, affiliations, threats, threat levels, pictures and videos (e.g., security footage), arrest records, identification (e.g., passports, driver’s licenses, etc.), and/or attach any file or document to the POI profile.
  • a search icon 410 is also displayed on the upper portion of the user interface 400.
  • a user may select this icon to search the POI database for specific information. For example, a user may search a specific POI name, a threat category, a threat level, or the like.
  • the system searches through the database for information that matches the search criteria and populates such information on the user interface 400.
  • Fig. 6B is an example of a graphical user interface displaying a profile for a person of interest from a POI database of the POI tracking system of Fig. 1.
  • a user may select a POI listed in the POI database (e.g., select the picture 404 displayed in Fig. 6A) to open the POI’s profile 426, as shown in Fig. 6B.
  • the POI’s profile 426 includes information specific to the POI.
  • the POP s profile 426 may include identifying information 406 (e.g., picture 404, name, date of birth, height, weight, age, eye color, hair color, etc.), threat level 408 (e.g., red, yellow, blue, gray, etc.), threat category 405 (e.g., obsession, violent, delusional, ashamed, etc.), location (e.g., where the POI was last seen), activity (e.g., past conduct, most recent conduct, or current conduct), and the like.
  • the POPs profile 426 may include a detailed description of the POPs activity. For example, there may be a separate tab“Activity” for additional details related to the POPs behavior.
  • Fig. 6C is an example of a graphical user interface of the POI tracking system of Fig. 1 that provides access to various lists categorizing persons of interest of different threat levels.
  • four different categories 424 of threat level are displayed on the user interface 400.
  • the threat categories are red, yellow, blue, and gray.
  • a user may select a category 424 to view all POIs within that category. For example, if a user selects the red category, a user can view all POIs with a red threat level. In this manner, a user can focus on POIs that are highly threatening.
  • Other categories are contemplated, such as, for example, geographical location (e.g., based on current, recent or historic location), threat category, and the like.
  • Fig. 6D is an example of a graphical user interface displaying alerts related to persons of interest from a POI database of the POI tracking system of Fig. 1.
  • the user interface 400 includes a separate Alerts tab 426. The most recent alert may be displayed on top.
  • An alert may indicate that a POI has been detected.
  • the alert may include various information related to the POI, such as for example, identifying POI information (e.g., picture and name), threat level, location, and the like.
  • the alert may also indicate the POI’s proximity to the user. For example, the system may determine the user’s location based on positioning technology in the user’s device 106.
  • the system may know the location of the POI based on a known location of the camera that captured the image of the POI.
  • the system may calculate an estimated distance between the user’s location and POI’s location and relay this information to the user via the alert.
  • the alert may also include instructions to the user. For example, the alert may suggest that the user find security or a secure location.
  • the alert may also include a status 428. For example, the alert may be active, indicating that the POI has been detected and has not yet been approached or removed from the premises.
  • the alert may be resolved, indicating that the POI has been approached, removed from the premises, and/or is no longer a threat.
  • the alert may include a note indicating any actions taken related to the POI (e.g., removed from premises).
  • a business may want to track high-value or important individuals (e.g., celebrities, government officials, key customers, and the like) that enter the premises to ensure they are treated well.
  • POIs may have varying levels of importance.
  • the President of the United States may be considered a top-level POI. If the President is detected on premises, an alert may be sent to an applicable user to provide the President with immediate attention and service.
  • an A-list celebrity may be considered a top-level POI
  • a B-list celebrity may be considered a mid-level POI.
  • color- coding e.g., gold for top-level, silver for mid-level, and bronze for low-level
  • any similar indicator of varying levels e.g., numbers
  • the technology described herein may be implemented as logical operations and/or modules in one or more systems.
  • the logical operations may be implemented as a sequence of processor implemented steps directed by software programs executing in one or more computer systems and as interconnected machine or circuit modules within one or more computer systems, or as a combination of both.
  • the descriptions of various component modules may be provided in terms of operations executed or effected by the modules.
  • the resulting implementation is a matter of choice, dependent on the performance requirements of the underlying system implementing the described technology.
  • the logical operations making up the embodiments of the technology described herein are referred to variously as operations, steps, objects, or modules.
  • logical operations may be performed in any order, unless explicitly claimed otherwise or a specific order is inherently necessitated by the claim language.
  • articles of manufacture are provided as computer program products that cause the instantiation of operations on a computer system to implement the procedural operations.
  • One implementation of a computer program product provides a non-transitory computer program storage medium readable by a computer system and encoding a computer program. It should further be understood that the described technology may be employed in special purpose devices independent of a personal computer.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Image Analysis (AREA)

Abstract

The present disclosure relates generally to a system and method for tracking persons of interest over a network, while maintaining privacy between discrete persons of interest databases. The method includes receiving a first image including a person; analyzing the first image for identifiable features corresponding to the person; comparing the identifiable features in the first image to corresponding features in one or more stored POI images in the two or more POI databases; detecting similar features in a second image of a POI that match the identifiable features in the first image within a threshold value, wherein the second image of the POI is from a first POI database of the two or more POI databases, wherein the first POI database is owned by a first user; determining the person is the POI; and extracting information related to the second image of the POI from the first POI database.

Description

SHARED PRIVACY PROTECTED DATABASES FOR PERSON
OF INTEREST
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] The present application claims priority to U.S. Provisional Application No.
62/806,196 entitled“SHARED PRIVACY PROTECTED DATABASES FOR PERSONS OF INTEREST” and filed on February 15, 2019, which is hereby incorporated by reference herein in its entirety.
FIELD
[0002] The present disclosure relates generally to detecting and tracking persons of interest over a network, while maintaining privacy between discrete persons of interest databases.
BACKGROUND
[0003] Businesses and high profile individuals often have a need to monitor persons of interest within their vicinity. For example, a business (e.g., a casino) may want to monitor high-value guests (e.g., key customers) to ensure that they are provided with excellent service and have a good experience at the establishment. As another example, high profile individuals (e.g., celebrities, government officials, and the like) may want to monitor threatening individuals (e.g., obsessed fans, enemies, terrorists, and the like) to thwart potentially hostile situations.
[0004] Security or other personnel may monitor such persons of interest at specific establishments or events. For example, security personnel may be positioned throughout a venue or positioned behind a security monitor to watch for suspicious behavior or key guests. Such monitoring, however, is time consuming, tedious, inefficient, and subject to human error. Further, often persons of interests may pose threats to a large number of at-risk individuals, but current methods for sharing person of interest information between at-risk individuals and their associates do not maintain privacy between the at-risk individuals and their respective security networks.
[0005] The information included in this Background section of the specification, including any references cited herein and any description or discussion thereof, is included for technical reference purposes only and is not to be regarded subject matter by which the scope of the invention as defined in the claims is to be bound. SUMMARY
[0006] The technology disclosed herein is generally related to a system and method for tracking persons of interest.
[0007] In one embodiment, a method of identifying a person of interest (POI) using two or more POI databases is disclosed. The method includes receiving a first image including a person; analyzing the first image for identifiable features corresponding to the person;
comparing the identifiable features in the first image to corresponding features in one or more stored POI images in the two or more POI databases; detecting similar features in a second image of a POI that match the identifiable features in the first image within a threshold value, wherein the second image of the POI is from a first POI database of the two or more POI databases, wherein the first POI database is owned by a first user; determining the person is the POI; and extracting information related to the second image of the POI from the first POI database.
[0008] In another embodiment, a method for tracking a person of interest (POI) and sharing information related to the POI across a plurality of POI databases is disclosed. The method includes receiving an image; comparing the received image to a plurality of images of POIs in the plurality of POI databases; determining the received image matches a POI image of the plurality of images of POIs in a first POI database of the plurality of POI databases; determining an access level to the first POI database; and transferring an amount of information related to the POI image based on the access level to the first POI database.
[0009] In yet another embodiment, a system for tracking persons of interest and sharing information related to the persons of interest over a network is disclosed. The system includes a user device; a camera; a POI database, wherein the user device, camera, and POI database are in communication over the network; and a processor. The processor is configured with instructions to receive an image from at least one of the camera or the user device; scan the POI database for a matching POI image to determine that the matching POI image matches the image; detect the matching POI image in the POI database, wherein the POI database comprises information related to the matching POI image; and determine whether permission is needed to transfer the information.
[0010] This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Specification. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. A more extensive presentation of features, details, utilities, and advantages of the present invention as defined in the claims is provided in the following written description of various embodiments and implementations and illustrated in the accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0011] Fig.1 is a block diagram illustrating an example of a POI tracking system.
[0012] Fig. 2 is a simplified block diagram of a computing device representative of one or more components of the system of Fig. 1.
[0013] Fig. 3 is a flow chart illustrating a method for locally filtering collected image data.
[0014] Fig. 4 is a flow chart illustrating a method for processing image data and selectively sharing information using one or more databases associated with the POI tracking system of Fig. 1.
[0015] Fig. 5 is a flow chart illustrating a method for processing image data using various databases of the POI tracking system of Fig. 1.
[0016] Fig. 6A is an example of a graphical user interface displaying persons of interest from a POI database of the POI tracking system of Fig. 1.
[0017] Fig. 6B is an example of a graphical user interface displaying a profile for a person of interest from a POI database of the POI tracking system of Fig. 1.
[0018] Fig. 6C is an example of a graphical user interface of the POI tracking system of Fig. 1 that provides access to various lists categorizing persons of interest of different threat levels.
[0019] Fig. 6D is an example of a graphical user interface displaying alerts related to persons of interest from a POI database of the POI tracking system of Fig. 1.
[0020] Fig. 6E is an example of a graphical user interface displaying a map of persons of interest from a POI database of the POI tracking system of Fig. 1.
SPECIFICATION
[0021] In some embodiments herein, a system and method for recognizing and tracking persons of interest (POI) is disclosed. The system and methods of the present disclosure facilitate POI tracking by enabling data sharing among various interconnected POI databases. Data sharing between databases of the present system is both secure and scalable. In several embodiments, the system provides multi-directional information flow among users and the system, but in a way that may be controllable, optionally limited, and secured.
[0022] In several embodiments, the system includes one or more user devices, POI databases, and sensors in communication over a network. Using these devices, the system may detect a person and send information related to the person to the system for processing. For example, a camera may capture one or more images, the captured images are then analyzed by a computer or other processing element, and when the images are determined to include a person (e.g., via facial recognition), the image may then be analyzed with respect to one or more POI databases. For example, the system may compare captured information to stored POI information contained in various POI databases (e.g., discrete databases corresponding to different users) to determine whether the person detected by the sensor is a POI in at least one POI database. When a POI is detected, an alert is sent to a user and optionally one or more designated users via a user device.
[0023] In several embodiments, a POI database includes information related to persons of interest. A person of interest may be an individual that is considered generally threatening or dangerous, threatening or dangerous to a specific user, or the like. Alternatively, a person of interest may be an individual that is considered important, high-profile, or the like. A POI database of the present disclosure may include one or more POI profiles, which may include information specific to a POI. For example, a POI profile may include a picture of the POI and identifying information such as, name, age, residency, physical features (e.g., gender, height, weight, hair color, eye color, etc.), and the like. A POI profile may also include information related to the threat level or high-profile level of the POI, the last detected location of the POI, activity of the POI, and other data as may be of-interest or useful in tracking the POI and expected actions, as will be discussed in more detail below.
[0024] In some embodiments, the system can identify different POI databases (e.g., based on a user’s affiliation with or access level to a database) and determine a level of data sharing among users and their respective databases. The level of data sharing may be based on the user’s affiliation with the database and/or on owner preferences associated with the database. The level of data sharing may be, for example, complete, limited, or none. For example, a user may be the owner of a particular database. The system may recognize the database as “owner’s database” and allow full access to all information contained therein. As another example, a database may belong to another user. If the user has previously agreed to share his or her database with other users in the system or with the particular user wanting access, then the system may recognize the database as“a shared database” and allow limited access to information contained therein. If the user has not agreed to share his or her database, then the system may recognize the database as a“private database” or“non-shared database” and restrict access to information contained therein. In this example, the user wanting access may request access to the owner of the private database. The owner of the private database may grant or deny access.
[0025] In several embodiments, one or more cameras are used to detect POIs. The cameras may be general cameras used by the system or cameras specific to a user. For example, one or more cameras may be placed at a public location (e.g., a street, town hall, shopping center, grocery store, etc.). Such cameras may capture images that may be used by any user within the system to determine whether a POI is present. For example, such information may reveal that a POI is in the same town or city as the user. As another example, a user may place one or more of his or her own cameras at strategic locations (e.g., a performance venue, a home, an office, etc.). For example, when Celebrity Star has a performance, she may place cameras at each entrance of the performance venue to scan for any POIs (e.g., obsessed fans). The one or more cameras capture images that will be analyzed by the system for POI detection.
[0026] In several embodiments, the system of the present disclosure has both local processing and cloud processing capabilities. As one example, local processing may occur within one or more cameras used with the system. As another example, local processing may occur at a central processing unit that is in communication with the one or more cameras over a local area network. Local processing may filter out irrelevant images (e.g., images that do not include a person/face) or redundant images (e.g., images of a person that were recently previously taken and sent to the cloud) before sending images to the cloud for more extensive analysis. In this manner, the system may be able to reduce the volume of data transmitted to the cloud, allowing more efficient data transmission and subsequent data processing.
However, in other embodiments, the system may exclude local processing, such that all image data is transferred to the cloud for processing. It should be noted that the data transferred to the cloud may be transferred as a single stream or may be transferred in various packets or bursts similar to streaming methods.
[0027] Image processing occurring in the cloud may include generally any type of object identification and comparison techniques, such as, but not limited to, biometric recognition, facial recognition, other personal identification features, e.g., license plate recognition, or the like. For example, processing elements may analyze captured images for faces (e.g., where local processing does not perform this task), and in instances where a face has been identified, optionally executing more sensitive facial recognition algorithms to extract facial features, and then comparing the facial images and/or facial features to stored or database images and/or facial features in one or more POI databases, and the like. As another example, biometric analysis may include scanning the image for one or more distinguishing biological traits that uniquely identify a POI in the POI database, e.g., a unique identifier may be hand geometry, earlobe geometry, and the like. As yet another example, license plate recognition may include detecting a license plate or the original characteristics of the license plate (e.g., alphanumeric code, state, year of registration, etc.), in an image and determining the license plate number. In this example, POI license plate information may be stored in a POI database to further identify a POI. License plate recognition may be beneficial, for example, to identify POIs that enter a parking structure at an event.
[0028] In many embodiments, the system of the present disclosure can determine whether an image includes a person of interest (e.g., whether the person in the image matches a POI in a POI database), scan one or more POI databases for POIs (e.g., POI profile information) that match the person in the image. The system may scan multiple POI databases in parallel or in a particular order (e.g., depending upon the user’s affiliation with the database). During the scanning, the system determines whether the person in the image matches a POI within a particular match threshold. For example, a match threshold over 80% may indicate a high likelihood that the person in the image is a POI. As another example, a match threshold of 100% indicates that the person is a POI.
[0029] Comparing a received image to POI images within a POI database may be executed using several different image comparison techniques. For example, the system may compare pixels, detected facial attributes, colors, shading, contrast, highlights, and the like. As one example, the system may detect facial attributes in each image. In this example, the system may mark facial attributes with one or more points. For example, the system may mark each comer of a mouth with a point, the tip of a nose with a point, and each eye with a point. The system may compare the distances between points in each image to assess whether the images are of the same face. In another example, mathematical representations of POI images may be stored. The system may convert the received image into a mathematical representation of the image and assess a mathematical correlation between the mathematical representation of the received image and the stored mathematical representations of POI images to determine a match.
[0030] The system also can send notifications and alerts in real-time to various users and, in some embodiments, to outside parties. For example, the system may send notifications to applicable users when new POI profiles are created, existing POI profiles are updated, POI activity is detected, a facial recognition match is detected, a team communication is sent, a threat response is executed, and the like. Outside parties may include a law enforcement authority, hospital, or the like.
[0031] As used herein, a“user” is any individual using the system. For example, a user may be a high profile individual (e.g., a celebrity, government official, etc.), members of a security team (e.g., body guards, private security detail, secret service, etc.), a business (e.g., a casino, a high-end store, a film production company, etc.), or the like. As used herein, an “owner” is any user that has a database specific to that user. For example, persons of interest in a database owned by a specific user (i.e., the owner) are of interest to that specific user (e.g., POI’s in a POI database owned by Celebrity Star may be people who are threatening or dangerous to Celebrity Star).
[0032] A user may have an administrative role, delegate, or an affiliate role associated with a POI database. For example, a user with an administrative role may have authority to add or edit POIs (e.g., edit the POI’s threat level, activity, etc.) within the POI database, allow various permissions to other users (e.g., provide access, grant administrative authority, etc.), confirm threats, communicate with and send alerts to affiliated users (e.g., a security team), and the like. As another example, a user with an affiliate role may have authority to access the POI database, confirm threats, communicate with and send alerts to affiliated users and to the owner/administrator, and the like. [0033] Turning now to the figures, a system of the present disclosure will be discussed in more detail. Fig. 1 is a block diagram illustrating an example of a POI tracking system 100. The system 100 includes one or more user devices 106a-n, one or more sensors 108a-n, and one or more databases 1 lOa-n. The system 100 may also include one or more servers 102 and a central processing unit (CPU) 112, which may be in communication with the user device(s) 106a-n, sensor(s) 108a-n, and database(s) l lOa-n. Each of the various components of the POI tracking system 100 may be in communication directly or indirectly with one another, such as through a network 104 and/or a local area network (LAN) 114. In this manner, each of the components can transmit and receive data from other components in the system. For example, the CPU 112 may be in communication with the user device(s) 106a-n and sensor(s) 108a-n over the LAN 114. In many instances, the server 102 may act as a go between for some of the components in the system 100.
[0034] The network 104 may be substantially any type or combination of types of communication system for transmitting data either through wired or wireless mechanism (e.g., cloud, WiFi, Ethernet, Bluetooth, cellular data, or the like). In some embodiments, certain components in the POI tracking system 100 may communicate via a first mode (e.g., Bluetooth) and others may communicate via a second mode (e.g., WiFi). Additionally, certain components may have multiple transmission mechanisms and be configured to communicate data in two or more manners. The configuration of the network 104 and communication mechanisms for each of the components may be varied as desired.
[0035] The server 102 includes one or more computing devices that process and execute information. The server 102 may include its own processing elements, memory components, and the like, and/or may be in communication with one or more external components (e.g., separate memory storage) (an example of computing elements that may be included in the server 102 is disclosed below with respect to Fig. 2). The server 102 may also include one or more server computers that are interconnected together via the network 104 or separate communication protocol. The server 102 may host and execute a number of the processes executed by the system 100.
[0036] The server 102 has or offers a number of configurable application programming interfaces (API) which can be accessed and used from an application on a user device 106 to send and receive data to the server 102. To prevent unauthorized access, all applications may be required to authenticate sessions or connections via a license key or other code. An advantage of this architecture is that each user obtains a streamlined, uncluttered view of relevant activity to the user.
[0037] The user device(s) 106a-n may be any of various types of computing devices, e.g., smart phones, tablet computers, desktop computers, laptop computers, set top boxes, gaming devices, wearable devices, or the like. The user device(s) 106a-n provides output to and receives input from a user. For example, the user device(s) 106a-n may receive POI database updates from a user and output POI information and notifications or alerts to a user. The type and number of user devices 106a-n may vary as desired.
[0038] The sensor(s) 108a-n may be any type of instrument or technology used that can capture and optionally detect and recognize a person. For example, the sensor(s) 108a-n may be one or more cameras. The camera may have onboard facial recognition technology, such that the camera is able to recognize a particular person when the person’s face is at least partially aligned with the camera view. Alternatively, the cameras may merely record image data and transmit the image data to an external processing element (e.g., a server 102 or CPU 112) for processing.
[0039] The database(s) 1 lOa-n may be an internal to the system or external database. For example, the database 110 may be an internal database controlled by a security company or the like that monitors threats. As another example, the security company may access an external database that is associated with a policing authority or enforcement agency and that includes information on dangerous persons and/or criminals. In many instances, the system may include a combination of internal databases and external databases, where the external databases can supplement data for the POI internal databases.
[0040] The system 100 may include multiple databases 110h associated with multiple users, where these POI databases 110h may share select information between users. The databases 110h may be in parallel communication with the server 102 over the network 104, such that the server 102 may access multiple databases l lOn simultaneously for data processing and POI analysis. The server 102 may access partial or limited information or complete information in a particular database 110 depending upon the user and the database 110. For example, the server 102 may access all information in a database 110 owned by the user requesting the information. [0041] A simplified block structure for a computing device that may be used with the system 100 or integrated into one or more of the system 100 components is shown in Fig. 2. For example, the server 102, user device(s) 106a-n, sensor(s) 108a-n, and/or database(s)
1 lOa-n may include one or more of the components shown in Fig. 2 and use one or more of these components to execute one or more of the operations disclosed in methods 200, 250, and 300. With reference to Fig. 2, the computing device 150 may include one or more processing elements 152, an input/output interface 154, a network interface 156, one or more memory components 158, a display 160, and one or more external devices 162. Each of the various components may be in communication with one another through one or more busses, wireless means, or the like.
[0042] The processing element 152 is any type of electronic device capable of processing, receiving, and/or transmitting instructions. For example, the processing element 152 may be a central processing unit, microprocessor, processor, or microcontroller. Additionally, it should be noted that select components of the computer 150 may be controlled by a first processor and other components may be controlled by a second processor, where the first and second processors may or may not be in communication with each other.
[0043] The memory components 158 are used by the computer 150 to store instructions for the processing element 152, as well as store data, such as user information, POI information, threat or status assessments, and the like. The memory components 158 may be, for example, magneto-optical storage, read-only memory, random access memory, erasable programmable memory, flash memory, or a combination of one or more types of memory components.
[0044] The display 160 provides visual feedback to a user and, optionally, can act as an input element to enable a user to control, manipulate, and calibrate various components of the computing device 150. The display 160 may be a liquid crystal display, plasma display, organic light-emitting diode display, and/or cathode ray tube display. In embodiments where the display 160 is used as an input, the display 160 may include one or more touch or input sensors, such as capacitive touch sensors, resistive grid, or the like.
[0045] The I/O interface 154 allows a user to enter data into the computer 150, as well as provides an input/output for the computer 150 to communicate with other devices (e.g., server 102, sensor(s) 108a-n, other computers, speakers, etc.). The I/O interface 154 can include one or more input buttons, touch pads, and so on.
[0046] The network interface 156 provides communication to and from the computer 150 to other devices. For example, the network interface 156 allows the server 102 to
communicate with the sensor(s) 108a-n through the network 104. The network interface 156 includes one or more communication protocols, such as, but not limited to WiFi, Ethernet, Bluetooth, and so on. The network interface 156 may also include one or more hardwired components, such as a Universal Serial Bus (USB) cable, or the like. The configuration of the network interface 156 depends on the types of communication desired and may be modified to communicate via WiFi, Bluetooth, and so on.
[0047] The external devices 162 are one or more devices that can be used to provide various inputs to the computing device 150, e.g., mouse, microphone, keyboard, trackpad, or the like. The external devices 162 may be local or remote and may vary as desired.
[0048] Fig. 3 is a flow chart illustrating a method for locally filtering collected image data, which helps to increase data transfer efficiency between the LAN and cloud, reducing costs. The method 200 begins with operation 202 and a local processing element receives an image from a camera. The local processing element may be an internal component of the camera or it may be an external processing element, such as, for example, a locally positioned central processing unit 112 in communication with the camera over a local area network 114. An image may be received by the local processing element at various discrete time intervals, over a continuous time interval, or at random time intervals. For example, the camera may be coupled to an internal or external motion sensor and may be configured to record and send images when motion is detected.
[0049] After operation 202, the method 200 proceeds to operation 204 and the local processing element scans the image for a face. For example, certain pixels may correlate with certain facial features, indicating an image of a face is likely. As one example, a face may be represented by a particular arrangement of pixels, pixel characteristics (e.g., hue, brightness, location on the image, etc.). Other image features may be analyzed to determine whether a face is present, such as, for example, color, shading, contrast, highlights, and the like. [0050] After operation 204, the method 200 proceeds to operation 206 and the local processing element determines whether the scanning operation 204 detected a face in the image. If a face is not detected, the method 200 proceeds to operation 208 and the image is discarded. For example, in instances where the received image pixels match less than a predetermined threshold (e.g., 50%, 60%, etc.) of the facial pixel arrangement, the local processing element determines that the image does not include a face and the image will be discarded.
[0051] It should be noted that the scanning and analysis operations 204, 206 may be less sensitive than further analysis that may be done in the cloud. In this manner, the algorithms analyzing the image may be selected to simply determine whether the images are more likely than not to contain a face, rather than a full facial recognition assessment. However, in other embodiments, the analysis may be more robust and include a full facial recognition analysis.
[0052] After operation 206, the method 200 optionally proceeds to operation 210 and a cache or local storage is searched for a recent matching image. The local processing element may store copies of images sent to the cloud (e.g., images where a face was determined to be present or face positive images). The local processing element may compare the received image to stored face positive images to determine whether there is a match. The local processing element may conduct facial recognition or image comparison techniques to determine whether the faces in each image match. For example, the local processing element may compare pixels, color, shading, contrast, highlights, and the like.
[0053] If method 200 proceeded to operation 210, the method 200 proceeds to operation 212, and the local processing element determines whether there was a recent stored matching image. As one example, a match may be determined if the received image matches a stored image within a particular matching threshold. For example, if the received image matches the stored image by 50% or greater, 60% or greater, 70% or greater, or the like, then the local processing element determines that there is a match. If a match is detected, the local processing element may determine whether the matching stored image is recent. For example, the local processing element may place a timestamp on the image when the image is received or when the image is placed in storage. The timestamp may be stored as metadata associated with the image. The definition of“recent” may vary. For example,“recent” images may include prior images received or stored within seconds, minutes, or hours of the received image.
[0054] In one example, a received image may match a recent stored image where different cameras detect the same person within a predetermined timeframe. For example, a camera positioned at the front door captures a first image corresponding to a first person (e.g., John Doe), transmits the image to the local processing element (either on-board, directly connected, or via the LAN). The local processing element then detects a face in the image, scans a local storage for a match (e.g., facial comparison or image comparison), fails to find an image matching characteristics of the image of John Doe, stores a copy of the image, and transmits the image to the cloud for additional processing. As John Doe passes from the front door to the lobby, a second camera positioned in the lobby captures a second image of John Doe and sends the second image to the local processing element. The local processing element detects a face in the image, searches storage for a match (e.g., image portion match or facial match), and determines that the first stored image of John Doe matches the current or second image of John Doe within a particular matching threshold. The local processing element further determines that the first stored image of John Doe is a recent stored image since the first stored image was taken at the front door only a few seconds or minutes before the second image of John Doe was taken in the lobby, or was captured during the same “event.”
[0055] If there is a recent matching image, the method 200 proceeds to operation 208 and the received image is discarded. In the above example, the second image of John Doe is discarded since the first matching image was recently sent to the cloud. By discarding redundant images, the system allows more efficient data processing since data transmission to the cloud is reduced. In some cases, however, it may be desirable to keep redundant images and send all relevant images (e.g., all those determined to include a face, other biometric identifier, and/or a license plate) to the cloud. For example, the additional image of John Doe provides additional location information throughout the event or system, so the exact position of John Doe can be tracked more efficiently.
[0056] By discarding images captured close in time to previously stored images, the system eliminates additional images that likely reveal similar situational data. It may be desirable to keep images captured further apart in time, as the circumstances/situational environment may change overtime (e.g., the location of the POI may change). For example, it may be desirable to know the location of a POI before, during, and/or after a performance. In this example, it may be desirable to keep any images that are taken at least 30 minutes apart.
[0057] In some instances, redundant image filtering may be adjustable by a user. For example, a user may want to know the whereabouts of a POI at all times once it is known that the POI is on the premises. The user may set user preferences via an application on a user device to keep all images so that the user can closely monitor a POI as the POI’s location changes from camera to camera. In some instances, the user may turn off redundant image filtering for certain POIs or all POIs. For example, a user may want to track a highly dangerous POI more closely than a non-threatening POI by saving more images as they are taken.
[0058] With reference again to Fig. 3, if there is no recent matching stored image (e.g., no redundant image has been sent to the cloud), the method 200 proceeds to operation 214 and the received image is sent to the cloud for additional, generally more extensive, processing. Alternatively, method 200 may proceed directly from operation 206 to operation 214 if method 200 did not proceed to operation 210 (e.g., no redundant image filtering was performed), and the image may be sent to the cloud after it is determined that there is a face in the image. By filtering out irrelevant (e.g., no face) and, optionally, redundant (e.g., the same or similar and previously sent) images, the local processing element reduces the volume of data sent to the cloud.
[0059] Fig. 4 is a flow chart illustrating a method for processing image data and selectively sharing information using one or more databases associated with the POI tracking system of Fig. 1. The method 250 begins with operation 252 and a cloud processing element receives an image. The cloud processing element may receive the image from a local processing element via the network (e.g., WiFi), such as the local processing element discussed with reference to Fig. 3 and method 200. In an alternate embodiment, local processing may be omitted and all images (e.g., images with and without faces, redundant images, etc.) may be sent directly from a camera or user device to the cloud.
[0060] The system may associate the received image with a particular user. For example, the received image may be received from a particular user’s camera. The user may have a user profile within the system that includes information on cameras owned by the user. The received image may include metadata with camera identifying information or the system may receive identifying information directly from the camera. The system can compare the camera identifying information to camera information stored in user profiles to find the user (owner) associated with the camera and received image. Alternatively, the system may associate the received image with the system generally and with no user in particular. For example, the image may come from a general camera associated with the system that is not owned by any particular user.
[0061] After operation 252, the method 250 proceeds to operation 254, and the received image is compared to images stored in one or more associated databases. Multiple databases may be associated with the system. For example, various users may each have their own POI database within the system. The system may scan each database and compare the received image to POI images stored in the various POI databases. The databases may be scanned in parallel or sequentially. In many embodiments, the image comparison in this step may be more extensive than the partial image comparison that occurred during local processing (discussed with reference to Fig. 3 and method 200). In operation 254, various image comparison techniques may be applied. For example, the system may use conventional facial recognition techniques and then compare facial features between the images. For example, facial recognition may generate geometric or mathematical descriptions of the image and use that data to assess similar structural components or the system may compare shapes, color, pixels, and the like. The system may apply a mathematical formula and conduct a statistical comparison of the images.
[0062] After operation 254, the method 250 proceeds to operation 256 and the cloud processing element determines whether the received image matches one or more images stored in the one or more associated databases. If no match is detected at operation 256, the method 250 proceeds to operation 258 and the image is discarded. If a match is detected at operation 256, the method 250 proceeds to operation 260 and the cloud processing element determines whether the match is within a threshold value. The threshold value may be a percent similarity between images or any other value to measure a correlation between the images. As one example, the threshold value may be 85% similarity. In this example, if the received image matches the stored image with at least 85% similarity, then the match is within the threshold value. In some embodiments, the system may send the received image or both matching images to one or more applicable users (e.g., the owner of the camera that captured the received image, security personnel, and the like) to confirm the accuracy of the match. In an alternate embodiment, the system may send the received image or both matching images to one or more applicable users to confirm the accuracy of the match without determining whether the match is within a threshold value. It should be noted that the thresholds may vary depending on the desires of the user, sensitivity of the system, or profile characteristics of the POI profile (e.g., larger threat POI matches may require a lower threshold for matches) and may range from 51% to 99% and the discussion of any particular level is meant as illustrative only.
[0063] If the match is not within the threshold value, the method 250 proceeds to operation 258 and the image is discarded. If the match is within the threshold value, then the method 250 proceeds to operation 262 and the cloud processing element determines whether the matching stored image is from the owner’s database. As discussed above, the system may know the user associated with the image (i.e., the owner) and therefore the associated user profile. The user profile may be associated with a POI database (e.g., the owner’s database).
[0064] If the matching stored POI image is from the owner’s database, the method 250 proceeds to operation 264 and all or substantially all information related to the matching POI image is transferred to one or more applicable users. Information related to the person of interest may include, for example, identifying information, last seen location, time POI was last seen, threat level, category of threat/danger, relationship to user, activity details, and the like. Identifying information may include, for example, name, age, residency, physical features (e.g., gender, height, weight, hair color, eye color, etc.), and the like. The threat level may indicate the severity of danger presented by the POI. For example, threat level may be represented by numbers, colors, symbols, or the like. For example, a POI may have a danger/threat rating from 1 (low) to 10 (high). As another example, different colors may represent different threat levels. For example, gray may indicate low-level threat, blue may indicate mid-level threat, yellow may indicate mid- to high- level threat, and red may indicate high-level threat. For example, gray may indicate low propensity for violence and suggest the person merely be observed; blue may indicate unknown propensity for violence, unstable, and/or obsessive; yellow may indicate propensity for violence but unable to travel; red may indicate high propensity for violence and ability to travel. The category of threat/danger may include, for example, obsession, delusion, anger, violence, and the like. The relationship of the POI to the user may correlate with the threat category. For example, the POI may be an obsessed fan, an enemy, in a delusional relationship with the user, and the like. Activity details may relate to the POI’s interactions with the user or the user’s affiliates (e.g., family, friends, security team, etc.). For example, activity details may indicate the POI has previously been violent with or verbally abusive to the user or the user’s affiliates, exhibited lude or inappropriate conduct, or the like.
[0065] Such information may be transferred to the owner (e.g., Celebrity Star) and/or to any authorized users (e.g., Celebrity Star’s security team). This information may also include personal and identifying information related to the owner. Once the information is transferred to the one or more applicable users, the one or more applicable users can determine a course of action. As one example, the one or more applicable users may determine a course of action based upon the POT s threat level and prior conduct. For example, if Celebrity Star’s security team receives a notification that an angry POI (e.g., threat level red) is in the lobby, the security team may send a security guard to remove the POI from the premises. As another example, if the notification indicates that the POI is violent and has been physical with security guards in the past (e.g., threat level red), then the security team may send several security guards to remove the POI from the premises.
[0066] In an alternate embodiment, the system may determine whether to send certain information to an outside party. For example, if a POI is detected, the system may scan the POI profile for any red flags. A red flag may be an arrest record, a restraining order, or some other indicator of a criminal record. If, for example, a restraining order is on file, the system may send an alert to a police station. Information sent to outside parties (e.g., a police station) may include identifying information related to the POI and the POT s detected location. In this example, the information may also include a copy of the restraining order.
[0067] If the matching stored image is not from the owner’s database, the method 250 proceeds to operation 266 and the cloud processing element determines whether permission is needed to access the information associated with the matching stored image. Whether permission is needed to access information in another user’s database may depend upon the requesting user’s (e.g., the user associated with the received image) affiliations and on the other user’s preferences for his or her database. For example, the requesting user may have an established relationship with other users allowing shared access between the users’ databases. Such databases are considered shared databases. With a shared database, permission has already been granted by the owner of the database for the user to access certain information within the shared database. As such, no additional permission is necessary for the user to access information in the shared database.
[0068] As another example, a user may set user preferences in his or her profile allowing his or her database to be shared with any user in the system (e.g., a globally shared database). In this manner, the requesting user’s identity is irrelevant to gaining access to the shared database. In this case, the requesting user may access certain information within the globally shared database without additional permission. As yet another example, a user may set user preferences in his or her profile to restrict access to his or her database. In one example, the database may be considered a private or non-shared database, and access may be restricted to ah other users. A private database requires permission for a requesting user to gain access to any information stored therein. In another example, a user may share his or her database with other users, but restrict access to particular users.
[0069] If permission is not required (e.g., the database is a shared database or a globally shared database), then the method 250 proceeds to operation 268 and limited information related to the matching stored POI image is transferred to one or more applicable users. Limited information may include pertinent information related to the POI (e.g., identifying information and threat level), while excluding any information that may be identifying of the user (e.g., user name, location and time of POI detection, activity of POI, and the like). For example, the location and time of POI detecting may be traced back to a particular user. As an example, if the POI was last sighted at the Apple Arena in NYC on February 11th at 7PM, which is the same place and time that Celebrity Star had a performance, then the time and location could reveal Celebrity Star’s identity. As another example, the activity note in the POI profile may reveal identifying information about the user. For example, a POI streaking on stage is a specific event that could be traced to a particular performer (e.g., if the event was covered by the news). Any identifying information may be scrubbed from the data transmitted to the requesting user. Such limited information may be transferred to the owner (e.g., Celebrity Star) and/or to any affiliated users (e.g., Celebrity Star’s security team). [0070] If permission is required (e.g., a private database), then the method 250 proceeds to operation 270 and permission is requested to the applicable user. In one embodiment, the system may automatically send a request to the user upon determining that the user’s private database has a matching POI image. In an alternate embodiment, the system may send a message back to the requesting user indicating a matching POI image has been detected in a private database, and allowing the requesting user to determine whether to send an access request via the system. In either embodiment, the system may send the access request to the applicable user as a message. For example, the message may populate in a message tab in the user’s profile.
[0071] After operation 270 the method 250 proceeds to operation 272 and the cloud processing element determines whether permission was granted to access data associated with the matching stored POI image (e.g., in the private database). For example, a selection (e.g., by a button, toggle, or the like) by the user receiving the request may indicate a grant or denial of permission to access the user’s database. If permission was not granted, the method 250 ends and no information is transferred. If permission was granted, the method 250 proceeds to operation 274 and limited information related to the matching stored image is transferred to one or more applicable users. Limited information may include pertinent information related to the POI (e.g., identifying information and threat level), while excluding any information that may be identifying of the user (e.g., user name, location and time of POI detection, activity of POI, and the like). Any identifying information may be scrubbed from the data transmitted to the requesting user. Such limited information may be transferred to the owner (e.g., Celebrity Star) and/or to any affiliated users (e.g., Celebrity Star’s security team).
[0072] If the matching POI image is from another user’s POI database, then the owner of the POI database containing the matching POI image may also receive certain information related to the POI detection event. This information may be limited to avoid sharing any identifying information of the user who detected the POI. For example, the information may only reveal the identity of the POI and that a new POI detection event was created for that POI. In some examples, the information may exclude the location of the POI detection event, as this may be traced back to the user who detected the POI. [0073] In the embodiment where the image is received from a general system camera and is not associated with any particular user, the system may scan all POI databases in the system, and upon detecting a match with a POI image in a particular user’s database, may alert that particular user that a POI in the user’s POI database has been located and provide the user with any circumstantial details as to the POI’s location, time of detection, and the like. In this example, the system provides complete information related to the POI to the user.
[0074] Fig. 5 is a flow chart illustrating a method for processing image data using various databases sequentially. The method 300 begins with operation 302 and the system 100 determines the identity of the owner of a received image. An image may be received that is associated with a particular user. For example, the image may be received from a camera associated with the user or via the user’s profile. The user’s profile may include identifying information for the user, such as, for example, the user’s name, email address, username, and the like.
[0075] Once the owner of the received image is identified, the method 300 proceeds to operation 304 and the system 100 determines the owner’s database. The user profile for the owner of the received image may be associated with a POI database particular to the user. This POI database is considered the owner’s database.
[0076] After operation 304, the method 300 proceeds to operation 306 and the received image is compared to POI images in the owner’s database. Conventional facial recognition and image comparison techniques may be used. For example, facial features, pixels, color, contrast, shapes, and the like may be compared between images.
[0077] After operation 306, the method 300 proceeds to operation 308 and the system 100 determines whether the received image matches a POI image in the owner’ s database within a threshold value. For example, the system may determine one or more of facial features, pixels, color, contrast, shapes, and the like match between the two images. The system may assess the percent match between the images and determine whether the percent match is equal to or exceeds a matching threshold value. A matching threshold value may represent a value at which there is a high probability that the images are the same person. For examples, the matching threshold value may be at least 70%, at least 80%, at least 90%, or the like. In other words, if the images match by at least 70%, at least 80%, or at least 90%, then there is a high probability that the images are the same person and that there is a match.
[0078] If the received image matches an image in the owner’s database within a threshold value, then the method 300 proceeds to operation 310 and the system 100 sends a notification with all associated POI information to one or more applicable users. The one or more applicable users includes the owner and may include other users affiliated with the owner or granted permissions by the owner (e.g., a security detail). The associated POI information may include any information related to the POI, such as, for example, identifying
information, location and time information, behavioral history, threat level, relationship to the owner, criminal record, and the like. The associated POI information may also include information related to the owner, such as, for example, the owner’s name, restraining orders, complaints, comments, and the like. The owner may have full access to information in the owner’s database, while affiliated users may have full or limited access, depending upon the owner’s permissions. For example, the owner may control the type of access of affiliated users via the owner’s user profile.
[0079] If the received image does not match an image in the owner’ s database within a threshold value, then the method 300 proceeds to operation 312 and the system 100 determines whether there is one or more shared databases. In one example, the user may be affiliated with one or more shared databases through the user’s profile. For example, Celebrity Star and Famous Actor may have a similar fan base and may have previously agreed to share their respective POI databases with one another. Celebrity Star may adjust user preferences in her profile to grant such access. Celebrity Star may further control the type and amount of information that Famous Actor may have access to within Celebrity Start’s POI database. Celebrity Star may grant full access or limited access to her POI database. In several embodiments, default access to a shared database provides limited access. In another example, the system may detect a globally shared database (e.g., one that is not affiliated with the user, but is accessible to all users).
[0080] If the system 100 determines that one or more shared databases exist, then the method proceeds to operation 314 and the received image is compared to images stored in the one or more shared databases. Any conventional methods for facial recognition and matching or image matching may be used. For example, color, shading, contrast, pixels, shapes, and the like may be compared between images.
[0081] After operation 314, the method 300 proceeds to operation 316 and the system 100 determines whether the received image matches an image in the shared database within a threshold value. For example, the matching threshold value may be a percent, ratio, or other numerical indicator of a high probability of matching images. If the images match within the threshold value (e.g., above a certain percent), then the system determines that the images match.
[0082] If the received image matches an image in the shared database within a threshold value, then the method 300 proceeds to operation 318 and the system 100 sends a notification with limited associated POI information to one or more applicable users. The information may be sent automatically upon determination that a POI image in a shared database matches the received image. The limited POI information may include identifying information related to the POI (e.g., name, age, picture, residence, etc.) and the POI’s threat level (e.g., category such as violent, obsessive, delusion, or the like; color such as red, yellow, blue, gray, or the like; or other indicator of threat level). The limited information may exclude any identifying information of the user (e.g., user name). The one or more applicable users receiving the information may be the owner of the received image and any affiliated users (e.g., security team). The information may be sent to the user as an alert and presented to the user in the user’s profile under an alert tab, as discussed in more detail below. In some embodiments, the user’s POI database is automatically updated with the information, such that the POI from the shared database becomes a POI in the user’s POI database.
[0083] If the received image does not match an image in the shared database within a threshold value, then the method 300 proceeds to operation 320 and the system 100 determines whether there is a non-shared database. Alternatively, the method 300 may proceed directly to operation 320 from operation 312 if the system 100 determines at operation 312 that no shared database exists. The system assesses user preferences for each database to determine whether it is a non-shared database. For example, when the system detects a user preference that the database remain private, the system determines a database is non-shared. [0084] If the system 100 determines at operation 320 that no non-shared database exists, then the method 300 proceeds to operation 330 and the image is discarded. If the system 100 determines at operation 320 that a non-shared database exists, then the method 300 proceeds to operation 322 and the system 100 sends a request to the owner of the non-shared database to approve access to the non-shared database. The request may be in the form of a message sent to an inbox in the owner’s user profile, text message, email, or the like.
[0085] After operation 322, the method 300 proceeds to operation 324 and the system 100 determines whether approval to access the non-shared database has been granted. For example, the request for approval message, as mentioned above, may include a button, toggle, link, other input, or the like for the owner to easily accept or reject the request.
Alternatively, the owner may respond to the message to grant or deny access. If no response is received, then the system may determine that access is denied. If access is denied, then the method 300 proceeds to operation 330 and the image is discarded.
[0086] If approval is granted, then the method 300 proceeds to operation 326 and the received image is compared to images stored in the non-shared database. Any conventional methods for facial recognition and matching or image matching may be used. For example, color, shading, contrast, pixels, shapes, and the like may be compared between images.
[0087] After operation 326, the method 300 proceeds to operation 328 and the system 100 determines whether the received image matches an image in the non-shared database within a threshold value. For example, the matching threshold value may be a percent, ratio, or other numerical indicator of a high probability of matching images. If the images match within the threshold value (e.g., above a certain percent), then the system determines that the images match.
[0088] If the received image matches an image in the non-shared database within a threshold value, then the method 300 proceeds to operation 318 and the system 100 sends a notification with limited associated POI information to one or more applicable users. The information may be sent automatically upon determination that a POI image in a non-shared database matches the received image. The limited POI information may include identifying information related to the POI (e.g., name, age, picture, residence, etc.) and the POI’s threat level (e.g., category such as violent, obsessive, delusion, or the like; color such as red, yellow, blue, gray, or the like; or other indicator of threat level). The limited information may exclude any identifying information of the user (e.g., user name). The one or more applicable users receiving the information may be the owner of the received image and any affiliated users (e.g., security team). The information may be sent to the user as an alert and presented to the user in the user’s profile under an alert tab, as discussed in more detail below. In some embodiments, the user’s POI database is automatically updated with the information, such that the POI from the non-shared database becomes a POI in the user’s POI database.
[0089] If the received image does not match an image in the non-shared database within a threshold value, then the method 300 proceeds to operation 330 and the image is discarded.
[0090] In an alternate embodiment, the system may scan for a matching POI image in a non-shared database prior to requesting permission to access the non-shared database. If a matching POI image is detected, then the system may request permission to share limited information related to the matching POI image to the requesting user. In some embodiments, the system may notify the requesting user that a match has been detected prior to requesting permission to access the non-shared database or prior to receiving permission to access the non-shared database. For example, the system may send the requesting user the image received by the system (e.g., from the camera) and inform the user that the person in the image has been marked as a POI in another user’s database. Without permission from the owner of the non-shared database to access the non-shared database, however, no additional information related to the POI (e.g., name, threat level, etc.) or related to the owner is provided.
[0091] Figs. 6A-E illustrate various user interfaces to access a POI database and track and monitor POI activity via a user profile. Fig. 6A is an example of a graphical user interface 400 displaying persons of interest from a POI database of the POI tracking system of Fig. 1. As one example, the POI database may be displayed in an application on a user device 106.
A user may obtain access to the user’s POI database by various secure means. For example, a user may create an account within the system and gain access to the user’s POI database by using a username and password that is specific to the user. The user may have administrative authority over the user’s account. In other words, the user may have full access to all information stored therein and authority to edit the account, grant or deny access to others, grant administrative authority to others, and the like. The user may share access to the user’s account, profile and/or POI database with other affiliated users (e.g., a security team). The user may set user preferences for the level of access of other affiliated users. For example, the user may allow access to some or all information associated with the POIs in the POI database. Affiliated users may have their own accounts (e.g., with usernames and
passwords) to access the user’s POI database.
[0092] As shown in Fig. 6A, the user interface 400 may provide access to various categories 402 within the user’s profile. Such categories may include, for example, “Activity”,“Lists”,“All”, and“Alerts.” As shown in this example, the category“All” has been selected. In this example,“All” shows a list of all POIs within the user’s POI database. As shown, each POI has a picture 404, a threat level 408, a threat category 405, and some identifying information 406 (e.g., name). Other icons may also be displayed that allow easy access to various functions within the system. For example, as shown, a people icon 416 is displayed on the lower portion of the user interface 400. A user may select this icon to view a quick list of all POIs within the POI database. This list may be simplified from the version displayed in Fig. 6A. For example, the list may only include POI names in alphabetical order. Alternatively, a user may select the people icon 416 to view a list of all affiliated users.
[0093] A places icon 418 is also displayed on the lower portion of the user interface 400.
A user may select this icon to view POI locations 434 on a map 432 (e.g., see Fig. 6E). POI locations may indicate the location where the POI was last seen. For example, a camera associated with the system 100 may have detected the POI in a specific location. The camera may send an image of the POI, along with metadata indicating the location and time the image was taken, to the cloud. The cloud processes the image according to method 250, and, upon determining that the received image matches a POI in the user’s database, may store the location and time information in the respective POI profile in the POI database. The location and time information may be updated each time the POI is detected. The most recent location information may be populated on the map 432 displayed on the user interface 400, as shown in Fig. 6E. A user may select the POI location 434 on the map to view the POI’s profile, as discussed in more detail below. The POI’s profile may include the timing information that correlates to the location information, such that a user can know when the POI was detected in that specific location 434. [0094] Returning to Fig. 6A, a messages icon 420 is also displayed on the lower portion of the user interface 400. A user may select this icon to access any received messages or send any messages. The user may receive messages from the system, from other users, and/or from affiliated users (e.g., the user’s security team). For example, the system may send messages requesting shared access to the user’s POI database. As another example, other users may directly message the user to request shared access to the user’s POI database. In this example, the other users’ identities and the user’s identity may remain hidden. In yet another example, affiliated users may send the user a message. For example, a security team may desire to send the user a message to let the user know how they plan to handle a particular threat, whether the threat has been removed, or precautions the user should take to avoid the threat.
[0095] A more icon 422 is also displayed on the lower portion of the user interface 400. A user may select this icon to access additional functions within the system. For example, the user may access and edit the user’s profile, set user preferences (e.g., change the database to shared or private), alter displays, and the like.
[0096] A camera icon 414 is displayed on an upper portion of the user interface 400. This camera icon may allow quick access to an onboard camera of the user device 106. For example, a user may spot a POI or someone who appears to be a threat. The user may use the onboard camera to take a picture of that individual. In one example, if the user uses the onboard camera via the camera icon 414, the image may automatically upload into the system. The image may be processed locally, such as by method 200, and sent to the cloud for additional processing, such as by method 250. Alternatively, the user may take images via a separate camera function on the user device 106 and upload these images into the system. In instances using the cloud it should be noted that the data may be streamed in various manners, including sending the data in packets or bursts, rather than single streams or may otherwise be transmitted as needed or based on the network bandwidth capabilities and the like.
[0097] In another embodiment, a user may know, via the POI database or personal knowledge, that an individual is a POI. A user may select the camera icon 414 to take a picture of the individual. The image may display on the user interface 400, allowing the user to send the image to affiliated users. The system may use positioning information built into the user device 106 (e.g., a Global Positioning System) to associate the image with a location. The system may send location and time information as metadata with the image. In this manner, an affiliated user may receive the image, location, and time information and take appropriate action. It is also contemplated that an affiliated user may take the picture and update the system or send the image to the primary user.
[0098] An add icon 412 is also displayed on the upper portion of the user interface 400. A user may select this icon to add information into the POI database. For example, a user may add another POI to the user’s POI database or may add additional information to an existing POI. For example, a user may update personal identifiers, locations, affiliations, threats, threat levels, pictures and videos (e.g., security footage), arrest records, identification (e.g., passports, driver’s licenses, etc.), and/or attach any file or document to the POI profile.
[0099] A search icon 410 is also displayed on the upper portion of the user interface 400.
A user may select this icon to search the POI database for specific information. For example, a user may search a specific POI name, a threat category, a threat level, or the like. The system searches through the database for information that matches the search criteria and populates such information on the user interface 400.
[00100] Fig. 6B is an example of a graphical user interface displaying a profile for a person of interest from a POI database of the POI tracking system of Fig. 1. For example, a user may select a POI listed in the POI database (e.g., select the picture 404 displayed in Fig. 6A) to open the POI’s profile 426, as shown in Fig. 6B. The POI’s profile 426 includes information specific to the POI. For example, the POP s profile 426 may include identifying information 406 (e.g., picture 404, name, date of birth, height, weight, age, eye color, hair color, etc.), threat level 408 (e.g., red, yellow, blue, gray, etc.), threat category 405 (e.g., obsession, violent, delusional, hatred, etc.), location (e.g., where the POI was last seen), activity (e.g., past conduct, most recent conduct, or current conduct), and the like. The POPs profile 426 may include a detailed description of the POPs activity. For example, there may be a separate tab“Activity” for additional details related to the POPs behavior.
[00101] Fig. 6C is an example of a graphical user interface of the POI tracking system of Fig. 1 that provides access to various lists categorizing persons of interest of different threat levels. In the example depicted, four different categories 424 of threat level are displayed on the user interface 400. In this example, the threat categories are red, yellow, blue, and gray. A user may select a category 424 to view all POIs within that category. For example, if a user selects the red category, a user can view all POIs with a red threat level. In this manner, a user can focus on POIs that are highly threatening. Other categories are contemplated, such as, for example, geographical location (e.g., based on current, recent or historic location), threat category, and the like.
[00102] Fig. 6D is an example of a graphical user interface displaying alerts related to persons of interest from a POI database of the POI tracking system of Fig. 1. As shown, the user interface 400 includes a separate Alerts tab 426. The most recent alert may be displayed on top. An alert may indicate that a POI has been detected. The alert may include various information related to the POI, such as for example, identifying POI information (e.g., picture and name), threat level, location, and the like. The alert may also indicate the POI’s proximity to the user. For example, the system may determine the user’s location based on positioning technology in the user’s device 106. The system may know the location of the POI based on a known location of the camera that captured the image of the POI. The system may calculate an estimated distance between the user’s location and POI’s location and relay this information to the user via the alert. The alert may also include instructions to the user. For example, the alert may suggest that the user find security or a secure location. The alert may also include a status 428. For example, the alert may be active, indicating that the POI has been detected and has not yet been approached or removed from the premises.
As another example, the alert may be resolved, indicating that the POI has been approached, removed from the premises, and/or is no longer a threat. The alert may include a note indicating any actions taken related to the POI (e.g., removed from premises).
[00103] While several embodiments are discussed relative to a threatening POI, it is also contemplated herein that such embodiments are also applicable to high-profile or important POIs. For example, a business may want to track high-value or important individuals (e.g., celebrities, government officials, key customers, and the like) that enter the premises to ensure they are treated well. POIs may have varying levels of importance. For example, the President of the United States may be considered a top-level POI. If the President is detected on premises, an alert may be sent to an applicable user to provide the President with immediate attention and service. As another example, an A-list celebrity may be considered a top-level POI, while a B-list celebrity may be considered a mid-level POI. Again, color- coding (e.g., gold for top-level, silver for mid-level, and bronze for low-level) or any similar indicator of varying levels (e.g., numbers) may be used to indicate the level of importance.
Conclusion
[00104] The technology described herein may be implemented as logical operations and/or modules in one or more systems. The logical operations may be implemented as a sequence of processor implemented steps directed by software programs executing in one or more computer systems and as interconnected machine or circuit modules within one or more computer systems, or as a combination of both. Likewise, the descriptions of various component modules may be provided in terms of operations executed or effected by the modules. The resulting implementation is a matter of choice, dependent on the performance requirements of the underlying system implementing the described technology. Accordingly, the logical operations making up the embodiments of the technology described herein are referred to variously as operations, steps, objects, or modules. Furthermore, it should be understood that logical operations may be performed in any order, unless explicitly claimed otherwise or a specific order is inherently necessitated by the claim language.
[00105] In some implementations, articles of manufacture are provided as computer program products that cause the instantiation of operations on a computer system to implement the procedural operations. One implementation of a computer program product provides a non-transitory computer program storage medium readable by a computer system and encoding a computer program. It should further be understood that the described technology may be employed in special purpose devices independent of a personal computer.
[00106] The above specification, examples and data provide a complete description of the structure and use of exemplary embodiments of the invention as defined in the claims.
Although various embodiments of the claimed invention have been described above with a certain degree of particularity, or with reference to one or more individual embodiments, those skilled in the art could make numerous alterations to the disclosed embodiments without departing from the spirit or scope of the claimed invention. Other embodiments are therefore contemplated. It is intended that all matter contained in the above description and shown in the accompanying drawings shall be interpreted as illustrative only of particular embodiments and not limiting. Changes in detail or structure may be made without departing from the basic elements of the invention as defined in the following claims.

Claims

CLAIMS: What is claimed is:
1. A method of identifying a person of interest (POI) using two or more POI databases, comprising:
receiving, by a processing element, a first image including a person;
analyzing, by the processing element, the first image for identifiable features corresponding to the person;
comparing, by the processing element, the identifiable features in the first image to corresponding features in one or more stored POI images in the two or more POI databases; detecting, by the processing element, similar features in a second image of a POI that match the identifiable features in the first image within a threshold value, wherein the second image of the POI is from a first POI database of the two or more POI databases, wherein the first POI database is owned by a first user;
determining, by the processing element, the person is the POI; and
extracting, by the processing element, information related to the second image of the POI from the first POI database.
2. The method of claim 1, wherein the identifiable features and the corresponding features are an arrangement of pixels.
3. The method of claim 1, wherein the identifiable features and the corresponding features are facial attributes.
4. The method of claim 1, wherein the threshold value is at least 80%.
5. The method of claim 1, wherein comparing the identifiable features in the first image to corresponding features in the one or more stored POI images in the two or more POI databases comprises scanning the two or more POI databases in parallel.
6. The method of any of claims 1 to 5, wherein the first image is associated with the first user and all of the information related to the second image of the POI is extracted from the first POI database.
7. The method of any of claims 1 to 5, wherein the second image is associated with a second user and limited information related to the second image of the POI is extracted from the first POI database.
8. The method of claim 7, wherein the limited information includes information related to the POI and excludes information related to the first user.
9. A method for tracking a person of interest (POI) and sharing information related to the POI across a plurality of POI databases, comprising:
receiving, by a processor, an image;
comparing, by the processor, the received image to a plurality of images of POIs in the plurality of POI databases;
determining, by the processor, the received image matches a POI image of the plurality of images of POIs in a first POI database of the plurality of POI databases;
determining, by the processor, an access level to the first POI database; and transferring, by the processor, an amount of information related to the POI image based on the access level to the first POI database.
10. The method of claim 9, wherein the access level is owner access and all information related to the POI image is transferred.
11. The method of claim 9, wherein the access level is shared access and limited information related to the POI image is transferred.
12. The method of claim 9, wherein the access level is non-shared access and no information related to the POI image is transferred until permission to access the first POI database is received.
13. The method of claim 12, further comprising requesting, by the processor, access to the first POI database.
14. The method of claim 9 or claim 11, further comprising updating, by the processor, a second POI database of the plurality of POI databases with the POI image and the information related to the POI image when the access level is non-owner access.
15. The method of any of claims 9 to 13, wherein the received image is an image with a face.
16. A system for tracking persons of interest and sharing information related to the persons of interest over a network, comprising:
a user device;
a camera;
a POI database, wherein the user device, camera, and POI database are in
communication over the network; and
a processor configured with instructions to
receive an image from at least one of the camera or the user device;
scan the POI database for a matching POI image to determine that the matching POI image matches the image;
detect the matching POI image in the POI database, wherein the POI database comprises information related to the matching POI image; and
determine whether permission is needed to transfer the information.
17. The system of claim 16, wherein the processor is further configured to send a notification to the user device comprising at least a portion of the information when permission is not needed.
18. The system of claim 17, wherein the notification comprises all of the information when a user of the user device owns the POI database.
19. The system of claim 16, wherein the processor is further configured to send a request for permission to access the POI database when permission is needed.
20. The system of claim 19, wherein the processor is further configured to send a notification to the user device comprising a portion of the information when permission is granted.
PCT/US2020/018382 2019-02-15 2020-02-14 Shared privacy protected databases for person of interest WO2020168252A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962806196P 2019-02-15 2019-02-15
US62/806,196 2019-02-15

Publications (1)

Publication Number Publication Date
WO2020168252A1 true WO2020168252A1 (en) 2020-08-20

Family

ID=69784606

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2020/018382 WO2020168252A1 (en) 2019-02-15 2020-02-14 Shared privacy protected databases for person of interest

Country Status (1)

Country Link
WO (1) WO2020168252A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130259327A1 (en) * 2008-12-12 2013-10-03 At&T Intellectual Property I, L.P. System and Method for Matching Faces
US9330301B1 (en) * 2012-11-21 2016-05-03 Ozog Media, LLC System, method, and computer program product for performing processing based on object recognition
US20180107660A1 (en) * 2014-06-27 2018-04-19 Amazon Technologies, Inc. System, method and apparatus for organizing photographs stored on a mobile computing device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130259327A1 (en) * 2008-12-12 2013-10-03 At&T Intellectual Property I, L.P. System and Method for Matching Faces
US9330301B1 (en) * 2012-11-21 2016-05-03 Ozog Media, LLC System, method, and computer program product for performing processing based on object recognition
US20180107660A1 (en) * 2014-06-27 2018-04-19 Amazon Technologies, Inc. System, method and apparatus for organizing photographs stored on a mobile computing device

Similar Documents

Publication Publication Date Title
US11936648B2 (en) Methods and apparatus for allowing users to control use and/or sharing of images and/or biometric data
Das et al. Assisting users in a world full of cameras: A privacy-aware infrastructure for computer vision applications
US10049288B2 (en) Managed notification system
CA2731250C (en) Managed biometric-based notification system and method
US20180218157A1 (en) End user social network protection portal
US20190124109A1 (en) Automated social account removal
JP2020520511A (en) Access control method and apparatus, system, electronic device, program and medium
US20220253487A1 (en) Linking and monitoring of offender social media
EP3042337B1 (en) World-driven access control using trusted certificates
US11962595B2 (en) System, method and computer-readable medium for utilizing a shared computer system
CN113168433A (en) Computing system with email privacy filter and related methods
US10638298B1 (en) Public event detection platform
US20180150683A1 (en) Systems, methods, and devices for information sharing and matching
US10972860B2 (en) Responding to changes in social traffic in a geofenced area
Ardabili et al. Understanding policy and technical aspects of ai-enabled smart video surveillance to address public safety
US20160248720A1 (en) Social media threat monitor
WO2020168252A1 (en) Shared privacy protected databases for person of interest
US9805585B1 (en) Distress transmission
Rengel Privacy-invading technologies and recommendations for designing a better future for privacy rights
US12010125B2 (en) Anomaly detection in an application with delegate authorization
US20220417265A1 (en) Anomaly detection in an application with delegate authorization
US11818137B2 (en) Context-based security policy for data access and visibility
US20240086038A1 (en) Method and apparatus for controlling a user interface
JP6613502B1 (en) Store visitor verification system using face recognition and store visitor verification program using face recognition
US20240056444A1 (en) System and method for generating a connection

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20710727

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20710727

Country of ref document: EP

Kind code of ref document: A1