WO2022099884A1 - 一种基于人脸识别的人员与案件关联分析方法和装置 - Google Patents

一种基于人脸识别的人员与案件关联分析方法和装置 Download PDF

Info

Publication number
WO2022099884A1
WO2022099884A1 PCT/CN2020/139838 CN2020139838W WO2022099884A1 WO 2022099884 A1 WO2022099884 A1 WO 2022099884A1 CN 2020139838 W CN2020139838 W CN 2020139838W WO 2022099884 A1 WO2022099884 A1 WO 2022099884A1
Authority
WO
WIPO (PCT)
Prior art keywords
person
case
information
interest
concerned
Prior art date
Application number
PCT/CN2020/139838
Other languages
English (en)
French (fr)
Inventor
林加明
陈积银
陈生坚
李仁杰
江文涛
张翔
Original Assignee
罗普特科技集团股份有限公司
罗普特(厦门)系统集成有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 罗普特科技集团股份有限公司, 罗普特(厦门)系统集成有限公司 filed Critical 罗普特科技集团股份有限公司
Publication of WO2022099884A1 publication Critical patent/WO2022099884A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services

Definitions

  • the present disclosure relates to the field of face recognition, in particular to a method and device for analyzing the association between persons and cases based on face recognition.
  • the purpose of the embodiments of the present application is to propose a method and device for analyzing the association between persons and cases based on face recognition to solve the technical problems mentioned in the above background technology section.
  • an embodiment of the present application provides a method for analyzing the association between persons and cases based on face recognition, including the following steps:
  • S1 Obtain the information of the concerned person, determine whether the concerned person is an identified person through the information, and obtain the historical travel track information of the concerned person, wherein the concerned person includes the person related to the historical case;
  • S2 Traverse the historical travel track information, obtain the snapshot time, the latitude and longitude of the capture point, and the travel mode corresponding to each track in the historical travel track information, obtain all the case information within a certain time range before and after the capture time, and store the latitude and longitude of the capture point. Calculate the distance from the longitude and latitude of the incident location of each case in the case information, and calculate the absolute value of the time difference between the capture time and the incident time of each case, and calculate the theoretical distance according to the product of the average speed of the capture travel mode and the absolute value of the time difference ;as well as
  • S3 If the distance is greater than the theoretical distance, there is no relationship between the track of the person of interest and the case. If the distance is less than or equal to the theoretical distance, and if the person of interest is an identified person, the ID number or photo of the person of interest will be associated with the case. Match the ID number or photo of the person involved to determine the relationship between the track of the person of interest and the case. If the person of interest is an unidentified person, compare the photo of the person of interest with the photo of the person involved in the case to determine the person of interest The relationship between the trajectory and the case.
  • the information of the person of interest includes a certificate number or a photo
  • the person of interest is determined to be an identified person through the certificate number or photo, and is determined to be an unidentified person through the photo.
  • the information given by the concerned personnel is not necessarily complete, so it is difficult to determine whether it is an identified person or an unidentified person. Therefore, the efficiency of association analysis can be improved by determining the identity.
  • step S1 specifically includes the following steps:
  • the historical travel trajectories of the concerned person can be obtained in different ways, and the identification efficiency is improved and the identification accuracy rate is enhanced.
  • the historical travel trajectory database is established by the following steps:
  • S111 Obtain a face snapshot, and use the face recognition engine to compare the face snapshot with the real population database to obtain a third comparison result;
  • the historical snapshot gallery is a gallery formed by clustering the acquired face snapshot images
  • the actual population database is a database established according to the actual existing population.
  • step S2 the distance between the longitude and latitude of the capture point and the longitude and latitude of the incident site is calculated by the Haversine formula:
  • R is the radius of the earth
  • the average value is 6371km
  • represents the difference between the longitude of the capture point and the incident site.
  • the distance between the capture point and the incident site can be used as one of the basis for judging the relevance of the person concerned and the case.
  • the calculation in this way can improve the accuracy of case association.
  • step S3 if the person of interest is an identified person, the certificate number of the person of interest is matched with the certificate number of the person involved in the case, and if there is a match, there is a strong correlation between the trajectory of the person of interest and the case If they do not match, the fourth comparison result is obtained by comparing the photo of the person concerned with the photo of the person involved in the case. It is a strong relationship, otherwise it is a weak relationship. Using this method to determine the association relationship with the case according to the person concerned is an identified person can improve the accuracy of identification.
  • the photo of the person concerned is compared with the photo of the person involved in the case to obtain a fifth comparison result, if the fifth comparison result is greater than or equal to the fifth comparison result If the threshold is set, there is a strong relationship between the track of the person concerned and the case, otherwise it is a weak relationship. Using this method to determine the association relationship with the case according to the person concerned is an unidentified person can improve the accuracy of identification.
  • the embodiment of the present application also proposes a person-case association analysis device based on face recognition, including:
  • the historical travel track information acquisition module is configured to obtain the information of the concerned person, determine whether the concerned person is an identified person through the information, and obtain the historical travel track information of the concerned person, wherein the concerned person includes the person related to the historical case ;
  • the distance calculation module is configured to traverse the historical travel trajectory information, obtain the snapshot time, the latitude and longitude of the snapshot point, and the snapshot travel mode corresponding to each track in the historical travel trajectory information, and obtain all case information within a certain time range before and after the snapshot time, Calculate the distance between the longitude and latitude of the capture point and the longitude and latitude of each case in the case information, and calculate the absolute value of the time difference between the capture time and the time of occurrence of each case.
  • the product of calculates the theoretical distance;
  • the relationship judgment module is configured so that if the distance is greater than the theoretical distance, there is no relationship between the track of the person concerned and the case; if the distance is less than or equal to the theoretical distance, and if the person concerned is an identified person, the person concerned will be The ID number or photo is matched with the ID number or photo of the person involved to determine the relationship between the track of the person of interest and the case. If the person of interest is an unidentified person, the photo of the person of interest and the photo of the person involved in the case are compared. Compare and determine the relationship between the track of the person concerned and the case.
  • embodiments of the present application provide an electronic device, including: one or more processors; a storage device for storing one or more programs, when the one or more programs are executed by the one or more processors , causing one or more processors to implement the method as described in any implementation manner of the first aspect.
  • an embodiment of the present application provides a computer-readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, implements the method described in any implementation manner of the first aspect.
  • the present disclosure proposes a method and device for analyzing the relationship between persons and cases based on face recognition, which can obtain information of the person concerned, determine whether the person concerned is an identified person through the information, and obtain the historical travel track information of the person concerned, Among them, the concerned persons include persons related to historical cases; traverse the historical travel trajectory information, obtain the snapshot time, the latitude and longitude of the snapshot point, and the snapshot travel mode corresponding to each track in the historical travel trajectory information, and obtain the snapshot time within a certain time range before and after the snapshot time. For all case information, calculate the distance between the longitude and latitude of the capture point and the longitude and latitude of the incident location of each case in the case information, and calculate the absolute value of the time difference between the capture time and the incident time of each case.
  • the product of the absolute value of the speed and the time difference calculates the theoretical distance; if the distance is greater than the theoretical distance, there is no relationship between the trajectory of the person of interest and the case; if the distance is less than or equal to the theoretical distance, and if the person of interest is an identified person, then Match the ID number or photo of the concerned person with the ID number or photo of the person involved to determine the relationship between the track of the concerned person and the case. If the concerned person is an unidentified person, match the concerned person's photo with the case's The photos of the persons involved in the case are compared to determine the relationship between the trajectory of the concerned person and the case.
  • face recognition technology and big data analysis and retrieval technology based on face trajectory and case information, quickly and accurately analyze the relationship between personnel and cases, effectively improving case handling efficiency and reducing misjudgment rates.
  • FIG. 1 is an exemplary device architecture diagram to which an embodiment of the present application may be applied;
  • FIG. 2 is a schematic flowchart of a method for analyzing the association between persons and cases based on face recognition according to an embodiment of the present disclosure
  • step S1 of the method for analyzing the association between persons and cases based on face recognition according to an embodiment of the present disclosure
  • FIG. 4 is a schematic flowchart of the establishment of a historical travel trajectory database of the method for analyzing the association between persons and cases based on face recognition according to an embodiment of the present disclosure
  • FIG. 5 is a schematic diagram of an apparatus for analyzing the association between persons and cases based on face recognition according to an embodiment of the present disclosure
  • FIG. 6 is a schematic structural diagram of a computer device suitable for implementing the electronic device according to the embodiment of the present application.
  • FIG. 1 shows an exemplary apparatus architecture 100 to which the face recognition-based person-case association analysis method or the face-recognition-based person-case association analysis apparatus according to the embodiments of the present application can be applied.
  • the apparatus architecture 100 may include terminal devices 101 , 102 , and 103 , a network 104 and a server 105 .
  • the network 104 is a medium used to provide a communication link between the terminal devices 101 , 102 , 103 and the server 105 .
  • the network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
  • the user can use the terminal devices 101, 102, 103 to interact with the server 105 through the network 104 to receive or send messages and the like.
  • Various applications may be installed on the terminal devices 101 , 102 and 103 , such as data processing applications, file processing applications, and the like.
  • the terminal devices 101, 102, and 103 may be hardware or software.
  • the terminal devices 101, 102, and 103 can be various electronic devices, including but not limited to smart phones, tablet computers, laptop computers, desktop computers, and the like.
  • the terminal devices 101, 102, and 103 are software, they can be installed in the electronic devices listed above. It can be implemented as a plurality of software or software modules (eg, software or software modules for providing distributed services), or can be implemented as a single software or software module. There is no specific limitation here.
  • the server 105 may be a server that provides various services, such as a background data processing server that processes files or data uploaded by the terminal devices 101 , 102 , and 103 .
  • the background data processing server can process the acquired files or data to generate processing results.
  • the method for analyzing the association between persons and cases based on face recognition may be executed by the server 105 or by the terminal devices 101 , 102 , and 103 .
  • the case-related analysis apparatus may be installed in the server 105 or may be installed in the terminal devices 101 , 102 , and 103 .
  • terminal devices, networks and servers in FIG. 1 are merely illustrative. There can be any number of terminal devices, networks and servers according to implementation needs.
  • the above-mentioned apparatus architecture may not include a network, but only need a server or a terminal device.
  • FIG. 2 shows that an embodiment of the present application discloses a method for analyzing the association between persons and cases based on face recognition, which includes the following steps:
  • S1 Obtain the information of the concerned person, determine whether the concerned person is an identified person through the information, and obtain the historical travel track information of the concerned person, wherein the concerned person includes the person related to the historical case;
  • S2 Traverse the historical travel track information, obtain the snapshot time, the latitude and longitude of the capture point, and the travel mode corresponding to each track in the historical travel track information, obtain all the case information within a certain time range before and after the capture time, and store the latitude and longitude of the capture point. Calculate the distance from the longitude and latitude of the incident location of each case in the case information, and calculate the absolute value of the time difference between the capture time and the incident time of each case, and calculate the theoretical distance according to the product of the average speed of the capture travel mode and the absolute value of the time difference ;as well as
  • S3 If the distance is greater than the theoretical distance, there is no relationship between the track of the person of interest and the case. If the distance is less than or equal to the theoretical distance, and if the person of interest is an identified person, the ID number or photo of the person of interest will be associated with the case. Match the ID number or photo of the person involved to determine the relationship between the track of the person of interest and the case. If the person of interest is an unidentified person, compare the photo of the person of interest with the photo of the person involved in the case to determine the person of interest The relationship between the trajectory and the case.
  • the concerned person can be a person who has been determined to be related to a certain case, but does not know whether it is related to other cases, or can be a person who has other problems in other law enforcement case handling, or needs to focus on staff.
  • the analysis of the relationship between the concerned person and the case aims to use face recognition and big data technology to quickly and accurately analyze the potential relationship between the people who need to be concerned in the process of law enforcement and case handling and some cases.
  • the information of the concerned person includes an ID number or photo, and the concerned person can be determined to be an identified person through the ID number or photo, or can be determined to be an unidentified person through the photo.
  • the information given by the concerned personnel is not necessarily complete, some only provide photos, and some provide detailed information such as ID numbers, so it is difficult to determine whether it is an identified person or an unidentified person. This can improve the efficiency of relationship analysis by determining the identity first.
  • the concerned person is an identified person or an unidentified person, the relevance to the case can be judged in different ways.
  • step S1 specifically includes the following steps:
  • the step of determining the historical travel trajectories according to the photos of the concerned persons includes: entering the face snapshots captured and uploaded by the face capture device in real time into the face recognition engine and setting the recognition similarity, which is similar to that of the actual population.
  • the library does 1:N search, and the comparison results are sorted in reverse order of comparison similarity, which are recorded as Top1, Top2...TopN. If the value of the highest similarity result Top1 is greater than the set comparison threshold for confirming the same person, the person in the face snapshot image is determined to be an identified person, and its identity is the identity of the person corresponding to Top1.
  • the snapshot information (longitude and latitude of the snapshot point, point name, snapshot picture information, etc.), identify the Top1 personnel information (identification similarity, personnel ID number, ID photo and other information), form a personnel track record and enter the personnel history travel track record table
  • a historical travel trajectory database is formed inside. According to whether the concerned person is an identified person or an unidentified person, the historical travel trajectories of the concerned person can be obtained in different ways, and the identification efficiency is improved and the identification accuracy rate is enhanced. And after learning that the person concerned is an identified person, more information can be obtained based on their identity, including but not limited to historical travel trajectory information.
  • the historical travel trajectory database is established through the following steps:
  • S111 Obtain a face snapshot, and use the face recognition engine to compare the face snapshot with the real population database to obtain a third comparison result;
  • the historical travel trajectory database is a database established for identified persons, which contains not only identity information, but also historical travel trajectory information composed of snapshot points.
  • the construction of a historical travel trajectory database is conducive to the subsequent comparison with the time and place of the incident, and improves the accuracy and efficiency of the association relationship judgment.
  • the historical snapshot gallery is a gallery formed by clustering the acquired face snapshot images
  • the actual population database is a database established according to the actual existing population.
  • the historical snapshot gallery is only a historical snapshot gallery formed by the same person whose similarity exceeds the set threshold.
  • the person concerned is an unidentified person.
  • the following steps can be used to obtain historical travel track information: The photos of the identified personnel are entered into the face recognition engine, and the recognition similarity S and the snapshot time range T start to T end are set, and a 1:N search is performed with the historical snapshot image library.
  • the comparison results are sorted in reverse order of comparison similarity, and recorded as Top1, Top2...TopN.
  • Top 1 to N are the travel trajectories of the unidentified person within the time range of T start to T end .
  • the trajectory information is formed.
  • the real population database contains a database of people with known identities established according to the three standards and one real or other third parties.
  • the case information (case name, time of occurrence, longitude and latitude of the incident place, etc.) is entered into the case information form, and the information of the persons involved in the case (personnel ID number, personnel ID photo, personnel gender and age) etc.) are entered into the case information record form.
  • step S2 the distance between the longitude and latitude of the capture point and the longitude and latitude of the incident site is calculated by the Haversine formula:
  • R is the radius of the earth
  • the average value is 6371km
  • represents the difference between the longitude of the capture point and the incident site.
  • step S2 it is judged whether each track in the historical travel track information of the concerned person is related to each case, and the time of the shooting in each track is used to screen out the time when the incident occurred at a certain time before and after the shooting time.
  • the relationship between the concerned person and the case can be further accurately determined according to the distance between the latitude and longitude of the capture point and the location of the incident.
  • a certain time before and after the capture time can be limited to within 30 minutes before and after.
  • step S3 if the person of interest is an identified person, the certificate number of the person of interest is matched with the certificate number of the person involved in the case, and if there is a match, there is a strong relationship between the trajectory of the person of interest and the case If it does not match, the fourth comparison result is obtained by comparing the photo of the concerned person with the photo of the person involved in the case. There is a strong relationship between them, and a weak relationship otherwise. According to the person concerned is an identified person, this method is used to determine the relationship with the case, which can improve the accuracy of identification.
  • step S3 if the person concerned is an unidentified person in step S3, the photo of the person concerned is compared with the photo of the person involved in the case to obtain a fifth comparison result, if the fifth comparison result is greater than or equal to the fifth threshold, then there is a strong relationship between the track of the person concerned and the case, otherwise it is a weak relationship.
  • this method to determine the association relationship with the case according to the person concerned is an unidentified person can improve the accuracy of identification.
  • the present application provides an embodiment of a person-case correlation analysis device based on face recognition, which is implemented with the method shown in FIG. 2 .
  • the device can be specifically applied to various electronic devices.
  • the embodiment of the present application also proposes a person-case correlation analysis device based on face recognition, as shown in FIG. 5 , including:
  • the historical travel track information acquisition module 1 is configured to obtain the information of the concerned person, determine whether the concerned person is an identified person through the information, and obtain the historical travel track information of the concerned person, wherein the concerned person includes the information related to the historical case. personnel;
  • the distance calculation module 2 is configured to traverse the historical travel trajectory information, obtain the snapshot time, the latitude and longitude of the snapshot point, and the snapshot travel mode corresponding to each track in the historical travel trajectory information, and obtain all case information within a certain time range before and after the snapshot time. , calculate the distance between the longitude and latitude of the capture point and the longitude and latitude of each case in the case information, and calculate the absolute value of the time difference between the capture time and the time of each case, according to the average speed of the capture travel mode and the absolute time difference The product of the values calculates the theoretical distance; and
  • the relationship judgment module 3 is configured so that if the distance is greater than the theoretical distance, there is no relationship between the track of the person concerned and the case; if the distance is less than or equal to the theoretical distance, and if the person concerned is an identified person, the person concerned will be Match the ID number or photo of the person involved with the ID number or photo of the person involved to determine the relationship between the track of the person of interest and the case. If the person of interest is an unidentified person, the photo of the person of interest and the photo of the person involved in the case are compared A comparison is made to determine the relationship between the track of the person concerned and the case.
  • the present disclosure proposes a method and device for analyzing the relationship between persons and cases based on face recognition, which can obtain information of the person concerned, determine whether the person concerned is an identified person through the information, and obtain the historical travel track information of the person concerned, Among them, the concerned persons include persons related to historical cases; traverse the historical travel trajectory information, obtain the snapshot time, the latitude and longitude of the snapshot point, and the snapshot travel mode corresponding to each track in the historical travel trajectory information, and obtain the snapshot time within a certain time range before and after the snapshot time. For all case information, calculate the distance between the longitude and latitude of the capture point and the longitude and latitude of the incident location of each case in the case information, and calculate the absolute value of the time difference between the capture time and the incident time of each case.
  • the product of the absolute value of the speed and the time difference calculates the theoretical distance; if the distance is greater than the theoretical distance, there is no relationship between the trajectory of the person of interest and the case; if the distance is less than or equal to the theoretical distance, and if the person of interest is an identified person, then Match the ID number or photo of the concerned person with the ID number or photo of the person involved to determine the relationship between the track of the concerned person and the case. If the concerned person is an unidentified person, match the concerned person's photo with the case's The photos of the persons involved in the case are compared to determine the relationship between the trajectory of the concerned person and the case.
  • face recognition technology and big data analysis and retrieval technology based on face trajectory and case information, quickly and accurately analyze the relationship between personnel and cases, effectively improving case handling efficiency and reducing misjudgment rates.
  • FIG. 6 it shows a schematic structural diagram of a computer apparatus 600 suitable for implementing the electronic device (for example, the server or terminal device shown in FIG. 1 ) according to the embodiment of the present application.
  • the electronic device shown in FIG. 6 is only an example, and should not impose any limitations on the functions and scope of use of the embodiments of the present application.
  • a computer device 600 includes a central processing unit (CPU) 601 and a graphics processing unit (GPU) 602, which can be loaded into random access according to a program stored in a read only memory (ROM) 603 or from a storage section 609 Various appropriate actions and processes are executed by the programs in the memory (RAM) 606 .
  • RAM memory
  • various programs and data required for the operation of the device 600 are also stored.
  • the CPU 601, GPU 602, ROM 603, and RAM 604 are connected to each other through a bus 605.
  • An input/output (I/O) interface 606 is also connected to bus 605 .
  • the following components are connected to the I/O interface 606: an input section 607 including a keyboard, a mouse, etc.; an output section 608 including a liquid crystal display (LCD), etc., and a speaker, etc.; a storage section 609 including a hard disk, etc.; The communication part 610 of a network interface card such as a modem, etc. The communication section 610 performs communication processing via a network such as the Internet.
  • a driver 611 may also be connected to the I/O interface 606 as desired.
  • a removable medium 612 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, etc., is mounted on the drive 611 as needed so that a computer program read therefrom is installed into the storage section 609 as needed.
  • embodiments of the present disclosure include a computer program product comprising a computer program carried on a computer-readable medium, the computer program containing program code for performing the method illustrated in the flowchart.
  • the computer program may be downloaded and installed from the network via the communication portion 610, and/or installed from the removable medium 612.
  • CPU central processing unit
  • GPU graphics processing unit
  • the computer-readable medium described in this application may be a computer-readable signal medium or a computer-readable medium, or any combination of the above two.
  • the computer readable medium may be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor device, apparatus or device, or a combination of any of the above. More specific examples of computer readable media may include, but are not limited to, electrical connections having one or more wires, portable computer disks, hard disks, random access memory (RAM), read only memory (ROM), erasable programmable Read only memory (EPROM or flash memory), fiber optics, portable compact disk read only memory (CD-ROM), optical storage devices, magnetic storage devices, or any suitable combination of the foregoing.
  • a computer-readable medium may be any tangible medium that contains or stores a program that can be used by or in conjunction with an instruction execution apparatus, apparatus, or device.
  • a computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, carrying computer-readable program code therein. Such propagated data signals may take a variety of forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • a computer-readable signal medium can also be any computer-readable medium other than a computer-readable medium that can transmit, propagate, or transport the program for use by or in connection with the instruction execution apparatus, apparatus, or device.
  • Program code embodied on a computer readable medium may be transmitted using any suitable medium including, but not limited to, wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
  • Computer program code for performing the operations of the present application may be written in one or more programming languages, including object-oriented programming languages—such as Java, Smalltalk, C++, but also conventional procedural programming language - such as "C" language or similar programming language.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or may be connected to an external computer (eg, using an Internet service provider through Internet connection).
  • LAN local area network
  • WAN wide area network
  • Internet service provider e.g., using an Internet service provider through Internet connection.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of code that contains one or more logical functions for implementing the specified functions executable instructions.
  • the functions noted in the blocks may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
  • each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations can be implemented by dedicated hardware-based devices that perform the specified functions or operations , or can be implemented in a combination of dedicated hardware and computer instructions.
  • the modules involved in the embodiments of the present application may be implemented in a software manner, and may also be implemented in a hardware manner.
  • the described modules may also be provided in a processor.
  • the present application also provides a computer-readable medium.
  • the computer-readable medium may be included in the electronic device described in the above embodiments; it may also exist alone without being assembled into the electronic device. middle.
  • the above-mentioned computer-readable medium carries one or more programs, and when the above-mentioned one or more programs are executed by the electronic device, the electronic device: obtains the information of the person concerned, and judges whether the person concerned is an identified person through the information , and obtain the historical travel trajectory information of the concerned person, where the concerned person includes the personnel related to the historical case; traverse the historical travel trajectory information, and obtain the snapshot time, the latitude and longitude of the snapshot point, and the snapshot travel mode corresponding to each trajectory in the historical travel trajectory information.
  • the distance is greater than the theoretical distance, there is no relationship between the track of the person concerned and the case, if the distance is less than or equal to the theoretical distance, and If the person of interest is an identified person, the ID number or photo of the person of interest will be matched with the ID number or photo of the person involved to determine the relationship between the track of the person of interest and the case. If the person of interest is an unidentified person personnel, the photos of the concerned persons are compared with the photos of the persons involved in the case to determine the relationship between the track of the concerned persons and the case.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Tourism & Hospitality (AREA)
  • Library & Information Science (AREA)
  • Evolutionary Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Development Economics (AREA)
  • Educational Administration (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Remote Sensing (AREA)
  • Collating Specific Patterns (AREA)

Abstract

一种基于人脸识别的人员与案件关联关系分析方法和装置,所述方法包括:获取关注人员的信息,通过信息判断关注人员是否是已识别身份的人员,并获取关注人员的历史出行轨迹信息,其中,关注人员包括与历史案件相关的人员(S1);遍历历史出行轨迹信息,根据历史出行轨迹信息中每个轨迹对应的抓拍时间、抓拍点经纬度及抓拍出行方式确定关注人员的轨迹与案件之间的关联关系,若关注人员为未识别身份的人员,则将关注人员的照片与案件的涉案人员照片进行比对确定关注人员的轨迹与案件之间的关联关系。利用人脸识别技术和大数据分析检索技术,根据人脸轨迹和案件信息,快速准确地分析出人员与案件的关联关系,有效提高办案效率,减少误判率。

Description

一种基于人脸识别的人员与案件关联分析方法和装置
相关申请
本申请要求保护在2020年11月16日提交的申请号为202011276290.4的中国专利申请的优先权,该申请的全部内容以引用的方式结合到本文中。
技术领域
本公开涉及人脸识别领域,具体涉及一种基于人脸识别的人员与案件关联分析方法和装置。
背景技术
随着近几年人脸识别技术、大数据技术在安防领域的深入应用以及智能监控设备的快速更新迭代,利用这些技术来丰富执法办案人员的办案手段,提升办案效率,构建智慧警务、智能办案的需求也越来越迫切。
传统的警务办案手段效率低下,某些案件案情复杂,涉及大量的线索和涉案关注人员需要办案人员人工筛查监控设备、走访相关嫌疑人出行轨迹和过滤无关人员,才能确定人员与案件的关联关系,并容易造成错判、漏判。
传统的警务办案时,一个案件发生时,执法人员往往需要采集涉案人员信息和发案地信息,人工排查发案地周边监控点位和涉案人员出现过的监控点位,还原涉案人员的轨迹,寻找破案线索。当案情复杂或涉案人员较多时,此做法费时费力,容易耽误破案的最佳时机。并且此种办案方式无法高效地分析涉案人员是否与其他案件有关联。
有鉴于此,建立一种基于人脸识别的人员与案件关联分析方法和装置是非常具有意义的。
公开内容
针对上述提到现有技术中人员与案件关联关系的判断效率低、准确性不高、时间长等问题。本申请的实施例的目的在于提出了一种基于人脸识别的人员与案件关联分析方法和装 置来解决以上背景技术部分提到的技术问题。
第一方面,本申请的实施例提供了一种基于人脸识别的人员与案件关联分析方法,包括以下步骤:
S1:获取关注人员的信息,通过信息判断关注人员是否是已识别身份的人员,并获取关注人员的历史出行轨迹信息,其中,关注人员包括与历史案件相关的人员;
S2:遍历历史出行轨迹信息,获取历史出行轨迹信息中每个轨迹对应的抓拍时间、抓拍点经纬度及抓拍出行方式,获取在抓拍时间的前后一定时间范围内的所有案件信息,将抓拍点位经纬度分别与案件信息中的每个案件的发案地点经纬度计算距离,并计算抓拍时间与每个案件的发案时间的时间差绝对值,根据抓拍出行方式的平均速度与时间差绝对值的乘积计算理论距离;以及
S3:若距离大于理论距离,则关注人员的轨迹与案件之间无关联关系,若距离小于或等于理论距离,并且若关注人员是已识别身份的人员,则将关注人员的证件号或照片与涉案人员的证件号或照片进行匹配确定关注人员的轨迹与案件之间的关联关系,若关注人员为未识别身份的人员,则将关注人员的照片与案件的涉案人员照片进行比对确定关注人员的轨迹与案件之间的关联关系。
在一些实施例中,关注人员的信息包括证件号或照片,关注人员通过证件号或照片确定为已识别身份的人员,通过照片确定为未识别身份的人员。因为在现实办案过程中关注人员所给出信息不同也不一定完整,所以难以确定是已识别身份的人员或未识别身份的人员,因此可以通过确定身份来提高关联关系分析的效率。
在一些实施例中,步骤S1具体包括以下步骤:
S11:若关注人员的信息为证件号,则关注人员为已识别身份的人员,根据关注人员的身份信息在历史出行轨迹库中查询获得关注人员的历史出行轨迹信息;
S12:若关注人员的信息为照片,则通过人脸识别引擎将照片与实有人口底库进行比对得到第一对比结果,若第一比对结果中最高相似度结果大于或等于第一阈值,则确定关注人员为已识别身份的人员,关注人员的身份为最高相似度结果所对应的人员身份,根据关注人员的身份信息在历史出行轨迹库中获得关注人员的历史出行轨迹信息;以及
S13:若第一比对结果中最高相似度结果小于第一阈值,则确定关注人员为未识别身份的人员,通过人脸识别引擎将照片与历史抓拍图库进行比对得到第二对比结果,将第二比对 结果中大于或等于第二阈值所对应的历史抓拍图库结合对应的抓拍点位信息形成关注人员的历史出行轨迹信息。
根据关注人员是已识别身份人员还是未识别身份人员可以通过不同的方式获得关注人员的历史出行轨迹,并且提高了识别效率,加强了识别准确率。
在一些实施例中,历史出行轨迹库通过以下步骤建立:
S111:获取人脸抓拍图,并通过人脸识别引擎将人脸抓拍图与实有人口底库进行比对得到第三对比结果;
S112:若第三比对结果中最高相似度结果大于或等于第三阈值,则确定人脸抓拍图所对应的人员为已识别身份的人员,人脸抓拍图所对应的人员的身份为最高相似度结果所对应的人员身份,将人脸抓拍图的抓拍信息结合已识别身份的人员的信息形成历史出行轨迹库。
构建历史出行轨迹库有利于后续与案件的发案时间和地点进行比对,提高关联关系判断的准确性和效率。
在一些实施例中,历史抓拍图库为根据获取的人脸抓拍图聚类形成的图库,实有人口底库为根据实际现有人口建立的数据库。
在一些实施例中,步骤S2中通过Haversine公式计算抓拍点位经纬度与发案地点经纬度之间的距离:
Figure PCTCN2020139838-appb-000001
其中,haversin(θ)=sin 2(θ/2)=(1-cos(θ))/2,R为地球半径,可取平均值6371km,
Figure PCTCN2020139838-appb-000002
表示抓拍点位和发案地点的纬度,Δλ表示抓拍点位和发案地点的经度的差值。
抓拍点位和发案地点之间的距离可以作为关注人员与案件的关联性的判断依据之一。通过该方式的计算可以提高案件关联的准确性。
在一些实施例中,步骤S3中若关注人员是已识别身份的人员,则将关注人员的证件号与涉案人员的证件号进行匹配,若匹配上则关注人员的轨迹与案件之间为强关联关系;若不匹配,则根据关注人员的照片与涉案人员的照片进行比对,得到第四比对结果,若第四比对结果大于或等于第四阈值,则关注人员的轨迹与案件之间为强关联关系,否则为弱关联关系。根据关注人员是已识别身份的人员采用此方式进行与案件的关联关系确定,可以提高识别的准确性。
在一些实施例中,若关注人员为未识别身份的人员,则将关注人员的照片与案件的涉案人员照片进行比对,得到第五比对结果,若第五比对结果大于或等于第五阈值,则关注人员的轨迹与案件之间为强关联关系,否则为弱关联关系。根据关注人员是未识别身份的人员采用此方式进行与案件的关联关系确定,可以提高识别的准确性。
第二方面,本申请的实施例还提出了一种基于人脸识别的人员与案件关联分析装置,包括:
历史出行轨迹信息获取模块,被配置为获取关注人员的信息,通过信息判断关注人员是否是已识别身份的人员,并获取关注人员的历史出行轨迹信息,其中,关注人员包括与历史案件相关的人员;
距离计算模块,被配置为遍历历史出行轨迹信息,获取历史出行轨迹信息中每个轨迹对应的抓拍时间、抓拍点经纬度及抓拍出行方式,获取在抓拍时间的前后一定时间范围内的所有案件信息,将抓拍点位经纬度分别与案件信息中的每个案件的发案地点经纬度计算距离,并计算抓拍时间与每个案件的发案时间的时间差绝对值,根据抓拍出行方式的平均速度与时间差绝对值的乘积计算理论距离;以及
关系判断模块,被配置为若距离大于理论距离,则关注人员的轨迹与案件之间无关联关系,若距离小于或等于理论距离,并且若关注人员是已识别身份的人员,则将关注人员的证件号或照片与涉案人员的证件号或照片进行匹配确定关注人员的轨迹与案件之间的关联关系,若关注人员为未识别身份的人员,则将关注人员的照片与案件的涉案人员照片进行比对确定关注人员的轨迹与案件之间的关联关系。
第三方面,本申请实施例提供了一种电子设备,包括:一个或多个处理器;存储装置,用于存储一个或多个程序,当一个或多个程序被一个或多个处理器执行,使得一个或多个处理器实现如第一方面中任一实现方式描述的方法。
第四方面,本申请实施例提供了一种计算机可读存储介质,其上存储有计算机程序,该计算机程序被处理器执行时实现如第一方面中任一实现方式描述的方法。
本公开提出了一种基于人脸识别的人员与案件关联关系分析方法和装置,获取关注人员的信息,通过信息判断关注人员是否是已识别身份的人员,并获取关注人员的历史出行轨迹信息,其中,关注人员包括与历史案件相关的人员;遍历历史出行轨迹信息,获取历史出行轨迹信息中每个轨迹对应的抓拍时间、抓拍点经纬度及抓拍出行方式,获取在抓拍时间的 前后一定时间范围内的所有案件信息,将抓拍点位经纬度分别与案件信息中的每个案件的发案地点经纬度计算距离,并计算抓拍时间与每个案件的发案时间的时间差绝对值,根据抓拍出行方式的平均速度与时间差绝对值的乘积计算理论距离;若距离大于理论距离,则关注人员的轨迹与案件之间无关联关系,若距离小于或等于理论距离,并且若关注人员是已识别身份的人员,则将关注人员的证件号或照片与涉案人员的证件号或照片进行匹配确定关注人员的轨迹与案件之间的关联关系,若关注人员为未识别身份的人员,则将关注人员的照片与案件的涉案人员照片进行比对确定关注人员的轨迹与案件之间的关联关系。利用人脸识别技术和大数据分析检索技术,根据人脸轨迹和案件信息,快速准确地分析出人员与案件的关联关系,有效提高办案效率,减少误判率。
附图说明
为了更清楚地说明本公开实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简要介绍,显而易见地,下面描述中的附图仅仅是本公开的一些实施例,对于本领域的普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请的一个实施例可以应用于其中的示例性装置架构图;
图2为本公开的实施例的基于人脸识别的人员与案件关联分析方法的流程示意图;
图3为本公开的实施例的基于人脸识别的人员与案件关联分析方法的步骤S1的流程示意图;
图4为本公开的实施例的基于人脸识别的人员与案件关联分析方法的历史出行轨迹库建立的流程示意图;
图5为本公开的实施例的基于人脸识别的人员与案件关联分析装置的示意图;
图6是适于用来实现本申请实施例的电子设备的计算机装置的结构示意图。
具体实施方式
为了使本公开的目的、技术方案和优点更加清楚,下面将结合附图对本公开作进一步地详细描述,显然,所描述的实施例仅仅是本公开一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其它 实施例,都属于本公开保护的范围。
图1示出了可以应用本申请实施例的基于人脸识别的人员与案件关联分析方法或基于人脸识别的人员与案件关联分析装置的示例性装置架构100。
如图1所示,装置架构100可以包括终端设备101、102、103,网络104和服务器105。网络104用以在终端设备101、102、103和服务器105之间提供通信链路的介质。网络104可以包括各种连接类型,例如有线、无线通信链路或者光纤电缆等等。
用户可以使用终端设备101、102、103通过网络104与服务器105交互,以接收或发送消息等。终端设备101、102、103上可以安装有各种应用,例如数据处理类应用、文件处理类应用等。
终端设备101、102、103可以是硬件,也可以是软件。当终端设备101、102、103为硬件时,可以是各种电子设备,包括但不限于智能手机、平板电脑、膝上型便携计算机和台式计算机等等。当终端设备101、102、103为软件时,可以安装在上述所列举的电子设备中。其可以实现成多个软件或软件模块(例如用来提供分布式服务的软件或软件模块),也可以实现成单个软件或软件模块。在此不做具体限定。
服务器105可以是提供各种服务的服务器,例如对终端设备101、102、103上传的文件或数据进行处理的后台数据处理服务器。后台数据处理服务器可以对获取的文件或数据进行处理,生成处理结果。
需要说明的是,本申请实施例所提供的基于人脸识别的人员与案件关联分析方法可以由服务器105执行,也可以由终端设备101、102、103执行,相应地,基于人脸识别的人员与案件关联分析装置可以设置于服务器105中,也可以设置于终端设备101、102、103中。
应该理解,图1中的终端设备、网络和服务器的数目仅仅是示意性的。根据实现需要,可以具有任意数目的终端设备、网络和服务器。在所处理的数据不需要从远程获取的情况下,上述装置架构可以不包括网络,而只需服务器或终端设备。
图2示出了本申请的实施例公开了一种基于人脸识别的人员与案件关联分析方法,包括以下步骤:
S1:获取关注人员的信息,通过信息判断关注人员是否是已识别身份的人员,并获取关注人员的历史出行轨迹信息,其中,关注人员包括与历史案件相关的人员;
S2:遍历历史出行轨迹信息,获取历史出行轨迹信息中每个轨迹对应的抓拍时间、抓拍点经纬度及抓拍出行方式,获取在抓拍时间的前后一定时间范围内的所有案件信息,将抓拍点位经纬度分别与案件信息中的每个案件的发案地点经纬度计算距离,并计算抓拍时间与每个案件的发案时间的时间差绝对值,根据抓拍出行方式的平均速度与时间差绝对值的乘积计算理论距离;以及
S3:若距离大于理论距离,则关注人员的轨迹与案件之间无关联关系,若距离小于或等于理论距离,并且若关注人员是已识别身份的人员,则将关注人员的证件号或照片与涉案人员的证件号或照片进行匹配确定关注人员的轨迹与案件之间的关联关系,若关注人员为未识别身份的人员,则将关注人员的照片与案件的涉案人员照片进行比对确定关注人员的轨迹与案件之间的关联关系。
在优选的实施例中,关注人员可以为已经确定跟某个案件相关的人员,但是不知道与其他案件是否存在关联性,也可以是其他执法办案中存在其他问题的人员,或者是需要重点关注的人员。关注人员与案件关联关系分析旨在利用人脸识别和大数据技术,快速准确分析出执法办案过程中需关注的人员与某些案件潜在的关联关系。即在设定的时间范围内(如最近三个月),获取该时间范围内关注人员的历史出行轨迹和该时间范围内发生的案件,结合空间条件(如某轨迹点周围3公里)和关注人员身份与涉案人员身份匹配程度、关注人员照片与涉案人员照片比对信息等,综合分析出关注人员与案件的关联关系。
在具体的实施例中,关注人员的信息包括证件号或照片,关注人员可以通过证件号或照片确定为已识别身份的人员,也可以通过照片确定为未识别身份的人员。因为在现实办案过程中关注人员所给出信息不同也不一定完整,有的仅提供照片,有的提供证件号等详细信息,因此难以确定是已识别身份的人员或未识别身份的人员,由此可以先通过确定身份来提高关联关系分析的效率。另外也是根据关注人员是已识别身份的人员还是未识别身份的人员进一步可以分别通过不同的方式来判断与案件的关联性。
在具体的实施例中,如图3所示,步骤S1具体包括以下步骤:
S11:若关注人员的信息为证件号,则关注人员为已识别身份的人员,根据关注人员的身份信息在历史出行轨迹库中查询获得关注人员的历史出行轨迹信息;
S12:若关注人员的信息为照片,则通过人脸识别引擎将照片与实有人口底库进行比对得到第一对比结果,若第一比对结果中最高相似度结果大于或等于第一阈值,则确定关注人 员为已识别身份的人员,关注人员的身份为最高相似度结果所对应的人员身份,根据关注人员的身份信息在历史出行轨迹库中获得关注人员的历史出行轨迹信息;以及
S13:若第一比对结果中最高相似度结果小于第一阈值,则确定关注人员为未识别身份的人员,通过人脸识别引擎将照片与历史抓拍图库进行比对得到第二对比结果,将第二比对结果中大于或等于第二阈值所对应的历史抓拍图库结合对应的抓拍点位信息形成关注人员的历史出行轨迹信息。
在具体的实施例中,根据关注人员的照片确定历史出行轨迹的步骤包括:将人脸抓拍设备实时抓拍上传的人脸抓拍图录入人脸识别引擎并设定识别相似度,与实有人口底图库做1:N检索,比对结果按比对相似度倒序排序,记为Top1、Top2…TopN。如果最高相似度结果Top1的值大于设定的确认同一人比对阈值,则此人脸抓拍图中的人员确定为已识别身份人员,其身份即为Top1所对应的人员身份。记录抓拍信息(抓拍点位经纬度、点位名称、抓拍图片信息等)、识别Top1人员信息(识别相似度、人员证件号、证件照等信息),形成一条人员轨迹记录录入人员历史出行轨迹记录表内形成历史出行轨迹库。根据关注人员是已识别身份人员还是未识别身份人员可以通过不同的方式获得关注人员的历史出行轨迹,并且提高了识别效率,加强了识别准确率。并且在获知关注人员是已识别身份的人员,可以根据其身份获得更多信息,包括但不限于历史出行轨迹信息。
在具体的实施例中,如图4所示,历史出行轨迹库通过以下步骤建立:
S111:获取人脸抓拍图,并通过人脸识别引擎将人脸抓拍图与实有人口底库进行比对得到第三对比结果;
S112:若第三比对结果中最高相似度结果大于或等于第三阈值,则确定人脸抓拍图所对应的人员为已识别身份的人员,人脸抓拍图所对应的人员的身份为最高相似度结果所对应的人员身份,将人脸抓拍图的抓拍信息结合已识别身份的人员的信息形成历史出行轨迹库。
历史出行轨迹库是针对已识别身份的人员建立的数据库,里面既包含身份信息,又包含由抓拍点位等组成的历史出行轨迹信息。构建历史出行轨迹库有利于后续与案件的发案时间和地点进行比对,提高关联关系判断的准确性和效率。
在具体的实施例中,历史抓拍图库为根据获取的人脸抓拍图聚类形成的图库,实有人口底库为根据实际现有人口建立的数据库。历史抓拍图库中仅仅是由相似度超过设定阈值,认为是同一人所形成的历史抓拍图库,关注人员为未识别身份的人员具体可以根据历史抓拍 图库通过以下步骤获取历史出行轨迹信息:将未识别身份人员照片录入人脸识别引擎并设定识别相似度S和抓拍时间范围T start~T end,与历史抓拍图片库做1:N检索,比对结果按比对相似度倒序排序,记为Top1、Top2…TopN。Top 1~N即为该未识别身份人员T start~T end时间范围内的出行轨迹。结合抓拍点位信息(抓拍点位经纬度、点位名称等)形成轨迹信息。实有人口底库包含根据三标一实或其他第三方建立的已知身份的人员的数据库。执法办案人员执法办案过程中将案件信息(案件名称、发案时间、发案地点经纬度等)录入案件信息表,并将此案件相关的涉案人员信息(人员证件号、人员证件照、人员性别年龄等)录入案件信息记录表中。
在具体的实施例中,步骤S2中通过Haversine公式计算抓拍点位经纬度与发案地点经纬度之间的距离:
Figure PCTCN2020139838-appb-000003
其中,haversin(θ)=sin 2(θ/2)=(1-cos(θ))/2,R为地球半径,可取平均值6371km,
Figure PCTCN2020139838-appb-000004
表示抓拍点位和发案地点的纬度,Δλ表示抓拍点位和发案地点的经度的差值。
在案件信息记录表中包含每个案件的发案地点和发案时间。在步骤S2中通过关注人员的历史出行轨迹信息中的每条轨迹判断与每个案件是否具有关联性,通过每条轨迹中的抓拍时间来筛选出在发案时间在抓拍时间的前后一定时间所发生的相关案件,再根据抓拍点位的经纬度和发案地点之间的距离进一步精确地确定关注人员与案件的关联性。在优选的实施例中,抓拍时间的前后一定时间可以限定为前后30分钟内。并且关注在抓拍点位抓拍到的出行方式,结合出行方式所对应的平均速度V以及抓拍时间和发案时间的时间差绝对值T得到理论距离S t。在其他可选的实施例中,抓拍时间的前后一定时间可以根据实际需求进行调整,因此也能获取到最佳的准确度最高的时间范围。通过该方式的计算可以提高案件关联的准确性和效率。
在具体的实施例中,步骤S3中若关注人员是已识别身份的人员,则将关注人员的证件号与涉案人员的证件号进行匹配,若匹配上则关注人员的轨迹与案件之间为强关联关系;若不匹配,则根据关注人员的照片与涉案人员的照片进行比对,得到第四比对结果,若第四比对结果大于或等于第四阈值,则关注人员的轨迹与案件之间为强关联关系,否则为弱关联关系。根据关注人员是已识别身份的人员采用此方式进行与案件的关联关系确定,可以提高识 别的准确性。
在具体的实施例中,步骤S3中若关注人员为未识别身份的人员,则将关注人员的照片与案件的涉案人员照片进行比对,得到第五比对结果,若第五比对结果大于或等于第五阈值,则关注人员的轨迹与案件之间为强关联关系,否则为弱关联关系。根据关注人员是未识别身份的人员采用此方式进行与案件的关联关系确定,可以提高识别的准确性。
进一步参考图5,作为对上述各图所示方法的实现,本申请提供了一种基于人脸识别的人员与案件关联分析装置的一个实施例,该装置实施例与图2所示的方法实施例相对应,该装置具体可以应用于各种电子设备中。
本申请的实施例还提出了一种基于人脸识别的人员与案件关联分析装置,如图5所示,包括:
历史出行轨迹信息获取模块1,被配置为获取关注人员的信息,通过信息判断关注人员是否是已识别身份的人员,并获取关注人员的历史出行轨迹信息,其中,关注人员包括与历史案件相关的人员;
距离计算模块2,被配置为遍历历史出行轨迹信息,获取历史出行轨迹信息中每个轨迹对应的抓拍时间、抓拍点经纬度及抓拍出行方式,获取在抓拍时间的前后一定时间范围内的所有案件信息,将抓拍点位经纬度分别与案件信息中的每个案件的发案地点经纬度计算距离,并计算抓拍时间与每个案件的发案时间的时间差绝对值,根据抓拍出行方式的平均速度与时间差绝对值的乘积计算理论距离;以及
关系判断模块3,被配置为若距离大于理论距离,则关注人员的轨迹与案件之间无关联关系,若距离小于或等于理论距离,并且若关注人员是已识别身份的人员,则将关注人员的证件号或照片与涉案人员的证件号或照片进行匹配确定关注人员的轨迹与案件之间的关联关系,若关注人员为未识别身份的人员,则将关注人员的照片与案件的涉案人员照片进行比对确定关注人员的轨迹与案件之间的关联关系。
本公开提出了一种基于人脸识别的人员与案件关联关系分析方法和装置,获取关注人员的信息,通过信息判断关注人员是否是已识别身份的人员,并获取关注人员的历史出行轨迹信息,其中,关注人员包括与历史案件相关的人员;遍历历史出行轨迹信息,获取历史出行轨迹信息中每个轨迹对应的抓拍时间、抓拍点经纬度及抓拍出行方式,获取在抓拍时间的前后一定时间范围内的所有案件信息,将抓拍点位经纬度分别与案件信息中的每个案件的 发案地点经纬度计算距离,并计算抓拍时间与每个案件的发案时间的时间差绝对值,根据抓拍出行方式的平均速度与时间差绝对值的乘积计算理论距离;若距离大于理论距离,则关注人员的轨迹与案件之间无关联关系,若距离小于或等于理论距离,并且若关注人员是已识别身份的人员,则将关注人员的证件号或照片与涉案人员的证件号或照片进行匹配确定关注人员的轨迹与案件之间的关联关系,若关注人员为未识别身份的人员,则将关注人员的照片与案件的涉案人员照片进行比对确定关注人员的轨迹与案件之间的关联关系。利用人脸识别技术和大数据分析检索技术,根据人脸轨迹和案件信息,快速准确地分析出人员与案件的关联关系,有效提高办案效率,减少误判率。
下面参考图6,其示出了适于用来实现本申请实施例的电子设备(例如图1所示的服务器或终端设备)的计算机装置600的结构示意图。图6示出的电子设备仅仅是一个示例,不应对本申请实施例的功能和使用范围带来任何限制。
如图6所示,计算机装置600包括中央处理单元(CPU)601和图形处理器(GPU)602,其可以根据存储在只读存储器(ROM)603中的程序或者从存储部分609加载到随机访问存储器(RAM)606中的程序而执行各种适当的动作和处理。在RAM 604中,还存储有装置600操作所需的各种程序和数据。CPU 601、GPU602、ROM 603以及RAM 604通过总线605彼此相连。输入/输出(I/O)接口606也连接至总线605。
以下部件连接至I/O接口606:包括键盘、鼠标等的输入部分607;包括诸如、液晶显示器(LCD)等以及扬声器等的输出部分608;包括硬盘等的存储部分609;以及包括诸如LAN卡、调制解调器等的网络接口卡的通信部分610。通信部分610经由诸如因特网的网络执行通信处理。驱动器611也可以根据需要连接至I/O接口606。可拆卸介质612,诸如磁盘、光盘、磁光盘、半导体存储器等等,根据需要安装在驱动器611上,以便于从其上读出的计算机程序根据需要被安装入存储部分609。
特别地,根据本公开的实施例,上文参考流程图描述的过程可以被实现为计算机软件程序。例如,本公开的实施例包括一种计算机程序产品,其包括承载在计算机可读介质上的计算机程序,该计算机程序包含用于执行流程图所示的方法的程序代码。在这样的实施例中,该计算机程序可以通过通信部分610从网络上被下载和安装,和/或从可拆卸介质612被安装。在该计算机程序被中央处理单元(CPU)601和图形处理器(GPU)602执行时,执行本申请的方法中限定的上述功能。
需要说明的是,本申请所述的计算机可读介质可以是计算机可读信号介质或者计算机可读介质或者是上述两者的任意组合。计算机可读介质例如可以是——但不限于——电、磁、光、电磁、红外线、或半导体的装置、装置或器件,或者任意以上的组合。计算机可读介质的更具体的例子可以包括但不限于:具有一个或多个导线的电连接、便携式计算机磁盘、硬盘、随机访问存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(EPROM或闪存)、光纤、便携式紧凑磁盘只读存储器(CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。在本申请中,计算机可读介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行装置、装置或者器件使用或者与其结合使用。而在本申请中,计算机可读的信号介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了计算机可读的程序代码。这种传播的数据信号可以采用多种形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。计算机可读的信号介质还可以是计算机可读介质以外的任何计算机可读介质,该计算机可读介质可以发送、传播或者传输用于由指令执行装置、装置或者器件使用或者与其结合使用的程序。计算机可读介质上包含的程序代码可以用任何适当的介质传输,包括但不限于:无线、电线、光缆、RF等等,或者上述的任意合适的组合。
可以以一种或多种程序设计语言或其组合来编写用于执行本申请的操作的计算机程序代码,所述程序设计语言包括面向对象的程序设计语言—诸如Java、Smalltalk、C++,还包括常规的过程式程序设计语言—诸如“C”语言或类似的程序设计语言。程序代码可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、部分在用户计算机上部分在远程计算机上执行、或者完全在远程计算机或服务器上执行。在涉及远程计算机的情形中,远程计算机可以通过任意种类的网络——包括局域网(LAN)或广域网(WAN)—连接到用户计算机,或者,可以连接到外部计算机(例如利用因特网服务提供商来通过因特网连接)。
附图中的流程图和框图,图示了按照本申请各种实施例的装置、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段、或代码的一部分,该模块、程序段、或代码的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。也应当注意,在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个接连地表示 的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行规定的功能或操作的专用的基于硬件的装置来实现,或者可以用专用硬件与计算机指令的组合来实现。
描述于本申请实施例中所涉及到的模块可以通过软件的方式实现,也可以通过硬件的方式来实现。所描述的模块也可以设置在处理器中。
作为另一方面,本申请还提供了一种计算机可读介质,该计算机可读介质可以是上述实施例中描述的电子设备中所包含的;也可以是单独存在,而未装配入该电子设备中。上述计算机可读介质承载有一个或者多个程序,当上述一个或者多个程序被该电子设备执行时,使得该电子设备:获取关注人员的信息,通过信息判断关注人员是否是已识别身份的人员,并获取关注人员的历史出行轨迹信息,其中,关注人员包括与历史案件相关的人员;遍历历史出行轨迹信息,获取历史出行轨迹信息中每个轨迹对应的抓拍时间、抓拍点经纬度及抓拍出行方式,获取在抓拍时间的前后一定时间范围内的所有案件信息,将抓拍点位经纬度分别与案件信息中的每个案件的发案地点经纬度计算距离,并计算抓拍时间与每个案件的发案时间的时间差绝对值,根据抓拍出行方式的平均速度与时间差绝对值的乘积计算理论距离;若距离大于理论距离,则关注人员的轨迹与案件之间无关联关系,若距离小于或等于理论距离,并且若关注人员是已识别身份的人员,则将关注人员的证件号或照片与涉案人员的证件号或照片进行匹配确定关注人员的轨迹与案件之间的关联关系,若关注人员为未识别身份的人员,则将关注人员的照片与案件的涉案人员照片进行比对确定关注人员的轨迹与案件之间的关联关系。
以上描述仅为本申请的较佳实施例以及对所运用技术原理的说明。本领域技术人员应当理解,本申请中所涉及的公开范围,并不限于上述技术特征的特定组合而成的技术方案,同时也应涵盖在不脱离上述公开构思的情况下,由上述技术特征或其等同特征进行任意组合而形成的其它技术方案。例如上述特征与本申请中公开的(但不限于)具有类似功能的技术特征进行互相替换而形成的技术方案。

Claims (11)

  1. 一种基于人脸识别的人员与案件关联分析方法,其特征在于,包括以下步骤:
    S1:获取关注人员的信息,通过所述信息判断所述关注人员是否是已识别身份的人员,并获取所述关注人员的历史出行轨迹信息,其中,所述关注人员包括与历史案件相关的人员;
    S2:遍历所述历史出行轨迹信息,获取所述历史出行轨迹信息中每个轨迹对应的抓拍时间、抓拍点经纬度及抓拍出行方式,获取在所述抓拍时间的前后一定时间范围内的所有案件信息,将所述抓拍点位经纬度分别与所述案件信息中的每个案件的发案地点经纬度计算距离,并计算所述抓拍时间与所述每个案件的发案时间的时间差绝对值,根据所述抓拍出行方式的平均速度与所述时间差绝对值的乘积计算理论距离;以及
    S3:若所述距离与所述理论距离的差值绝对值大于预设范围,则所述关注人员的轨迹与所述案件之间无关联关系,若所述距离与所述理论距离的差值绝对值小于或等于所述预设范围,并且若所述关注人员是已识别身份的人员,则将所述关注人员的证件号或照片与所述涉案人员的证件号或照片进行匹配确定所述关注人员的轨迹与所述案件之间的关联关系,若所述关注人员为未识别身份的人员,则将所述关注人员的照片与所述案件的涉案人员照片进行比对确定所述关注人员的轨迹与所述案件之间的关联关系。
  2. 根据权利要求1所述的基于人脸识别的人员与案件关联分析方法,其特征在于,所述关注人员的信息包括证件号或照片,所述关注人员通过所述证件号或所述照片确定为已识别身份的人员,通过所述照片确定为未识别身份的人员。
  3. 根据权利要求2所述的基于人脸识别的人员与案件关联分析方法,其特征在于,所述步骤S1具体包括以下步骤:
    S11:若所述关注人员的信息为证件号,则所述关注人员为已识别身份的人员,根据所述关注人员的身份信息在历史出行轨迹库中查询获得所述关注人员的历史出行轨迹信息;
    S12:若所述关注人员的信息为照片,则通过人脸识别引擎将所述照片与实有人口底库进行比对得到第一对比结果,若所述第一比对结果中最高相似度结果大于或等于第一阈值,则确定所述关注人员为已识别身份的人员,所述关注人员的身份为所述最高相似度结果所对应的人员身份,根据所述关注人员的身份信息在所述历史出行轨迹库中获得所述关注人 员的历史出行轨迹信息;以及
    S13:若所述第一比对结果中最高相似度结果小于第一阈值,则确定所述关注人员为未识别身份的人员,通过人脸识别引擎将所述照片与历史抓拍图库进行比对得到第二对比结果,将所述第二比对结果中大于或等于第二阈值所对应的所述历史抓拍图库结合对应的抓拍点位信息形成所述关注人员的历史出行轨迹信息。
  4. 根据权利要求3所述的基于人脸识别的人员与案件关联分析方法,其特征在于,所述历史出行轨迹库通过以下步骤建立:
    S111:获取人脸抓拍图,并通过人脸识别引擎将所述人脸抓拍图与所述实有人口底库进行比对得到第三对比结果;
    S112:若所述第三比对结果中最高相似度结果大于或等于第三阈值,则确定所述人脸抓拍图所对应的人员为已识别身份的人员,所述人脸抓拍图所对应的人员的身份为所述最高相似度结果所对应的人员身份,将所述人脸抓拍图的抓拍信息结合所述已识别身份的人员的信息形成所述历史出行轨迹库。
  5. 根据权利要求3所述的基于人脸识别的人员与案件关联分析方法,其特征在于,所述历史抓拍图库为根据获取的所述人脸抓拍图聚类形成的图库,所述实有人口底库为根据实际现有人口建立的数据库。
  6. 根据权利要求1所述的基于人脸识别的人员与案件关联分析方法,其特征在于,所述步骤S2中通过Haversine公式计算所述抓拍点位经纬度与发案地点经纬度之间的距离:
    Figure PCTCN2020139838-appb-100001
    其中,haversin(θ)=sin 2(θ/2)=(1-cos(θ))/2,R为地球半径,可取平均值6371km,
    Figure PCTCN2020139838-appb-100002
    表示所述抓拍点位和所述发案地点的纬度,Δλ表示所述抓拍点位和所述发案地点的经度的差值。
  7. 根据权利要求1所述的基于人脸识别的人员与案件关联分析方法,其特征在于,所述步骤S3中若所述关注人员是已识别身份的人员,则将所述关注人员的证件号与所述涉案人员的证件号进行匹配,若匹配上则所述关注人员的轨迹与所述案件之间为强关联关系;若不匹配,则根据所述关注人员的照片与所述涉案人员的照片进行比对,得到第四比对结果,若所述第四比对结果大于或等于第四阈值,则所述关注人员的轨迹与所述案件之间为强关 联关系,否则为弱关联关系。
  8. 根据权利要求1所述的基于人脸识别的人员与案件关联分析方法,其特征在于,若所述关注人员为未识别身份的人员,则将所述关注人员的照片与所述案件的涉案人员照片进行比对,得到第五比对结果,若所述第五比对结果大于或等于第五阈值,则所述关注人员的轨迹与所述案件之间为强关联关系,否则为弱关联关系。
  9. 一种基于人脸识别的人员与案件关联分析装置,其特征在于,包括:
    历史出行轨迹信息获取模块,被配置为获取关注人员的信息,通过所述信息判断所述关注人员是否是已识别身份的人员,并获取所述关注人员的历史出行轨迹信息,其中,所述关注人员包括与历史案件相关的人员;
    距离计算模块,被配置为遍历所述历史出行轨迹信息,获取所述历史出行轨迹信息中每个轨迹对应的抓拍时间、抓拍点经纬度及抓拍出行方式,获取在所述抓拍时间的前后一定时间范围内的所有案件信息,将所述抓拍点位经纬度分别与所述案件信息中的每个案件的发案地点经纬度计算距离,并计算所述抓拍时间与所述每个案件的发案时间的时间差绝对值,根据所述抓拍出行方式的平均速度与所述时间差绝对值的乘积计算理论距离;以及
    关系判断模块,被配置为若所述距离大于所述理论距离,则所述关注人员的轨迹与所述案件之间无关联关系,若所述距离小于或等于所述抓理论距离,并且若所述关注人员是已识别身份的人员,则将所述关注人员的证件号或照片与所述涉案人员的证件号或照片进行匹配确定所述关注人员的轨迹与所述案件之间的关联关系,若所述关注人员为未识别身份的人员,则将所述关注人员的照片与所述案件的涉案人员照片进行比对确定所述关注人员的轨迹与所述案件之间的关联关系。
  10. 一种电子设备,包括:
    一个或多个处理器;
    存储装置,用于存储一个或多个程序,
    当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现如权利要求1-8中任一所述的方法。
  11. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,该程序被处理器执行时实现如权利要求1-8中任一所述的方法。
PCT/CN2020/139838 2020-11-16 2020-12-28 一种基于人脸识别的人员与案件关联分析方法和装置 WO2022099884A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011276290.4A CN112347296B (zh) 2020-11-16 2020-11-16 一种基于人脸识别的人员与案件关联分析方法和装置
CN202011276290.4 2020-11-16

Publications (1)

Publication Number Publication Date
WO2022099884A1 true WO2022099884A1 (zh) 2022-05-19

Family

ID=74363850

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/139838 WO2022099884A1 (zh) 2020-11-16 2020-12-28 一种基于人脸识别的人员与案件关联分析方法和装置

Country Status (2)

Country Link
CN (1) CN112347296B (zh)
WO (1) WO2022099884A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115809256A (zh) * 2023-02-22 2023-03-17 中关村科学城城市大脑股份有限公司 治安管理综合信息系统和可视化展示方法
CN117319559A (zh) * 2023-11-24 2023-12-29 杭州度言软件有限公司 一种基于智能语音机器人的催收方法与系统

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113468948B (zh) * 2021-04-26 2023-11-10 深圳市安软科技股份有限公司 基于视图数据的治安防控方法、模块、设备及存储介质
CN114491148B (zh) * 2022-04-14 2022-07-12 武汉中科通达高新技术股份有限公司 目标人员搜索方法、装置、计算机设备及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110276272A (zh) * 2019-05-30 2019-09-24 罗普特科技集团股份有限公司 确认标签人员的同行人员关系的方法、装置、存储介质
CN110334120A (zh) * 2019-06-28 2019-10-15 深圳市商汤科技有限公司 档案应用方法及装置、存储介质
CN110705476A (zh) * 2019-09-30 2020-01-17 深圳市商汤科技有限公司 数据分析方法、装置、电子设备和计算机存储介质
CN111598753A (zh) * 2020-01-15 2020-08-28 北京明略软件系统有限公司 一种嫌疑人推荐方法、装置、电子设备和存储介质

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106339428B (zh) * 2016-08-16 2019-08-23 东方网力科技股份有限公司 基于视频大数据的嫌疑人身份识别方法和装置
CN108595606A (zh) * 2018-04-20 2018-09-28 广东亿迅科技有限公司 基于运营商数据的公安案件时空分析方法及装置
EP3570226A1 (en) * 2018-05-16 2019-11-20 Ernst & Young GmbH Wirtschaftsprüfungsgesellschaft Method and system of obtaining audit evidence
CN109190498A (zh) * 2018-08-09 2019-01-11 安徽四创电子股份有限公司 一种基于人脸识别的案件智能化串并的方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110276272A (zh) * 2019-05-30 2019-09-24 罗普特科技集团股份有限公司 确认标签人员的同行人员关系的方法、装置、存储介质
CN110334120A (zh) * 2019-06-28 2019-10-15 深圳市商汤科技有限公司 档案应用方法及装置、存储介质
CN110705476A (zh) * 2019-09-30 2020-01-17 深圳市商汤科技有限公司 数据分析方法、装置、电子设备和计算机存储介质
CN111598753A (zh) * 2020-01-15 2020-08-28 北京明略软件系统有限公司 一种嫌疑人推荐方法、装置、电子设备和存储介质

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115809256A (zh) * 2023-02-22 2023-03-17 中关村科学城城市大脑股份有限公司 治安管理综合信息系统和可视化展示方法
CN117319559A (zh) * 2023-11-24 2023-12-29 杭州度言软件有限公司 一种基于智能语音机器人的催收方法与系统
CN117319559B (zh) * 2023-11-24 2024-02-02 杭州度言软件有限公司 一种基于智能语音机器人的催收方法与系统

Also Published As

Publication number Publication date
CN112347296B (zh) 2022-06-17
CN112347296A (zh) 2021-02-09

Similar Documents

Publication Publication Date Title
WO2022099884A1 (zh) 一种基于人脸识别的人员与案件关联分析方法和装置
Reed et al. Identifying https-protected netflix videos in real-time
WO2019085064A1 (zh) 医疗理赔拒付方法、装置、终端设备及存储介质
US10135830B2 (en) Utilizing transport layer security (TLS) fingerprints to determine agents and operating systems
US10880672B2 (en) Evidence management system and method
CN111241883B (zh) 防止远程被测人员作弊的方法和装置
CN110941978B (zh) 一种未识别身份人员的人脸聚类方法、装置及存储介质
CN111405475B (zh) 一种多维感知数据碰撞融合分析方法和装置
EP3707612B1 (en) Duplicative data detection
CN112232178A (zh) 基于人像聚档的区域落脚点判定方法、系统、设备及介质
WO2023178930A1 (zh) 图像识别方法、训练方法、装置、系统及存储介质
KR102017746B1 (ko) 유사도 산출 방법 및 그 장치
WO2019218452A1 (zh) 热词分析方法、计算机可读存储介质、终端设备及装置
CN113239792A (zh) 一种大数据分析处理系统和方法
CN111914649A (zh) 人脸识别的方法及装置、电子设备、存储介质
CN114519879A (zh) 人体数据归档方法、装置、设备及存储介质
CN109271859A (zh) 串并案方法和装置、电子设备、计算机存储介质
WO2019187107A1 (ja) 情報処理装置、制御方法、及びプログラム
Das et al. Realizing digital forensics as a big data challenge
CN108427930B (zh) 基于数理统计建立身份识别信息关联关系的方法及系统
CN113014591B (zh) 假冒公众号的检测方法和装置、电子设备、及介质
CN112989083B (zh) 人员身份分析方法、装置、设备及存储介质
RU2778208C1 (ru) Система интеллектуального мониторинга поведения пользователя при взаимодействии с контентом
CN111046307B (zh) 用于输出信息的方法和装置
CN117216308B (zh) 基于大模型的搜索方法、系统、设备及介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20961437

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20961437

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 161023)