CN107016322B - Method and device for analyzing followed person - Google Patents

Method and device for analyzing followed person Download PDF

Info

Publication number
CN107016322B
CN107016322B CN201610058847.4A CN201610058847A CN107016322B CN 107016322 B CN107016322 B CN 107016322B CN 201610058847 A CN201610058847 A CN 201610058847A CN 107016322 B CN107016322 B CN 107016322B
Authority
CN
China
Prior art keywords
information
snapshot
face picture
suspect
face
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610058847.4A
Other languages
Chinese (zh)
Other versions
CN107016322A (en
Inventor
黄军
李得洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Uniview Technologies Co Ltd
Original Assignee
Zhejiang Uniview Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Uniview Technologies Co Ltd filed Critical Zhejiang Uniview Technologies Co Ltd
Priority to CN201610058847.4A priority Critical patent/CN107016322B/en
Publication of CN107016322A publication Critical patent/CN107016322A/en
Application granted granted Critical
Publication of CN107016322B publication Critical patent/CN107016322B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/30Scenes; Scene-specific elements in albums, collections or shared content, e.g. social network photos or video
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • G06V40/173Classification, e.g. identification face re-identification, e.g. recognising unknown faces across different face tracks

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Image Analysis (AREA)
  • Alarm Systems (AREA)

Abstract

The application provides a method and a device for analyzing trailing personnel, wherein the method comprises the steps of searching all snap shot pictures, comparing the snap shot pictures with established suspect pictures, screening and storing face pictures with the face similarity larger than a first threshold value and snap shot information; screening persons who follow the first suspect and do not exceed a following time threshold value from the stored snapshot points; grouping the screened trailing personnel according to the face similarity, selecting one picture from each group as a trailing personnel retrieval picture, and searching out face pictures with the face similarity larger than a first threshold value and corresponding snapshot point information in the pictures of all the snapshot points; and comparing the snapshot point information of the following person in each group with the corresponding search picture screening snapshot point information of the following person, finding out the missing snapshot point information, and judging whether the following person is a suspect partnering or not according to the following ratio. A corresponding apparatus is also included. The invention can reduce the manual investigation amount of criminal investigation personnel and reduce manual misjudgment and missed judgment.

Description

Method and device for analyzing followed person
Technical Field
The invention relates to the field of big data retrieval, in particular to a method and a device for analyzing motion tracks of a suspect and a following person of the suspect.
Background
Usually, criminals are all group crime crimes, and the criminal investigation department may only know a few criminal suspects in group staffs, so it is difficult to make the criminal group complete at one network. For example, when a crime group crime works, the criminal detection personnel usually work separately and cooperate, and when finding that several suspects are already caught, the criminal detection personnel inevitably will activate other group members, and the criminal group members are not easy to catch the whole number of the criminal group members.
Based on the situation, criminal investigation personnel manually investigate trailing personnel of known suspects by using a video monitoring technology to screen out other group personnel. However, in such a manner, the criminal investigation personnel needs to process a large amount of data, and easily causes manual misjudgment and missed judgment, and also cannot guarantee that the criminal group complete numbers are caught.
Disclosure of Invention
In view of the above, the present application provides a method and apparatus for analyzing a person who follows. The method and the corresponding device can reduce the manual investigation amount of criminal investigation personnel and reduce manual misjudgment and missed judgment.
Specifically, the method is realized through the following technical scheme:
a method for analyzing following persons is characterized in that a face picture captured by each capturing point within a set range and set time is compared with an established suspect picture, and the face picture with the face similarity larger than a first threshold and corresponding capturing point information are stored as first suspect information;
screening out persons with the time of following the first suspect not exceeding a following time threshold from snapshot points in the first suspect information, and storing the screened following persons and the corresponding snapshot point information as following person information;
grouping trailing personnel in the trailing personnel information according to the face similarity, calculating the quality scores of all face pictures in each group, taking the face picture with the highest quality score in each group as a trailing personnel retrieval picture, searching face pictures snapped by all snap points within a set range and within a set time, and screening out the face pictures with the face similarity larger than a first threshold value and corresponding snap point information;
comparing the snapshot point information of the trailing personnel in each group with the snapshot point information screened by the corresponding trailing personnel retrieval picture to find out the missing snapshot point information;
after finding out the missing snapshot point information, checking whether the missing snapshot point captures a first suspect or not, and updating the first suspect information;
and counting the proportion of the snapshot points of the first suspect and the trailing person which are simultaneously grabbed in the set range to the total snapshot points, and judging that the trailing person is a suspect partnering when the proportion is greater than a set threshold value.
Preferably, the first suspect information and the snapshot point information in the trailing person information are combined on the map, and the motion tracks of the first suspect and the trailing person are drawn according to the snapshot time.
Preferably, the following persons in the following person information are grouped according to the face similarity, and the specific steps are,
and selecting one face picture in the trailing personnel information, comparing the face picture with the remaining pictures in the trailing information table, and if the face similarity is greater than a second threshold value, placing the face pictures in the same group.
Preferably, the two snapshot points are used for cooperatively snapshot the face picture, and specifically, after the face picture is snapshot by the first snapshot point, whether the snapshot face picture meets the standard face picture condition is judged, if not, the snapshot information of the second snapshot point is checked, whether the snapshot face picture of the second snapshot point meets the standard face picture condition is judged, if the snapshot face picture of the second snapshot point meets the standard face picture condition, the face picture and the corresponding snapshot point information are stored, and if the snapshot face picture of the second snapshot point does not meet the standard face picture condition, the face picture closer to the standard face picture condition and the corresponding snapshot point information are selected to be stored.
Preferably, the angle formed by the lens of the first snapshot point and the lens of the second snapshot point is set to be 30-60 degrees.
Preferably, the face pictures are captured by a plurality of cameras simultaneously, the face pictures which meet the standard face picture condition and are closest to the standard face picture are screened out, and the shooting angles of the plurality of cameras cover all angles of the capturing personnel.
A tailgating person analysis apparatus, the apparatus comprising:
the first suspect information generating unit is used for comparing the face picture captured by each camera in the set range and the set time with the established suspect picture, and storing the face picture with the face similarity larger than a first threshold value and the corresponding capture point information as first suspect information;
the following person information generation unit is used for screening out persons who do not exceed a following time threshold value during the following of the first suspect from the snapshot points of the first suspect information generation unit, and storing the screened following persons and the corresponding snapshot point information as following person information;
the system comprises a trailing personnel missing snapshot point information analysis unit, a trailing personnel missing snapshot point information analysis unit and a snapshot point information analysis unit, wherein the trailing personnel missing snapshot point information analysis unit is used for grouping trailing personnel in trailing personnel information according to the human face similarity, calculating the quality score of each face picture in each group, searching all snapshot pictures in a set range and set time by taking a face picture with the highest quality in each group as a trailing personnel retrieval picture, and screening out the face pictures with the human face similarity larger than a first threshold value and corresponding snapshot point information; comparing the snapshot point information of the trailing personnel in each group with the snapshot point information screened by the corresponding trailing personnel retrieval picture to find out the missing snapshot point information;
the first suspect information updating unit is used for checking whether the missed snapshot points take a snapshot of the first suspect or not after finding out the missed snapshot point information, and updating the information in the first suspect generating unit;
and the suspect conspire judging unit is used for counting the proportion of the snapshot points of the first suspect and the trailing person which are simultaneously captured in the set range to the total snapshot point, and judging that the trailing person is a suspect conspire when the proportion is greater than a set threshold value.
Preferably, the motion trajectory generation unit is configured to combine the first suspect information and the snapshot point information in the trailing person information on the map, and draw the motion trajectories of the first suspect and the trailing person according to the snapshot time.
Preferably, the following persons in the following person information are grouped according to the face similarity, and the specific steps are,
and selecting one face picture in the trailing personnel information, comparing the face picture with the remaining pictures in the trailing information table, and if the face similarity is greater than a second threshold value, placing the face picture in the same group.
Preferably, the two snapshot points are used for cooperatively snapshot the face picture, and specifically, after the face picture is snapshot by the first snapshot point, whether the snapshot face picture meets the standard face picture condition is judged, if not, the snapshot information of the second snapshot point is checked, whether the snapshot face picture of the second snapshot point meets the standard face picture condition is judged, if the snapshot face picture of the second snapshot point meets the standard face picture condition, the face picture and the corresponding snapshot point information are stored, and if the snapshot face picture of the second snapshot point does not meet the standard face picture condition, the face picture closer to the standard face picture condition and the corresponding snapshot point information are selected to be stored.
Compared with the prior art, the method and the device have the advantages that the information of the trailing person is found through the first suspect information, the corresponding trailing face picture is searched in the set range by utilizing the face similarity, the missing snapshot points are found, and whether the trailing person is a suspect partnering or not is judged by analyzing the number of the same snapshot points where the trailing person and the first suspect pass through.
Drawings
FIG. 1 is a flow chart of an implementation of the present application;
FIG. 2 is a diagram of a first suspect and trail person trajectory in an exemplary embodiment of the present application;
FIG. 3 is a schematic diagram of a scenario for two or more snapshot points in an exemplary embodiment of the present application;
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
In order to solve the technical problems mentioned in the background art, the invention provides a method for analyzing followed persons. The following describes an implementation of the present invention in an embodiment with reference to fig. 1, fig. 2, and fig. 3.
First, the criminal investigation personnel searches the monitoring data of all the snapshot points within the set time and the set range by using the established picture of the criminal suspect. It should be noted that the "setting range" and the "setting time" herein mainly depend on the investigation range and the investigation time period defined by criminal investigation personnel when performing investigation of a criminal case. For example, the area of a crime case is detected within a few days of the crime case. The snapshot points here generally refer to a snapshot camera on the road, for example a bayonet camera or the like. And recording the pictures with the human face similarity greater than a first threshold value compared with the established suspect pictures and the corresponding snapshot point information by searching the human face pictures which are snapshot in the set time period by all the snapshot points. Table 1 may generally be saved as the first suspect information.
As shown in fig. 3, in the present embodiment, there are 1 to 12 snapshot cameras in the set snapshot range, and the search time is 2016.1.170: 00 to 2016.1.1712:00, assuming a first threshold of 80%, the following can be found in Table 1:
snapshot point Time of taking a snapshot Degree of similarity
1 2016.1.1702:15 85%
2 2016.1.1702:30 87%
9 2016.1.1703:00 89%
10 2016.1.1703:15 87%
11 2016.1.1705:45 90%
12 2016.1.1706:15 85%
7 2016.1.1706:30 83%
8 2016.1.1707:00 87%
TABLE 1
For the convenience of analyzing the behavior of the first suspect, the first suspect can be marked on the map according to the snapshot time by combining the map, and the motion trail L1 of the first suspect is formed: 1-2-9-10-11-12-7-8.
And screening out the persons who follow the first suspect from the snapshot points contained in the first suspect information and have the time of the first suspect not exceeding the following time threshold, and saving the screened following face pictures and the corresponding snapshot point information in a table 2 as the following person information.
Assume in this embodiment that the trailing time threshold is 30 minutes. Thus table 2 is exemplified as follows:
picture frame Snapshot point Time of taking a snapshot
Picture 1 1 2016.1.1702:20
Picture 2 2 2016.1.1702:35
Picture 3 2 2016.1.1702:37
Picture 4 9 2016.1.1703:14
Picture 5 9 2016.1.1703:15
Picture 6 10 2016.1.1703:23
Picture 7 10 2016.1.1703:25
Picture 8 11 2016.1.1706:00
Picture 9 12 2016.1.1706:20
Picture 10 7 2016.1.1706:40
Picture 11 7 2016.1.1706:45
Picture 12 8 2016.1.1707:05
Picture 13 8 2016.1.1707:10
TABLE 2
The trailing people information in table 2 is grouped by face similarity. For example, the first trailing face picture in table 2 is selected for comparison with other pictures in the table, and trailing person information greater than the second threshold is selected as the same group. By analogy, all the following persons in the table 2 are subjected to person grouping after comparison one by one. The trailing times of the trailing persons can be counted from the grouping. The tailgating information distributed by group is saved in table 3. The face similarity corresponding to each picture is recorded in each group.
Assuming that the second threshold is 85%, table 3 is grouped as follows:
picture frame Snapshot point Time of taking a snapshot Similarity of human face Grouping
Picture 1 1 2016.1.1702:20 1
Picture 2 2 2016.1.1702:35 89%
Picture 11 7 2016.1.1706:45 90
Picture
12 8 2016.1.1707:05 88
Picture
3 2 2016.1.1702:37 2
Picture 5 9 2016.1.1703:15 87%
Picture 7 10 2016.1.1703:25 90%
Picture 8 11 2016.1.1706:00 91%
3
In order to conveniently view the movement track of the trailing personnel, the movement track L2 of each trailing personnel can be drawn on the map by snapping each trailing personnel in table 3 according to the snapping point of the trailing personnel in combination with the map.
For example, packet 1:1-2-7-8, packet 2: 2-9-10-11.
Of course, since the data in the following person information table 2 is only screened according to the time interval, there will be a certain misjudgment. For example, there are some passers-by who have the same distance as the first suspect, and such misjudgment has a great influence on the investment of the human resources for solving a case and the efficiency of solving a case. Therefore, this part of misjudgment data needs to be deleted.
Here, in order to improve the accuracy of the search result, the applicant calculates the face picture quality score in each group, and the main considered parameters include: face size, image sharpness, and image darkness. Of course, other parameters may be selected, such as image sharpness, image uniformity, face bilateral symmetry, etc. And selecting the picture with the highest quality score from each group as a trailing person retrieval picture, searching all face pictures snapped by the snapping points within a set range and a set time period, and recording the face pictures with the face similarity larger than a first threshold value compared with the trailing person retrieval picture and the corresponding snapping point information into a table 4. Again, this is stored in table 4 for each search result group.
Picture frame Snapshot point Time of taking a snapshot Similarity of human face Grouping
Picture 11 7 2016.1.1706:45 90% 1
Picture 1 1 2016.1.1702:20
Picture 2 2 2016.1.1702:35 89%
Picture 14 3 2016.1.1703:30 88%
Picture 15 4 2016.1.1704:10 90%
Picture 16 5 2016.1.1705:20 86%
Picture 17 6 2016.1.1705:35 91
Picture
12 8 2016.1.1707:05 88%
Picture 8 11 2016.1.1706:00 91% 2
Picture 3 2 2016.1.1702:53
Picture 5 9 2016.1.1703:20 87%
Picture 7 10 2016.1.1703:55 90%
Picture 18 3 2016.1.1704:30 85%
Picture 19 6 2016.1.1705:45 90%
3
TABLE 4
Similarly, to facilitate analysis of the behavior of the trailing person, the motion trajectory L3 can be drawn by combining the map with each set of search results in table 4 according to the capturing time of the capturing point.
For example, packets 1:1-2-3-4-5-6-7-8, packets 2: 2-9-10-3-6-11.
The removed snapshot points can be marked on a map, so that criminal investigation personnel can analyze the reason for removing the snapshot points.
Since the search is performed in the set range and the set time period based on the trailing person retrieval picture, the search range is larger than the trailing person data directly selected through table 1. The snapshot points of the group members in table 4 are compared with the snapshot points of the group members corresponding to table 2, and the missing snapshot points in table 2 are found out. In the present embodiment, it can be found that the missing snap-shot points of the group 1 are 3, 4, 5, 6, and the missing snap-shot points of the group 2 are 3, 6. And then judging whether the missed snapshot points have first suspect pictures or not, if so, adding the missed snapshot points and the snapshot picture information to the table 1, updating the motion trail L1 of the first suspect, and checking the motion situation of the first suspect at the missed snapshot points. As in this example, it is checked and found that there is a first suspect at both of the capturing points 3 and 6 in the group 1, so that table 1 is updated, and the updated motion trajectory L1' of the first suspect is obtained: 1-2-9-10-3-6-11-12-7-8.
Through the steps, the motion tracks of the first suspect and the following person can be analyzed as comprehensively as possible. However, as described above, some of the following persons may be misjudged persons and need to be deleted. This problem is solved by the present invention in a preferred manner. And calculating the ratio of the number of the snapshot points which are commonly passed by the first suspect and the following persons to the total number of the snapshot points, judging whether the ratio exceeds a set threshold value, and if so, considering that the following persons are partnered. Assuming that the threshold is set to 40%, it can be determined that the following person of group 1 is partnered in this embodiment. The invention reduces the monitoring range of the following personnel and is beneficial to improving the case handling efficiency of criminal investigation personnel. Of course, the screening of the partner can be realized by other ways such as calculating the ratio and counting the trailing times, and the details are not repeated herein.
In the above scheme, if the first suspect intentionally faces sideways at the snapshot point, a normal face picture cannot be obtained, and therefore an accurate calculation result cannot be obtained through comparison of similarity of faces. To solve this problem, the present invention provides two preferred embodiments, which are specifically described in conjunction with fig. 3. One way is that the face picture is captured by two capturing points in cooperation. Specifically, after a first snapshot point A takes a snapshot of a face picture, whether the face picture taken by the first snapshot point A meets a standard face picture condition is judged, if not, the snapshot information of a second snapshot point B is checked, whether the face picture taken by the second snapshot point B meets the standard face picture condition is judged, if the face picture taken by the second snapshot point B meets the standard face picture condition, the face picture and the corresponding snapshot point information are stored, and if the face picture taken by the second snapshot point B does not meet the standard face picture condition, two face pictures closer to the standard face picture condition and the corresponding snapshot point information are selected to be stored. The standard face picture condition refers to that the distance between eyes is larger than 30 pixels, the horizontal rotation angle of the face is between-15 degrees and 15 degrees, and the pitch rotation angle of the face is between-10 degrees and 10 degrees. Compared with the mode of only capturing one picture, the capturing mode has the advantages that the capturing effect is more accurate, and the consumption performance is low. In order to ensure the accuracy and definition of face snapshot, the angle formed by the first camera and the second camera is ideally within 30-60 degrees.
As another preferred embodiment, the face picture is captured by a plurality of capturing points (for example, A, B, C, D four capturing points in fig. 3) at the same time, and a face picture which satisfies the standard face picture condition and is closest to the standard face picture is screened out, wherein the plurality of camera capturing angles cover the angles of the capturing person.
In correspondence with the aforementioned embodiments of the method of tailgating person analysis, the present application also provides embodiments of an apparatus of tailgating person analysis.
Embodiments of the apparatus for personnel analysis followed by the present application can be applied to backend devices such as platforms. The device embodiments may be implemented by software, or by hardware, or by a combination of hardware and software. Taking a software implementation as an example, as a logical device, the device is formed by reading corresponding computer program instructions in the nonvolatile memory into the memory for operation through the processor of the backend device where the device is located. The backend device in which the apparatus is located in the embodiment may also include other hardware, which is not described herein again, usually according to the actual function of the apparatus.
The device specifically comprises a first suspect information generating unit, a second suspect information generating unit and a third suspect information generating unit, wherein the first suspect information generating unit is used for comparing a face picture captured by each camera in a set range and set time with an established suspect picture, and storing the face picture with the face similarity larger than a first threshold value and corresponding capture point information as first suspect information;
the following person information generation unit is used for screening out persons who do not exceed a following time threshold value during the following of the first suspect from the snapshot points of the first suspect information generation unit, and storing the screened following persons and the corresponding snapshot point information as following person information;
the follow-up personnel missing snapshot point information analysis unit is used for grouping follow-up personnel in the follow-up personnel information according to the face similarity, calculating the quality score of each face picture in each group, taking the face picture with the highest quality score in each group as a follow-up personnel retrieval picture, searching all snapshot pictures of the snapshot points within a set range and set time, and screening out the face pictures with the face similarity larger than a first threshold value and corresponding snapshot point information; comparing the snapshot point information of the trailing personnel in each group with the snapshot point information screened by the corresponding trailing personnel retrieval picture to find out the missing snapshot point information;
the first suspect information updating unit is used for checking whether the missed snapshot points take a snapshot of the first suspect or not after finding out the missed snapshot point information, and updating the information in the first suspect generating unit;
and the suspect conspire judging unit is used for counting the proportion of the snapshot points of the first suspect and the trailing person which are simultaneously captured in the set range to the total snapshot point, and judging that the trailing person is a suspect conspire when the proportion is greater than a set threshold value.
And the motion track generation unit is used for combining the first suspect information and the snapshot point information in the trailing person information on the map and drawing the motion tracks of the first suspect and the trailing person according to the snapshot time.
Wherein the method comprises grouping the following persons in the following person information according to face similarity,
and selecting one face picture in the trailing personnel information, comparing the face picture with the remaining pictures in the trailing information table, and if the face similarity is greater than a second threshold value, placing the face picture in the same group.
The method specifically comprises the steps of judging whether a face picture to be captured meets standard face picture conditions or not after the face picture is captured by a first capturing point, checking capturing information of a second capturing point if the face picture to be captured does not meet the standard face picture conditions, judging whether the face picture captured by the second capturing point fully meets the standard face picture conditions or not, if the face picture captured by the second capturing point meets the standard face picture conditions, storing the face picture and corresponding capturing point information, and if the face picture captured by the second capturing point does not meet the standard face picture conditions, selecting and storing the face picture closer to the standard face picture conditions and the corresponding capturing point information.
The method searches the information of the first suspect through the face recognition technology in the set range through the established suspect picture, finds the following person by using the snapshot point corresponding to the first suspect, finds the missing snapshot point by analyzing the information of the following person, and perfects the motion tracks of the first suspect and the following person. And judging whether the following person is a suspect group or not by analyzing the coincidence proportion of the snapshot points of the first suspect and the following person. The intelligent degree is high, and the problems of misjudgment and missed judgment of trailing personnel are effectively solved.
The implementation process of the functions and actions of each unit in the above device is specifically described in the implementation process of the corresponding step in the above method, and is not described herein again.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the application. One of ordinary skill in the art can understand and implement it without inventive effort.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the scope of protection of the present application.

Claims (10)

1. A method for analysis of a person following a flight, characterized in that,
comparing the face picture captured by each capturing point within a set range and set time with the established suspect picture, and storing the face picture with the face similarity larger than a first threshold value and corresponding capturing point information as first suspect information;
screening out persons with the time of following the first suspect not exceeding a following time threshold from snapshot points in the first suspect information, and storing the screened following persons and the corresponding snapshot point information as following person information;
grouping trailing personnel in the trailing personnel information according to the face similarity, calculating the quality scores of all face pictures in each group, taking the face picture with the highest quality score in each group as a trailing personnel retrieval picture, searching pictures snapped by all snap points within a set range and within a set time, and screening out the face pictures with the face similarity larger than a first threshold value and corresponding snap point information;
comparing the snapshot point information of the trailing personnel in each group with the snapshot point information screened by the corresponding trailing personnel retrieval picture to find out the missing snapshot point information;
after finding out the missing snapshot point information, checking whether the missing snapshot point captures a first suspect or not, and updating the first suspect information;
and counting the proportion of the snapshot points of the first suspect and the trailing person which are simultaneously grabbed in the set range to the total snapshot points, and judging that the trailing person is a suspect partnering when the proportion is greater than a set threshold value.
2. The method for analyzing the trail personnel according to claim 1, wherein the first suspect information and the snapshot point information in the trail personnel information are combined on a map, and the motion tracks of the first suspect and the trail personnel are drawn according to the snapshot time.
3. The method for analysis of trailing personnel according to claim 1, wherein the trailing personnel in the trailing personnel information are grouped by face similarity by the steps of,
and selecting one face picture in the trailing personnel information, comparing the face picture with the remaining pictures in the trailing information table, and if the face similarity is greater than a second threshold value, placing the face pictures in the same group.
4. The method for analyzing the following persons according to claim 1, wherein the two capturing points are used for capturing the face picture cooperatively, and the method specifically comprises the steps of judging whether the captured face picture meets the standard face picture condition or not after the first capturing point captures the face picture, checking the capturing information of the second capturing point if the captured face picture does not meet the standard face picture condition, judging whether the face picture captured by the second capturing point meets the standard face picture condition or not, storing the face picture and the corresponding capturing point information if the captured face picture meets the standard face picture condition, and selecting and storing the face picture closer to the standard face picture condition and the corresponding capturing point information if the captured face picture does not meet the standard face picture condition.
5. The method for analysis of trailing personnel according to claim 4, wherein the first and second snap points are arranged with a lens angle in the range of 30 ° -60 °.
6. The method of analyzing the followed persons according to claim 1, wherein a plurality of cameras are used to capture the face pictures simultaneously, and the face pictures which meet the standard face picture condition and are closest to the standard face picture are selected, wherein the capturing angles of the plurality of cameras cover all angles of the captured persons.
7. A trailing person analysis device, comprising,
the first suspect information generating unit is used for comparing the face picture captured by each camera in the set range and the set time with the established suspect picture, and storing the face picture with the face similarity larger than a first threshold value and the corresponding capture point information as first suspect information;
the following person information generation unit is used for screening out persons who do not exceed a following time threshold value during the following of the first suspect from the snapshot points of the first suspect information generation unit, and storing the screened following persons and the corresponding snapshot point information as following person information;
the follow-up personnel missing snapshot point information analysis unit is used for grouping follow-up personnel in the follow-up personnel information according to the face similarity, calculating the quality score of each face picture in each group, taking the face picture with the highest quality score in each group as a follow-up personnel retrieval picture, searching all snapshot pictures of the snapshot points within a set range and set time, and screening out the face pictures with the face similarity larger than a first threshold value and corresponding snapshot point information; comparing the snapshot point information of the trailing personnel in each group with the snapshot point information screened by the corresponding trailing personnel retrieval picture to find out the missing snapshot point information;
the first suspect information updating unit is used for checking whether the missed snapshot points take a snapshot of the first suspect or not after finding out the missed snapshot point information, and updating the information in the first suspect generating unit;
and the suspect conspire judging unit is used for counting the proportion of the snapshot points of the first suspect and the trailing person which are simultaneously captured in the set range to the total snapshot point, and judging that the trailing person is a suspect conspire when the proportion is greater than a set threshold value.
8. The device for analyzing the followed persons according to claim 7, wherein the movement track generation unit is configured to combine the first suspect information and the snapshot point information in the followed person information on the map, and draw the movement tracks of the first suspect and the followed person according to the snapshot time.
9. The device for analyzing the trailing personnel according to claim 7, wherein the trailing personnel in the trailing personnel information are grouped according to the similarity of human faces by the specific steps of,
and selecting one face picture in the trailing personnel information, comparing the face picture with the remaining pictures in the trailing information table, and if the face similarity is greater than a second threshold value, placing the face picture in the same group.
10. The device for tailgating analysis according to claim 7,
the method specifically comprises the steps of judging whether a face picture to be captured meets standard face picture conditions or not after the face picture is captured by a first capturing point, checking capturing information of a second capturing point if the face picture to be captured does not meet the standard face picture conditions, judging whether the face picture to be captured by the second capturing point meets the standard face picture conditions or not, if the face picture to be captured by the second capturing point meets the standard face picture conditions, storing the face picture and corresponding capturing point information, and if the face picture to be captured by the second capturing point does not meet the standard face picture conditions, selecting and storing the face picture closer to the standard face picture conditions and corresponding capturing point information.
CN201610058847.4A 2016-01-28 2016-01-28 Method and device for analyzing followed person Active CN107016322B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610058847.4A CN107016322B (en) 2016-01-28 2016-01-28 Method and device for analyzing followed person

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610058847.4A CN107016322B (en) 2016-01-28 2016-01-28 Method and device for analyzing followed person

Publications (2)

Publication Number Publication Date
CN107016322A CN107016322A (en) 2017-08-04
CN107016322B true CN107016322B (en) 2020-01-14

Family

ID=59438993

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610058847.4A Active CN107016322B (en) 2016-01-28 2016-01-28 Method and device for analyzing followed person

Country Status (1)

Country Link
CN (1) CN107016322B (en)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110019963B (en) * 2017-12-11 2021-08-10 罗普特科技集团股份有限公司 Method for searching suspect relation personnel
CN108229335A (en) * 2017-12-12 2018-06-29 深圳市商汤科技有限公司 It is associated with face identification method and device, electronic equipment, storage medium, program
CN110210276A (en) * 2018-05-15 2019-09-06 腾讯科技(深圳)有限公司 A kind of motion track acquisition methods and its equipment, storage medium, terminal
CN109117714B (en) * 2018-06-27 2021-02-26 北京旷视科技有限公司 Method, device and system for identifying fellow persons and computer storage medium
CN109165637B (en) * 2018-10-08 2020-10-20 武汉爱迪科技股份有限公司 Identity recognition method and system based on dynamic video analysis
CN109858329A (en) * 2018-12-15 2019-06-07 深圳壹账通智能科技有限公司 Anti- trailing method, apparatus, equipment and storage medium based on recognition of face
CN109784199B (en) * 2018-12-21 2020-11-24 深圳云天励飞技术有限公司 Peer-to-peer analysis method and related product
CN111382627B (en) * 2018-12-28 2024-03-26 成都云天励飞技术有限公司 Method for judging peer and related products
CN109784274B (en) * 2018-12-29 2021-09-14 杭州励飞软件技术有限公司 Method for identifying trailing and related product
CN110348347A (en) * 2019-06-28 2019-10-18 深圳市商汤科技有限公司 A kind of information processing method and device, storage medium
CN110569720B (en) * 2019-07-31 2022-06-07 安徽四创电子股份有限公司 Audio and video intelligent identification processing method based on audio and video processing system
CN111104915B (en) * 2019-12-23 2023-05-16 云粒智慧科技有限公司 Method, device, equipment and medium for peer analysis
CN111639689B (en) * 2020-05-20 2023-07-25 杭州海康威视系统技术有限公司 Face data processing method and device and computer readable storage medium
CN111563479B (en) * 2020-05-26 2023-11-03 深圳市商汤科技有限公司 Concurrent person weight removing method, partner analyzing method and device and electronic equipment
CN117690172A (en) * 2020-10-19 2024-03-12 杭州海康威视数字技术股份有限公司 Associated target identification method and device, electronic equipment and storage medium
CN113452903B (en) * 2021-06-17 2023-07-11 浙江大华技术股份有限公司 Snapshot equipment, snap method and main control chip

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102158595A (en) * 2011-02-16 2011-08-17 中兴通讯股份有限公司 Method and device for realizing burglary prevention of mobile terminal by face recognition
CN103456175A (en) * 2013-09-25 2013-12-18 武汉烽火众智数字技术有限责任公司 Accompanying vehicle real-time detection method based on vehicle registration plate recognition and meshing monitoring
CN103871248A (en) * 2014-03-18 2014-06-18 浙江宇视科技有限公司 Method and device for analyzing vehicles tailing after suspected vehicle based on track collision
US8837805B1 (en) * 2014-03-11 2014-09-16 Rafael Aviyants System and method for verification of a banknote
CN104616494A (en) * 2014-12-23 2015-05-13 浙江宇视科技有限公司 Method and device for recording and determining target object based on base station and block port

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102158595A (en) * 2011-02-16 2011-08-17 中兴通讯股份有限公司 Method and device for realizing burglary prevention of mobile terminal by face recognition
CN103456175A (en) * 2013-09-25 2013-12-18 武汉烽火众智数字技术有限责任公司 Accompanying vehicle real-time detection method based on vehicle registration plate recognition and meshing monitoring
US8837805B1 (en) * 2014-03-11 2014-09-16 Rafael Aviyants System and method for verification of a banknote
CN103871248A (en) * 2014-03-18 2014-06-18 浙江宇视科技有限公司 Method and device for analyzing vehicles tailing after suspected vehicle based on track collision
CN104616494A (en) * 2014-12-23 2015-05-13 浙江宇视科技有限公司 Method and device for recording and determining target object based on base station and block port

Also Published As

Publication number Publication date
CN107016322A (en) 2017-08-04

Similar Documents

Publication Publication Date Title
CN107016322B (en) Method and device for analyzing followed person
CN107292240B (en) Person finding method and system based on face and body recognition
US10956753B2 (en) Image processing system and image processing method
CN107944427B (en) Dynamic face recognition method and computer readable storage medium
US10262209B2 (en) Method for analyzing video data
CN106295618A (en) A kind of personal identification method and device based on video image
CN109447186A (en) Clustering method and Related product
CN107657232B (en) Pedestrian intelligent identification method and system
CN108805210B (en) Bullet hole identification method based on deep learning
WO2022156234A1 (en) Target re-identification method and apparatus, and computer-readable storage medium
CN109800329B (en) Monitoring method and device
CN111291682A (en) Method and device for determining target object, storage medium and electronic device
CN106571040B (en) Suspicious vehicle confirmation method and equipment
CN111723656B (en) Smog detection method and device based on YOLO v3 and self-optimization
US20210089784A1 (en) System and Method for Processing Video Data from Archive
CN108764153A (en) Face identification method, device, system and storage medium
CN109783663B (en) Archiving method and device
CN111079519A (en) Multi-posture human body detection method, computer storage medium and electronic device
CN114140663A (en) Multi-scale attention and learning network-based pest identification method and system
KR20200059643A (en) ATM security system based on image analyses and the method thereof
CN114764895A (en) Abnormal behavior detection device and method
CN111708907A (en) Target person query method, device, equipment and storage medium
CN115439796A (en) Specific area personnel tracking and identifying method, system, electronic equipment and storage medium
CN114118271A (en) Image determination method, image determination device, storage medium and electronic device
CN109815369B (en) Filing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant