CN114445646A - Personnel association degree analysis method and device, electronic equipment and storage medium - Google Patents

Personnel association degree analysis method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114445646A
CN114445646A CN202111678119.0A CN202111678119A CN114445646A CN 114445646 A CN114445646 A CN 114445646A CN 202111678119 A CN202111678119 A CN 202111678119A CN 114445646 A CN114445646 A CN 114445646A
Authority
CN
China
Prior art keywords
code
human
event
portrait
degree
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111678119.0A
Other languages
Chinese (zh)
Inventor
张纯纯
谢友平
郭栋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Yuntian Lifei Technology Co ltd
Shenzhen Intellifusion Technologies Co Ltd
Original Assignee
Chengdu Yuntian Lifei Technology Co ltd
Shenzhen Intellifusion Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Yuntian Lifei Technology Co ltd, Shenzhen Intellifusion Technologies Co Ltd filed Critical Chengdu Yuntian Lifei Technology Co ltd
Priority to CN202111678119.0A priority Critical patent/CN114445646A/en
Publication of CN114445646A publication Critical patent/CN114445646A/en
Priority to PCT/CN2022/143531 priority patent/WO2023125840A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/5866Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, manually generated location and time information
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/587Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location

Abstract

The embodiment of the invention provides a method for analyzing personnel association, which is used for acquiring various portrait events and various code events to be processed; acquiring a person code parallel-track index based on each portrait event and each code event; obtaining a human code fitting degree based on the human code combining index; and obtaining the correlation degree analysis result between each portrait event and each code event based on the human code fitting degree. The invention also discloses a personnel association degree analysis device, electronic equipment and a computer readable storage medium. By combining and analyzing the portrait event and the code event, a relatively accurate correlation degree analysis result can be determined without a large number of portrait events and code events, so that the analysis resource consumption is relatively low, the analysis time is relatively short, and the analysis efficiency is improved.

Description

Personnel association degree analysis method and device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of event analysis technologies, and in particular, to a method and an apparatus for analyzing a degree of association of people, an electronic device, and a storage medium.
Background
Currently, in the field of event analysis, people associated with a target event are determined by performing association degree analysis on a portrait event or a code event corresponding to the target event. However, in the existing method, a technician manually performs relevance analysis on a portrait event to obtain relevance analysis results of a plurality of target persons, or performs relevance analysis on a code event to obtain relevance analysis results of a plurality of target persons, and when performing the analysis on the portrait event and the code event independently, more events are required to obtain more accurate relevance analysis results, and more event analysis consumes more computing resources and longer computing time, so that the efficiency of relevance analysis is lower.
Disclosure of Invention
The embodiment of the invention provides a method for analyzing personnel association, which is used for obtaining the human code fitting degree through the human code parallel orbit index; and obtaining correlation degree analysis results of a plurality of target personnel based on the human code fitting degree, and determining more accurate correlation degree analysis results without a large number of human image events and code events, so that the analysis resource consumption is lower, the analysis time is shorter, and the analysis efficiency is improved.
In a first aspect, an embodiment of the present invention provides a method for analyzing a person relevance, including the following steps:
acquiring various portrait events and various code events to be processed;
acquiring a person code parallel-track index based on each portrait event and each code event;
acquiring a human code fitting degree based on the human code parallel orbit index;
and obtaining the correlation degree analysis result between each portrait event and each code event based on the human code fitting degree.
Optionally, before obtaining the human code fitness based on the human code merging track index, the method further includes:
acquiring a person code association degree, a person code time dispersion degree and a person code behavior similarity degree based on each portrait event and each code event;
the obtaining of the human code fitting degree based on the human code combining index comprises the following steps:
and obtaining the human code fitting degree based on the human code merging index, the human code association degree, the human code time dispersion degree and the human code behavior similarity.
Optionally, the step of obtaining a person code joint track index based on each portrait event and each code event includes:
acquiring the times of the person code parallel track based on the time information of each portrait event and the time information of each code event;
determining a preset smoothing coefficient based on each portrait event, each code event and a preset analysis purpose;
calculating the merging times of the preprocessed human codes by using the merging times of the human codes and the preset smooth coefficient;
and calculating the human code track merging index based on the preprocessed human code track merging times, the preset smooth coefficient and the smooth coefficient.
Optionally, obtaining a person code association degree based on each portrait event and each code event includes:
based on the human code track combining times, obtaining single track combining times of each human image event and each code event, the number of codes which are combined with each human image event and the number of human images which are combined with each code event;
obtaining a track combining frequency based on the single track combining times and the code quantity;
obtaining a parallel-orbit frequency index based on the total number of the portraits of each portrait event and the number of the portraits;
and obtaining the association degree of the people codes based on the parallel track frequency and the parallel track frequency index.
Optionally, obtaining a person code time dispersion based on each portrait event and each code event includes:
acquiring the number of times of combining the portrait and the number of times of combining the portrait based on the total number of the portrait of each portrait event and the number of times of combining the portrait;
acquiring the total number of days of the people code combining track based on the number of days of the people code combining track;
and calculating the time dispersion of the people codes by utilizing a preset merging track day threshold value, the people code merging track days and the total people code merging track days.
Optionally, the step of obtaining the similarity of the person code behavior based on each portrait event and each code event includes:
constructing a plurality of first person activity vectors corresponding to a plurality of target persons by utilizing each portrait event, wherein one target person corresponds to one first person activity vector;
constructing a plurality of second personnel activity vectors corresponding to a plurality of target personnel by utilizing each code event, wherein one target personnel corresponds to one second personnel activity vector;
and calculating cosine similarity of each first person activity vector and each second person activity vector to obtain the human code behavior similarity.
Optionally, before the step of obtaining the human code fitness based on the human code joint trajectory index, the human code association degree, the human code time dispersion and the human code behavior similarity, the method further includes:
acquiring a first weight of the person code association degree, a second weight of the person code time dispersion degree and a third weight of the person code behavior similarity degree;
the step of obtaining the human code fitting degree based on the human code merging index, the human code association degree, the human code time dispersion degree and the human code behavior similarity degree comprises the following steps:
obtaining a middle score based on the person code association degree, the person code time dispersion degree, the person code behavior similarity degree, the first weight, the second weight and the third weight;
and calculating the human code fitting degree by utilizing the human code merging index and the intermediate score.
In a second aspect, an embodiment of the present invention provides an apparatus for analyzing a person relevance degree, where the apparatus includes:
the acquisition module is used for acquiring each portrait event and each code event to be processed;
a first obtaining module, configured to obtain a human code merging index based on each of the portrait events and each of the code events;
the second obtaining module is used for obtaining the human code fitting degree based on the human code combining index;
and the third obtaining module is used for obtaining the association degree analysis result between each portrait event and each code event based on the human code fitting degree.
In a third aspect, an embodiment of the present invention provides an electronic device, including: the system comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein the processor executes the computer program to realize the steps in the analysis method of the personnel relevance provided by the embodiment of the invention.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements the steps in the method for analyzing the degree of association of people provided by the embodiment of the present invention.
In the embodiment of the invention, each portrait event and each code event to be processed are obtained and acquired; acquiring a person code parallel-track index based on each portrait event and each code event; acquiring a human code fitting degree based on the human code parallel orbit index; and obtaining the correlation degree analysis result between each portrait event and each code event based on the human code fitting degree. The face event and the code event are combined and analyzed to obtain the face code merging index, the final association degree analysis result is obtained based on the face code merging index, and the accurate association degree analysis result can be determined without a large number of face events and code events, so that the analysis resource consumption is low, the analysis time is short, and the analysis efficiency is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a flowchart of a method for analyzing a person relevance according to an embodiment of the present invention;
FIG. 2 is a flowchart of another method for analyzing a degree of people relevance according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of an apparatus for analyzing a degree of association of people according to an embodiment of the present invention;
FIG. 4 is a schematic structural diagram of another apparatus for analyzing a degree of association between persons according to an embodiment of the present invention;
FIG. 5 is a schematic structural diagram of a first obtaining module according to an embodiment of the present invention;
FIG. 6 is a schematic structural diagram of a fourth obtaining module according to an embodiment of the present invention;
FIG. 7 is a schematic structural diagram of another fourth obtaining module according to an embodiment of the present invention;
FIG. 8 is a schematic structural diagram of another fourth obtaining module according to an embodiment of the present invention;
FIG. 9 is a schematic structural diagram of another apparatus for analyzing a degree of association between persons according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, fig. 1 is a flowchart of a method for analyzing a person relevance provided in an embodiment of the present invention, and as shown in fig. 1, the method for analyzing a person relevance includes the following steps:
101. and acquiring each portrait event and each code event to be processed.
In the embodiment of the present invention, each portrait event and each code event to be processed are each portrait event and code event corresponding to a target user, where the target person refers to a person who needs to perform association degree analysis, and a plurality of target persons need to perform association degree analysis. Each target person would correspond to a portrait event and a code event. The human image event of a target person refers to a set formed by information such as a human face image, a human body image and a human face video of the target person, the code event of the target person refers to code information of the target person, the code information can include an identity card code, a two-dimensional code, an identity code and the like, generally, one code information represents the identity of one target person, and the code information is used as an identification of the identity of the target person.
Generally, the electronic device receives portrait events of a plurality of target persons and code events of the plurality of target persons, which are input by a user, and then continues to receive an analysis instruction sent by the user, and the electronic device performs the steps of the application by using the received portrait events of the plurality of target persons and the code events of the plurality of target persons based on the received analysis instruction.
102. And acquiring a person code parallel-track index based on each portrait event and each code event.
The person code parallel track means that a certain person in the person image event and code information in the code event simultaneously appear in a certain space S (the user can set according to requirements, such as 10S or 100S) and a certain time T (the user can set according to requirements, such as 100m or 500 m).
The person code parallel track means that a certain person in the person image event and code information in the code event simultaneously appear in a certain space S (the user can set according to requirements, such as 10S or 100S) and a certain time T (the user can set according to requirements, such as 100m or 500 m).
The number of times of people code combining refers to the number of times that a certain person in the person event and code information in the code event appear at the same time in a certain time T and a certain space S.
The figure number track combining index is obtained based on figure number track combining times, the more the figure number track combining times, the higher the figure number track combining index of the figure and the figure number information is considered, when the track combining times reach a certain degree, the influence on the figure number fitting degree tends to be gentle, and the figure number track combining index also tends to be gentle.
The degree of association is used to evaluate the importance of a certain code information to a certain portrait.
The human code time dispersion is an index reflecting the continuity of human code parallel track, compared with the continuous human code parallel track and the discontinuous human code parallel track, the human image and the code time dispersion of the continuous human code parallel track are considered to be higher.
The human code behavior similarity refers to the similarity degree of the times of simultaneous occurrence of the human figure and the code information in the same time and the same space.
103. Obtaining the human code fitting degree based on the human code combining index
The human code fitting degree refers to the matching degree between a person and a code which is mined through a plurality of algorithm analysis and integration rules based on a portrait event and a code event and is expressed through specific fitting degree.
Further, before obtaining the human code fitting degree based on the human code merging track index, the method further includes: acquiring a person code association degree, a person code time dispersion degree and a person code behavior similarity degree based on each portrait event and each code event; the obtaining of the human code fitting degree based on the human code combining index comprises the following steps: and obtaining the human code fitting degree based on the human code merging index, the human code association degree, the human code time dispersion degree and the human code behavior similarity.
And after the human code track merging index, the human code association degree, the human code time dispersion degree and the human code behavior similarity are obtained, calculating the human code fitting degree by utilizing a preset floor (a, b) function. Where floor (a, b) takes a b-bit decimal number for a, e.g. floor (3.1415926, 3) ═ 3.141.
Specifically, before the step of obtaining the human code fitting degree based on the human code merging index, the human code association degree, the human code time dispersion degree, and the human code behavior similarity, the method further includes: acquiring a first weight of the person code association degree, a second weight of the person code time dispersion degree and a third weight of the person code behavior similarity degree; correspondingly, the step of obtaining the human code fitting degree based on the human code merging index, the human code association degree, the human code time dispersion degree and the human code behavior similarity degree comprises the following steps: obtaining a middle score based on the person code association degree, the person code time dispersion degree, the person code behavior similarity degree, the first weight, the second weight and the third weight; and calculating the human code fitting degree by utilizing the human code merging index and the intermediate score.
It should be noted that, in the present invention, the first weight, the second weight, and the third weight may be weights set by a user based on a requirement, and the present invention is not particularly limited. In general, specific values of the first weight, the second weight, and the third weight may be different based on data saturation of the portrait event and the code event and a preset analysis purpose of a user requirement (including a degree of fineness of an analysis result and an accuracy of the analysis result of the user requirement, and the degree of fineness of the analysis result and the accuracy of the analysis result are different).
Specifically, the human code fitting degree is calculated according to a formula I, wherein the formula I is as follows:
C=S1W1+S2W2+S3W3
P=floor(C+A,ε)
wherein S is1Is degree of association of a person code, W1Is a first weight, S2Is a time dispersion of a human code, W2Is a second weight, S3Is a human code behavior similarity score, W3Is the third weight, C is the median score, P is the human code fitness, A is the human code doubling index, ε is the number of reserved decimal places, ε is typically 6. The floor function is synonymous with the function described above.
Based on different conditions, different weighted values are set so as to ensure that the obtained human code fitting degree is more accordant with the actual condition, and the accuracy of the human code fitting degree is improved.
104. And obtaining the correlation degree analysis result between each portrait event and each code event based on the human code fitting degree.
And determining the correlation degree analysis results of the target persons based on the respective human code fitting degrees of the target persons. Determining whether the multiple target persons have the association relationship by utilizing the respective human code fitting degrees of the multiple target persons, and determining the association degree of the multiple target persons.
Further, obtaining an association analysis result between each portrait event and each code event based on the human code fitting degree, including: and carrying out normalization processing on the human code fitting degree to obtain a result score, and obtaining an association degree analysis result between each portrait event and each code event based on the result score.
It should be noted that the actual values of the human code fitting degrees may be distributed relatively dispersedly and distributed in a larger numerical range, and at this time, all the human code fitting degrees need to be normalized, so that the corresponding result scores of all the human code fitting degrees after normalization are distributed in the range [0,1 ].
Specifically, the human code fitting degree is normalized by using a formula two, wherein the formula two is as follows:
Figure BDA0003452954300000071
wherein, PiHuman code fitting degree, Q, of the ith person in human code fitting degreeiIs the result score of the ith person, AmaxIs the maximum value of the human code parallel-track index, PmaxAnd (4) the maximum value of the human code fitting degree, tau is the reserved decimal digit, tau can be 6, the floor function meaning is referred to the above, and the ith person is any one of a plurality of target persons.
The human code fitting degree is normalized, so that the relation of the obtained result scores is clearer, the problem that the human code fitting degree is too dispersed and the association degree of the target person is difficult to determine is avoided, and the association degree analysis difficulty of the target person is reduced.
In the embodiment of the invention, portrait events of a plurality of target persons and code events of the plurality of target persons are obtained; acquiring a person code parallel orbit index, a person code association degree, a person code time dispersion degree and a person code behavior similarity degree based on the portrait event and the code event; acquiring human code fitting degree based on the human code merging index, the human code association degree, the human code time dispersion degree and the human code behavior similarity degree; and obtaining the correlation degree analysis results of the target persons based on the human code fitting degree.
In the existing method, technicians manually perform relevance analysis on portrait events to obtain relevance analysis results of a plurality of target personnel, or perform relevance analysis on code events to obtain relevance analysis results of a plurality of target personnel, when the portrait events and the code events are analyzed independently, more events are needed to obtain more accurate relevance analysis results, more events are analyzed, and the relevance analysis efficiency is lower due to the fact that more calculation resources and calculation time are consumed. In the invention, the final fitting degree is obtained by combining and analyzing the portrait event and the code event, the correlation degree analysis results of a plurality of target personnel are obtained based on the portrait event and the code event, and the more accurate correlation degree analysis results can be determined without a large number of portrait events and code events, so that the analysis resource consumption is lower, the analysis time is shorter, and the analysis efficiency is improved.
Optionally, referring to fig. 2, fig. 2 is a flowchart of another method for analyzing a person relevance provided in an embodiment of the present invention, and as shown in fig. 2, the method specifically includes the following steps:
201. and acquiring the times of the person code parallel-track based on the time information of each portrait event and the time information of each code event.
202. And determining a preset smoothing coefficient based on each portrait event, each code event and a preset analysis purpose.
203. And calculating the merging times of the preprocessed human codes by using the merging times of the human codes and the preset smooth coefficient.
204. And calculating the human code track merging index based on the preprocessed human code track merging times, the preset smooth coefficient and the smooth coefficient.
In the embodiment of the invention, the number of times of combining and orbit of the person codes and the number of days of combining and orbit of the person codes are counted based on the portrait event and the code event. The number of times of combining the person codes may include the number of times of combining a certain person image and certain code information, the number of pieces of code information combined with a certain person image, the number of pieces of person images combined with certain code information, and the like, and the number of days of combining the person codes may include the number of days of combining a certain person image and certain code information.
It should be noted that, in the present application, a preset smoothing coefficient is determined based on the data saturation of each portrait event, the data saturation of each code event, and a preset analysis purpose (the preset analysis purpose is the same as that described above), and the preset smoothing coefficient is a value between 0 and 1.
Specifically, the formula for calculating the number of times of merging the preprocessed people codes is formula three, as follows:
Figure BDA0003452954300000081
wherein beta is a smoothing coefficient, T1Preprocessing the number of times of combining the people's codes and the track T0And merging the human codes for times. Then, after the preprocessing people code track combining times are obtained, a people code track combining index is obtained by using a formula five, wherein the formula five is as follows:
Z=floor(sigmoid(T1,β),α)
where α is the reserved decimal number, usually 6, floor function meaning as described above, and Z is the human code doubling index. In some embodiments, the pseudo code corresponding to the process of finding the human code track merging index is as follows:
Figure BDA0003452954300000091
the human code track-merging index is floor (sigmoid (number of times of human code track-merging after preprocessing, smoothing coefficient), and decimal digits are reserved).
And preprocessing the times of combining and orbit the people codes, and determining the times of combining and orbit the different preprocessed people codes for different times of combining and orbit the people codes, so that the obtained final people code combining and orbit index better meets the analysis requirement of the association degree, the times of combining and orbit the people codes are prevented from being directly utilized without processing, and the accuracy rate of the obtained people code combining and orbit index is prevented from being lower.
Specifically, obtaining the person code association degree based on each portrait event and each code event includes: based on the number of times of combining the portrait events, obtaining the number of times of combining the portrait events with the code events, the number of codes combined with the portrait events and the number of portraits combined with the code events; obtaining a track combining frequency based on the single track combining times and the code quantity; obtaining a parallel-orbit frequency index based on the total number of the portraits of each portrait event and the number of the portraits; and obtaining the association degree of the people codes based on the parallel track frequency and the parallel track frequency index. And counting the total number of the figures based on the figure events to obtain the association degree of the person codes.
The single track-merging frequency of each portrait event and each code event (the single track-merging frequency refers to the frequency of merging one portrait event with one code event, and may also be understood as the frequency of merging one portrait with one code information), the number of codes merged with each portrait event (the number of code events in a track-merging relationship with one portrait event, and may be understood as the number of code information in a track-merging relationship with one portrait) and the number of portraits merged with each code event (the number of portrait events in a track-merging relationship with one code event, and may be understood as the number of portraits merged with one code information) may be determined from the number of times of merging the portraits.
Wherein, based on the single track-combining times and the number of codes, obtaining the track-combining frequency means: for a certain portrait E and a certain code event F, the single track combining times of the portrait E and the code event F are compared with the code quantity of all code events having the track combining relation with the portrait E, and the ratio of the ratio is the track combining frequency.
Wherein, based on the total number of the figures and the number of the figures, obtaining the parallel-orbit frequency index means: for a certain code event G, determining the number of the portraits having a parallel-track relation with the code event G, then adding 1 to the number of the portraits corresponding to the code event G to obtain an intermediate parameter, then comparing the total number of the portraits of all the portrait events of a plurality of target persons with the intermediate parameter, and then taking the ratio of the ratio as the input of an lg function to obtain the output of the lg function, namely the parallel-track frequency index which is the intermediate parameter lg. And finally, the product of the parallel frequency and the parallel frequency index is obtained to obtain the final human code association degree.
The method is used for determining the person code association degree, so that the person code association degree effectively reflects the association degree of the portrait and the code, and the accuracy of the person code association degree is improved.
Specifically, obtaining the person code time dispersion based on each of the person image events and each of the code events includes: acquiring the number of times of combining the portrait and the number of times of combining the portrait based on the total number of the portrait of each portrait event and the number of times of combining the portrait; acquiring the total number of days of the people code combining track based on the number of days of the people code combining track; and calculating the time dispersion of the people codes by utilizing a preset merging track day threshold value, the people code merging track days and the total people code merging track days.
Based on the scheme, after the number of days for merging the obtained numbers is counted, the time dispersion of the number is continuously obtained. The method comprises the following steps of: acquiring the total number of days of the people code combining track based on the number of days of the people code combining track; and calculating the time dispersion of the people codes by utilizing a preset merging track day threshold value, the people code merging track days and the total people code merging track days.
The total number of days of the person code combining is the sum of the number of days of the person code combining, and the preset combining day threshold may be a value set by the user based on the requirement, for example, 2 days. And determining the human code time dispersion of the human code merging track days to be 0 when the human code merging track days are less than 2, and taking the ratio of the human code merging track days to the total human code merging track days as the human code time dispersion when the human code merging track days are more than or equal to 2.
In some embodiments, the human code time dispersion can also be solved using the following pseudo-code:
man code time dispersion if (man code parallel track days <2) ready for writing
0.0
}else{
sigmoid (set of consecutive people's yard merging days map (day ═ mat.pow (2, mat.pow (day-1, 0.8)) }) sum/mat.pow (actual people's yard merging days, 0.5), 18.0)
}
It can be seen that, in the above pseudo code, some other parameters are also involved, and a user can adjust the corresponding parameters based on requirements, and the parameter values in the pseudo code are not specifically limited by the present invention.
For different people code track combining days, the corresponding people code time dispersion is obtained in different modes, the invalid data with fewer people code track combining days is prevented from being additionally analyzed, the analysis efficiency is improved, meanwhile, the invalid data with fewer track combining days is determined to be 0, the invalid data with fewer track combining days, which is not 0, is prevented from generating inaccurate influence parameters, and therefore the accuracy of the people code time dispersion can be improved.
Specifically, the step of obtaining the similarity of the person code behavior based on each portrait event and each code event includes: constructing a plurality of first person activity vectors corresponding to a plurality of target persons by utilizing each portrait event, wherein one target person corresponds to one first person activity vector; constructing a plurality of second personnel activity vectors corresponding to a plurality of target personnel by utilizing each code event, wherein one target personnel corresponds to one second personnel activity vector; and calculating cosine similarity of each first person activity vector and each second person activity vector to obtain the human code behavior similarity.
Based on the description above, the human code behavior similarity refers to the similarity degree of the times of simultaneous occurrence of the human face and the code in the same time and the same space.
Specifically, with the time period and the area range as dimensions, the number of portrait events and the number of code events of all target persons are counted to respectively construct a first person activity vector and a second person activity vector. Typically, the time period is divided in hours, for example, 24 time periods per day, and the area range is divided in latitude and longitude (divided using the geohash algorithm, in which the precise bit takes 7 bits).
Then, the similarity of the human code behaviors is continuously calculated according to a formula four, wherein the formula four is as follows:
Figure BDA0003452954300000111
wherein S is3Is the similarity of human code behaviors, A is the first human activity vector, B is the second human activity vector, AiIs the value of the ith dimension in A, BiIs the value of the ith dimension in B.
Determining the similarity of human code behaviors by means of cosine similarity, wherein the similarity of human code behaviors accurately reflects the similarity information of the portrait and the code, so that the final human code fitting degree has high accuracy
The invention can be used for scenes such as reduction analysis, danger early warning and the like of relevant departments, and based on the human code fitting degree, the personnel identity related to the event is searched out through the line expansion of the space-time accompanying behavior of the human code or the space-time position co-occurrence relation of the human code, and finally the silk-spinning cocoon-stripping locking is carried out by utilizing the characteristic data such as the identity, the track and the like of the individual and the related personnel are found.
It should be noted that the method for analyzing the degree of association of the person provided by the embodiment of the present invention can be applied to devices such as a smart phone, a computer, and a server that can perform data query.
Optionally, referring to fig. 3, fig. 3 is a schematic structural diagram of an analysis apparatus for a person relevance degree according to an embodiment of the present invention, and as shown in fig. 3, the apparatus includes:
an obtaining module 301, configured to obtain each portrait event and each code event to be processed;
a first obtaining module 302, configured to obtain a human code joint index based on each of the human image events and each of the code events;
a second obtaining module 303, configured to obtain a human code fitting degree based on the human code merging index;
a third obtaining module 304, configured to obtain a correlation analysis result between each portrait event and each code event based on the human code fitness.
Optionally, as shown in fig. 4, the apparatus further includes:
a fourth obtaining module 305, configured to obtain a person code association degree, a person code time dispersion degree, and a person code behavior similarity degree based on each portrait event and each code event;
correspondingly, the second obtaining module 303 is configured to obtain a human code fitting degree based on the human code merging index, the human code association degree, the human code time dispersion degree, and the human code behavior similarity.
Optionally, as shown in fig. 5, the first obtaining module 302 includes:
a first obtaining unit 3021 configured to obtain a number of times of face code combining based on time information of each of the face events and time information of each of the code events;
a second obtaining unit 3022 configured to determine a preset smoothing coefficient based on each portrait event, each code event, and a preset analysis purpose;
a first calculating unit 3023, configured to calculate the merging times of the preprocessed human codes by using the merging times of the human codes and the preset smoothing coefficient;
a second calculating unit 3024, configured to calculate the human code joint track index based on the preprocessed human code joint track frequency, the preset smoothing coefficient, and the smoothing coefficient.
Optionally, as shown in fig. 6, the fourth obtaining module 305 includes:
a third obtaining unit 3051, configured to obtain, based on the people code merging frequency, a single merging frequency of each of the people image events and each of the code events, a number of codes merged with each of the people image events, and a number of people images merged with each of the code events;
a fourth obtaining unit 3052, configured to obtain a combining frequency based on the single combining frequency and the number of codes;
a fifth obtaining unit 3053, configured to obtain a merging frequency index based on the total number of the portraits of each of the portraits events and the number of the portraits;
a sixth obtaining unit 3054, configured to obtain the people code association degree based on the parallel frequency and the parallel frequency index.
Optionally, as shown in fig. 7, the fourth obtaining module 305 further includes:
a seventh obtaining unit 3055, configured to obtain a number of days of combining and tracking based on the total number of the figures of each of the figure events and the number of times of combining and tracking the figure codes;
an eighth obtaining unit 3056, configured to obtain the total days of the people's yard merging on the basis of the days of the people's yard merging;
a ninth obtaining unit 3057, configured to calculate the time dispersion of the human codes by using a preset merging days threshold, the human code merging days, and the total human code merging days.
Optionally, as shown in fig. 8, the fourth obtaining module 305 further includes:
a first constructing unit 3058, configured to construct, by using each of the portrait events, a plurality of first human activity vectors corresponding to a plurality of the target people, where one target person corresponds to one first human activity vector;
a second constructing unit 3059, configured to construct, by using each code event, a plurality of second person activity vectors corresponding to a plurality of target persons, where one target person corresponds to one second person activity vector;
a third calculation unit 30510, configured to perform cosine similarity calculation on each first person activity vector and each second person activity vector to obtain the human code behavior similarity.
Optionally, as shown in fig. 9, the apparatus further includes:
a weight obtaining module 306, configured to obtain a first weight of the person code association degree, a second weight of the person code time dispersion degree, and a third weight of the person code behavior similarity degree;
correspondingly, the second obtaining module 303 is configured to obtain an intermediate score based on the personal code association degree, the personal code time dispersion degree, the personal code behavior similarity degree, the first weight, the second weight, and the third weight; and calculating the human code fitting degree by utilizing the human code merging index and the intermediate score.
The analysis apparatus for the degree of association of the person according to the embodiment of the present invention may be applied to devices such as a smart phone, a computer, and a server that can perform business analysis at a graph level.
The analysis device for the personnel association degree provided by the embodiment of the invention can realize each process realized by the analysis method for the personnel association degree in the method embodiment, and can achieve the same beneficial effect. To avoid repetition, further description is omitted here.
Referring to fig. 10, fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present invention, as shown in fig. 10, including: a memory 1002, a processor 1001 and a computer program for a method for analyzing a person relevance, which is stored on the memory 1002 and can be run on the processor 1001, wherein:
the processor 1001 is used for calling the computer program stored in the memory 1002, and executes the following steps:
acquiring various portrait events and various code events to be processed;
acquiring a person code parallel-track index based on each portrait event and each code event;
obtaining a human code fitting degree based on the human code combining index;
and obtaining the correlation degree analysis result between each portrait event and each code event based on the human code fitting degree.
Optionally, before the processor 1001 performs the operation of obtaining the human code fitness based on the human code merging index, the method further includes: :
acquiring a person code association degree, a person code time dispersion degree and a person code behavior similarity degree based on each portrait event and each code event;
accordingly, the processor 1001 performs the operation of obtaining the human code fitting degree based on the human code merging index, including:
and obtaining the human code fitting degree based on the human code merging index, the human code association degree, the human code time dispersion degree and the human code behavior similarity.
Optionally, the processor 1001 obtains a person code merging index based on each of the person image events and each of the code events, and includes:
acquiring the times of the person code parallel track based on the time information of each portrait event and the time information of each code event;
determining a preset smoothing coefficient based on each portrait event, each code event and a preset analysis purpose;
calculating the merging times of the preprocessed human codes by using the merging times of the human codes and the preset smooth coefficient;
and calculating the human code track merging index based on the preprocessed human code track merging times, the preset smooth coefficient and the smooth coefficient.
Optionally, the processor 1001 obtains the person-code association degree based on each portrait event and each code event, and includes:
based on the number of times of combining the portrait events, obtaining the number of times of combining the portrait events with the code events, the number of codes combined with the portrait events and the number of portraits combined with the code events;
obtaining a track combining frequency based on the single track combining times and the code quantity;
obtaining a parallel-orbit frequency index based on the total number of the portraits of each portrait event and the number of the portraits;
and obtaining the association degree of the people codes based on the parallel track frequency and the parallel track frequency index. .
Optionally, the processor 1001 obtains a person code time dispersion based on each of the portrait events and each of the code events, and includes:
acquiring the number of times of combining the portrait and the number of times of combining the portrait based on the total number of the portrait of each portrait event and the number of times of combining the portrait;
acquiring the total number of days of the people code combining track based on the number of days of the people code combining track;
and calculating the time dispersion of the people codes by utilizing a preset merging track day threshold value, the people code merging track days and the total people code merging track days.
Optionally, the processor 1001 performs a step of obtaining a human code behavior similarity based on each portrait event and each code event, including:
constructing a plurality of first person activity vectors corresponding to a plurality of target persons by utilizing each portrait event, wherein one target person corresponds to one first person activity vector;
constructing a plurality of second personnel activity vectors corresponding to a plurality of target personnel by utilizing each code event, wherein one target personnel corresponds to one second personnel activity vector;
and calculating cosine similarity of each first person activity vector and each second person activity vector to obtain the human code behavior similarity.
Optionally, before the processor 1001 executes the human code merging index, the human code association degree, the human code time dispersion degree, and the human code behavior similarity to obtain the human code fitting degree, the method further includes:
acquiring a first weight of the person code association degree, a second weight of the person code time dispersion degree and a third weight of the person code behavior similarity degree;
correspondingly, the processor 1001 obtains the human code fitness based on the human code merging index, the human code association degree, the human code time dispersion degree, and the human code behavior similarity, and includes:
obtaining a middle score based on the person code association degree, the person code time dispersion degree, the person code behavior similarity degree, the first weight, the second weight and the third weight;
and calculating the human code fitting degree by utilizing the human code merging index and the intermediate score.
The electronic device provided by the embodiment of the invention can be applied to devices such as a smart phone, a computer, and a server which can analyze the association degree of the person.
The electronic equipment provided by the embodiment of the invention can realize each process realized by the analysis method of the personnel association degree in the method embodiment, and can achieve the same beneficial effect. To avoid repetition, further description is omitted here.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the method for analyzing the person relevance provided in the embodiment of the present invention or the method for analyzing the person relevance of the application terminal, and can achieve the same technical effect, and in order to avoid repetition, the detailed description is omitted here.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
The above disclosure is only for the purpose of illustrating the preferred embodiments of the present invention, and it is therefore to be understood that the invention is not limited by the scope of the appended claims.

Claims (10)

1. A method for analyzing the degree of association of people is characterized by comprising the following steps:
acquiring various portrait events and various code events to be processed;
acquiring a person code parallel-track index based on each portrait event and each code event;
obtaining a human code fitting degree based on the human code combining index;
and obtaining the correlation degree analysis result between each portrait event and each code event based on the human code fitting degree.
2. The method of claim 1, wherein before obtaining the human code fitness based on the human code merging index, the method further comprises:
acquiring a person code association degree, a person code time dispersion degree and a person code behavior similarity degree based on each portrait event and each code event;
the obtaining of the human code fitting degree based on the human code combining index comprises the following steps:
and obtaining the human code fitting degree based on the human code merging index, the human code association degree, the human code time dispersion degree and the human code behavior similarity.
3. The method of claim 2, wherein said step of obtaining a face code merge index based on each of said face events and each of said code events comprises:
acquiring the times of the person code parallel track based on the time information of each portrait event and the time information of each code event;
determining a preset smoothing coefficient based on each portrait event, each code event and a preset analysis purpose;
calculating the merging times of the preprocessed human codes by using the merging times of the human codes and the preset smooth coefficient;
and calculating the human code track merging index based on the preprocessed human code track merging times, the preset smooth coefficient and the smooth coefficient.
4. The method of claim 3, wherein obtaining a person-code relevancy based on each of the portrait events and each of the code events comprises:
based on the number of times of combining the portrait events, obtaining the number of times of combining the portrait events with the code events, the number of codes combined with the portrait events and the number of portraits combined with the code events;
obtaining a track combining frequency based on the single track combining times and the code quantity;
obtaining a parallel-orbit frequency index based on the total number of the portraits of each portrait event and the number of the portraits;
and obtaining the association degree of the people codes based on the parallel track frequency and the parallel track frequency index.
5. The method of claim 3, wherein obtaining a person-code time dispersion based on each of the portrait events and each of the code events comprises:
acquiring the number of times of combining the portrait and the number of times of combining the portrait based on the total number of the portrait of each portrait event and the number of times of combining the portrait;
acquiring the total number of days of the people code combining track based on the number of days of the people code combining track;
and calculating the time dispersion of the people codes by utilizing a preset merging track day threshold value, the people code merging track days and the total people code merging track days.
6. The method of claim 2, wherein the step of obtaining a person-code behavior similarity based on each of the portrait events and each of the code events comprises:
constructing a plurality of first person activity vectors corresponding to a plurality of target persons by utilizing each portrait event, wherein one target person corresponds to one first person activity vector;
constructing a plurality of second personnel activity vectors corresponding to a plurality of target personnel by utilizing each code event, wherein one target personnel corresponds to one second personnel activity vector;
and calculating cosine similarity of each first person activity vector and each second person activity vector to obtain the human code behavior similarity.
7. The method of any one of claims 2-6, wherein before the step of obtaining the human code fitness based on the human code joint score index, the human code association degree, the human code time dispersion degree, and the human code behavior similarity, the method further comprises:
acquiring a first weight of the person code association degree, a second weight of the person code time dispersion degree and a third weight of the person code behavior similarity degree;
the step of obtaining the human code fitting degree based on the human code merging index, the human code association degree, the human code time dispersion degree and the human code behavior similarity degree comprises the following steps:
obtaining a middle score based on the person code association degree, the person code time dispersion degree, the person code behavior similarity degree, the first weight, the second weight and the third weight;
and calculating the human code fitting degree by utilizing the human code merging index and the intermediate score.
8. An apparatus for analyzing a degree of association between persons, the apparatus comprising:
the acquisition module is used for acquiring each portrait event and each code event to be processed;
a first obtaining module, configured to obtain a human code merging index based on each of the portrait events and each of the code events;
the second obtaining module is used for obtaining the human code fitting degree based on the human code combining index;
and the third obtaining module is used for obtaining the association degree analysis result between each portrait event and each code event based on the human code fitting degree.
9. An electronic device, comprising: memory, processor and computer program stored on the memory and executable on the processor, the processor implementing the steps in the method for analyzing a person relevance as claimed in any one of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, characterized in that a computer program is stored thereon, which, when being executed by a processor, carries out the steps in the method of analyzing a person relevance as claimed in any one of claims 1 to 7.
CN202111678119.0A 2021-12-31 2021-12-31 Personnel association degree analysis method and device, electronic equipment and storage medium Pending CN114445646A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111678119.0A CN114445646A (en) 2021-12-31 2021-12-31 Personnel association degree analysis method and device, electronic equipment and storage medium
PCT/CN2022/143531 WO2023125840A1 (en) 2021-12-31 2022-12-29 Personnel association degree analysis method, apparatus, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111678119.0A CN114445646A (en) 2021-12-31 2021-12-31 Personnel association degree analysis method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114445646A true CN114445646A (en) 2022-05-06

Family

ID=81365831

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111678119.0A Pending CN114445646A (en) 2021-12-31 2021-12-31 Personnel association degree analysis method and device, electronic equipment and storage medium

Country Status (2)

Country Link
CN (1) CN114445646A (en)
WO (1) WO2023125840A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023125840A1 (en) * 2021-12-31 2023-07-06 深圳云天励飞技术股份有限公司 Personnel association degree analysis method, apparatus, electronic device and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5435249B2 (en) * 2011-03-23 2014-03-05 日本電気株式会社 Event analysis apparatus, event analysis method, and program
CN110751042B (en) * 2019-09-19 2023-02-14 任子行网络技术股份有限公司 Time partition-based portrait and IMSI information association method and system
CN110781336B (en) * 2019-09-30 2022-08-09 任子行网络技术股份有限公司 Method and system for fusing portrait data and mobile phone feature data based on global filing
CN110874362A (en) * 2019-10-29 2020-03-10 青岛海信网络科技股份有限公司 Data association analysis method and device
CN114445646A (en) * 2021-12-31 2022-05-06 深圳云天励飞技术股份有限公司 Personnel association degree analysis method and device, electronic equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023125840A1 (en) * 2021-12-31 2023-07-06 深圳云天励飞技术股份有限公司 Personnel association degree analysis method, apparatus, electronic device and storage medium

Also Published As

Publication number Publication date
WO2023125840A1 (en) 2023-07-06

Similar Documents

Publication Publication Date Title
US10635890B2 (en) Facial recognition method and apparatus, electronic device, and storage medium
CN108197592B (en) Information acquisition method and device
CN105139040A (en) Queuing state information detection method and system thereof
CN110659665A (en) Model construction method of different-dimensional features and image identification method and device
CN111439642A (en) Elevator control method, device, computer readable storage medium and terminal equipment
CN106485585A (en) Method and system for ranking
CN111726765A (en) WIFI indoor positioning method and system for large-scale complex scene
WO2020147408A1 (en) Facial recognition model evaluation method and apparatus, and storage medium and computer device
CN110689046A (en) Image recognition method, image recognition device, computer device, and storage medium
CN111832440A (en) Construction method of human face feature extraction model, computer storage medium and equipment
US20230410220A1 (en) Information processing apparatus, control method, and program
CN110458644A (en) A kind of information processing method and relevant device
CN114445646A (en) Personnel association degree analysis method and device, electronic equipment and storage medium
CN115222443A (en) Client group division method, device, equipment and storage medium
CN113570391A (en) Community division method, device, equipment and storage medium based on artificial intelligence
CN113064972A (en) Intelligent question and answer method, device, equipment and storage medium
CN116453200A (en) Face recognition method and device, electronic equipment and storage medium
CN112289405A (en) Matching speed recommendation method and device, electronic equipment and storage medium
CN116486341A (en) Training and identifying method and device for human body behavior identification model based on RFID
CN113891323B (en) WiFi-based user tag acquisition system
CN112528140A (en) Information recommendation method, device, equipment, system and storage medium
CN115705581A (en) Advertisement recommendation success rate prediction method and device and advertisement recommendation method and device
CN112949305A (en) Negative feedback information acquisition method, device, equipment and storage medium
CN117522454B (en) Staff identification method and system
KR102528021B1 (en) Method and apparatus for providing a service for recommending cosmetics to user

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination