CN111324772B - Personnel relationship determination method and device, electronic equipment and storage medium - Google Patents

Personnel relationship determination method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN111324772B
CN111324772B CN201910670269.3A CN201910670269A CN111324772B CN 111324772 B CN111324772 B CN 111324772B CN 201910670269 A CN201910670269 A CN 201910670269A CN 111324772 B CN111324772 B CN 111324772B
Authority
CN
China
Prior art keywords
dimension
person
target person
index
persons
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910670269.3A
Other languages
Chinese (zh)
Other versions
CN111324772A (en
Inventor
陈登杭
彭左
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision System Technology Co Ltd
Original Assignee
Hangzhou Hikvision System Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision System Technology Co Ltd filed Critical Hangzhou Hikvision System Technology Co Ltd
Priority to CN201910670269.3A priority Critical patent/CN111324772B/en
Publication of CN111324772A publication Critical patent/CN111324772A/en
Application granted granted Critical
Publication of CN111324772B publication Critical patent/CN111324772B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/78Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/783Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • G06F16/7837Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using objects detected or recognised in the video content
    • G06F16/784Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using objects detected or recognised in the video content the detected or recognised objects being people
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Library & Information Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The embodiment of the invention provides a personnel relationship determining method, a personnel relationship determining device, electronic equipment and a storage medium, and the scheme is as follows: the method comprises the steps of obtaining a surveillance video containing target personnel, obtaining video structured data in the surveillance video, and determining a relationship index between the target personnel and other personnel according to personnel identity data and personnel interaction data contained in the video structured data. By the scheme provided by the embodiment of the invention, the personnel relationship among the personnel in the monitored video can be quickly and efficiently analyzed for the video structured data in the monitored video, and the misjudgment probability caused by artificial subjective factors is reduced.

Description

Personnel relationship determination method and device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of intelligent monitoring device technologies, and in particular, to a method and an apparatus for determining a person relationship, an electronic device, and a storage medium.
Background
In order to conveniently monitor personnel, property and the like, the intelligent monitoring equipment is more and more widely applied in production and life.
At present, intelligent monitoring equipment is deployed in a smart campus scheme, so that a heavy target can be effectively controlled, and personnel in a campus can be identified and captured in real time. When the suspected missing person is found, all monitoring videos captured by intelligent monitoring equipment near the suspected missing person are manually browsed, the last appearing place of the suspected missing person and the person who finally contacts the suspected missing person are analyzed, and the suspected missing person is found in an assisted mode. In real life, the person is lost generally because a person suspected to be lost interacts with the surrounding persons, and then a certain negative emotion is generated by a certain person. Therefore, in the process of analyzing the surveillance video of the suspected missing person, it is important to analyze the relationship between the suspected missing person and other persons in the surveillance video.
In the process of analyzing the personnel relationship among different personnel in the monitoring video, all the monitoring videos are often browsed manually, and useful information is extracted from the monitoring videos; and manually judging the personnel relationship among different personnel according to the extracted information, and further tracking the suspected missing personnel according to the confirmed personnel relationship. In the whole process of analyzing the relationship between the personnel, the manual analysis progress is slow due to the large data volume of the monitoring video, and the phenomenon of misjudgment can occur due to the subjective reason of the analyst.
Disclosure of Invention
The embodiment of the invention aims to provide a personnel relationship determination method, a personnel relationship determination device, electronic equipment and a storage medium, so that a monitoring video can be analyzed quickly and efficiently, and the probability of misjudgment is reduced. The specific technical scheme is as follows:
the embodiment of the invention provides a method for determining a personnel relationship, which comprises the following steps:
acquiring a monitoring video containing a target person;
acquiring video structured data in the monitoring video;
and determining a relation index between the target person and other persons according to the person identity data and the person interaction data contained in the video structured data, wherein the relation index represents the strength of the person relation between the target person and the other persons.
Optionally, the step of determining the relationship index between the target person and the other person according to the person identity data and the person interaction data included in the video structured data includes:
determining an inherent relationship index between the target person and other persons according to the person identity data contained in the video structured data;
determining an interaction relation index between the target person and the other persons according to person interaction data contained in the video structured data;
determining a relationship index between the target person and the other person based on the inherent relationship index and the interaction relationship index.
Optionally, the person identity data includes identity data in at least one dimension;
the step of determining the inherent relationship index between the target person and other persons according to the person identity data contained in the video structured data includes:
for the personnel identity data under each dimension, judging whether the personnel identity data of the target personnel under the dimension is the same as the personnel identity data of other personnel;
if the dimension is the same as the preset value, determining that the inherent relation index of the dimension is a first preset value;
and if not, determining the inherent relation index of the dimension as a second preset value.
Optionally, the person identity data comprises identity data in one or more of an age dimension, a native dimension, and a living quarters dimension.
Optionally, the person interaction data includes interaction data of one or more dimensions of an effective contact time dimension, a peer dimension, a conversation duration dimension, a conversation number dimension, a conversation emotion dimension, a fighting number dimension, and a fighting duration dimension;
the step of determining an interaction relationship index between the target person and the other persons according to the person interaction data included in the video structured data includes:
if the person interaction data comprise interaction data of the effective contact time dimension, determining an interaction relation index between the target person and the other persons in the effective contact time dimension according to the contact time of the other persons and the target person;
if the personnel interaction data comprise interaction data of the same-row dimension, determining an interaction relation index between the target personnel and the other personnel in the same-row dimension according to the times of the target personnel and the other personnel sharing the same row;
if the person interaction data comprises interaction data of the conversation duration dimension, determining an interaction relationship index between the target person and the other persons in the conversation duration dimension according to the conversation times of the target person and the other persons and the conversation durations of the target person and the other persons;
if the person interaction data comprise interaction data of the conversation frequency dimension, determining an interaction relation index between the target person and the other persons in the conversation frequency dimension according to the conversation frequency of the target person and the other persons;
if the person interaction data comprise interaction data of the conversation emotion dimension, determining an interaction relation index between the target person and the other persons in the conversation emotion dimension according to expressions of the target person and the other persons in a conversation process and the number of times of the expressions of the target person and the other persons;
if the person interaction data comprises interaction data of the number of times of putting a frame dimensionality, determining an interaction relation index between the target person and the other persons in the number of times of putting a frame dimensionality according to the number of times of putting a frame between the target person and the other persons;
and if the personnel interaction data comprise interaction data of the length dimension of the time of putting up the frame, determining an interaction relation index of the target personnel and other personnel in the length dimension of the time of putting up the frame according to the times of putting up the frame of the target personnel and other personnel and the length of the time of putting up the frame of the target personnel and other personnel.
Optionally, the step of determining, according to the contact time between the other person and the target person, an interaction relation index between the target person and the other person in the effective contact time dimension includes:
for each other person, judging whether the contact time of the other person and the target person is longer than a preset time length;
if so, the target is determined using the following formulaAn index d of the interaction relationship between a person and the other person in the effective contact time dimension r1
Figure BDA0002141474990000031
Wherein, a 1 As an adjustment factor of the effective contact time dimension, 0 < a 1 <1,n 1 The number of effective contact times of the target person with the other person, i is the ith effective contact time of the target person with the other person, p 1 Is said target person, p 2 For the purpose of the other person or persons,
Figure BDA0002141474990000035
the effective contact time of the target person and the other person in the ith effective contact is obtained.
Optionally, the step of determining, according to the number of times of the target person and the other person sharing the same line, an interaction relationship index between the target person and the other person in the dimension of the same line includes:
for each other person, determining the number of times the other person is in the same row as the target person;
determining an interaction relation index d between the target person and the other person in the peer dimension by using the following formula r2
Figure BDA0002141474990000032
Wherein, a 2 As an adjustment factor of the effective contact time dimension, 0 < a 2 <1,n 2 The number of times the target person has traveled in the same line as the other person.
Optionally, the step of determining an interactive relationship index between the target person and the other person in the conversation duration dimension according to the number of conversations between the target person and the other person and the conversation duration between the target person and the other person includes:
for each other person, determining the number of conversations of the target person with the other person and the conversation duration of the target person with the other person during each conversation;
determining an interaction relation index d between the target person and the other person in the conversation duration dimension by using the following formula r3
Figure BDA0002141474990000033
Wherein, a 3 An adjustment factor of 0 < a for the conversation duration dimension 3 <1,n 3 The number of conversations between the target person and the other person, i is the ith conversation between the target person and the other person, t i The conversation duration in the ith conversation process.
Optionally, the step of determining, according to the number of conversations between the target person and the other person, an interactive relationship index in the dimension of the number of conversations between the target person and the other person includes:
for each other person, determining the number of conversations the target person has with that person;
determining an interaction relation index d between the target person and the person in the dimension of the number of conversations by using the following formula r4
Figure BDA0002141474990000034
Wherein, a 4 As adjustment factor for dimension of said number of conversations, 0 < a 4 <1,n 3 The number of conversations the target person has with the other person.
Optionally, the step of determining an interaction relationship index between the target person and the other person in the conversation emotion dimension according to the expression of the target person and the other person in the conversation process and the number of times of appearance of the expression of the target person and the other person, includes:
for each other person, determining the expression of the target person and the number of times of the expression of the target person in the conversation process of the target person and the other persons;
determining an interaction relation index d between the target person and the other person in the conversational emotion dimension using the following formula r5
Figure BDA0002141474990000041
Wherein n is 5 The number of times of expression occurrence of the target person is shown, i is the expression of the target person appearing for the ith time, C i And the score corresponding to the expression of the target person appearing for the ith time.
Optionally, the step of determining, according to the number of times of putting a shelf between the target person and the other person, an interaction relation index between the target person and the other person in the dimension of the number of times of putting a shelf includes:
for each other person, determining the number of times of putting a shelf between the target person and the other person;
determining an interaction relation index d between the target person and the other person in the shelving frequency dimension by using the following formula r6
Figure BDA0002141474990000042
Wherein, a 6 For the adjustment coefficient of the dimension of the number of times of racking, 0 < a 6 <1,n 6 And the times of fighting the target person and the other persons are counted.
Optionally, the step of determining an interaction relationship index of the target person and the other persons in the dimension of the time length of putting up the frame according to the times of putting up the frame of the target person and the other persons and the time length of putting up the frame of the target person and the other persons includes:
for each other person, determining the number of times of putting the frame between the target person and the other person and the frame-putting duration corresponding to each time of putting the frame between the target person and the other person;
determining an interaction relation index d of the target person and the other persons in the length dimension of the fighting time by using the following formula r7
Figure BDA0002141474990000043
Wherein, a 7 Setting the adjustment coefficient of the time dimension of the frame as 0 < a 7 <1,n 6 The number of times of fighting the target person and the other person, i is the ith time of fighting the target person and the other person, and t ti And the time length of the target person and the other persons when the ith time of the frame is spent is obtained.
Optionally, the step of determining the relationship index between the target person and the other person based on the inherent relationship index and the interaction relationship index includes:
determining a first weight of the inherent relation index of each dimension and a second weight of the interaction relation index of each dimension;
and determining the relationship index between the target person and the other persons according to the first weight of each dimension, the second weight of each dimension, the inherent relationship index of each dimension and the interaction relationship index of each dimension.
Optionally, the step of determining the relationship index between the target person and the other person according to the first weight of each dimension, the second weight of each dimension, the inherent relationship index of each dimension, and the interaction relationship index of each dimension includes:
determining a relationship index d between the target person and the other person according to the following formula:
Figure BDA0002141474990000044
wherein n is the total dimension number of the personnel identity data contained in the personnel identity data, i is the ith dimension in the n dimensions, and w fi A first weight corresponding to the i-th dimensional inherent relation index, d fi Is an inherent relation index of the ith dimension, m is the total dimension number of the interactive data contained in the personnel interactive data, j is the jth dimension in the m dimensions, w rj A second weight corresponding to the jth dimension interaction relation index, d rj Is the interaction relation index of the jth dimension.
Optionally, after determining the relationship index between the target person and the other person according to the first weight of each dimension, the second weight of each dimension, the inherent relationship index of each dimension, and the interaction relationship index of each dimension, the method further includes:
and re-determining the relation index between the target person and the other persons according to the first weight of each dimension, the second weight of each dimension, the inherent relation index of each dimension, the interaction relation index of each dimension and the attenuation index.
Optionally, the step of re-determining the relationship index between the target person and the other person according to the first weight of each dimension, the second weight of each dimension, the inherent relationship index of each dimension, the interaction relationship index of each dimension, and the attenuation index includes:
re-determining the relationship index d' between the target person and the other person using the following formula:
d′ rj =d rj *a t
Figure BDA0002141474990000051
wherein n is the total dimension number of the identity data contained in the personnel identity data, and i is the second dimension of the n dimensionsi dimensions, w fi A first weight being an index of the intrinsic relation of the ith dimension, d fi Is an inherent relation index of the ith dimension, m is the total dimension number of the interactive data contained in the personnel interactive data, j is the jth dimension in the m dimensions, w rj A second weight being an index of the j-th dimension interaction relationship, d rj Is the interaction relation index before the j dimension attenuation, t is the time variation, a is the attenuation coefficient, a is more than 0 and less than 1,d rj And the j dimension is an interaction relation index after attenuation.
Optionally, after re-determining the relationship index between the target person and the other person according to the first weight of each dimension, the second weight of each dimension, the inherent relationship index of each dimension, the interaction relationship index of each dimension, and the attenuation index, the method further includes:
and sequencing the other persons according to the sequence of the re-determined relationship indexes between the target person and the other persons from large to small to obtain a person relationship list of the target person.
The embodiment of the invention also provides a personnel relationship determining device, which comprises:
the video acquisition module is used for acquiring a monitoring video containing target personnel;
the data acquisition module is used for acquiring video structured data in the monitoring video;
and the relation index determining module is used for determining a relation index between the target person and other persons according to the person identity data and the person interaction data contained in the video structured data, wherein the relation index represents the strength of the person relation between the target person and the other persons.
Optionally, the relationship index determining module includes:
the inherent relationship index determining submodule is used for determining the inherent relationship index between the target person and other persons according to the person identity data contained in the video structured data;
the interactive relation index determining submodule is used for determining an interactive relation index between the target person and the other persons according to the person interactive data contained in the video structured data;
and the relation index determining submodule is used for determining the relation index between the target person and the other persons based on the inherent relation index and the interaction relation index.
Optionally, the person identity data includes identity data in at least one dimension;
the inherent relationship index determining submodule is specifically configured to, for the person identity data in each dimension, determine whether the person identity data of the target person in the dimension is the same as the person identity data of other persons; if the dimension is the same as the preset value, determining that the inherent relation index of the dimension is a first preset value; and if not, determining the inherent relation index of the dimension as a second preset value.
Optionally, the person identity data comprises identity data in one or more of an age dimension, a native dimension, and a living quarters dimension.
Optionally, the person interaction data includes interaction data of one or more dimensions of an effective contact time dimension, a peer dimension, a conversation duration dimension, a conversation number dimension, a conversation emotion dimension, a fighting number dimension, and a fighting duration dimension;
the interaction relation index determining submodule comprises:
a first determining unit, configured to determine, if the person interaction data includes interaction data in the effective contact time dimension, an interaction relationship index between the target person and the other person in the effective contact time dimension according to contact time between the other person and the target person;
a second determining unit, configured to determine, if the person interaction data includes interaction data of the peer dimension, an interaction relationship index between the target person and the other person in the peer dimension according to the number of times that the target person and the other person are in the peer;
a third determining unit, configured to determine, if the person interaction data includes interaction data in the conversation duration dimension, an interaction relationship index in the conversation duration dimension between the target person and the other person according to the number of conversations between the target person and the other person and the conversation duration between the target person and the other person;
a fourth determining unit, configured to determine, according to the number of conversations between the target person and the other person, an interactive relationship index between the target person and the other person in the number of conversations dimension if the person interaction data includes interaction data in the number of conversations dimension;
a fifth determining unit, configured to determine, if the person interaction data includes interaction data of the conversation emotion dimension, an interaction relationship index between the target person and the other person in the conversation emotion dimension according to an expression of the target person and the other person during a conversation and a number of times that the expression of the target person and the other person appears;
a sixth determining unit, configured to determine, if the person interaction data includes interaction data of the number of times of putting a shelf, an interaction relationship index between the target person and the other person in the number of times of putting a shelf in the dimension according to the number of times of putting a shelf between the target person and the other person;
and the seventh determining unit is used for determining the interaction relationship index of the target person and the other persons in the racking time dimension according to the racking times of the target person and the other persons and the racking time of the target person and the other persons if the person interaction data comprises the interaction data of the racking time dimension.
Optionally, the first determining unit is specifically configured to, for each other person, determine whether contact time between the other person and the target person is longer than a preset time period; if yes, determining an interaction relation index d between the target person and the other person in the effective contact time dimension by using the following formula r1
Figure BDA0002141474990000061
Wherein, a 1 As an adjustment factor for the effective contact time dimension, 0 < a 1 <1,n 1 The number of effective contact times of the target person with the other person, i is the ith effective contact time of the target person with the other person, p 1 Is the target person, p 2 For the purpose of the other person or persons,
Figure BDA0002141474990000077
the effective contact time of the target person and the other person in the ith effective contact is obtained.
Optionally, the second determining unit is specifically configured to determine, for each other person, the number of times that the other person and the target person are in the same row; determining an interaction relation index d between the target person and the other person in the peer dimension by using the following formula r2
Figure BDA0002141474990000071
Wherein, a 2 As an adjustment factor of the effective contact time dimension, 0 < a 2 <1,n 2 The number of times the target person has the same line with the other persons.
Optionally, the third determining unit is specifically configured to determine, for each other person, the number of conversations between the target person and the other person, and the conversation duration between the target person and the other person in each conversation process; determining an interaction relation index d between the target person and the other person in the conversation duration dimension by using the following formula r3
Figure BDA0002141474990000072
Wherein, a 3 An adjustment factor of 0 < a for the conversation duration dimension 3 <1,n 3 The number of conversations between the target person and the other person, i is the ith conversation between the target person and the other person, t i The conversation duration in the ith conversation process.
Optionally, the fourth determining unit is specifically configured to determine, for each of the other people, the number of conversations between the target person and the person; determining an interactive relationship index d between the target person and the person in the conversation time dimension by using the following formula r4
Figure BDA0002141474990000073
Wherein, a 4 As adjustment factor for dimension of said number of conversations, 0 < a 4 <1,n 3 The number of conversations the target person has with the other person.
Optionally, the fifth determining unit is specifically configured to determine, for each other person, an expression that the target person appears in a conversation process between the target person and the other person, and a number of times that the target person appears in the expression; determining an interaction relation index d between the target person and the other person in the conversational emotion dimension using the following formula r5
Figure BDA0002141474990000074
Wherein n is 5 The number of times of expression occurrence of the target person is shown, i is the expression of the target person appearing for the ith time, C i And the score corresponding to the ith expression of the target person is obtained.
Optionally, the sixth determining unit is specifically configured to determine, for each other person, the number of times of putting the frame between the target person and the other person; determining the number of racking times between the target person and the other person using the following formulaInteraction relation index d of dimension r6
Figure BDA0002141474990000075
Wherein, a 6 For the adjustment coefficient of the dimension of the number of times of racking, 0 < a 6 <1,n 6 And the number of times of putting a shelf for the target person and the other persons is determined.
Optionally, the seventh determining unit is specifically configured to determine, for each other person, the number of times the target person and the other person put the frame, and a time length corresponding to each time the target person and the other person put the frame; determining an interaction relation index d of the target person and the other persons in the length dimension of the fighting time by using the following formula r7
Figure BDA0002141474990000076
Wherein, a 7 Setting the adjustment coefficient of the time dimension of the frame as 0 < a 7 <1,n 6 The number of times of putting the frame for the target person and the other person, i is the ith time of putting the frame for the target person and the other person, and t ti And the time length of the target person and the other persons when the ith time of the frame is spent is obtained.
Optionally, the relationship index determining submodule includes:
the weight determining unit is used for determining a first weight of the inherent relation index of each dimension and a second weight of the interaction relation index of each dimension;
and the index determining unit is used for determining the relationship index between the target person and the other persons according to the first weight of each dimension, the second weight of each dimension, the inherent relationship index of each dimension and the interaction relationship index of each dimension.
Optionally, the index determining unit is specifically configured to determine the relationship index d between the target person and the other person according to the following formula:
Figure BDA0002141474990000081
wherein n is the total dimension number of the personnel identity data contained in the personnel identity data, i is the ith dimension in the n dimensions, and w fi A first weight corresponding to the i-th dimensional inherent relation index, d fi Is an inherent relation index of the ith dimension, m is the total dimension number of the interactive data contained in the personnel interactive data, j is the jth dimension in the m dimensions, w rj A second weight corresponding to the jth dimension interaction relation index, d rj Is the interaction relation index of the jth dimension.
Optionally, the apparatus further includes:
and the relation index re-determining module is used for re-determining the relation index between the target person and the other persons according to the first weight of each dimension, the second weight of each dimension, the inherent relation index of each dimension, the interaction relation index of each dimension and the attenuation index.
Optionally, the relationship index re-determining module is specifically configured to re-determine the relationship index d' between the target person and the other person by using the following formula:
d′ rj =d rj *a t
Figure BDA0002141474990000082
wherein n is the total dimension number of the identity data contained in the personnel identity data, i is the ith dimension in the n dimensions, and w fi A first weight being an index of the intrinsic relation of the ith dimension, d fi Is an inherent relation index of the ith dimension, m is the total dimension number of the interactive data contained in the personnel interactive data, j is the jth dimension in the m dimensions, w rj A second weight being an index of the j-th dimension interaction relationship, d rj Is an interactive relation index before the attenuation of the j dimension, t is the time variation, a is the attenuation coefficient, 0 < a < 1,d' rj And the j dimension is an interaction relation index after attenuation.
Optionally, the apparatus further includes:
and the personnel relationship list generating module is used for sequencing the other personnel according to the order of the relationship indexes between the target personnel and the other personnel from large to small, so as to obtain a personnel relationship list of the target personnel.
The embodiment of the invention also provides electronic equipment, which comprises a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory complete mutual communication through the communication bus;
a memory for storing a computer program;
and a processor for implementing any of the above-described steps of the method for determining a person relationship when executing a program stored in the memory.
The embodiment of the present invention further provides a computer-readable storage medium, in which a computer program is stored, and when the computer program is executed by a processor, the steps of the method for determining a person relationship are implemented.
Embodiments of the present invention further provide a computer program product containing instructions, which when run on a computer, cause the computer to execute any one of the above-mentioned methods for determining a person relationship.
According to the personnel relationship determining method, the personnel relationship determining device, the electronic equipment and the storage medium, provided by the embodiment of the invention, the monitoring video containing the target personnel can be obtained, the video structured data in the monitoring video is obtained, and the relationship index between the target personnel and other personnel is determined according to the personnel identity data and the personnel interaction data contained in the video structured data. By the scheme provided by the embodiment of the invention, the personnel relationship among the personnel in the monitored video can be quickly and efficiently analyzed according to the video structured data in the monitored video, and the misjudgment probability caused by artificial subjective factors is reduced.
Of course, not all of the advantages described above need to be achieved at the same time in the practice of any one product or method of the invention.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a first flowchart of a method for determining a person relationship according to an embodiment of the present invention;
fig. 2 is a second flowchart of a method for determining a personal relationship according to an embodiment of the present invention;
fig. 3 is a third flowchart of a method for determining a person relationship according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a person relationship determining apparatus according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In order to solve the problems of slow manual analysis progress and erroneous judgment in the process of personnel relationship analysis, the embodiment of the invention provides a personnel relationship determination method, which comprises the steps of obtaining a monitoring video containing target personnel, obtaining video structured data in the monitoring video, and determining a relationship index between the target personnel and other personnel according to personnel identity data and personnel interaction data contained in the video structured data. By the scheme provided by the embodiment of the invention, the personnel relationship among the personnel in the monitoring video can be quickly and efficiently analyzed according to the video structured data in the monitoring video, and the misjudgment probability caused by artificial subjective factors is reduced.
The following examples are given to illustrate the present invention in detail.
Referring to fig. 1, fig. 1 is a first flowchart of a method for determining a person relationship according to an embodiment of the present invention, where the method includes the following steps.
And step S101, acquiring a monitoring video containing the target personnel.
In this step, the target person may be monitored in real time according to the preset video acquisition time interval, the motion trajectory of the target person is determined, and the monitoring video of each video monitoring point corresponding to the motion trajectory of the target person is acquired.
In the embodiment of the present invention, the preset video acquisition time interval may be set according to an environment where a target person is located. Taking the above smart campus as an example, in one example, if 8-00-8 is the student class time, then for the target person student a, in the period 8. In another example, if 8.
In the embodiment of the invention, each video monitoring point is provided with corresponding intelligent monitoring equipment, and the event of each video monitoring point can be recorded through the intelligent monitoring equipment. The intelligent monitoring equipment can store identity information of each target person in the monitoring area, such as various information of names, face images, main social relations, ages, family addresses, native places and the like of the persons. Of course, the identity information of each target person may also be stored in other devices, for example, the identity information of each target person may be stored in a cloud storage space, and the intelligent monitoring device may call the identity information in the cloud storage space. In addition, the target person may be any one person in the surveillance video, for example, in the smart campus, the target person may be a teacher, a student, a campus worker, a parent of the student, and the like in the campus.
And S102, acquiring video structured data in the monitoring video.
In this step, the monitoring video may be structured based on a face recognition technology, an expression recognition technology, a behavior analysis technology, and the like, so as to obtain video structured data such as the person identity information, the expression information, the behavior information, and the like in the video. For example, behavior analysis technology is adopted to analyze behaviors of people in the monitoring video, conversation behaviors, peer behaviors or fighting behaviors among different people in the monitoring video are determined, and data of different behavior dimensions are obtained.
And S103, determining a relationship index between the target person and other persons according to the person identity data and the person interaction data contained in the video structured data.
In the step, for the target person, the relationship index between the target person and other persons in the monitoring video is calculated according to the person identity data and the person interaction data contained in the video structural data. Wherein the relationship index represents the strength of the personnel relationship between the target personnel and other personnel. The other persons are persons other than the target person in the monitoring video.
By the scheme provided by the embodiment of the invention, the personnel relationship among the personnel in the monitoring video can be quickly and efficiently analyzed according to the video structured data in the monitoring video, and the misjudgment probability caused by artificial subjective factors is reduced.
Based on the embodiment shown in fig. 1, the embodiment of the application further provides a method for determining a person relationship. Referring to the second flowchart of the person relationship determination method shown in fig. 2, the method may include the following steps.
Step S201, a monitoring video containing the target personnel is obtained.
Step S202, acquiring video structured data in the monitoring video.
Step S201 to step S202 are the same as step S101 to step S102.
And step S203, determining the inherent relationship index between the target person and other persons according to the person identity data contained in the video structured data.
In this step, the person identification data of the target person is compared with the person identification data of other persons according to the video structured data. And determining the inherent relation index between the target person and other persons according to the comparison result. Wherein the inherent relation index represents the strength of the inherent relation between the target person and other persons.
And S204, determining an interaction relation index between the target person and other persons according to the person interaction data contained in the video structured data.
In this step, each interaction behavior between the target person and other persons is analyzed according to the person interaction data included in the video structured data, and an interaction relationship index between the target person and other persons is calculated, wherein the interaction relationship index is the strength of the interaction relationship between the target person and other persons.
And S205, determining a relationship index between the target person and other persons based on the inherent relationship index and the interaction relationship index.
In this step, the relationship index between the target person and the other person is calculated according to the inherent relationship index and the interaction relationship index between the target person and the other person.
In an optional embodiment, regarding the step S203, the person identity data includes identity data in at least one dimension. In one example, the person identity data can include identity data in one or more of an age dimension, a native dimension, and a habitats dimension. Of course, the person identity data may include identity data of other dimensions, for example, identity data of a working property dimension, and the like, which is not limited in this embodiment.
In one embodiment, whether the person identity data of the target person is the same as the person identity data of other persons is determined from the age dimension, the native dimension or the living place dimension according to the identity data of one or more of the age dimension, the native dimension and the living place dimension contained in the person identity data. If the dimension is the same as the dimension, the inherent relation index under the dimension is a first preset value. If not, the inherent relation index under the dimension is a second preset value.
For the above-mentioned determination of whether the personnel identity data of the target personnel in each dimension is the same as the personnel identity data of other personnel, specific limitations may be performed according to the identity data of different dimensions, and embodiments of the present invention are not specifically limited.
In one example, it may be determined whether the person identity data of the target person in the dimension is completely consistent with the person identity data of other persons; if the identity data of the target person is completely consistent with the identity data of the other persons, the identity data of the target person is the same as the identity data of the other persons; and if the identity data of the target person is not identical with the identity data of the other persons, the identity data of the target person is different from the identity data of the other persons.
For example, the person identity data includes identity data for a place of residence dimension, and the place of residence for the target person is: the D1 area of C city of province B of nation A, the place of residence of other people 1 is: the D1 area of C city of province B of nation A, the place of residence of other people 2 is: d2 region C city of State A, state B. The identity data of the target person and the other persons 1 in the dimension of the living area are the same, and the inherent relationship index between the target person and the other persons 1 is a first preset value. The identity data of the target person and the other persons 2 in the dimension of the existing residence are the same, and the inherent relation index between the target person and the other persons 2 is a second preset value.
In another example, whether the deviation between the personnel identity data of the target personnel and the personnel identity data of other personnel in the dimension is within a preset deviation range can be judged; if the deviation is within the preset deviation range, the personnel identity data of the target personnel is the same as the personnel identity data of other personnel; and if the deviation is not within the preset deviation range, the personnel identity data of the target personnel is different from the personnel identity data of other personnel.
For example, the person identity data comprises identity data in an age dimension, with the target person having an age of 18 years, the other persons 1 having an age of 20 years, and the other persons 2 having an age of 25 years. If the preset age deviation range is 3 years, and the age difference of the target person and the other person 1 in the age dimension is within the preset age deviation range, the identity data of the target person and the other person 1 in the age dimension are the same, and the inherent relationship index between the target person and the other person 1 is a first preset value. The age difference of the target person and the other persons 2 in the age dimension exceeds the preset age range, the identity data of the target person and the other persons 2 in the age dimension are different, and the inherent relationship index between the target person and the other persons 2 is a second preset value.
For the first preset value and the second preset value, in one example, the first preset value is 1, and the second preset value is 0. If the person identity data of the target person in the dimension 1 is the same as the person identity data of the other persons 1 in the dimension 1, it may be determined that the inherent relationship index of the target person and the other persons 1 in the dimension 1 is 1. If the person identity data of the target person in the dimension 1 is different from the person identity data of the other persons 2 in the dimension 1, it can be determined that the inherent relationship index of the target person and the other persons 2 in the dimension 1 is 0.
In the embodiment of the present invention, specific sizes of the first preset value and the second preset value may be set as preset values according to needs, and are not specifically limited in the embodiment of the present invention.
For example, the person identity data comprises identity data in an age dimension, with the target person having an age of 20 years, the other persons 1 having an age of 20 years, and the other persons 2 having an age of 18 years. The inherent relation index of the target person and other persons 1 is a first preset value, and the inherent relation index of the target person and other persons 2 is a second preset value. In one example, the first preset value may be 10, and the second preset value may be 5. In another example, the first preset value may be 5, and the second preset value may be 1.
In the embodiment of the present invention, the method for determining the inherent relationship may be different according to the result of comparing the identity data of the target person with the identity data of other persons in each dimension, and is not particularly limited in the embodiment of the present invention.
For example, the person identification data is identification data of an age dimension, and the inherent relationship index of the target person and other persons in the dimension can be determined by calculating the age difference between the target person and other persons. If the difference between the ages of the target person and the ages of the other persons is less than 1 year, the inherent relationship index between the target person and the other persons can be a first preset value. If the age difference between the target person and the other persons is greater than or equal to 1 year but less than 3 years, the index of the inherent relationship between the target person and the other persons may be a second preset value. If the age difference between the target person and the other persons is greater than or equal to 3 years, the inherent relationship index between the target person and the other persons may be a third preset value.
For another example, the person identity data is identity data of the native dimension, and whether the data of countries, provinces, cities, regions and the like included in the identity data of the native dimension of the target person and the surrounding persons are the same or not can be determined one by one. If any data is the same, adding 1 to the inherent relation index of the dimension, and if the data is not the same, keeping the inherent relation index unchanged. Wherein the inherent relationship index default value is 0. In one example, the target person's native place is: in the D1 area of C city of State A, B province, other people 1 have the following native place: region D2 of C city of province B of nation A, region H of city F of province E of nation A by native place of other personnel 2, region H of city F of province E of nation A by native place of other personnel 3: l district of K city, J province, I nation. The country, province and city of the target person and other persons 1 in the native dimension are completely the same, and the inherent relationship index of the target person and other persons 1 in the native dimension is 3. The target person is the same as the other persons 2 only in the country in the native dimension, and the inherent relationship index of the target person and the other persons 2 in the native dimension is 1. The target person and the other persons 3 are different in the country in the native dimension, and the inherent relationship index of the target person and the other persons in the native dimension is 0.
In the embodiment of the present invention, the personal identity data includes, but is not limited to, identity data of an age dimension, a native place dimension, and a living place dimension. For example, the personal identity data may also include identity data of the person in dimensions of interests, major social relationships, and the like.
According to the method, through the analysis of the inherent relationship indexes of the target person and other persons in different dimensions, the calculation data of the inherent relationship indexes are enriched, and the accuracy of the inherent relationship indexes between the target person and other persons is improved.
In an optional embodiment, the human interaction data includes interaction data in one or more dimensions of an effective contact time dimension, a peer dimension, a conversation duration dimension, a conversation frequency dimension, a conversation emotion dimension, a fighting frequency dimension, and a fighting duration dimension with respect to the step S204.
In the step S204, the method for determining the interaction relation index may include the following steps.
In case one, if the person interaction data includes interaction data in an effective contact time dimension, determining an interaction relationship index between the target person and other persons in the effective contact time dimension according to contact time between the other persons and the target person.
For each other person, if the duration of the simultaneous presence of the target person and the other person in the same video monitoring point is longer than a preset duration, the target person and the other person are considered to have effective contact behavior; and if the duration that the target person and other persons appear in the same video monitoring point at the same time is less than or equal to the preset duration, determining that no effective contact behavior exists between the target person and the other persons.
And if the target person and other persons have effective contact behaviors, the person interaction data between the target person and the other persons comprises interaction data of an effective contact time dimension. And determining an interactive relationship index between the target person and other persons in the effective contact time dimension according to the effective contact time and the effective contact times contained in the interactive data between the target person and other persons.
In the embodiment of the present invention, the effective contact behavior includes, but is not limited to, a peer behavior, a conversation behavior, and a fighting behavior. Such as handshaking, hugging, etc., between the target person and other persons.
In an alternative embodiment, for each other person, it is determined whether the contact time of the other person with the target person is greater than a preset time period. If yes, determining an interaction relation index d between the target person and the other person in the effective contact time dimension by using the following formula r1
Figure BDA0002141474990000131
Wherein, a 1 For the adjustment factor of the effective contact time dimension, 0 < a 1 <1,n 1 The number of effective contacts of the target person with the other person, i is the ith effective contact of the target person with the other person, p 1 Is the target person, p 2 In order for the other person to be the person,
Figure BDA0002141474990000132
the effective contact time of the target person and the other person in the ith effective contact is shown.
In the embodiment of the present invention, the number of times of effective contact between the target person and the other person may be determined according to the effective contact time between the target person and the other person.
In one example, for a certain video surveillance point, a target person p 1 With other persons p 2 The contact time between the two is 10 minutes, and if the preset time is 3 minutes, the target person p 1 With other persons p 2 The contact time between the two is longer than the preset time, the target person p 1 With other persons p 2 The contact duration of (A) is the effective contact duration and is recorded as
Figure BDA0002141474990000133
According to the target person p 1 With other persons p 2 Effective contact time of
Figure BDA0002141474990000134
The target person p can be determined 1 With other persons p 2 The number of effective contacts. For example, for the surveillance video of the video surveillance point, the surveillance video is divided according to the preset time interval of effective contact times sampling, and if the surveillance video is divided by taking 1 minute as the time interval, the target person p 1 With other persons p 2 At an effective contact time>
Figure BDA0002141474990000135
Within, 10 consecutive time periods may be divided, each time period may represent a target person p 1 With other persons p 2 1 operative contact therebetween. Thus, the target person p 1 With other persons p 2 The number of effective contacts therebetween may be 10.
In the embodiment of the present invention, when determining the effective contact time, the effective contact time may be determined according to specific circumstances of the environment in places where different persons, such as a school doorway, a mall doorway, and an elevator, are likely to simultaneously appear within a duration. In one example, the contact time between the target person and the other person in the above-mentioned location may not be recorded in the valid contact time. In another embodiment, the contact time between the target person and the other person in the location may be recorded as the effective contact time. For example, when a special event occurs at the location, such as a failure of an elevator, people in the elevator cannot exit. At this time, the contact time of the target person with other persons during the elevator malfunction can be recorded in the effective contact time.
According to the method, the interactive relation index of the target person and other persons in the effective contact time dimension is analyzed, the calculation data of the interactive relation index is enriched, and the accuracy of the interactive relation index between the target person and other persons is improved.
And in the second situation, if the personnel interaction data comprise interaction data of the same-row dimension, determining the interaction relation index of the target personnel and other personnel in the same-row dimension according to the times of the target personnel and other personnel in the same row.
For each other person, if the time interval of the target person and the other person appearing at the same video monitoring point is smaller than the preset same-row time interval, the target person and the other person are considered to have the same-row behavior; and if the time interval of the target person and the other person appearing at the same video monitoring point is greater than or equal to the preset same-line time interval, determining that no same-line behavior exists between the target person and the other person.
And if the target person and the other persons have the same-row behavior, the person interaction data between the target person and the other persons comprises interaction data of the same-row dimension. And determining the interaction relation index between the target person and the other persons on the same-row dimension according to the number of the same rows contained in the interaction data between the target person and the other persons.
In the embodiment of the present invention, for the determination of the peer behavior, in an example, the target person and the other person accompany forward, and a time interval of the target person and the other person appearing at the same video monitoring point is almost 0, and at this time, the peer behavior between the target person and the other person may be considered to exist. In another example, if the target person and the other persons appear at the same video monitoring point in tandem, and the time difference between the appearance time of the target person and the appearance time of the other persons is smaller than the preset peer-to-peer time interval, it may be considered that peer behavior exists between the target person and the other persons.
In an alternative embodiment, for each other person, the number of times the other person is in line with the target person is determined.
Determining an interaction relation index d between the target person and the other person in the dimension of the same line by using the following formula r2
Figure BDA0002141474990000141
Wherein, a 2 For the adjustment factor of the effective contact time dimension, 0 < a 2 <1,n 2 The number of times the target person has gone with the other person.
In the embodiment of the present invention, the number of times of the target person and other persons having the same line can be determined according to the time of the target person and other persons having the same line.
In one example, for a certain video surveillance point, a target person p 1 At 12 2 And 12 1 And other persons p 2 Each leaves the video surveillance point at 12. If the preset time interval is 2 minutes, the target person p 1 With other persons p 2 There is a peer behavior in between, and the target person p 1 With other persons p 2 The peer time of (2) was 9 minutes.
According to the target person p 1 With other persons p 2 Can determine the target person p 1 With other persons p 2 The number of lines in between. For example, for a surveillance video of the video surveillance point, the surveillance video is divided according to a preset time interval sampled by the same line times, and if the surveillance video is divided by taking 1 minute as the time interval, the target person p 1 With other persons p 2 Within 9 minutes of the same row, 9 consecutive time periods can be divided, each time period can represent a target person p 1 With other persons p 2 There are 1 in between, and thus, the target person p 1 With other persons p 2 The number of inter-row lines may be 9.
According to the method, the calculation data of the interaction relation indexes are enriched through the analysis of the interaction relation indexes of the target personnel and other personnel in the same-row dimension, and the accuracy of the interaction relation indexes between the target personnel and other personnel is improved.
And thirdly, if the person interaction data comprise interaction data of the conversation duration dimension, determining an interaction relation index of the conversation duration dimension between the target person and other persons according to the conversation times of the target person and other persons and the conversation duration of the target person and other persons.
For each other person, it can be determined whether there is a conversation behavior between the target person and the other person according to the mouth movements, head deviation, limb movements, and the like of the other person and the target person.
If there is a conversation behavior between the target person and the other person, the person interaction data between the target person and the other person includes interaction data of a conversation duration dimension. And determining the interactive relationship index of the interaction between the target person and other persons in the dimension of the conversation time length according to the conversation time length and the conversation times contained in the interactive data between the target person and other persons.
In an alternative embodiment, for each of the other people, the number of conversations the target person has with that other person and the length of the conversation the target person has with the other person during each conversation is determined.
Determining an interaction relation index d between the target person and the other person in the conversation duration dimension by using the following formula r3
Figure BDA0002141474990000151
Wherein, a 3 For the adjustment factor of the conversation duration dimension, 0 < a 3 <1,n 3 The number of conversations between the target person and the other person, i is the i-th conversation between the target person and the other person, t i The conversation duration in the ith conversation process.
According to the interactive relationship index analysis method, the interactive relationship index of the target person and other persons in the conversation time dimension is analyzed, the calculation data of the interactive relationship index is enriched, and the accuracy of the interactive relationship index between the target person and other persons is improved.
And if the person interaction data comprises interaction data of the conversation frequency dimension, determining an interaction relation index of the conversation frequency dimension between the target person and other persons according to the conversation frequency of the target person and other persons.
For each other person, it can be determined whether there is a conversation behavior between the target person and the other person according to the mouth movements, head deviation, limb movements, and the like of the other person and the target person.
If there is a conversation between the target person and the other person, the person interaction data between the target person and the other person includes interaction data in the dimension of conversation times. And determining the interactive relationship index of the interaction times dimension between the target person and the other persons according to the conversation times contained in the interactive data between the target person and the other persons.
In an alternative embodiment, for each of the other people, the number of conversations the target person has with that person is determined.
Determining an interactive relationship index d between a target person and the person in the dimension of conversation times by using the following formula r4
Figure BDA0002141474990000152
Wherein, a 4 Adjustment coefficient of dimension of conversation times, 0 < a 4 <1,n 3 The number of conversations the target person has with the other person.
In the embodiment of the present invention, the number of conversations between the target person and the other person may be determined according to the conversation duration between the target person and the other person.
In one example, for a certain video surveillance point, a target person p 1 With other persons p 2 The above-mentioned talking behaviour exists in between, and the time of talking lasts up to 10 minutes. According to the target person p 1 With other persons p 2 The target person p can be determined 10 minutes after the conversation 1 With other persons p 2 The number of conversations therebetween. For example, for the video surveillance point, the surveillance video is divided according to the preset time interval of sampling the conversation times, if the surveillance video is divided by taking 1 minute as the time interval, the target person p 1 With other persons p 2 Within 10 minutes of the conversation, the conversation may be divided into 10 consecutive time periods, each of which may represent the target person p 1 With other persons p 2 1 conversation between them. Thus, the target person p 1 With other persons p 2 The number of conversations between may be 10.
According to the interactive relationship index analysis method, the interactive relationship index of the target person and other persons in the number of times of conversation is analyzed, the calculation data of the interactive relationship index is enriched, and the accuracy of the interactive relationship index between the target person and other persons is improved.
And fifthly, if the person interaction data comprises interaction data of the conversation emotion dimensionality, determining an interaction relation index between the target person and other persons in the conversation emotion dimensionality according to expressions of the target person and other persons in the conversation process and the number of times of the expressions of the target person and other persons.
For each other person, whether conversation behavior exists between the target person and the other person can be determined according to the mouth movement, head deviation, limb movement and the like of the other person and the target person.
If there is a conversation behavior between the target person and the other person, the person interaction data between the target person and the other person includes interaction data of a dimension of a conversation emotion. And determining the interaction relation index of the emotion dimensionality of the conversation between the target person and other persons according to the expressions of the target person and other persons in the conversation process and the times of the expressions of the target person and other persons.
In an alternative embodiment, for each of the other people, the target person's emotions with the other people during the conversation and the number of times the target person appears emotions are determined.
Determining an interactive relationship index d between the target person and the other person in the dimension of the emotion of the conversation by using the following formula r5
Figure BDA0002141474990000161
Wherein n is 5 The number of times of expression occurrence of the target person, i is the ith expression of the target person, C i And the score corresponding to the ith appearing expression of the target person.
In the embodiment of the present invention, the number of times of expression occurrence between the target person and another person may be determined according to a conversation duration between the target person and the other person.
In one example, for a certain video surveillance point, a target person p 1 With other persons p 2 The conversation time is 10 minutes, the monitoring video is divided according to the preset time interval of sampling the occurrence times of the conversation expressions, if the monitoring video is divided by taking 30 seconds as the time interval, the target person p 1 With other persons p 2 Within 10 minutes of conversation, 20 continuous time periods can be divided, and the target person p can be shown at the end of each time period 1 Facial expressions as target person p for the time period 1 The appearance of the expression is shown in the target person p 1 With other persons p 2 Target person p in conversation process 1 The number of expressions appeared was 20.
In the embodiment of the present invention, the preset expression score is a score corresponding to each preset expression, for example, the score of a more excited expression may be set to 1, such as laugh, anger, and the like; the expression of normal emotion is set to 0.5, e.g., smile, faceless, etc.
According to the method, the interactive relation index of the target person and other persons in the conversation emotion dimension is analyzed, the calculation data of the interactive relation index is enriched, and the accuracy of the interactive relation index between the target person and other persons is improved.
And sixthly, if the personnel interaction data comprise interaction data of the number of times of putting the shelf, determining an interaction relation index between the target personnel and other personnel in the number of times of putting the shelf according to the number of times of putting the shelf between the target personnel and other personnel.
And for each other person, determining whether a fighting behavior exists between the target person and the other person according to the limb actions between the target person and the other person, such as arm actions, leg actions and the like.
And if the target person and other persons have a fighting behavior, the person interaction data of the target person and the other persons comprises interaction data of fighting times dimensionality. And determining the interaction relation index of the target person and the other persons in the dimension of the frame-making times according to the frame-making times contained in the interaction data between the target person and the other persons.
In an alternative embodiment, for each other person, the number of times the targeted person has spent with that other person is determined.
Determining the interaction relation index d of the target person and the other persons in the dimension of the number of times of putting a shelf by using the following formula r6
Figure BDA0002141474990000171
Wherein, a 6 For the adjustment coefficient of the dimension of the number of times of racking, 0 < a 6 <1,n 6 The number of times the target person and the other person are put on the shelf.
According to the method, the calculation data of the interactive relation indexes are enriched through the analysis of the interactive relation indexes of the number of times of putting a frame of the target person and other persons, and the accuracy of the interactive relation indexes between the target person and other persons is improved.
And determining the interaction relation index of the length dimension of the frame made by the target person and other persons according to the frame making times of the target person and other persons and the frame making time of the target person and other persons if the person interaction data comprises the interaction data of the length dimension of the frame made by the target person and other persons.
And for each other person, determining whether a fighting behavior exists between the target person and the other person according to the limb actions between the target person and the other person, such as arm actions, leg actions and the like.
And if the target person and other persons have a fighting behavior, the person interaction data of the target person and the other persons comprises interaction data of fighting times dimensionality. And determining an interaction relation index of the time dimension of the fighting between the target person and the other persons according to the fighting times and the fighting time contained in the interaction data between the target person and the other persons.
In an alternative embodiment, for each other person, the number of times the target person and the other person are put on a shelf and the shelf-making time length corresponding to each time the target person and the other person are put on a shelf are determined.
Determining an interaction relation index d of the length dimension of the fighting time of the target person and the other persons by using the following formula r7
Figure BDA0002141474990000172
Wherein, a 7 For setting up the adjustment coefficient of the length dimension of the frame, 0 < a 7 <1),n 6 The number of times of putting the frame for the target person and the other persons, i is the ith time of putting the frame for the target person and the other persons, t ti The time length of the target person and other persons when the ith time of the frame is reached is determined.
In the embodiment of the invention, the number of times of putting the shelf between the target person and other persons can be determined according to the shelf-putting duration of the target person and the other persons.
In one example, for a certain video surveillance point, a target person p 1 With other persons p 2 The time length of the frame-making is 10 minutes, the monitoring video is divided according to the preset time interval of the sampling of the frame-making times, if the monitoring video is divided by taking 2 minutes as the time interval, the monitoring video is monitored at a certain video monitoring point by the target person p 1 With other persons p 2 Within 10 minutes of fighting, the time period can be divided into 5 continuous time periods, and each time period represents a target person p for a certain video monitoring point 1 With other persons p 2 The gazang is 1 time of play. Thus, for a certain video surveillance point, the target person p 1 With other persons p 2 Between them is beaten the frameThe number of times may be 5, with a shelf life of 2 minutes per shelf.
According to the method, the calculation data of the interaction relation index is enriched through the analysis of the interaction relation index of the length dimension of the target person and other persons in the fighting process, and the accuracy of the interaction relation index between the target person and other persons is improved.
In the embodiment of the present invention, in the process of determining the interactive relationship index of each dimension, the determination of the effective contact duration, the effective contact times, the peer time, the peer times, the conversation time, the conversation times, the time to put up a shelf, the number to put up a shelf, and the like is explained through a video monitoring point. However, in practical process, the interactive relationship index of the target person and other persons is determined according to the movement tracks of the target person and other persons. Therefore, in an actual process, the determination of the effective contact duration, the effective contact times, the peer time, the peer times, the conversation time, the conversation times, the shelf-making duration, the shelf-making times, and the like is often a result of comprehensively analyzing the monitoring videos corresponding to the plurality of video monitoring points. For each video monitoring point, the determination of the effective contact duration, the effective contact times, the peer time, the peer times, the conversation time, the conversation times, the time for putting up the frame, and the like can be referred to the above description.
The effective contact time dimension is taken as an example for explanation. When two or more video monitoring points are analyzed, the effective contact time length may be a contact time length between a target person and another person in the two or more video monitoring points. For example, if the preset time is 1 minute, the target person and the other person 1 run into the video monitoring point 2 from the video monitoring point 1 by hand and stay in the video monitoring point 2. If the contact time of the target person and the other person 1 in the video monitoring point 1 is 40 seconds, and the contact time in the video monitoring point 2 is 5 minutes, the effective contact time of the target person and the other person 1 is 5 minutes and 40 seconds, which is the sum of the contact time in the video monitoring point 1 and the contact time in the video monitoring point 2. Determining the effective number of contacts between the target person and the other person 1 according to the method provided in the above-mentioned caseNumber and effective contact time according to the above-mentioned interactive relation index d in the dimension of effective contact time r1 The interaction relation index of the target person and other persons 1 in the effective contact time dimension is calculated. By analogy, according to the interaction data of other dimensions contained in the person interaction data, the interaction relationship index between the target person and other persons in the dimension is determined, and the monitoring videos of the multiple video monitoring points can be comprehensively analyzed to determine the interaction relationship index between the target person and other persons. The specific implementation steps may refer to the above method for determining the interaction relation index in each dimension, which is not specifically described herein.
In the embodiment of the present invention, the preset time interval for sampling the effective contact times, the preset time interval for sampling the co-line times, the preset time interval for sampling the talk times, and the time interval for sampling the racking times may be set by different methods.
In an embodiment, the surveillance video may be divided according to a preset video division time interval to obtain a plurality of surveillance video segments, where the preset time interval of the effective contact time sampling, the preset time interval of the same-line time sampling, the preset time interval of the talk time sampling, and the time interval of the fighting time sampling are the same as the preset video division time interval. For example, if the duration of the surveillance video is 20 minutes and the preset video partition time interval is 30 seconds, 40 surveillance video segments can be obtained. For each monitoring video clip, if an effective contact behavior, a peer behavior, a conversation behavior, or a fighting behavior exists in the monitoring video clip, 1 effective contact frequency, peer frequency, conversation frequency, or fighting frequency correspondingly exists.
In another embodiment, the preset time intervals are respectively set for the monitoring videos according to different behaviors between the target person and other persons. For example, for setting the preset time interval of the number of talking samples, a surveillance video clip of the target person talking with other persons is extracted from the surveillance video, and the preset time interval of the number of talking samples is set according to the duration of the surveillance video clip. For example, the duration of the surveillance video segment is 10 minutes, and the preset time interval for sampling the number of conversations may be set to 1 minute. Alternatively, the duration of the surveillance video clip is 2 minutes, and the preset time interval for sampling the number of conversations may be set to 10 seconds.
In the embodiment of the present invention, the people interaction data is data corresponding to interaction behaviors of each dimension of the target person and other people in the monitoring video. Thus, in addition to the above-described contact activity, peer activity, conversation activity, and fighting activity, there may be other activities between the target person and other persons, and these other activities also have interaction data of corresponding dimensions. For example, in the smart campus, other behavior modes such as game behavior and labor behavior may exist between the target person and other persons, and the other behavior modes correspond to the interaction data with corresponding dimensions.
With respect to the step S205, an optional embodiment may include the following steps.
In step S2051, a first weight of the inherent relationship index of each dimension and a second weight of the interaction relationship index of each dimension are determined.
In this step, for each dimension of the identity data included in the person identity data, a weight of the inherent relationship index of the dimension may be set as a first weight according to the weight of the dimension of the identity data. For example, the weight of the age dimension identity data is less than the weight of the native dimension identity data, which is less than the weight of the identity data of the living place dimension, the first weight of the inherent relationship index of the age dimension may be set to 1, the first weight of the inherent relationship index of the native dimension may be set to 2, and the first weight of the inherent relationship index of the living place dimension may be set to 3.
For the interaction data of each dimension contained in the human interaction data, the weight of the dimension interaction relationship index can be set according to the weight of the dimension interaction data, and the weight is used as a second weight. For example, the weight of the interaction data of the shelving times dimension is greater than that of the interaction data of the peer dimension, the second weight of the interaction relationship of the shelving times dimension is set to 2, and the interaction relationship index of the peer dimension is set to 1.
Step S2052 is to determine a relationship index between the target person and another person according to the first weight of each dimension, the second weight of each dimension, the inherent relationship index of each dimension, and the interaction relationship index of each dimension.
In an alternative embodiment, the relationship index d between the target person and the other persons may be determined according to the following formula:
Figure BDA0002141474990000191
wherein n is the total dimension number of the personnel identity data contained in the personnel identity data, i is the ith dimension in the n dimensions, and w fi A first weight corresponding to the i-th dimensional inherent relation index, d fi Is an inherent relation index of the ith dimension, m is the total dimension number of the interactive data contained in the human interactive data, j is the jth dimension in the m dimensions, w rj A second weight corresponding to the jth dimension interaction relation index, d rj Is the interaction relation index of the jth dimension.
And calculating a weighted average according to a first weight and an inherent relationship index corresponding to each dimension of the identity data in the personnel identity data and a second weight and an interaction relationship index corresponding to each dimension of the interaction data in the personnel interaction data, wherein the weighted average is used as a relationship index between the target personnel and other personnel.
In the embodiment of the present invention, the relationship index may be calculated in other manners. In one example, the sum of the inherent relationship index and the interactive relationship index is calculated as the relationship index between the target person and the other person. In another example, the sum of the average number of inherent relationship indexes per dimension and the average number of interactive relationship indexes per dimension is calculated as the relationship index between the target person and the other person.
Therefore, in the relation index calculation process, the video structured data in the monitoring video is fully utilized, and the accuracy of relation index calculation among different people in the video is improved.
After determining the relationship index between the target person and the other persons according to the above step S205, in an alternative embodiment, the relationship index between the target person and the other persons is determined again according to the first weight of each dimension, the second weight of each dimension, the inherent relationship index of each dimension, the interaction relationship index of each dimension, and the attenuation index.
In the embodiment of the present invention, the inherent relationship index between the target person and the other person does not change with time, and the inherent relationship index is an expression of an inherent relationship between the target person and the other person, for example, the inherent relationship between the target person and the other person is a parent-child relationship, and normally, the parent-child relationship will not change with time. However, the interaction relation index between the target person and the other person is determined according to the person interaction data corresponding to the interaction behavior between the target person and the other person, and the interaction relation index will be attenuated with the change of time. Therefore, as time changes, the interaction relationship index between the target person and other persons will be attenuated, so that the relationship index between the target person and other persons changes, and the relationship index between the target person and other persons needs to be determined again, thereby reducing the influence of time on the relationship index.
In an alternative embodiment, the relationship index d' between the target person and the other persons may be determined again according to the following formula:
d′ rj =d rj *a t
Figure BDA0002141474990000201
wherein n is the total dimension number of the identity data contained in the person identity data, i is the ith dimension in the n dimensions, and w fi Is a first weight of the ith dimension intrinsic relation index,d fi is an inherent relation index of the ith dimension, m is the total dimension number of the interactive data contained in the human interactive data, j is the jth dimension in the m dimensions, w rj A second weight being an index of the j-th dimension interaction relationship, d rj Is the interaction relation index before the j dimension attenuation, t is the time variation, a is the attenuation coefficient, a is more than 0 and less than 1,d rj And the j dimension is the attenuated interaction relation index.
In the embodiment of the present invention, in the calculation process of the relationship index between the target person and the other person, the adjustment coefficient corresponding to the interaction relationship index of each dimension, and the attenuation coefficient corresponding to the re-determined relationship index may not be a fixed value. The adjustment factor of the effective contact time dimension is taken as an example for explanation. Since the effective contact time of the target person with each of the other persons may be different, the adjustment factor for the effective contact time dimension may be determined based on the effective contact time of the target person with the other persons. For example, if the effective contact time between the target person and the other person 1 is 10 minutes and the effective contact time between the target person and the other person 2 is 20 minutes, the adjustment coefficient of the effective contact time dimension between the target person and the other person 1 may be set to 0.9, and the adjustment coefficient of the effective contact time dimension between the target person and the other person 2 may be set to 0.5.
According to the method, after the relation indexes among different personnel are determined, the attenuation condition of the interaction relation indexes along with time is fully considered, the accuracy of the relation indexes is improved, and the misjudgment probability is reduced.
After determining the relationship index between the target person and the other person according to the step S103, in an optional embodiment, the other person is sorted according to the relationship index between the target person and the other person to obtain a person relationship list of the target person.
In one embodiment, each other person is sorted in descending order according to the size of the relationship index between the target person and each other person, so as to obtain the person relationship list of the target person.
In another embodiment, the relationship index of the target person and each other person is compared with a preset relationship index, other persons with relationship indexes larger than the preset relationship index are screened out, and the other persons are arranged according to the sequence of the relationship indexes from large to small to obtain a person relationship list of the target person.
In the person relationship list of the target person, the relationship index between the target person and each of the other persons may be normalized so that the relationship index between the target person and each of the other persons is not greater than 1.
Referring to table 1, a person relationship list of the target person is shown, where table 1 is a person relationship list provided in an embodiment of the present invention.
TABLE 1
Other persons Index of relationship
Other persons 3 0.75
Other persons 1 0.63
Other persons 4 0.47
Other persons 2 0.11
In an embodiment of the present invention, the person relationship list of the target person at least includes names or numbers of other persons, and a relationship index corresponding to the target person and each other person. And the names or the numbers of other persons are the names or the numbers of the other persons stored in the intelligent monitoring equipment. Besides, the method can also comprise partial inherent relations of the target person, such as parent-child relations and the like.
In an optional embodiment, based on the above staff relationship list of the target staff, the other staff are sorted according to the order of the redetermined relationship indexes between the target staff and the other staff from large to small, so as to obtain the staff relationship list of the target staff.
In one embodiment, according to the size of the relationship index between the re-determined target person and other persons, the other persons are sorted in descending order to generate a new person relationship list of the target person, and the person relationship list of the target person before the relationship index is re-determined and the newly generated person relationship list of the target person are stored.
In another embodiment, according to the size of the relationship index between the target person and the other persons, the other persons are sorted in the descending order, and the person relationship list of the target person before the relationship index is redetermined is updated.
Therefore, other persons related to the target person and the relationship strength between the target person and other persons can be simply and intuitively reflected through the person relationship list.
In summary, by adopting the scheme provided by the embodiment of the invention, the personnel relationship among the personnel in the monitoring video can be quickly and efficiently analyzed according to the video structured data in the monitoring video, and the misjudgment probability caused by artificial subjective factors is reduced.
Based on the embodiment shown in fig. 2, the embodiment of the present invention further provides a complete method for determining a person relationship. Referring to fig. 3, fig. 3 is a flowchart of a third method for determining a person relationship according to an embodiment of the present invention. The method comprises the following steps.
Step S301, acquiring a surveillance video containing a target person and video structured data in the surveillance video.
In the tracking process of the suspected missing person, the target person can be understood as the suspected missing person.
Step S301 is the same as step S201 and step S202 described above.
And step S302, calculating an interactive relation index of the target person and other persons in an effective contact time dimension.
And step S303, calculating the interaction relation index of the target person and other persons in the same-row dimension.
In step S304, it is determined whether there is a conversation between the target person and another person. If yes, go to step S305. If not, go to step S308.
Step S305, calculating the interactive relationship index of the target person and other persons in the conversation time dimension.
And step S306, calculating the interactive relationship index of the target person and other persons in the conversation frequency dimension.
And step S307, calculating the interactive relation index of the target person and other persons in the conversation emotion dimension.
And step S308, determining whether a fighting behavior exists between the target person and other persons. If yes, go to step S309. If not, step S311.
And step S309, calculating an interaction relation index of the target person in the dimension of the number of times of putting a shelf.
And S310, calculating an interaction relation index of the target person and other persons in the time dimension of the fighting.
For the calculation of the interaction relation index of each dimension in steps S302 to S310, reference may be made to the description above regarding step S204.
In step S311, the inherent relationship index of the target person and other persons in the dimension of age, native place and living place is calculated.
For the calculation of the inherent relationship index of each dimension in step S311, reference may be made to the description above regarding step S203.
Step S312, calculating the relationship indexes between the target person and other persons according to the plurality of interaction relationship indexes and the plurality of inherent relationship indexes obtained through calculation.
The calculation of the relationship index between the target person and the other person in step S307 can refer to the description of step S205.
Based on the same inventive concept, according to the method for determining a person relationship provided by the embodiment of the present invention, the embodiment of the present invention further provides a device for determining a person relationship.
Referring to fig. 4, fig. 4 is a schematic structural diagram of an apparatus for determining a human relationship according to an embodiment of the present invention, where the apparatus may include the following modules.
The video acquiring module 401 is configured to acquire a monitoring video including a target person.
A data obtaining module 402, configured to obtain video structured data in the surveillance video.
A relationship index determining module 403, configured to determine a relationship index between the target person and another person according to the person identity data and the person interaction data included in the video structured data, where the relationship index indicates the strength of the person relationship between the target person and the another person.
Optionally, the relationship index determining module 403 may include:
and the inherent relationship index determining submodule is used for determining the inherent relationship index between the target person and other persons according to the person identity data contained in the video structured data.
And the interactive relation index determining submodule is used for determining the interactive relation index between the target person and other persons according to the person interactive data contained in the video structured data.
And the relation index determining submodule is used for determining the relation index between the target person and other persons based on the inherent relation index and the interaction relation index.
Optionally, the person identity data may comprise identity data in at least one dimension.
The inherent relationship index determining submodule can be specifically used for judging whether the personnel identity data of the target personnel in each dimension is the same as the personnel identity data of other personnel; if the dimension is the same as the preset value, determining that the inherent relation index of the dimension is a first preset value; and if not, determining the inherent relation index of the dimension as a second preset value.
Optionally, the person identity data may include identity data in one or more of an age dimension, a native dimension, and a living quarters dimension.
Optionally, the people interaction data may include interaction data in one or more of an effective contact time dimension, a peer dimension, a conversation duration dimension, a conversation number dimension, a conversation emotion dimension, a time to fight dimension, and a time to fight dimension.
The interactive relationship index determining submodule may include:
the first determining unit is used for determining an interactive relation index between the target person and other persons in the effective contact time dimension according to the contact time of other persons and the target person if the person interactive data comprises interactive data of the effective contact time dimension.
And the second determining unit is used for determining the interaction relation index of the target person and other persons in the same-row dimension according to the number of times that the target person and other persons are in the same row if the person interaction data comprises interaction data of the same-row dimension.
And the third determining unit is used for determining the interactive relationship index of the target person and other persons in the conversation duration dimension according to the conversation times of the target person and other persons and the conversation duration of the target person and other persons if the person interactive data comprises interactive data of the conversation duration dimension.
And the fourth determining unit is used for determining the interactive relationship index of the target person and other persons in the dimension of the conversation times according to the conversation times of the target person and other persons if the person interactive data comprises interactive data of the dimension of the conversation times.
And the fifth determining unit is used for determining the interaction relation index of the target person and other persons in the conversation emotion dimensionality according to the expressions of the target person and other persons in the conversation process and the times of the expressions of the target person and other persons if the person interaction data comprises interaction data of the conversation emotion dimensionality.
And the sixth determining unit is used for determining the interaction relation index between the target person and other persons in the dimension of the number of times of putting the frame according to the number of times of putting the frame of the target person and other persons if the person interaction data comprises interaction data of the dimension of the number of times of putting the frame.
And the seventh determining unit is used for determining the interaction relation index of the length dimension of the time spent on the shelf of the target person and other persons according to the number of times of the shelf of the target person and other persons and the length of the shelf of the target person and other persons if the person interaction data comprises the interaction data of the length dimension of the time spent on the shelf.
Optionally, the first determining unit may be specifically configured to determine, for each other person, whether contact time between the other person and the target person is longer than a preset time period; if yes, determining an interaction relation index d between the target person and the other person in the effective contact time dimension by using the following formula r1
Figure BDA0002141474990000231
Wherein, a 1 For the adjustment factor of the effective contact time dimension, 0 < a 1 <1,n 1 The number of effective contacts of the target person with the other person, i is the ith effective contact of the target person with the other person, p 1 Is a target person, p 2 For the purpose of the other person or persons,
Figure BDA0002141474990000236
the effective contact time of the target person and the other person in the ith effective contact is shown.
Optionally, the second determining unit may be specifically configured to determine, for each other person, the number of times that the other person and the target person are in the same row; determining an interaction relation index d between the target person and the other person in the dimension of the same line by using the following formula r2
Figure BDA0002141474990000232
Wherein, a 2 For the adjustment factor of the effective contact time dimension, 0 < a 2 <1,n 2 The number of times the target person has gone with the other person.
Optionally, the third determining unit may be specifically configured to determine, for each of the other people, the number of conversations between the target person and the other person, and the conversation duration between the target person and the other person in each conversation process; determining an interaction relation index d between the target person and the other person in the conversation duration dimension by using the following formula r3
Figure BDA0002141474990000233
Wherein, a 3 Adjustment factor of 0 < a for conversation duration dimension 3 <1,n 3 The number of conversations between the target person and the other person, i the ith conversation between the target person and the other person, t i The duration of the i-th conversation.
Optionally, the fourth determining unit may be specifically configured to determine, for each of the other persons, the number of conversations between the target person and the person; determining an interactive relationship index d between a target person and the person in the dimension of conversation times by using the following formula r4
Figure BDA0002141474990000234
Wherein, a 4 Adjustment coefficient of dimension of conversation times, 0 < a 4 <1,n 3 The number of conversations the target person has with the other person.
Optionally, the fifth determining unit may be specifically configured to determine, for each of the other persons, that the target person has conversed with the other personThe expression of the target person in the process and the times of the expression of the target person; determining an interactive relationship index d between the target person and the other person in the dimension of the emotion of the conversation by using the following formula r5
Figure BDA0002141474990000235
Wherein n is 5 The number of times of expression occurrence of the target person, i is the ith expression of the target person, C i And the score corresponding to the ith expression of the target person.
Optionally, the sixth determining unit may be specifically configured to determine, for each other person, the number of times of putting a frame between the target person and the other person; determining the interaction relation index d of the dimension of the number of times of fighting between the target person and the other person by using the following formula r6
Figure BDA0002141474990000241
Wherein, a 6 Adjustment coefficient of the dimension of the number of times of racking, 0 < a 6 <1,n 6 And (4) putting a shelf for the target person and the other persons.
Optionally, the seventh determining unit may be specifically configured to determine, for each other person, the number of times the target person and the other person put the frame, and the frame-putting duration corresponding to each time the target person and the other person put the frame; determining an interaction relation index d of the length dimension of the fighting time of the target person and the other persons by using the following formula r7
Figure BDA0002141474990000242
Wherein, a 7 For the adjustment coefficient of the time dimension of the framing, 0 < a 7 <1,n 6 The number of times of putting the frame between the target person and the other person, i is the ith time of putting the frame between the target person and the other person,t ti the time length of the target person and other persons when the ith time of the frame is reached is determined.
Optionally, the relationship index determining sub-module may include:
and the weight determining unit is used for determining a first weight of the inherent relation index of each dimension and a second weight of the interaction relation index of each dimension.
And the index determining unit is used for determining the relation index between the target person and other persons according to the first weight of each dimension, the second weight of each dimension, the inherent relation index of each dimension and the interaction relation index of each dimension.
Optionally, the index determining unit may be specifically configured to determine the relationship index d between the target person and the other person according to the following formula:
Figure BDA0002141474990000243
wherein n is the total dimension number of the personnel identity data contained in the personnel identity data, i is the ith dimension in the n dimensions, and w fi A first weight corresponding to the i-th dimensional inherent relation index, d fi Is an inherent relation index of the ith dimension, m is the total dimension number of the interactive data contained in the human interactive data, j is the jth dimension in the m dimensions, w rj A second weight corresponding to the jth dimension interaction relation index, d rj Is the interaction relation index of the jth dimension.
Optionally, the person relationship determining apparatus may further include:
and the relationship index re-determining module is used for re-determining the relationship index between the target person and other persons according to the first weight of each dimension, the second weight of each dimension, the inherent relationship index of each dimension, the interaction relationship index of each dimension and the attenuation index.
Optionally, the relationship index re-determining module may be specifically configured to re-determine the relationship index d' between the target person and the other persons by using the following formula:
d′ rj =d rj *a t
Figure BDA0002141474990000244
wherein n is the total dimension number of the identity data contained in the personnel identity data, i is the ith dimension in the n dimensions, and w fi A first weight being an index of the intrinsic relation of the ith dimension, d fi Is an inherent relation index of the ith dimension, m is the total dimension number of the interactive data contained in the human interactive data, j is the jth dimension in the m dimensions, w rj A second weight being an index of the j-th dimension interaction relationship, d rj Is the interaction relation index before the j dimension attenuation, t is the time variation, a is the attenuation coefficient, a is more than 0 and less than 1,d rj And the j dimension is an interaction relation index after attenuation.
Optionally, the person relationship determining apparatus may further include:
and the personnel relation list generating module is used for sequencing other personnel according to the sequence from large to small of the relationship indexes between the re-determined target personnel and other personnel to obtain a personnel relation list of the target personnel.
Based on the same inventive concept, according to the method for determining the personnel relationship provided by the embodiment of the invention, the embodiment of the invention also provides electronic equipment. Referring to fig. 5, fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present invention, where the electronic device includes a processor 501, a communication interface 502, a memory 503, and a communication bus 504, where the processor 501, the communication interface 502, and the memory 503 complete mutual communication through the communication bus 504;
a memory 503 for storing a computer program;
the processor 501 is configured to implement the steps of any one of the above-described person relationship determination methods when executing the program stored in the memory 503.
The communication bus mentioned in the electronic device may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this is not intended to represent only one bus or type of bus.
The communication interface is used for communication between the electronic equipment and other equipment.
The Memory may include a Random Access Memory (RAM) or a Non-Volatile Memory (NVM), such as at least one disk Memory. Alternatively, the memory may be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), application Specific Integrated Circuits (ASICs), field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components.
Based on the same inventive concept, according to the method for determining a person relationship provided in the embodiments of the present invention, an embodiment of the present invention further provides a computer-readable storage medium, in which a computer program is stored, and the computer program, when executed by a processor, implements the steps of any one of the methods for determining a person relationship.
Based on the same inventive concept, according to the person relationship determining method provided in the above embodiment of the present invention, an embodiment of the present invention further provides a computer program product containing instructions, which when run on a computer, causes the computer to execute the steps of any one of the person relationship determining methods in the above embodiments.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the invention to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that includes one or more available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Disk (SSD)), among others.
It should be noted that, in this document, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising a … …" does not exclude the presence of another identical element in a process, method, article, or apparatus that comprises the element.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for embodiments such as the apparatus, the electronic device, the computer-readable storage medium, and the computer program product, since they are substantially similar to the method embodiments, the description is simple, and for relevant points, reference may be made to part of the description of the method embodiments.
The above description is only for the preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention shall fall within the protection scope of the present invention.

Claims (10)

1. A person relationship determination method, comprising:
acquiring a monitoring video containing a target person;
acquiring video structured data in the monitoring video;
for the personnel identity data of each dimension in the video structured data, judging whether the personnel identity data of the target personnel in the dimension is the same as the personnel identity data of other personnel;
if the dimension is the same as the preset value, determining that the inherent relation index of the dimension is a first preset value;
if not, determining the inherent relation index of the dimension as a second preset value;
determining an interaction relation index between the target person and the other persons according to person interaction data contained in the video structured data;
determining a relationship index between the target person and the other persons based on the inherent relationship index and the interaction relationship index, wherein the relationship index represents the strength of the person relationship between the target person and the other persons.
2. The method of claim 1, wherein the people interaction data comprises interaction data in one or more of an active contact time dimension, a peer dimension, a conversation duration dimension, a conversation times dimension, a conversation emotion dimension, a fighting times dimension, and a fighting duration dimension;
the step of determining an interaction relationship index between the target person and the other persons according to the person interaction data included in the video structured data includes:
if the person interaction data comprises interaction data of the effective contact time dimension, determining an interaction relation index between the target person and the other persons in the effective contact time dimension according to the contact time of the other persons and the target person;
if the personnel interaction data comprise interaction data of the same-row dimension, determining an interaction relation index between the target personnel and the other personnel in the same-row dimension according to the times of the target personnel and the other personnel sharing the same row;
if the person interaction data comprises interaction data of the conversation duration dimension, determining an interaction relationship index between the target person and the other persons in the conversation duration dimension according to the conversation times of the target person and the other persons and the conversation durations of the target person and the other persons;
if the person interaction data comprises interaction data of the conversation frequency dimension, determining an interaction relation index between the target person and the other persons in the conversation frequency dimension according to the conversation frequency of the target person and the other persons;
if the person interaction data comprises interaction data of the conversation emotion dimension, determining an interaction relation index between the target person and the other persons in the conversation emotion dimension according to expressions appearing in the conversation process of the target person and the other persons and the number of times of appearance of the expressions of the target person and the other persons;
if the person interaction data comprise interaction data of the racking frequency dimension, determining an interaction relation index between the target person and the other persons in the racking frequency dimension according to the racking frequency of the target person and the other persons;
and if the personnel interaction data comprise interaction data of the length dimension of the time of putting up the frame, determining an interaction relation index of the target personnel and other personnel in the length dimension of the time of putting up the frame according to the times of putting up the frame of the target personnel and other personnel and the length of the time of putting up the frame of the target personnel and other personnel.
3. The method of claim 1 or 2, wherein the step of determining the relationship index between the target person and the other person based on the inherent relationship index and the interaction relationship index comprises:
determining a first weight of the inherent relation index of each dimension and a second weight of the interaction relation index of each dimension;
and determining the relationship index between the target person and the other persons according to the first weight of each dimension, the second weight of each dimension, the inherent relationship index of each dimension and the interaction relationship index of each dimension.
4. The method of claim 3, wherein after determining the relationship index between the target person and the other person according to the first weight of each dimension, the second weight of each dimension, the inherent relationship index of each dimension, and the interaction relationship index of each dimension, further comprising:
and re-determining the relation index between the target person and the other persons according to the first weight of each dimension, the second weight of each dimension, the inherent relation index of each dimension, the interaction relation index of each dimension and the attenuation index.
5. A person relationship determination apparatus, comprising:
the video acquisition module is used for acquiring a monitoring video containing target personnel;
the data acquisition module is used for acquiring video structured data in the monitoring video;
the relation index determining module is used for judging whether the personnel identity data of the target personnel in each dimension in the video structured data is the same as the personnel identity data of other personnel in the dimension; if the dimension is the same as the preset value, determining that the inherent relation index of the dimension is a first preset value; if not, determining the inherent relation index of the dimension as a second preset value; determining an interaction relation index between the target person and the other persons according to person interaction data contained in the video structured data; determining a relationship index between the target person and the other persons based on the inherent relationship index and the interaction relationship index, wherein the relationship index represents the strength of the person relationship between the target person and the other persons.
6. The apparatus of claim 5, wherein the people interaction data comprises interaction data in one or more of an active contact time dimension, a peer dimension, a conversation duration dimension, a conversation times dimension, a conversation emotion dimension, a fighting times dimension, and a fighting duration dimension;
the interactive relationship index determining submodule comprises:
a first determining unit, configured to determine, if the person interaction data includes interaction data in the effective contact time dimension, an interaction relationship index between the target person and the other person in the effective contact time dimension according to contact time between the other person and the target person;
a second determining unit, configured to determine, if the person interaction data includes interaction data of the peer dimension, an interaction relationship index between the target person and the other person in the peer dimension according to the number of times that the target person and the other person are in the peer;
a third determining unit, configured to determine, if the person interaction data includes interaction data in the conversation duration dimension, an interaction relationship index in the conversation duration dimension between the target person and the other person according to the number of conversations between the target person and the other person and the conversation duration between the target person and the other person;
a fourth determining unit, configured to determine, according to the number of conversations between the target person and the other person, an interactive relationship index between the target person and the other person in the number of conversations dimension if the person interaction data includes interaction data in the number of conversations dimension;
a fifth determining unit, configured to determine, if the person interaction data includes interaction data of the conversation emotion dimension, an interaction relationship index between the target person and the other person in the conversation emotion dimension according to an expression of the target person and the other person during a conversation and a number of times that the expression of the target person and the other person appears;
a sixth determining unit, configured to determine, if the person interaction data includes interaction data of the number of times of putting a shelf, an interaction relationship index between the target person and the other person in the number of times of putting a shelf in the dimension according to the number of times of putting a shelf between the target person and the other person;
and the seventh determining unit is used for determining the interaction relationship index of the target person and the other persons in the racking time dimension according to the racking times of the target person and the other persons and the racking time of the target person and the other persons if the person interaction data comprises the interaction data of the racking time dimension.
7. The apparatus of claim 5 or 6, wherein the relational index determining sub-module comprises:
the weight determining unit is used for determining a first weight of the inherent relation index of each dimension and a second weight of the interaction relation index of each dimension;
and the index determining unit is used for determining the relationship index between the target person and the other persons according to the first weight of each dimension, the second weight of each dimension, the inherent relationship index of each dimension and the interaction relationship index of each dimension.
8. The apparatus of claim 7, further comprising:
and the relation index re-determining module is used for re-determining the relation index between the target person and the other persons according to the first weight of each dimension, the second weight of each dimension, the inherent relation index of each dimension, the interaction relation index of each dimension and the attenuation index.
9. An electronic device is characterized by comprising a processor, a communication interface, a memory and a communication bus, wherein the processor and the communication interface are used for realizing mutual communication by the memory through the communication bus;
a memory for storing a computer program;
a processor for implementing the method steps of any of claims 1 to 4 when executing a program stored in the memory.
10. A computer-readable storage medium, characterized in that a computer program is stored in the computer-readable storage medium, which computer program, when being executed by a processor, carries out the method steps of any one of claims 1 to 4.
CN201910670269.3A 2019-07-24 2019-07-24 Personnel relationship determination method and device, electronic equipment and storage medium Active CN111324772B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910670269.3A CN111324772B (en) 2019-07-24 2019-07-24 Personnel relationship determination method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910670269.3A CN111324772B (en) 2019-07-24 2019-07-24 Personnel relationship determination method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111324772A CN111324772A (en) 2020-06-23
CN111324772B true CN111324772B (en) 2023-04-07

Family

ID=71171092

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910670269.3A Active CN111324772B (en) 2019-07-24 2019-07-24 Personnel relationship determination method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111324772B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112804491A (en) * 2020-12-31 2021-05-14 重庆惠统智慧科技有限公司 Campus security supervision method, system, server and storage medium
CN114036203B (en) * 2021-10-15 2024-09-24 浙江大华技术股份有限公司 Data analysis method and device, electronic equipment and storage medium
CN115424341A (en) * 2022-08-30 2022-12-02 长沙海信智能系统研究院有限公司 Fighting behavior identification method and device and electronic equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106203458A (en) * 2015-04-29 2016-12-07 杭州海康威视数字技术股份有限公司 Crowd's video analysis method and system
CN106610997A (en) * 2015-10-23 2017-05-03 杭州海康威视数字技术股份有限公司 Method, device and system for processing person information
CN106776781A (en) * 2016-11-11 2017-05-31 深圳云天励飞技术有限公司 A kind of human relation network analysis method and device
CN107480246A (en) * 2017-08-10 2017-12-15 北京中航安通科技有限公司 A kind of recognition methods of associate people and device
CN110019963A (en) * 2017-12-11 2019-07-16 罗普特(厦门)科技集团有限公司 The searching method of suspect relationship personnel
CN110020025A (en) * 2017-09-28 2019-07-16 阿里巴巴集团控股有限公司 A kind of data processing method and device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8831276B2 (en) * 2009-01-13 2014-09-09 Yahoo! Inc. Media object metadata engine configured to determine relationships between persons

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106203458A (en) * 2015-04-29 2016-12-07 杭州海康威视数字技术股份有限公司 Crowd's video analysis method and system
CN106610997A (en) * 2015-10-23 2017-05-03 杭州海康威视数字技术股份有限公司 Method, device and system for processing person information
CN106776781A (en) * 2016-11-11 2017-05-31 深圳云天励飞技术有限公司 A kind of human relation network analysis method and device
CN107480246A (en) * 2017-08-10 2017-12-15 北京中航安通科技有限公司 A kind of recognition methods of associate people and device
CN110020025A (en) * 2017-09-28 2019-07-16 阿里巴巴集团控股有限公司 A kind of data processing method and device
CN110019963A (en) * 2017-12-11 2019-07-16 罗普特(厦门)科技集团有限公司 The searching method of suspect relationship personnel

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Hema Swetha Koppula, Rudhir Gupta, Ashutosh Saxena.Learning Human Activities and Object Affordances from RGB-D Videos.arXiv:1210.1207.2013,第1-18页. *
高翔.基于视频深度学习的人物行为分析与社交关系识别.中国优秀硕士学位论文全文数据库.2018,13-15. *

Also Published As

Publication number Publication date
CN111324772A (en) 2020-06-23

Similar Documents

Publication Publication Date Title
CN111324772B (en) Personnel relationship determination method and device, electronic equipment and storage medium
Mehl The electronically activated recorder (EAR) a method for the naturalistic observation of daily social behavior
Campana et al. Cooperation in criminal organizations: Kinship and violence as credible commitments
Lauw et al. Homophily in the digital world: A LiveJournal case study
Raaijmakers et al. Exploring the relationship between subjectively experienced severity of imprisonment and recidivism: A neglected element in testing deterrence theory
US10734103B2 (en) Stress management system and stress management method
Kim et al. Predicting continuous conflict perceptionwith bayesian gaussian processes
Cavari The short-term effect of going public
Findlay et al. Using interaction networks to map communities on Twitter
Shahram et al. Understanding the life histories of pregnant-involved young aboriginal women with substance use experiences in three Canadian cities
US20170169463A1 (en) Method, apparatus, and computer-readable medium for determining effectiveness of a targeting model
US20160034426A1 (en) Creating Cohesive Documents From Social Media Messages
Miles The bully pulpit and media coverage: Power without persuasion
Münch et al. Walking through Twitter: Sampling a language-based follow network of influential Twitter accounts
Shaw et al. Behavioral consistency in the digital age
US11854369B2 (en) Multi-computer processing system for compliance monitoring and control
Lev-On et al. “Objection, your honor”: use of social media by civilians to challenge the criminal justice system
KR102149160B1 (en) System and method for improving sns dysfunction based on deep learning
US20200073877A1 (en) Video cookies
Weinberger If intelligence is a cause, it is a within-subjects cause
Hanley et al. A Golden Age: Conspiracy Theories' Relationship with Misinformation Outlets, News Media, and the Wider Internet
Venugopal et al. How subsistence communities reconfigure livelihood systems in response to climate change: a coupled-systems perspective
JP2019040605A (en) Feeling interactive method based on humor creation and robot system
Graves et al. Sifting signal from noise: A new perspective on the meaning of tweets about the “big game”
Peng et al. The dark side of entertainment? How viral entertaining media build an attention base for the far-right politics of The Epoch Times

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant