CN110795971A - User behavior identification method, device, equipment and computer storage medium - Google Patents

User behavior identification method, device, equipment and computer storage medium Download PDF

Info

Publication number
CN110795971A
CN110795971A CN201810871759.5A CN201810871759A CN110795971A CN 110795971 A CN110795971 A CN 110795971A CN 201810871759 A CN201810871759 A CN 201810871759A CN 110795971 A CN110795971 A CN 110795971A
Authority
CN
China
Prior art keywords
information
behavior
target object
behavior information
standard
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810871759.5A
Other languages
Chinese (zh)
Other versions
CN110795971B (en
Inventor
谢利民
黄成武
孙健峰
刘晓辉
廖雄成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Intellifusion Technologies Co Ltd
Original Assignee
Shenzhen Intellifusion Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Intellifusion Technologies Co Ltd filed Critical Shenzhen Intellifusion Technologies Co Ltd
Priority to CN201810871759.5A priority Critical patent/CN110795971B/en
Publication of CN110795971A publication Critical patent/CN110795971A/en
Application granted granted Critical
Publication of CN110795971B publication Critical patent/CN110795971B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Alarm Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a user behavior identification method, a device, equipment and a computer storage medium, wherein the method specifically comprises the following steps: acquiring facial feature information of a first target object; acquiring identity information of a first target object matched with the facial feature information according to the facial feature information; acquiring behavior information of the first target object; when the behavior information is determined to be the pre-stored non-standard behavior information, the warning information is sent, and the warning information comprises the identity information of the first target object, so that the behavior information of the first target object can be monitored in real time, the occurrence of the non-standard behavior can be found in time, the warning information can be received in time, measures can be taken immediately, and the harm to the physical and mental health of a party is avoided. Meanwhile, the embodiment of the invention also plays a role in standardizing and constraining the user behavior.

Description

User behavior identification method, device, equipment and computer storage medium
Technical Field
The present invention relates to the field of communications technologies, and in particular, to a method, an apparatus, a device, and a computer storage medium for user behavior identification.
Background
In many locations, there is a certain specification of the behavior of the person, and no non-civilized behavior (such as non-civilized words or actions) is allowed to occur. Because the uneventful behaviors bring great confusion to other people and even cause great harm to physical and mental health. For example, nonstandard words (e.g., threatening or abusive words) or irregular limb actions (e.g., beating, shoulder, or palming, etc.) of a young child in a kindergarten can cause significant harm to the physical and mental health of the young child. Such adverse events occur occasionally, although the effect is so great. A solution that can eliminate the occurrence of adverse events as much as possible is urgently needed.
Disclosure of Invention
The embodiment of the invention provides a user behavior identification method, a user behavior identification device, user behavior identification equipment and a computer storage medium, and mainly aims to solve the problems that in the prior art, the existence of an uneventful behavior brings great confusion to other people, and even causes great harm to physical and mental health.
In a first aspect, an embodiment of the present invention provides a user behavior identification method, which specifically includes:
acquiring facial feature information of a first target object;
acquiring identity information of a first target object matched with the facial feature information according to the facial feature information;
acquiring behavior information of the first target object;
and when the behavior information is determined to be the pre-stored non-standard behavior information, sending alarm information, wherein the alarm information comprises the identity information of the first target object.
In a second aspect, the present invention further provides a user behavior recognition apparatus, including:
a first acquisition unit configured to acquire facial feature information of a first target object;
a matching unit configured to acquire, from the facial feature information, identity information of a first target object that matches the facial feature information;
a second obtaining unit, configured to obtain behavior information of the first target object;
and the determining unit is used for sending alarm information when the behavior information is determined to be pre-stored non-standard behavior information, wherein the alarm information comprises the identity information of the first target object.
In a third aspect, the present invention provides a user behavior recognition apparatus, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor implements part or all of the steps of any one of the user behavior recognition methods described in the above method embodiments when executing the computer program.
In a fourth aspect, the present invention further provides a computer-readable storage medium, which stores a computer program, wherein the computer program, when executed by a processor, implements part or all of the steps of any one of the user behavior recognition methods described in the above method embodiments.
Has the advantages that: compared with the prior art, the embodiment of the invention mainly obtains the facial feature information of the first target object; acquiring identity information of a first target object matched with the facial feature information according to the facial feature information; acquiring behavior information of the first target object; when the behavior information is determined to be the pre-stored non-standard behavior information, the warning information is sent, and the warning information comprises the identity information of the first target object, so that the behavior information of the first target object can be monitored in real time, the occurrence of the non-standard behavior can be found in time, the warning information can be received in time, measures can be taken immediately, and the harm to the physical and mental health of a party is avoided. Meanwhile, the embodiment of the invention also plays a role in standardizing and constraining the user behavior.
These and other aspects of the invention are apparent from and will be elucidated with reference to the embodiments described hereinafter.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic flow chart of a user behavior identification method according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of another user behavior identification method according to an embodiment of the present invention;
fig. 3 is a schematic flowchart of another user behavior identification method according to an embodiment of the present invention;
fig. 4 is a schematic flow chart of another user behavior identification method according to an embodiment of the present invention
Fig. 5 is a schematic structural diagram of a user behavior recognition apparatus according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a user behavior recognition device according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Embodiments of the present application are described below with reference to the drawings. For convenience of description, the first target object of the kindergarten is taken as an illustration object in the embodiment of the present invention, but the present invention is not limited thereto.
Example one
According to an aspect of the present invention, a user behavior identification method is provided, which specifically includes the following steps, as shown in fig. 1:
s1, acquiring facial feature information of the first target object;
it should be noted that, specifically, the face image information of the first target object may be acquired by an image acquisition device, and the face image information is preprocessed, where the preprocessing mainly includes smoothing, transforming, enhancing, restoring, filtering, and the like of an image, and geometric features of the face feature information, such as geometric descriptions of relationships between local structures of eyes, a nose, a mouth, a chin, and the like, are acquired by extracting the face feature information from the face image information, that is, the face feature information.
S2, acquiring identity information of a first target object matched with the facial feature information according to the facial feature information;
specifically, the obtained facial feature information of the first target object may be matched with a preset standard facial feature information base, where the preset standard facial feature information base may be stored by obtaining standard facial feature information of the first target object in advance and associating the standard facial feature information with identity information of the first target object, and may be obtained by calculating a similarity between the obtained facial feature information and preset standard facial feature information, and if the similarity between the obtained facial feature information and preset standard facial feature information is greater than or equal to a preset similarity, the obtained facial feature information is successfully matched with the preset standard facial feature information, which indicates that the standard facial feature information corresponding to the first target object and the identity information of the first target object are stored in the preset standard facial feature information base, the identity information of the first target object may be obtained according to the result of successful matching. If the similarity between the acquired facial feature information of the first target object and the preset standard facial feature information is smaller than the preset similarity, it is determined that matching between the acquired facial feature information of the first target object and the preset standard facial feature information fails, and it is indicated that the standard facial feature information of the first target object and the identity information of the first target object are not stored in a preset standard facial feature information base, the facial feature information can be updated to the preset standard facial feature information base, and the identity information of the target object corresponding to the facial feature information which fails in matching is manually updated, so that the identity information which matches with the facial feature information can be determined next time according to the acquired facial feature information, or the acquired facial feature information is discarded.
S3, acquiring the behavior information of the first target object;
it should be noted that the behavior information may include voice behavior information and/or motion information, where the voice behavior information may be acquired by a sound pickup device, where the voice behavior information may include text content of voice of the first target object, the motion information may be specifically acquired by an image acquisition device, and the image information is analyzed and processed by using an image recognition technology to acquire the motion information of the first target object, where the motion information may include information such as a body motion and/or a movement speed of the first target object. The pre-stored non-canonical behavioral information may be pre-stored information including, for example, non-canonical terms, which may include, for example, uncertainties, threatening terms, insulting terms, etc., and non-canonical terms, which may include, for example, lewd actions, uncertainties, bruised actions, etc.
And S4, when the behavior information is determined to be the pre-stored non-standard behavior information, sending alarm information, wherein the alarm information comprises the identity information of the first target object.
It should be noted that, it is determined that the behavior information is pre-stored non-canonical behavior information, and specifically, the acquired behavior information of the first target object may be matched with the pre-stored non-canonical behavior information, for example, the acquired behavior information of the first target object is motion information, and a limb motion corresponding to the motion information is compared with a non-canonical limb motion corresponding to the pre-stored non-canonical behavior information, where the non-canonical limb motion may include a lewd motion or a rough motion, for example. If the similarity between the limb action corresponding to the action information and the non-standard limb action corresponding to the pre-stored non-standard behavior information is greater than the preset similarity, it may be determined that the obtained behavior information of the first target object is successfully matched with the pre-stored non-standard behavior information, and the behavior of the first target object is determined to be the pre-stored non-standard behavior information.
It should be noted that, in the case that it is determined that the acquired behavior information of the first target object is pre-stored non-standard behavior information, the warning information may be sent, where the warning information includes the identity information of the first target object, so that a party receiving the warning information may determine who the first target object is, and may specify and constrain the behavior of the first target object in time.
Furthermore, the type of the behavior information of the first target object and the reaction condition of the victim can be identified through the convolutional neural network model, the content of the alarm information and the received target equipment terminal can be determined according to the type of the behavior information and the reaction condition of the victim, and the behavior constraint effect on the first target object can be further strengthened.
It should be noted that, if the matching between the acquired first target object behavior information and the pre-stored non-standard behavior information fails, which indicates that the acquired behavior information of the first target object meets the behavior standard, the behavior information of the first target object acquired this time may be ignored.
Further, the step S4 of determining that the behavior information is pre-stored non-normative behavior information may specifically include:
and acquiring the behavior information of the first target object, matching the behavior parameters corresponding to the behavior information with the non-standard behavior parameters corresponding to the pre-stored non-standard behavior information, if the behavior parameters corresponding to the behavior information and the non-standard behavior parameters corresponding to the pre-stored non-standard behavior information meet a preset rule, determining that the matching between the behavior information and the pre-stored non-standard behavior information is successful, otherwise, failing to match.
It should be noted that the behavior information includes voice and/or motion information, the obtained behavior information of the first target object is matched with pre-stored non-standard behavior information, specifically, the behavior parameter corresponding to the behavior information is matched with the pre-stored non-standard behavior parameter corresponding to the non-standard behavior information, that is, the behavior parameter corresponding to the voice behavior information is matched with the pre-stored non-standard behavior parameter corresponding to the non-standard voice behavior information, and/or the behavior parameter corresponding to the motion information is matched with the pre-stored non-standard behavior parameter corresponding to the non-standard motion information, where the behavior parameter corresponding to the voice behavior information includes information such as text content, the behavior parameter corresponding to the motion information includes information such as body motion and/or movement speed, and if the behavior parameter corresponding to the voice behavior information is matched with the pre-stored non-standard behavior parameter corresponding to the non-standard voice behavior information, the behavior information is stored in the non-standard behavior parameter corresponding to the non-standard And/or if the behavior parameters corresponding to the action information and the non-standard behavior parameters corresponding to the pre-stored non-standard action information meet a preset rule, determining that the matching between the behavior information and the pre-stored non-standard behavior information is successful, otherwise, failing to match.
Whether the behavior information is the pre-stored non-standard behavior information or not is determined through matching between the behavior parameters corresponding to the behavior information and the pre-stored non-standard behavior parameters corresponding to the non-standard behavior information, and the behavior information can be more accurately matched with the non-standard behavior information, so that the scheme is more accurate and reasonable in user behavior identification, and the problem of excessive identification is avoided.
Has the advantages that: compared with the prior art, the embodiment of the invention mainly obtains the facial feature information of the first target object; acquiring identity information of a first target object matched with the facial feature information according to the facial feature information; acquiring behavior information of the first target object; when the behavior information is determined to be the pre-stored non-standard behavior information, the warning information is sent, and the warning information comprises the identity information of the first target object, so that the behavior information of the first target object can be monitored in real time, the occurrence of the non-standard behavior can be found in time, the warning information can be received in time, measures can be taken immediately, and the harm to the physical and mental health of a party is avoided. Meanwhile, the embodiment of the invention also plays a role in standardizing and constraining the user behavior.
Example two
On the basis of the above embodiment, the present invention further provides another user behavior identification method, which specifically includes the following steps, as shown in fig. 2:
s1, acquiring facial feature information of the first target object;
s2, acquiring identity information of a first target object matched with the facial feature information according to the facial feature information;
s3, acquiring the behavior information of the first target object;
the specific implementation of steps S1-S3 has been described in detail in the above embodiments, and will not be described herein.
And S4, when the behavior information is determined to be the pre-stored non-standard behavior information, sending alarm information, wherein the alarm information comprises the identity information of the first target object.
Preferably, when the behavior information is language behavior information, the determining that the language behavior information is pre-stored non-canonical behavior information in step S4 may specifically include:
s41, converting the voice content of the language behavior information into corresponding character content according to the language behavior information;
it should be noted that the voice behavior information of the first target object may be acquired through a sound pickup device, and the voice behavior information is analyzed and processed by using a voice recognition technology, so as to recognize the text content corresponding to the acquired voice behavior information.
S42, matching the text content with the non-standard text content corresponding to the pre-stored non-standard behavior information;
and S43, if the matching degree of the text content and the non-standard text content corresponding to the pre-stored non-standard behavior information is greater than or equal to a preset matching degree, determining that the language behavior information is non-standard behavior information.
And matching the text content with the non-standard text content corresponding to the pre-stored non-standard behavior information, if the matching degree of the text content and the non-standard text content corresponding to the pre-stored non-standard behavior information is greater than or equal to a preset matching degree, determining that the matching between the voice behavior information and the pre-stored non-standard behavior information is successful, indicating that the text content contains non-standard text content, such as non-civilized phrases, and determining that the voice behavior information is the non-standard behavior information. The specific implementation manner may include performing word-by-word check on the text content and the non-standard text content corresponding to the pre-stored non-standard behavior information, and if the check accuracy is greater than a preset accuracy, determining that the matching degree of the text content and the non-standard text content corresponding to the pre-stored non-standard behavior information is greater than or equal to the preset matching degree, where the check accuracy is used to indicate that the number of words correctly checked with the non-standard text content accounts for the word number percentage of the text content.
The voice behavior information of the first target object is converted into corresponding text content, and the text content is matched with the non-standard text content corresponding to the pre-stored non-standard behavior information, so that the voice behavior information is determined to be the non-standard behavior information, damage to physical and mental health of a victim due to the fact that the first target object uses a non-standard language is avoided, adverse events caused by the fact that the non-standard language is used can be avoided, and meanwhile the action of the first target object is restrained and standardized.
Preferably, as shown in fig. 3, when the behavior information is action behavior information, the determining that the action behavior information is pre-stored non-normative behavior information in step S4 may specifically include:
s41', according to the action behavior information, analyzing the limb action and/or the movement speed corresponding to the action behavior information;
the image information of the first target object is acquired by an image acquisition device, and the limb movement contained in the image information is determined by an image identification technology (image identification) which is a technology for processing, analyzing and understanding an image by a computer to identify various different modes of targets and objects.
According to the motion information, the moving speed of the first target object corresponding to the motion information is analyzed, and a specific embodiment may include that, when the first target object is subjected to displacement change, a displacement change value of the first target object, that is, a displacement distance of the first target object, is obtained by a ranging sensor, time taken for the first target object to generate the displacement change value is calculated according to a ranging time point, and the moving speed of the first target object is calculated according to the displacement distance and the time.
S42', matching the limb movement and/or the movement speed with the non-standard limb movement and/or the non-standard movement speed corresponding to the pre-stored non-standard behavior information;
s43', if the similarity between one of the limb actions and/or the movement speeds and the non-canonical limb actions and/or movement speeds corresponding to the pre-stored non-canonical behavior information is greater than or equal to a preset similarity, determining that the action behavior information is non-canonical behavior information;
it is to be noted that the non-standard limb movement corresponding to the non-standard behavior information is pre-established, the limb movement determined by the image recognition technology is compared with the non-standard limb movement corresponding to the pre-established non-standard behavior information, the similarity between the limb movement and the non-standard limb movement corresponding to the pre-established non-standard behavior information is calculated, if the similarity between the limb movement and the non-standard limb movement corresponding to the pre-established non-standard behavior information is greater than or equal to the preset limb movement similarity, it is determined that the matching between the movement information and the pre-stored non-standard behavior information is successful, and it is determined that the movement behavior information is the non-standard behavior information if the limb movement is described as the non-standard limb movement.
The movement speed of the first target object is too high, which may affect a party in certain specific scenarios, for example, during business hours of an enterprise, and therefore, the movement speed of the first target object corresponding to the action behavior information of the first target object is also identified, whether the movement speed is an irregular movement speed is determined, an irregular movement speed corresponding to irregular behavior information is established in advance, the movement speed of the first target object is matched with the irregular movement speed corresponding to the preset irregular behavior information, and if the movement speed of the first target object is greater than or equal to at least one irregular movement speed in an irregular movement speed interval corresponding to the pre-stored irregular behavior information, the action behavior information may be determined to be the irregular behavior information.
The limb actions and/or the moving speed corresponding to the action behavior information of the first target object are matched with the non-standard limb actions and/or the moving speed corresponding to the pre-stored non-standard behavior information, so that the action behavior information is determined to be the non-standard behavior information, damage to physical and mental health of a victim due to the fact that the first target object uses the non-standard actions is avoided, adverse events caused by the fact that the first target object uses the non-standard actions can be avoided, and meanwhile the actions of the first target object are restrained and normalized.
Preferably, when it is determined in step S4 that the behavior information is pre-stored non-normative behavior information, the sending of the warning information specifically includes:
s44, determining the non-standard grade of the behavior information, wherein the alarm information comprises multi-grade alarm information, and each grade of alarm information corresponds to a corresponding non-standard grade;
it should be noted that the non-standard level of the behavior information may be determined according to the type of the behavior information and the degree of injury predicted to be possibly caused to the victim, and a non-standard level list may be set in advance, and the non-standard level list may include different types of behavior information and corresponding non-standard levels, and meanwhile, corresponding alarm information is set for each level of non-standard level. The content of the alarm information and the received target equipment terminal can be determined according to the type of the behavior information and the degree of injury possibly caused to the victim. For example, when the behavior information of the first target object is the use of the indefinite term, the corresponding non-specification level is set to a low level, correspondingly setting corresponding low-level alarm information as voice prompt information of 'please notice your civilization phrase', correspondingly setting an accepted low-level target equipment terminal, wherein the alarm information further comprises identity information of the first target object, when the behavior information of the first target object is an action using uncivilized action, such as lewd activity, predicts that a relatively high degree of injury may be inflicted on the victim, thereby determining that its corresponding non-canonical level is high, correspondingly setting corresponding advanced alarm information as alarm information for reminding related personnel to go to take emergency measures in time, and correspondingly setting the accepted advanced target equipment terminal, wherein the alarm information also comprises the identity information of the first target object.
And S45, sending corresponding level alarm information according to the determined non-standard level of the behavior information.
And sending the alarm information of the corresponding level according to the determined non-standard level of the behavior information, and determining a target equipment terminal for sending the alarm information according to the type of the behavior information and the degree of injury possibly caused to the victim, wherein if the degree of injury caused to the victim by the behavior information is large, the alarm information corresponding to the non-standard level of the behavior information can be sent to the target equipment terminal with the high level, so that the actions of restricting and standardizing the behavior of the first target object are facilitated.
Has the advantages that: compared with the prior art, the embodiment of the invention mainly obtains the facial feature information of the first target object; acquiring identity information of a first target object matched with the facial feature information according to the facial feature information; acquiring behavior information of the first target object; when the behavior information is determined to be the pre-stored non-standard behavior information, the non-standard level of the behavior information is determined, the alarm information of the corresponding level of the non-standard level is sent, and the alarm information comprises the identity information of the first target object, so that the embodiment of the invention can monitor the behavior information of the first target object in real time, discover the occurrence of the non-standard behavior in time, and determine the target equipment terminal for sending the alarm information according to the type of the behavior information and the degree of injury possibly caused to the victim, so that measures can be taken immediately, the constraint and the standard action of the behavior of the user are strengthened, and the damage to the physical and mental health of the party is avoided.
EXAMPLE III
On the basis of the above embodiment, the present invention further provides another user behavior identification method, which specifically includes the following steps, as shown in fig. 4:
s1, acquiring facial feature information of the first target object;
s2, acquiring identity information of a first target object matched with the facial feature information according to the facial feature information;
s3, acquiring the behavior information of the first target object;
s41', determining the non-standard grade of the behavior information, wherein the alarm information comprises multi-grade alarm information, and each grade of alarm information corresponds to a corresponding non-standard grade;
s42', according to the determined non-standard level of the behavior information, sending corresponding level alarm information;
the detailed implementation of steps S1-S42 "has been described in detail in the above embodiments, and will not be described herein.
S5, recognizing a portrait image of a second target object, wherein the second target object and the first target object are co-workers;
before obtaining target feature information of a first target object, the user behavior identification scheme further includes acquiring a portrait image of the first target object to form a portrait image set of the first target object, and determining, from the portrait image set, a face image appearing in a video image within a preset time length taking a time point corresponding to the portrait image of each first target object as a time starting point to obtain a P-face image, where P is an integer greater than 1; classifying the P personal face images to obtain Q objects and the number of the face images corresponding to each object, wherein Q is a positive integer smaller than P; and taking K objects with the number of the face images larger than a first preset threshold value in the Q objects as the second target objects, wherein K is a positive integer smaller than Q. Namely, the second target object and the first target object are members of the same team, the second target object is an object influenced by the behavior information of the first target object, namely, the first target object is an executor of the non-standard behavior, and the second target object is a receiver of the non-standard behavior of the first target object. For example, if the first target object is a kindergarten teacher, the second target object is a kindergarten child in the kindergarten who is within the class of the teacher.
S6, extracting expression features and behavior features from the portrait image of the second target object;
it should be noted that, the portrait image of the second target object is processed by using a face recognition technology, and then face feature information of the portrait image information is extracted, geometric features of the face feature information, such as geometric descriptions of relations between local structures of eyes, nose, mouth, chin, etc., are obtained, and expression features, such as joy, anger, sadness, etc., in the portrait image of the second target object are extracted by describing the geometric features; and analyzing and processing the portrait image through an image recognition technology, and extracting behavior characteristics, such as trembling and other limb movements, in the portrait image of the second target object.
S7, when it is determined that the expression features and behavior features of the second target object are both abnormal, determining that the behavior information of the first target object is a first-level non-standard level, and sending first-level alarm information corresponding to the first-level non-standard level includes: starting an alarm device, and pushing the behavior information of the first target object to a manager of the first target object and a guardian of a second target object; and the alarm information corresponding to the first-level nonstandard level is first-level alarm information.
It should be noted that, specifically, the expression feature of the second target object may be matched with a pre-stored abnormal expression feature, and the behavior feature may be matched with a pre-stored abnormal behavior feature, if the similarity between the expression feature of the second target object and the pre-stored abnormal expression feature is greater than or equal to a first preset similarity, and the similarity between the behavior feature and the pre-stored abnormal behavior feature is greater than or equal to a second preset similarity, it is determined that the expression feature and the behavior feature of the second target object are both in an abnormal state, it is determined that the behavior information of the first target object is in a first-level abnormal level, and sending first-level alarm information corresponding to the first-level abnormal level includes: and starting an alarm device, pushing the behavior information of the first target object to a manager of the first target object and a guardian of the second target object, so that the manager of the first target object and the guardian of the second target object can find out the non-standard behavior of the first target object in time and acquire corresponding emergency measures in time, thereby avoiding greater damage to the physical and mental health of the second target object and avoiding the occurrence of adverse events.
When the expression characteristics or the behavior characteristics of the second target object are determined to be abnormal, determining that the behavior information of the first target object is in a secondary nonstandard level, and sending secondary alarm information corresponding to the secondary nonstandard level, wherein the secondary alarm information comprises the behavior information of the first target object pushed to a manager of the first target object and a guardian of the second target object; and the alarm information corresponding to the second-level nonstandard level is second-level alarm information.
It should be noted that, specifically, the expression feature of the second target object may be matched with a pre-stored abnormal expression feature, the behavior feature may be matched with a pre-stored abnormal behavior feature, if the similarity between the expression feature of the second target object and the pre-stored abnormal expression feature is greater than or equal to a first preset similarity, or the similarity between the behavior feature and the pre-stored abnormal behavior feature is greater than or equal to a second preset similarity, it is determined that the expression feature or the behavior feature of the second target object is in an abnormal state, it is determined that the behavior information of the first target object is in a secondary abnormal level, and sending secondary alarm information corresponding to the secondary abnormal level includes pushing the behavior information of the first target object to a manager of the first target object and a guardian of the second target object, the method has the advantages that managers of the first target object and guardians of the second target object can find out the non-standard behaviors of the first target object in time and collect corresponding emergency measures in time, so that the physical and mental health of the second target object is prevented from being damaged greatly, and adverse events are avoided.
And when the expression characteristics and the behavior characteristics of the second target object are determined to be normal states, determining that the behavior information of the first target object is in a three-level nonstandard level, and sending three-level alarm information corresponding to the three-level nonstandard level comprises pushing the behavior information of the first target object to a manager of the first target object. And the alarm information corresponding to the three-level nonstandard level is three-level alarm information.
It should be noted that, specifically, the expression feature of the second target object may be matched with a pre-stored abnormal expression feature, and the behavior feature may be matched with a pre-stored abnormal behavior feature, if the similarity between the expression feature of the second target object and the pre-stored abnormal expression feature is smaller than a first preset similarity, and the similarity between the behavior feature and the pre-stored abnormal behavior feature is smaller than a second preset similarity, it is determined that the expression feature and the behavior feature of the second target object are in a normal state, it is determined that the behavior information of the first target object is in a three-level abnormal level, and sending the three-level warning information corresponding to the three-level abnormal level includes pushing the behavior information of the first target object to a manager of the first target object, so that the manager of the first target object can discover the abnormal behavior of the first target object in time, and corresponding measures are collected in time to avoid greater damage to the physical and mental health of the second target object and prevent adverse events in advance.
Has the advantages that: the scheme of the invention mainly comprises the steps of obtaining facial feature information of a first target object; acquiring identity information of a first target object matched with the facial feature information according to the facial feature information; acquiring behavior information of the first target object; when the behavior information is determined to be the pre-stored non-standard behavior information, determining the non-standard level of the behavior information, sending the alarm information of the corresponding level of the non-standard level, simultaneously determining the non-standard level of the behavior information of the first target object according to the feedback behavior information of a receiver of the behavior information of the first target object, namely the second target object, namely the injury degree of the victim according to the behavior of the first target object, and sending the alarm information corresponding to the non-standard level. The accuracy and the rationality of user behavior identification are improved, and the restriction and the standardization effect on the behavior of the user are enhanced, so that the harm to the physical and mental health of the party is avoided.
Example four
An embodiment of the present invention further provides a user behavior recognition apparatus, as shown in fig. 5, fig. 5 is a schematic structural diagram of a user behavior recognition apparatus 500, where the apparatus specifically includes:
a first acquisition unit 510 for acquiring facial feature information of a first target object;
a matching unit 520, configured to obtain, according to the facial feature information, identity information of a first target object that matches the facial feature information;
a second obtaining unit 530, configured to obtain behavior information of the first target object;
a determining unit 540, configured to send alarm information when the behavior information is determined to be pre-stored non-standard behavior information, where the alarm information includes identity information of the first target object.
Has the advantages that: compared with the prior art, the embodiment of the present invention mainly obtains the facial feature information of the first target object through the first obtaining unit 510; the matching unit 520 acquires identity information of a first target object matched with the facial feature information according to the facial feature information; the second obtaining module 530 obtains the behavior information of the first target object; when the determining unit 540 determines that the behavior information is the pre-stored non-standard behavior information, the determining unit sends the warning information, where the warning information includes the identity information of the first target object, so that the embodiment of the present invention may monitor the behavior information of the first target object in real time, discover the occurrence of the non-standard behavior in time, and receive the warning information in time, so that measures may be taken immediately, and the physical and mental health of the party is prevented from being injured. Meanwhile, the embodiment of the invention also plays a role in standardizing and constraining the user behavior.
The user behavior recognition apparatus 500 according to an embodiment of the present invention is a user behavior recognition method corresponding to the above-described embodiment, and the user behavior recognition apparatus 500 further includes a plurality of units for implementing corresponding functions corresponding to corresponding steps of the user behavior recognition method. Since the steps of the user behavior recognition method have been described in detail in the above embodiments, they are not repeated in this apparatus 500.
As shown in fig. 6, the embodiment of the present invention further provides a user behavior recognition device 6, where the user behavior recognition device 6 includes a memory 61, a processor 62, and a computer program 63 stored in the memory 61 and executable on the processor 62, and the processor 62 implements the steps of the user behavior recognition method when executing the computer program 63.
An embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored, where the computer program is executed by a processor to implement part or all of the steps of any one of the user behavior identification methods described in the above method embodiments.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the order of acts, as some steps may occur in other orders or concurrently in accordance with the invention. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required by the invention.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The above embodiments of the present invention are described in detail, and the principle and the implementation of the present invention are explained by applying specific embodiments, and the above description of the embodiments is only used to help understanding the method of the present invention and the core idea thereof; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (10)

1. A user behavior recognition method is characterized by comprising the following steps:
acquiring facial feature information of a first target object;
acquiring identity information of a first target object matched with the facial feature information according to the facial feature information;
acquiring behavior information of the first target object;
and when the behavior information is determined to be the pre-stored non-standard behavior information, sending alarm information, wherein the alarm information comprises the identity information of the first target object.
2. The method according to claim 1, wherein determining that the behavior information is pre-stored non-canonical behavior information comprises:
and acquiring the behavior information of the first target object, matching the behavior parameters corresponding to the behavior information with the non-standard behavior parameters corresponding to the pre-stored non-standard behavior information, if the behavior parameters corresponding to the behavior information and the non-standard behavior parameters corresponding to the pre-stored non-standard behavior information meet a preset rule, determining that the matching between the behavior information and the pre-stored non-standard behavior information is successful, otherwise, failing to match.
3. The method according to any one of claims 1 or 2, wherein the behavior information of the first target object includes language behavior information and/or action behavior information.
4. The method according to claim 3, wherein when the behavior information is language behavior information, determining that the language behavior information is pre-stored non-canonical behavior information includes:
converting the voice content of the language behavior information into corresponding text content according to the language behavior information;
matching the text content with the non-standard text content corresponding to the pre-stored non-standard behavior information;
and if the matching degree of the text content and the non-standard text content corresponding to the pre-stored non-standard behavior information is greater than or equal to the preset matching degree, determining that the language behavior information is the non-standard behavior information.
5. The method according to claim 3, wherein when the behavior information is action behavior information, determining that the action behavior information is pre-stored non-canonical behavior information includes:
analyzing the limb action and/or the movement speed corresponding to the action behavior information according to the action behavior information;
matching the limb actions and/or the movement speed with the non-standard limb actions and/or the non-standard movement speed corresponding to the pre-stored non-standard behavior information;
and if the similarity between one of the limb actions and/or the movement speed and the non-standard limb actions and/or the movement speed corresponding to the pre-stored non-standard behavior information is greater than or equal to the preset similarity, determining that the action behavior information is the non-standard behavior information.
6. The user behavior recognition method according to claim 1, wherein: when the behavior information is determined to be the pre-stored non-standard behavior information, the sending of the alarm information comprises:
determining the non-standard grade of the behavior information, wherein the alarm information comprises multi-level alarm information, and each level of alarm information corresponds to a corresponding non-standard grade;
and sending corresponding level alarm information according to the determined non-standard level of the behavior information.
7. The user behavior recognition method of claim 6, further comprising:
identifying a portrait image of a second target object, wherein the second target object and the first target object are co-workers;
extracting expression characteristics and behavior characteristics from the portrait image of the second target object;
determining that the expression characteristics and the behavior characteristics of the second target object are abnormal states, determining that the behavior information of the first target object is a first-level non-standard level, and sending first-level alarm information corresponding to the first-level non-standard level comprises: starting an alarm device, and pushing the behavior information of the first target object to a manager of the first target object and a guardian of a second target object;
when the expression characteristics or the behavior characteristics of the second target object are determined to be abnormal, determining that the behavior information of the first target object is in a secondary nonstandard level, and sending secondary alarm information corresponding to the secondary nonstandard level, wherein the secondary alarm information comprises the behavior information of the first target object pushed to a manager of the first target object and a guardian of the second target object;
and when the expression characteristics and the behavior characteristics of the second target object are determined to be normal states, determining that the behavior information of the first target object is in a three-level nonstandard level, and sending three-level alarm information corresponding to the three-level nonstandard level comprises pushing the behavior information of the first target object to a manager of the first target object.
8. A user behavior recognition apparatus, comprising:
a first acquisition unit configured to acquire facial feature information of a first target object;
a matching unit configured to acquire, from the facial feature information, identity information of a first target object that matches the facial feature information;
a second obtaining unit, configured to obtain behavior information of the first target object;
and the determining unit is used for sending alarm information when the behavior information is determined to be pre-stored non-standard behavior information, wherein the alarm information comprises the identity information of the first target object.
9. A user behavior recognition device, the device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor, when executing the computer program, implements the steps of a user behavior recognition method according to any of claims 1 to 7.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of a method for user behavior recognition according to any one of claims 1 to 7.
CN201810871759.5A 2018-08-02 2018-08-02 User behavior identification method, device, equipment and computer storage medium Active CN110795971B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810871759.5A CN110795971B (en) 2018-08-02 2018-08-02 User behavior identification method, device, equipment and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810871759.5A CN110795971B (en) 2018-08-02 2018-08-02 User behavior identification method, device, equipment and computer storage medium

Publications (2)

Publication Number Publication Date
CN110795971A true CN110795971A (en) 2020-02-14
CN110795971B CN110795971B (en) 2023-02-17

Family

ID=69425091

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810871759.5A Active CN110795971B (en) 2018-08-02 2018-08-02 User behavior identification method, device, equipment and computer storage medium

Country Status (1)

Country Link
CN (1) CN110795971B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112651308A (en) * 2020-12-14 2021-04-13 北京市商汤科技开发有限公司 Object identification tracking method and device, electronic equipment and storage medium
CN113762184A (en) * 2021-09-13 2021-12-07 北京市商汤科技开发有限公司 Image processing method, image processing device, electronic equipment and computer storage medium
CN114152283A (en) * 2021-11-24 2022-03-08 山东蓝创网络技术股份有限公司 Family old-care nursing bed service supervision system based on stereoscopic dot matrix technology

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103246869A (en) * 2013-04-19 2013-08-14 福建亿榕信息技术有限公司 Crime monitoring method based on face recognition technology and behavior and sound recognition
CN105373774A (en) * 2015-10-10 2016-03-02 安徽清新互联信息科技有限公司 Method for detecting physical punishment behaviors of kindergarten teachers on children
CN105632049A (en) * 2014-11-06 2016-06-01 北京三星通信技术研究有限公司 Pre-warning method and device based on wearable device
CN106096831A (en) * 2016-06-08 2016-11-09 山西万立科技有限公司 Expressway Civilization services overall evaluation system
CN106791708A (en) * 2017-02-07 2017-05-31 深圳云天励飞技术有限公司 A kind of method for processing video frequency and device
CN107481737A (en) * 2017-08-28 2017-12-15 广东小天才科技有限公司 The method, apparatus and terminal device of a kind of voice monitoring
CN107832799A (en) * 2017-11-20 2018-03-23 北京奇虎科技有限公司 Object identifying method and device, computing device based on camera scene
CN108154115A (en) * 2017-12-22 2018-06-12 北京奇虎科技有限公司 Object identifying method and device, computing device based on camera scene
CN108345865A (en) * 2018-03-07 2018-07-31 广州图普网络科技有限公司 A kind of monitoring method, device and the user terminal of involved party's abnormal behaviour

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103246869A (en) * 2013-04-19 2013-08-14 福建亿榕信息技术有限公司 Crime monitoring method based on face recognition technology and behavior and sound recognition
CN105632049A (en) * 2014-11-06 2016-06-01 北京三星通信技术研究有限公司 Pre-warning method and device based on wearable device
CN105373774A (en) * 2015-10-10 2016-03-02 安徽清新互联信息科技有限公司 Method for detecting physical punishment behaviors of kindergarten teachers on children
CN106096831A (en) * 2016-06-08 2016-11-09 山西万立科技有限公司 Expressway Civilization services overall evaluation system
CN106791708A (en) * 2017-02-07 2017-05-31 深圳云天励飞技术有限公司 A kind of method for processing video frequency and device
CN107481737A (en) * 2017-08-28 2017-12-15 广东小天才科技有限公司 The method, apparatus and terminal device of a kind of voice monitoring
CN107832799A (en) * 2017-11-20 2018-03-23 北京奇虎科技有限公司 Object identifying method and device, computing device based on camera scene
CN108154115A (en) * 2017-12-22 2018-06-12 北京奇虎科技有限公司 Object identifying method and device, computing device based on camera scene
CN108345865A (en) * 2018-03-07 2018-07-31 广州图普网络科技有限公司 A kind of monitoring method, device and the user terminal of involved party's abnormal behaviour

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112651308A (en) * 2020-12-14 2021-04-13 北京市商汤科技开发有限公司 Object identification tracking method and device, electronic equipment and storage medium
CN113762184A (en) * 2021-09-13 2021-12-07 北京市商汤科技开发有限公司 Image processing method, image processing device, electronic equipment and computer storage medium
CN114152283A (en) * 2021-11-24 2022-03-08 山东蓝创网络技术股份有限公司 Family old-care nursing bed service supervision system based on stereoscopic dot matrix technology

Also Published As

Publication number Publication date
CN110795971B (en) 2023-02-17

Similar Documents

Publication Publication Date Title
CN109583278B (en) Face recognition alarm method, device and system and computer equipment
CN110795971B (en) User behavior identification method, device, equipment and computer storage medium
CN110795963A (en) Monitoring method, device and equipment based on face recognition
CN111241883B (en) Method and device for preventing cheating of remote tested personnel
CN115828112A (en) Fault event response method and device, electronic equipment and storage medium
US20230410221A1 (en) Information processing apparatus, control method, and program
CN116563829A (en) Driver emotion recognition method and device, electronic equipment and storage medium
CN114187561A (en) Abnormal behavior identification method and device, terminal equipment and storage medium
WO2023284185A1 (en) Updating method for similarity threshold in face recognition and electronic device
CN116313103A (en) Training method of pain identification model, pain identification method, device and medium
KR101747712B1 (en) interview auto recognizetion real-time management method by smart phone
TWI691923B (en) Fraud detection system for financial transaction and method thereof
KR102648004B1 (en) Apparatus and Method for Detecting Violence, Smart Violence Monitoring System having the same
CN115171335A (en) Image and voice fused indoor safety protection method and device for elderly people living alone
US11706391B1 (en) First responder monitoring system with distress detection
CN112507972B (en) Performance assessment system based on blockchain
CN115641701A (en) Event reminding method, device, equipment and storage medium
CN111708988B (en) Infringement video identification method and device, electronic equipment and storage medium
CN111339829B (en) User identity authentication method, device, computer equipment and storage medium
CN113743293A (en) Fall behavior detection method and device, electronic equipment and storage medium
US11886950B2 (en) System and method for assessing and verifying the validity of a transaction
EP3828792A1 (en) Frictionless and autonomous control processing
CN113520393B (en) Detection method and device for conflict event, wearable device and storage medium
CN111291597A (en) Image-based crowd situation analysis method, device, equipment and system
US20220207878A1 (en) Information acquisition support apparatus, information acquisition support method, and recording medium storing information acquisition support program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant