CN109086729B - Communication behavior detection method, device, equipment and storage medium - Google Patents

Communication behavior detection method, device, equipment and storage medium Download PDF

Info

Publication number
CN109086729B
CN109086729B CN201810921697.4A CN201810921697A CN109086729B CN 109086729 B CN109086729 B CN 109086729B CN 201810921697 A CN201810921697 A CN 201810921697A CN 109086729 B CN109086729 B CN 109086729B
Authority
CN
China
Prior art keywords
key point
key
information
key points
palm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810921697.4A
Other languages
Chinese (zh)
Other versions
CN109086729A (en
Inventor
邵泉铭
肖俊文
王亚夫
胡建兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Yunstare Technology Co ltd
Original Assignee
Chengdu Yunstare Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Yunstare Technology Co ltd filed Critical Chengdu Yunstare Technology Co ltd
Priority to CN201810921697.4A priority Critical patent/CN109086729B/en
Publication of CN109086729A publication Critical patent/CN109086729A/en
Application granted granted Critical
Publication of CN109086729B publication Critical patent/CN109086729B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Telephone Function (AREA)

Abstract

The invention relates to a communication behavior detection method, a communication behavior detection device, communication behavior detection equipment and a storage medium. The communication behavior detection method comprises the following steps: acquiring behavior picture information of monitored personnel; inputting the picture information into a posture estimation module to generate key point information; wherein the key points comprise at least one of wrist key points, palm key points, elbow key points and eye key points; the wrist key points are key points corresponding to the wrist parts; the palm key points are key points corresponding to the palm parts; the elbow key points are the key points corresponding to the elbow parts; the eye key points are key points corresponding to the eye parts; and judging whether the behavior picture information contains communication behaviors or not according to the key point information.

Description

Communication behavior detection method, device, equipment and storage medium
Technical Field
The invention relates to the technical field of behavior detection, in particular to a communication behavior detection method, a communication behavior detection device, communication behavior detection equipment and a storage medium.
Background
With the development of communication technology, mobile communication devices have become an integral part of people's lives. People often use mobile communication devices for communication. However, the act of communicating using a mobile communication device is prohibited in many instances. For example: the use of mobile communication devices for communication is prohibited in gas stations, airplanes, and the like.
When the communication using the mobile communication device is prohibited, in order to prevent people from using the mobile communication device to communicate, generally, a monitoring device acquires a picture of a person, recognizes and judges gestures of the person in the picture, judges whether a behavior of using the mobile communication device to communicate exists, and if the behavior is prevented. Although the mode avoids the influence on people in the surrounding area, when the people are far away from the monitoring equipment, the definition of the picture acquired by the monitoring equipment is reduced, and the accuracy is low when the picture is identified and judged.
Disclosure of Invention
In view of this, the present invention provides a communication behavior detection method, apparatus, device and storage medium, for overcoming the problems in the prior art that when a person is far away from a monitoring device, the definition of a picture obtained by the monitoring device is reduced, and the accuracy is low when the picture is identified and determined.
According to a first aspect of embodiments of the present application, there is provided a communication behavior detection method, including:
acquiring behavior picture information of monitored personnel;
inputting the picture information into a posture estimation module to generate key point information; wherein the key points comprise at least one of wrist key points, palm key points, elbow key points and eye key points;
the wrist key points are key points corresponding to the wrist parts; the palm key points are key points corresponding to the palm parts; the elbow key points are the key points corresponding to the elbow parts; the eye key points are key points corresponding to the eye parts;
and judging whether the behavior picture information contains communication behaviors or not according to the key point information.
Optionally, the input posture estimation module adopts a posture recognition technology of openpos based on a skeleton model.
Optionally, the generating the key point information includes:
generating palm key point position information, face key point position information, left eye key point position information and right eye key point position information;
the determining whether the behavior picture information contains a communication behavior according to the key point information includes:
calculating the distance between the palm key point and the face key point;
judging whether the distance between the palm key point and the face key point is smaller than a preset distance or not;
if so, judging whether the palm key point is positioned between a straight line extending along the right front of the face of the monitored person where the left eye key point is positioned and a straight line extending along the right front of the face of the monitored person where the right eye key point is positioned in the horizontal direction;
if not, determining whether the behavior picture information contains communication behaviors.
Optionally, the method for generating the position information of the palm key point includes:
generating position information of the elbow key points and position information of the wrist key points;
and determining the position information of the palm key point according to the position information of the elbow key point and the position information of the wrist key point.
Optionally, determining the position information of the palm key point according to the position information of the elbow key point and the position information of the wrist key point includes:
determining straight lines where the elbow key points and the wrist key points are located according to the position information of the elbow key points and the position information of the wrist key points;
when a first distance corresponding to a target point on a straight line is a first preset number times of a second distance, determining that the position information of the target point is the position information of a palm key point;
the target point is any point except a line segment formed by the elbow key point and the wrist key point on the straight line; the first distance is the distance between the target point and the elbow key point, and the second distance is the distance between the target point and the wrist key point.
Optionally, the value range of the first preset number is 3-5.
Optionally, determining the position information of the palm key point according to the position information of the elbow key point and the position information of the wrist key point includes:
determining a line segment taking the elbow key point and the wrist key point as end points according to the position information of the elbow key point and the position information of the wrist key point;
determining auxiliary points close to the wrist key points at one-half of a second preset number of the line segments;
determining a point of symmetry of the auxiliary points with respect to the wrist keypoints;
and the position information of the symmetrical point is the position information of the palm.
Optionally, the value range of the second preset number is 3-5.
Optionally, the generating the key point information includes:
generating corresponding elbow key point position information and frame number information of different frame numbers in continuous time;
generating wrist key point position information and frame number information corresponding to different frame numbers in continuous time;
generating information and frame number information of whether the palm key point uses the telephone or not corresponding to different frame numbers in continuous time;
the determining whether the behavior picture information contains a communication behavior according to the key point information includes:
judging whether the behavior is a hand-lifting behavior or not according to the position information of the elbow key points and the position information of the wrist key points;
if so, judging whether the duration time of the hand raising behavior exceeds a first preset time according to the elbow key point frame number information and the wrist key point frame number information;
if yes, judging that the duration of using the telephone exceeds a second preset time within the duration of the hand-up behavior according to the information and the frame number information of whether the palm key point uses the telephone;
if yes, determining whether the behavior picture information contains communication behaviors.
According to a second aspect of embodiments of the present application, there is provided a communication behavior detection apparatus including:
the acquisition module acquires behavior picture information of monitored personnel;
the generating module inputs the picture information into the attitude estimation module to generate key point information; wherein the key points comprise at least one of wrist key points, palm key points, elbow key points and eye key points;
the wrist key points are key points corresponding to the wrist parts; the palm key points are key points corresponding to the palm parts; the elbow key points are the key points corresponding to the elbow parts; the eye key points are key points corresponding to the eye parts;
and the judging module is used for judging whether the behavior picture information contains communication behaviors or not according to the key point information.
Optionally, the posture estimation module adopts a posture recognition technology of openpos based on a skeleton model.
Optionally, the generating module is specifically configured to: inputting the picture information into a posture estimation module to generate palm key point position information, face key point position information, left eye key point position information and right eye key point position information;
the judgment module is specifically configured to:
calculating the distance between the palm key point and the face key point;
judging whether the distance between the palm key point and the face key point is smaller than a preset distance or not;
if so, judging whether the palm key point is positioned between a straight line extending along the right front of the face of the monitored person where the left eye key point is positioned and a straight line extending along the right front of the face of the monitored person where the right eye key point is positioned in the horizontal direction;
if not, determining whether the behavior picture information contains communication behaviors.
Optionally, the generating module is specifically configured to:
generating corresponding elbow key point position information and frame number information of different frame numbers in continuous time;
generating wrist key point position information and frame number information corresponding to different frame numbers in continuous time;
generating information and frame number information of whether the palm key point uses the telephone or not corresponding to different frame numbers in continuous time;
the judgment module is specifically configured to:
the determining whether the behavior picture information contains a communication behavior according to the key point information includes:
judging whether the behavior is a hand-lifting behavior or not according to the position information of the elbow key points and the position information of the wrist key points;
if so, judging whether the duration time of the hand raising behavior exceeds a first preset time according to the elbow key point frame number information and the wrist key point frame number information;
if yes, judging that the duration of using the telephone exceeds a second preset time within the duration of the hand-up behavior according to the information and the frame number information of whether the palm key point uses the telephone;
if yes, determining whether the behavior picture information contains communication behaviors.
According to a third aspect of embodiments of the present application, there is provided a communication behavior detection apparatus including:
a processor, and a memory coupled to the processor;
the memory is configured to store a computer program configured to perform at least the communication behavior detection method according to the first aspect of the present application;
the processor is used for calling and executing the computer program in the memory.
According to a fourth aspect of the embodiments of the present application, there is provided a storage medium storing a computer program, which when executed by a processor, implements the steps of the communication behavior detection method according to the first aspect of the present application.
By adopting the technical scheme, the behavior picture information can be acquired; then inputting the picture information into a posture estimation module to generate the position information of the key points; and judging whether the behavior picture information contains communication behaviors or not according to the key point information. Each of the key points includes: at least one of a wrist, an elbow, and a point of a location of an eye; due to the size of the positions occupied by the key points such as the wrist, the elbow and the eyes, the positions are occupied more than the positions occupied by the various details on the hand used for judging the gesture. Whether the behavior is a communication behavior is determined through key point points such as wrists, elbows and eyes, and compared with the determination of lower definition requirements on pictures through gestures, when a person is far away from the monitoring equipment, the definition of pictures acquired by the monitoring equipment is reduced.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic flowchart of a communication behavior detection method according to an embodiment of the present application.
Fig. 2 is a flowchart illustrating a communication behavior detection method according to another embodiment of the present application.
Fig. 3 is a flowchart illustrating a communication behavior detection method according to another embodiment of the present application.
Fig. 4 is a schematic structural diagram of a communication behavior detection apparatus according to another embodiment of the present application.
Fig. 5 is a schematic structural diagram of a communication behavior detection device according to another embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be described in detail below. It is to be understood that the described embodiments are merely exemplary of the invention, and not restrictive of the full scope of the invention. All other embodiments, which can be made by those skilled in the art without any inventive work based on the embodiments of the present invention, are within the scope of the present invention.
Fig. 1 is a flowchart illustrating a communication behavior detection method according to an embodiment of the present invention.
As shown in fig. 1, the method of the present embodiment includes:
s101, acquiring picture information of behaviors of monitored personnel;
the behavior picture information is obtained by shooting through the monitoring equipment.
Specifically, the monitoring device may be various types of cameras, monitor persons in situations where communication using the mobile communication device is prohibited, and take picture information of the behaviors of the persons in the situations in real time. The monitored person is a person in a situation where the communication using the mobile communication device is prohibited.
S102, inputting the picture information into an attitude estimation module to generate key point information; wherein the key points comprise at least one of wrist key points, palm key points, elbow key points and eye key points;
the wrist key points are key points corresponding to the wrist parts; the palm key points are key points corresponding to the palm parts; the elbow key points are the key points corresponding to the elbow parts; the eye key points are key points corresponding to the eye parts;
specifically, the input posture estimation module adopts an openpos skeleton model-based posture identification technology. The gesture recognition technology of OpenPose based on a skeleton model is a technology for processing human body gestures as a whole. The technology can be used for tracking and detecting the feature points of multiple persons in real time, and can simultaneously locate 18 key feature points on the human body.
And S103, judging whether the behavior picture information contains communication behaviors or not according to the key point information.
By adopting the technical scheme, the behavior picture information can be acquired through the monitoring equipment; then inputting the picture information into a posture estimation module to generate the position information of the key points; and judging whether the behavior is a communication behavior according to the key point information. Each of the key points includes: at least one of a wrist, an elbow, and a point of a site where an eye is located. Due to the size of the positions occupied by the key points such as the wrist, the elbow and the eyes, the positions are occupied more than the positions occupied by the various details on the hand used for judging the gesture. Whether the behavior is the communication behavior is determined through key point points such as wrists, elbows and eyes, and compared with the determination of lower definition requirements on the picture through gestures, when a person is far away from the monitoring equipment, the definition of the picture acquired by the monitoring equipment is reduced.
In practical applications, there are the following two methods for detecting communication behaviors by the above technical solutions.
Referring to fig. 2, the steps of one method are as follows:
s201, generating palm key point position information, face key point position information, left eye key point position information and right eye key point position information;
specifically, S201 is a further explanation of S102. Inputting the picture information into a posture estimation module, and generating palm key point position information, face key point position information, left eye key point position information and right eye key point position information.
Further, the method for generating the position information of the palm key point comprises the following steps: generating position information of the elbow key points and position information of the wrist key points; and determining the position information of the palm key point according to the position information of the elbow key point and the position information of the wrist key point.
Generating the position information of the elbow key points and the position information of the wrist key points is as follows: and inputting the picture information into a posture estimation module to generate the position information of the elbow key points and the position information of the wrist key points.
According to the position information of the elbow key points and the position information of the wrist key points, determining the position information of the palm key points as follows: determining straight lines where the elbow key points and the wrist key points are located according to the position information of the elbow key points and the position information of the wrist key points; when the first distance corresponding to the target point on the straight line is multiple of the first preset number of the second distance, determining that the position information of the target point is the position information of the palm key point; the target point is any point except a line segment formed by the elbow key point and the wrist key point on the straight line; the first distance is the distance between the target point and the elbow key point, and the second distance is the distance between the target point and the wrist key point. Wherein the value range of the first preset number is 4-6. Specifically, the first preset number may be 5.
According to the position information of the elbow key points and the position information of the wrist key points, the position information of the wrist key points can be determined as follows:
determining a line segment taking the elbow key point and the wrist key point as end points according to the position information of the elbow key point and the position information of the wrist key point;
determining auxiliary points close to the wrist key points at one-half of a second preset number of line segments;
determining the symmetrical points of the auxiliary points relative to the key points of the wrist;
the position information of the symmetric point is the position information of the palm.
Wherein the value range of the second preset number is 3-5. Specifically, the second preset number may be 4.
S202, calculating the distance between a palm key point and a face key point;
specifically, a three-dimensional model may be established, and the distance between the palm key point and the face key point may be determined according to the information of the palm key point and the information of the face key point.
S203, judging whether the distance between the palm key point and the face key point is smaller than a preset distance or not;
the value range of the preset distance is 0-60 pixel distance.
S204, if yes, judging whether the palm key point is positioned between a straight line extending along the front of the face of the monitored person and a straight line extending along the front of the face of the monitored person, wherein the left eye key point is positioned on the straight line;
and S205, if the judgment result is negative, determining that the behavior of the monitored person is a calling behavior.
If the monitored person is using other functions of the communication device, for example: playing games, reading electronic documents, etc., it is necessary to place the communication device in front of the eyes, i.e., on the midperpendicular of the connection line of the key points of the left eye and the right eye; meanwhile, since the communication device is generally picked up by the palm, the position of the palm key point can be regarded approximately as the position of the communication device. When the palm holding the communication device is closer to the face and not in front of the eyes, it is considered to communicate with the communication device.
Referring to fig. 3, another method includes the following steps:
s301, generating corresponding elbow key point position information and frame number information with different frame numbers in continuous time;
s302, generating wrist key point position information and frame number information corresponding to different frame numbers in continuous time;
s303, generating information and frame number information of whether the palm key point uses the telephone or not, which correspond to different frame numbers in continuous time;
specifically, S301, S302, and S303 are further explanations of S102. Inputting the picture information into a posture estimation module to generate elbow key point position information and frame number information under different frame numbers in continuous time; generating the position information and the frame number information of the wrist key point in the continuous time and at different frame numbers; generating information of whether the palm key point uses the telephone or not and frame number information under different frame numbers in continuous time; generating palm key point position information, face key point position information, left eye key point position information and right eye key point position information.
S304, judging whether the behavior is a hand-lifting behavior according to the position information of the elbow key points and the position information of the wrist key points;
specifically, the position information of the elbow key point and the position information of the wrist key point can be input into a posture estimation module adopting an openpos skeleton model-based posture recognition technology, so that whether the behavior is a hand-lifting behavior can be judged.
S305, if yes, judging whether the duration time of the hand raising behavior exceeds a first preset time according to the elbow key point frame number information and the wrist key point frame number information;
specifically, the first preset time ranges from 3 seconds to 5 seconds, and further, the first preset time may be 5 seconds.
S306, if yes, judging that the duration of using the phone exceeds a second preset time within the duration of the hand-up behavior according to the information and the frame number information of whether the palm key point uses the phone;
specifically, the second preset time ranges from 3 seconds to 5 seconds, and further, the second preset time may be 3 seconds.
And S307, if yes, determining the behavior of the monitored person to be a calling behavior.
The basis of the above judging mode is: when the monitored person holds the handset and lifts his hand for more than a certain period of time, it is highly likely that communication is being performed using the communication device. Thus, it is determined whether or not the hand raising operation is performed by S304; by judging the duration of the hand-raising action in S305 and judging whether the mobile phone is held in S306, it can be determined step by step whether the monitored person meets the conditions in the basis mentioned in this paragraph, and whether the monitored person is communicating.
Fig. 4 is a schematic structural diagram of a communication behavior detection apparatus according to another embodiment of the present application. Referring to fig. 4, the communication behavior detection apparatus provided by the present application includes:
the acquiring module 401 acquires behavior picture information of a monitored person;
the behavior picture information is obtained by shooting through the monitoring equipment.
A generating module 402, which inputs the picture information into the attitude estimation module to generate the key point information; wherein the key points comprise at least one of wrist key points, palm key points, elbow key points and eye key points;
the determining module 403 determines whether the behavior picture information includes a communication behavior according to the key point information.
By adopting the technical scheme, the behavior picture information can be acquired through the monitoring equipment; then inputting the picture information into a posture estimation module to generate the position information of the key points; and judging whether the behavior is a communication behavior according to the key point information. The respective key points may include: the 18 key nodes where the wrist, elbow and eyes are located. Due to the size of the positions occupied by the key points such as the wrist, the elbow and the eyes, the positions are occupied more than the positions occupied by the various details on the hand used for judging the gesture. Whether the behavior is the communication behavior is determined through key point points such as wrists, elbows and eyes, and compared with the determination of lower definition requirements on pictures through gestures, when people are far away from the monitoring equipment, the definition of pictures acquired by the monitoring equipment is reduced, compared with the scheme in the background technology, the behavior is identified and judged through the scheme, and the judgment accuracy can be improved.
Optionally, the input posture estimation module adopts a posture recognition technology of openpos based on a skeleton model.
Optionally, the method further includes: generating key point information, including:
the generating module 402 is specifically configured to:
inputting the picture information into a posture estimation module, and generating palm key point position information, face key point position information, palm key point position information, left eye key point position information and right eye key point position information;
the determining module 403 is specifically configured to:
calculating the distance between the palm key point and the face key point;
judging whether the distance between the palm key point and the face key point is smaller than a preset distance or not;
if so, judging whether the palm key point is positioned between a straight line extending along the right front of the face of the monitored person where the left eye key point is positioned and a straight line extending along the right front of the face of the monitored person where the right eye key point is positioned in the horizontal direction;
if not, determining whether the behavior picture information contains communication behaviors.
Optionally, the generating module 402 is specifically configured to:
generating position information and frame number information of the elbow key points in continuous time and at different frame numbers;
generating the position information and the frame number information of the wrist key point in the continuous time and at different frame numbers;
generating information of whether the palm key point uses the telephone or not and frame number information under different frame numbers in continuous time;
the determining module 403 is specifically configured to:
judging whether the behavior picture information contains communication behaviors according to the key point information, wherein the judging step comprises the following steps:
judging whether the behavior is a hand-lifting behavior or not according to the position information of the elbow key points and the position information of the wrist key points;
if so, judging whether the duration time of the hand raising behavior exceeds a first preset time or not according to the elbow key point frame number information and the wrist key point frame number information;
if yes, judging whether the duration of using the telephone exceeds a second preset time within the duration of the hand-up behavior according to the information and the frame number information of whether the palm key point uses the telephone;
and if so, determining whether the behavior picture information contains communication behaviors.
Fig. 5 is a schematic structural diagram of a communication behavior detection device according to another embodiment of the present application. Referring to fig. 5, the communication behavior detection apparatus provided by the present application includes: a processor 502, and a memory 501 connected to the processor 502;
the memory 501 is used for storing a computer program, which is at least used for executing the communication behavior detection method provided by the present application;
the processor 502 is used to invoke and execute computer programs in the memory 501.
By adopting the technical scheme, the behavior picture information can be acquired through the monitoring equipment; then inputting the picture information into a posture estimation module to generate the position information of the key points; and judging whether the behavior is a communication behavior according to the key point information. The respective key points may include: at least one of a wrist, an elbow, and a point of a site where an eye is located. Due to the size of the positions occupied by the key points such as the wrist, the elbow and the eyes, the positions are occupied more than the positions occupied by the various details on the hand used for judging the gesture. Whether the behavior is the communication behavior is determined through key point points such as wrists, elbows and eyes, and compared with the determination of lower definition requirements on pictures through gestures, when people are far away from the monitoring equipment, the definition of pictures acquired by the monitoring equipment is reduced, compared with the scheme in the background technology, the behavior is identified and judged through the scheme, and the judgment accuracy can be improved.
The application also provides a storage medium, wherein the storage medium stores a computer program, and when the computer program is executed by a processor, the steps in the communication behavior detection method provided by the application are realized.
It is understood that the same or similar parts in the above embodiments may be mutually referred to, and the same or similar parts in other embodiments may be referred to for the content which is not described in detail in some embodiments.
It should be noted that the terms "first," "second," and the like in the description of the present invention are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. Further, in the description of the present invention, the meaning of "a plurality" means at least two unless otherwise specified.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (14)

1. A communication behavior detection method, comprising:
acquiring behavior picture information of monitored personnel;
inputting the picture information into a posture estimation module to generate key point information; wherein the key points comprise at least one of wrist key points, palm key points, elbow key points, face key points, and eye key points;
the wrist key points are key points corresponding to the wrist parts; the palm key points are key points corresponding to the palm parts; the elbow key points are the key points corresponding to the elbow parts; the eye key points are key points corresponding to the eye parts;
judging whether the behavior picture information contains communication behaviors or not according to the key point information;
which comprises the following steps: calculating the distance between the palm key point and the face key point;
judging whether the distance between the palm key point and the face key point is smaller than a preset distance or not;
if so, judging whether the palm key point is positioned between a straight line extending along the right front of the face of the monitored person where the left eye key point is positioned and a straight line extending along the right front of the face of the monitored person where the right eye key point is positioned in the horizontal direction;
if not, determining that the behavior picture information contains communication behaviors.
2. The method of claim 1, wherein the input pose estimation module employs openpos skeletal model-based pose recognition technology.
3. The method of claim 1, wherein generating palm key point location information comprises:
generating position information of the elbow key points and position information of the wrist key points;
and determining the position information of the palm key point according to the position information of the elbow key point and the position information of the wrist key point.
4. The method of claim 3, wherein determining the palm key location information from the elbow key location information and wrist key location information comprises:
determining straight lines where the elbow key points and the wrist key points are located according to the position information of the elbow key points and the position information of the wrist key points;
when a first distance corresponding to a target point on a straight line is a first preset number times of a second distance, determining that the position information of the target point is the position information of a palm key point;
the target point is any point except a line segment formed by the elbow key point and the wrist key point on the straight line; the first distance is the distance between the target point and the elbow key point, and the second distance is the distance between the target point and the wrist key point.
5. The method of claim 4, wherein the first predetermined number is in a range of 4 to 6.
6. The method of claim 3, wherein determining the palm key location information from the elbow key location information and wrist key location information comprises:
determining a line segment taking the elbow key point and the wrist key point as end points according to the position information of the elbow key point and the position information of the wrist key point;
determining auxiliary points close to the wrist key points at one-half of a second preset number of the line segments;
determining a point of symmetry of the auxiliary points with respect to the wrist keypoints;
and the position information of the symmetrical point is the position information of the palm.
7. The method of claim 6, wherein the second predetermined number is in a range of 3 to 5.
8. The method of claim 1, wherein the generating the keypoint information comprises:
generating corresponding elbow key point position information and frame number information of different frame numbers in continuous time;
generating wrist key point position information and frame number information corresponding to different frame numbers in continuous time;
generating information and frame number information of whether the palm key point uses the telephone or not corresponding to different frame numbers in continuous time;
the determining whether the behavior picture information contains a communication behavior according to the key point information includes:
judging whether the behavior is a hand-lifting behavior or not according to the position information of the elbow key points and the position information of the wrist key points;
if so, judging whether the duration time of the hand raising behavior exceeds a first preset time according to the elbow key point frame number information and the wrist key point frame number information;
if yes, judging that the duration of using the telephone exceeds a second preset time within the duration of the hand-up behavior according to the information and the frame number information of whether the palm key point uses the telephone;
if yes, determining whether the behavior picture information contains communication behaviors.
9. A communication behavior detection apparatus, comprising:
the acquisition module acquires behavior picture information of monitored personnel;
the generating module inputs the picture information into the attitude estimation module to generate key point information; wherein the key points comprise at least one of wrist key points, palm key points, elbow key points, face key points and eye key points;
the wrist key points are key points corresponding to the wrist parts; the palm key points are key points corresponding to the palm parts; the elbow key points are the key points corresponding to the elbow parts; the eye key points are key points corresponding to the eye parts;
the judging module is used for judging whether the behavior picture information contains communication behaviors or not according to the key point information;
which comprises the following steps: calculating the distance between the palm key point and the face key point;
judging whether the distance between the palm key point and the face key point is smaller than a preset distance or not;
if so, judging whether the palm key point is positioned between a straight line extending along the right front of the face of the monitored person where the left eye key point is positioned and a straight line extending along the right front of the face of the monitored person where the right eye key point is positioned in the horizontal direction;
if not, determining that the behavior picture information contains communication behaviors.
10. The apparatus of claim 9, wherein the input pose estimation module employs openpos skeletal model-based pose recognition technology.
11. The apparatus of claim 9, wherein the generating module is specifically configured to: inputting the picture information into a posture estimation module to generate palm key point position information, face key point position information, left eye key point position information and right eye key point position information;
the judgment module is specifically configured to:
calculating the distance between the palm key point and the face key point;
judging whether the distance between the palm key point and the face key point is smaller than a preset distance or not;
if so, judging whether the palm key point is positioned between a straight line extending along the right front of the face of the monitored person where the left eye key point is positioned and a straight line extending along the right front of the face of the monitored person where the right eye key point is positioned in the horizontal direction;
if not, determining that the behavior picture information contains communication behaviors.
12. The apparatus of claim 9,
the generation module is specifically configured to:
generating corresponding elbow key point position information and frame number information of different frame numbers in continuous time; generating wrist key point position information and frame number information corresponding to different frame numbers in continuous time;
generating information and frame number information of whether the palm key point uses the telephone or not corresponding to different frame numbers in continuous time; the judgment module is specifically configured to:
the determining whether the behavior picture information contains a communication behavior according to the key point information includes:
judging whether the behavior is a hand-lifting behavior or not according to the position information of the elbow key points and the position information of the wrist key points;
if so, judging whether the duration time of the hand raising behavior exceeds a first preset time according to the elbow key point frame number information and the wrist key point frame number information;
if yes, judging that the duration of using the telephone exceeds a second preset time within the duration of the hand-up behavior according to the information and the frame number information of whether the palm key point uses the telephone;
if yes, determining whether the behavior picture information contains communication behaviors.
13. A communication behavior detection device characterized by comprising:
a processor, and a memory coupled to the processor;
the memory is configured to store a computer program for performing at least the communication behavior detection method of any of claims 1-8;
the processor is used for calling and executing the computer program in the memory.
14. A storage medium, characterized in that the storage medium stores a computer program which, when executed by a processor, implements the steps of the communication behavior detection method according to any one of claims 1 to 8.
CN201810921697.4A 2018-08-13 2018-08-13 Communication behavior detection method, device, equipment and storage medium Active CN109086729B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810921697.4A CN109086729B (en) 2018-08-13 2018-08-13 Communication behavior detection method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810921697.4A CN109086729B (en) 2018-08-13 2018-08-13 Communication behavior detection method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN109086729A CN109086729A (en) 2018-12-25
CN109086729B true CN109086729B (en) 2022-03-01

Family

ID=64834660

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810921697.4A Active CN109086729B (en) 2018-08-13 2018-08-13 Communication behavior detection method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN109086729B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110070001A (en) * 2019-03-28 2019-07-30 上海拍拍贷金融信息服务有限公司 Behavioral value method and device, computer readable storage medium
CN110084123A (en) * 2019-03-28 2019-08-02 上海拍拍贷金融信息服务有限公司 Human body behavioral value method and system, computer readable storage medium
CN110110631B (en) * 2019-04-25 2021-06-29 深兰科技(上海)有限公司 Method and equipment for recognizing and making call
CN110490125B (en) * 2019-08-15 2023-04-18 成都睿晓科技有限公司 Oil filling area service quality detection system based on gesture automatic detection
CN111461020B (en) * 2020-04-01 2024-01-19 浙江大华技术股份有限公司 Recognition method, equipment and related storage medium for unsafe mobile phone behavior
CN112906646A (en) * 2021-03-23 2021-06-04 中国联合网络通信集团有限公司 Human body posture detection method and device

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10395018B2 (en) * 2010-11-29 2019-08-27 Biocatch Ltd. System, method, and device of detecting identity of a user and authenticating a user
CN105469073A (en) * 2015-12-16 2016-04-06 安徽创世科技有限公司 Kinect-based call making and answering monitoring method of driver
CN105912991B (en) * 2016-04-05 2019-06-25 湖南大学 Activity recognition based on 3D point cloud and crucial bone node
CN106066996B (en) * 2016-05-27 2019-07-30 上海理工大学 The local feature representation method of human action and its application in Activity recognition
CN107301384A (en) * 2017-06-09 2017-10-27 湖北天业云商网络科技有限公司 A kind of driver takes phone behavioral value method and system
CN108256431B (en) * 2017-12-20 2020-09-25 中车工业研究院有限公司 Hand position identification method and device
CN108289180B (en) * 2018-01-30 2020-08-21 广州市百果园信息技术有限公司 Method, medium, and terminal device for processing video according to body movement

Also Published As

Publication number Publication date
CN109086729A (en) 2018-12-25

Similar Documents

Publication Publication Date Title
CN109086729B (en) Communication behavior detection method, device, equipment and storage medium
CN110210571B (en) Image recognition method and device, computer equipment and computer readable storage medium
JP7154678B2 (en) Target position acquisition method, device, computer equipment and computer program
CN111325726A (en) Model training method, image processing method, device, equipment and storage medium
KR102316278B1 (en) Electronic device and method for controlling fingerprint information
CN108182396B (en) Method and device for automatically identifying photographing behavior
US9746927B2 (en) User interface system and method of operation thereof
EP3076320A1 (en) Individual identification device, and identification threshold setting method
CN109684980B (en) Automatic scoring method and device
CN109191802B (en) Method, device, system and storage medium for eyesight protection prompt
CN108875533B (en) Face recognition method, device, system and computer storage medium
CN110570460B (en) Target tracking method, device, computer equipment and computer readable storage medium
CN112506340B (en) Equipment control method, device, electronic equipment and storage medium
KR20180013277A (en) Electronic apparatus for displaying graphic object and computer readable recording medium
US20140232748A1 (en) Device, method and computer readable recording medium for operating the same
CN109509257A (en) Indoor floor rank components pattern forming method, terminal and storage medium
CN112181141B (en) AR positioning method and device, electronic equipment and storage medium
KR102337209B1 (en) Method for notifying environmental context information, electronic apparatus and storage medium
KR20180116843A (en) Method for detecting motion and electronic device implementing the same
CN108628442A (en) A kind of information cuing method, device and electronic equipment
US10481772B2 (en) Widget displaying method and apparatus for use in flexible display device, and storage medium
US20200018926A1 (en) Information processing apparatus, information processing method, and program
CN108920085B (en) Information processing method and device for wearable device
CN108898000A (en) A kind of method and terminal solving lock screen
CN108537149A (en) Image processing method, device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: No. 803 and 804, 8 / F, building 7, No. 599, shijicheng South Road, Chengdu hi tech Zone, China (Sichuan) pilot Free Trade Zone, Chengdu 610000, Sichuan Province

Applicant after: Chengdu yunstare Technology Co.,Ltd.

Address before: 610000 Chengdu, Sichuan, China (Sichuan) free trade pilot area, Chengdu high tech Zone, 10 Tianhua two road 219, 6 level 601.

Applicant before: Chengdu Huina Intelligent Technology Co.,Ltd.

Address after: 610000 Chengdu, Sichuan, China (Sichuan) free trade pilot area, Chengdu high tech Zone, 10 Tianhua two road 219, 6 level 601.

Applicant after: Chengdu Huina Intelligent Technology Co.,Ltd.

Address before: 610000 Chengdu, Sichuan, China (Sichuan) free trade pilot area, Chengdu high tech Zone, 10 Tianhua two road 219, 6 level 601.

Applicant before: CHENGDU DINGDING TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: Communication behavior detection method, device, equipment and storage medium

Effective date of registration: 20220616

Granted publication date: 20220301

Pledgee: Bank of Chengdu science and technology branch of Limited by Share Ltd.

Pledgor: Chengdu yunstare Technology Co.,Ltd.

Registration number: Y2022980007938