CN112016363A - Personnel monitoring method and device, computer device and readable storage medium - Google Patents

Personnel monitoring method and device, computer device and readable storage medium Download PDF

Info

Publication number
CN112016363A
CN112016363A CN201910465988.1A CN201910465988A CN112016363A CN 112016363 A CN112016363 A CN 112016363A CN 201910465988 A CN201910465988 A CN 201910465988A CN 112016363 A CN112016363 A CN 112016363A
Authority
CN
China
Prior art keywords
personnel
person
action
behavior
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910465988.1A
Other languages
Chinese (zh)
Inventor
李国瑜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Futaihua Industry Shenzhen Co Ltd
Hon Hai Precision Industry Co Ltd
Original Assignee
Futaihua Industry Shenzhen Co Ltd
Hon Hai Precision Industry Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Futaihua Industry Shenzhen Co Ltd, Hon Hai Precision Industry Co Ltd filed Critical Futaihua Industry Shenzhen Co Ltd
Priority to CN201910465988.1A priority Critical patent/CN112016363A/en
Publication of CN112016363A publication Critical patent/CN112016363A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Alarm Systems (AREA)

Abstract

The invention provides a personnel monitoring method, a personnel monitoring device, a computer device and a computer readable storage medium, wherein the method comprises the following steps: acquiring a video image of a person in a preset area in real time from a camera device; extracting face information of people in the video image, and determining the person information according to the face information; extracting a behavior feature map of a person in the video image, and determining the action of the person according to the behavior feature map; comparing the action of the personnel with the action categories in the action library to judge whether the personnel action is abnormal; if the personnel behavior is abnormal, outputting the personnel information; and if the personnel act normally, continuing monitoring. By the method, the activity range of the personnel in the preset area can be accurately mastered, and whether the behavior of the personnel is normal or not can be judged.

Description

Personnel monitoring method and device, computer device and readable storage medium
Technical Field
The invention relates to the technical field of computers, in particular to a personnel monitoring method, a personnel monitoring device, a computer device and a computer readable storage medium.
Background
With the continuous improvement of economic level, the manufacturing industry is rapidly developed, the market competition is global, the manpower demand is continuously increased, and the manpower cost is increased day by day. In the current economic form, management of human costs faces significant challenges. Under the current management mode of the manufacturing industry, the decision-making layer is difficult to master the working states and the whereabouts of all the personnel; although the existing monitoring system can reflect part of problems, the existing monitoring system cannot be effectively utilized, the workload of screening and looking up monitoring information is large, specific idle personnel is difficult to identify, the idle time and the activity range are difficult to define, the information utilization and transmission mechanism is incomplete, the assessment of the working change of staff is lack of reliable data support, the feedback mechanism is opaque, and the staff excitation is insufficient.
Disclosure of Invention
In view of the above, there is a need for a personnel monitoring method, a personnel monitoring device, a computer device and a computer readable storage medium, which can realize monitoring of the working state and behavior trace of personnel.
A first aspect of the present application provides a people monitoring method, the method comprising:
acquiring a video image of a person in a preset area in real time from a camera device;
extracting face information of people in the video image, and determining the person information according to the face information;
extracting a behavior feature map of a person in the video image, and determining the action of the person according to the behavior feature map;
comparing the action of the personnel with the action categories in the action library to judge whether the personnel action is abnormal;
if the personnel behavior is abnormal, outputting the personnel information;
and if the personnel act normally, continuing monitoring.
Preferably, the method further comprises:
judging whether the personnel are in a normal moving range or not according to the personnel information;
and if the personnel are not in the normal activity area, an abnormal alarm is given.
Preferably, the step of determining whether the person is in a normal range of motion comprises:
acquiring image frames with the personnel, which are acquired by a plurality of camera devices, according to a time sequence;
marking the position of a camera device for acquiring the image frame of the person on a map of a preset area;
outputting the movement track of the personnel according to the marked position;
and judging whether the movement track of the personnel is in a normal movement range or not according to the personnel information.
Preferably, the method further comprises:
counting the duration time of normal behaviors and the duration time of abnormal behaviors of any person within a preset time, and calculating the working saturation of the person;
and counting the duration of normal behaviors and the duration of abnormal behaviors of all the personnel in any post within a preset time, and calculating the working saturation of the post.
Preferably, the method further comprises:
and allocating the personnel at different posts according to the working saturation of the posts.
Preferably, the step of extracting the face information of the person in the video image and determining the person information according to the face information includes:
extracting an image frame with the head portrait of the human face in a video image;
extracting the face feature information of the personnel by a face recognition method;
and searching the personnel information matched with the face feature information in a preset database.
Preferably, the step of extracting a behavior feature map of a person in the video image, and determining the action of the person according to the behavior feature map includes:
extracting image frames with personnel information in the video images;
identifying a behavior feature map of the person in the image frame by using a human behavior identification algorithm;
and determining the action of the person represented by the behavior characteristic diagram of the person according to the behavior characteristic diagram of the person.
Preferably, the step of determining the action of the person represented by the behavior feature map of the person comprises:
identifying key points of human bones in the behavior feature map of the person;
connecting the key points, and converting the connecting line into a vector distance;
and determining the action of the person represented by the behavior feature diagram of the person according to the vector distance.
Preferably, the content of the abnormal behavior action library includes at least one of a behavior action in a normal state, a behavior action duration in a normal state, a working range in a normal state, a behavior action in an abnormal state, a behavior action duration in an abnormal state, and a working range in an abnormal state.
A second aspect of the present application provides a people monitoring apparatus, the apparatus comprising:
an acquisition module: the system comprises a camera device, a display device and a control device, wherein the camera device is used for acquiring a video image of personnel in a preset area in real time;
a first extraction module: the face information extraction module is used for extracting face information of people in the video image and determining the person information according to the face information;
a second extraction module: the behavior feature map is used for extracting the behavior feature map of the person in the video image, and the action of the person is determined according to the behavior feature map;
a judging module: the action database is used for comparing the action of the personnel with the action categories in the action database and judging whether the personnel action is abnormal or not;
a first execution module: the personnel information output module is used for outputting the personnel information when the personnel behavior is abnormal;
a second execution module: and when the behavior of the personnel is normal, continuing to monitor.
A third aspect of the application provides a computer arrangement comprising a processor for implementing the person monitoring method as described above when executing a computer program stored in a memory.
A fourth aspect of the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a person monitoring method as described above.
The personnel monitoring of the invention enables the personnel monitoring mechanism to be more intelligent and accurate, simplifies the workload of screening and consulting the monitoring information, accurately masters the activity range of personnel in the preset area and judges whether the behavior of the personnel is normal or not.
Drawings
Fig. 1 is a schematic diagram of an application environment architecture of a personnel monitoring method according to an embodiment of the present invention.
Fig. 2 is a flowchart of a person monitoring method according to a second embodiment of the present invention.
Fig. 3 is a schematic structural diagram of a person monitoring device according to a third embodiment of the present invention.
Fig. 4 is a schematic diagram of a computer device according to a fourth embodiment of the present invention.
Detailed Description
In order that the above objects, features and advantages of the present invention can be more clearly understood, a detailed description of the present invention will be given below with reference to the accompanying drawings and specific embodiments. It should be noted that the embodiments and features of the embodiments of the present application may be combined with each other without conflict.
In the following description, numerous specific details are set forth to provide a thorough understanding of the present invention, and the described embodiments are merely a subset of the embodiments of the present invention, rather than a complete embodiment. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention.
Example one
Fig. 1 is a schematic diagram of an application environment architecture of a personnel monitoring method according to an embodiment of the present invention.
The personnel monitoring method is applied to a computer device 1, the computer device 1 and a plurality of camera devices 2 establish communication connection through a network, and in other embodiments, the computer device 1 can also establish communication connection with at least one user terminal 3 through a network. The network may be a wired network or a Wireless network, such as radio, Wireless Fidelity (WIFI), cellular, satellite, broadcast, etc.
The computer device 1 may be an electronic device installed with personnel monitoring method software, such as a personal computer, a server, and the like, wherein the server may be a single server, a server cluster, a cloud server, or the like.
The camera 2 may be a camera with a shooting function, including but not limited to a 360 ° panoramic camera, a monitor, and the like.
The user terminal 3 is a variety of intelligent electronic devices with a display screen including, but not limited to, smart phones, tablets, laptop convenience computers, desktop computers, and the like.
Example two
Fig. 2 is a flowchart illustrating a personnel monitoring method according to a second embodiment of the present invention. The order of the steps in the flow chart may be changed and some steps may be omitted according to different needs.
And step S1, acquiring the video image of the personnel in the preset area in real time from the camera device.
In an embodiment of the present invention, the preset area may be a factory, a construction site, a park, a school, or other public places.
In another embodiment of the present invention, the image capturing device may be a 360 ° panoramic network camera, and the 360 ° panoramic network camera may be installed at an entrance of a preset area, a main trunk of the preset area, and around the preset area according to actual needs, so as to obtain the optimal range of the whole preset area.
And step S2, extracting the face information of the person in the video image, and determining the person information according to the face information.
In an embodiment of the present invention, the step of extracting face information of a person in the video image, and determining the person information according to the face information may include:
extracting an image frame with the head portrait of the human face in a video image;
extracting the face feature information of the personnel by a face recognition method;
and searching the personnel information matched with the face feature information in a preset database.
The personnel information includes, but is not limited to, name, job number, department of belongings, work place, and the like.
The preset database comprises face feature information of all the persons in the preset area and person information corresponding to the face feature information.
In another embodiment of the present invention, the steps further include determining whether the person is in a normal range of motion based on the person information;
and if the personnel are not in the normal activity area, an abnormal alarm is given.
The abnormal alarm can be sent to the preset personnel by means of mails, short messages, telephones and instant messages.
The step of determining whether the person is in a normal range of motion may comprise:
acquiring image frames with the personnel, which are acquired by a plurality of camera devices, according to a time sequence;
marking the position of a camera device for acquiring the image frame of the person on a map of a preset area;
outputting the movement track of the personnel according to the marked position;
and judging whether the movement track of the personnel is in a normal movement range or not according to the personnel information.
For example, by extracting the face information of the video, feature extraction is performed on the face features of the person by using a face recognition method according to the face information, the face features of the person are compared with information in a database in a preset person information database, and then the person information is output, for example, the person information is Zhang III, the department is manufacturing one team, and the work place is manufacturing second building of a building. Acquiring image frames of Zhang three face characteristics acquired by a plurality of camera devices according to a time sequence; marking the position of a camera device for acquiring the image frame of the personnel information on a map of a preset area; outputting the movement track of the personnel according to the marked position; for example, according to the labeled location information, the person is now located at the first floor of the building. And judging whether the movement track of the personnel is in the normal movement range or not according to the normal movement range of the personnel. And if the personnel leave the normal activity range of the personnel, sending an abnormal alarm in the modes of short messages, mails and the like.
And step S3, extracting the behavior feature diagram of the person in the video image, and determining the action of the person according to the behavior feature diagram.
In an embodiment of the present invention, the step of extracting a behavior feature map of a person in the video image, and determining the action of the person according to the behavior feature map includes:
extracting image frames with personnel information in the video images;
identifying a behavior feature map of the person in the image frame by using a human behavior identification algorithm;
and determining the action of the person represented by the behavior characteristic diagram of the person according to the behavior characteristic diagram of the person.
The human behavior recognition algorithm includes, but is not limited to, a human behavior recognition algorithm based on machine vision, and a human behavior recognition algorithm based on deep learning.
In yet another embodiment of the present invention, a programming language may be used to read image frames in the video image and output the image frames.
In an embodiment of the present invention, the step of determining the person actions represented by the behavior feature map of the person includes:
identifying key points of human bones in the behavior feature map of the person;
connecting the key points, and converting the connecting line into a vector distance;
and determining the action of the person represented by the behavior feature diagram of the person according to the vector distance.
For example, a video image acquired by a 360-degree panoramic network camera is read by a Python program to obtain an image frame with personnel information in the video image, the image frame is output, then a human behavior feature map in the image frame is identified by using a human behavior identification algorithm based on deep learning, key points of human bones in the human behavior feature map are identified, the key points of the human bones comprise a head, a shoulder, a palm and a sole, the key points are connected, the distance of the connection is calculated, and the personnel actions represented by the human feature map are determined by comparing the distance of the connection with the distance of a person in actions such as standing, sitting and walking.
And step S4, comparing the action of the personnel with the action types in the action library, and judging whether the personnel action is abnormal.
In an embodiment of the present invention, the content of the behavior action library may include at least one of a behavior action in a normal state, a behavior action duration in a normal state, a working range in a normal state, a behavior action in an abnormal state, a behavior action duration in an abnormal state, and a working range in an abnormal state.
For example, the behavior actions of the abnormal state may include making a call, watching a mobile phone, smoking a cigarette, making an alarm, etc.; the duration of the behavior action in the abnormal state can comprise that the call time exceeds 5 minutes, and the mobile phone watching time exceeds 3 minutes; the operating range of the abnormal state may include an area where a certain kind of person should not appear for a certain period of time. When the action behavior of a certain person is determined to be calling, recording the calling time of the person as 8 minutes, comparing the calling time with the data in the abnormal behavior data, and judging that the behavior of the person is abnormal if the calling time exceeding 5 minutes is abnormal.
And step S5, if the personnel behavior is abnormal, outputting the personnel information.
For example, the obtained action category of the person is compared with the action category in the action library, the person action is judged to be abnormal, the person information is retrieved, the name, the job number, the department and the work place of the person are found out, and the information of the abnormal action person is notified to a related person in a mail, short message or telephone mode.
And step S6, if the personnel behavior is normal, continuing monitoring.
In an embodiment of the present invention, the obtained action category of the person is compared with the action category in the action library, and if the person is judged to be normal, the person is continuously monitored within the preset area range.
In another embodiment of the present invention, the steps may further include: counting the duration of normal behaviors and the duration of abnormal behaviors of any person within a preset time, and calculating the work saturation of the person, wherein the calculation mode of the work saturation of the person is the ratio of the duration of normal behaviors to the duration of abnormal behaviors of the person within the preset time.
Counting the duration of normal behaviors and the duration of abnormal behaviors of all people in any post within a preset time, and calculating the working saturation of the post in a mode of calculating the working saturation of the post within the preset time, wherein the ratio of the duration of the normal behaviors to the duration of the abnormal behaviors of all people in the post within the preset time.
For example, in a working day, the time for counting the normal behavior of the employee Zhang III is 420 minutes, the time for counting the abnormal behavior is 80 minutes, and the work saturation of Zhang III is 84%.
In a working day, the sum of the normal behavior time of 5 employees of a machine tool machining post is counted to be 39 hours, the sum of the abnormal behavior time is counted to be 1 hour, and the working saturation of the post is 97.5 percent.
In other embodiments of the present invention, the steps may further include: and allocating the personnel at different posts according to the working saturation of the posts.
For example, a factory has five production lines, each of which has an electric welding station, and it is calculated that the work saturation of the electric welding stations of different production lines is 98%, 92%, 90%, 85%, and 75%, respectively, and it is necessary to allocate employees in the station with low work saturation to the station with high work saturation.
The above-mentioned fig. 2 describes the personnel monitoring method of the present invention in detail, and the functional modules of the software device for implementing the personnel monitoring method and the hardware device architecture for implementing the personnel monitoring method are described below with reference to fig. 3-4.
It is to be understood that the embodiments are illustrative only and that the scope of the claims is not limited to this configuration.
EXAMPLE III
FIG. 3 is a block diagram of a preferred embodiment of the people monitoring device of the present invention.
In some embodiments, the people monitoring device 10 operates in a computer device. The computer device is connected with a plurality of user terminals through a network. The people monitoring device 10 may comprise a plurality of functional modules consisting of program code segments. The program code of the various program segments in the people monitoring device 10 may be stored in a memory of a computer device and executed by the at least one processor to implement the people monitoring function.
In this embodiment, the people monitoring device 10 may be divided into a plurality of functional modules according to the functions performed by the people monitoring device. Referring to fig. 3, the functional modules may include: the device comprises an acquisition module 101, a first extraction module 102, a second extraction module 103, a judgment module 104, a first execution module 105 and a second execution module 106. The module referred to herein is a series of computer program segments capable of being executed by at least one processor and capable of performing a fixed function and is stored in memory. In the present embodiment, the functions of the modules will be described in detail in the following embodiments.
The acquisition module 101: the system is used for acquiring the video images of the persons in the preset area in real time from the camera device.
In an embodiment of the present invention, the preset area may be a factory, a construction site, a park, a school, or other public places.
In another embodiment of the present invention, the image capturing device may be a 360 ° panoramic network camera, and the 360 ° panoramic network camera may be installed at an entrance of a preset area, a main trunk of the preset area, and around the preset area according to actual needs, so as to obtain the optimal range of the whole preset area.
The first extraction module 102: the face information extraction module is used for extracting face information of people in the video image and determining the people information according to the face information.
In an embodiment of the present invention, the step of extracting face information of a person in the video image, and determining the person information according to the face information may include:
extracting an image frame with the head portrait of the human face in a video image;
extracting the face feature information of the personnel by a face recognition method;
and searching the personnel information matched with the face feature information in a preset database.
The personnel information includes, but is not limited to, name, job number, department of belongings, work place, and the like.
The preset database comprises face feature information of all the persons in the preset area and person information corresponding to the face feature information.
In another embodiment of the present invention, the steps further include determining whether the person is in a normal range of motion based on the person information;
and if the personnel are not in the normal activity area, an abnormal alarm is given.
The abnormal alarm can be sent to the preset personnel by means of mails, short messages, telephones and instant messages.
The step of determining whether the person is in a normal range of motion may comprise:
acquiring image frames with the personnel, which are acquired by a plurality of camera devices, according to a time sequence;
marking the position of a camera device for acquiring the image frame of the person on a map of a preset area;
outputting the movement track of the personnel according to the marked position;
and judging whether the movement track of the personnel is in a normal movement range or not according to the personnel information.
For example, by extracting the face information of the video, feature extraction is performed on the face features of the person by using a face recognition method according to the face information, the face features of the person are compared with information in a database in a preset person information database, and then the person information is output, for example, the person information is Zhang III, the department is manufacturing one team, and the work place is manufacturing second building of a building. Acquiring image frames of Zhang three face characteristics acquired by a plurality of camera devices according to a time sequence; marking the position of a camera device for acquiring the image frame of the personnel information on a map of a preset area; outputting the movement track of the personnel according to the marked position; for example, according to the labeled location information, the person is now located at the first floor of the building. And judging whether the movement track of the personnel is in the normal movement range or not according to the normal movement range of the personnel. And if the personnel leave the normal activity range of the personnel, sending an abnormal alarm in the modes of short messages, mails and the like.
The second extraction module 103: the method is used for extracting the behavior feature map of the person in the video image and determining the action of the person according to the behavior feature map.
In an embodiment of the present invention, the step of extracting a behavior feature map of a person in the video image, and determining the action of the person according to the behavior feature map includes:
extracting image frames with personnel information in the video images;
identifying a behavior feature map of the person in the image frame by using a human behavior identification algorithm;
and determining the action of the person represented by the behavior characteristic diagram of the person according to the behavior characteristic diagram of the person.
The human behavior recognition algorithm includes, but is not limited to, a human behavior recognition algorithm based on machine vision, and a human behavior recognition algorithm based on deep learning.
In yet another embodiment of the present invention, a programming language may be used to read image frames in the video image and output the image frames.
In an embodiment of the present invention, the step of determining the person actions represented by the behavior feature map of the person includes:
identifying key points of human bones in the behavior feature map of the person;
connecting the key points, and converting the connecting line into a vector distance;
and determining the action of the person represented by the behavior feature diagram of the person according to the vector distance.
For example, a video image acquired by a 360-degree panoramic network camera is read by a Python program to obtain an image frame with personnel information in the video image, the image frame is output, then a human behavior feature map in the image frame is identified by using a human behavior identification algorithm based on deep learning, key points of human bones in the human behavior feature map are identified, the key points of the human bones comprise a head, a shoulder, a palm and a sole, the key points are connected, the distance of the connection is calculated, and the personnel actions represented by the human feature map are determined by comparing the distance of the connection with the distance of a person in actions such as standing, sitting and walking.
The judging module 104: and the action database is used for comparing the action of the personnel with the action categories in the action database and judging whether the personnel action is abnormal or not.
In an embodiment of the present invention, the content of the behavior action library may include at least one of a behavior action in a normal state, a behavior action duration in a normal state, a working range in a normal state, a behavior action in an abnormal state, a behavior action duration in an abnormal state, and a working range in an abnormal state.
For example, the behavior actions of the abnormal state may include making a call, watching a mobile phone, smoking a cigarette, making an alarm, etc.; the duration of the behavior action in the abnormal state can comprise that the call time exceeds 5 minutes, and the mobile phone watching time exceeds 3 minutes; the operating range of the abnormal state may include an area where a certain kind of person should not appear for a certain period of time. When the action behavior of a certain person is determined to be calling, recording the calling time of the person as 8 minutes, comparing the calling time with the data in the abnormal behavior data, and judging that the behavior of the person is abnormal if the calling time exceeding 5 minutes is abnormal.
The first execution module 105: and the information processing module is used for outputting the information of the personnel when the behavior of the personnel is abnormal.
For example, the obtained action category of the person is compared with the action category in the action library, the person action is judged to be abnormal, the person information is retrieved, the name, the job number, the department and the work place of the person are found out, and the information of the abnormal action person is notified to a related person in a mail, short message or telephone mode.
The second execution module 106: and when the behavior of the personnel is normal, continuing to monitor.
In an embodiment of the present invention, the obtained action category of the person is compared with the action category in the action library, and if the person is judged to be normal, the person is continuously monitored within the preset area range.
In another embodiment of the present invention, the steps may further include: counting the duration of normal behaviors and the duration of abnormal behaviors of any person within a preset time, and calculating the work saturation of the person, wherein the calculation mode of the work saturation of the person is the ratio of the duration of normal behaviors to the duration of abnormal behaviors of the person within the preset time.
Counting the duration of normal behaviors and the duration of abnormal behaviors of all people in any post within a preset time, and calculating the working saturation of the post in a mode of calculating the working saturation of the post within the preset time, wherein the ratio of the duration of the normal behaviors to the duration of the abnormal behaviors of all people in the post within the preset time.
For example, in a working day, the time for counting the normal behavior of the employee Zhang III is 420 minutes, the time for counting the abnormal behavior is 80 minutes, and the work saturation of Zhang III is 84%.
In a working day, the sum of the normal behavior time of 5 employees of a machine tool machining post is counted to be 39 hours, the sum of the abnormal behavior time is counted to be 1 hour, and the working saturation of the post is 97.5 percent.
In other embodiments of the present invention, the steps may further include: and allocating the personnel at different posts according to the working saturation of the posts.
For example, a factory has five production lines, each of which has an electric welding station, and it is calculated that the work saturation of the electric welding stations of different production lines is 98%, 92%, 90%, 85%, and 75%, respectively, and it is necessary to allocate employees in the station with low work saturation to the station with high work saturation.
Example four
FIG. 4 is a diagram of a computer device according to a preferred embodiment of the present invention.
The computer device 1 comprises a memory 20, a processor 30 and a computer program 40, such as a person monitoring program, stored in the memory 20 and executable on the processor 30. The processor 30, when executing the computer program 40, implements the steps of the above-described embodiment of the person monitoring method, such as the steps S1-S6 shown in fig. 2. Alternatively, the processor 30, when executing the computer program 40, implements the functions of the modules/units in the aforementioned embodiment of the people monitoring apparatus, such as the unit 101 and 106 in fig. 3.
Illustratively, the computer program 40 may be partitioned into one or more modules/units that are stored in the memory 20 and executed by the processor 30 to implement the present invention. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, the instruction segments describing the execution process of the computer program 40 in the computer apparatus 1. For example, the computer program 40 may be divided into an obtaining module 101, a first extracting module 102, a second extracting module 103, a determining module 104, a first executing module 105, and a second executing module 106 in fig. 3, and the functions of the respective modules are shown in the third embodiment.
The computer device 1 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. It will be appreciated by a person skilled in the art that the schematic diagram is merely an example of the computer apparatus 1, and does not constitute a limitation of the computer apparatus 1, and may comprise more or less components than those shown, or some components may be combined, or different components, for example, the computer apparatus 1 may further comprise an input and output device, a network access device, a bus, etc.
The Processor 30 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. The general purpose processor may be a microprocessor or the processor 30 may be any conventional processor or the like, the processor 30 being the control center of the computer device 1, various interfaces and lines connecting the various parts of the overall computer device 1.
The memory 20 may be used for storing the computer program 40 and/or the module/unit, and the processor 30 implements various functions of the computer device 1 by running or executing the computer program and/or the module/unit stored in the memory 20 and calling data stored in the memory 20. The memory 20 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the computer apparatus 1, and the like. In addition, the memory 20 may include high speed random access memory, and may also include non-volatile memory, such as a hard disk, a memory, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), at least one magnetic disk storage device, a Flash memory device, or other volatile solid state storage device.
The modules/units integrated with the computer device 1 may be stored in a computer-readable storage medium if they are implemented in the form of software functional units and sold or used as separate products. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and which, when executed by a processor, may implement the steps of the above-described embodiments of the method. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
In the embodiments provided in the present invention, it should be understood that the disclosed computer apparatus and method can be implemented in other ways. For example, the above-described embodiments of the computer apparatus are merely illustrative, and for example, the division of the units is only one logical function division, and there may be other divisions when the actual implementation is performed.
In addition, functional units in the embodiments of the present invention may be integrated into the same processing unit, or each unit may exist alone physically, or two or more units are integrated into the same unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional module.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned. Furthermore, it is obvious that the word "comprising" does not exclude other elements or steps, and the singular does not exclude the plural. The units or computer means recited in the computer means claims may also be implemented by the same unit or computer means, either in software or in hardware. The terms first, second, etc. are used to denote names, but not any particular order.
Finally, it should be noted that the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting, and although the present invention is described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention.

Claims (12)

1. A people monitoring method, the method comprising:
acquiring a video image of a person in a preset area in real time from a camera device;
extracting face information of people in the video image, and determining the person information according to the face information;
extracting a behavior feature map of a person in the video image, and determining the action of the person according to the behavior feature map;
comparing the action of the personnel with the action categories in the action library to judge whether the personnel action is abnormal;
if the personnel behavior is abnormal, outputting the personnel information;
and if the personnel act normally, continuing monitoring.
2. The people monitoring method of claim 1, further comprising:
judging whether the personnel are in a normal moving range or not according to the personnel information;
and if the personnel are not in the normal activity area, an abnormal alarm is given.
3. The people monitoring method of claim 2, wherein the step of determining whether the person is in a normal range of motion comprises:
acquiring image frames with the personnel, which are acquired by a plurality of camera devices, according to a time sequence;
marking the position of a camera device for acquiring the image frame of the person on a map of a preset area;
outputting the movement track of the personnel according to the marked position;
and judging whether the movement track of the personnel is in a normal movement range or not according to the personnel information.
4. The people monitoring method of claim 1, further comprising:
counting the duration time of normal behaviors and the duration time of abnormal behaviors of any person within a preset time, and calculating the working saturation of the person;
and counting the duration of normal behaviors and the duration of abnormal behaviors of all the personnel in any post within a preset time, and calculating the working saturation of the post.
5. The people monitoring method of claim 4, wherein the method further comprises:
and allocating the personnel at different posts according to the working saturation of the posts.
6. The person monitoring method according to claim 1, wherein the step of extracting face information of a person in the video image and determining the person information according to the face information comprises:
extracting an image frame with the head portrait of the human face in a video image;
extracting the face feature information of the personnel by a face recognition method;
and searching the personnel information matched with the face feature information in a preset database.
7. The people monitoring method according to claim 1, wherein the step of extracting a behavior feature map of the people in the video image and determining the actions of the people according to the behavior feature map comprises:
extracting image frames with personnel information in the video images;
identifying a behavior feature map of the person in the image frame by using a human behavior identification algorithm;
and determining the action of the person represented by the behavior characteristic diagram of the person according to the behavior characteristic diagram of the person.
8. The people monitoring method of claim 7, wherein the step of determining the actions of the person represented by the behavioral profile of the person comprises:
identifying key points of human bones in the behavior feature map of the person;
connecting the key points, and converting the connecting line into a vector distance;
and determining the action of the person represented by the behavior feature diagram of the person according to the vector distance.
9. The people monitoring method according to claim 1, wherein the content of the abnormal behavior action library includes at least one of a behavior action in a normal state, a behavior action duration in a normal state, a working range in a normal state, a behavior action in an abnormal state, a behavior action duration in an abnormal state, and a working range in an abnormal state.
10. A people monitoring device, characterized in that the device comprises:
an acquisition module: the system comprises a camera device, a display device and a control device, wherein the camera device is used for acquiring a video image of personnel in a preset area in real time;
a first extraction module: the face information extraction module is used for extracting face information of people in the video image and determining the person information according to the face information;
a second extraction module: the behavior feature map is used for extracting the behavior feature map of the person in the video image, and the action of the person is determined according to the behavior feature map;
a judging module: the action database is used for comparing the action of the personnel with the action categories in the action database and judging whether the personnel action is abnormal or not;
a first execution module: the personnel information output module is used for outputting the personnel information when the personnel behavior is abnormal;
a second execution module: and when the behavior of the personnel is normal, continuing to monitor.
11. A computer device, characterized by: the computer arrangement comprises a processor for implementing the person monitoring method according to any one of claims 1-9 when executing a computer program stored in a memory.
12. A computer-readable storage medium having stored thereon a computer program, characterized in that: the computer program, when being executed by a processor, implementing a people monitoring method as claimed in any one of the claims 1-9.
CN201910465988.1A 2019-05-30 2019-05-30 Personnel monitoring method and device, computer device and readable storage medium Pending CN112016363A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910465988.1A CN112016363A (en) 2019-05-30 2019-05-30 Personnel monitoring method and device, computer device and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910465988.1A CN112016363A (en) 2019-05-30 2019-05-30 Personnel monitoring method and device, computer device and readable storage medium

Publications (1)

Publication Number Publication Date
CN112016363A true CN112016363A (en) 2020-12-01

Family

ID=73501988

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910465988.1A Pending CN112016363A (en) 2019-05-30 2019-05-30 Personnel monitoring method and device, computer device and readable storage medium

Country Status (1)

Country Link
CN (1) CN112016363A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113435380A (en) * 2021-07-06 2021-09-24 北京市商汤科技开发有限公司 Method and device for detecting people and sentry matching, computer equipment and storage medium
CN114529864A (en) * 2021-12-30 2022-05-24 东莞先知大数据有限公司 Method and device for detecting shoreside smuggling behavior and storage medium
CN114666546A (en) * 2022-03-24 2022-06-24 中国铁塔股份有限公司江苏省分公司 Monitoring method and device for communication iron tower and communication iron tower
CN115099724A (en) * 2022-08-24 2022-09-23 中达安股份有限公司 Monitoring and early warning method, device and equipment for construction scene and storage medium
CN116363575A (en) * 2023-02-15 2023-06-30 南京诚勤教育科技有限公司 Classroom monitoring management system based on wisdom campus
CN116434296A (en) * 2023-03-02 2023-07-14 深圳市华方信息产业有限公司 Real-time face recognition monitoring behavior method, device, equipment and medium
CN117151959A (en) * 2023-10-16 2023-12-01 广东紫慧旭光科技有限公司 Real-time video analysis method, system and storage medium for city management

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101577812A (en) * 2009-03-06 2009-11-11 北京中星微电子有限公司 Method and system for post monitoring
US20110050875A1 (en) * 2009-08-26 2011-03-03 Kazumi Nagata Method and apparatus for detecting behavior in a monitoring system
CN104618685A (en) * 2014-12-29 2015-05-13 国家电网公司 Intelligent image analysis method for power supply business hall video monitoring
CN105138947A (en) * 2014-05-30 2015-12-09 由田新技股份有限公司 Guard reminding method, reminding device and reminding system
CN107545224A (en) * 2016-06-29 2018-01-05 珠海优特电力科技股份有限公司 The method and device of transformer station personnel Activity recognition
CN108664608A (en) * 2018-05-11 2018-10-16 中国联合网络通信集团有限公司 Recognition methods, device and the computer readable storage medium of a suspect
CN109145804A (en) * 2018-08-15 2019-01-04 深圳市烽焌信息科技有限公司 Behavior monitoring method and robot
CN109190560A (en) * 2018-08-31 2019-01-11 辽宁奇辉电子系统工程有限公司 It a kind of railway signal building based on face recognition technology relieves and anti-tired sleeps system
CN109413369A (en) * 2017-08-17 2019-03-01 孟思宏 A kind of monitor video intellectual analysis early warning platform
CN109657626A (en) * 2018-12-23 2019-04-19 广东腾晟信息科技有限公司 A kind of analysis method by procedure identification human body behavior

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101577812A (en) * 2009-03-06 2009-11-11 北京中星微电子有限公司 Method and system for post monitoring
US20110050875A1 (en) * 2009-08-26 2011-03-03 Kazumi Nagata Method and apparatus for detecting behavior in a monitoring system
CN105138947A (en) * 2014-05-30 2015-12-09 由田新技股份有限公司 Guard reminding method, reminding device and reminding system
CN104618685A (en) * 2014-12-29 2015-05-13 国家电网公司 Intelligent image analysis method for power supply business hall video monitoring
CN107545224A (en) * 2016-06-29 2018-01-05 珠海优特电力科技股份有限公司 The method and device of transformer station personnel Activity recognition
CN109413369A (en) * 2017-08-17 2019-03-01 孟思宏 A kind of monitor video intellectual analysis early warning platform
CN108664608A (en) * 2018-05-11 2018-10-16 中国联合网络通信集团有限公司 Recognition methods, device and the computer readable storage medium of a suspect
CN109145804A (en) * 2018-08-15 2019-01-04 深圳市烽焌信息科技有限公司 Behavior monitoring method and robot
CN109190560A (en) * 2018-08-31 2019-01-11 辽宁奇辉电子系统工程有限公司 It a kind of railway signal building based on face recognition technology relieves and anti-tired sleeps system
CN109657626A (en) * 2018-12-23 2019-04-19 广东腾晟信息科技有限公司 A kind of analysis method by procedure identification human body behavior

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113435380A (en) * 2021-07-06 2021-09-24 北京市商汤科技开发有限公司 Method and device for detecting people and sentry matching, computer equipment and storage medium
CN114529864A (en) * 2021-12-30 2022-05-24 东莞先知大数据有限公司 Method and device for detecting shoreside smuggling behavior and storage medium
CN114666546A (en) * 2022-03-24 2022-06-24 中国铁塔股份有限公司江苏省分公司 Monitoring method and device for communication iron tower and communication iron tower
CN114666546B (en) * 2022-03-24 2023-06-23 中国铁塔股份有限公司江苏省分公司 Monitoring method and device for communication iron tower and communication iron tower
CN115099724A (en) * 2022-08-24 2022-09-23 中达安股份有限公司 Monitoring and early warning method, device and equipment for construction scene and storage medium
CN116363575A (en) * 2023-02-15 2023-06-30 南京诚勤教育科技有限公司 Classroom monitoring management system based on wisdom campus
CN116363575B (en) * 2023-02-15 2023-11-03 南京诚勤教育科技有限公司 Classroom monitoring management system based on wisdom campus
CN116434296A (en) * 2023-03-02 2023-07-14 深圳市华方信息产业有限公司 Real-time face recognition monitoring behavior method, device, equipment and medium
CN116434296B (en) * 2023-03-02 2024-08-20 深圳市华方信息产业有限公司 Real-time face recognition monitoring behavior method, device, equipment and medium
CN117151959A (en) * 2023-10-16 2023-12-01 广东紫慧旭光科技有限公司 Real-time video analysis method, system and storage medium for city management
CN117151959B (en) * 2023-10-16 2024-04-16 广东紫慧旭光科技有限公司 Real-time video analysis method, system and storage medium for city management

Similar Documents

Publication Publication Date Title
CN112016363A (en) Personnel monitoring method and device, computer device and readable storage medium
CN109409238B (en) Obstacle detection method and device and terminal equipment
CN108846911A (en) A kind of Work attendance method and device
CN108681871B (en) Information prompting method, terminal equipment and computer readable storage medium
CN113128437A (en) Identity recognition method and device, electronic equipment and storage medium
CN112949157B (en) Fire data analysis method, device, computer device and storage medium
CN112489236B (en) Attendance data processing method and device, server and storage medium
CN113542692A (en) Face recognition system and method based on monitoring video
CN109767369A (en) A kind of work attendance statistics method, system and terminal device
CN111382986B (en) Student management method and device, computer device and computer readable storage medium
CN110674834A (en) Geo-fence identification method, device, equipment and computer-readable storage medium
CN114022841A (en) Personnel monitoring and identifying method and device, electronic equipment and readable storage medium
CN111178816A (en) Dormitory monitoring management method and device, electronic equipment and storage medium
Shinde et al. Design and development of geofencing based attendance system for mobile application
CN114338915A (en) Caller ID risk identification method, caller ID risk identification device, caller ID risk identification equipment and storage medium
CN109801394B (en) Staff attendance checking method and device, electronic equipment and readable storage medium
CN112183380A (en) Passenger flow volume analysis method and system based on face recognition and electronic equipment
CN111288998A (en) Map drawing method and device, storage medium and electronic device
CN111372197B (en) Early warning method and related device
CN115906905A (en) Application method and device of two-dimensional code
CN112487175A (en) Exhibitor flow control method, exhibitor flow control device, server and computer-readable storage medium
CN110321495B (en) Method, device, computer equipment and storage medium for pushing active message
TW202102019A (en) Monitoring method, device, computer device and readable storage medium
CN113012006A (en) Intelligent investigation and research method, system, computer equipment and storage medium
CN115659302B (en) Method and device for determining missing detection personnel, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20201201