CN114782874A - Anti-epidemic protection article wearing behavior standard detection method based on human body posture - Google Patents

Anti-epidemic protection article wearing behavior standard detection method based on human body posture Download PDF

Info

Publication number
CN114782874A
CN114782874A CN202210525955.3A CN202210525955A CN114782874A CN 114782874 A CN114782874 A CN 114782874A CN 202210525955 A CN202210525955 A CN 202210525955A CN 114782874 A CN114782874 A CN 114782874A
Authority
CN
China
Prior art keywords
wearing
protective
behavior
articles
category
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210525955.3A
Other languages
Chinese (zh)
Inventor
章东平
雷羽文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Haoqing Technology Co ltd
China University of Metrology
Original Assignee
Hangzhou Haoqing Technology Co ltd
China University of Metrology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Haoqing Technology Co ltd, China University of Metrology filed Critical Hangzhou Haoqing Technology Co ltd
Priority to CN202210525955.3A priority Critical patent/CN114782874A/en
Publication of CN114782874A publication Critical patent/CN114782874A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Image Analysis (AREA)

Abstract

The invention belongs to the technical field of deep learning, behavior recognition and image recognition, and discloses a method for detecting the wearing behavior specification of an epidemic-resistant protective article based on human body posture, which comprises the following steps of 1, performing behavior recognition of a video stream by using a protective clothing wearing behavior recognition algorithm; 2. detecting the category of the protective articles of the current frame by using a protective article target detection algorithm; 3. and setting a behavior occurrence time list and a target dynamic detection algorithm, detecting whether the wearing process is standard, and feeding back the process which does not meet the wearing standard to the background to send out a voice alarm. The method utilizes technologies such as deep learning, behavior recognition and image recognition to perform learning processing on key point features of the human body, obtains step classification of wearing behaviors of the protective clothing, and achieves the standard detection effect of the wearing process of the protective clothing by combining with a target detection algorithm. The method can quickly check whether the wearing process of the protective articles is standard or not, reduces the infection risk of medical personnel and related workers, and provides powerful support for epidemic prevention work.

Description

Method for detecting wearing behavior specification of epidemic-resistant protective product based on human body posture
Technical Field
The invention belongs to the technical field of deep learning, behavior recognition and image recognition, and particularly relates to a standard detection method for wearing behaviors of an anti-epidemic protective product based on human body postures.
Background
During virus pandemic, medical care personnel and relevant staff need correctly standardize when carrying out epidemic prevention work and dress protective articles, effectively prevent the diffusion of epidemic situation. At present, whether the protective clothing is worn in a standard or not is mainly detected through manual supervision, and whether action standards and sequences in the process of real-time supervision of the wearing of the protective clothing meet the standard requirements or not cannot be achieved.
However, with the continuous forward progress of technologies such as deep learning, machine vision, image processing and the like, the characteristics of the wearing behavior of the protective clothing can be learned by using the key technologies, and the normative of the behavior can be detected in real time by combining other calculation methods, so that the problems can be finally solved.
Disclosure of Invention
The invention aims to provide a method for detecting the wearing behavior specification of an anti-epidemic protective article based on human body posture, so as to solve the technical problems.
In order to solve the technical problems, the specific technical scheme of the method for detecting the wearing behavior specification of the epidemic prevention protection article based on the human body posture is as follows:
a method for detecting wearing behavior specifications of an epidemic prevention article based on human body postures comprises the following steps:
step 1: determining a correct wearing standard flow of the epidemic prevention article, and collecting videos of wearing the article by related personnel;
and 2, step: editing the video into a plurality of standard behavior video categories, and cleaning and labeling video category data to form a training, verifying and testing data set of the video worn by the protective clothing;
and step 3: constructing an anti-epidemic protective clothing wearing behavior step identification algorithm based on human body postures;
and 4, step 4: performing target detection on the epidemic prevention article by using a MobileNet algorithm;
and 5: the method comprises the following steps that a high-definition monitoring camera collects N frame data in real time and inputs the data into a protective clothing wearing behavior step recognition algorithm, and whether a video belongs to a standard behavior step category is recognized;
and 6: inputting a real-time video stream collected by a monitoring camera in a wearing area of the epidemic prevention article, detecting a current frame in real time by using a MobileNet target detection algorithm, setting a target dynamic detection algorithm for the type detection of the article, and detecting whether the wearing flow of the article is standard or not.
Further, the standardized procedure for correct wearing of the epidemic prevention articles in the step 1 is as follows: behavior step 1: hand hygiene; behavior step 2: wearing a disposable cap; behavior step 3: wearing a medical protective mask; behavior step 4: wearing protective clothing; and 5, action step: wearing goggles; and 6, action step: wearing gloves; action step 7: wearing a boot sleeve; action step 8: putting on a barrier gown; action step 9: putting on the shoe cover.
Further, the step 2: the videos are clipped into the following video categories:
the method comprises the following steps of performing standard behaviors on hand hygiene, performing standard behaviors on wearing disposable caps, performing standard behaviors on wearing medical protective masks, performing standard behaviors on wearing protective clothing, performing standard behaviors on wearing goggles, performing standard behaviors on wearing gloves, performing standard behaviors on wearing boots, performing standard behaviors on wearing isolation gowns, performing standard behaviors on wearing shoe covers, performing irregular behaviors on hand hygiene, performing irregular behaviors on wearing disposable caps, performing irregular behaviors on wearing medical protective masks, performing irregular behaviors on wearing protective clothing, performing irregular behaviors on wearing goggles, performing irregular behaviors on wearing boots, performing irregular behaviors on wearing isolation gowns and performing irregular behaviors on wearing shoe covers, cleaning and labeling video category data, and forming a training, verification and test data set of videos worn by the protective clothing.
Further, the step 3 is to perform sparse sampling on the protective clothing wearing video data set obtained in the step 2, each video obtains a T-frame picture, the algorithm is trained by using the picture data, and a protective clothing wearing behavior step identification algorithm meeting the identification rate requirement is obtained, wherein the step identification algorithm comprises the following specific steps:
step 3.1: training data and test data are prepared in an early stage: wear clean area control through protective articles and collect the video, with the video data set according to 6: 2: 2, dividing the training set, the verification set and the test set; uniformly dividing the video data set in the step 2 into T sections, and randomly extracting a frame of picture from each section to obtain network input data I ═ I1,…IT];
Step 3.2: designing a behavior recognition network structure:
step 3.2.1: performing sparse sampling to obtain T-frame pictures, wherein the resolution of the pictures is 224 multiplied by 224, performing adjacent 3-frame differential calculation by using a sliding window with the step length of 3 to obtain a T-2 frame motion differential gray-scale image, and inputting the T-2 frame motion differential gray-scale image after stacking to perform 3 multiplied by 3 convolution to extract video motion characteristics;
step 3.2.2: inputting a T frame picture into a central point network, and extracting 17 key points of human body posture, wherein the 17 key points comprise: the central point network backbone network uses a hourglass network, each frame of picture obtains 17 key point heat maps, the two-dimensional posture size is expressed as 17 x 56, and the key point heat map J of each frame of picture is extracted;
the formula for J is:
Figure BDA0003644464830000021
wherein K represents a key point, (i, j) represents a pixel point coordinate, and (x)k,yk) Coordinates representing K Key points, ckRepresenting a confidence level;
step 3.2.3: stacking the motion characteristics and the key point heat map, inputting the motion characteristics and the key point heat map into a 3D convolutional neural network to extract attitude characteristics, and obtaining behavior step category identification through a softmax classification layer;
step 3.3: model training and testing: and (3) sending the training data into a behavior recognition network, extracting a joint heat map, cutting the heat map into 56 x 56 by using a 10-clip cutting scheme, and performing parameter fine adjustment on the model to obtain a protective clothing wearing behavior step recognition algorithm meeting the recognition rate requirement.
Further, in the step 4, a target detection data set of the protective articles is constructed for the video frames obtained by sparse sampling in the step 3, and the category of the protective articles is consistent with that of the protective articles appearing in the video in the step 2, and is divided into: protective articles category one: a hand sanitizer is not required; category two of protective articles: a disposable cap; category three of protective articles: a medical protective mask; category four of protective articles: protective clothing; category five of protective articles: goggles; category six of protective articles: a glove; category seven of protective articles: a boot cover; protective articles category eight: a barrier coat; protective articles category nine: shoe covers; and training the MobileNet algorithm by using the data set to obtain the protective article target detection algorithm meeting the requirement of the recognition rate.
Further, the step 5 identifies whether the video belongs to a normative behavior step category or not through a step identification algorithm, and if the video belongs to an unnormalized behavior step category, a voice alarm of 'XX behavior step unnormality' is directly output; if the behavior is in the standard behavior step type, obtaining a behavior occurrence time sequence list, which specifically comprises the following steps:
step 5.1: inputting I through the protective clothing wearing behavior recognition algorithm in the step 3t-N-1To ItUniformly sampling N frames of video streams, outputting the behavior class confidence coefficient of the video of the section by a softmax layer, setting a confidence coefficient threshold value as gamma, and outputting the behavior step class with the maximum confidence coefficient under the normal condition; if the output softmax values are all smaller than the threshold value, no action is output; recording the time of step action
Figure BDA0003644464830000031
Where i denotes a behavior category, J denotes the number of times that a behavior i is continuously detected, J is 1,2, …, Ji(ii) a Step 5.2: storing the moment of occurrence of the behavior detected by the algorithm
Figure BDA0003644464830000032
According to the standard flow of the wearing behavior of the protective clothing, setting
Figure BDA0003644464830000033
Time sequence, if all the behavior category requirements are met and the behavior occurrence time sequence requirement is met, the next step is carried out; if the wearing flow does not meet any one of the requirements, the wearing is judged to be not standard, and a voice alarm of 'non-standard wearing flow' is fed back to the background to send out.
Further, if the result of the target dynamic detection algorithm meets the dynamic existence requirement of the protective articles and the action occurrence time list meets the sequence requirement of the action occurrence time, outputting a wearing process specification; if the requirements are not met, the data are directly fed back to a background to send out a voice alarm of 'irregular wearing process', and the specific steps are as follows: inputting a real-time video stream to detect the protective articles according to the protective article target detection algorithm in the step 4, and setting the dynamic existing sequence requirements of the protective article category targets as follows according to the protective article category appearance sequence: protective articles class one-protective articles class two-protective articles class three-protective articles class four-protective articles class five-protective articles class six-protective articles class seven-protective articles class eight-protective articles class nine;
the target dynamic detection algorithm firstly detects whether a target type I exists, if the target type I does not exist, the target detection of the protective articles of the next frame of image is carried out until the protective articles type I appears, then the second protective articles type is detected whether the protective articles type II appears after the first protective articles type I exists, if the second protective articles type II appears, the third protective articles type is detected whether the protective articles type III appears, and if the second protective articles type does not appear, the target detection of the protective articles of the next frame of image is carried out until the second protective articles type appears. By analogy, if the type n of the protective articles is detected, only the type n +1 of the protective articles needs to be detected, if the type n +1 of the protective articles exists, the next detection is carried out, if the type n +1 of the protective articles does not exist, the target dynamic detection algorithm always detects whether the type n +1 of the protective articles exists in the input video frame, and the target detection of the next type of protective articles is carried out after the type n +1 of the protective articles is detected;
when all the protective article categories appear according to the dynamic existence sequence of the protective article wearing target, showing the specifications of the wearing process of the epidemic prevention protective article; if the target detection algorithm result meets the target dynamic existence sequence requirement, and the step 5.2 meets the sequence requirement of behavior occurrence time, judging that the target detection algorithm result is in a wearing standard; if the wearing behavior does not meet one of the requirements, the wearing behavior is judged to be not standard, and a voice alarm of 'non-standard wearing behavior' is fed back to the background.
The method for detecting the wearing behavior specification of the epidemic prevention article based on the human body posture has the following advantages: the method utilizes technologies such as deep learning, behavior recognition and image recognition to perform learning processing on key point features of the human body, obtains step classification of wearing behaviors of the protective clothing, and achieves the standard detection effect of the wearing process of the protective clothing by combining with a target detection algorithm. The method can quickly check whether the wearing process of the protective articles is standard or not, greatly reduces the infection risk of medical personnel and related workers, and provides powerful support for epidemic prevention work.
Drawings
FIG. 1 is a flow chart of the present invention for detecting the wearing behavior specification of a protective article based on human body posture;
FIG. 2 is a schematic diagram of the structure of the protective clothing wearing behavior recognition algorithm of the invention;
FIG. 3 is a block diagram of a behavior recognition algorithm 3D convolutional neural network of the present invention;
FIG. 4 is a flow chart of the dynamic target detection algorithm of the present invention.
Detailed Description
In order to better understand the purpose, structure and function of the invention, the following describes the method for detecting the wearing behavior specification of the epidemic prevention article based on human body posture in detail with reference to the accompanying drawings.
As shown in figure 1, the method for detecting the wearing behavior specification of the epidemic prevention article based on the human body posture comprises the following steps:
step 1: the correct wearing standard flow of the epidemic prevention protective product is as follows: behavior step 1: hand hygiene; behavior step 2: wearing a disposable cap; behavior step 3: wearing a medical protective mask; behavior step 4: wearing protective clothing; and 5, action step: wearing goggles; and 6, action step: wearing gloves; action step 7: wearing a boot sleeve; and a behavior step 8: wearing an isolation gown; action step 9: putting on the shoe cover. And collecting videos of the protective articles worn by the related personnel.
And 2, step: the videos are clipped into the following video categories: 1,2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, and the video category data are cleaned and labeled to form a training, verification and testing data set of the video worn by the protective suit.
And step 3: and constructing an anti-epidemic protective clothing wearing behavior step identification algorithm based on the human body posture. And (3) carrying out sparse sampling on the protective clothing wearing video data set obtained in the step (2), obtaining a T-frame picture for each video, and training the algorithm by using the picture data to obtain the protective clothing wearing behavior step identification algorithm meeting the identification rate requirement. As shown in fig. 2, the specific steps of the recognition algorithm are as follows:
step 3.1: training data and test data are prepared in an early stage: wear clean area control through protective articles and collect the video, according to 6 with the video data set: 2: 2 into a training set, a validation set and a test set. Uniformly dividing the video data set in the step 2 into T sections, and randomly extracting a frame of picture from each section to obtain network input data I ═ I1,…IT]。
Step 3.2: and (3) designing a behavior recognition network structure:
step 3.2.1: performing sparse sampling to obtain T frame pictures, wherein the resolution of the pictures is 224 multiplied by 224, performing adjacent 3 frame difference calculation by using a sliding window with the step length of 3 to obtain a T-2 frame motion difference gray map, and inputting 3 multiplied by 3 convolution after stacking to extract video motion characteristics;
step 3.2.2: inputting T frame pictures into a central point network (CenterNet network), extracting 17 key points of human body postures (wherein the 17 key points comprise a nose, a left eye, a right eye, a left ear, a right ear, a left shoulder, a right shoulder, a left elbow, a right elbow, a left wrist, a right wrist, a left waist, a right waist, a left knee, a right knee, a left ankle and a right ankle), wherein the central point network backbone network uses a hourglass network, 17 key point heat maps are obtained from each frame of picture, the two-dimensional posture size is expressed as 17 multiplied by 56, and a key point heat map J of each frame of picture is extracted;
the formula for J is:
Figure BDA0003644464830000051
wherein K represents a key point, (i, j) represents a pixel point coordinate, (x)k,yk) Coordinates representing K Key points, ckThe confidence is indicated.
Step 3.2.3: and stacking the motion characteristics and the key point heat map, inputting the motion characteristics and the key point heat map into a 3D convolutional neural network to extract attitude characteristics, and obtaining behavior step category identification through a softmax classification layer.
Step 3.3: model training and testing: and (3) sending the training data into a behavior recognition network, extracting a joint heat map, cutting the heat map into 56 x 56 by using a 10-clip cutting scheme, and performing parameter fine adjustment on the model to obtain a protective clothing wearing behavior step recognition algorithm meeting the recognition rate requirement.
And 4, step 4: and (3) carrying out target detection on the epidemic prevention articles by using a MobileNet algorithm. Constructing a protective article target detection data set for the video frame obtained by sparse sampling in the step 3, wherein the category of the protective article is consistent with that of the protective article appearing in the video in the step 2, and the method comprises the following steps: protective articles category one: a hand sanitizer is not required; category two of protective articles: a disposable cap; category three of protective articles: a medical protective mask; category four of protective articles: protective clothing; category five of protective articles: goggles; category six of protective articles: a glove; category seven of protective articles: a boot cover; protective articles category eight: a barrier coat; protective articles category nine: shoe covers. And training the MobileNet algorithm by using the data set to obtain the protective article target detection algorithm meeting the requirement of the recognition rate.
And 5: the high-definition monitoring camera collects N frame data in real time and inputs the N frame data into a protective clothing wearing behavior step recognition algorithm to recognize whether the video belongs to a standard behavior step category or not, and if the video belongs to an unnormalized behavior step category, a voice alarm of 'XX behavior step unnormalization' is directly output; if the behavior is a normative behavior step type, a behavior occurrence time sequence list is obtained, as shown in fig. 3, which specifically includes the following steps:
step 5.1: inputting I through the protective clothing wearing behavior recognition algorithm in the step 3t-N-1To ItThe N frames of video streams are uniformly sampled, and the softmax layer outputs the confidence coefficient of the behavior category of the video segment. Setting a confidence coefficient threshold value as gamma, and outputting a behavior step type with the maximum confidence coefficient under a normal condition; and if the output softmax values are all smaller than the threshold value, outputting no action. Recording the time of step action
Figure BDA0003644464830000061
Where i denotes a behavior class, J denotes the number of times that behavior i is detected consecutively, J is 1,2, …, Ji
And step 5.2: storing the behavior occurrence time detected by algorithm
Figure BDA0003644464830000062
According to the standard flow of the wearing behavior of the protective clothing, setting
Figure BDA0003644464830000063
The time sequence. If all the behavior category requirements exist and the behavior occurrence time sequence requirements are met, carrying out the next step; if the wearing flow does not meet any one of the requirements, the wearing is judged to be irregular, and the wearing flow is fed back to the background to send out a 'irregular wearing flow' voice alarm.
And 6: inputting a real-time video stream collected by a monitoring camera of a wearing area of the epidemic prevention protection product, detecting a current frame in real time by using a MobileNet target detection algorithm, setting a target dynamic detection algorithm for the type detection of the protection product, and performing standard detection on the wearing flow of the protection product, wherein if the target dynamic detection algorithm result meets the dynamic existence requirement of the protection product and the action occurrence time list meets the sequence requirement of the action occurrence time, the wearing flow is output to be standard; if the requirement is not met, the information is directly fed back to the background, and a voice alarm of 'irregular wearing process' is sent out. The method comprises the following specific steps:
and 4, inputting a real-time video stream to detect the protective articles through the protective article target detection algorithm in the step 4. According to the appearance sequence of the categories of the protective articles, setting the dynamic existence sequence requirement of the categories of the protective articles as follows: protective article category 1-protective article category 2-protective article category 3-protective article category 4-protective article category 5-protective article category 6-protective article category 7-protective article category 8-protective article category 9.
As shown in fig. 4, the target dynamic detection algorithm first detects whether a first target category exists, if the first protective article category does not exist, the target detection of the protective article of the next frame image is performed until the first protective article category appears, and then detects whether a second protective article category appears after the first protective article category is detected, if the second protective article category appears, the third protective article category appears, and if the second protective article category does not appear, the target detection of the protective article of the next frame image is performed until the second protective article category appears. By analogy, if the type n of the protective articles is detected, only the type n +1 of the protective articles needs to be detected, if the type n +1 of the protective articles exists, the next detection is performed, if the type n +1 of the protective articles does not exist, the target dynamic detection algorithm always detects whether the type n +1 of the protective articles exists in the input video frame, and the target detection of the next type of protective articles is performed after the type n +1 of the protective articles is detected.
When all the protective article categories appear according to the dynamic existence sequence of the protective article wearing target, the epidemic prevention protective article wearing flow specification is represented. If the target detection algorithm result meets the target dynamic existence sequence requirement, and the step 5.2 meets the sequence requirement of behavior occurrence time, judging that the target detection algorithm result is in a wearing standard; if the wearing behavior does not meet one of the requirements, the wearing behavior is judged to be not standard, and a voice alarm of 'non-standard wearing behavior' is fed back to the background.
It is to be understood that the present invention has been described with reference to certain embodiments, and that various changes in the features and embodiments, or equivalent substitutions may be made therein by those skilled in the art without departing from the spirit and scope of the invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from the essential scope thereof. Therefore, it is intended that the invention not be limited to the particular embodiment disclosed, but that the invention will include all embodiments falling within the scope of the appended claims.

Claims (7)

1. A method for detecting the wearing behavior specification of an anti-epidemic protection article based on human body posture is characterized by comprising the following steps:
step 1: determining a standard wearing process of the epidemic prevention articles, and collecting videos of wearing the articles by related personnel;
step 2: editing the video into a plurality of standard behavior video categories, and cleaning and labeling video category data to form a training, verifying and testing data set of the video worn by the protective clothing;
and 3, step 3: constructing an anti-epidemic protective clothing wearing behavior step identification algorithm based on human body postures;
and 4, step 4: carrying out target detection on the epidemic prevention articles by using a MobileNet algorithm;
and 5: the method comprises the following steps that a high-definition monitoring camera collects N frame data in real time and inputs the N frame data into a protective clothing wearing behavior step recognition algorithm, and whether a video belongs to a standard behavior step category is recognized;
step 6: inputting a real-time video stream collected by a monitoring camera in a wearing area of the epidemic prevention protection article, detecting a current frame in real time by using a MobileNet target detection algorithm, setting a target dynamic detection algorithm for the protection article class detection, and detecting whether the wearing flow of the protection article is standard or not.
2. The method for detecting the wearing behavior specification of the anti-epidemic protective equipment based on the human body posture according to claim 1, wherein the correct wearing specification process of the anti-epidemic protective equipment in the step 1 is as follows: behavior step 1: hand hygiene; and (2) behavior step: wearing a disposable cap; behavior step 3: wearing a medical protective mask; and 4, behavior step: wearing protective clothing; behavior step 5: wearing goggles; action step 6: wearing gloves; action step 7: wearing a boot sleeve; action step 8: wearing an isolation gown; action step 9: putting on the shoe cover.
3. The method for detecting the wearing behavior specification of the epidemic prevention articles based on the human body posture according to claim 1, wherein the step 2: the videos are clipped into the following video categories:
the method comprises the following steps of performing normal behaviors of hand hygiene, performing normal behaviors of wearing disposable caps, performing normal behaviors of wearing medical protective masks, performing normal behaviors of wearing protective clothing, performing normal behaviors of wearing goggles, performing normal behaviors of wearing gloves, performing normal behaviors of wearing boots, performing normal behaviors of wearing isolation clothes, performing normal behaviors of wearing shoe covers, performing abnormal behaviors of hand hygiene, performing abnormal behaviors of wearing disposable caps, performing abnormal behaviors of wearing medical protective masks, performing abnormal behaviors of wearing protective clothing, performing abnormal behaviors of wearing goggles, performing abnormal behaviors of wearing gloves, performing abnormal behaviors of wearing boots, performing abnormal behaviors of wearing isolation clothes and performing abnormal behaviors of wearing shoe covers, cleaning and labeling video category data, and forming a training, verification and test data set of wearing videos of the protective clothing.
4. The method for detecting the specifications of the wearing behaviors of the anti-epidemic protective articles based on the human body postures as claimed in claim 1, wherein the step 3 is to perform sparse sampling on the video data sets worn on the protective clothing obtained in the step 2, each video is obtained by a T-frame picture, the algorithm is trained by using picture data, and the step recognition algorithm of the wearing behaviors of the protective clothing meeting the recognition rate requirement is obtained, and the step recognition algorithm comprises the following specific steps:
step 3.1: training data and test data are prepared in an early stage: wear clean area control through protective articles and collect the video, according to 6 with the video data set: 2:2, dividing the training set, the verification set and the test set; uniformly dividing the video data set in the step 2 into T sections, and randomly extracting a frame of picture from each section to obtain network input data I ═ I1,…IT];
Step 3.2: designing a behavior recognition network structure:
step 3.2.1: performing sparse sampling to obtain T-frame pictures, wherein the resolution of the pictures is 224 multiplied by 224, performing adjacent 3-frame differential calculation by using a sliding window with the step length of 3 to obtain a T-2 frame motion differential gray-scale image, and inputting the T-2 frame motion differential gray-scale image after stacking to perform 3 multiplied by 3 convolution to extract video motion characteristics;
step 3.2.2: inputting a T frame picture into a central point network, and extracting 17 key points of human body posture, wherein the 17 key points comprise: the central point network backbone network uses a hourglass network, each frame of picture obtains 17 key point heat maps, the two-dimensional posture size is expressed as 17 x 56, and the key point heat map J of each frame of picture is extracted;
the formula for J is:
Figure FDA0003644464820000021
wherein K represents a key point, (i, j) represents a pixel point coordinate, (x)k,yk) Coordinates representing K Key points, ckRepresenting a confidence level;
step 3.2.3: stacking the motion characteristics and the key point heat map, inputting the motion characteristics and the key point heat map into a 3D convolutional neural network to extract attitude characteristics, and obtaining behavior step category identification through a softmax classification layer;
step 3.3: model training and testing: and (3) sending the training data into a behavior recognition network, extracting a joint heat map, cutting the heat map into 56 x 56 by using a 10-clip cutting scheme, and carrying out parameter fine adjustment on the model to obtain a protective clothing wearing behavior step recognition algorithm meeting the recognition rate requirement.
5. The method for detecting the wearing behavior specification of the epidemic prevention articles based on the human body posture according to the claim 1, wherein the step 4 constructs a target detection data set of the prevention articles according to the video frame obtained by sparse sampling in the step 3, and the categories of the prevention articles are consistent with the categories of the prevention articles appearing in the video in the step 2, and are divided into: protective articles category one: a hand sanitizer is avoided; category two of protective articles: a disposable cap; category three of protective articles: a medical protective mask; protective articles category four: protective clothing; category five of protective articles: goggles; protective articles category six: a glove; category seven of protective articles: a boot cover; protective articles category eight: a barrier coat; category nine of protective articles: shoe covers; and training the MobileNet algorithm by using the data set to obtain the protective article target detection algorithm meeting the requirement of the recognition rate.
6. The method for detecting the wearing behavior specification of the anti-epidemic protective equipment based on the human body posture according to claim 1, wherein the step 5 is to identify whether the video belongs to a specification behavior step category or not through a step identification algorithm, and if the video belongs to an unnormalized behavior step category, a voice alarm of 'XX behavior step unnormalization' is directly output; if the behavior is in the standard behavior step type, obtaining a behavior occurrence time sequence list, which specifically comprises the following steps:
step 5.1: inputting I through the protective clothing wearing behavior recognition algorithm in the step 3t-N-1To ItUniformly sampling N frames of video streams, outputting the behavior class confidence coefficient of the video of the section by a softmax layer, setting a confidence coefficient threshold value as gamma, and outputting the behavior step class with the maximum confidence coefficient under the normal condition; if the output softmax values are all smaller than the threshold value, outputting no action; recording the time of step action
Figure FDA0003644464820000031
Where i denotes a behavior category, J denotes the number of times that a behavior i is continuously detected, J is 1,2, …, Ji
And step 5.2: storing the moment of occurrence of the behavior detected by the algorithm
Figure FDA0003644464820000032
According to the standard flow of the wearing behavior of the protective clothing, setting
Figure FDA0003644464820000033
Time sequence, if all the behavior category requirements are met and the behavior occurrence time sequence requirement is met, the next step is carried out; if the wearing flow does not meet any one of the requirements, the wearing is judged to be not standard, and a voice alarm of 'non-standard wearing flow' is fed back to the background to send out.
7. The method for detecting the wearing behavior specification of the anti-epidemic protective equipment based on the human body posture according to claim 6, wherein the step 6 is to output the wearing flow specification if the result of the target dynamic detection algorithm meets the dynamic existence requirement of the protective equipment and the action occurrence time list meets the sequence requirement of the action occurrence time; if the requirement is not met, the data are directly fed back to the background to send out a voice alarm of 'irregular wearing process', and the specific steps are as follows:
inputting a real-time video stream to detect the protective articles through the protective article target detection algorithm in the step 4, and setting the dynamic existence sequence requirements of the protective article category targets as follows according to the protective article category appearance sequence: protective articles class one-protective articles class two-protective articles class three-protective articles class four-protective articles class five-protective articles class six-protective articles class seven-protective articles class eight-protective articles class nine;
the target dynamic detection algorithm firstly detects whether a target type I exists, if the target type I does not exist, the target detection of the protective articles of the next frame of image is carried out until the protective articles type I appears, then the second protective articles type is detected whether the protective articles type II appears after the first protective articles type I exists, if the second protective articles type II appears, the third protective articles type is detected whether the protective articles type III appears, and if the second protective articles type does not appear, the target detection of the protective articles of the next frame of image is carried out until the second protective articles type appears. By analogy, if the type n of the protective articles is detected, only the type n +1 of the protective articles needs to be detected, if the type n +1 of the protective articles exists, the next detection is carried out, if the type n +1 of the protective articles does not exist, the target dynamic detection algorithm always detects whether the type n +1 of the protective articles exists in the input video frame, and the target detection of the next type of protective articles is carried out after the type n +1 of the protective articles is detected;
when all the protective article categories appear according to the dynamic existence sequence of the protective article wearing targets, the wearing flow of the epidemic prevention protective article is normalized; if the target detection algorithm result meets the target dynamic existence sequence requirement, and the step 5.2 meets the action occurrence time sequence requirement, determining that the wearing specification is met; if the wearing behavior does not meet one of the requirements, the wearing behavior is judged to be not standard, and a voice alarm of 'non-standard wearing behavior' is fed back to the background.
CN202210525955.3A 2022-05-16 2022-05-16 Anti-epidemic protection article wearing behavior standard detection method based on human body posture Pending CN114782874A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210525955.3A CN114782874A (en) 2022-05-16 2022-05-16 Anti-epidemic protection article wearing behavior standard detection method based on human body posture

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210525955.3A CN114782874A (en) 2022-05-16 2022-05-16 Anti-epidemic protection article wearing behavior standard detection method based on human body posture

Publications (1)

Publication Number Publication Date
CN114782874A true CN114782874A (en) 2022-07-22

Family

ID=82437904

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210525955.3A Pending CN114782874A (en) 2022-05-16 2022-05-16 Anti-epidemic protection article wearing behavior standard detection method based on human body posture

Country Status (1)

Country Link
CN (1) CN114782874A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114937232A (en) * 2022-07-25 2022-08-23 浙江大学 Wearing detection method, system and equipment for medical waste treatment personnel protective appliance
CN115375287A (en) * 2022-10-21 2022-11-22 安徽博诺思信息科技有限公司 Power grid operation and maintenance operation violation monitoring and early warning management system
CN116189311A (en) * 2023-04-27 2023-05-30 成都愚创科技有限公司 Protective clothing wears standardized flow monitoring system
CN116453100A (en) * 2023-06-16 2023-07-18 国家超级计算天津中心 Method, device, equipment and medium for detecting wearing and taking-off normalization of protective equipment
CN116844117A (en) * 2023-09-01 2023-10-03 福建智康云医疗科技有限公司 Medical protective clothing wearing monitoring system based on AI video analysis

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114937232A (en) * 2022-07-25 2022-08-23 浙江大学 Wearing detection method, system and equipment for medical waste treatment personnel protective appliance
CN114937232B (en) * 2022-07-25 2022-10-21 浙江大学 Wearing detection method, system and equipment for medical waste treatment personnel protective appliance
CN115375287A (en) * 2022-10-21 2022-11-22 安徽博诺思信息科技有限公司 Power grid operation and maintenance operation violation monitoring and early warning management system
CN115375287B (en) * 2022-10-21 2022-12-20 安徽博诺思信息科技有限公司 Power grid operation and maintenance operation violation monitoring and early warning management system
CN116189311A (en) * 2023-04-27 2023-05-30 成都愚创科技有限公司 Protective clothing wears standardized flow monitoring system
CN116453100A (en) * 2023-06-16 2023-07-18 国家超级计算天津中心 Method, device, equipment and medium for detecting wearing and taking-off normalization of protective equipment
CN116844117A (en) * 2023-09-01 2023-10-03 福建智康云医疗科技有限公司 Medical protective clothing wearing monitoring system based on AI video analysis
CN116844117B (en) * 2023-09-01 2023-11-14 福建智康云医疗科技有限公司 Medical protective clothing wearing monitoring system based on AI video analysis

Similar Documents

Publication Publication Date Title
CN114782874A (en) Anti-epidemic protection article wearing behavior standard detection method based on human body posture
CN110287825B (en) Tumble action detection method based on key skeleton point trajectory analysis
CN104036236B (en) A kind of face gender identification method based on multiparameter exponential weighting
CN114582030B (en) Behavior recognition method based on service robot
CN110706255A (en) Fall detection method based on self-adaptive following
CN112966628A (en) Visual angle self-adaptive multi-target tumble detection method based on graph convolution neural network
CN112488034A (en) Video processing method based on lightweight face mask detection model
CN112185514A (en) Rehabilitation training effect evaluation system based on action recognition
CN114511931A (en) Action recognition method, device and equipment based on video image and storage medium
CN114894337B (en) Temperature measurement method and device for outdoor face recognition
CN111783702A (en) Efficient pedestrian tumble detection method based on image enhancement algorithm and human body key point positioning
Ali et al. Real-time face mask detection in deep learning using convolution neural network
US20220130148A1 (en) System and Method for Identifying Outfit on a Person
CN113313186B (en) Method and system for identifying irregular wearing work clothes
CN116189311B (en) Protective clothing wears standardized flow monitoring system
CN111912531A (en) Dynamic human body trunk temperature analysis method based on intelligent thermal imaging video analysis
Mohd et al. An optimized low computational algorithm for human fall detection from depth images based on Support Vector Machine classification
US20230368408A1 (en) Posture Detection Apparatus, Posture Detection Method, and Sleeping Posture Determination Method
CN115588229A (en) Internet-based care service management system and method
CN113100755A (en) Limb rehabilitation training and evaluating system based on visual tracking control
Hossen Social distance monitoring using a low-cost 3d sensor
Iksan et al. Face mask detection services of Covid19 monitoring system to maintain a safe environment using deep learning method
TWI838897B (en) Intelligent calculation engine system
Paulose et al. Recurrent neural network for human action recognition using star skeletonization
Raza et al. Human Fall Detection from Sequences of Skeleton Features using Vision Transformer.

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination