CN113313186A - Method and system for identifying non-standard wearing work clothes - Google Patents

Method and system for identifying non-standard wearing work clothes Download PDF

Info

Publication number
CN113313186A
CN113313186A CN202110643917.3A CN202110643917A CN113313186A CN 113313186 A CN113313186 A CN 113313186A CN 202110643917 A CN202110643917 A CN 202110643917A CN 113313186 A CN113313186 A CN 113313186A
Authority
CN
China
Prior art keywords
wearing
image
working clothes
images
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110643917.3A
Other languages
Chinese (zh)
Other versions
CN113313186B (en
Inventor
卫潮冰
杨玺
冯健榆
黄茂光
陈建科
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Power Grid Co Ltd
Jiangmen Power Supply Bureau of Guangdong Power Grid Co Ltd
Original Assignee
Guangdong Power Grid Co Ltd
Jiangmen Power Supply Bureau of Guangdong Power Grid Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Power Grid Co Ltd, Jiangmen Power Supply Bureau of Guangdong Power Grid Co Ltd filed Critical Guangdong Power Grid Co Ltd
Priority to CN202110643917.3A priority Critical patent/CN113313186B/en
Publication of CN113313186A publication Critical patent/CN113313186A/en
Application granted granted Critical
Publication of CN113313186B publication Critical patent/CN113313186B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/28Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Abstract

The application discloses a method and a system for identifying irregular wearing working clothes, a training sample set which is specially manufactured is trained based on a YOLOV4 network structure, a target detection model can be obtained through training, then target detection is carried out on a plurality of unframed images in a real-time monitoring video stream based on the target detection model, the output of the unframed images which are suspected to be worn irregularly is further identified, an ROI (region of interest) in the unframed images which are suspected to be worn irregularly is extracted, whether sleeves or trouser legs in the unframed images which are suspected to be worn irregularly act normally is judged according to the length-width ratio of the ROI, whether the sleeves or the trouser legs in the unframed images appear obviously in a symbiotic manner is further represented by calculating the entropy value of a gray level matrix of a gray level characteristic image, and when the phenomenon that the sleeves or the trouser legs in the unframed images appear obviously in a symbiotic manner is judged to be worn irregularly. Whether the working clothes are worn normally or not is identified through the plurality of judgment rules, so that the accuracy of identifying the working clothes which are worn irregularly is improved, and meanwhile, the identification efficiency is also improved.

Description

Method and system for identifying non-standard wearing work clothes
Technical Field
The application relates to the technical field of intelligent power monitoring, in particular to a method and a system for identifying an irregular wearing work clothes.
Background
Personal protection is the most basic requirement of safety production, and special electrified or other dangerous equipment exists in operation sites such as electric power, buildings and the like, and the environment is complex, so that safety accidents are easy to occur. In the daily operation stage, protective equipment such as standard work clothes and the like are required to be worn. However, because the safety awareness of the operating personnel is not enough, the wearing condition is easy to be irregular, and the monitoring personnel needs to be monitored on site all the time and stop in time.
Through data statistics discovery, the unnormal wearing condition through manual supervision operation personnel can lead to guardianship's intensity of labour big, and the discernment is inefficient, and intelligent level is low.
In recent years, with the gradual maturity of computer vision computing and internet of things technologies, especially the rapid development of neural network technologies, deep learning technologies are beginning to be applied to various production environments. The concept of deep learning has first originated from the study of artificial neural networks by western mathematicians and computer scientists. The artificial neural network is an algorithm model for simulating animal neural network behavior characteristics and performing distributed parallel information processing, and achieves the purpose of processing information by adjusting the interconnection relationship among a large number of internal nodes. The deep neural network extracts a large number of image features through convolution operation, and combines low-level features at the same time to obtain more abstract high-level features, and the high-level features can be used for representing attribute types or hidden features of objects so as to find out the distribution rule of data or feature representation of image data from image information.
The deep learning technology is applied to the irregular wearing condition of the monitoring operator, so that the identification efficiency can be improved, the labor intensity of the guardian is reduced, however, the problem of inaccurate identification still exists when the current deep learning technology is applied to the irregular wearing condition of the monitoring operator, and meanwhile, the identification efficiency is required to be improved.
Disclosure of Invention
The application provides an irregular wearing working clothes identification method and system, which are used for solving the technical problems that the irregular wearing working clothes are inaccurate in identification and low in identification efficiency in the prior art.
In view of this, the first aspect of the present application provides an irregular wearing work clothes identification method, including the following steps:
s1, collecting working clothes wearing images of personnel in the electric power construction site to construct a working clothes wearing image sample set;
s2, classifying each working clothes wearing image in the working clothes wearing image sample set according to the dressing classification rule, and determining a class label, wherein the class label comprises a wearing irregularity, a suspected wearing irregularity and a wearing specification;
s3, manually labeling a target area in each working clothes wearing image by using a rectangular frame according to the category label so as to obtain the coordinates of the rectangular frame and the corresponding category label to manufacture a training sample set, wherein the target area comprises sleeves and trouser legs of the working clothes;
s4, training a YOLOV4 network structure through the training sample set to obtain a target detection model;
s5, decoding the real-time monitoring video stream to obtain a plurality of unframed images;
s6, performing target identification on the multiple de-framing images based on the target detection model, so as to obtain a corresponding target rectangular frame and rectangular frame information thereof in each de-framing image, wherein the rectangular frame information comprises a category label, a target rectangular frame coordinate and a confidence score of whether the target rectangular frame is the target area;
s7, comparing the confidence score of each deframer image with a preset confidence score threshold value, so as to screen deframer images larger than the preset confidence score threshold value from the plurality of deframer images;
s8, screening the unframed images screened in the step S7 according to the category labels to screen the unframed images suspected of being worn abnormally;
s9, extracting an ROI (region of interest) from the corresponding target region in the suspected dressing irregular unframed image screened in the step S8 to obtain a characteristic image of the ROI, wherein the characteristic image comprises a sleeve image of the working clothes and a trouser leg image of the working clothes;
s10, calculating the length-width ratio of the ROI according to the ROI and the coordinates of the target rectangular frame;
s11, judging whether the length-width ratio of the ROI exceeds a preset length-width ratio threshold value, if so, executing a step S13, and if not, executing a step S12;
s12, carrying out gray level processing on the characteristic image corresponding to the ROI area to obtain a gray level characteristic image, calculating an entropy value of a gray level co-occurrence matrix of the gray level characteristic image, judging whether the entropy value of the gray level co-occurrence matrix is larger than a preset entropy value or not, and if so, executing a step S13; if the above determination is no, go to step S5;
s13, changing the class label in the unframed image corresponding to the ROI area into a non-standard wearing state, and executing the step S14;
and S14, outputting a class label as a deframing image corresponding to the wearing irregularity.
Preferably, after step S1, step S2 is preceded by:
s102, carrying out augmentation technology processing on each work clothes wearing image in the work clothes wearing image sample set, and accordingly obtaining a work clothes wearing image augmentation sample set.
Preferably, step S2 specifically includes:
s201, judging whether a worker in the working clothes wearing image wears a working clothes, if not, determining that the class label of the corresponding working clothes wearing image is not standard, and if yes, executing step S202;
s202, judging whether the working clothes worn by the staff in the working clothes wearing image are standard or not based on a working clothes wearing rule, if so, determining that the corresponding category label of the working clothes wearing image is the wearing standard, and if not, determining that the category label of the working clothes wearing image is suspected to be not standard.
Preferably, after the step S5, the step S6 includes before:
and carrying out filtering processing on the plurality of the decoded images so as to obtain a plurality of filtered decoded images.
Preferably, step S8 further includes: and (4) screening out the unframed image which is screened out in the step (S7) according to the category label, and turning to the step (S14).
In a second aspect, the present invention further provides an irregular wearing work clothes identification system, including:
the system comprises a sample set construction module, a data acquisition module and a data processing module, wherein the sample set construction module is used for acquiring working clothes wearing images of personnel in the electric power construction site so as to construct a working clothes wearing image sample set;
the category label module is used for classifying each working clothes wearing image in the working clothes wearing image sample set according to the dressing classification rule and determining a category label, wherein the category label comprises a wearing irregularity, a suspected wearing irregularity and a wearing specification;
the labeling module is used for manually labeling a target area in each working clothes wearing image by using a rectangular frame according to the category label so as to obtain the coordinate of the rectangular frame and the corresponding category label to manufacture a training sample set, wherein the target area comprises sleeves and trouser legs of the working clothes;
the training module is used for training a Yolov4 network structure through the training sample set to obtain a target detection model;
the decoding module is used for decoding the real-time monitoring video stream so as to obtain a plurality of unframed images;
the target detection module is used for carrying out target identification on the multiple de-framing images based on the target detection model so as to obtain a corresponding target rectangular frame and rectangular frame information thereof in each de-framing image, wherein the rectangular frame information comprises a category label, a target rectangular frame coordinate and a confidence score of whether the target rectangular frame is a target area;
the first screening module is used for comparing the confidence score of each deframed image with a preset confidence score threshold value so as to screen deframed images which are larger than the preset confidence score threshold value from the plurality of deframed images;
the second screening module is used for screening the unframed images screened by the first screening module into unframed images suspected of irregular wearing according to the category labels;
the ROI extraction module is used for extracting an ROI area from the corresponding target area in the screened suspected irregular wearing unframed image so as to obtain a characteristic image of the ROI area, wherein the characteristic image comprises a sleeve image of the working clothes and a trouser leg image of the working clothes;
the calculation module is used for calculating the length-width ratio of the ROI according to the ROI area and the target rectangular frame coordinate;
the judging module is used for judging whether the length-width ratio of the ROI exceeds a preset length-width ratio threshold value or not;
the texture detection module is used for carrying out gray level processing on the characteristic image corresponding to the ROI area so as to obtain a gray level characteristic image, calculating an entropy value of a gray level co-occurrence matrix of the gray level characteristic image, and judging whether the entropy value of the gray level co-occurrence matrix is larger than a preset entropy value or not;
the label changing module is used for changing the category label in the unframed image corresponding to the ROI area into a non-standard wearing label;
and the image output module is used for outputting the deframing image of which the class label corresponds to the wearing non-specification.
Preferably, the system further comprises:
and the augmentation module is used for carrying out augmentation technology processing on each work clothes wearing image in the work clothes wearing image sample set so as to obtain the work clothes wearing image augmentation sample set.
Preferably, the category label module specifically includes: a first judgment submodule and a second judgment submodule;
the first judgment sub-module is used for judging whether a worker in the working clothes wearing image wears a working clothes, determining that the class label of the corresponding working clothes wearing image is not standard if the worker does not wear the working clothes, and triggering a working signal of the second judgment sub-module if the worker wears the working clothes;
the second judgment submodule is used for receiving the working signal triggered by the first judgment submodule, judging whether the working clothes worn by the staff in the working clothes wearing image are standard or not based on a working clothes wearing rule, determining that the corresponding category label of the working clothes wearing image is a wearing standard if the working clothes worn by the staff in the working clothes wearing image is standard, and determining that the corresponding category label of the working clothes wearing image is a suspected wearing non-standard if the working clothes worn by the staff in the working clothes wearing image is non-standard.
Preferably, the system further comprises:
and the filtering module is used for carrying out filtering processing on the plurality of the de-framing images so as to obtain a plurality of filtered de-framing images.
Preferably, the second screening module is further configured to screen out the unframed image screened out by the first screening module according to the category label, and is further configured to transmit the unframed image with irregular wearing to the image output module so as to output a corresponding unframed image.
According to the technical scheme, the invention has the following advantages:
the invention provides a method and a system for identifying irregular wearing working clothes, which are characterized in that a specifically made training sample set is trained on the basis of a YOLOV4 network structure, a target detection model which can output three categories of irregular wearing, suspected irregular wearing and wearing specifications can be obtained through training, then a plurality of unframed images in a real-time monitoring video stream are subjected to target detection on the basis of the target detection model, the unframed images which are output to be suspected to be irregular wearing are further identified, an ROI (region of interest) in the unframed images which are suspected to be irregular wearing is extracted in the further identification process, whether sleeves or trouser legs in the unframed images which are suspected to be irregular wearing are in normal action or not is judged according to the length-width ratio of the ROI, and if the sleeves or trouser legs are not in normal action, the clothes are judged to be irregular wearing; if the movement is normal, representing whether obvious wrinkles appear on sleeves or trouser legs in the unframed image by calculating the entropy value of the gray level co-occurrence matrix of the gray level characteristic image, and judging that the wearing is not standard when the obvious wrinkles appear. Whether the working clothes are worn normally or not is identified through the plurality of judgment rules, so that the accuracy of identifying the working clothes which are not worn normally is improved, the probability of error identification is reduced, and meanwhile, the identification efficiency is also improved.
Drawings
Fig. 1 is a flowchart of an irregular wearing work clothes identification method according to an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram of an irregular wearing work clothes identification system according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of a category label module in an irregular wearing work clothes identification system according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In the prior art, the deep learning technology is applied to the irregular wearing condition of the monitoring operator, so that the identification efficiency can be improved, and the labor intensity of the guardian is reduced.
Therefore, the invention provides a method for identifying the irregular wearing working clothes, and referring to fig. 1, the method for identifying the irregular wearing working clothes comprises the following steps:
s1, collecting working clothes wearing images of personnel in the electric power construction site to construct a working clothes wearing image sample set;
it should be noted that, in the way of collecting the work clothes wearing images of the staff in the power construction site, the images of the staff contained in the historical monitoring video of the power construction site are intercepted frame by frame to obtain a large number of work clothes wearing images, wherein the work clothes wearing images may include the staff who does not wear the work clothes.
S2, classifying each working clothes wearing image in the working clothes wearing image sample set according to the dressing classification rule, and determining a class label, wherein the class label comprises a wearing irregularity, a suspected wearing irregularity and a wearing specification;
when classifying, the classification may be performed manually or based on a trained deep learning algorithm.
S3, manually labeling a target area in each working clothes wearing image by using a rectangular frame according to the category label to obtain the coordinate of the rectangular frame and the corresponding category label so as to manufacture a training sample set, wherein the target area comprises sleeves and trouser legs of the working clothes;
it should be noted that, because the important criterion for determining whether the working clothes are worn normally is mainly the sleeves and the trouser legs of the working clothes, for example, the actions of pulling the sleeves and pulling the trouser legs can be determined as that the working clothes are worn abnormally, the sleeves and the trouser legs of the working clothes can be marked by the rectangular frame to prepare for subsequent determination.
S4, training a YOLOV4 network structure through a training sample set to obtain a target detection model;
s5, decoding the real-time monitoring video stream to obtain a plurality of unframed images;
s6, performing target identification on the multiple de-framing images based on the target detection model, so as to obtain a corresponding target rectangular frame and rectangular frame information thereof in each de-framing image, wherein the rectangular frame information comprises a category label, a target rectangular frame coordinate and a confidence score of whether the target rectangular frame is a target area;
s7, comparing the confidence score of each deframed image with a preset confidence score threshold value, and screening the deframed images larger than the preset confidence score threshold value from the plurality of deframed images;
it can be understood that based on the YOLOV4 network structure, the confidence score of the deframed image can be output, the deframed image larger than the preset confidence score threshold value is screened out, and meanwhile, the deframed image with the confidence score smaller than the preset confidence score threshold value is removed, so that the detection precision of the target can be improved.
S8, screening the unframed images screened in the step S7 to screen suspected unframed images with irregular wearing according to the category labels;
it should be noted that when the category label of the deframed image is suspected to be irregular in wearing, it needs to be further confirmed whether the image is irregular in wearing, and when the category label of the deframed image is regular in wearing or irregular in wearing, it does not need to be further confirmed.
S9, extracting an ROI (region of interest) region from the corresponding target region in the suspected dressing irregular unframed image screened in the step S8 to obtain a characteristic image of the ROI region, wherein the characteristic image comprises a sleeve image of the working clothes and a trouser leg image of the working clothes;
it should be noted that the ROI region is also the region of interest, the ROI is called region of interest in english, and the specific extraction process is to solve the frame image from the ROI and outline the region to be processed in a manner of a square frame, a circle, an ellipse, an irregular polygon, etc., and is called the region of interest.
S10, calculating the length-width ratio of the ROI according to the ROI and the coordinates of the target rectangular frame;
in this embodiment, the ROI region is extracted as a rectangular region, and the length and width of the rectangular region can be calculated from the center coordinates of the target rectangular frame, thereby obtaining the length-width ratio.
S11, judging whether the length-width ratio of the ROI exceeds a preset length-width ratio threshold value, if so, executing a step S13, and if not, executing a step S12;
it should be noted that the preset length-width ratio threshold value can be obtained and set by analyzing a large amount of test data.
When the length-width ratio of the ROI area exceeds a preset length-width ratio threshold, the sleeves or the trouser legs are close to or exceed elbow joints and knee joints, namely a large amount of skin is exposed, and the condition that the working clothes are not normally worn can be determined;
when the length-width ratio of the ROI is smaller than a preset length-width ratio threshold, it indicates that a small amount of skin may be exposed, and it is necessary to further determine whether the corresponding sleeves or the corresponding trouser legs are normal or abnormal.
S12, carrying out gray level processing on the characteristic image corresponding to the ROI area to obtain a gray level characteristic image, calculating an entropy value of a gray level co-occurrence matrix of the gray level characteristic image, judging whether the entropy value of the gray level co-occurrence matrix is larger than a preset entropy value or not, and if so, executing a step S13; if the above determination is no, go to step S5;
it should be noted that, a gray level co-occurrence matrix is defined as a probability that each pixel point with gray level i in an image leaves a certain fixed position (the distance is d, the direction is 0 degree, 45 degrees, 90 degrees, etc.), and exactly the gray level value is j, that is, all estimated values can be expressed in a matrix form, which is called a gray level co-occurrence matrix, because the data volume of the gray level co-occurrence matrix is large, the gray level co-occurrence matrix is generally not directly used as a feature for distinguishing textures, but some statistics constructed based on the gray level co-occurrence matrix are used as texture classification features, wherein an entropy value includes randomness of information quantity to represent the complexity of the image; the larger the entropy value, the more complex the image texture.
In this embodiment, as the worker climbs or bends over in normal work, the sleeves and the legs naturally slip off, exposing a part of the skin of the wrist/ankle, which is a normal action. However, due to the artificial folding of the sleeves or legs, there are significant wrinkles in the sleeves and legs relative to normal motion. Therefore, the complexity of the gray characteristic image is represented by calculating the entropy value of the gray co-occurrence matrix, the higher the entropy value of the gray co-occurrence matrix is, the more obvious wrinkles on the sleeves and the trouser legs are indicated, and when the entropy value of the gray co-occurrence matrix is larger than the preset entropy value, the corresponding sleeves or trouser legs are indicated to be abnormal actions, and the condition that the wearing is not standard can be determined.
S13, changing the class label in the unframed image corresponding to the ROI area into a non-standard wearing state, and executing the step S14;
and S14, outputting a class label as a deframing image corresponding to the wearing irregularity.
It should be noted that, in a general example, after the deframed image is output, a timely alarm needs to be performed to notify a monitor to arrive at the scene for supervision.
The embodiment provides a method for identifying irregular wearing working clothes, a specially made training sample set is trained based on a YOLOV4 network structure, a target detection model which can output three categories of irregular wearing, suspected irregular wearing and wearing specifications can be obtained through training, then target detection is carried out on a plurality of unframed images in a real-time monitoring video stream based on the target detection model, the unframed images which are output to be suspected to be irregular wearing are further identified, in the further identification process, an ROI (region of interest) area in the unframed images which are suspected to be irregular wearing is extracted, whether sleeves or trouser legs in the unframed images which are suspected to be irregular wearing are in normal action or not is judged according to the length-width proportion of the ROI area, and if the sleeves or trouser legs are not in normal action, the clothes are judged to be irregular wearing; if the movement is normal, representing whether obvious wrinkles appear on sleeves or trouser legs in the unframed image by calculating the entropy value of the gray level co-occurrence matrix of the gray level characteristic image, and judging that the wearing is not standard when the obvious wrinkles appear. Whether the working clothes are worn normally or not is identified through the plurality of judgment rules, so that the accuracy of identifying the working clothes which are not worn normally is improved, the probability of error identification is reduced, and meanwhile, the identification efficiency is also improved.
The following is a detailed description of a specific embodiment of the irregular wearing work clothes identification method provided by the present embodiment.
Further, step S2 is preceded by:
s102, carrying out augmentation technology processing on each work clothes wearing image in the work clothes wearing image sample set, and accordingly obtaining the work clothes wearing image augmentation sample set.
It should be noted that the augmentation technology specifically includes: the operations of rotating, mirroring and adjusting brightness and contrast are carried out on each working clothes wearing image in the working clothes wearing image sample set, so that the number of the sample sets is multiplied, and the diversity and the representativeness of the sample sets are improved.
Further, step S2 specifically includes:
s201, judging whether a worker in the working clothes wearing image wears a working clothes, if not, determining that the class label of the corresponding working clothes wearing image is not standard, and if yes, executing S202;
it should be noted that when the worker does not wear the work clothes, further determination is not needed, and therefore, the category label is not standard in wearing.
S202, whether the working clothes worn by the staff in the working clothes wearing image are standard or not is judged based on the working clothes wearing rule, if the working clothes worn by the staff in the working clothes wearing image are judged to be standard, the category label of the corresponding working clothes wearing image is determined to be the wearing standard, and if the working clothes worn by the staff in the working clothes wearing image are determined to be not standard, the category label of the corresponding working clothes wearing image is determined to be suspected to be not standard.
It should be noted that, when the staff wears the work clothes, whether the staff in the work clothes wearing image wears the work clothes to meet the standard or not needs to be judged according to the work clothes wearing rule, wherein the judgment approach can be judged through manual work, when the staff in the work clothes wearing image wears the work clothes to be standard, the category label of the corresponding work clothes wearing image is determined to be the wearing standard, and if the staff in the work clothes wearing image wears the work clothes to be non-standard according to the work clothes wearing rule, the category label can be further identified, so that the category label of the corresponding work clothes wearing image is suspected to be not standard.
Further, after the step S5, the step S6 includes before:
and filtering the plurality of the unframed images to obtain a plurality of filtered unframed images.
Further, step S8 further includes: and (5) screening out the unframed image which is screened out in the step (S7) according to the category label, and turning to the step (S14).
It should be noted that, when the category label is not standard in wearing, the corresponding deframed image needs to be output in time, and an alarm is given in time.
The above is a detailed description of an embodiment of the method for identifying the irregular wearing working clothes provided by the present invention, and the following is a detailed description of an embodiment of the system for identifying the irregular wearing working clothes provided by the present invention.
For ease of understanding, referring to fig. 2, the present invention provides an irregular wearing work clothes identification system, comprising:
the sample set construction module 100 is used for collecting working clothes wearing images of staff in the electric power construction site so as to construct a working clothes wearing image sample set;
it should be noted that, in the way of collecting the work clothes wearing images of the staff in the power construction site, the images of the staff contained in the historical monitoring video of the power construction site are intercepted frame by frame to obtain a large number of work clothes wearing images, wherein the work clothes wearing images may include the staff who does not wear the work clothes.
The category label module 200 is configured to classify each working clothes wearing image in the working clothes wearing image sample set according to the dressing classification rule, and determine a category label, where the category label includes a wearing irregularity, a suspected wearing irregularity, and a wearing norm;
when classifying, the classification may be performed manually or based on a trained deep learning algorithm.
The labeling module 300 is configured to manually label a target area in each work clothes wearing image with a rectangular frame according to the category label, so as to obtain coordinates of the rectangular frame and a corresponding category label, and make a training sample set, where the target area includes sleeves and trouser legs of the work clothes;
it should be noted that, because the important criterion for determining whether the working clothes are worn normally is mainly the sleeves and the trouser legs of the working clothes, for example, the actions of pulling the sleeves and pulling the trouser legs can be determined as that the working clothes are worn abnormally, the sleeves and the trouser legs of the working clothes can be marked by the rectangular frame to prepare for subsequent determination.
A training module 400, configured to train a YOLOV4 network structure through a training sample set to obtain a target detection model;
a decoding module 500, configured to perform decoding processing on the real-time monitoring video stream, so as to obtain a plurality of deframed images;
the target detection module 600 is configured to perform target identification on the multiple deframed images based on a target detection model, so as to obtain a corresponding target rectangular frame and rectangular frame information thereof in each deframed image, where the rectangular frame information includes a category label, a target rectangular frame coordinate, and a confidence score of whether the target rectangular frame is a target region;
a first screening module 700, configured to compare the confidence score of each deframed image with a preset confidence score threshold, so as to screen a deframed image larger than the preset confidence score threshold from a plurality of deframed images;
it can be understood that based on the YOLOV4 network structure, the confidence score of the deframed image can be output, and the deframed image larger than the preset confidence score threshold value is screened out, so that the detection accuracy of the target can be improved.
The second screening module 800 is configured to screen the unframed images screened by the first screening module 700 into suspected unframed images with irregular wearing according to the category labels;
it should be noted that when the category label of the deframed image is suspected to be irregular in wearing, it needs to be further confirmed whether the image is irregular in wearing, and when the category label of the deframed image is regular in wearing or irregular in wearing, it does not need to be further confirmed.
An ROI extracting module 900, configured to extract an ROI region from a corresponding target region in the screened suspected irregular wearing unframed image, so as to obtain a feature image of the ROI region, where the feature image includes a sleeve image of the work clothes and a trouser leg image of the work clothes;
it should be noted that the ROI region is also the region of interest, the ROI is called region of interest in english, and the specific extraction process is to solve the frame image from the ROI and outline the region to be processed in a manner of a square frame, a circle, an ellipse, an irregular polygon, etc., and is called the region of interest.
The calculation module 110 is configured to calculate a length-width ratio of the ROI area according to the ROI area and the coordinates of the target rectangular frame;
in this embodiment, the ROI region is extracted as a rectangular region, and the length and width of the rectangular region can be calculated from the center coordinates of the target rectangular frame, thereby obtaining the length-width ratio.
A determining module 120, configured to determine whether a length-width ratio of the ROI exceeds a preset length-width ratio threshold;
it should be noted that the preset length-width ratio threshold value can be obtained and set by analyzing a large amount of test data.
When the length-width ratio of the ROI area exceeds a preset length-width ratio threshold, the sleeves or the trouser legs are close to or exceed elbow joints and knee joints, namely a large amount of skin is exposed, and the condition that the working clothes are not normally worn can be determined;
when the length-width ratio of the ROI is smaller than a preset length-width ratio threshold, it indicates that a small amount of skin may be exposed, and it is necessary to further determine whether the corresponding sleeves or the corresponding trouser legs are normal or abnormal.
The texture detection module 130 is configured to perform gray level processing on the feature image corresponding to the ROI region to obtain a gray level feature image, calculate an entropy value of a gray level co-occurrence matrix of the gray level feature image, and determine whether the entropy value of the gray level co-occurrence matrix is greater than a preset entropy value;
it should be noted that, a gray level co-occurrence matrix is defined as a probability that each pixel point with gray level i in an image leaves a certain fixed position (the distance is d, the direction is 0 degree, 45 degrees, 90 degrees, etc.), and exactly the gray level value is j, that is, all estimated values can be expressed in a matrix form, which is called a gray level co-occurrence matrix, because the data volume of the gray level co-occurrence matrix is large, the gray level co-occurrence matrix is generally not directly used as a feature for distinguishing textures, but some statistics constructed based on the gray level co-occurrence matrix are used as texture classification features, wherein an entropy value includes randomness of information quantity to represent the complexity of the image; the larger the entropy value, the more complex the image texture.
In this embodiment, as the worker climbs or bends over in normal work, the sleeves and the legs naturally slip off, exposing a part of the skin of the wrist/ankle, which is a normal action. However, due to the artificial folding of the sleeves or legs, there are significant wrinkles in the sleeves and legs relative to normal motion. Therefore, the complexity of the gray characteristic image is represented by calculating the entropy value of the gray co-occurrence matrix, the higher the entropy value of the gray co-occurrence matrix is, the more obvious wrinkles on the sleeves and the trouser legs are indicated, and when the entropy value of the gray co-occurrence matrix is larger than the preset entropy value, the corresponding sleeves or trouser legs are indicated to be abnormal actions, and the condition that the wearing is not standard can be determined.
The label changing module 140 is configured to change the category label in the deframed image corresponding to the ROI area into a non-standard wearing label;
and the image output module 150 is used for outputting the deframed image of which the class label corresponds to the wearing non-specification.
It should be noted that, in a general example, after the deframed image is output, a timely alarm needs to be performed to notify a monitor to arrive at the scene for supervision.
The embodiment provides a method for identifying irregular wearing working clothes, a specially made training sample set is trained based on a YOLOV4 network structure, a target detection model which can output three categories of irregular wearing, suspected irregular wearing and wearing specifications can be obtained through training, then target detection is carried out on a plurality of unframed images in a real-time monitoring video stream based on the target detection model, the unframed images which are output to be suspected to be irregular wearing are further identified, in the further identification process, an ROI (region of interest) area in the unframed images which are suspected to be irregular wearing is extracted, whether sleeves or trouser legs in the unframed images which are suspected to be irregular wearing are in normal action or not is judged according to the length-width proportion of the ROI area, and if the sleeves or trouser legs are not in normal action, the clothes are judged to be irregular wearing; if the movement is normal, representing whether obvious wrinkles appear on sleeves or trouser legs in the unframed image by calculating the entropy value of the gray level co-occurrence matrix of the gray level characteristic image, and judging that the wearing is not standard when the obvious wrinkles appear. Whether the working clothes are worn normally or not is identified through the plurality of judgment rules, so that the accuracy of identifying the working clothes which are not worn normally is improved, the probability of error identification is reduced, and meanwhile, the identification efficiency is also improved.
Further, the system also includes:
and the augmentation module is used for carrying out augmentation technology processing on each work clothes wearing image in the work clothes wearing image sample set so as to obtain the work clothes wearing image augmentation sample set.
It should be noted that the augmentation technology specifically includes: the operations of rotating, mirroring and adjusting brightness and contrast are carried out on each working clothes wearing image in the working clothes wearing image sample set, so that the number of the sample sets is multiplied, and the diversity and the representativeness of the sample sets are improved.
Further, as shown in fig. 3, the category label module 200 specifically includes: a first judgment sub-module 201 and a second judgment sub-module 202;
the first judgment sub-module 201 is used for judging whether a worker in the working clothes wearing image wears a working clothes, determining that the class label of the corresponding working clothes wearing image is not standard if the worker does not wear the working clothes, and triggering a working signal of the second judgment sub-module 202 if the worker wears the working clothes;
the second judgment submodule 202 is configured to receive a working signal triggered by the first judgment submodule 201, and is further configured to judge whether the working clothes worn by the staff in the working clothes wearing image are standard or not based on the working clothes wearing rule, and determine that the category label of the corresponding working clothes wearing image is the wearing standard if the working clothes worn by the staff in the working clothes wearing image is standard, and determine that the category label of the corresponding working clothes wearing image is the suspected wearing non-standard if the working clothes worn by the staff in the working clothes wearing image is not standard.
Further, the system also includes:
and the filtering module is used for carrying out filtering processing on the plurality of the de-framing images so as to obtain a plurality of filtered de-framing images.
Further, the second screening module 800 is further configured to screen out the unframed image with the first screening module screened out according to the category label, and is further configured to transmit the unframed image with the irregular wearing to the image output module, so as to output a corresponding unframed image.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (10)

1. An identification method for irregular wearing work clothes is characterized by comprising the following steps:
s1, collecting working clothes wearing images of personnel in the electric power construction site to construct a working clothes wearing image sample set;
s2, classifying each working clothes wearing image in the working clothes wearing image sample set according to the dressing classification rule, and determining a class label, wherein the class label comprises a wearing irregularity, a suspected wearing irregularity and a wearing specification;
s3, manually labeling a target area in each working clothes wearing image by using a rectangular frame according to the category label so as to obtain the coordinates of the rectangular frame and the corresponding category label to manufacture a training sample set, wherein the target area comprises sleeves and trouser legs of the working clothes;
s4, training a YOLOV4 network structure through the training sample set to obtain a target detection model;
s5, decoding the real-time monitoring video stream to obtain a plurality of unframed images;
s6, performing target identification on the multiple de-framing images based on the target detection model, so as to obtain a corresponding target rectangular frame and rectangular frame information thereof in each de-framing image, wherein the rectangular frame information comprises a category label, a target rectangular frame coordinate and a confidence score of whether the target rectangular frame is the target area;
s7, comparing the confidence score of each deframer image with a preset confidence score threshold value, so as to screen deframer images larger than the preset confidence score threshold value from the plurality of deframer images;
s8, screening the unframed images screened in the step S7 according to the category labels to screen the unframed images suspected of being worn abnormally;
s9, extracting an ROI (region of interest) from the corresponding target region in the suspected dressing irregular unframed image screened in the step S8 to obtain a characteristic image of the ROI, wherein the characteristic image comprises a sleeve image of the working clothes and a trouser leg image of the working clothes;
s10, calculating the length-width ratio of the ROI according to the ROI and the coordinates of the target rectangular frame;
s11, judging whether the length-width ratio of the ROI exceeds a preset length-width ratio threshold value, if so, executing a step S13, and if not, executing a step S12;
s12, carrying out gray level processing on the characteristic image corresponding to the ROI area to obtain a gray level characteristic image, calculating an entropy value of a gray level co-occurrence matrix of the gray level characteristic image, judging whether the entropy value of the gray level co-occurrence matrix is larger than a preset entropy value or not, and if so, executing a step S13; if the above determination is no, go to step S5;
s13, changing the class label in the unframed image corresponding to the ROI area into a non-standard wearing state, and executing the step S14;
and S14, outputting a class label as a deframing image corresponding to the wearing irregularity.
2. The method for identifying the irregular wearing work clothes according to claim 1, wherein after the step S1, the step S2 further comprises:
s102, carrying out augmentation technology processing on each work clothes wearing image in the work clothes wearing image sample set, and accordingly obtaining a work clothes wearing image augmentation sample set.
3. The method for identifying the irregular wearing work clothes according to claim 1, wherein the step S2 specifically comprises:
s201, judging whether a worker in the working clothes wearing image wears a working clothes, if not, determining that the class label of the corresponding working clothes wearing image is not standard, and if yes, executing step S202;
s202, judging whether the working clothes worn by the staff in the working clothes wearing image are standard or not based on a working clothes wearing rule, if so, determining that the corresponding category label of the working clothes wearing image is the wearing standard, and if not, determining that the category label of the working clothes wearing image is suspected to be not standard.
4. The method for identifying the irregular wearing workwear as claimed in claim 1, wherein after the step S5, the step S6 is preceded by:
and carrying out filtering processing on the plurality of the decoded images so as to obtain a plurality of filtered decoded images.
5. The method for identifying the irregular wearing work clothes according to claim 1, wherein the step S8 further comprises: and (4) screening out the unframed image which is screened out in the step (S7) according to the category label, and turning to the step (S14).
6. An irregular wearing work garment identification system, comprising:
the system comprises a sample set construction module, a data acquisition module and a data processing module, wherein the sample set construction module is used for acquiring working clothes wearing images of personnel in the electric power construction site so as to construct a working clothes wearing image sample set;
the category label module is used for classifying each working clothes wearing image in the working clothes wearing image sample set according to the dressing classification rule and determining a category label, wherein the category label comprises a wearing irregularity, a suspected wearing irregularity and a wearing specification;
the labeling module is used for manually labeling a target area in each working clothes wearing image by using a rectangular frame according to the category label so as to obtain the coordinate of the rectangular frame and the corresponding category label to manufacture a training sample set, wherein the target area comprises sleeves and trouser legs of the working clothes;
the training module is used for training a Yolov4 network structure through the training sample set to obtain a target detection model;
the decoding module is used for decoding the real-time monitoring video stream so as to obtain a plurality of unframed images;
the target detection module is used for carrying out target identification on the multiple de-framing images based on the target detection model so as to obtain a corresponding target rectangular frame and rectangular frame information thereof in each de-framing image, wherein the rectangular frame information comprises a category label, a target rectangular frame coordinate and a confidence score of whether the target rectangular frame is a target area;
the first screening module is used for comparing the confidence score of each deframed image with a preset confidence score threshold value so as to screen deframed images which are larger than the preset confidence score threshold value from the plurality of deframed images;
the second screening module is used for screening the unframed images screened by the first screening module into unframed images suspected of irregular wearing according to the category labels;
the ROI extraction module is used for extracting an ROI area from the corresponding target area in the screened suspected irregular wearing unframed image so as to obtain a characteristic image of the ROI area, wherein the characteristic image comprises a sleeve image of the working clothes and a trouser leg image of the working clothes;
the calculation module is used for calculating the length-width ratio of the ROI according to the ROI area and the target rectangular frame coordinate;
the judging module is used for judging whether the length-width ratio of the ROI exceeds a preset length-width ratio threshold value or not;
the texture detection module is used for carrying out gray level processing on the characteristic image corresponding to the ROI area so as to obtain a gray level characteristic image, calculating an entropy value of a gray level co-occurrence matrix of the gray level characteristic image, and judging whether the entropy value of the gray level co-occurrence matrix is larger than a preset entropy value or not;
the label changing module is used for changing the category label in the unframed image corresponding to the ROI area into a non-standard wearing label;
and the image output module is used for outputting the deframing image of which the class label corresponds to the wearing non-specification.
7. The irregular wearing workwear identification system of claim 6 further comprising:
and the augmentation module is used for carrying out augmentation technology processing on each work clothes wearing image in the work clothes wearing image sample set so as to obtain the work clothes wearing image augmentation sample set.
8. The irregular wearing work clothes recognition system of claim 6, wherein the category label module specifically comprises: a first judgment submodule and a second judgment submodule;
the first judgment sub-module is used for judging whether a worker in the working clothes wearing image wears a working clothes, determining that the class label of the corresponding working clothes wearing image is not standard if the worker does not wear the working clothes, and triggering a working signal of the second judgment sub-module if the worker wears the working clothes;
the second judgment submodule is used for receiving the working signal triggered by the first judgment submodule, judging whether the working clothes worn by the staff in the working clothes wearing image are standard or not based on a working clothes wearing rule, determining that the corresponding category label of the working clothes wearing image is a wearing standard if the working clothes worn by the staff in the working clothes wearing image is standard, and determining that the corresponding category label of the working clothes wearing image is a suspected wearing non-standard if the working clothes worn by the staff in the working clothes wearing image is non-standard.
9. The irregular wearing workwear identification system of claim 6 further comprising:
and the filtering module is used for carrying out filtering processing on the plurality of the de-framing images so as to obtain a plurality of filtered de-framing images.
10. The irregular wearing working clothes recognition system according to claim 6, wherein the second screening module is further configured to screen the deframed images screened by the first screening module into deframed images with irregular wearing according to the category labels, and further configured to transmit the deframed images with irregular wearing to the image output module to output corresponding deframed images.
CN202110643917.3A 2021-06-09 2021-06-09 Method and system for identifying irregular wearing work clothes Active CN113313186B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110643917.3A CN113313186B (en) 2021-06-09 2021-06-09 Method and system for identifying irregular wearing work clothes

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110643917.3A CN113313186B (en) 2021-06-09 2021-06-09 Method and system for identifying irregular wearing work clothes

Publications (2)

Publication Number Publication Date
CN113313186A true CN113313186A (en) 2021-08-27
CN113313186B CN113313186B (en) 2023-01-24

Family

ID=77378345

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110643917.3A Active CN113313186B (en) 2021-06-09 2021-06-09 Method and system for identifying irregular wearing work clothes

Country Status (1)

Country Link
CN (1) CN113313186B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114419491A (en) * 2021-12-28 2022-04-29 云从科技集团股份有限公司 Video identification method and device and computer storage medium
CN116912756A (en) * 2023-04-14 2023-10-20 广东墨点鹰智能科技有限公司 Edge protection safety reminding identification method and system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101364263A (en) * 2008-09-28 2009-02-11 腾讯科技(深圳)有限公司 Method and system for detecting skin texture to image
CN109635697A (en) * 2018-12-04 2019-04-16 国网浙江省电力有限公司电力科学研究院 Electric operating personnel safety dressing detection method based on YOLOv3 target detection
CN109635758A (en) * 2018-12-18 2019-04-16 武汉市蓝领英才科技有限公司 Wisdom building site detection method is dressed based on the high altitude operation personnel safety band of video
CN109934287A (en) * 2019-03-12 2019-06-25 上海宝尊电子商务有限公司 A kind of clothing texture method for identifying and classifying based on LBP and GLCM
CN111242185A (en) * 2020-01-03 2020-06-05 凌云光技术集团有限责任公司 Defect rapid preliminary screening method and system based on deep learning
CN112183471A (en) * 2020-10-28 2021-01-05 西安交通大学 Automatic detection method and system for standard wearing of epidemic prevention mask of field personnel

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101364263A (en) * 2008-09-28 2009-02-11 腾讯科技(深圳)有限公司 Method and system for detecting skin texture to image
CN109635697A (en) * 2018-12-04 2019-04-16 国网浙江省电力有限公司电力科学研究院 Electric operating personnel safety dressing detection method based on YOLOv3 target detection
CN109635758A (en) * 2018-12-18 2019-04-16 武汉市蓝领英才科技有限公司 Wisdom building site detection method is dressed based on the high altitude operation personnel safety band of video
CN109934287A (en) * 2019-03-12 2019-06-25 上海宝尊电子商务有限公司 A kind of clothing texture method for identifying and classifying based on LBP and GLCM
CN111242185A (en) * 2020-01-03 2020-06-05 凌云光技术集团有限责任公司 Defect rapid preliminary screening method and system based on deep learning
CN112183471A (en) * 2020-10-28 2021-01-05 西安交通大学 Automatic detection method and system for standard wearing of epidemic prevention mask of field personnel

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
CHENGXIA LIU ET AL.: "Novel Measurement for Multidirectional Fabric Wrinkling", 《FIBERS AND POLYMERS》 *
卫潮冰: "基于CNN的安全智能监测识别算法", 《电子设计工程》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114419491A (en) * 2021-12-28 2022-04-29 云从科技集团股份有限公司 Video identification method and device and computer storage medium
CN116912756A (en) * 2023-04-14 2023-10-20 广东墨点鹰智能科技有限公司 Edge protection safety reminding identification method and system
CN116912756B (en) * 2023-04-14 2024-04-09 广东墨点鹰智能科技有限公司 Edge protection safety reminding identification method and system

Also Published As

Publication number Publication date
CN113313186B (en) 2023-01-24

Similar Documents

Publication Publication Date Title
CN113313186B (en) Method and system for identifying irregular wearing work clothes
CN108537154A (en) Transmission line of electricity Bird's Nest recognition methods based on HOG features and machine learning
CN110414400B (en) Automatic detection method and system for wearing of safety helmet on construction site
CN111209848A (en) Real-time fall detection method based on deep learning
CN112396658A (en) Indoor personnel positioning method and positioning system based on video
CN114937232B (en) Wearing detection method, system and equipment for medical waste treatment personnel protective appliance
CN115249331B (en) Mine ecological safety identification method based on convolutional neural network model
CN112036327A (en) SSD-based lightweight safety helmet detection method
CN112188164A (en) AI vision-based violation real-time monitoring system and method
CN111401310B (en) Kitchen sanitation safety supervision and management method based on artificial intelligence
CN115223249A (en) Quick analysis and identification method for unsafe behaviors of underground personnel based on machine vision
CN111860187A (en) High-precision worn mask identification method and system
CN114613101A (en) Intelligent hotel security system based on big data
CN113384267A (en) Fall real-time detection method, system, terminal equipment and storage medium
CN113506416A (en) Engineering abnormity early warning method and system based on intelligent visual analysis
CN113536842A (en) Electric power operator safety dressing identification method and device
CN111582183A (en) Mask identification method and system in public place
CN115909212A (en) Real-time early warning method for typical violation behaviors of power operation
CN110751125A (en) Wearing detection method and device
CN113762115B (en) Distribution network operator behavior detection method based on key point detection
CN113469150B (en) Method and system for identifying risk behaviors
CN115240277A (en) Security check behavior monitoring method and device, electronic equipment and storage medium
CN115049875A (en) Detection method for wearing insulating gloves in transformer substation based on deep learning
CN114359831A (en) Risk omen reasoning-oriented intelligent identification system and method for worker side-falling
CN112528855A (en) Electric power operation dressing standard identification method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant