CN113837138B - Dressing monitoring method, dressing monitoring system, dressing monitoring medium and electronic terminal - Google Patents

Dressing monitoring method, dressing monitoring system, dressing monitoring medium and electronic terminal Download PDF

Info

Publication number
CN113837138B
CN113837138B CN202111165053.5A CN202111165053A CN113837138B CN 113837138 B CN113837138 B CN 113837138B CN 202111165053 A CN202111165053 A CN 202111165053A CN 113837138 B CN113837138 B CN 113837138B
Authority
CN
China
Prior art keywords
human body
target
dressing
similarity
local
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111165053.5A
Other languages
Chinese (zh)
Other versions
CN113837138A (en
Inventor
杨胜元
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Unisinsight Technology Co Ltd
Original Assignee
Chongqing Unisinsight Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Unisinsight Technology Co Ltd filed Critical Chongqing Unisinsight Technology Co Ltd
Priority to CN202111165053.5A priority Critical patent/CN113837138B/en
Publication of CN113837138A publication Critical patent/CN113837138A/en
Application granted granted Critical
Publication of CN113837138B publication Critical patent/CN113837138B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Alarm Systems (AREA)

Abstract

The invention provides a dressing monitoring method, a dressing monitoring system, a dressing monitoring medium and an electronic terminal, wherein the dressing monitoring method comprises the following steps: collecting material pictures, inputting the material pictures into a pre-trained feature extraction model for feature extraction, and obtaining a clothing base, wherein the clothing base comprises: human body material characteristics and corresponding garment types; acquiring a dressing monitoring task; calling the related human body material characteristics in a clothing base according to the type of the monitored clothing in the dressing monitoring task; acquiring a picture to be identified; inputting the picture to be identified into a feature extraction model for feature extraction, and obtaining target feature associated information, wherein the target feature associated information comprises: human body overall characteristics, human body local characteristics and local confidence degrees corresponding to the human body local characteristics; performing dressing monitoring according to the related human body material characteristics and the target characteristic association information; the dressing monitoring method effectively improves dressing monitoring accuracy, reduces dressing monitoring cost, and has higher automation degree, lower professional requirements and more convenient implementation.

Description

Dressing monitoring method, dressing monitoring system, dressing monitoring medium and electronic terminal
Technical Field
The present invention relates to the field of computer technologies, and in particular, to a dressing monitoring method, a dressing monitoring system, a dressing monitoring medium, and an electronic terminal.
Background
With the development of computer technology, image monitoring has received more and more attention and importance. In some specific application scenarios, the dressing of the present personnel needs to be monitored to determine whether the dressing of the personnel is uniform, such as uniform wear for mail storage, uniform reflective wear for construction sites, and the like. At present, a mode of manually watching a monitoring video is generally adopted to monitor the wearing of people appearing in the video and judge whether the dressing of a target person meets relevant regulations, however, the mode of personnel monitoring is easy to cause lower dressing monitoring accuracy, higher cost and lower automation degree.
Disclosure of Invention
The invention provides a dressing monitoring method, a dressing monitoring system, a dressing monitoring medium and an electronic terminal, which are used for solving the problems of lower dressing monitoring accuracy, higher cost and lower automation degree in the prior art.
The dressing monitoring method provided by the invention comprises the following steps:
collecting material pictures, inputting the material pictures into a pre-trained feature extraction model for feature extraction, and obtaining a clothing base, wherein the clothing base comprises: human body material characteristics and corresponding garment types;
Acquiring a dressing monitoring task;
invoking relevant human body material characteristics in the clothing base according to the type of the monitored clothing in the clothing monitoring task;
acquiring a picture to be identified;
inputting the picture to be identified into a feature extraction model for feature extraction, and obtaining target feature association information, wherein the target feature association information comprises: human body overall characteristics, human body local characteristics and local confidence degrees corresponding to the human body local characteristics;
and performing dressing monitoring according to the related human body material characteristics and the target characteristic related information.
Optionally, the step of obtaining the clothing base includes:
performing clothing class marking on the material pictures;
inputting the marked material pictures into the feature extraction model to perform feature extraction, and obtaining a plurality of human body material features, wherein the human body material features comprise: integral material characteristics, local material characteristics and local confidence corresponding to the local material characteristics;
the local material characteristics comprise head area material characteristics and shoulder and waist area material characteristics, and local confidence degrees corresponding to the head area material characteristics and the shoulder and waist area material characteristics are obtained;
judging whether the local confidence coefficient corresponding to the head region material characteristics and the shoulder and waist region material characteristics exceeds a preset confidence coefficient threshold value;
And if the local confidence coefficient of the head region material characteristics and/or the local confidence coefficient of the shoulder and waist region material characteristics does not exceed the confidence coefficient threshold value, filtering the corresponding human body target to complete the acquisition of the clothing base.
Optionally, the step of obtaining the feature extraction model includes:
obtaining a training set, the training set comprising: the training device comprises a plurality of training samples, real sample characteristics corresponding to the training samples and real confidence degrees corresponding to the real sample characteristics;
inputting the training sample into a neural network for feature extraction, and obtaining predicted sample features and prediction confidence of the predicted sample features, wherein the predicted sample features comprise: predicting sample overall characteristics and predicting sample local characteristics;
and training the neural network according to the real sample characteristics, the real confidence coefficient, the predicted sample characteristics and the predicted confidence coefficient to obtain the characteristic extraction model.
Optionally, the step of performing dressing monitoring according to the related human body material characteristics and the target characteristic association information includes:
obtaining a comparison mode in the dressing monitoring task, wherein the comparison mode comprises the following steps: white list contrast and black list contrast;
Inputting the picture to be identified into the feature extraction model to extract a human body target, and obtaining a corresponding human body target;
judging whether the human body target is in a preset target area or not, and filtering the human body target which is not in the target area once;
judging whether the local confidence coefficient of the head region feature and the local confidence coefficient of the shoulder region feature in the human body local features of the human body target exceed a preset confidence coefficient threshold value, if the local confidence coefficient of the head region feature and/or the local confidence coefficient of the shoulder region feature does not exceed the confidence coefficient threshold value, judging that the corresponding human body target does not have the upper body, and further performing secondary filtering on the human body target;
after the filtering is finished, comparing the human body overall characteristics with the human body overall material characteristics in the related human body material characteristics for one time to obtain a first similarity;
and performing dressing monitoring according to the comparison mode and the first similarity.
Optionally, the step of performing dressing monitoring according to the comparison mode and the first similarity includes:
if the comparison mode is white list comparison, judging that the corresponding human body target has wearing abnormality when the first similarity does not exceed a preset similarity threshold value, and accumulating the wearing abnormality times of the human body target once;
When the first similarity exceeds the similarity threshold, performing secondary comparison on the shoulder and waist region characteristics in the local human body characteristics and the shoulder and waist region material characteristics in the related human body material characteristics to obtain second similarity;
judging whether the second similarity exceeds the similarity threshold, if the second similarity does not exceed the similarity threshold, judging that the corresponding human body target has wearing abnormality, and accumulating the wearing abnormality times of the human body target once;
and performing dressing monitoring according to the dressing abnormality times.
Optionally, the step of performing dressing monitoring further includes:
if the comparison mode is blacklist comparison, judging that the corresponding human body target has wearing abnormality when the first similarity exceeds a preset similarity threshold value, and accumulating the wearing abnormality times of the human body target once;
when the first similarity does not exceed the similarity threshold, comparing the shoulder and waist region characteristics in the human body local characteristics with the shoulder and waist region material characteristics in the related human body material characteristics for three times to obtain a third similarity;
judging whether the third similarity exceeds the similarity threshold, if so, judging that the corresponding human body target is abnormal in dressing, and accumulating the number of times of abnormal dressing of the human body target once;
And performing dressing monitoring according to the dressing abnormality times.
Optionally, the step of performing dressing monitoring according to the dressing abnormality number includes:
judging whether the existence time of the human body target exceeds a preset action duration threshold;
if the existing time of the human body target exceeds the action duration threshold, resetting the dressing abnormality times, re-recording the existing time of the human body target and re-judging the dressing abnormality;
if the existing time of the human body target does not exceed the action duration threshold, judging whether the wearing anomaly number of the human body target is greater than or equal to a preset anomaly number threshold;
if the wearing abnormal times of the human body target is greater than or equal to the abnormal times threshold, wearing abnormal warning is sent out, and wearing monitoring is completed;
the step of obtaining the abnormal times threshold value comprises the following steps: acquiring the abnormal frequency threshold according to the existence time of the human body target, the preset image recognition speed and the hit rate of the wearing abnormal behavior, wherein the acquisition of the mathematical expression of the abnormal frequency threshold is as follows:
C=t×V×β
wherein C is an abnormal frequency threshold, t is the current time of existence of a human target, V is the image recognition speed, and beta is the hit rate of abnormal behavior.
The invention also provides a dressing monitoring system, comprising:
the preprocessing module is used for collecting material pictures, inputting the material pictures into a pre-trained feature extraction model for feature extraction, and obtaining a clothing base, wherein the clothing base comprises: human body material characteristics and corresponding garment types;
the resource scheduling module is used for acquiring dressing monitoring tasks; invoking relevant human body material characteristics in the clothing base according to the type of the monitored clothing in the clothing monitoring task;
the dressing monitoring module is used for acquiring pictures to be identified; inputting the picture to be identified into a feature extraction model for feature extraction, and obtaining target feature association information, wherein the target feature association information comprises: human body overall characteristics, human body local characteristics and local confidence degrees corresponding to the human body local characteristics; performing dressing monitoring according to the related human body material characteristics and the target characteristic related information; the preprocessing module, the resource scheduling module and the dressing monitoring module are connected.
The invention also provides a computer readable storage medium having stored thereon a computer program which when executed by a processor implements a method as described in any of the above.
The invention also provides an electronic terminal, comprising: a processor and a memory;
the memory is configured to store a computer program, and the processor is configured to execute the computer program stored in the memory, so as to cause the terminal to perform the method according to any one of the above.
The invention has the beneficial effects that: according to the dressing monitoring method, the dressing monitoring system, the dressing monitoring medium and the electronic terminal, the material pictures are acquired and input into the pre-trained feature extraction model to perform feature extraction, and a clothing base is obtained, wherein the clothing base comprises: human body material characteristics and corresponding garment types; acquiring a dressing monitoring task; invoking relevant human body material characteristics in the clothing base according to the type of the monitored clothing in the clothing monitoring task; acquiring a picture to be identified; inputting a picture to be identified into a feature extraction model to perform feature extraction, and obtaining target feature association information, wherein the target feature association information comprises: human body overall characteristics, human body local characteristics and local confidence degrees corresponding to the human body local characteristics; performing dressing monitoring according to the related human body material characteristics and the target characteristic related information; the clothing base construction and base resource calling are well realized, the human body targets in the pictures to be identified are identified on the basis of the base resource, the human body overall characteristics, the human body local characteristics and the local confidence corresponding to the human body local characteristics are effectively combined, the accuracy of dressing monitoring is improved, the dressing monitoring cost is reduced, the automation degree is higher, the construction of complex models such as dressing monitoring models or dressing detection models is not needed, and the implementation is more convenient.
Drawings
FIG. 1 is a flow chart of a method for monitoring a dressing in an embodiment of the invention.
Fig. 2 is a schematic flow chart of acquiring a clothing base in the dressing monitoring method according to the embodiment of the invention.
Fig. 3 is a schematic flow chart of acquiring a feature extraction model in the dressing monitoring method according to an embodiment of the invention.
Fig. 4 is a schematic flow chart of dressing monitoring according to related information of relevant human body material characteristics and target characteristics in a dressing monitoring method according to an embodiment of the invention.
Fig. 5 is a schematic structural diagram of a dressing monitoring system in an embodiment of the invention.
Detailed Description
Other advantages and effects of the present invention will become apparent to those skilled in the art from the following disclosure, which describes the embodiments of the present invention with reference to specific examples. The invention may be practiced or carried out in other embodiments that depart from the specific details, and the details of the present description may be modified or varied from the spirit and scope of the present invention. It should be noted that the following embodiments and features in the embodiments may be combined with each other without conflict.
It should be noted that the illustrations provided in the following embodiments merely illustrate the basic concept of the present invention by way of illustration, and only the components related to the present invention are shown in the drawings and are not drawn according to the number, shape and size of the components in actual implementation, and the form, number and proportion of the components in actual implementation may be arbitrarily changed, and the layout of the components may be more complicated.
The inventors found that with the development of computer technology, image monitoring is receiving more and more attention and importance. In some specific application scenarios, the dressing of the present personnel needs to be monitored to determine whether the dressing of the personnel is uniform, such as uniform wear for mail storage, uniform reflective wear for construction sites, and the like. At present, a mode of manually watching a monitoring video is generally adopted to monitor the wearing of people appearing in the video and judge whether the dressing of a target person meets relevant regulations, however, the mode of personnel monitoring is easy to cause lower dressing monitoring accuracy, higher cost and lower automation degree. Therefore, the inventor proposes a dressing monitoring method, a system, a medium and an electronic terminal, by collecting material pictures, inputting the material pictures into a pre-trained feature extraction model for feature extraction, and obtaining a clothing base, wherein the clothing base comprises: human body material characteristics and corresponding garment types; acquiring a dressing monitoring task; invoking relevant human body material characteristics in the clothing base according to the type of the monitored clothing in the clothing monitoring task; acquiring a picture to be identified; inputting the picture to be identified into a feature extraction model for feature extraction, and obtaining target feature associated information, wherein the target feature associated information comprises: human body overall characteristics, human body local characteristics and local confidence degrees corresponding to the human body local characteristics; performing dressing monitoring according to the related human body material characteristics and the target characteristic association information; the method has the advantages that the construction of the clothing base and the calling of the base resources are well realized, the human body targets in the pictures to be identified are identified on the basis of the base resources, the human body overall characteristics, the human body local characteristics and the local confidence corresponding to the human body local characteristics are effectively combined, the accuracy of dressing monitoring is improved, the dressing monitoring cost is reduced, the degree of automation is higher, the construction of complex models such as a dressing monitoring model or a dressing detection model is not needed, the professional complexity of dressing monitoring is reduced, better effect than that of a dressing detection model trained independently can be obtained under the condition that fewer materials are used, the method is convenient to implement, the scene adaptation speed is high, and the dressing monitoring effect is good.
As shown in fig. 1, the dressing monitoring method in this embodiment includes:
s101: collecting material pictures, inputting the material pictures into a pre-trained feature extraction model for feature extraction, and obtaining a clothing base, wherein the clothing base comprises: material pictures, human body material characteristics and corresponding garment types; the material pictures are multiple, for example: collecting a plurality of human body pictures containing uniform clothes at different angles and different postures as material pictures, marking the types of clothes on the material pictures, inputting the marked material pictures into the feature extraction model for distinguishing human body targets, obtaining a plurality of material pictures containing single human body targets, carrying out feature extraction on the material pictures containing the single human body targets, obtaining human body material features, storing the material pictures, the human body material features and the corresponding types of clothes, and constructing a clothes base as the clothes base.
S102: acquiring a dressing monitoring task; the dressing monitoring tasks include: the method comprises the steps of monitoring clothing types, positions and sizes of target areas, similarity thresholds, behavior duration thresholds and hit rates of abnormal behavior of wearing apparel, wherein the monitoring clothing types are mail storage work uniforms, reflective clothing and the like, and the target areas are preset image identification areas.
S103: invoking relevant human body material characteristics in the clothing base according to the type of the monitored clothing in the clothing monitoring task; namely, according to the type of the monitored clothing in the clothing monitoring task, the human body material characteristics which are the same as the type of the monitored clothing are called from a clothing base as related human body material characteristics, the related human body material characteristics are stored in a memory to serve as a target base, and then the clothing abnormality recognition and the clothing monitoring are carried out on the basis of the target base.
S104: acquiring a picture to be identified; the picture to be identified can be obtained from a monitoring video, for example: and acquiring a monitoring video, and carrying out picture interception on the monitoring video to acquire a plurality of pictures to be identified.
S105: inputting the picture to be identified into a feature extraction model for feature extraction, and obtaining target feature association information, wherein the target feature association information comprises: human body overall characteristics, human body local characteristics and local confidence degrees corresponding to the human body local characteristics; the image to be identified is input into the feature extraction model to extract the features, the features of the human body target in the image to be identified can be better obtained, the features comprise the whole human body features and the local human body features, and the local confidence coefficient is organically combined with the whole human body features and the local human body features by obtaining the local confidence coefficient corresponding to the local human body features, so that the accuracy of subsequent dressing monitoring of the human body target can be better improved. The body part features include: head region features, shoulder waist region features, crotch region features, and leg region features, the shoulder waist region features being features in a region of a person's target shoulder to waist.
S106: and performing dressing monitoring according to the related human body material characteristics and the target characteristic related information. By calling the related human body material characteristics in the clothing base, the overall human body characteristics, the local human body characteristics and the local confidence corresponding to the local human body characteristics in the target characteristic related information are combined on the basis of the related human body material characteristics, so that the accuracy of dressing abnormality identification and dressing monitoring can be improved well, the difficulty of dressing monitoring is reduced, real-time monitoring on dressing abnormality is realized, the dressing monitoring cost is reduced, and the degree of automation is higher.
As shown in fig. 2, in some embodiments, the step of obtaining a garment base library comprises:
s201: performing clothing class marking on the material pictures;
s202: inputting the marked material pictures into the feature extraction model to perform feature extraction, and obtaining a plurality of human body material features, wherein the human body material features comprise: the system comprises an overall material characteristic and a local material characteristic, wherein the local material characteristic has a corresponding local confidence; the local material characteristics comprise head area material characteristics and shoulder and waist area material characteristics, and local confidence degrees corresponding to the head area material characteristics and the shoulder and waist area material characteristics are obtained; the local material features further include: crotch region material features and leg region material features;
S203: judging whether the local confidence coefficient corresponding to the head region material characteristics and the shoulder and waist region material characteristics exceeds a preset confidence coefficient threshold value;
s204: and if the local confidence coefficient of the head region material characteristics and/or the local confidence coefficient of the shoulder and waist region material characteristics does not exceed the confidence coefficient threshold value, filtering the corresponding human body target to complete the acquisition of the clothing base. When the local confidence coefficient of the head area material characteristics and/or the local confidence coefficient of the shoulder and waist area material characteristics does not exceed a preset confidence coefficient threshold value, judging that the corresponding human body target does not have an upper body, further acquiring the human body target, and constructing a clothing base by taking the filtered human body material characteristics and the corresponding local confidence coefficient as data of the clothing base. The filtering of the human body target is carried out according to the local confidence coefficient of the head area material characteristics and the local confidence coefficient of the shoulder and waist area material characteristics, so that the accuracy of subsequent wearing anomaly identification and wearing monitoring is improved.
As shown in fig. 3, in order to improve accuracy of dressing monitoring, the inventor proposes to increase training of local confidence of local features in a training process of a feature extraction model, so as to improve accuracy of dressing monitoring and improve training efficiency of a sign extraction model. The step of obtaining the feature extraction model comprises the following steps:
S301: obtaining a training set, the training set comprising: the training device comprises a plurality of training samples, real sample characteristics corresponding to the training samples and real confidence degrees corresponding to the real sample characteristics; the true sample features include: a true global sample feature, a true local sample feature, the true local sample feature comprising: a real head region sample feature, a real shoulder waist region sample feature, a real crotch region sample feature, and a real leg region sample feature;
s302: inputting the training sample into a neural network for feature extraction, and obtaining predicted sample features and prediction confidence of the predicted sample features, wherein the predicted sample features comprise: predicting sample overall characteristics and predicting sample local characteristics; the prediction sample local features include: predicting head region sample features, predicting shoulder waist region sample features, predicting crotch region sample features, and predicting leg region sample features;
s303: and training the neural network according to the real sample characteristics, the real confidence coefficient, the predicted sample characteristics and the predicted confidence coefficient to obtain the characteristic extraction model. Through increasing the training to the local confidence coefficient of local feature in the training process of the feature extraction model, human body target filtration can be conveniently carried out according to the local confidence coefficient in the subsequent dressing monitoring process, so that the dressing monitoring accuracy is improved, and errors are avoided.
As shown in fig. 4, in some embodiments, the step of performing dressing monitoring according to the related human body material characteristics and the target characteristic association information includes:
s401: obtaining a comparison mode in the dressing monitoring task, wherein the comparison mode comprises the following steps: white list contrast and black list contrast; the white list contrast is: the lower the similarity between the characteristics in the target characteristic associated information and the characteristics of the related human body materials is, the greater the possibility of the corresponding human body target clothes abnormality is; the blacklist contrast is: the higher the similarity between the characteristics in the target characteristic associated information and the characteristics of the related human body materials is, the lower the possibility of the corresponding human body target clothes abnormality is; because the application scene of this scheme is comparatively diversified, can be applied to a plurality of trades such as finance trade, building site trade and transportation trade, the contrast mode that different application scenes need probably is different, for example: the unified uniform monitoring aiming at the mail storage generally needs to carry out white list comparison, namely if the similarity between the characteristics in the acquired target characteristic associated information and the characteristics of the related human body materials is lower, the possibility of clothing abnormality of the corresponding human body target is higher; for the light detection of the construction site, blacklist comparison is generally needed, namely if the similarity between the features in the acquired target feature related information and the related human body material features is higher, the possibility of clothing abnormality of the corresponding human body target is higher, and the method can be well adapted to different application scenes. When a user inputs a dressing monitoring task, a corresponding comparison mode can be selected, and then targeted monitoring of different application scenes is realized.
S402: inputting the picture to be identified into the feature extraction model to extract a human body target, and obtaining a corresponding human body target; the image to be identified is input into the feature extraction model to extract and distinguish human body targets, and a single human body target is determined.
S403: judging whether the human body target is in a preset target area or not, and filtering the human body target which is not in the target area once; the target area may be set in advance, and the size and the position of the target area may be set according to actual situations, which will not be described herein. The human body targets which are not in the target area can be filtered once, so that the human body targets can be screened, and accurate dressing monitoring is conveniently carried out on the human body targets.
S404: judging whether the local confidence coefficient of the upper body feature in the human body local features of the human body target exceeds a preset confidence coefficient threshold value, if the local confidence coefficient corresponding to the upper body feature in the human body local features does not exceed the confidence coefficient threshold value, judging that the corresponding human body target does not have the upper body, and further performing secondary filtering on the human body target; because the demand of wearing dress control at present usually is to the upper half of the body, consequently, through carrying out secondary filtering to the human target, get rid of the human target that does not have the upper half, can help improving the accuracy of follow-up wearing dress control, it is more convenient to implement.
S405: after the filtering is finished, comparing the human body overall characteristics with the human body overall material characteristics in the related human body material characteristics for one time to obtain a first similarity; by acquiring the first similarity between the human body overall characteristics and the human body overall material characteristics, it is possible to facilitate determination of whether or not the human body target is abnormal in wearing.
S406: and performing dressing monitoring according to the comparison mode and the first similarity.
In some embodiments, the step of performing dressing monitoring according to the comparison and the first similarity includes:
if the comparison mode is white list comparison, judging that the corresponding human body target has wearing abnormality when the first similarity does not exceed a preset similarity threshold value, and accumulating the wearing abnormality times of the human body target once;
when the first similarity exceeds the similarity threshold, performing secondary comparison on the shoulder and waist region characteristics in the local human body characteristics and the shoulder and waist region material characteristics in the related human body material characteristics to obtain second similarity; by performing secondary comparison on the shoulder and waist region characteristics in the local human body characteristics and the shoulder and waist region material characteristics in the related human body material characteristics, the accuracy of monitoring the clothing abnormality can be further improved.
Judging whether the second similarity exceeds the similarity threshold, if the second similarity does not exceed the similarity threshold, judging that the corresponding human body target has wearing abnormality, and accumulating the wearing abnormality times of the human body target once;
and performing dressing monitoring according to the dressing abnormality times. I.e. the number of abnormal wearing, and alarming or reporting the abnormal wearing. By utilizing the accumulated wearing abnormality times in a period of time to warn or report the abnormal wearing, the wearing abnormality judgment result can be smoothed in a probability form, the wearing abnormality judgment and wearing monitoring accuracy is improved, and the false alarm probability is reduced.
If the comparison mode is blacklist comparison, judging that the corresponding human body target has wearing abnormality when the first similarity exceeds a preset similarity threshold value, and accumulating the wearing abnormality times of the human body target once;
when the first similarity does not exceed the similarity threshold, comparing the shoulder and waist region characteristics in the human body local characteristics with the shoulder and waist region material characteristics in the related human body material characteristics for three times to obtain a third similarity; by comparing the shoulder and waist region characteristics in the local human body characteristics with the shoulder and waist region material characteristics in the related human body material characteristics for three times, the accuracy of dressing monitoring can be effectively improved.
Judging whether the third similarity exceeds the similarity threshold, if so, judging that the corresponding human body target is abnormal in dressing, and accumulating the number of times of abnormal dressing of the human body target once;
and performing dressing monitoring according to the dressing abnormality times.
In some embodiments, feature comparison may also be performed by pre-configuring feature comparison types, where the feature comparison types include: local feature contrast, full feature contrast, and comprehensive contrast, the local feature contrast comprising: shoulder and waist region feature contrast, the local feature contrast is: comparing the local characteristics of the human body target so as to judge whether the human body target has wearing abnormality or not; the full feature contrast is: comparing the overall human body characteristics of the human body target to judge whether the human body target has abnormal dressing; the comprehensive comparison is as follows: by combining the overall characteristics of the human body with the local characteristics of the human body, comprehensive comparison is performed to judge whether the human body target has abnormal dressing, for example: first similarity between the human body overall characteristics and the human body overall material characteristics is obtained, if a preset comparison mode is white list comparison, when the first similarity exceeds a preset similarity threshold, the shoulder and waist area characteristics in the human body local characteristics and the shoulder and waist area material characteristics in the related human body material characteristics are subjected to secondary comparison, second similarity is obtained, if the second similarity does not exceed the similarity threshold, the occurrence of wearing abnormality of the corresponding human body target is judged, and if the second similarity exceeds the similarity threshold, the judgment of the next human body target is carried out.
In some embodiments, the step of performing dressing monitoring according to the number of dressing anomalies includes:
judging whether the existence time of the human body target exceeds a preset action duration threshold; the action duration threshold may be set according to actual situations, which will not be described herein.
If the existing time of the human body target exceeds the action duration threshold, resetting the dressing abnormality times, re-recording the existing time of the human body target and re-judging the dressing abnormality;
if the existing time of the human body target does not exceed the action duration threshold, judging whether the wearing anomaly number of the human body target is greater than or equal to a preset anomaly number threshold;
and if the wearing abnormal times of the human body target are greater than or equal to the abnormal times threshold, transmitting wearing abnormal warning or reporting the wearing abnormal to finish wearing monitoring. The dressing abnormality times in the fixed time period are compared with the preset abnormality times threshold, so that the accuracy of dressing monitoring and dressing abnormality reporting is improved from the aspect of judging success probability, false reporting is avoided, and the accuracy is high.
In some embodiments, the step of obtaining the threshold number of anomalies includes: acquiring the abnormal frequency threshold according to the existence time of the human body target, the preset image recognition speed and the hit rate of the wearing abnormal behavior, wherein the acquisition of the mathematical expression of the abnormal frequency threshold is as follows:
C=t×V×β
wherein C is an abnormal frequency threshold, t is the current time of existence of a human target, V is the image recognition speed, and beta is the hit rate of abnormal behavior. The dressing abnormal behavior hit rate is a hit rate for behavior of an unnormalized dressing/garment. By the method, the abnormal frequency threshold is obtained, the image recognition speed and the hit rate of the wearing abnormal behaviors can be combined, the accurate abnormal frequency threshold is obtained, and the wearing monitoring accuracy is improved. The image recognition speed is the speed of performing wearing anomaly determination on the image to be recognized, such as 8 frames/second, and the wearing anomaly hit rate can be set according to actual conditions, or the corresponding wearing anomaly hit rate can be obtained according to historical clothing anomaly determination results, which is not described herein.
Under the application scene that accords with national laws and regulations and allows monitoring, the dressing monitoring method in the embodiment is used for dressing monitoring, such as unified dressing detection of mail storage, detection of reflective clothing of a construction site, detection of wings of a canteen and the like.
Embodiment one: unified dressing detection and monitoring for mail storage
When the personnel in the mail storage need to be subjected to unified dressing detection, the material pictures of unified clothing of the detected personnel can be acquired through a camera or a camera, the material pictures are multi-angle and multi-pose pictures with simple background, the material pictures are subjected to clothing type marking, the marked material pictures are uploaded to a feature extraction model, feature extraction is performed, human body material features are obtained, and the human body material features comprise: the method comprises the steps of storing integral material characteristics and local material characteristics, wherein the local material characteristics have corresponding local confidence degrees, and storing the material pictures, the human body material characteristics and the local confidence degrees corresponding to the local material characteristics to finish the construction of a clothing base;
acquiring a dressing monitoring task, wherein the dressing monitoring task comprises the following steps: monitoring the type of clothing, the position and the size of a target area, a similarity threshold (such as 40), a behavior duration threshold (such as 30 seconds) and a dressing abnormal behavior hit rate (such as 80%), acquiring relevant human body material characteristics with the same type of clothing from a clothing base according to the type of the monitored clothing, storing the relevant human body material characteristics into a memory as a target base, and carrying out dressing abnormal identification and dressing monitoring on the basis of the target base;
Acquiring a preset comparison mode, wherein the comparison mode of mail storage detection is usually white list detection;
acquiring pictures to be identified, wherein the pictures to be identified can acquire a plurality of images to be identified by intercepting a monitoring video or extracting an image frame;
inputting the picture to be identified into the feature extraction model to perform human body target region and extraction, and determining a corresponding human body target;
judging whether the human body target is in a preset target area or not, and filtering the human body target which is not in the target area once;
acquiring human body overall characteristics and human body local characteristics of the human body target and local confidence corresponding to the human body local characteristics; the body part features include: head region features, shoulder waist region features, crotch region features, and leg region features;
judging whether the local confidence coefficient of the head region feature and the local confidence coefficient of the shoulder region feature in the human body local features of the human body target exceed a preset confidence coefficient threshold value, if the local confidence coefficient of the head region feature and/or the local confidence coefficient of the shoulder region feature does not exceed the confidence coefficient threshold value, judging that the corresponding human body target does not have the upper body, and further performing secondary filtering on the human body target; the confidence threshold may be set according to practical situations, which will not be described herein.
After the filtering is finished, a pre-configured characteristic comparison type is obtained, the characteristic comparison type detected by the mail storage is full-characteristic comparison, and the human body overall characteristics are compared with the human body overall material characteristics in the related human body material characteristics to obtain overall similarity;
judging whether the overall similarity exceeds a preset similarity threshold, judging that the corresponding human body target is abnormal in dressing when the overall similarity does not exceed the preset similarity threshold, and accumulating the number of times of abnormal dressing of the human body target once;
when the overall similarity exceeds the similarity threshold, judging a next human target;
and performing dressing monitoring according to the dressing abnormality times.
According to the dressing abnormality times, the dressing monitoring step comprises the following steps:
judging whether the existence time of the human body target exceeds a preset action duration threshold;
if the existing time of the human body target exceeds the action duration threshold, resetting the dressing abnormality times, re-recording the existing time of the human body target and re-judging the dressing abnormality;
if the existing time of the human body target does not exceed the action duration threshold, judging whether the wearing anomaly number of the human body target is greater than or equal to a preset anomaly number threshold;
If the wearing abnormal times of the human body target is greater than or equal to the abnormal times threshold, wearing abnormal warning is sent out or mail storage wearing nonstandard warning information is reported, and wearing monitoring is completed.
The step of obtaining the abnormal times threshold value comprises the following steps: acquiring the abnormal frequency threshold according to the existence time of the human body target, the preset image recognition speed and the hit rate of the wearing abnormal behavior, wherein the acquisition of the mathematical expression of the abnormal frequency threshold is as follows:
C=t×V×β
wherein C is an abnormal frequency threshold, t is the current time of existence of a human target, V is the image recognition speed, and beta is the hit rate of abnormal behavior.
Embodiment two: construction site reflective garment detection and monitoring
When the site reflective clothing detection is carried out, collecting material pictures of unified clothing of detected personnel through a camera or a camera, wherein the material pictures are multi-angle and multi-pose pictures with simple background, carrying out clothing type marking on the material pictures, uploading the marked material pictures to a feature extraction model, carrying out feature extraction, and obtaining human body material features, wherein the human body material features comprise: the method comprises the steps of storing integral material characteristics and local material characteristics, wherein the local material characteristics have corresponding local confidence degrees, and storing the material pictures, the human body material characteristics and the local confidence degrees corresponding to the local material characteristics to finish the construction of a clothing base;
Acquiring a dressing monitoring task, wherein the dressing monitoring task comprises the following steps: monitoring the type of clothing, the position and the size of a target area, a similarity threshold (such as 45), a behavior duration threshold (such as 20 seconds) and a dressing abnormal behavior hit rate (such as 85%), acquiring relevant human body material characteristics with the same type of clothing from a clothing base according to the type of the monitored clothing, storing the relevant human body material characteristics into a memory as a target base, and carrying out dressing abnormal identification and dressing monitoring on the basis of the target base;
acquiring a preset comparison mode, wherein the comparison mode of reflective clothing detection is usually white list detection;
acquiring a picture to be identified;
inputting the picture to be identified into the feature extraction model to perform human body target region and extraction, and determining a corresponding human body target;
judging whether the human body target is in a preset target area or not, and filtering the human body target which is not in the target area once;
acquiring human body overall characteristics and human body local characteristics of the human body target and local confidence corresponding to the human body local characteristics; the body part features include: head region features, shoulder waist region features, crotch region features, and leg region features;
Judging whether the local confidence coefficient of the head region feature and the local confidence coefficient of the shoulder region feature in the human body local features of the human body target exceed a preset confidence coefficient threshold value, if the local confidence coefficient of the head region feature and/or the local confidence coefficient of the shoulder region feature does not exceed the confidence coefficient threshold value, judging that the corresponding human body target does not have the upper body, and further performing secondary filtering on the human body target;
after the filtering is finished, a pre-configured characteristic comparison type is obtained, the characteristic comparison type detected by the reflective clothing is shoulder and waist region characteristic comparison, the shoulder and waist region characteristics are compared with shoulder and waist region material characteristics in the related human body material characteristics, and shoulder and waist characteristic similarity is obtained;
judging whether the shoulder and waist feature similarity exceeds a preset similarity threshold, judging that the corresponding human body target is abnormal in dressing when the shoulder and waist feature similarity does not exceed the preset similarity threshold, and accumulating the number of times of abnormal dressing of the human body target once;
when the shoulder and waist feature similarity exceeds the similarity threshold, judging a next human target;
and performing dressing monitoring according to the dressing abnormality times.
According to the dressing abnormality times, the dressing monitoring step comprises the following steps:
judging whether the existence time of the human body target exceeds a preset action duration threshold;
if the existing time of the human body target exceeds the action duration threshold, resetting the dressing abnormality times, re-recording the existing time of the human body target and re-judging the dressing abnormality;
if the existing time of the human body target does not exceed the action duration threshold, judging whether the wearing anomaly number of the human body target is greater than or equal to a preset anomaly number threshold;
if the wearing abnormal times of the human body target is greater than or equal to the abnormal times threshold, wearing abnormal warning is sent out or the warning information of irregular wearing of the reflective clothing is reported, and wearing monitoring is completed.
The step of obtaining the abnormal times threshold value comprises the following steps: and acquiring the threshold value of the abnormal times according to the existence time of the human body target, the preset image recognition speed and the hit rate of the wearing abnormal behaviors.
Embodiment III: canteen light detection and monitoring
Firstly, acquiring a light wing material picture, wherein the light wing material picture is a picture with multiple angles and multiple postures and simple background, marking the clothing type of the material picture, uploading the marked material picture to a feature extraction model, extracting features, and acquiring human body material features, wherein the human body material features comprise: the method comprises the steps of storing integral material characteristics and local material characteristics, wherein the local material characteristics have corresponding local confidence degrees, and storing the material pictures, the human body material characteristics and the local confidence degrees corresponding to the local material characteristics to finish the construction of a clothing base;
Then, a dressing monitoring task is acquired, wherein the dressing monitoring task comprises the following steps: monitoring the type of clothing, the position and the size of a target area, a similarity threshold (such as 20), a behavior duration threshold (such as 35 seconds) and a dressing abnormal behavior hit rate (such as 75%), acquiring relevant human body material characteristics with the same type of clothing from a clothing base according to the type of the monitored clothing, storing the relevant human body material characteristics into a memory as a target base, and carrying out dressing abnormal identification and dressing monitoring on the basis of the target base;
acquiring a preset comparison mode, wherein the comparison mode of light detection is usually blacklist detection;
acquiring a picture to be identified;
inputting the picture to be identified into the feature extraction model to perform human body target region and extraction, and determining a corresponding human body target;
judging whether the human body target is in a preset target area or not, and filtering the human body target which is not in the target area once;
acquiring human body overall characteristics and human body local characteristics of the human body target and local confidence corresponding to the human body local characteristics; the body part features include: head region features, shoulder waist region features, crotch region features, and leg region features;
Judging whether the local confidence coefficient of the head region feature and the local confidence coefficient of the shoulder region feature in the human body local features of the human body target exceed a preset confidence coefficient threshold value, if the local confidence coefficient of the head region feature and/or the local confidence coefficient of the shoulder region feature does not exceed the confidence coefficient threshold value, judging that the corresponding human body target does not have the upper body, and further performing secondary filtering on the human body target;
after the filtering is finished, a pre-configured characteristic comparison type is obtained, wherein the characteristic comparison type of the light shoulder detection is a comprehensive comparison type of full characteristic comparison and shoulder and waist region characteristic comparison;
the human body overall characteristics are subjected to first comparison with the human body overall material characteristics in the related human body material characteristics, and overall similarity is obtained;
when the overall similarity exceeds a preset similarity threshold, judging that the corresponding human body target has wearing abnormality, and accumulating the wearing abnormality times of the human body target once;
when the overall similarity does not exceed the similarity threshold, performing second comparison on the shoulder and waist region features in the human body local features and the shoulder and waist region material features in the related human body material features to obtain shoulder and waist similarity;
Judging whether the shoulder and waist similarity exceeds the similarity threshold, if the third similarity exceeds the similarity threshold, judging that the corresponding human body target is abnormal in dressing, and accumulating the number of times of abnormal dressing of the human body target once;
and finally, performing dressing monitoring according to the dressing abnormality times.
According to the dressing abnormality times, the dressing monitoring step comprises the following steps:
judging whether the existence time of the human body target exceeds a preset action duration threshold;
if the existing time of the human body target exceeds the action duration threshold, resetting the dressing abnormality times, re-recording the existing time of the human body target and re-judging the dressing abnormality;
if the existing time of the human body target does not exceed the action duration threshold, judging whether the wearing anomaly number of the human body target is greater than or equal to a preset anomaly number threshold;
and if the wearing abnormal times of the human body target is greater than or equal to the abnormal times threshold, transmitting wearing abnormal warning or reporting light alarm information to complete wearing monitoring. The step of obtaining the abnormal times threshold value comprises the following steps: and acquiring the threshold value of the abnormal times according to the existence time of the human body target, the preset image recognition speed and the hit rate of the wearing abnormal behaviors.
As shown in fig. 5, this embodiment further provides a dressing monitoring system, including:
the preprocessing module is used for collecting material pictures, inputting the material pictures into a pre-trained feature extraction model for feature extraction, and obtaining a clothing base, wherein the clothing base comprises: human body material characteristics and corresponding garment types;
the resource scheduling module is used for acquiring dressing monitoring tasks; invoking relevant human body material characteristics in the clothing base according to the type of the monitored clothing in the clothing monitoring task;
the dressing monitoring module is used for acquiring pictures to be identified; inputting the picture to be identified into a feature extraction model for feature extraction, and obtaining target feature association information, wherein the target feature association information comprises: human body overall characteristics, human body local characteristics and local confidence degrees corresponding to the human body local characteristics; performing dressing monitoring according to the related human body material characteristics and the target characteristic related information; the preprocessing module, the resource scheduling module and the dressing monitoring module are connected. The system in this embodiment inputs a material picture into a pre-trained feature extraction model to perform feature extraction by collecting the material picture, and obtains a clothing base, where the clothing base includes: human body material characteristics and corresponding garment types; acquiring a dressing monitoring task; invoking relevant human body material characteristics in the clothing base according to the type of the monitored clothing in the clothing monitoring task; acquiring a picture to be identified; inputting a picture to be identified into a feature extraction model to perform feature extraction, and obtaining target feature association information, wherein the target feature association information comprises: human body overall characteristics, human body local characteristics and local confidence degrees corresponding to the human body local characteristics; performing dressing monitoring according to the related human body material characteristics and the target characteristic related information; the clothing base construction method has the advantages that the clothing base construction and base resource calling are well realized, the human body targets in the pictures to be identified are identified on the basis of the base resource, the human body overall characteristics, the human body local characteristics and the local confidence corresponding to the human body local characteristics are effectively combined, the dressing monitoring accuracy is improved, the dressing monitoring cost is reduced, the automation degree is high, the construction of complex models such as dressing monitoring models or dressing detection models is not needed, the implementation is convenient, the professional requirements are low, and the adaptability is high.
In some embodiments, the step of obtaining a garment base library comprises:
performing clothing class marking on the material pictures;
inputting the marked material pictures into the feature extraction model to perform feature extraction, and obtaining a plurality of human body material features, wherein the human body material features comprise: integral material characteristics, local material characteristics and local confidence corresponding to the local material characteristics;
the local material characteristics comprise head area material characteristics and shoulder and waist area material characteristics, and local confidence degrees corresponding to the head area material characteristics and the shoulder and waist area material characteristics are obtained;
judging whether the local confidence coefficient corresponding to the head region material characteristics and the shoulder and waist region material characteristics exceeds a preset confidence coefficient threshold value;
and if the local confidence coefficient of the head region material characteristics and/or the local confidence coefficient of the shoulder and waist region material characteristics does not exceed the confidence coefficient threshold value, filtering the corresponding human body target to complete the acquisition of the clothing base.
In some embodiments, the step of obtaining the feature extraction model includes:
obtaining a training set, the training set comprising: the training device comprises a plurality of training samples, real sample characteristics corresponding to the training samples and real confidence degrees corresponding to the real sample characteristics;
Inputting the training sample into a neural network for feature extraction, and obtaining predicted sample features and prediction confidence of the predicted sample features, wherein the predicted sample features comprise: predicting sample overall characteristics and predicting sample local characteristics;
and training the neural network according to the real sample characteristics, the real confidence coefficient, the predicted sample characteristics and the predicted confidence coefficient to obtain the characteristic extraction model.
In some embodiments, the step of performing dressing monitoring according to the relevant human body material characteristics and the target characteristic association information includes:
obtaining a comparison mode in the dressing monitoring task, wherein the comparison mode comprises the following steps: white list contrast and black list contrast;
inputting the picture to be identified into the feature extraction model to extract a human body target, and obtaining a corresponding human body target;
judging whether the human body target is in a preset target area or not, and filtering the human body target which is not in the target area once;
judging whether the local confidence coefficient of the head region feature and the local confidence coefficient of the shoulder region feature in the human body local features of the human body target exceed a preset confidence coefficient threshold value, if the local confidence coefficient of the head region feature and/or the local confidence coefficient of the shoulder region feature does not exceed the confidence coefficient threshold value, judging that the corresponding human body target does not have the upper body, and further performing secondary filtering on the human body target;
After the filtering is finished, comparing the human body overall characteristics with the human body overall material characteristics in the related human body material characteristics for one time to obtain a first similarity;
and performing dressing monitoring according to the comparison mode and the first similarity.
In some embodiments, the step of performing dressing monitoring according to the comparison and the first similarity includes:
if the comparison mode is white list comparison, judging that the corresponding human body target has wearing abnormality when the first similarity does not exceed a preset similarity threshold value, and accumulating the wearing abnormality times of the human body target once;
when the first similarity exceeds the similarity threshold, performing secondary comparison on the shoulder and waist region characteristics in the local human body characteristics and the shoulder and waist region material characteristics in the related human body material characteristics to obtain second similarity;
judging whether the second similarity exceeds the similarity threshold, if the second similarity does not exceed the similarity threshold, judging that the corresponding human body target has wearing abnormality, and accumulating the wearing abnormality times of the human body target once;
and performing dressing monitoring according to the dressing abnormality times.
In some embodiments, the step of performing dressing monitoring further comprises, based on the comparison and the first similarity:
if the comparison mode is blacklist comparison, judging that the corresponding human body target has wearing abnormality when the first similarity exceeds a preset similarity threshold value, and accumulating the wearing abnormality times of the human body target once;
when the first similarity does not exceed the similarity threshold, comparing the shoulder and waist region characteristics in the human body local characteristics with the shoulder and waist region material characteristics in the related human body material characteristics for three times to obtain a third similarity;
judging whether the third similarity exceeds the similarity threshold, if so, judging that the corresponding human body target is abnormal in dressing, and accumulating the number of times of abnormal dressing of the human body target once;
and performing dressing monitoring according to the dressing abnormality times.
In some embodiments, the step of performing dressing monitoring according to the number of dressing anomalies includes:
judging whether the existence time of the human body target exceeds a preset action duration threshold;
if the existing time of the human body target exceeds the action duration threshold, resetting the dressing abnormality times, re-recording the existing time of the human body target and re-judging the dressing abnormality;
If the existing time of the human body target does not exceed the action duration threshold, judging whether the wearing anomaly number of the human body target is greater than or equal to a preset anomaly number threshold;
if the wearing abnormal times of the human body target is greater than or equal to the abnormal times threshold, wearing abnormal warning is sent out, and wearing monitoring is completed;
the step of obtaining the abnormal times threshold value comprises the following steps: acquiring the abnormal frequency threshold according to the existence time of the human body target, the preset image recognition speed and the hit rate of the wearing abnormal behavior, wherein the acquisition of the mathematical expression of the abnormal frequency threshold is as follows:
C=t×V×β
wherein C is an abnormal frequency threshold, t is the current time of existence of a human target, V is the image recognition speed, and beta is the hit rate of abnormal behavior.
The present embodiment also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements any of the methods of the present embodiments.
The embodiment also provides an electronic terminal, including: a processor and a memory;
the memory is configured to store a computer program, and the processor is configured to execute the computer program stored in the memory, so that the terminal executes any one of the methods in the present embodiment.
The computer readable storage medium in this embodiment, as will be appreciated by those of ordinary skill in the art: all or part of the steps for implementing the method embodiments described above may be performed by computer program related hardware. The aforementioned computer program may be stored in a computer readable storage medium. The program, when executed, performs steps including the method embodiments described above; and the aforementioned storage medium includes: various media that can store program code, such as ROM, RAM, magnetic or optical disks.
The electronic terminal provided in this embodiment includes a processor, a memory, a transceiver, and a communication interface, where the memory and the communication interface are connected to the processor and the transceiver and complete communication with each other, the memory is used to store a computer program, the communication interface is used to perform communication, and the processor and the transceiver are used to run the computer program, so that the electronic terminal performs each step of the above method.
In this embodiment, the memory may include a random access memory (Random Access Memory, abbreviated as RAM), and may further include a non-volatile memory (non-volatile memory), such as at least one magnetic disk memory.
The processor may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU for short), a network processor (Network Processor, NP for short), etc.; but also digital signal processors (Digital Signal Processing, DSP for short), application specific integrated circuits (Application Specific Integrated Circuit, ASIC for short), field-programmable gate arrays (Field-Programmable Gate Array, FPGA for short) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
The above embodiments are merely illustrative of the principles of the present invention and its effectiveness, and are not intended to limit the invention. Modifications and variations may be made to the above-described embodiments by those skilled in the art without departing from the spirit and scope of the invention. Accordingly, it is intended that all equivalent modifications and variations of the invention be covered by the claims, which are within the ordinary skill of the art, be within the spirit and scope of the present disclosure.

Claims (7)

1. A method of dressing monitoring, comprising:
collecting material pictures, inputting the material pictures into a pre-trained feature extraction model for feature extraction, and obtaining a clothing base, wherein the clothing base comprises: human body material characteristics and corresponding garment types;
acquiring a dressing monitoring task;
invoking relevant human body material characteristics in the clothing base according to the type of the monitored clothing in the clothing monitoring task;
acquiring a picture to be identified;
inputting the picture to be identified into a feature extraction model for feature extraction, and obtaining target feature association information, wherein the target feature association information comprises: human body overall characteristics, human body local characteristics and local confidence degrees corresponding to the human body local characteristics;
Performing dressing monitoring according to the related human body material characteristics and the target characteristic related information;
wherein, according to the relevant human body material characteristics and the target characteristic associated information, performing dressing monitoring comprises:
obtaining a comparison mode in the dressing monitoring task, wherein the comparison mode comprises the following steps: white list contrast and black list contrast;
inputting the picture to be identified into the feature extraction model to extract a human body target, and obtaining a corresponding human body target;
judging whether the human body target is in a preset target area or not, and filtering the human body target which is not in the target area once;
judging whether the local confidence coefficient of the head region feature and the local confidence coefficient of the shoulder region feature in the human body local features of the human body target exceed a preset confidence coefficient threshold value, if the local confidence coefficient of the head region feature and/or the local confidence coefficient of the shoulder region feature does not exceed the confidence coefficient threshold value, judging that the corresponding human body target does not have the upper body, and further performing secondary filtering on the human body target;
after the filtering is finished, comparing the human body overall characteristics with the human body overall material characteristics in the related human body material characteristics for one time to obtain a first similarity;
Performing dressing monitoring according to the comparison mode and the first similarity;
wherein, according to the comparison mode and the first similarity, performing dressing monitoring includes:
if the comparison mode is white list comparison, judging that the corresponding human body target has wearing abnormality when the first similarity does not exceed a preset similarity threshold value, and accumulating the wearing abnormality times of the human body target once;
when the first similarity exceeds the similarity threshold, performing secondary comparison on the shoulder and waist region characteristics in the local human body characteristics and the shoulder and waist region material characteristics in the related human body material characteristics to obtain second similarity;
judging whether the second similarity exceeds the similarity threshold, if the second similarity does not exceed the similarity threshold, judging that the corresponding human body target has wearing abnormality, and accumulating the wearing abnormality times of the human body target once;
performing dressing monitoring according to the dressing abnormality times;
wherein, according to the comparison mode and the first similarity, the step of performing dressing monitoring further includes:
if the comparison mode is blacklist comparison, judging that the corresponding human body target has wearing abnormality when the first similarity exceeds a preset similarity threshold value, and accumulating the wearing abnormality times of the human body target once;
When the first similarity does not exceed the similarity threshold, comparing the shoulder and waist region characteristics in the human body local characteristics with the shoulder and waist region material characteristics in the related human body material characteristics for three times to obtain a third similarity;
judging whether the third similarity exceeds the similarity threshold, if so, judging that the corresponding human body target is abnormal in dressing, and accumulating the number of times of abnormal dressing of the human body target once;
and performing dressing monitoring according to the dressing abnormality times.
2. The method of apparel monitoring as in claim 1 wherein the step of obtaining a library of apparel bases comprises:
performing clothing class marking on the material pictures;
inputting the marked material pictures into the feature extraction model to perform feature extraction, and obtaining a plurality of human body material features, wherein the human body material features comprise: integral material characteristics, local material characteristics and local confidence corresponding to the local material characteristics;
the local material characteristics comprise head area material characteristics and shoulder and waist area material characteristics, and local confidence degrees corresponding to the head area material characteristics and the shoulder and waist area material characteristics are obtained;
Judging whether the local confidence coefficient corresponding to the head region material characteristics and the shoulder and waist region material characteristics exceeds a preset confidence coefficient threshold value;
and if the local confidence coefficient of the head region material characteristics and/or the local confidence coefficient of the shoulder and waist region material characteristics does not exceed the confidence coefficient threshold value, filtering the corresponding human body target to complete the acquisition of the clothing base.
3. The dressing monitoring method as claimed in claim 1, wherein the step of obtaining the feature extraction model comprises:
obtaining a training set, the training set comprising: the training device comprises a plurality of training samples, real sample characteristics corresponding to the training samples and real confidence degrees corresponding to the real sample characteristics;
inputting the training sample into a neural network for feature extraction, and obtaining predicted sample features and prediction confidence of the predicted sample features, wherein the predicted sample features comprise: predicting sample overall characteristics and predicting sample local characteristics;
and training the neural network according to the real sample characteristics, the real confidence coefficient, the predicted sample characteristics and the predicted confidence coefficient to obtain the characteristic extraction model.
4. The dressing monitoring method as claimed in claim 1, wherein the step of carrying out dressing monitoring according to the number of dressing anomalies comprises:
Judging whether the existence time of the human body target exceeds a preset action duration threshold;
if the existing time of the human body target exceeds the action duration threshold, resetting the dressing abnormality times, re-recording the existing time of the human body target and re-judging the dressing abnormality;
if the existing time of the human body target does not exceed the action duration threshold, judging whether the wearing anomaly number of the human body target is greater than or equal to a preset anomaly number threshold;
if the wearing abnormal times of the human body target is greater than or equal to the abnormal times threshold, wearing abnormal warning is sent out, and wearing monitoring is completed;
the step of obtaining the abnormal times threshold value comprises the following steps: acquiring the abnormal frequency threshold according to the existence time of the human body target, the preset image recognition speed and the hit rate of the wearing abnormal behavior, wherein the acquisition of the mathematical expression of the abnormal frequency threshold is as follows:
C=t×V×β
wherein C is an abnormal frequency threshold, t is the current time of existence of a human target, V is the image recognition speed, and beta is the hit rate of abnormal behavior.
5. A wear monitoring system, comprising:
The preprocessing module is used for collecting material pictures, inputting the material pictures into a pre-trained feature extraction model for feature extraction, and obtaining a clothing base, wherein the clothing base comprises: human body material characteristics and corresponding garment types;
the resource scheduling module is used for acquiring dressing monitoring tasks; invoking relevant human body material characteristics in the clothing base according to the type of the monitored clothing in the clothing monitoring task;
the dressing monitoring module is used for acquiring pictures to be identified; inputting the picture to be identified into a feature extraction model for feature extraction, and obtaining target feature association information, wherein the target feature association information comprises: human body overall characteristics, human body local characteristics and local confidence degrees corresponding to the human body local characteristics; performing dressing monitoring according to the related human body material characteristics and the target characteristic related information; the preprocessing module, the resource scheduling module and the dressing monitoring module are connected;
wherein, according to the relevant human body material characteristics and the target characteristic associated information, performing dressing monitoring comprises:
obtaining a comparison mode in the dressing monitoring task, wherein the comparison mode comprises the following steps: white list contrast and black list contrast;
Inputting the picture to be identified into the feature extraction model to extract a human body target, and obtaining a corresponding human body target;
judging whether the human body target is in a preset target area or not, and filtering the human body target which is not in the target area once;
judging whether the local confidence coefficient of the head region feature and the local confidence coefficient of the shoulder region feature in the human body local features of the human body target exceed a preset confidence coefficient threshold value, if the local confidence coefficient of the head region feature and/or the local confidence coefficient of the shoulder region feature does not exceed the confidence coefficient threshold value, judging that the corresponding human body target does not have the upper body, and further performing secondary filtering on the human body target;
after the filtering is finished, comparing the human body overall characteristics with the human body overall material characteristics in the related human body material characteristics for one time to obtain a first similarity;
performing dressing monitoring according to the comparison mode and the first similarity;
wherein, according to the comparison mode and the first similarity, performing dressing monitoring includes:
if the comparison mode is white list comparison, judging that the corresponding human body target has wearing abnormality when the first similarity does not exceed a preset similarity threshold value, and accumulating the wearing abnormality times of the human body target once;
When the first similarity exceeds the similarity threshold, performing secondary comparison on the shoulder and waist region characteristics in the local human body characteristics and the shoulder and waist region material characteristics in the related human body material characteristics to obtain second similarity;
judging whether the second similarity exceeds the similarity threshold, if the second similarity does not exceed the similarity threshold, judging that the corresponding human body target has wearing abnormality, and accumulating the wearing abnormality times of the human body target once;
performing dressing monitoring according to the dressing abnormality times;
wherein, according to the comparison mode and the first similarity, the step of performing dressing monitoring further includes:
if the comparison mode is blacklist comparison, judging that the corresponding human body target has wearing abnormality when the first similarity exceeds a preset similarity threshold value, and accumulating the wearing abnormality times of the human body target once;
when the first similarity does not exceed the similarity threshold, comparing the shoulder and waist region characteristics in the human body local characteristics with the shoulder and waist region material characteristics in the related human body material characteristics for three times to obtain a third similarity;
judging whether the third similarity exceeds the similarity threshold, if so, judging that the corresponding human body target is abnormal in dressing, and accumulating the number of times of abnormal dressing of the human body target once;
And performing dressing monitoring according to the dressing abnormality times.
6. A computer-readable storage medium having stored thereon a computer program, characterized by: the computer program implementing the method according to any of claims 1 to 4 when executed by a processor.
7. An electronic terminal, comprising: a processor and a memory;
the memory is configured to store a computer program, and the processor is configured to execute the computer program stored in the memory, to cause the terminal to perform the method according to any one of claims 1 to 4.
CN202111165053.5A 2021-09-30 2021-09-30 Dressing monitoring method, dressing monitoring system, dressing monitoring medium and electronic terminal Active CN113837138B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111165053.5A CN113837138B (en) 2021-09-30 2021-09-30 Dressing monitoring method, dressing monitoring system, dressing monitoring medium and electronic terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111165053.5A CN113837138B (en) 2021-09-30 2021-09-30 Dressing monitoring method, dressing monitoring system, dressing monitoring medium and electronic terminal

Publications (2)

Publication Number Publication Date
CN113837138A CN113837138A (en) 2021-12-24
CN113837138B true CN113837138B (en) 2023-08-29

Family

ID=78967983

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111165053.5A Active CN113837138B (en) 2021-09-30 2021-09-30 Dressing monitoring method, dressing monitoring system, dressing monitoring medium and electronic terminal

Country Status (1)

Country Link
CN (1) CN113837138B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113989858B (en) * 2021-12-28 2022-04-08 上海安维尔信息科技股份有限公司 Work clothes identification method and system

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106778609A (en) * 2016-12-15 2017-05-31 国网浙江省电力公司杭州供电公司 A kind of electric power construction field personnel uniform wears recognition methods
CN107818489A (en) * 2017-09-08 2018-03-20 中山大学 A kind of more people's costume retrieval methods based on dressing parsing and human testing
CN109101922A (en) * 2018-08-10 2018-12-28 广东电网有限责任公司 Operating personnel device, assay, device and electronic equipment
CN110188724A (en) * 2019-06-05 2019-08-30 中冶赛迪重庆信息技术有限公司 The method and system of safety cap positioning and color identification based on deep learning
CN110287804A (en) * 2019-05-30 2019-09-27 广东电网有限责任公司 A kind of electric operating personnel's dressing recognition methods based on mobile video monitor
CN110569722A (en) * 2019-08-01 2019-12-13 江苏濠汉信息技术有限公司 Visual analysis-based constructor dressing standard detection method and device
CN110781976A (en) * 2019-10-31 2020-02-11 重庆紫光华山智安科技有限公司 Extension method of training image, training method and related device
CN111401314A (en) * 2020-04-10 2020-07-10 上海东普信息科技有限公司 Dressing information detection method, device, equipment and storage medium
CN111444767A (en) * 2020-02-25 2020-07-24 华中科技大学 Pedestrian detection and tracking method based on laser radar
CN111723844A (en) * 2020-05-19 2020-09-29 上海明略人工智能(集团)有限公司 Method and system for determining clustering center and method and device for determining picture type
CN112364734A (en) * 2020-10-30 2021-02-12 福州大学 Abnormal dressing detection method based on yolov4 and CenterNet

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2659698C (en) * 2008-03-21 2020-06-16 Dressbot Inc. System and method for collaborative shopping, business and entertainment
US10671840B2 (en) * 2017-05-04 2020-06-02 Intel Corporation Method and apparatus for person recognition using continuous self-learning

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106778609A (en) * 2016-12-15 2017-05-31 国网浙江省电力公司杭州供电公司 A kind of electric power construction field personnel uniform wears recognition methods
CN107818489A (en) * 2017-09-08 2018-03-20 中山大学 A kind of more people's costume retrieval methods based on dressing parsing and human testing
CN109101922A (en) * 2018-08-10 2018-12-28 广东电网有限责任公司 Operating personnel device, assay, device and electronic equipment
CN110287804A (en) * 2019-05-30 2019-09-27 广东电网有限责任公司 A kind of electric operating personnel's dressing recognition methods based on mobile video monitor
CN110188724A (en) * 2019-06-05 2019-08-30 中冶赛迪重庆信息技术有限公司 The method and system of safety cap positioning and color identification based on deep learning
CN110569722A (en) * 2019-08-01 2019-12-13 江苏濠汉信息技术有限公司 Visual analysis-based constructor dressing standard detection method and device
CN110781976A (en) * 2019-10-31 2020-02-11 重庆紫光华山智安科技有限公司 Extension method of training image, training method and related device
CN111444767A (en) * 2020-02-25 2020-07-24 华中科技大学 Pedestrian detection and tracking method based on laser radar
CN111401314A (en) * 2020-04-10 2020-07-10 上海东普信息科技有限公司 Dressing information detection method, device, equipment and storage medium
CN111723844A (en) * 2020-05-19 2020-09-29 上海明略人工智能(集团)有限公司 Method and system for determining clustering center and method and device for determining picture type
CN112364734A (en) * 2020-10-30 2021-02-12 福州大学 Abnormal dressing detection method based on yolov4 and CenterNet

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
施工现场人员类型识别方法的研究与实现;孙懿;《中国优秀硕士学位论文全文数据库 (工程科技Ⅱ辑)》(第7期);C042-600 *

Also Published As

Publication number Publication date
CN113837138A (en) 2021-12-24

Similar Documents

Publication Publication Date Title
CN109765539B (en) Indoor user behavior monitoring method and device, electrical equipment and home monitoring system
CN108256404B (en) Pedestrian detection method and device
CN108319926A (en) A kind of the safety cap wearing detecting system and detection method of building-site
CN109657564A (en) A kind of personnel detection method, device, storage medium and terminal device on duty
CN106845352B (en) Pedestrian detection method and device
CN111210399B (en) Imaging quality evaluation method, device and equipment
CN111161206A (en) Image capturing method, monitoring camera and monitoring system
CN109740573B (en) Video analysis method, device, equipment and server
CN110879995A (en) Target object detection method and device, storage medium and electronic device
CN110837582A (en) Data association method and device, electronic equipment and computer-readable storage medium
CN111191507A (en) Safety early warning analysis method and system for smart community
CN113139403A (en) Violation behavior identification method and device, computer equipment and storage medium
CN113792691B (en) Video identification method, system, equipment and medium
CN112017323A (en) Patrol alarm method and device, readable storage medium and terminal equipment
CN113052107A (en) Method for detecting wearing condition of safety helmet, computer equipment and storage medium
CN113837138B (en) Dressing monitoring method, dressing monitoring system, dressing monitoring medium and electronic terminal
CN111860187A (en) High-precision worn mask identification method and system
CN110505438B (en) Queuing data acquisition method and camera
CN114494965A (en) Method and system for detecting wandering pets based on vision
CN113470013A (en) Method and device for detecting moved article
CN113938828A (en) Method and device for generating electronic fence of equipment
CN113903066A (en) Track generation method, system and device and electronic equipment
CN116778673A (en) Water area safety monitoring method, system, terminal and storage medium
CN110969209B (en) Stranger identification method and device, electronic equipment and storage medium
CN115953815A (en) Monitoring method and device for infrastructure site

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant