CN113837138A - Dressing monitoring method, system, medium and electronic terminal - Google Patents

Dressing monitoring method, system, medium and electronic terminal Download PDF

Info

Publication number
CN113837138A
CN113837138A CN202111165053.5A CN202111165053A CN113837138A CN 113837138 A CN113837138 A CN 113837138A CN 202111165053 A CN202111165053 A CN 202111165053A CN 113837138 A CN113837138 A CN 113837138A
Authority
CN
China
Prior art keywords
human body
dressing
target
local
material characteristics
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111165053.5A
Other languages
Chinese (zh)
Other versions
CN113837138B (en
Inventor
杨胜元
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Unisinsight Technology Co Ltd
Original Assignee
Chongqing Unisinsight Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Unisinsight Technology Co Ltd filed Critical Chongqing Unisinsight Technology Co Ltd
Priority to CN202111165053.5A priority Critical patent/CN113837138B/en
Publication of CN113837138A publication Critical patent/CN113837138A/en
Application granted granted Critical
Publication of CN113837138B publication Critical patent/CN113837138B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computational Linguistics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Evolutionary Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Alarm Systems (AREA)

Abstract

The invention provides a dressing monitoring method, a dressing monitoring system, a dressing monitoring medium and an electronic terminal, wherein the method comprises the following steps: gather the material picture, carry out the feature extraction with the good feature extraction model of material picture input training in advance, obtain the clothing end storehouse, the clothing end storehouse includes: human body material characteristics and corresponding garment types; acquiring a dressing monitoring task; calling related human body material characteristics in a garment base according to the type of the monitored garment in the dressing monitoring task; acquiring a picture to be identified; inputting the picture to be recognized into a feature extraction model for feature extraction, and acquiring target feature associated information, wherein the target feature associated information comprises: the human body overall characteristic, the human body local characteristic and the local confidence corresponding to the human body local characteristic; according to the relevant human body material characteristics and the target characteristic correlation information, dressing monitoring is carried out; the dressing monitoring method effectively improves the dressing monitoring accuracy, reduces the dressing monitoring cost, and has the advantages of higher automation degree, lower professional requirement and more convenient implementation.

Description

Dressing monitoring method, system, medium and electronic terminal
Technical Field
The invention relates to the technical field of computers, in particular to a dressing monitoring method, a dressing monitoring system, a dressing monitoring medium and an electronic terminal.
Background
With the development of computer technology, image monitoring has received more and more attention and attention. In some specific application scenarios, dressing of personnel on site needs to be monitored, and whether the dressing of the personnel is uniform or not is judged, for example, uniform clothes need to be worn in mail storage, reflective clothes need to be worn in a construction site, and the like. At present, the mode of watching the monitoring video by people is adopted, the wearing of the personnel appearing in the video is monitored, whether the dressing of the target personnel meets the relevant regulations is judged, however, the accuracy of dressing monitoring is low easily caused by the mode of monitoring the personnel, the cost is high, and the automation degree is low.
Disclosure of Invention
The invention provides a dressing monitoring method, a dressing monitoring system, a medium and an electronic terminal, and aims to solve the problems of low dressing monitoring accuracy, high cost and low automation degree in the prior art.
The dressing monitoring method provided by the invention comprises the following steps:
collecting material pictures, inputting the material pictures into a pre-trained feature extraction model for feature extraction, and obtaining a clothing bottom library, wherein the clothing bottom library comprises: human body material characteristics and corresponding garment types;
acquiring a dressing monitoring task;
calling related human body material characteristics in the garment base according to the type of the monitored garment in the dressing monitoring task;
acquiring a picture to be identified;
inputting the picture to be recognized into a feature extraction model for feature extraction, and acquiring target feature associated information, wherein the target feature associated information comprises: the human body overall characteristic, the human body local characteristic and the local confidence corresponding to the human body local characteristic;
and according to the related human body material characteristics and the target characteristic correlation information, dressing monitoring is carried out.
Optionally, the step of obtaining the clothing bottom library includes:
marking the clothing category of the material picture;
inputting the marked material pictures into the feature extraction model for feature extraction, and acquiring a plurality of human body material features, wherein the human body material features comprise: the overall material characteristics, the local material characteristics and the local confidence degrees corresponding to the local material characteristics;
the local material characteristics comprise head area material characteristics and shoulder and waist area material characteristics, and local confidence degrees corresponding to the head area material characteristics and the shoulder and waist area material characteristics are obtained;
judging whether the local confidence corresponding to the head region material characteristics and the shoulder and waist region material characteristics exceeds a preset confidence threshold value or not;
and if the local confidence coefficient of the head area material characteristics and/or the local confidence coefficient of the shoulder and waist area material characteristics do not exceed the confidence coefficient threshold value, filtering the corresponding human body target, and finishing the acquisition of the clothing bottom library.
Optionally, the obtaining step of the feature extraction model includes:
obtaining a training set, the training set comprising: the method comprises the following steps of (1) training samples, real sample features corresponding to the training samples, and real confidence degrees corresponding to the real sample features;
inputting the training sample into a neural network for feature extraction, and obtaining a prediction sample feature and a prediction confidence coefficient of the prediction sample feature, wherein the prediction sample feature comprises: predicting the overall characteristics of the sample and predicting the local characteristics of the sample;
and training the neural network according to the real sample characteristics, the real confidence coefficient, the prediction sample characteristics and the prediction confidence coefficient to obtain the characteristic extraction model.
Optionally, the step of performing dressing monitoring according to the related human body material characteristics and the target characteristic association information includes:
obtaining a comparison mode in the dressing monitoring task, wherein the comparison mode comprises the following steps: comparing a white list and a black list;
inputting the picture to be recognized into the feature extraction model to extract a human body target, and acquiring a corresponding human body target;
judging whether the human body target is in a preset target area or not, and filtering the human body target which is not in the target area for the first time;
judging whether the local confidence of the head region feature and the local confidence of the shoulder and waist region feature in the human body local features of the human body target exceed preset confidence threshold values or not, if the local confidence of the head region feature and/or the local confidence of the shoulder and waist region feature do not exceed the confidence threshold values, judging that the corresponding human body target does not have an upper half body, and further carrying out secondary filtering on the human body target;
after filtering is finished, comparing the human body overall characteristics with the human body overall material characteristics in the related human body material characteristics for one time to obtain a first similarity;
and according to the comparison mode and the first similarity, dressing monitoring is carried out.
Optionally, the step of performing dressing monitoring according to the comparison manner and the first similarity includes:
if the comparison mode is white list comparison, judging that the corresponding human body target is abnormal in dressing when the first similarity does not exceed a preset similarity threshold, and accumulating the times of abnormal dressing of the human body target once;
when the first similarity exceeds the similarity threshold, carrying out secondary comparison on the shoulder and waist area characteristics in the human body local characteristics and the shoulder and waist area material characteristics in the related human body material characteristics to obtain a second similarity;
judging whether the second similarity exceeds the similarity threshold, if the second similarity does not exceed the similarity threshold, judging that the corresponding human body target is abnormal in dressing, and accumulating the dressing abnormal times of the human body target once;
and according to the dressing abnormal times, dressing monitoring is carried out.
Optionally, the step of performing dressing monitoring according to the comparison manner and the first similarity further includes:
if the comparison mode is blacklist comparison, when the first similarity exceeds a preset similarity threshold, judging that the corresponding human body target is abnormal in dressing, and accumulating the dressing abnormal times of the human body target once;
when the first similarity does not exceed the similarity threshold, comparing the shoulder and waist area characteristics in the human body local characteristics with the shoulder and waist area material characteristics in the related human body material characteristics for three times to obtain a third similarity;
judging whether the third similarity exceeds the similarity threshold, if so, judging that the corresponding human body target is abnormal in dressing, and accumulating the dressing abnormal times of the human body target once;
and according to the dressing abnormal times, dressing monitoring is carried out.
Optionally, the step of performing the dressing monitoring according to the number of times of the dressing abnormality includes:
judging whether the existence time of the human body target exceeds a preset action duration threshold value or not;
if the existence time of the human body target exceeds the behavior duration threshold, resetting the dressing abnormity times, recording the existence time of the human body target again and judging the dressing abnormity again;
if the existence time of the human body target does not exceed the behavior duration threshold, judging whether the dressing abnormal times of the human body target are larger than or equal to a preset abnormal time threshold;
if the dressing abnormity frequency of the human body target is larger than or equal to the abnormity frequency threshold value, sending out a dressing abnormity warning to finish dressing monitoring;
the acquiring step of the abnormal times threshold value comprises the following steps: acquiring the abnormal times threshold according to the existence time of the human body target, the preset image recognition speed and the dressing abnormal behavior hit rate, wherein the mathematical expression of the abnormal times threshold is as follows:
C=t×V×β
wherein C is an abnormal frequency threshold value, t is the current existing time of the human body target, V is the image recognition speed, and beta is the dressing abnormal behavior hit rate.
The invention also provides a dressing monitoring system, comprising:
the preprocessing module is used for collecting material pictures, inputting the material pictures into a pre-trained feature extraction model for feature extraction, and obtaining a clothing base library, wherein the clothing base library comprises: human body material characteristics and corresponding garment types;
the resource scheduling module is used for acquiring a dressing monitoring task; calling related human body material characteristics in the garment base according to the type of the monitored garment in the dressing monitoring task;
the dressing monitoring module is used for acquiring a picture to be identified; inputting the picture to be recognized into a feature extraction model for feature extraction, and acquiring target feature associated information, wherein the target feature associated information comprises: the human body overall characteristic, the human body local characteristic and the local confidence corresponding to the human body local characteristic; according to the related human body material characteristics and the target characteristic correlation information, dressing monitoring is carried out; the preprocessing module, the resource scheduling module and the dressing monitoring module are connected.
The invention also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method as defined in any one of the above.
The present invention also provides an electronic terminal, comprising: a processor and a memory;
the memory is adapted to store a computer program and the processor is adapted to execute the computer program stored by the memory to cause the terminal to perform the method as defined in any one of the above.
The invention has the beneficial effects that: according to the dressing monitoring method, the dressing monitoring system, the dressing monitoring medium and the electronic terminal, the material pictures are collected and input into the pre-trained feature extraction model for feature extraction, and a clothing base library is obtained and comprises the following steps: human body material characteristics and corresponding garment types; acquiring a dressing monitoring task; calling related human body material characteristics in the garment base according to the type of the monitored garment in the dressing monitoring task; acquiring a picture to be identified; inputting a picture to be recognized into a feature extraction model for feature extraction, and acquiring target feature associated information, wherein the target feature associated information comprises: the human body overall characteristic, the human body local characteristic and the local confidence corresponding to the human body local characteristic; according to the related human body material characteristics and the target characteristic correlation information, dressing monitoring is carried out; the method has the advantages that the construction of the clothing base and the calling of the base resources are well realized, the human body target in the picture to be recognized is recognized on the basis of the base resources, the human body overall characteristics, the human body local characteristics and the local confidence corresponding to the human body local characteristics are effectively combined, the dressing monitoring accuracy is improved, the dressing monitoring cost is reduced, the automation degree is high, the dressing monitoring model or the dressing detection model and other complex models are not required to be constructed, and the implementation is convenient.
Drawings
Fig. 1 is a schematic flow chart of a dressing monitoring method according to an embodiment of the present invention.
Fig. 2 is a schematic flow chart of acquiring a clothing base library in the clothing monitoring method according to the embodiment of the invention.
Fig. 3 is a schematic flow chart of obtaining a feature extraction model in the clothing monitoring method according to the embodiment of the present invention.
Fig. 4 is a schematic flow chart of the clothing monitoring according to the relevant human body material characteristics and the target characteristic association information in the clothing monitoring method in the embodiment of the invention.
Fig. 5 is a schematic structural diagram of a dressing monitoring system in an embodiment of the present invention.
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention. It is to be noted that the features in the following embodiments and examples may be combined with each other without conflict.
It should be noted that the drawings provided in the following embodiments are only for illustrating the basic idea of the present invention, and the components related to the present invention are only shown in the drawings rather than drawn according to the number, shape and size of the components in actual implementation, and the type, quantity and proportion of the components in actual implementation may be changed freely, and the layout of the components may be more complicated.
The inventor finds that with the development of computer technology, image monitoring receives more and more attention and attention. In some specific application scenarios, dressing of personnel on site needs to be monitored, and whether the dressing of the personnel is uniform or not is judged, for example, uniform clothes need to be worn in mail storage, reflective clothes need to be worn in a construction site, and the like. At present, the mode of watching the monitoring video by people is adopted, the wearing of the personnel appearing in the video is monitored, whether the dressing of the target personnel meets the relevant regulations is judged, however, the accuracy of dressing monitoring is low easily caused by the mode of monitoring the personnel, the cost is high, and the automation degree is low. Therefore, the inventor proposes a method, a system, a medium and an electronic terminal for dressing monitoring, which inputs the material pictures into a pre-trained feature extraction model for feature extraction by collecting the material pictures, and obtains a clothing base library, wherein the clothing base library comprises: human body material characteristics and corresponding garment types; acquiring a dressing monitoring task; calling related human body material characteristics in the garment base according to the type of the monitored garment in the dressing monitoring task; acquiring a picture to be identified; inputting the picture to be recognized into a feature extraction model for feature extraction, and acquiring target feature associated information, wherein the target feature associated information comprises: the human body overall characteristic, the human body local characteristic and the local confidence corresponding to the human body local characteristic; according to the relevant human body material characteristics and the target characteristic correlation information, dressing monitoring is carried out; the method has the advantages that the construction of the clothing bottom base and the calling of the bottom base resources are well realized, the human body target in the picture to be recognized is recognized on the basis of the bottom base resources, the human body overall characteristics, the human body local characteristics and the local confidence corresponding to the human body local characteristics are effectively combined, the accuracy of dressing monitoring is improved, the dressing monitoring cost is reduced, the automation degree is high, the construction of complex models such as a dressing monitoring model or a dressing detection model is not needed, the professional complexity of dressing monitoring is reduced, better effect than that of a separately trained dressing detection model can be obtained under the condition of using fewer materials, the implementation is convenient, the scene adaptation speed is high, and the dressing monitoring effect is better.
As shown in fig. 1, the dressing monitoring method in this embodiment includes:
s101: collecting material pictures, inputting the material pictures into a pre-trained feature extraction model for feature extraction, and obtaining a clothing bottom library, wherein the clothing bottom library comprises: material pictures, human body material characteristics and corresponding clothing types; the material pictures are multiple, for example: collecting a plurality of human body pictures containing uniform clothes at different angles and different postures as material pictures, marking the clothes types of the material pictures, inputting the marked material pictures into the characteristic extraction model to distinguish human body targets, obtaining a plurality of material pictures containing single human body targets, extracting the characteristics of the material pictures containing single human body targets, obtaining human body material characteristics, storing the material pictures, the human body material characteristics and the corresponding clothes types, and using the material pictures, the human body material characteristics and the corresponding clothes types as a clothes base to complete the construction of the clothes base.
S102: acquiring a dressing monitoring task; the dressing monitoring task comprises the following steps: the method comprises the steps of monitoring the types of clothes, such as postal storage work uniforms, reflective clothes and the like, the position and the size of a target area, a similarity threshold value, a behavior duration threshold value and a dressing abnormal behavior hit rate, wherein the target area is a preset image recognition area.
S103: calling related human body material characteristics in the garment base according to the type of the monitored garment in the dressing monitoring task; the method comprises the steps of calling human body material characteristics with the same type as a monitored garment from a garment base according to the type of the monitored garment in a garment monitoring task to serve as related human body material characteristics, storing the related human body material characteristics in a memory to serve as a target base, and subsequently performing garment abnormity identification and garment monitoring on the basis of the target base.
S104: acquiring a picture to be identified; the picture to be identified can be obtained from a monitoring video, such as: the method comprises the steps of obtaining a monitoring video, carrying out picture interception on the monitoring video, and obtaining a plurality of pictures to be identified.
S105: inputting the picture to be recognized into a feature extraction model for feature extraction, and acquiring target feature associated information, wherein the target feature associated information comprises: the human body overall characteristic, the human body local characteristic and the local confidence corresponding to the human body local characteristic; the image to be recognized is input into the feature extraction model for feature extraction, so that the features of the human body target in the image to be recognized, including the human body overall features and the human body local features, can be well obtained, and the local confidence corresponding to the human body local features is obtained, so that the local confidence is organically combined with the human body overall features and the human body local features, and the accuracy of follow-up dressing monitoring on the human body target can be well improved. The human body local features include: a head region characteristic, a shoulder-waist region characteristic, a crotch region characteristic, and a leg region characteristic, the shoulder-waist region characteristic being a characteristic in a region from a target shoulder to a waist of a human body.
S106: and according to the related human body material characteristics and the target characteristic correlation information, dressing monitoring is carried out. By calling the related human body material characteristics in the clothing bottom library, the accuracy of clothing abnormity identification and clothing monitoring can be better improved, the difficulty of clothing monitoring is reduced, real-time monitoring of clothing abnormity is realized, the clothing monitoring cost is reduced, and the degree of automation is higher by combining the human body overall characteristics, the human body local characteristics and the local confidence corresponding to the human body local characteristics in the target characteristic correlation information on the basis of the related human body material characteristics.
As shown in fig. 2, in some embodiments, the step of obtaining a clothing base library comprises:
s201: marking the clothing category of the material picture;
s202: inputting the marked material pictures into the feature extraction model for feature extraction, and acquiring a plurality of human body material features, wherein the human body material features comprise: the method comprises the steps of obtaining overall material characteristics and local material characteristics, wherein the local material characteristics have corresponding local confidence degrees; the local material characteristics comprise head area material characteristics and shoulder and waist area material characteristics, and local confidence degrees corresponding to the head area material characteristics and the shoulder and waist area material characteristics are obtained; the local material features further include: crotch region material characteristics and leg region material characteristics;
s203: judging whether the local confidence corresponding to the head region material characteristics and the shoulder and waist region material characteristics exceeds a preset confidence threshold value or not;
s204: and if the local confidence coefficient of the head area material characteristics and/or the local confidence coefficient of the shoulder and waist area material characteristics do not exceed the confidence coefficient threshold value, filtering the corresponding human body target, and finishing the acquisition of the clothing bottom library. When the local confidence of the material characteristics of the head region and/or the local confidence of the material characteristics of the shoulder and waist region do not exceed a preset confidence threshold, judging that the corresponding human body target does not have an upper half body, further acquiring the human body target, and taking the filtered human body material characteristics and the corresponding local confidence as data of a garment bottom library to complete the construction of the garment bottom library. The human body target is filtered according to the local confidence coefficient of the material characteristics of the head area and the local confidence coefficient of the material characteristics of the shoulder and waist area, and the accuracy of subsequent dressing abnormity identification and dressing monitoring is improved.
As shown in fig. 3, in order to improve the accuracy of the dressing monitoring, the inventors propose to increase the training of the local confidence of the local features in the training process of the feature extraction model, so as to improve the accuracy of the dressing monitoring and improve the training efficiency of the physical sign extraction model. The step of obtaining the feature extraction model comprises the following steps:
s301: obtaining a training set, the training set comprising: the method comprises the following steps of (1) training samples, real sample features corresponding to the training samples, and real confidence degrees corresponding to the real sample features; the real sample features include: the real global sample characteristics and the real local sample characteristics comprise: real head region sample characteristics, real shoulder and waist region sample characteristics, real crotch region sample characteristics and real leg region sample characteristics;
s302: inputting the training sample into a neural network for feature extraction, and obtaining a prediction sample feature and a prediction confidence coefficient of the prediction sample feature, wherein the prediction sample feature comprises: predicting the overall characteristics of the sample and predicting the local characteristics of the sample; the predicting sample local features comprises: predicting head area sample characteristics, shoulder and waist area sample characteristics, crotch area sample characteristics and leg area sample characteristics;
s303: and training the neural network according to the real sample characteristics, the real confidence coefficient, the prediction sample characteristics and the prediction confidence coefficient to obtain the characteristic extraction model. By adding the training of the local confidence of the local features in the training process of the feature extraction model, the human body target filtering can be conveniently carried out according to the local confidence in the subsequent dressing monitoring process, so that the dressing monitoring accuracy is improved, and the occurrence of errors is avoided.
As shown in fig. 4, in some embodiments, the step of performing dressing monitoring according to the related human material characteristics and the target characteristic association information includes:
s401: obtaining a comparison mode in the dressing monitoring task, wherein the comparison mode comprises the following steps: comparing a white list and a black list; the white list comparison is: the lower the similarity between the characteristics in the target characteristic associated information and the characteristics of the related human body materials is, the higher the possibility that the corresponding human body target clothes are abnormal is; the blacklist comparison is: the higher the similarity between the characteristics in the target characteristic associated information and the characteristics of the related human body materials is, the lower the possibility that the corresponding human body target clothes are abnormal is; because the application scene of this scheme is comparatively diversified, can be applied to a plurality of trades such as financial industry, building site trade and traffic industry, the required contrast mode of different application scenes may be different, if: white list comparison is usually required for uniform monitoring of mail storage, that is, if the similarity between the characteristics in the obtained target characteristic associated information and the characteristics of related human body materials is lower, the probability of clothes abnormity of the corresponding human body target is higher; for the light shoulder detection in a construction site, a blacklist comparison is usually required, that is, if the similarity between the features in the obtained target feature associated information and the features of the relevant human body materials is higher, the probability that the clothing of the corresponding human body target is abnormal is higher, and the method can be better adapted to different application scenes. When the user inputs the dressing monitoring task, the corresponding comparison mode can be selected, and then the targeted monitoring on different application scenes is realized.
S402: inputting the picture to be recognized into the feature extraction model to extract a human body target, and acquiring a corresponding human body target; namely, the picture to be recognized is input into the feature extraction model to carry out human body target extraction and distinguishing, and a single human body target is determined.
S403: judging whether the human body target is in a preset target area or not, and filtering the human body target which is not in the target area for the first time; the target area may be set in advance, and the size and position of the target area may be set according to actual conditions, which is not described herein again. The human body targets which are not in the target area are filtered once, so that the human body targets can be screened, and the human body targets can be conveniently and accurately monitored in dressing.
S404: judging whether the local confidence of the upper-body feature in the human body local features of the human body target exceeds a preset confidence threshold, if the local confidence corresponding to the upper-body feature in the human body local features does not exceed the confidence threshold, judging that the corresponding human body target does not have the upper body, and further carrying out secondary filtering on the human body target; because the requirement of dress control is usually to the upper half of human body at present, consequently, through carrying out secondary filter to human target, get rid of the human target that does not have the upper half, can help improving the accuracy of follow-up dress control, implement more conveniently.
S405: after filtering is finished, comparing the human body overall characteristics with the human body overall material characteristics in the related human body material characteristics for one time to obtain a first similarity; by acquiring the first similarity between the human body overall characteristics and the human body overall material characteristics, whether the dressing abnormality occurs to the human body target can be conveniently determined.
S406: and according to the comparison mode and the first similarity, dressing monitoring is carried out.
In some embodiments, the step of performing the dressing monitoring according to the comparison and the first similarity comprises:
if the comparison mode is white list comparison, judging that the corresponding human body target is abnormal in dressing when the first similarity does not exceed a preset similarity threshold, and accumulating the times of abnormal dressing of the human body target once;
when the first similarity exceeds the similarity threshold, carrying out secondary comparison on the shoulder and waist area characteristics in the human body local characteristics and the shoulder and waist area material characteristics in the related human body material characteristics to obtain a second similarity; the accuracy of the garment abnormity monitoring can be further improved by carrying out secondary comparison on the shoulder and waist area characteristics in the local characteristics of the human body and the shoulder and waist area material characteristics in the related human body material characteristics.
Judging whether the second similarity exceeds the similarity threshold, if the second similarity does not exceed the similarity threshold, judging that the corresponding human body target is abnormal in dressing, and accumulating the dressing abnormal times of the human body target once;
and according to the dressing abnormal times, dressing monitoring is carried out. Namely the dressing abnormity frequency, and performing dressing abnormity warning or reporting. By utilizing the accumulated dressing abnormity times in a period of time to warn or report abnormal dressing, the judgment result of the abnormal dressing can be smoothly processed in a probability mode, the precision of dressing abnormity judgment and dressing monitoring is improved, and the probability of false alarm is reduced.
If the comparison mode is blacklist comparison, when the first similarity exceeds a preset similarity threshold, judging that the corresponding human body target is abnormal in dressing, and accumulating the dressing abnormal times of the human body target once;
when the first similarity does not exceed the similarity threshold, comparing the shoulder and waist area characteristics in the human body local characteristics with the shoulder and waist area material characteristics in the related human body material characteristics for three times to obtain a third similarity; the shoulder and waist area characteristics in the local characteristics of the human body are compared with the shoulder and waist area material characteristics in the related human body material characteristics for three times, so that the dressing monitoring accuracy can be effectively improved.
Judging whether the third similarity exceeds the similarity threshold, if so, judging that the corresponding human body target is abnormal in dressing, and accumulating the dressing abnormal times of the human body target once;
and according to the dressing abnormal times, dressing monitoring is carried out.
In some embodiments, the feature comparison may also be performed by pre-configuring a feature comparison type, where the feature comparison type includes: local feature comparison, full feature comparison and comprehensive comparison, the local feature comparison comprises: comparing the characteristics of the shoulder and waist regions, wherein the local characteristic comparison is as follows: comparing the local characteristics of the human body target so as to judge whether the human body target is abnormal in dressing; the full feature contrast is: comparing the human body overall characteristics of the human body target so as to judge whether the human body target is abnormal in dressing; the comprehensive comparison is as follows: the overall characteristics of the human body and the local characteristics of the human body are combined to carry out comprehensive comparison so as to judge whether the dressing abnormality occurs to the human body target, for example: the method comprises the steps of firstly obtaining a first similarity between human body overall characteristics and human body overall material characteristics, carrying out secondary comparison on shoulder and waist area characteristics in human body local characteristics and shoulder and waist area material characteristics in related human body material characteristics when the first similarity exceeds a preset similarity threshold value if a preset comparison mode is white list comparison, obtaining a second similarity, judging that the dressing of a corresponding human body target is abnormal if the second similarity does not exceed the similarity threshold value, and judging the next human body target if the second similarity exceeds the similarity threshold value.
In some embodiments, the step of performing the dressing monitoring according to the number of the dressing abnormality includes:
judging whether the existence time of the human body target exceeds a preset action duration threshold value or not; the behavior duration threshold may be set according to an actual situation, and details are not described here.
If the existence time of the human body target exceeds the behavior duration threshold, resetting the dressing abnormity times, recording the existence time of the human body target again and judging the dressing abnormity again;
if the existence time of the human body target does not exceed the behavior duration threshold, judging whether the dressing abnormal times of the human body target are larger than or equal to a preset abnormal time threshold;
and if the dressing abnormity frequency of the human body target is greater than or equal to the abnormity frequency threshold value, sending out a dressing abnormity warning or reporting the dressing abnormity, and finishing the dressing monitoring. By comparing the dressing abnormity frequency in a fixed time period with a preset abnormity frequency threshold value, the dressing monitoring and dressing abnormity reporting accuracy can be improved from the perspective of judging success probability, false alarm is avoided, and the accuracy is high.
In some embodiments, the obtaining of the anomaly number threshold comprises: acquiring the abnormal times threshold according to the existence time of the human body target, the preset image recognition speed and the dressing abnormal behavior hit rate, wherein the mathematical expression of the abnormal times threshold is as follows:
C=t×V×β
wherein C is an abnormal frequency threshold value, t is the current existing time of the human body target, V is the image recognition speed, and beta is the dressing abnormal behavior hit rate. The dress anomaly behavior hit rate is a hit rate for an irregular dress/dress behavior. The abnormal times threshold value is obtained through the method, the accurate abnormal times threshold value can be obtained by combining the image recognition speed and the dressing abnormal behavior hit rate, and the dressing monitoring accuracy is improved. The image recognition speed is a speed for judging the clothing abnormality of the image to be recognized, such as 8 frames/second, and the clothing abnormal behavior hit rate can be set according to actual conditions, and can also be obtained according to historical clothing abnormal judgment results, and details are not repeated here.
Under the application scene that accords with national laws and regulations and allows monitoring, the dressing monitoring method in the embodiment is used for dressing monitoring, such as uniform dressing detection of mail and storage, reflective garment detection of a construction site, light shoulder detection of a dining hall and the like.
The first embodiment is as follows: mail and storage uniform dressing detection and monitoring
When the staff in the mail storage need to be unified to dress and examine time measuring, can gather the material picture of the unified clothing that is detected the personnel through camera or camera, this material picture is multi-angle, multi-attitude, and the simple picture of background, right the material picture carries out clothing type mark, uploads the material picture after will marking to the feature extraction model, carries out the feature extraction, acquires the human material feature, the human material feature includes: the method comprises the steps that overall material characteristics and local material characteristics are provided, the local material characteristics have corresponding local confidence degrees, and a material picture, human body material characteristics and the local confidence degrees corresponding to the local material characteristics are stored to complete construction of a clothing base library;
acquiring a dressing monitoring task, wherein the dressing monitoring task comprises the following steps: monitoring the clothing type, the position and the size of a target area, a similarity threshold (such as 40), a behavior duration threshold (such as 30 seconds) and a dressing abnormal behavior hit rate (such as 80%), acquiring related human body material characteristics with the same clothing type from the clothing base library according to the monitored clothing type, storing the related human body material characteristics in a memory as a target base library, and performing dressing abnormal identification and dressing monitoring on the basis of the target base library;
acquiring a preset comparison mode, wherein the comparison mode of mail storage detection is generally white list detection;
acquiring a picture to be identified, wherein the picture to be identified can acquire a plurality of images to be identified by intercepting or extracting image frames of a monitoring video;
inputting the picture to be recognized into the feature extraction model to perform human body target distinguishing and extraction, and determining a corresponding human body target;
judging whether the human body target is in a preset target area or not, and filtering the human body target which is not in the target area for the first time;
acquiring human body overall characteristics and human body local characteristics of the human body target and local confidence corresponding to the human body local characteristics; the human body local features include: a head region characteristic, a shoulder-waist region characteristic, a crotch region characteristic, and a leg region characteristic;
judging whether the local confidence of the head region feature and the local confidence of the shoulder and waist region feature in the human body local features of the human body target exceed preset confidence threshold values or not, if the local confidence of the head region feature and/or the local confidence of the shoulder and waist region feature do not exceed the confidence threshold values, judging that the corresponding human body target does not have an upper half body, and further carrying out secondary filtering on the human body target; the size of the confidence threshold may be set according to actual conditions, and details are not repeated here.
After filtering, acquiring a preset feature comparison type, wherein the feature comparison type of mail storage detection is full-feature comparison, and comparing the human body overall features with the human body overall material features in the related human body material features to acquire overall similarity;
judging whether the overall similarity exceeds a preset similarity threshold, judging that the corresponding human body target is abnormal in dressing when the overall similarity does not exceed the preset similarity threshold, and accumulating the dressing abnormal times of the human body target once;
when the overall similarity exceeds the similarity threshold, judging a next human body target;
and according to the dressing abnormal times, dressing monitoring is carried out.
According to the dressing abnormal times, the dressing monitoring step comprises the following steps:
judging whether the existence time of the human body target exceeds a preset action duration threshold value or not;
if the existence time of the human body target exceeds the behavior duration threshold, resetting the dressing abnormity times, recording the existence time of the human body target again and judging the dressing abnormity again;
if the existence time of the human body target does not exceed the behavior duration threshold, judging whether the dressing abnormal times of the human body target are larger than or equal to a preset abnormal time threshold;
if the dressing abnormity frequency of the human body target is larger than or equal to the abnormity frequency threshold value, sending out a dressing abnormity warning or reporting post storage dressing nonstandard warning information to finish dressing monitoring.
The acquiring step of the abnormal times threshold value comprises the following steps: acquiring the abnormal times threshold according to the existence time of the human body target, the preset image recognition speed and the dressing abnormal behavior hit rate, wherein the mathematical expression of the abnormal times threshold is as follows:
C=t×V×β
wherein C is an abnormal frequency threshold value, t is the current existing time of the human body target, V is the image recognition speed, and beta is the dressing abnormal behavior hit rate.
Example two: building site reflection of light clothing detects and control
When carrying out building site reflection of light clothing and detecting, gather the material picture that is detected personnel's unified clothing through camera or camera, this material picture is multi-angle, multi-attitude, and the simple picture of background, right the material picture carries out clothing type mark, uploads the characteristic extraction model with the material picture after the mark, carries out the characteristic and draws, acquires the human material characteristic, the human material characteristic includes: the method comprises the steps that overall material characteristics and local material characteristics are provided, the local material characteristics have corresponding local confidence degrees, and a material picture, human body material characteristics and the local confidence degrees corresponding to the local material characteristics are stored to complete construction of a clothing base library;
acquiring a dressing monitoring task, wherein the dressing monitoring task comprises the following steps: monitoring the clothing type, the position and the size of a target area, a similarity threshold (such as 45), a behavior duration threshold (such as 20 seconds) and a dressing abnormality behavior hit rate (such as 85%), acquiring related human body material characteristics with the same clothing type from the clothing base according to the monitored clothing type, storing the related human body material characteristics in a memory as a target base, and performing dressing abnormality identification and dressing monitoring on the basis of the target base;
acquiring a preset comparison mode, wherein the comparison mode of the reflective garment detection is generally white list detection;
acquiring a picture to be identified;
inputting the picture to be recognized into the feature extraction model to perform human body target distinguishing and extraction, and determining a corresponding human body target;
judging whether the human body target is in a preset target area or not, and filtering the human body target which is not in the target area for the first time;
acquiring human body overall characteristics and human body local characteristics of the human body target and local confidence corresponding to the human body local characteristics; the human body local features include: a head region characteristic, a shoulder-waist region characteristic, a crotch region characteristic, and a leg region characteristic;
judging whether the local confidence of the head region feature and the local confidence of the shoulder and waist region feature in the human body local features of the human body target exceed preset confidence threshold values or not, if the local confidence of the head region feature and/or the local confidence of the shoulder and waist region feature do not exceed the confidence threshold values, judging that the corresponding human body target does not have an upper half body, and further carrying out secondary filtering on the human body target;
after filtering is finished, acquiring a preset feature comparison type, wherein the feature comparison type detected by the reflective garment is shoulder and waist region feature comparison, and comparing the shoulder and waist region features with shoulder and waist region material features in the related human body material features to acquire shoulder and waist feature similarity;
judging whether the shoulder and waist feature similarity exceeds a preset similarity threshold, judging that the corresponding human body target is abnormal in dressing when the shoulder and waist feature similarity does not exceed the preset similarity threshold, and accumulating the dressing abnormal times of the human body target once;
when the shoulder and waist feature similarity exceeds the similarity threshold, judging a next human body target;
and according to the dressing abnormal times, dressing monitoring is carried out.
According to the dressing abnormal times, the dressing monitoring step comprises the following steps:
judging whether the existence time of the human body target exceeds a preset action duration threshold value or not;
if the existence time of the human body target exceeds the behavior duration threshold, resetting the dressing abnormity times, recording the existence time of the human body target again and judging the dressing abnormity again;
if the existence time of the human body target does not exceed the behavior duration threshold, judging whether the dressing abnormal times of the human body target are larger than or equal to a preset abnormal time threshold;
if the dressing abnormity frequency of the human body target is larger than or equal to the abnormity frequency threshold value, sending out dressing abnormity warning or reporting the nonstandard dressing alarm information of the reflective clothes to finish dressing monitoring.
The acquiring step of the abnormal times threshold value comprises the following steps: and acquiring the abnormal times threshold according to the existence time of the human body target, the preset image recognition speed and the dressing abnormal behavior hit rate.
Example three: canteen light shoulder detection and monitoring
Firstly, acquiring an optical wing material picture which is multi-angle and multi-pose and has a simple background, marking the material picture with a clothing type, uploading the marked material picture to a feature extraction model, performing feature extraction, and acquiring human body material features, wherein the human body material features comprise: the method comprises the steps that overall material characteristics and local material characteristics are provided, the local material characteristics have corresponding local confidence degrees, and a material picture, human body material characteristics and the local confidence degrees corresponding to the local material characteristics are stored to complete construction of a clothing base library;
then, a dressing monitoring task is obtained, wherein the dressing monitoring task comprises the following steps: monitoring the clothing type, the position and the size of a target area, a similarity threshold (such as 20), a behavior duration threshold (such as 35 seconds) and a dressing abnormality behavior hit rate (such as 75%), acquiring related human body material characteristics with the same clothing type from the clothing base library according to the monitored clothing type, storing the related human body material characteristics in a memory as a target base library, and performing dressing abnormality identification and dressing monitoring on the basis of the target base library;
acquiring a preset comparison mode, wherein the comparison mode of light wing detection is generally blacklist detection;
acquiring a picture to be identified;
inputting the picture to be recognized into the feature extraction model to perform human body target distinguishing and extraction, and determining a corresponding human body target;
judging whether the human body target is in a preset target area or not, and filtering the human body target which is not in the target area for the first time;
acquiring human body overall characteristics and human body local characteristics of the human body target and local confidence corresponding to the human body local characteristics; the human body local features include: a head region characteristic, a shoulder-waist region characteristic, a crotch region characteristic, and a leg region characteristic;
judging whether the local confidence of the head region feature and the local confidence of the shoulder and waist region feature in the human body local features of the human body target exceed preset confidence threshold values or not, if the local confidence of the head region feature and/or the local confidence of the shoulder and waist region feature do not exceed the confidence threshold values, judging that the corresponding human body target does not have an upper half body, and further carrying out secondary filtering on the human body target;
after filtering is finished, acquiring a preset feature comparison type, wherein the feature comparison type detected by the light shoulder is a comprehensive comparison type of full feature comparison and shoulder and waist region feature comparison;
carrying out first comparison on the human body overall characteristics and the human body overall material characteristics in the related human body material characteristics to obtain overall similarity;
when the overall similarity exceeds a preset similarity threshold, judging that the corresponding human body target is abnormal in dressing, and accumulating the dressing abnormal times of the human body target once;
when the overall similarity does not exceed the similarity threshold, performing second comparison on the shoulder and waist region characteristics in the human body local characteristics and the shoulder and waist region material characteristics in the related human body material characteristics to obtain shoulder and waist similarity;
judging whether the shoulder and waist similarity exceeds the similarity threshold, if the third similarity exceeds the similarity threshold, judging that the corresponding human body target is abnormal in dressing, and accumulating the dressing abnormal times of the human body target once;
and finally, according to the dressing abnormal times, dressing monitoring is carried out.
According to the dressing abnormal times, the dressing monitoring step comprises the following steps:
judging whether the existence time of the human body target exceeds a preset action duration threshold value or not;
if the existence time of the human body target exceeds the behavior duration threshold, resetting the dressing abnormity times, recording the existence time of the human body target again and judging the dressing abnormity again;
if the existence time of the human body target does not exceed the behavior duration threshold, judging whether the dressing abnormal times of the human body target are larger than or equal to a preset abnormal time threshold;
and if the dressing abnormity frequency of the human body target is greater than or equal to the abnormity frequency threshold value, sending out a dressing abnormity warning or reporting light wing warning information to finish the dressing monitoring. The acquiring step of the abnormal times threshold value comprises the following steps: and acquiring the abnormal times threshold according to the existence time of the human body target, the preset image recognition speed and the dressing abnormal behavior hit rate.
As shown in fig. 5, the present embodiment further provides a dressing monitoring system, including:
the preprocessing module is used for collecting material pictures, inputting the material pictures into a pre-trained feature extraction model for feature extraction, and obtaining a clothing base library, wherein the clothing base library comprises: human body material characteristics and corresponding garment types;
the resource scheduling module is used for acquiring a dressing monitoring task; calling related human body material characteristics in the garment base according to the type of the monitored garment in the dressing monitoring task;
the dressing monitoring module is used for acquiring a picture to be identified; inputting the picture to be recognized into a feature extraction model for feature extraction, and acquiring target feature associated information, wherein the target feature associated information comprises: the human body overall characteristic, the human body local characteristic and the local confidence corresponding to the human body local characteristic; according to the related human body material characteristics and the target characteristic correlation information, dressing monitoring is carried out; the preprocessing module, the resource scheduling module and the dressing monitoring module are connected. The system in this embodiment inputs the material picture into a pre-trained feature extraction model for feature extraction by collecting the material picture, and obtains a clothing library, where the clothing library includes: human body material characteristics and corresponding garment types; acquiring a dressing monitoring task; calling related human body material characteristics in the garment base according to the type of the monitored garment in the dressing monitoring task; acquiring a picture to be identified; inputting a picture to be recognized into a feature extraction model for feature extraction, and acquiring target feature associated information, wherein the target feature associated information comprises: the human body overall characteristic, the human body local characteristic and the local confidence corresponding to the human body local characteristic; according to the related human body material characteristics and the target characteristic correlation information, dressing monitoring is carried out; the method has the advantages that the construction of the clothing bottom library and the calling of the bottom library resources are well realized, the human body target in the picture to be recognized is recognized on the basis of the bottom library resources, the human body overall characteristics, the human body local characteristics and the local confidence corresponding to the human body local characteristics are effectively combined, the dressing monitoring accuracy is improved, the dressing monitoring cost is reduced, the automation degree is high, the construction of complex models such as a dressing monitoring model or a dressing detection model is not needed, the implementation is convenient, the requirement on the specialty is low, and the adaptability is high.
In some embodiments, the step of obtaining a library of garment chassis comprises:
marking the clothing category of the material picture;
inputting the marked material pictures into the feature extraction model for feature extraction, and acquiring a plurality of human body material features, wherein the human body material features comprise: the overall material characteristics, the local material characteristics and the local confidence degrees corresponding to the local material characteristics;
the local material characteristics comprise head area material characteristics and shoulder and waist area material characteristics, and local confidence degrees corresponding to the head area material characteristics and the shoulder and waist area material characteristics are obtained;
judging whether the local confidence corresponding to the head region material characteristics and the shoulder and waist region material characteristics exceeds a preset confidence threshold value or not;
and if the local confidence coefficient of the head area material characteristics and/or the local confidence coefficient of the shoulder and waist area material characteristics do not exceed the confidence coefficient threshold value, filtering the corresponding human body target, and finishing the acquisition of the clothing bottom library.
In some embodiments, the obtaining of the feature extraction model comprises:
obtaining a training set, the training set comprising: the method comprises the following steps of (1) training samples, real sample features corresponding to the training samples, and real confidence degrees corresponding to the real sample features;
inputting the training sample into a neural network for feature extraction, and obtaining a prediction sample feature and a prediction confidence coefficient of the prediction sample feature, wherein the prediction sample feature comprises: predicting the overall characteristics of the sample and predicting the local characteristics of the sample;
and training the neural network according to the real sample characteristics, the real confidence coefficient, the prediction sample characteristics and the prediction confidence coefficient to obtain the characteristic extraction model.
In some embodiments, the step of performing dressing monitoring according to the related human material characteristics and target characteristic association information includes:
obtaining a comparison mode in the dressing monitoring task, wherein the comparison mode comprises the following steps: comparing a white list and a black list;
inputting the picture to be recognized into the feature extraction model to extract a human body target, and acquiring a corresponding human body target;
judging whether the human body target is in a preset target area or not, and filtering the human body target which is not in the target area for the first time;
judging whether the local confidence of the head region feature and the local confidence of the shoulder and waist region feature in the human body local features of the human body target exceed preset confidence threshold values or not, if the local confidence of the head region feature and/or the local confidence of the shoulder and waist region feature do not exceed the confidence threshold values, judging that the corresponding human body target does not have an upper half body, and further carrying out secondary filtering on the human body target;
after filtering is finished, comparing the human body overall characteristics with the human body overall material characteristics in the related human body material characteristics for one time to obtain a first similarity;
and according to the comparison mode and the first similarity, dressing monitoring is carried out.
In some embodiments, the step of performing the dressing monitoring according to the comparison and the first similarity comprises:
if the comparison mode is white list comparison, judging that the corresponding human body target is abnormal in dressing when the first similarity does not exceed a preset similarity threshold, and accumulating the times of abnormal dressing of the human body target once;
when the first similarity exceeds the similarity threshold, carrying out secondary comparison on the shoulder and waist area characteristics in the human body local characteristics and the shoulder and waist area material characteristics in the related human body material characteristics to obtain a second similarity;
judging whether the second similarity exceeds the similarity threshold, if the second similarity does not exceed the similarity threshold, judging that the corresponding human body target is abnormal in dressing, and accumulating the dressing abnormal times of the human body target once;
and according to the dressing abnormal times, dressing monitoring is carried out.
In some embodiments, the step of performing the dressing monitoring according to the comparison manner and the first similarity further includes:
if the comparison mode is blacklist comparison, when the first similarity exceeds a preset similarity threshold, judging that the corresponding human body target is abnormal in dressing, and accumulating the dressing abnormal times of the human body target once;
when the first similarity does not exceed the similarity threshold, comparing the shoulder and waist area characteristics in the human body local characteristics with the shoulder and waist area material characteristics in the related human body material characteristics for three times to obtain a third similarity;
judging whether the third similarity exceeds the similarity threshold, if so, judging that the corresponding human body target is abnormal in dressing, and accumulating the dressing abnormal times of the human body target once;
and according to the dressing abnormal times, dressing monitoring is carried out.
In some embodiments, the step of performing the dressing monitoring according to the number of the dressing abnormality includes:
judging whether the existence time of the human body target exceeds a preset action duration threshold value or not;
if the existence time of the human body target exceeds the behavior duration threshold, resetting the dressing abnormity times, recording the existence time of the human body target again and judging the dressing abnormity again;
if the existence time of the human body target does not exceed the behavior duration threshold, judging whether the dressing abnormal times of the human body target are larger than or equal to a preset abnormal time threshold;
if the dressing abnormity frequency of the human body target is larger than or equal to the abnormity frequency threshold value, sending out a dressing abnormity warning to finish dressing monitoring;
the acquiring step of the abnormal times threshold value comprises the following steps: acquiring the abnormal times threshold according to the existence time of the human body target, the preset image recognition speed and the dressing abnormal behavior hit rate, wherein the mathematical expression of the abnormal times threshold is as follows:
C=t×V×β
wherein C is an abnormal frequency threshold value, t is the current existing time of the human body target, V is the image recognition speed, and beta is the dressing abnormal behavior hit rate.
The present embodiment also provides a computer-readable storage medium on which a computer program is stored, which when executed by a processor implements any of the methods in the present embodiments.
The present embodiment further provides an electronic terminal, including: a processor and a memory;
the memory is used for storing computer programs, and the processor is used for executing the computer programs stored by the memory so as to enable the terminal to execute the method in the embodiment.
The computer-readable storage medium in the present embodiment can be understood by those skilled in the art as follows: all or part of the steps for implementing the above method embodiments may be performed by hardware associated with a computer program. The aforementioned computer program may be stored in a computer readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
The electronic terminal provided by the embodiment comprises a processor, a memory, a transceiver and a communication interface, wherein the memory and the communication interface are connected with the processor and the transceiver and are used for completing mutual communication, the memory is used for storing a computer program, the communication interface is used for carrying out communication, and the processor and the transceiver are used for operating the computer program so that the electronic terminal can execute the steps of the method.
In this embodiment, the Memory may include a Random Access Memory (RAM), and may also include a non-volatile Memory (non-volatile Memory), such as at least one disk Memory.
The Processor may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the Integrated Circuit may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, or a discrete hardware component.
The foregoing embodiments are merely illustrative of the principles and utilities of the present invention and are not intended to limit the invention. Any person skilled in the art can modify or change the above-mentioned embodiments without departing from the spirit and scope of the present invention. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention be covered by the claims of the present invention.

Claims (10)

1. A method of dressing monitoring, comprising:
collecting material pictures, inputting the material pictures into a pre-trained feature extraction model for feature extraction, and obtaining a clothing bottom library, wherein the clothing bottom library comprises: human body material characteristics and corresponding garment types;
acquiring a dressing monitoring task;
calling related human body material characteristics in the garment base according to the type of the monitored garment in the dressing monitoring task;
acquiring a picture to be identified;
inputting the picture to be recognized into a feature extraction model for feature extraction, and acquiring target feature associated information, wherein the target feature associated information comprises: the human body overall characteristic, the human body local characteristic and the local confidence corresponding to the human body local characteristic;
and according to the related human body material characteristics and the target characteristic correlation information, dressing monitoring is carried out.
2. The garment monitoring method of claim 1, wherein the step of obtaining a library of garment chassis comprises:
marking the clothing category of the material picture;
inputting the marked material pictures into the feature extraction model for feature extraction, and acquiring a plurality of human body material features, wherein the human body material features comprise: the overall material characteristics, the local material characteristics and the local confidence degrees corresponding to the local material characteristics;
the local material characteristics comprise head area material characteristics and shoulder and waist area material characteristics, and local confidence degrees corresponding to the head area material characteristics and the shoulder and waist area material characteristics are obtained;
judging whether the local confidence corresponding to the head region material characteristics and the shoulder and waist region material characteristics exceeds a preset confidence threshold value or not;
and if the local confidence coefficient of the head area material characteristics and/or the local confidence coefficient of the shoulder and waist area material characteristics do not exceed the confidence coefficient threshold value, filtering the corresponding human body target, and finishing the acquisition of the clothing bottom library.
3. The garment monitoring method of claim 1, wherein the step of obtaining the feature extraction model comprises:
obtaining a training set, the training set comprising: the method comprises the following steps of (1) training samples, real sample features corresponding to the training samples, and real confidence degrees corresponding to the real sample features;
inputting the training sample into a neural network for feature extraction, and obtaining a prediction sample feature and a prediction confidence coefficient of the prediction sample feature, wherein the prediction sample feature comprises: predicting the overall characteristics of the sample and predicting the local characteristics of the sample;
and training the neural network according to the real sample characteristics, the real confidence coefficient, the prediction sample characteristics and the prediction confidence coefficient to obtain the characteristic extraction model.
4. The method of claim 1, wherein the step of performing the dressing monitoring according to the related body material characteristics and the target characteristic association information comprises:
obtaining a comparison mode in the dressing monitoring task, wherein the comparison mode comprises the following steps: comparing a white list and a black list;
inputting the picture to be recognized into the feature extraction model to extract a human body target, and acquiring a corresponding human body target;
judging whether the human body target is in a preset target area or not, and filtering the human body target which is not in the target area for the first time;
judging whether the local confidence of the head region feature and the local confidence of the shoulder and waist region feature in the human body local features of the human body target exceed preset confidence threshold values or not, if the local confidence of the head region feature and/or the local confidence of the shoulder and waist region feature do not exceed the confidence threshold values, judging that the corresponding human body target does not have an upper half body, and further carrying out secondary filtering on the human body target;
after filtering is finished, comparing the human body overall characteristics with the human body overall material characteristics in the related human body material characteristics for one time to obtain a first similarity;
and according to the comparison mode and the first similarity, dressing monitoring is carried out.
5. The garment monitoring method of claim 4, wherein the step of performing garment monitoring based on the comparison and the first similarity comprises:
if the comparison mode is white list comparison, judging that the corresponding human body target is abnormal in dressing when the first similarity does not exceed a preset similarity threshold, and accumulating the times of abnormal dressing of the human body target once;
when the first similarity exceeds the similarity threshold, carrying out secondary comparison on the shoulder and waist area characteristics in the human body local characteristics and the shoulder and waist area material characteristics in the related human body material characteristics to obtain a second similarity;
judging whether the second similarity exceeds the similarity threshold, if the second similarity does not exceed the similarity threshold, judging that the corresponding human body target is abnormal in dressing, and accumulating the dressing abnormal times of the human body target once;
and according to the dressing abnormal times, dressing monitoring is carried out.
6. The garment monitoring method of claim 4, wherein the step of performing garment monitoring based on the comparison and the first similarity further comprises:
if the comparison mode is blacklist comparison, when the first similarity exceeds a preset similarity threshold, judging that the corresponding human body target is abnormal in dressing, and accumulating the dressing abnormal times of the human body target once;
when the first similarity does not exceed the similarity threshold, comparing the shoulder and waist area characteristics in the human body local characteristics with the shoulder and waist area material characteristics in the related human body material characteristics for three times to obtain a third similarity;
judging whether the third similarity exceeds the similarity threshold, if so, judging that the corresponding human body target is abnormal in dressing, and accumulating the dressing abnormal times of the human body target once;
and according to the dressing abnormal times, dressing monitoring is carried out.
7. The method of claim 5 or 6, wherein the step of performing the dressing monitoring according to the number of the dressing abnormality includes:
judging whether the existence time of the human body target exceeds a preset action duration threshold value or not;
if the existence time of the human body target exceeds the behavior duration threshold, resetting the dressing abnormity times, recording the existence time of the human body target again and judging the dressing abnormity again;
if the existence time of the human body target does not exceed the behavior duration threshold, judging whether the dressing abnormal times of the human body target are larger than or equal to a preset abnormal time threshold;
if the dressing abnormity frequency of the human body target is larger than or equal to the abnormity frequency threshold value, sending out a dressing abnormity warning to finish dressing monitoring;
the acquiring step of the abnormal times threshold value comprises the following steps: acquiring the abnormal times threshold according to the existence time of the human body target, the preset image recognition speed and the dressing abnormal behavior hit rate, wherein the mathematical expression of the abnormal times threshold is as follows:
C=t×V×β
wherein C is an abnormal frequency threshold value, t is the current existing time of the human body target, V is the image recognition speed, and beta is the dressing abnormal behavior hit rate.
8. A garment monitoring system, comprising:
the preprocessing module is used for collecting material pictures, inputting the material pictures into a pre-trained feature extraction model for feature extraction, and obtaining a clothing base library, wherein the clothing base library comprises: human body material characteristics and corresponding garment types;
the resource scheduling module is used for acquiring a dressing monitoring task; calling related human body material characteristics in the garment base according to the type of the monitored garment in the dressing monitoring task;
the dressing monitoring module is used for acquiring a picture to be identified; inputting the picture to be recognized into a feature extraction model for feature extraction, and acquiring target feature associated information, wherein the target feature associated information comprises: the human body overall characteristic, the human body local characteristic and the local confidence corresponding to the human body local characteristic; according to the related human body material characteristics and the target characteristic correlation information, dressing monitoring is carried out; the preprocessing module, the resource scheduling module and the dressing monitoring module are connected.
9. A computer-readable storage medium having stored thereon a computer program, characterized in that: the computer program, when executed by a processor, implements the method of any one of claims 1 to 7.
10. An electronic terminal, comprising: a processor and a memory;
the memory is for storing a computer program and the processor is for executing the computer program stored by the memory to cause the terminal to perform the method of any of claims 1 to 7.
CN202111165053.5A 2021-09-30 2021-09-30 Dressing monitoring method, dressing monitoring system, dressing monitoring medium and electronic terminal Active CN113837138B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111165053.5A CN113837138B (en) 2021-09-30 2021-09-30 Dressing monitoring method, dressing monitoring system, dressing monitoring medium and electronic terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111165053.5A CN113837138B (en) 2021-09-30 2021-09-30 Dressing monitoring method, dressing monitoring system, dressing monitoring medium and electronic terminal

Publications (2)

Publication Number Publication Date
CN113837138A true CN113837138A (en) 2021-12-24
CN113837138B CN113837138B (en) 2023-08-29

Family

ID=78967983

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111165053.5A Active CN113837138B (en) 2021-09-30 2021-09-30 Dressing monitoring method, dressing monitoring system, dressing monitoring medium and electronic terminal

Country Status (1)

Country Link
CN (1) CN113837138B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113989858A (en) * 2021-12-28 2022-01-28 安维尔信息科技(天津)有限公司 Work clothes identification method and system

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100030578A1 (en) * 2008-03-21 2010-02-04 Siddique M A Sami System and method for collaborative shopping, business and entertainment
CN106778609A (en) * 2016-12-15 2017-05-31 国网浙江省电力公司杭州供电公司 A kind of electric power construction field personnel uniform wears recognition methods
CN107818489A (en) * 2017-09-08 2018-03-20 中山大学 A kind of more people's costume retrieval methods based on dressing parsing and human testing
US20180322333A1 (en) * 2017-05-04 2018-11-08 Intel Corporation Method and apparatus for person recognition
CN109101922A (en) * 2018-08-10 2018-12-28 广东电网有限责任公司 Operating personnel device, assay, device and electronic equipment
CN110188724A (en) * 2019-06-05 2019-08-30 中冶赛迪重庆信息技术有限公司 The method and system of safety cap positioning and color identification based on deep learning
CN110287804A (en) * 2019-05-30 2019-09-27 广东电网有限责任公司 A kind of electric operating personnel's dressing recognition methods based on mobile video monitor
CN110569722A (en) * 2019-08-01 2019-12-13 江苏濠汉信息技术有限公司 Visual analysis-based constructor dressing standard detection method and device
CN110781976A (en) * 2019-10-31 2020-02-11 重庆紫光华山智安科技有限公司 Extension method of training image, training method and related device
CN111401314A (en) * 2020-04-10 2020-07-10 上海东普信息科技有限公司 Dressing information detection method, device, equipment and storage medium
CN111444767A (en) * 2020-02-25 2020-07-24 华中科技大学 Pedestrian detection and tracking method based on laser radar
CN111723844A (en) * 2020-05-19 2020-09-29 上海明略人工智能(集团)有限公司 Method and system for determining clustering center and method and device for determining picture type
CN112364734A (en) * 2020-10-30 2021-02-12 福州大学 Abnormal dressing detection method based on yolov4 and CenterNet

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100030578A1 (en) * 2008-03-21 2010-02-04 Siddique M A Sami System and method for collaborative shopping, business and entertainment
CN106778609A (en) * 2016-12-15 2017-05-31 国网浙江省电力公司杭州供电公司 A kind of electric power construction field personnel uniform wears recognition methods
US20180322333A1 (en) * 2017-05-04 2018-11-08 Intel Corporation Method and apparatus for person recognition
CN107818489A (en) * 2017-09-08 2018-03-20 中山大学 A kind of more people's costume retrieval methods based on dressing parsing and human testing
CN109101922A (en) * 2018-08-10 2018-12-28 广东电网有限责任公司 Operating personnel device, assay, device and electronic equipment
CN110287804A (en) * 2019-05-30 2019-09-27 广东电网有限责任公司 A kind of electric operating personnel's dressing recognition methods based on mobile video monitor
CN110188724A (en) * 2019-06-05 2019-08-30 中冶赛迪重庆信息技术有限公司 The method and system of safety cap positioning and color identification based on deep learning
CN110569722A (en) * 2019-08-01 2019-12-13 江苏濠汉信息技术有限公司 Visual analysis-based constructor dressing standard detection method and device
CN110781976A (en) * 2019-10-31 2020-02-11 重庆紫光华山智安科技有限公司 Extension method of training image, training method and related device
CN111444767A (en) * 2020-02-25 2020-07-24 华中科技大学 Pedestrian detection and tracking method based on laser radar
CN111401314A (en) * 2020-04-10 2020-07-10 上海东普信息科技有限公司 Dressing information detection method, device, equipment and storage medium
CN111723844A (en) * 2020-05-19 2020-09-29 上海明略人工智能(集团)有限公司 Method and system for determining clustering center and method and device for determining picture type
CN112364734A (en) * 2020-10-30 2021-02-12 福州大学 Abnormal dressing detection method based on yolov4 and CenterNet

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
孙懿: "施工现场人员类型识别方法的研究与实现", 《中国优秀硕士学位论文全文数据库 (工程科技Ⅱ辑)》, no. 7, pages 042 - 600 *
潘坚跃等: "人体及穿戴特征识别在电力设施监控中的应用", 《电子设计工程》, vol. 23, no. 10, pages 68 - 71 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113989858A (en) * 2021-12-28 2022-01-28 安维尔信息科技(天津)有限公司 Work clothes identification method and system

Also Published As

Publication number Publication date
CN113837138B (en) 2023-08-29

Similar Documents

Publication Publication Date Title
CN109765539B (en) Indoor user behavior monitoring method and device, electrical equipment and home monitoring system
CN112561948B (en) Space-time trajectory-based accompanying trajectory recognition method, device and storage medium
CN108256404B (en) Pedestrian detection method and device
CN108038176B (en) Method and device for establishing passerby library, electronic equipment and medium
CN106845352B (en) Pedestrian detection method and device
CN110674696B (en) Monitoring method, device, system, monitoring equipment and readable storage medium
CN109426785B (en) Human body target identity recognition method and device
CN110390229B (en) Face picture screening method and device, electronic equipment and storage medium
CN111210399B (en) Imaging quality evaluation method, device and equipment
CN110717358B (en) Visitor number counting method and device, electronic equipment and storage medium
CN111225234A (en) Video auditing method, video auditing device, equipment and storage medium
CN111161206A (en) Image capturing method, monitoring camera and monitoring system
CN109670383A (en) Video shaded areas choosing method, device, electronic equipment and system
CN112001230A (en) Sleeping behavior monitoring method and device, computer equipment and readable storage medium
CN108416298B (en) Scene judgment method and terminal
CN111814510A (en) Detection method and device for remnant body
CN112381054A (en) Method for detecting working state of camera and related equipment and system
CN113269046A (en) High-altitude falling object identification method and system
CN115273208A (en) Track generation method, system and device and electronic equipment
CN113837138A (en) Dressing monitoring method, system, medium and electronic terminal
CN110808995A (en) Safety protection method and device
CN110505438B (en) Queuing data acquisition method and camera
CN112115745A (en) Method, device and system for identifying code missing scanning behaviors of commodities
CN111753587A (en) Method and device for detecting falling to ground
CN114913600A (en) Electricity stealing detection method and device for electricity meter, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant