CN113963258A - Worker card wearing identification method, device, equipment and storage medium - Google Patents

Worker card wearing identification method, device, equipment and storage medium Download PDF

Info

Publication number
CN113963258A
CN113963258A CN202111140116.1A CN202111140116A CN113963258A CN 113963258 A CN113963258 A CN 113963258A CN 202111140116 A CN202111140116 A CN 202111140116A CN 113963258 A CN113963258 A CN 113963258A
Authority
CN
China
Prior art keywords
card
wearing
neural network
worker
training
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111140116.1A
Other languages
Chinese (zh)
Inventor
徐梦佳
李斯
杨周龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dongpu Software Co Ltd
Original Assignee
Dongpu Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dongpu Software Co Ltd filed Critical Dongpu Software Co Ltd
Priority to CN202111140116.1A priority Critical patent/CN113963258A/en
Publication of CN113963258A publication Critical patent/CN113963258A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computational Linguistics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Evolutionary Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a worker card wearing identification method, a worker card wearing identification device, worker card wearing identification equipment and a storage medium, aiming at the problems of missing detection, time consumption, labor consumption and the like existing in the conventional mode of manually supervising worker card wearing standardization, an image of the worker card wearing standardization and an image of the worker card wearing non-standardization are obtained, and key point information in the images is marked to form an image data set; the image data set is used as training data, a preset MobileNet neural network model is trained by adopting a TensorFlow frame, the training accuracy is improved by adopting a Momentum optimization method, the trained MobileNet neural network model is obtained, and the model is deployed at a mobile equipment end as a work card wearing recognition model and used for detecting whether the work card wearing of a worker is standard or not. Based on machine learning and deep learning technologies, a lightweight mobile Net network structure is adopted, meanwhile, a Momentum optimization method is used for improving training accuracy, staff standard wearing work cards are identified, the mobile terminal can be deployed, inspection accuracy is improved, and human resources are saved.

Description

Worker card wearing identification method, device, equipment and storage medium
Technical Field
The invention belongs to the technical field of staff management, and particularly relates to a method, a device, equipment and a storage medium for identifying the wearing of a worklist.
Background
The worker card not only can display the position and identity information of a wearer, but also represents the management culture of an enterprise. From the perspective of the employee, the employee badge is a sign that the enterprise confirms the identity of the employee. From the perspective of the enterprise, the worker cards are the image of the enterprise. The enterprise staff wears the worker card, which is not only beneficial to improving the occupation honor and responsibility of the staff, but also beneficial to the standard management of the enterprise, and is also convenient for identifying and supervising the staff.
Currently, many enterprises specify that employees must wear employee cards during the work hours. However, some employee cards are standard in wearing and some are not. Each staff appearance instrument and dressing represents the image of the company, and each staff is the image of the company. In order to better show professional and reliable enterprise images to customers, workers are required to wear the worker cards correctly, the worker cards are manually checked in the past, and the problems of missing inspection, time consumption, labor consumption and the like exist.
Disclosure of Invention
The invention aims to provide a method, a device, equipment and a storage medium for identifying the wearing of a worklist, which replace the original manual inspection and can be deployed at a mobile terminal, thereby improving the inspection accuracy and saving human resources.
In order to solve the problems, the technical scheme of the invention is as follows:
a worker card wearing identification method comprises the following steps:
the server side obtains an image of a standard wearing worker card and an image of an irregular wearing worker card, and marks key point information in the images to form an image data set;
taking the image data set as training data, training a preset MobileNet neural network model by adopting a TensorFlow frame, and improving the training accuracy by adopting a Momentum optimization method to obtain the trained MobileNet neural network model as a work card wearing recognition model;
and deploying the worker card wearing identification model at the mobile equipment end for detecting whether worker card wearing of workers is standard.
According to an embodiment of the present invention, the performing precision training on the preset MobileNet neural network model by using the tensrflow frame further includes:
converting the picture form of the image data set into a tfrecrd form for storage, preprocessing the image, and expanding the number of pictures in the image data set;
and (3) building a MobileNet neural network structure, setting parameter information including learning rate, minimum batch and learning turn number, and training a MobileNet neural network model.
According to an embodiment of the present invention, the building of the MobileNet neural network structure further includes:
the MobileNet neural network model comprises a convolution layer, an excitation layer, a pooling layer and a full-connection layer, wherein the convolution layer is used for extracting feature information and a feature mapping relation of an image; the excitation layer is used for carrying out nonlinear operation on the characteristic information according to the characteristic mapping relation and extracting deep characteristic information; the pooling layer is used for compressing the image; and the full connection layer is used for fitting the deep characteristic information of the compressed image, transmitting the fitted deep characteristic information to the classification regression layer for calculation and outputting an identification result.
According to an embodiment of the present invention, the training of the MobileNet neural network model further includes:
and in the process of training the MobileNet neural network model, updating the parameters of the MobileNet neural network model by adopting an SGD gradient descent method.
According to an embodiment of the present invention, the deploying the identification model of wearing the card at the mobile device further includes:
creating a classification server for the worker card wearing identification model, and defining a classification server interface;
and establishing connection between the classification server and the mobile equipment terminal, and applying the classification server to the mobile equipment terminal.
According to an embodiment of the present invention, the creating a classification server for a card wearing identification model and defining a classification server interface further comprises:
generating a classification server interface based on TensorFlow Serving, realizing a function of worker card wearing classification, and generating a classification server; and when the classification server is operated, monitoring the request from the mobile equipment terminal and transmitting the response of the service.
A card wear identification device comprising:
the data module is used for the server side to obtain an image of a standard wearing workday card and an image of an irregular wearing workday card, and mark key point information in the images to form an image data set;
the model module is used for taking the image data set as training data, training a preset MobileNet neural network model by adopting a TensorFlow frame, improving the training accuracy by adopting a Momentum optimization method, obtaining the trained MobileNet neural network model and taking the trained MobileNet neural network model as a card wearing identification model;
and the deployment module is used for deploying the worker card wearing identification model at the mobile equipment end and detecting whether worker card wearing of workers is standard or not.
According to an embodiment of the present invention, the model module includes a data processing unit and a model training unit;
the data processing unit is used for converting the picture form of the image data set into a tfrecrd form for storage, preprocessing the image and expanding the number of the pictures in the image data set;
the model training unit is used for building a MobileNet neural network structure, setting parameter information including learning rate, minimum batch and learning round number, and training the MobileNet neural network model.
An identification device for wearing a card, comprising:
a memory having instructions stored therein and a processor, the memory and the processor interconnected by a line;
the processor calls the instruction in the memory to realize the card wearing identification method in one embodiment of the invention.
A computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a card wearing identification method in an embodiment of the present invention.
Due to the adoption of the technical scheme, compared with the prior art, the invention has the following advantages and positive effects:
aiming at the problems of missing detection, time consumption, labor consumption and the like existing in the conventional method of regularly wearing the work card by adopting a manual supervision worker, the work card wearing identification method in one embodiment of the invention comprises the steps of obtaining an image of the regularly worn work card and an image of the irregularly worn work card, and labeling key point information in the images to form an image data set; the image data set is used as training data, a preset MobileNet neural network model is trained by adopting a TensorFlow frame, the training accuracy is improved by adopting a Momentum optimization method, the trained MobileNet neural network model is obtained, and the model is deployed at a mobile equipment end as a work card wearing recognition model and used for detecting whether the work card wearing of a worker is standard or not. Based on machine learning and deep learning technologies, a lightweight mobile Net network structure is adopted, meanwhile, a Momentum optimization method is used for improving training accuracy, staff standard wearing work cards are identified, the mobile terminal can be deployed, inspection accuracy is improved, and human resources are saved.
Drawings
FIG. 1 is a flow chart of a method for identifying the wearing of a workcard according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a structure of a convolution kernel of a MobileNet network according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a MobileNet network convolution kernel model according to an embodiment of the present invention;
FIG. 4 is a block diagram of a card wearing identification device according to an embodiment of the present invention;
fig. 5 is a schematic diagram of a card wearing identification device in an embodiment of the invention.
Detailed Description
The invention provides a method, a device, equipment and a storage medium for identifying workcard wearing, which are disclosed by the invention, and is further described in detail with reference to the accompanying drawings and specific embodiments. Advantages and features of the present invention will become apparent from the following description and from the claims.
Example one
The embodiment provides a worker card wearing identification method aiming at the problems of missing detection, time consumption, labor consumption and the like in the existing mode of manually supervising worker card wearing by manual supervision.
Specifically, referring to fig. 1, the identification method for wearing the workmanship card comprises the following steps:
s1: the server side obtains an image of a standard wearing worker card and an image of an irregular wearing worker card, and marks key point information in the images to form an image data set;
s2: taking the image data set as training data, training a preset MobileNet neural network model by adopting a TensorFlow frame, and improving the training accuracy by adopting a Momentum optimization method to obtain the trained MobileNet neural network model as a card wearing recognition model;
s3: and deploying the worker card wearing identification model at the mobile equipment end for detecting whether worker card wearing of workers is standard.
In step S1, the server obtains the image of the standard wearing plate and the image of the non-standard wearing plate, and labels the key point information in the images to form an image data set.
The server side can intercept images of the workers from videos recorded by the cameras, can also be provided with a DSS digital monitoring system, accesses all the cameras into a local area network, accesses all the cameras through the DSS monitoring platform, intercepts the images of the workers, and comprises images of standard wearing of the workers and images of non-standard wearing of the workers.
After the images of the standard wearing worker cards and the images of the non-standard wearing worker cards are obtained, the key point information of the images needs to be marked. Wherein, the key point information is the position and the posture of the workcard. For example, the worker cards hanging the neck are required to be hung on the neck in a standard wearing manner, the worker cards are positioned in front of the chest, and the worker cards are correct in posture and not inclined; for the pin worker cards, the standard wearing requirements are that the pin worker cards are different from the left chest, the posture of the worker cards is correct, and the worker cards are not inclined. In practical application, corresponding key point information is marked according to different worker card styles. When the key point information is labeled, a Labeling tool, such as a Labeling tool, can be used for Labeling the key point information.
After the image key point information is marked, txt files of a train data set and a val verification data set can be directly generated through a script, and then the train data set and the val data set are respectively stored into a single record file to form an image data set for training a MobileNet neural network model subsequently.
In step S2, the image data set is used as training data, a tensrflow frame is used to train a preset MobileNet neural network model, and a Momentum optimization method is used to improve the training accuracy, so as to obtain a trained MobileNet neural network model, which is used as a card wearing recognition model.
Wherein, adopt TensorFlow frame to carry out the precision training to predetermined MobileNet neural network model and further include:
s200: converting the picture form of the image data set into a tfrecrd form for storage, preprocessing the image, and expanding the number of pictures in the image data set;
s201: and (3) building a MobileNet neural network structure, setting parameter information including learning rate, minimum batch and learning turn number, and training a MobileNet neural network model.
In step S200, the present embodiment uses a tensrflow frame, and the image format of the image data set needs to be converted into tfrecrd format for storage, so as to facilitate data reading.
In order for the MobileNet neural network model to be adequately trained, pre-processing of the images is required. In practical application, the images are preprocessed, the images can be read from tfrechrd, randomly turned left and right and cut into the size of the pictures with the specified size, so that after a neural network trains a round of the data set, when the data set is retrained, the retrained input pictures and the previous training pictures are different, the number of pictures of the image data set is increased, and the problem that the network is easy to cause overfitting is solved.
In step S201, a MobileNet neural network structure is built. MobileNet is the construction of lightweight deep neural networks by using deep separable convolutions. The basic structure of the method is depth separable convolution kernels (depth separable filters), and the convolution kernels are composed of depth convolution kernels (depth convolution filters) and point convolution kernels (point convolution filters). The MobileNet neural network can greatly reduce the calculation amount and accelerate the image classification speed by a deep convolution kernel (depthwise convolution filters) compared with a standard convolution kernel. Referring to FIG. 2, if the size of the input image is DF×DFX M, size of output image DF×DFX N, then for standard convolution DK×DKThe calculated amount is DK×DK×M×DF×DFX is N; the calculated amount for depthwise constraint is DK×DK×M×DF×DFThe calculated amount of pointwise contribution is M × DF×DFX N, the total calculated amount of depthwise partial volume is DK×DK×M×DF×DF+M×DF×DFX is N; the depthwise partial convolution can be compared to the standard convolution as follows:
Figure BDA0003283521150000061
when a 3 × 3 convolution kernel is used, depthwise partial convolution can be reduced by about 9 times in computation compared to standard convolution.
For depthwise partial containment, the basic structure is shown in fig. 3, where ReLU is used as the activation function. The basic structure is a 3 × 3 depthwise restriction, wherein a part of the depthwise restriction is down sampling by means of a string ═ 2. Then the feature is changed to 1 × 1 by using average potential, and a full link layer is added according to the prediction category size, and finally a softmax layer is added. In practical applications, the computational effort is substantially concentrated on the 1 × 1 convolution. In the aspect of implementation of the bottom layer of convolution, the convolution is generally implemented in an im2col mode, and memory recombination is required; however, when the convolution kernel is 1 × 1, the method is not needed, and the memory recombination is not needed, so that the bottom layer implementation is accelerated, namely the model output is accelerated.
The basic principle of the structure of the MobileNet neural network is introduced, and in practical application, specific modeling is needed.
For example, step 300: the MobileNet neural network model is decomposed into an input layer, a convolutional layer, an excitation layer, a pooling layer, a full-link layer and a classification regression layer. Inputting the image into a MobileNet neural network model through an input layer, and extracting the feature information and the feature mapping relation of the image by a convolution layer; the excitation layer carries out nonlinear operation on the characteristic information according to the characteristic mapping relation and extracts deep characteristic information; the pooling layer compresses the image; and fitting the deep characteristic information of the image compressed by the full-junction layer, and transmitting the fitted deep characteristic information to the classification regression layer for calculation and outputting an identification result. In practical applications, the MobileNet neural network model convolution layer may use the ReLU function as the activation function, and may also use the MaxMin function as the activation function.
In order to improve the classification precision of the MobileNet neural network model, the model and the cavity convolution can be combined, and the receptive field of the characteristic diagram is increased. In the convolutional neural network, the definition of a Receptive Field (Receptive Field) is the area size of a pixel point on a feature map (feature map) output by each layer of the convolutional neural network, which is mapped on an input picture. That is, one point on the feature map corresponds to an area on the input map. In order to increase the receptive field and reduce the computational complexity in the convolutional neural network CNN, down-sampling (posing or s2/conv) is usually performed, so that although the receptive field can be increased, the spatial resolution is reduced, which in turn leads to the loss of image information. In order not to reduce the resolution and still enlarge the field of view, the present embodiment uses a hole convolution. The hollow convolution has the advantages that under the condition of not losing image information, on one hand, the receptive field is increased, and a large target can be detected and segmented; on the other hand, the resolution is improved, and the target can be accurately positioned. In practical application, the parameter setting of the hole convolution can be added on the basis of the MobileNet neural network.
After the MobileNet neural network model is constructed, relevant parameters for model training need to be set. In practical applications, the MobileNet neural network model itself has already determined some parameters such as the number of convolution kernels in each hidden layer, the number of layers of the neural network, the kind of activation function, etc. (i.e. default), but some hyper-parameters need to be set, such as: a learning rate (learning rate), a minimum batch (mini batch), a number of rounds of learning (epoch), a Momentum parameter Momentum), and the like. And after the super-parameter setting is finished, training the network and adjusting the parameters, so that the set super-parameter value is the most suitable value of the network. And finally, verifying to obtain the testing precision of the network.
The training of the MobileNet neural network model further comprises:
s400: and in the process of training the MobileNet neural network model, updating the parameters of the MobileNet neural network model by adopting an SGD gradient descent method.
SGD refers to a stored gradient device, i.e., a random gradient descent. In practical applications, the training data set may be divided into n buckets, each containing m samples. Each update utilizes the data of one batch, rather than the entire training set. Namely:
xt+1=xtxt
Δxt=-ηgt
where η is the learning rate and gt is the gradient of x at time t.
When too much training data is available, updates with the entire data set tend not to be displayed temporally. With the SGD method, the stress on the machine can be reduced and convergence can be faster.
The SGD method converges faster when there is much redundancy in the training set (similar samples occur multiple times). For example, if the first half and the second half of the training set have the same gradient, and the first half is used as one batch and the second half is used as another batch, then two steps can be advanced to the optimal solution by using the SGD method while only one step is advanced by using the entire data set as one batch. It follows that the SGD method converges faster.
When the model is trained by adopting the SGD method, the gradient can be updated by combining momentum (momentum). momentum simulates the inertia of an object in motion, namely, the previous updated direction is kept to a certain extent during updating, and the final updated direction is fine-tuned by using the gradient of the current batch. Stability can be increased to a certain extent, thereby accelerating model learning. The application formula of momentum is as follows:
Δxt=ρΔxt-1-ηgt
wherein rho, namely momentum, represents the original updating direction to be kept to a certain extent, and the value of rho is between 0 and 1. At the start of training, the initial value is typically chosen to be 0.5, since the gradient may be large; when the gradient is not that large, it may instead be 0.9. Eta is the learning rate, i.e. how much the gradient of the current batch affects the final update direction, and has the same meaning as the ordinary SGD. The sum of ρ and η is not necessarily 1.
In the common gradient descent method x + ═ v, the update amount v of x at a time is v ═ -dx × lr, where dx is the first derivative of the objective function func (x) with respect to x.
When using impulses (representing the cumulative effect of force on time), the update v per x is considered to be the sum of the gradient decrease of this time-dx lr and the update v of the last x multiplied by a factor momentum between [0,1], i.e., v-dx lr + v momememtum.
It can be seen from the formula:
when the gradient descending-dx x lr is in the same direction as the last updated quantity v, the last updated quantity can play a positive acceleration role in the search.
When the gradient descending-dx x lr is opposite to the last updating amount v, the last updating amount can play a role of decelerating the search.
When the training precision of the MobileNet neural network model reaches a preset requirement (such as 90% accuracy), the parameter adjustment of the MobileNet neural network model can be stopped, and the trained MobileNet neural network model is used as a worker-brand wearing recognition model for application.
In step S3, a worker card wearing identification model is deployed on the mobile device side for detecting whether worker card wearing of the worker is normative.
Wherein, the step of deploying the identification model of the card wearing on the mobile device further comprises:
s500: creating a classification server for the worker card wearing identification model, and defining a classification server interface; and establishing connection between the classification server and the mobile equipment terminal, and applying the classification server to the mobile equipment terminal.
Specifically, based on TensorFlow Serving, a classification server interface is generated, a function for classifying the wearing of the work cards is realized, and a classification server is generated; and when the classification server is operated, monitoring the request from the mobile equipment terminal and transmitting the response of the service.
In practical application, a classification server for the worker brand wearing identification model is created, and a service contract can be defined in a protocol buffer, wherein the service contract is an IDL language and a binary code for gRPC; proto document defines a service that has as input a JPEG-encoded image string to be classified and returns a list of inferred classes arranged according to scores.
When the connection between the classification server and the mobile equipment terminal is established, a Python Web server can be established based on BaseHTTPServer, the BaseHITTPServer processes the image file uploaded by the user, the inference function of the Python Web server is called to perform inference processing on the image file, and then the inference result is returned in a plain text form.
When the classification server is applied to the mobile device terminal, the classification server can be implemented in a virtual container mode. The compiled classification server file can be copied to a permanent position in the container, and all temporarily built files are cleaned; and submitting the state to a new Dock image outside the container, creating a snapshot recording the change of the virtual file system, pushing the image to a docker service cloud to which the mobile equipment belongs, and serving the image.
Therefore, the mobile equipment terminal can carry out casual inspection on the staffs in the way that the staffs are regularly worn, and the staffs who are not regularly worn can inform the corresponding responsible persons to carry out correction or treatment according to corresponding measures. The worker card wearing identification method in the embodiment replaces manual inspection, improves inspection accuracy, and saves human resources.
Example two
The present embodiment provides a worker card wearing identification device, please refer to fig. 4, the worker card wearing identification device includes:
the data module 1 is used for the server side to obtain an image of a standard wearing work card and an image of an irregular wearing work card, and mark key point information in the images to form an image data set;
the model module 2 is used for taking the image data set as training data, training a preset MobileNet neural network model by adopting a TensorFlow frame, improving the training accuracy by adopting a Momentum optimization method, obtaining the trained MobileNet neural network model and taking the trained MobileNet neural network model as a card wearing and identifying model;
and the deployment module 3 is used for deploying the worker card wearing identification model at the mobile equipment end and detecting whether worker card wearing of workers is standard.
The model module 2 comprises a data processing unit and a model training unit, wherein the data processing unit is used for converting the picture form of the image data set into a tfrecrd form for storage, preprocessing the image and expanding the number of the pictures in the image data set; the model training unit is used for building a MobileNet neural network structure, setting parameter information including learning rate, minimum batch and learning round number, and training the MobileNet neural network model.
The deployment module 3 takes the trained MobileNet neural network model as a worker-card wearing identification model, and defines a classification server interface by creating a classification server for the worker-card wearing identification model; and establishing the connection between the classification server and the mobile equipment terminal, and applying the classification server to the mobile equipment terminal. The mobile equipment terminal can carry out random and irregular spot check of worker card standard wearing on the staff, manual check is replaced, the check accuracy is improved, and human resources are saved.
The implementation manners of the data module 1, the model module 2, and the deployment module 3 are as described in the first embodiment, and are not described herein again.
EXAMPLE III
The embodiment provides a worker card wearing identification device. Referring to fig. 5, the card wear identification device 500 may have a relatively large difference due to different configurations or performances, and may include one or more processors (CPUs) 510 (e.g., one or more processors) and a memory 520, one or more storage media 530 (e.g., one or more mass storage devices) storing applications 533 or data 532. Memory 520 and storage media 530 may be, among other things, transient or persistent storage. The program stored on the storage medium 530 may include one or more modules (not shown), each of which may include a series of instructional operations on the card wear identification device 500.
Further, the processor 510 may be configured to communicate with the storage medium 530 to execute a series of instruction operations in the storage medium 530 on the card wear identification device 500.
The card wear identification device 500 may also include one or more power supplies 540, one or more wired or wireless network interfaces 550, one or more input-output interfaces 560, and/or one or more operating systems 531, such as Windows service, Vista, and the like.
Those skilled in the art will appreciate that the construction of the card wear identification device shown in figure 5 does not constitute a limitation of a card wear identification device and may include more or fewer components than shown, or some components in combination, or a different arrangement of components.
The present invention also provides a computer-readable storage medium, which may be a non-volatile computer-readable storage medium, and which may also be a volatile computer-readable storage medium. The computer-readable storage medium has stored therein instructions which, when executed on a computer, cause the computer to perform the steps of the card wear identification method of the first embodiment.
The modules in the second embodiment, if implemented in the form of software functional modules and sold or used as independent products, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present invention may be substantially or partially implemented in software, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses and devices may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The embodiments of the present invention have been described in detail with reference to the accompanying drawings, but the present invention is not limited to the above embodiments. Even if various changes are made to the present invention, it is still within the scope of the present invention if they fall within the scope of the claims of the present invention and their equivalents.

Claims (10)

1. A worker card wearing identification method is characterized by comprising the following steps:
the server side obtains an image of a standard wearing worker card and an image of an irregular wearing worker card, and marks key point information in the images to form an image data set;
taking the image data set as training data, training a preset MobileNet neural network model by adopting a TensorFlow frame, and improving the training accuracy by adopting a Momentum optimization method to obtain the trained MobileNet neural network model as a work card wearing recognition model;
and deploying the worker card wearing identification model at the mobile equipment end for detecting whether worker card wearing of workers is standard.
2. The method for identifying workhorse wear as claimed in claim 1, wherein the training of the preset MobileNet neural network model using the tensrflow frame further comprises:
converting the picture form of the image data set into a tfrecrd form for storage, preprocessing the image, and expanding the number of pictures in the image data set;
and (3) building a MobileNet neural network structure, setting parameter information including learning rate, minimum batch and learning turn number, and training a MobileNet neural network model.
3. The method for identifying workmanship cards as claimed in claim 2, wherein the building of the MobileNet neural network structure further comprises:
the MobileNet neural network model comprises a convolution layer, an excitation layer, a pooling layer and a full-connection layer, wherein the convolution layer is used for extracting feature information and a feature mapping relation of an image; the excitation layer is used for carrying out nonlinear operation on the characteristic information according to the characteristic mapping relation and extracting deep characteristic information; the pooling layer is used for compressing the image; and the full connection layer is used for fitting the deep characteristic information of the compressed image, transmitting the fitted deep characteristic information to the classification regression layer for calculation and outputting an identification result.
4. The card wear identification method of claim 2, wherein the training of the MobileNet neural network model further comprises:
and in the process of training the MobileNet neural network model, updating the parameters of the MobileNet neural network model by adopting an SGD gradient descent method.
5. The card wear identification method of claim 1, wherein the deploying the card wear identification model to the mobile device further comprises:
creating a classification server for the worker card wearing identification model, and defining a classification server interface;
and establishing connection between the classification server and the mobile equipment terminal, and applying the classification server to the mobile equipment terminal.
6. The card wear identification method of claim 5, wherein the creating a classification server for the card wear identification model and defining a classification server interface further comprises:
generating a classification server interface based on TensorFlow Serving, realizing a function of worker card wearing classification, and generating a classification server; and when the classification server is operated, monitoring the request from the mobile equipment terminal and transmitting the response of the service.
7. An identification means is worn to worker's tablet characterized in that includes:
the data module is used for the server side to obtain an image of a standard wearing workday card and an image of an irregular wearing workday card, and mark key point information in the images to form an image data set;
the model module is used for taking the image data set as training data, training a preset MobileNet neural network model by adopting a TensorFlow frame, improving the training accuracy by adopting a Momentum optimization method, obtaining the trained MobileNet neural network model and taking the trained MobileNet neural network model as a card wearing identification model;
and the deployment module is used for deploying the worker card wearing identification model at the mobile equipment end and detecting whether worker card wearing of workers is standard or not.
8. The card wear identification device of claim 7, wherein the model module comprises a data processing unit and a model training unit;
the data processing unit is used for converting the picture form of the image data set into a tfrecrd form for storage, preprocessing the image and expanding the number of the pictures in the image data set;
the model training unit is used for building a MobileNet neural network structure, setting parameter information including learning rate, minimum batch and learning round number, and training the MobileNet neural network model.
9. An identification device is worn to worker's card, comprising:
a memory having instructions stored therein and a processor, the memory and the processor interconnected by a line;
the processor invokes the instructions in the memory to implement the card wear identification method of any one of claims 1-6.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the card wear identification method according to any one of claims 1 to 6.
CN202111140116.1A 2021-09-28 2021-09-28 Worker card wearing identification method, device, equipment and storage medium Pending CN113963258A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111140116.1A CN113963258A (en) 2021-09-28 2021-09-28 Worker card wearing identification method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111140116.1A CN113963258A (en) 2021-09-28 2021-09-28 Worker card wearing identification method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113963258A true CN113963258A (en) 2022-01-21

Family

ID=79462580

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111140116.1A Pending CN113963258A (en) 2021-09-28 2021-09-28 Worker card wearing identification method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113963258A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114863242A (en) * 2022-04-26 2022-08-05 北京拙河科技有限公司 Deep learning network optimization method and system for image recognition
CN115063735A (en) * 2022-04-27 2022-09-16 长沙海信智能系统研究院有限公司 Worker card identification method and device and electronic equipment

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114863242A (en) * 2022-04-26 2022-08-05 北京拙河科技有限公司 Deep learning network optimization method and system for image recognition
CN114863242B (en) * 2022-04-26 2022-11-29 北京拙河科技有限公司 Deep learning network optimization method and system for image recognition
CN115063735A (en) * 2022-04-27 2022-09-16 长沙海信智能系统研究院有限公司 Worker card identification method and device and electronic equipment

Similar Documents

Publication Publication Date Title
CN100487720C (en) Face comparison device
CN101989327B (en) Image analyzing apparatus and image analyzing method
CN110309706A (en) Face critical point detection method, apparatus, computer equipment and storage medium
CN113963258A (en) Worker card wearing identification method, device, equipment and storage medium
CN107895160A (en) Human face detection and tracing device and method
CN105426875A (en) Face identification method and attendance system based on deep convolution neural network
CN104636751A (en) Crowd abnormity detection and positioning system and method based on time recurrent neural network
CN111523389A (en) Intelligent emotion recognition method and device, electronic equipment and storage medium
CN110135449A (en) Use the manufacture Parts Recognition of computer vision and machine learning
CN110009614A (en) Method and apparatus for output information
EP3944145B1 (en) Method and device for training image recognition model, equipment and medium
US20220237917A1 (en) Video comparison method and apparatus, computer device, and storage medium
CN110414376A (en) Update method, face recognition cameras and the server of human face recognition model
CN114782901B (en) Sand table projection method, device, equipment and medium based on visual change analysis
CN112651342A (en) Face recognition method and device, electronic equipment and storage medium
CN114550051A (en) Vehicle loss detection method and device, computer equipment and storage medium
CN112906464A (en) Pedestrian detection and identification method, device, equipment and storage medium
Kumar et al. Automated Attendance System Based on Face Recognition Using Opencv
CN115862113A (en) Stranger abnormity identification method, device, equipment and storage medium
CN112686232B (en) Teaching evaluation method and device based on micro expression recognition, electronic equipment and medium
CN112926496B (en) Neural network for predicting image definition, training method and prediction method
CN117541442A (en) Teaching attendance management method, device, equipment and storage medium
CN111611917A (en) Model training method, feature point detection device, feature point detection equipment and storage medium
CN116959070A (en) Psychological health early warning system and method for establishing feature extraction model thereof
CN108596068B (en) Method and device for recognizing actions

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination