CN113076998A - Distributed classification method based on kubernets deep neural network model - Google Patents

Distributed classification method based on kubernets deep neural network model Download PDF

Info

Publication number
CN113076998A
CN113076998A CN202110357657.3A CN202110357657A CN113076998A CN 113076998 A CN113076998 A CN 113076998A CN 202110357657 A CN202110357657 A CN 202110357657A CN 113076998 A CN113076998 A CN 113076998A
Authority
CN
China
Prior art keywords
model
neural network
data
deep neural
classification method
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110357657.3A
Other languages
Chinese (zh)
Inventor
袁正午
林才贵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing University of Post and Telecommunications
Original Assignee
Chongqing University of Post and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing University of Post and Telecommunications filed Critical Chongqing University of Post and Telecommunications
Priority to CN202110357657.3A priority Critical patent/CN113076998A/en
Publication of CN113076998A publication Critical patent/CN113076998A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/254Fusion techniques of classification results, e.g. of results related to same input data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/259Fusion by voting

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Health & Medical Sciences (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention relates to a distributed classification method based on a kubernets deep neural network model, and belongs to the field of deep learning and cloud computing. S1: acquiring a data set, labeling the data set, and carrying out standardized processing, wherein the data is divided into a training set and a test set; inputting the training set after the standardization treatment into a plurality of different deep neural network models for training; inputting a test set to obtain the accuracy of the corresponding model, and distributing weight according to the high accuracy; s2: deploying a model; and (3) adopting a kubernets container arrangement system to reasonably deploy a classification model. The invention uses the container technology, solves the problem of larger inference delay of the deep neural network model, and simultaneously utilizes the idea of integrated learning to ensure that the inference capability of the model is stronger.

Description

Distributed classification method based on kubernets deep neural network model
Technical Field
The invention belongs to the field of deep learning and cloud computing, and designs a distributed classification method based on a kubernetes deep neural network model.
Background
The deep learning is a branch of machine learning, is an algorithm which takes an artificial neural network as a framework and performs characterization learning on data, and basically uses a deep learning model in a classification task along with the development of the deep learning.
Although the deep neural network model is superior in classification task, with the development of the deep neural network model, the parameter quantity of the model is continuously increased, the technical resource requirement is higher and higher, and the actual deployment and application are difficult.
When the deep neural network model classifies the multi-attribute data set, attribute combination is rarely concerned, the characteristic attributes are directly input into the deep neural network training model, and then the trained model is used for sample classification. For example, in some deep neural networks which are popular at present, due to large parameter quantity and high difficulty in obtaining models, the effect of actually deploying the models is not ideal.
Disclosure of Invention
In view of this, the invention aims to provide a distributed classification method based on a kubernets deep neural network model, which solves the problem of large inference delay of the deep neural network model, and meanwhile, the inference capability of the model is stronger by using the idea of ensemble learning for reference.
In order to achieve the purpose, the invention provides the following technical scheme:
a distributed classification method based on a kubernets deep neural network model comprises the following steps:
s1: training a model;
s11: acquiring a data set, which can be any data set to be classified;
s12: labeling the data set;
s13: carrying out standardization processing on a data set, and then dividing the data set into a training set and a testing set;
s14: respectively inputting the training set after the standardization treatment into n different deep neural network models for training to obtain a model f1,f2,f3,...,fn(ii) a I.e. the same data set is judged comprehensively by using a plurality of deep learning models.
S15: inputting a test set to obtain the accuracy of the corresponding model, and distributing the weight W according to the high accuracy1,W2,W3,...,Wn
S2: deploying a model;
before deployment, a kubernets cluster is established, a master manages nodes of a plurality of workers, a plurality of pods run tensoflow serving, and the deployment of the pods can be deployed according to specific resource conditions; all resources establish connection through service; the trained n deep neural network models are deployed in tensoflow-serving; and the external part accesses the tenserflow-serving through the api to obtain an inference result.
S3: and testing real-time data to obtain a final result, and specifically comprising the following steps of:
s31: acquiring real-time data;
s32: carrying out standardization processing on the acquired real-time data;
s33: obtaining different reasoning results from the standardized data by accessing the tensoflow-serving api;
s34: and (3) adopting a Weighted voting algorithm (Weighted voting algorithm), and carrying out Weighted summation on the results obtained by the real-time data inference by different models to obtain the final result.
Further, the normalization process is:
Figure BDA0003004119410000021
i.e. the characteristic value x of the dataiThe mean value mu is subtracted and then divided by the standard deviation sigma, so that the data are subjected to standard normal distribution, and the calculation amount is reduced.
Further, the deep neural network model may include: resnet, Incepton, VGG, Densenet, and the like. The hyper-parameter selection of the deep learning model can be diversified.
Further, in step S2, kubernets are used as the container arrangement system to monitor, expand and allocate resources for the tensoflow serving pod.
Further, in step S34, according to the method
Figure BDA0003004119410000022
And calculating to obtain a label value corresponding to the maximum value, and obtaining a label value corresponding to the maximum value.
The invention has the beneficial effects that: the method uses a weight voting mode to comprehensively judge the inferred categories, is similar to ensemble learning, adopts a distributed thought to reduce delay due to the fact that a deep learning algorithm is higher than a traditional model in inference delay, and reasonably deploys a classification model by using a kubernets container arrangement system to reduce inference delay. And finally judging a result, distributing Weight according to the performance of the model in the test set, and carrying out weighted summation on the results obtained by real-time data inference by different models to obtain a final result.
The invention uses the container technology, solves the problem of larger inference delay of the deep neural network model, and simultaneously, makes the inference capability of the model stronger by using the idea of integrated learning.
Additional advantages, objects, and features of the invention will be set forth in part in the description which follows and in part will become apparent to those having ordinary skill in the art upon examination of the following or may be learned from practice of the invention. The objectives and other advantages of the invention may be realized and attained by the means of the instrumentalities and combinations particularly pointed out hereinafter.
Drawings
For the purposes of promoting a better understanding of the objects, aspects and advantages of the invention, reference will now be made to the following detailed description taken in conjunction with the accompanying drawings in which:
FIG. 1 is a flow chart of a distributed classification method based on a kubernets deep neural network model according to the present invention.
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention. It should be noted that the drawings provided in the following embodiments are only for illustrating the basic idea of the present invention in a schematic way, and the features in the following embodiments and examples may be combined with each other without conflict.
Referring to fig. 1, the invention preferably selects a distributed classification method based on a kubernets deep neural network model, and designs a distributed classification system by using the idea of ensemble learning and combining with a container technology to improve the reasoning ability of the deep learning model.
Fig. 1 is a schematic flow chart of a distributed classification method according to an embodiment of the present invention, and as shown in fig. 1, the method specifically includes the following steps:
1) and (6) data acquisition.
The present embodiment uses arbitrary data sets, mainly data sets of classification tasks.
2) And (6) data annotation.
3) And (4) preprocessing data.
In the model training stage, the original data are subjected to standardization processing, so that the influence of different dimensions on the data can be eliminated.
And sending the processed image data into a yolov5 model for training, loading pre-training parameters in order to reduce the training time of the model, and training the trained hyper-parameters by adopting yolov5 default configuration parameters.
4) And (4) splitting data.
The data is divided into a training set and a testing set. The training set is used for training the models to obtain different models. The test set is used for obtaining the accuracy of different models and distributing weights to the different models.
5) And (6) deploying the model.
Acquiring different models, deploying the models in different tenserflow serving pods, and connecting the models through services of kubernets.
6) Real-time data.
And after the real-time data is preprocessed, accessing the service of the kubernets through the API to obtain a result.
7) And (6) voting.
The final result is calculated by Weighted votating algorithm.
As other implementation modes, a pod can be designed for training the model on line and then distributing the model so as to obtain the model with better generalization capability.
Finally, the above embodiments are only intended to illustrate the technical solutions of the present invention and not to limit the present invention, and although the present invention has been described in detail with reference to the preferred embodiments, it will be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions, and all of them should be covered by the claims of the present invention.

Claims (7)

1. A distributed classification method based on a kubernets deep neural network model is characterized by comprising the following steps:
s1: training a model;
s11: acquiring a data set;
s12: labeling the data set;
s13: carrying out standardization processing on a data set, and then dividing the data set into a training set and a testing set;
s14: respectively inputting the training set after the standardization treatment into n different deep neural network models for training to obtain a model f1,f2,f3,...,fn
S15: inputting test set to obtain accuracy of corresponding model, and assigning weight W1,W2,W3,...,Wn
S2: deploying a model;
before deployment, establishing a kubernets cluster, managing nodes of a plurality of workers by a master, operating tensoflow serving by a plurality of pods, and deploying the pods according to specific resource conditions; all resources establish connection through service; the n trained deep neural network models are deployed in the tensoflow serving.
2. The distributed classification method of claim 1, further comprising: and testing real-time data to obtain a final result, and specifically comprising the following steps of:
s31: acquiring real-time data;
s32: carrying out standardization processing on the acquired real-time data;
s33: obtaining different reasoning results from the standardized data by accessing the tensoflow serving api;
s34: and (3) adopting a weighted voting algorithm, and carrying out weighted summation on the results obtained by real-time data inference by different models to obtain a final result.
3. The distributed classification method according to claim 1 or 2, characterized in that the normalization process is:
Figure FDA0003004119400000011
i.e. the characteristic value x of the dataiThe mean μ is subtracted and divided by the standard deviation σ to fit the data to a standard normal distribution.
4. The distributed classification method according to claim 1 or 2, wherein in step S15, the weight W is assigned according to the accuracy1,W2,W3,...,Wn
5. The distributed classification method according to claim 1 or 2, characterized in that the deep neural network model comprises: resnet, Incepton, VGG or Densenet network models.
6. The distributed classification method according to claim 1 or 2, wherein in step S2, tensorflow serving pod is supervised, expanded and resources are allocated using kubernets as a container arrangement system.
7. The distributed classification method according to claim 2, wherein in step S34, the classification is based on
Figure FDA0003004119400000021
And calculating to obtain a label value corresponding to the maximum value, and obtaining a label value corresponding to the maximum value.
CN202110357657.3A 2021-04-01 2021-04-01 Distributed classification method based on kubernets deep neural network model Pending CN113076998A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110357657.3A CN113076998A (en) 2021-04-01 2021-04-01 Distributed classification method based on kubernets deep neural network model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110357657.3A CN113076998A (en) 2021-04-01 2021-04-01 Distributed classification method based on kubernets deep neural network model

Publications (1)

Publication Number Publication Date
CN113076998A true CN113076998A (en) 2021-07-06

Family

ID=76614763

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110357657.3A Pending CN113076998A (en) 2021-04-01 2021-04-01 Distributed classification method based on kubernets deep neural network model

Country Status (1)

Country Link
CN (1) CN113076998A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107527024A (en) * 2017-08-08 2017-12-29 北京小米移动软件有限公司 Face face value appraisal procedure and device
CN107665333A (en) * 2017-08-28 2018-02-06 平安科技(深圳)有限公司 A kind of indecency image identification method, terminal, equipment and computer-readable recording medium based on convolutional neural networks
CN111352664A (en) * 2018-12-05 2020-06-30 北京京东尚科信息技术有限公司 Distributed machine learning task starting method, system, equipment and storage medium
CN112015519A (en) * 2020-08-28 2020-12-01 江苏银承网络科技股份有限公司 Model online deployment method and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107527024A (en) * 2017-08-08 2017-12-29 北京小米移动软件有限公司 Face face value appraisal procedure and device
CN107665333A (en) * 2017-08-28 2018-02-06 平安科技(深圳)有限公司 A kind of indecency image identification method, terminal, equipment and computer-readable recording medium based on convolutional neural networks
CN111352664A (en) * 2018-12-05 2020-06-30 北京京东尚科信息技术有限公司 Distributed machine learning task starting method, system, equipment and storage medium
CN112015519A (en) * 2020-08-28 2020-12-01 江苏银承网络科技股份有限公司 Model online deployment method and device

Similar Documents

Publication Publication Date Title
CN110798417B (en) Signal modulation identification method and device based on cyclic residual error network
CN112329820B (en) Method and device for sampling unbalanced data under federal learning
CN111371742B (en) SVDD (singular value decomposition and direct data decomposition) -based network slice physical node anomaly detection method
JP2024039598A (en) Multi-task hybrid supervised medical image segmentation method and system based on federated learning
CN110704371A (en) Large-scale data management and data distribution system and method
CN115861246B (en) Product quality abnormality detection method and system applied to industrial Internet
CN112200238B (en) Hard rock pulling shear rupture identification method and device based on sound characteristics
US20230385692A1 (en) Systems and methods for artificial intelligence inference platform and model controller
CN111368911A (en) Image classification method and device and computer readable storage medium
CN113076998A (en) Distributed classification method based on kubernets deep neural network model
CN110825589A (en) Anomaly detection method and device for micro-service system and electronic equipment
CN114333040A (en) Multi-level target detection method and system
US20230162487A1 (en) System and method for deep learning techniques utilizing continuous federated learning with a distributed data generative model
CN115082449B (en) Electronic component defect detection method
CN109145966B (en) Automatic identification method for holed worm fossil
Narayanan et al. On challenges in unsupervised domain generalization
CN110554667A (en) convolutional Neural Network (CNN) based intermittent industrial process fault diagnosis
CN113673174B (en) Super parameter determination method, device, equipment and storage medium
CN112819180B (en) Multi-service data generation method and device based on federal generation model
CN115601747A (en) Method and system for calculating confluency of adherent cells
Ayrapetov et al. Analysis of Work of YOLO v. 3 AND YOLO v. 2 Neural Networks
Ojeda-Magana et al. Images sub-segmentation with the pfcm clustering algorithm
CN115459937A (en) Method for extracting characteristics of encrypted network traffic packet in distributed scene
CN114548295A (en) Bearing fault classification system and method based on multi-scale domain adaptive network
CN112101394B (en) Provider domain deployment method, device, computing equipment and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210706