CN114898220A - Intelligent production control method for structural member of overhead working truck - Google Patents
Intelligent production control method for structural member of overhead working truck Download PDFInfo
- Publication number
- CN114898220A CN114898220A CN202210817835.0A CN202210817835A CN114898220A CN 114898220 A CN114898220 A CN 114898220A CN 202210817835 A CN202210817835 A CN 202210817835A CN 114898220 A CN114898220 A CN 114898220A
- Authority
- CN
- China
- Prior art keywords
- cutter
- parameter
- image
- neuron
- cutting
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/26—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/774—Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/06—Recognition of objects for industrial automation
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Computing Systems (AREA)
- Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Artificial Intelligence (AREA)
- Software Systems (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- General Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Mathematical Physics (AREA)
- Computational Linguistics (AREA)
- Biophysics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Databases & Information Systems (AREA)
- Medical Informatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to the field of artificial intelligence, and provides an intelligent production control method for a structural member of an overhead working truck, which comprises the following steps: constructing a DNN network; obtaining a dictionary matrix and a sparse vector set of image blocks; obtaining a unique dictionary vector of the cutter in the cutter cutting image; obtaining the identification difficulty of the cutter in the cutter cutting image; obtaining a loss function value of a cutter cutting image in a DNN network and parameter values of all neuron parameters; acquiring a weighted information entropy of the loss function value; obtaining the weighted conditional entropy of the neuron parameters; obtaining the independent importance degree of the neuron parameters; obtaining the correlation degree of each neuron parameter and other neuron parameters; calculating the comprehensive importance degree of the neuron parameters; obtaining a sparse control loss function; performing supervision training on the network by using the comprehensive loss function to complete network training; and obtaining the position of the cutter, and cutting according to the position of the cutter. The invention improves the identification speed of the cutter.
Description
Technical Field
The invention relates to the field of artificial intelligence, in particular to an intelligent production control method for a structural member of an overhead working truck.
Background
With the development of social economy and the promotion of industrialization, the usage amount of the overhead working truck is increased. The structural member is used as a main stressed member of the overhead working truck, and the quality of the structural member directly influences the use safety of the overhead working truck.
The cutting process is used as a necessary process for processing a structural member of the overhead working truck, a common intelligent control method of the cutting process is to program a corresponding program in advance for a cutting route so as to realize cutting control according to the corresponding program, the cutting control method has low production flexibility, a set of cutting route program needs to be programmed by a professional worker when a new product is produced, and the product conversion rate of each set of program is low when the production batch is small. Meanwhile, according to the cutting control method, the production process is not monitored after a program is programmed, once a problem occurs in the middle process, deviation accumulation is easily caused, and the waste of the whole steel can occur when the problem occurs. The cutting control method is also a cutting control method which is used for carrying out cutting control through machine vision, namely, a camera is used for acquiring images of the cutting process in real time, and a cutting instruction at the next moment is determined according to the current cutting result images and the original design drawing. However, when cutting control is performed through machine vision, the position of a cutter needs to be positioned in real time, only when the position of the cutter is accurately positioned, a cutting instruction at each moment can be given, and the cutter can be accurately positioned by a neural network.
The parameters of the neural network can be sparsified to improve the recognition efficiency of the cutter, but the network characterization capability is reduced by parameter sparsification, so that the neural network needs to be analyzed and considered to sparsify the parameter data with poor description capability, and important parameters are reserved, thereby realizing the purpose of improving the recognition efficiency while ensuring that the sparsity of the network parameters is as low as possible.
Disclosure of Invention
The invention provides an intelligent production control method for a structural member of an overhead working truck, which aims to solve the problem of low identification efficiency of the existing cutter.
The invention discloses an intelligent production control method of a structural member of an overhead working truck, which adopts the following technical scheme that the method comprises the following steps:
constructing a DNN network, wherein a loss function of the network is a cross entropy loss function, and performing supervision training on the DNN network through the cross entropy loss function;
partitioning all cutter cutting images in the cutter cutting image data set to obtain an image block set, and performing dictionary training on all image blocks in the image block set to obtain a dictionary matrix and a sparse vector set of the image blocks;
obtaining a dictionary vector unique to the cutter in each cutter cutting image through the sparse vector of each cutter image block in the sparse vector set of the image blocks, the corresponding sparse vectors of all non-cutter image blocks and a dictionary matrix;
obtaining the identification difficulty of the cutter in each cutter cutting image through the unique dictionary vector of the cutter in each cutter cutting image and the times of the unique dictionary vector appearing in the sparse vector of the non-cutter image block of all the cutting images;
acquiring all neuron parameters in the DNN, acquiring loss function values of all cutter cutting images in the DNN and parameter values of all neuron parameters, and grading the loss function values of all cutter cutting images in the cutter cutting image data set and the parameter values of all neuron parameters;
acquiring the weighted information entropy of the loss function value by using the identification difficulty of the cutter in the cutter cutting image corresponding to each loss function value grade;
obtaining the weighted conditional entropy of each neuron parameter by using the identification difficulty of the cutter in the cutter cutting image corresponding to the parameter values of all neuron parameters in the parameter value grade of each neuron parameter;
obtaining the independent importance degree of each neuron parameter according to the weighted conditional entropy of each neuron parameter and the weighted information entropy of the loss function value;
obtaining the association degree of each neuron parameter and each other neuron parameter by using the parameter values of each neuron parameter and each other neuron parameter in all the cutter cutting images;
calculating the comprehensive importance degree of each neuron parameter by using the independent importance degree of each neuron parameter and the association degree of the neuron parameter and each other neuron parameter;
obtaining a sparse control loss function by using the parameter value of each neuron parameter and the comprehensive importance degree of each neuron parameter of the cutter cutting image to be detected in the DNN;
obtaining a comprehensive loss function of the cutter cutting image to be detected in the DNN network through the sparse control loss function and the cross entropy loss function, and performing supervision training on the network by using the comprehensive loss function to complete network training;
and inputting the cutting image of the cutter to be detected into the trained network to obtain the position of the cutter, and cutting according to the position of the cutter.
Further, the method for obtaining the identification difficulty of the cutter in each cutter cutting image comprises the following steps:
obtaining the description capacity of the unique dictionary vector of the cutter in each cutter cutting image through the times that the unique dictionary vector of the cutter in each cutter cutting image appears in the sparse vectors of the non-cutter image blocks of all cutting images;
and obtaining the identification difficulty of the cutters in each cutter cutting image through the unique dictionary vectors of all the cutters in each cutter cutting image and the description capacity of the unique dictionary vectors of the cutters.
Further, the method for controlling the intelligent production of the structural member of the overhead working truck comprises the following steps of:
acquiring a rectangular frame with the largest area in cutter external rectangles in all cutter cutting images in a cutter cutting image data set;
and dividing all the cutter cut images into a plurality of image blocks by taking the size of the rectangular frame with the largest area as a standard to obtain an image block set.
Further, in the method for controlling the intelligent production of the structural member of the aerial work platform, the expression of the comprehensive importance degree of each neuron parameter is as follows:
in the formula:is shown asThe overall importance of the individual neuron parameters,indicating the second in the networkThe parameters of each of the neurons are determined,representing the total number of neuron parameters in the network,is shown asThe individual neuron parameter andthe degree of correlation values of the parameters of the individual neurons,is shown asThe degree of independent importance of individual neuron parameters.
Further, in the intelligent production control method for the structural member of the overhead working truck, the expression of the identification difficulty of the cutter in the cutter cutting image is as follows:
in the formula:is shown asThe difficulty of identifying the tool in the cutting image of the cutter,is shown asCutting the image with a knifeThe descriptive ability of dictionary vectors unique to each tool,is shown asIn the image cut by the sheet knifeThe second in sparse vectors corresponding to dictionary vectors unique to each toolThe value of the one or more of the one,is shown asIn the image cut by the sheet knifeA dictionary vector unique to each tool,is shown asAnd the number of dictionary vectors unique to the cutter in the image cut by the cutter.
Further, in the intelligent production control method for the structural member of the overhead working truck, the expression of the sparse control loss function is as follows:
in the formula:a function representing the loss of sparse control is represented,is shown asThe parameters of each of the neurons are determined,showing the tool to be inspected cutting image in the DNN networkParameter values of individual neuron parameters.
Further, according to the intelligent production control method for the structural member of the aerial work platform, the independent importance degree of each neuron parameter is obtained by making a difference between the weighted conditional entropy of each neuron parameter and the weighted information entropy of the loss function value.
The invention has the beneficial effects that: the invention obtains the sparse necessity of each parameter by analyzing the importance of each network parameter in the training process, and constructs the sparse control loss function according to the sparse necessity, thereby meeting the cutting real-time control requirement by improving the real-time property of cutter identification. Compared with the prior art, the method improves the cutter recognition speed of the network by improving the sparsity of the network, thereby reducing the cutting hysteresis and improving the cutting control quality.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a schematic flow chart diagram of an embodiment of an intelligent production control method for a structural member of an aerial work platform according to the present invention;
fig. 2 is a schematic diagram of a parameter diagram structure.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example 1
The main purposes of the invention are: the importance degree of each parameter data is obtained by analyzing each parameter data in the cutting process, and a sparse loss function is constructed, so that the recognition speed is improved on the basis of ensuring the recognition precision.
In order to achieve the object of the present invention, an embodiment of an intelligent production control method for a structural member of an aerial work platform is provided, as shown in fig. 1, including:
the scenario addressed by the present embodiment is: firstly, arranging a camera to collect cutting images, training a tool recognition network by using the collected cutting images, obtaining the importance of each parameter by analyzing parameter data of each training stage in the network training process, and constructing a sparse loss function according to the importance of each parameter. The network parameters are thinned as much as possible through the sparse loss function, the identification efficiency of the network is improved, and real-time cutting control is met.
101. And constructing the DNN network, wherein a loss function of the network is a cross entropy loss function, and performing supervised training on the DNN network through the cross entropy loss function.
In the embodiment, cutter identification is needed to realize cutting control, and a cutting image training network is needed to realize cutter identification, so that the cutting image is collected firstly.
Acquiring a cut image dataset: the collected cutting image needs to contain a cutting tool and cutting information, so that a camera is arranged on one side of the cutting machine, the camera collects one cutting image every 10s, and the collected cutting image is labeled to obtain a cutting image data set.
Finishing the first round of training of the tool identification network: the cutter identification network is a DNN network, the network structure is an Encoder-Decoder structure, the network input is a cutting image data set, and the network output is a cutter position hot spot diagram. The loss function of the network is a cross entropy loss function. And inputting all cutting images into a tool recognition network to complete a first round of training of data.
And obtaining a cutting image, labeling and processing to obtain a training data set of the network, and finishing initial training of the network.
In order to improve the identification efficiency of the tool identification network, parameters in the identification network need to be sparse as much as possible, and the identification precision of the network also needs to be guaranteed when the network is sparse, namely, only the parameters with low importance degree can be zero, and the parameters with high importance degree cannot be zero, so that a sparse loss function is constructed based on the method. And the parameter data of each training stage needs to be analyzed in order to reflect the importance degree of each parameter. The method comprises the steps of calculating a cutter identification difficulty weight by considering the cutter identification difficulty of each cutting image during parameter analysis, calculating the independent importance degree of each parameter by combining the weights, analyzing the relation among the parameters to obtain the association degree, constructing a graph structure by taking the independent importance degree of each parameter as a node value and the association degree among the parameters as an edge weight, analyzing the graph structure to obtain the comprehensive importance degree of each parameter node, and constructing a sparse control loss function by combining the comprehensive importance degree of each parameter.
In order to analyze the importance degree of each parameter, the determination degree of the accuracy of each parameter on the identification accuracy of the images with various identification difficulties needs to be considered, wherein the determination degree of the identification accuracy of the images with high identification difficulties can reflect the importance of the parameter, namely, the accuracy degree of each parameter not only affects the identification accuracy of the images with low identification difficulties, but also has great influence on the identification accuracy of the images with high identification difficulties, so that the importance degree of the parameter is higher.
Therefore, the identification difficulty of each image is calculated firstly, and the specific process is as follows:
102. and partitioning all cutter cut images in the cutter cut image data set to obtain an image block set, and performing dictionary training on all image blocks in the image block set to obtain a dictionary matrix and a sparse vector set of the image blocks.
The method comprises the steps of obtaining a circumscribed rectangle of a cutter in a cutting image through a cutter label, obtaining a rectangular frame with the largest area in the circumscribed rectangle of the cutter in all the cutting images, and uniformly dividing each cutting image into a plurality of image blocks with the sizes by utilizing a sliding window to slide in a non-coincident mode according to the size of the rectangular frame as the size of the sliding window. And thus, all the cut images in the data set are subjected to blocking processing to obtain an image block set.
And then performing dictionary training on all image blocks in the data set by utilizing a K-SVD algorithm to obtain a dictionary matrix and a sparse vector set of the image blocks.
And acquiring a sparse vector set corresponding to all the cutter image blocks, recording the sparse vector set as a cutter sparse vector set, and acquiring a sparse vector set corresponding to the non-cutter image blocks, and recording the sparse vector set as a non-cutter sparse vector set.
103. And obtaining a dictionary vector unique to the cutter in each cutter cutting image through the sparse vector of each cutter image block in the sparse vector set of the image blocks, the corresponding sparse vectors of all non-cutter image blocks and the dictionary matrix.
For the convenience of analysis, the following steps are adoptedAnalyzing the cutting image to obtain the firstObtaining the sparse vector corresponding to the cutter image block of each cutting imageAnd sparse vectors corresponding to all non-cutter image blocks of each cutting image. And comparing the sparse vector of the cutter image block with the sparse vectors of the non-cutter image blocks to obtain dictionary vectors corresponding to non-sparse values unique to the cutter image block, wherein for example, the a-th value in the sparse vector corresponding to the cutter image block is a non-sparse value, and the a-th values of the sparse vectors corresponding to all the non-cutter image blocks are sparse values, so that the dictionary vector corresponding to the a-th value is the dictionary vector unique to the cutter. Analogy to this approach yields a dictionary vector unique to the tool in each of the other cutting images.
104. And obtaining the identification difficulty of the cutter in each cutter cutting image through the unique dictionary vector of the cutter in each cutter cutting image and the times of the unique dictionary vector appearing in the sparse vector of the non-cutter image block of all the cutting images.
Analyzing the descriptive ability of the unique dictionary vectors of each tool to obtain the number of times each unique dictionary vector appears in the non-tool image blocks of all cut images, e.g. theS-th unique dictionary of image block of cutting image toolThe vector corresponds to the s-th value in the sparse vector, the number of non-sparse values appearing in the s-th value in the sparse vector set corresponding to all the non-tool image blocks is obtained, the number ratio is obtained by taking the ratio of the number of all the non-tool image blocks as the number occupation ratio, the number occupation ratio is recorded as the description capacity of the dictionary vector unique to the s-th tool, when the description capacity of the dictionary vector is small, the description characteristic sometimes serves as the description characteristic of the non-tool image, sometimes serves as the description characteristic of the tool image, and therefore the segmentation accuracy of the tool for describing the characteristic by using the characteristic as the tool is poor.
And obtaining the description value in each sparse vector corresponding to the tool unique dictionary vector in each cutting image (the description value in each sparse vector is the value in each sparse vector). The difficulty of identifying the tool in each cut image is thus:
in the formula:is shown asThe identification difficulty of the cutter in the cutting image is larger, the image contains a plurality of characteristic information with larger distinguishability, therefore, the identification difficulty of the cutter of the image is smaller,is shown asIn the sheet cut imageThe describing capability of dictionary vectors unique to each tool is larger, the larger the value is, the more accurate the tool identification is performed by taking the feature as the tool feature,is shown asIn the sheet cut imageThe second in sparse vectors corresponding to dictionary vectors unique to each toolA larger value indicates more feature information in the image containing the description of the s-th dictionary vector,is shown asIn the sheet cut imageA dictionary vector unique to each tool,is shown asThe number of dictionary vectors unique to the tool in the cut-open image.
105. Obtaining all neuron parameters in the DNN, obtaining the loss function value of each cutter cutting image in the DNN and the parameter values of all neuron parameters, and grading the loss function values of all cutter cutting images in the cutter cutting image data set and the parameter values of all neuron parameters.
The independent importance degree of each parameter can be reflected by the information gain of the parameter, and only the information gain value of the identification accuracy condition of images with various identification difficulties needs to be considered when the information gain is calculated. (the information gain describes the degree of certainty of each attribute and the degree of certainty of the comprehensive description attribute of the system, the larger the information gain of each attribute is, the larger the purity improvement of the total comprehensive description attribute of the system is (the larger the degree of certainty is), and the determination degree of the accuracy of each network parameter to the loss function is put in this text).
Value data of each parameter (in this embodiment, the parameter refers to a neuron parameter) in the current training stage is obtained, and all the value data of each parameter form a data sequence. And obtaining the loss function value of the loss function in the current training stage, and forming a loss function value sequence by all the loss function values.
Dividing each value in the loss function value sequence into three levels according to the value size, wherein the lower the loss function value, the higher the level, the first, the second and the third level are respectively corresponding, for example, the loss function sequenceThe maximum value 14 and the minimum value 2 of the value sequence are obtained, so that the value width of each grade isThus, the value intervals of the first, second and third levels areThe first, second and third grades are represented by labels 1, 2 and 3, respectively, so that the discrete values of the loss function sequence are。
Dividing each value in each (neuron) parameter sequence into 10 value categories (namely value levels) according to the value taking size, and respectively marking as a value category 1, a value category 2, … and a value category 10.
106. And acquiring the weighted information entropy of the loss function value by using the identification difficulty of the cutter in the cutter cutting image corresponding to each loss function value grade.
In the training process, different loss function values can be obtained due to different identification difficulties of the cutters in different images under the same parameter, so that the identification difficulty of the cutters in each image is different, the loss function value in the image with low identification difficulty is small, and if the problem of identification difficulty is not considered, the problem of inaccurate solved information gain value caused by directly solving the information gain of each parameter can be solved.
Weighted information entropy for calculating loss function value by taking tool identification difficulty of each image as weightThe expression is as follows:
wherein:data set representing cutting image of toolThe difficulty of identifying the tool in the cutting image of each tool,data set representing cutting image of toolThe number of images is one of the number of images,indicating the number of images in the image dataset cut by the tool,the values of the loss function are all expressed in the second placeWithin the individual grade intervalThe identification of the cutting tool in the cutting image of the cutting tool is difficultThe degree of the magnetic field is measured,means that the loss function takes a value ofThe number of sample data corresponding to the cutter cutting the image in each grade interval,indicating loss function value asWithin the individual grade intervalEach cutter cuts the image sample data,expressing values of loss functionsAnd (4) each grade interval.
107. And obtaining the weighted conditional entropy of each neuron parameter by using the identification difficulty of the cutter in the cutter cutting image corresponding to the parameter values of all neuron parameters in the parameter value grade of each neuron parameter.
For the analysis, taking the ith parameter as an example, the weighted conditional entropy of the ith parameter is calculatedThe expression is as follows:
wherein:indicates that the ith parameter is inTo the second in the individual value categoryThe tool identification difficulty value of each tool cutting image,indicates that the ith parameter is inTo the second in the individual value categoryThe image is cut by the cutter and the image is cut by the cutter,indicates that the ith parameter is inThe number of cutter cutting images in each value category;data set representing cutting image of toolThe tool identification difficulty value of each tool cutting image,data set representing cutting image of toolThe image is cut by the cutter and the image is cut by the cutter,indicating the number of tool-cut images in the tool-cut image dataset,indicates that the ith parameter is inIndividual value classes and their corresponding loss function values are in the second placeTo the second in the individual value categoryThe tool identification difficulty value of each tool cutting image,indicates that the ith parameter is inThe value class and the loss function value corresponding to the value class are in the kth value classThe image is cut by the cutter and the image is cut by the cutter,indicates the ith parameter is in the secondIndividual value classes and their corresponding loss function values are in the second placeThe number of the cutter cutting images in each value category,to express parametersAnd (4) each value level.
108. And obtaining the independent importance degree of each neuron parameter according to the weighted conditional entropy of each neuron parameter and the weighted information entropy of the loss function value.
Calculating the weighted information gain value of the ith parameter, and taking the weighted information gain value of the ith parameter as the independent importance degree of the ith parameter, which is recorded as. A larger value indicates that the degree of determination (degree of accuracy) of the parameter is more decisive for the degree of determination of the loss function value. Weighted information gain value of ith parameterThe expression of (a) is:
109. and obtaining the association degree of each neuron parameter and each other neuron parameter by using the parameter values of each neuron parameter and each other neuron parameter in all the cutter cutting images.
The DNN network has a plurality of neuron parameters, parameter values of the same parameter corresponding to all cutter cutting images in the cutter cutting image data set form a parameter sequence of the parameter, according to the method, the parameter sequence of each parameter can be obtained, correlation coefficients are respectively calculated pairwise between the parameter sequences, and the correlation degree between the parameters is expressed by utilizing the square of the correlation coefficients. The expression of the correlation coefficient is as follows:
in the formula:the parameter sequence representing the ith parameterThe value of the one or more of the one,represents the average of all values in the parameter sequence of the ith parameter,in the parameter sequence representing the jth parameterThe value of the one or more of the one,the average of all values in the parameter sequence representing the jth parameter,the parameter sequence representing the ith parameterThe value of the one or more of the one,and the number of values in the parameter sequence of the ith parameter, or the number of elements in the parameter sequence of the ith parameter is expressed, and the number of values is consistent with the number of the cutter cutting images in the cutter cutting image data set.
110. And calculating the comprehensive importance degree of each neuron parameter by using the independent importance degree of each neuron parameter and the association degree of the neuron parameter and each other neuron parameter.
The small independent importance degree of each parameter cannot completely indicate that the parameter is unimportant, because the parameter has a larger correlation with some more important parameters, and the parameter is also more important, so the relationship between each parameter and other parameters and the importance degree of the parameter correlated with the parameter need to be analyzed to calculate the comprehensive importance degree of each parameter.
Taking the independent importance degree of each parameter as a node value, and taking the association degree between the parameters as an edge weight value to construct a parameter map structure, for example, if there are five neuron parameters, the independent importance degrees of neuron parameter 1, neuron parameter 2, neuron parameter 3, neuron parameter 4 and neuron parameter 5 are 10, 4, 15, 3, 20, respectively, the association degree of neuron parameter 1 and neuron parameter 2 is 0.9, the association degree of neuron parameter 1 and neuron parameter 3 is 0.5, the association degree of neuron parameter 1 and neuron parameter 4 is 0.33, the association degree of neuron parameter 1 and neuron parameter 5 is 0.91, the association degree of neuron parameter 2 and neuron parameter 3 is 0.86, the association degree of neuron parameter 2 and neuron parameter 4 is 0.33, the association degree of neuron parameter 2 and neuron parameter 5 is 0.61, the association degree of neuron parameter 3 and neuron parameter 4 is 0.53, and if the association degree of the neuron parameter 3 and the neuron parameter 5 is 0.5, and the association degree of the neuron parameter 4 and the neuron parameter 5 is 0.49, constructing a parameter graph structure by taking the respective independent importance degrees of the five neuron parameters as node values and the association degree between the parameters as an edge weight. As shown in fig. 2.
Because the comprehensive importance degree of each parameter needs to be analyzed, the relationship between each parameter and other parameters needs to be analyzed, and when the parameter has a larger relevance with some parameters with larger independent importance degree, or has a larger relevance with some parameters with smaller independent importance degree, but these parameters have some parameters with higher relevance and larger independent importance, this indicates that the importance of the parameter is larger.
First, theComprehensive importance of individual neuron parameters (i.e., nodes)The expression of (a) is:
in the formula:is shown asThe overall degree of importance of the individual parameters,indicating the second in the networkThe number of the parameters is one,representing the total number of parameters in the network,is shown asA parameter andthe degree of association value of each parameter,is shown asThe degree of independent importance of each parameter.
111. And obtaining a sparse control loss function by using the parameter value of each neuron parameter and the comprehensive importance degree of each neuron parameter in the DNN network of the cutter cutting image to be detected.
Constructing a sparse control loss function:
in the formula:a function representing the loss of sparse control is represented,denotes the firstThe number of the parameters is one,representing the total number of parameters in the network,showing the tool to be inspected cutting image in the DNN networkParameter values of the individual parameters.
112. And obtaining a comprehensive loss function of the cutter cutting image to be detected in the DNN network through the sparse control loss function and the cross entropy loss function, and performing supervision training on the network by using the comprehensive loss function to complete network training.
The original cross entropy loss function of the network is noted asThus the composite loss function is:
at this point, the importance degree of each parameter is obtained by analyzing each parameter in the training process, and the sparse control loss function is set according to the importance degree.
And carrying out supervision training on the network by using the comprehensive loss function until the network training is finished.
113. And inputting the cutting image of the cutter to be detected into the trained network to obtain the position of the cutter, and cutting according to the position of the cutter.
And rapidly recognizing the position of the cutter by using the trained cutter recognition network, and determining a cutting instruction at the next moment according to the current position of the cutter and a design drawing for cutting the structural member. And finishing cutting according to the cutting instruction, and realizing intelligent control on cutting of the structural part.
The invention obtains the sparse necessity of each parameter by analyzing the importance of each network parameter in the training process, and constructs the sparse control loss function according to the sparse necessity, thereby meeting the cutting real-time control requirement by improving the real-time property of cutter identification. Compared with the prior art, the method improves the cutter recognition speed of the network by improving the sparsity of the network, thereby reducing the cutting hysteresis and improving the cutting control quality.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.
Claims (7)
1. An intelligent production control method for a structural member of an overhead working truck is characterized by comprising the following steps:
constructing a DNN network, wherein a loss function of the network is a cross entropy loss function, and performing supervision training on the DNN network through the cross entropy loss function;
partitioning all cutter cutting images in the cutter cutting image data set to obtain an image block set, and performing dictionary training on all image blocks in the image block set to obtain a dictionary matrix and a sparse vector set of the image blocks;
obtaining a dictionary vector unique to the cutter in each cutter cutting image through the sparse vector of each cutter image block in the sparse vector set of the image blocks, the corresponding sparse vectors of all non-cutter image blocks and a dictionary matrix;
obtaining the identification difficulty of the cutter in each cutter cutting image through the unique dictionary vector of the cutter in each cutter cutting image and the times of the unique dictionary vector appearing in the sparse vector of the non-cutter image block of all the cutting images;
acquiring all neuron parameters in the DNN, acquiring loss function values of all cutter cutting images in the DNN and parameter values of all neuron parameters, and grading the loss function values of all cutter cutting images in the cutter cutting image data set and the parameter values of all neuron parameters;
acquiring the weighted information entropy of the loss function value by using the identification difficulty of the cutter in the cutter cutting image corresponding to each loss function value grade;
obtaining the weighted conditional entropy of each neuron parameter by using the identification difficulty of the cutter in the cutter cutting image corresponding to the parameter values of all neuron parameters in the parameter value grade of each neuron parameter;
obtaining the independent importance degree of each neuron parameter according to the weighted conditional entropy of each neuron parameter and the weighted information entropy of the loss function value;
obtaining the association degree of each neuron parameter and each other neuron parameter by using the parameter values of each neuron parameter and each other neuron parameter in all the cutter cutting images;
calculating the comprehensive importance degree of each neuron parameter by using the independent importance degree of each neuron parameter and the association degree of the neuron parameter and each other neuron parameter;
obtaining a sparse control loss function by using the parameter value of each neuron parameter and the comprehensive importance degree of each neuron parameter of the cutter cutting image to be detected in the DNN;
obtaining a comprehensive loss function of the cutter cutting image to be detected in the DNN network through the sparse control loss function and the cross entropy loss function, and performing supervision training on the network by using the comprehensive loss function to complete network training;
and inputting the cutting image of the cutter to be detected into the trained network to obtain the position of the cutter, and cutting according to the position of the cutter.
2. The intelligent production control method for the structural member of the overhead working truck as claimed in claim 1, wherein the method for obtaining the identification difficulty of the cutter in the cutting image of each cutter comprises:
obtaining the description capacity of the dictionary vector unique to the cutter in each cutter cutting image through the times that the dictionary vector unique to the cutter in each cutter cutting image appears in the sparse vectors of the non-cutter image blocks of all the cutting images;
and obtaining the identification difficulty of the cutters in each cutter cutting image through the unique dictionary vectors of all the cutters in each cutter cutting image and the description capacity of the unique dictionary vectors of the cutters.
3. The intelligent production control method for the structural member of the overhead working truck according to claim 1, wherein the method for obtaining the image block set by partitioning all the cutter cut images in the cutter cut image data set comprises the following steps:
acquiring a rectangular frame with the largest area in cutter external rectangles in all cutter cutting images in a cutter cutting image data set;
and dividing all the cutter cut images into a plurality of image blocks by taking the size of the rectangular frame with the largest area as a standard to obtain an image block set.
4. The intelligent production control method for the structural member of the aerial work platform as claimed in claim 1, wherein the expression of the comprehensive importance degree of each neuron parameter is as follows:
in the formula:is shown asThe overall importance of the individual neuron parameters,indicating the second in the networkThe parameters of each of the neurons are determined,representing the total number of neuron parameters in the network,is shown asThe individual neuron parameter andthe degree of correlation values of the parameters of the individual neurons,is shown asThe degree of independent importance of individual neuron parameters.
5. The intelligent production control method for the structural member of the overhead working truck as claimed in claim 2, wherein the expression of the identification difficulty of the cutter in the cutter cutting image is as follows:
in the formula:is shown asThe difficulty of identifying the tool in the cutting image of the cutter,is shown asIn the image cut by the sheet knifeThe descriptive ability of dictionary vectors unique to each tool,is shown asCutting the image with a knifeThe second in sparse vectors corresponding to dictionary vectors unique to each toolThe value of the one or more of the one,is shown asIn the image cut by the sheet knifeA dictionary vector unique to each tool,is shown asAnd the number of dictionary vectors unique to the cutter in the image cut by the cutter.
6. The intelligent production control method for the structural member of the aerial work platform as claimed in claim 4, wherein the expression of the sparse control loss function is as follows:
7. The intelligent production control method for the structural member of the aerial work platform as claimed in claim 1, wherein the independent importance degree of each neuron parameter is obtained by subtracting the weighted conditional entropy of each neuron parameter and the weighted information entropy of the loss function value.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210817835.0A CN114898220B (en) | 2022-07-13 | 2022-07-13 | Intelligent production control method for structural member of overhead working truck |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210817835.0A CN114898220B (en) | 2022-07-13 | 2022-07-13 | Intelligent production control method for structural member of overhead working truck |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114898220A true CN114898220A (en) | 2022-08-12 |
CN114898220B CN114898220B (en) | 2022-09-09 |
Family
ID=82729981
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210817835.0A Active CN114898220B (en) | 2022-07-13 | 2022-07-13 | Intelligent production control method for structural member of overhead working truck |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114898220B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115049814A (en) * | 2022-08-15 | 2022-09-13 | 聊城市飓风工业设计有限公司 | Intelligent eye protection lamp adjusting method adopting neural network model |
CN115359497A (en) * | 2022-10-14 | 2022-11-18 | 景臣科技(南通)有限公司 | Call center monitoring alarm method and system |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109598799A (en) * | 2018-11-30 | 2019-04-09 | 南京信息工程大学 | A kind of Virtual cropping method based on CycleGAN |
WO2021150017A1 (en) * | 2020-01-23 | 2021-07-29 | Samsung Electronics Co., Ltd. | Method for interactive segmenting an object on an image and electronic computing device implementing the same |
CN114140485A (en) * | 2021-11-29 | 2022-03-04 | 昆明理工大学 | Method and system for generating cutting track of main root of panax notoginseng |
CN114742987A (en) * | 2022-06-08 | 2022-07-12 | 苏州市洛肯电子科技有限公司 | Automatic positioning control method and system for cutting of non-metallic materials |
-
2022
- 2022-07-13 CN CN202210817835.0A patent/CN114898220B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109598799A (en) * | 2018-11-30 | 2019-04-09 | 南京信息工程大学 | A kind of Virtual cropping method based on CycleGAN |
WO2021150017A1 (en) * | 2020-01-23 | 2021-07-29 | Samsung Electronics Co., Ltd. | Method for interactive segmenting an object on an image and electronic computing device implementing the same |
CN114140485A (en) * | 2021-11-29 | 2022-03-04 | 昆明理工大学 | Method and system for generating cutting track of main root of panax notoginseng |
CN114742987A (en) * | 2022-06-08 | 2022-07-12 | 苏州市洛肯电子科技有限公司 | Automatic positioning control method and system for cutting of non-metallic materials |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115049814A (en) * | 2022-08-15 | 2022-09-13 | 聊城市飓风工业设计有限公司 | Intelligent eye protection lamp adjusting method adopting neural network model |
CN115359497A (en) * | 2022-10-14 | 2022-11-18 | 景臣科技(南通)有限公司 | Call center monitoring alarm method and system |
Also Published As
Publication number | Publication date |
---|---|
CN114898220B (en) | 2022-09-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114898220B (en) | Intelligent production control method for structural member of overhead working truck | |
CN109523520B (en) | Chromosome automatic counting method based on deep learning | |
CN111079602A (en) | Vehicle fine granularity identification method and device based on multi-scale regional feature constraint | |
Gakii et al. | A classification model for water quality analysis using decision tree | |
CN108711148B (en) | Tire defect intelligent detection method based on deep learning | |
CN106355192A (en) | Support vector machine method based on chaos and grey wolf optimization | |
CN105374209B (en) | A kind of urban area road network running status characteristics information extraction method | |
CN103473786A (en) | Gray level image segmentation method based on multi-objective fuzzy clustering | |
CN111652167A (en) | Intelligent evaluation method and system for chromosome karyotype image | |
CN113658174B (en) | Microkernel histology image detection method based on deep learning and image processing algorithm | |
CN112560722A (en) | Airplane target identification method and device, computer equipment and storage medium | |
CN111832608A (en) | Multi-abrasive-particle identification method for ferrographic image based on single-stage detection model yolov3 | |
CN110288017B (en) | High-precision cascade target detection method and device based on dynamic structure optimization | |
CN115760484A (en) | Method, device and system for improving potential danger identification capability of power distribution station area and storage medium | |
CN104915679A (en) | Large-scale high-dimensional data classification method based on random forest weighted distance | |
CN111784022A (en) | Short-time adjacent fog prediction method based on combination of Wrapper method and SVM method | |
CN106202274B (en) | A kind of defective data automatic abstract classification method based on Bayesian network | |
CN109446964A (en) | Face detection analysis method and device based on end-to-end single-stage multiple scale detecting device | |
US20150242676A1 (en) | Method for the Supervised Classification of Cells Included in Microscopy Images | |
CN117315380B (en) | Deep learning-based pneumonia CT image classification method and system | |
CN115269958A (en) | Internet reliability data information acquisition and analysis system | |
CN117252459A (en) | Fruit quality evaluation system based on deep learning | |
CN115830302A (en) | Multi-scale feature extraction and fusion power distribution network equipment positioning identification method | |
Nagpal et al. | An application of deep learning for sweet cherry phenotyping using yolo object detection | |
CN115293827A (en) | Novel model interpretability analysis method for assisting fine operation of enterprise |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |