CN114898220A - Intelligent production control method for structural member of overhead working truck - Google Patents

Intelligent production control method for structural member of overhead working truck Download PDF

Info

Publication number
CN114898220A
CN114898220A CN202210817835.0A CN202210817835A CN114898220A CN 114898220 A CN114898220 A CN 114898220A CN 202210817835 A CN202210817835 A CN 202210817835A CN 114898220 A CN114898220 A CN 114898220A
Authority
CN
China
Prior art keywords
cutter
parameter
image
neuron
cutting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210817835.0A
Other languages
Chinese (zh)
Other versions
CN114898220B (en
Inventor
姬蕾
郑代顺
路秋媛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jincheng Technology Co ltd
Original Assignee
Jincheng Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jincheng Technology Co ltd filed Critical Jincheng Technology Co ltd
Priority to CN202210817835.0A priority Critical patent/CN114898220B/en
Publication of CN114898220A publication Critical patent/CN114898220A/en
Application granted granted Critical
Publication of CN114898220B publication Critical patent/CN114898220B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/06Recognition of objects for industrial automation
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the field of artificial intelligence, and provides an intelligent production control method for a structural member of an overhead working truck, which comprises the following steps: constructing a DNN network; obtaining a dictionary matrix and a sparse vector set of image blocks; obtaining a unique dictionary vector of the cutter in the cutter cutting image; obtaining the identification difficulty of the cutter in the cutter cutting image; obtaining a loss function value of a cutter cutting image in a DNN network and parameter values of all neuron parameters; acquiring a weighted information entropy of the loss function value; obtaining the weighted conditional entropy of the neuron parameters; obtaining the independent importance degree of the neuron parameters; obtaining the correlation degree of each neuron parameter and other neuron parameters; calculating the comprehensive importance degree of the neuron parameters; obtaining a sparse control loss function; performing supervision training on the network by using the comprehensive loss function to complete network training; and obtaining the position of the cutter, and cutting according to the position of the cutter. The invention improves the identification speed of the cutter.

Description

Intelligent production control method for structural member of overhead working truck
Technical Field
The invention relates to the field of artificial intelligence, in particular to an intelligent production control method for a structural member of an overhead working truck.
Background
With the development of social economy and the promotion of industrialization, the usage amount of the overhead working truck is increased. The structural member is used as a main stressed member of the overhead working truck, and the quality of the structural member directly influences the use safety of the overhead working truck.
The cutting process is used as a necessary process for processing a structural member of the overhead working truck, a common intelligent control method of the cutting process is to program a corresponding program in advance for a cutting route so as to realize cutting control according to the corresponding program, the cutting control method has low production flexibility, a set of cutting route program needs to be programmed by a professional worker when a new product is produced, and the product conversion rate of each set of program is low when the production batch is small. Meanwhile, according to the cutting control method, the production process is not monitored after a program is programmed, once a problem occurs in the middle process, deviation accumulation is easily caused, and the waste of the whole steel can occur when the problem occurs. The cutting control method is also a cutting control method which is used for carrying out cutting control through machine vision, namely, a camera is used for acquiring images of the cutting process in real time, and a cutting instruction at the next moment is determined according to the current cutting result images and the original design drawing. However, when cutting control is performed through machine vision, the position of a cutter needs to be positioned in real time, only when the position of the cutter is accurately positioned, a cutting instruction at each moment can be given, and the cutter can be accurately positioned by a neural network.
The parameters of the neural network can be sparsified to improve the recognition efficiency of the cutter, but the network characterization capability is reduced by parameter sparsification, so that the neural network needs to be analyzed and considered to sparsify the parameter data with poor description capability, and important parameters are reserved, thereby realizing the purpose of improving the recognition efficiency while ensuring that the sparsity of the network parameters is as low as possible.
Disclosure of Invention
The invention provides an intelligent production control method for a structural member of an overhead working truck, which aims to solve the problem of low identification efficiency of the existing cutter.
The invention discloses an intelligent production control method of a structural member of an overhead working truck, which adopts the following technical scheme that the method comprises the following steps:
constructing a DNN network, wherein a loss function of the network is a cross entropy loss function, and performing supervision training on the DNN network through the cross entropy loss function;
partitioning all cutter cutting images in the cutter cutting image data set to obtain an image block set, and performing dictionary training on all image blocks in the image block set to obtain a dictionary matrix and a sparse vector set of the image blocks;
obtaining a dictionary vector unique to the cutter in each cutter cutting image through the sparse vector of each cutter image block in the sparse vector set of the image blocks, the corresponding sparse vectors of all non-cutter image blocks and a dictionary matrix;
obtaining the identification difficulty of the cutter in each cutter cutting image through the unique dictionary vector of the cutter in each cutter cutting image and the times of the unique dictionary vector appearing in the sparse vector of the non-cutter image block of all the cutting images;
acquiring all neuron parameters in the DNN, acquiring loss function values of all cutter cutting images in the DNN and parameter values of all neuron parameters, and grading the loss function values of all cutter cutting images in the cutter cutting image data set and the parameter values of all neuron parameters;
acquiring the weighted information entropy of the loss function value by using the identification difficulty of the cutter in the cutter cutting image corresponding to each loss function value grade;
obtaining the weighted conditional entropy of each neuron parameter by using the identification difficulty of the cutter in the cutter cutting image corresponding to the parameter values of all neuron parameters in the parameter value grade of each neuron parameter;
obtaining the independent importance degree of each neuron parameter according to the weighted conditional entropy of each neuron parameter and the weighted information entropy of the loss function value;
obtaining the association degree of each neuron parameter and each other neuron parameter by using the parameter values of each neuron parameter and each other neuron parameter in all the cutter cutting images;
calculating the comprehensive importance degree of each neuron parameter by using the independent importance degree of each neuron parameter and the association degree of the neuron parameter and each other neuron parameter;
obtaining a sparse control loss function by using the parameter value of each neuron parameter and the comprehensive importance degree of each neuron parameter of the cutter cutting image to be detected in the DNN;
obtaining a comprehensive loss function of the cutter cutting image to be detected in the DNN network through the sparse control loss function and the cross entropy loss function, and performing supervision training on the network by using the comprehensive loss function to complete network training;
and inputting the cutting image of the cutter to be detected into the trained network to obtain the position of the cutter, and cutting according to the position of the cutter.
Further, the method for obtaining the identification difficulty of the cutter in each cutter cutting image comprises the following steps:
obtaining the description capacity of the unique dictionary vector of the cutter in each cutter cutting image through the times that the unique dictionary vector of the cutter in each cutter cutting image appears in the sparse vectors of the non-cutter image blocks of all cutting images;
and obtaining the identification difficulty of the cutters in each cutter cutting image through the unique dictionary vectors of all the cutters in each cutter cutting image and the description capacity of the unique dictionary vectors of the cutters.
Further, the method for controlling the intelligent production of the structural member of the overhead working truck comprises the following steps of:
acquiring a rectangular frame with the largest area in cutter external rectangles in all cutter cutting images in a cutter cutting image data set;
and dividing all the cutter cut images into a plurality of image blocks by taking the size of the rectangular frame with the largest area as a standard to obtain an image block set.
Further, in the method for controlling the intelligent production of the structural member of the aerial work platform, the expression of the comprehensive importance degree of each neuron parameter is as follows:
Figure DEST_PATH_IMAGE002
in the formula:
Figure DEST_PATH_IMAGE004
is shown as
Figure DEST_PATH_IMAGE006
The overall importance of the individual neuron parameters,
Figure DEST_PATH_IMAGE008
indicating the second in the network
Figure DEST_PATH_IMAGE010
The parameters of each of the neurons are determined,
Figure DEST_PATH_IMAGE012
representing the total number of neuron parameters in the network,
Figure DEST_PATH_IMAGE014
is shown as
Figure 128216DEST_PATH_IMAGE006
The individual neuron parameter and
Figure 842094DEST_PATH_IMAGE010
the degree of correlation values of the parameters of the individual neurons,
Figure DEST_PATH_IMAGE016
is shown as
Figure 746465DEST_PATH_IMAGE006
The degree of independent importance of individual neuron parameters.
Further, in the intelligent production control method for the structural member of the overhead working truck, the expression of the identification difficulty of the cutter in the cutter cutting image is as follows:
Figure DEST_PATH_IMAGE018
in the formula:
Figure DEST_PATH_IMAGE020
is shown as
Figure DEST_PATH_IMAGE022
The difficulty of identifying the tool in the cutting image of the cutter,
Figure DEST_PATH_IMAGE024
is shown as
Figure 249996DEST_PATH_IMAGE022
Cutting the image with a knife
Figure DEST_PATH_IMAGE026
The descriptive ability of dictionary vectors unique to each tool,
Figure DEST_PATH_IMAGE028
is shown as
Figure 595527DEST_PATH_IMAGE022
In the image cut by the sheet knife
Figure 103475DEST_PATH_IMAGE026
The second in sparse vectors corresponding to dictionary vectors unique to each tool
Figure 167246DEST_PATH_IMAGE026
The value of the one or more of the one,
Figure 303830DEST_PATH_IMAGE026
is shown as
Figure 894080DEST_PATH_IMAGE022
In the image cut by the sheet knife
Figure 825127DEST_PATH_IMAGE026
A dictionary vector unique to each tool,
Figure DEST_PATH_IMAGE030
is shown as
Figure 939976DEST_PATH_IMAGE022
And the number of dictionary vectors unique to the cutter in the image cut by the cutter.
Further, in the intelligent production control method for the structural member of the overhead working truck, the expression of the sparse control loss function is as follows:
Figure DEST_PATH_IMAGE032
in the formula:
Figure DEST_PATH_IMAGE034
a function representing the loss of sparse control is represented,
Figure 690369DEST_PATH_IMAGE006
is shown as
Figure 400705DEST_PATH_IMAGE006
The parameters of each of the neurons are determined,
Figure DEST_PATH_IMAGE036
showing the tool to be inspected cutting image in the DNN network
Figure 456648DEST_PATH_IMAGE006
Parameter values of individual neuron parameters.
Further, according to the intelligent production control method for the structural member of the aerial work platform, the independent importance degree of each neuron parameter is obtained by making a difference between the weighted conditional entropy of each neuron parameter and the weighted information entropy of the loss function value.
The invention has the beneficial effects that: the invention obtains the sparse necessity of each parameter by analyzing the importance of each network parameter in the training process, and constructs the sparse control loss function according to the sparse necessity, thereby meeting the cutting real-time control requirement by improving the real-time property of cutter identification. Compared with the prior art, the method improves the cutter recognition speed of the network by improving the sparsity of the network, thereby reducing the cutting hysteresis and improving the cutting control quality.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a schematic flow chart diagram of an embodiment of an intelligent production control method for a structural member of an aerial work platform according to the present invention;
fig. 2 is a schematic diagram of a parameter diagram structure.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example 1
The main purposes of the invention are: the importance degree of each parameter data is obtained by analyzing each parameter data in the cutting process, and a sparse loss function is constructed, so that the recognition speed is improved on the basis of ensuring the recognition precision.
In order to achieve the object of the present invention, an embodiment of an intelligent production control method for a structural member of an aerial work platform is provided, as shown in fig. 1, including:
the scenario addressed by the present embodiment is: firstly, arranging a camera to collect cutting images, training a tool recognition network by using the collected cutting images, obtaining the importance of each parameter by analyzing parameter data of each training stage in the network training process, and constructing a sparse loss function according to the importance of each parameter. The network parameters are thinned as much as possible through the sparse loss function, the identification efficiency of the network is improved, and real-time cutting control is met.
101. And constructing the DNN network, wherein a loss function of the network is a cross entropy loss function, and performing supervised training on the DNN network through the cross entropy loss function.
In the embodiment, cutter identification is needed to realize cutting control, and a cutting image training network is needed to realize cutter identification, so that the cutting image is collected firstly.
Acquiring a cut image dataset: the collected cutting image needs to contain a cutting tool and cutting information, so that a camera is arranged on one side of the cutting machine, the camera collects one cutting image every 10s, and the collected cutting image is labeled to obtain a cutting image data set.
Finishing the first round of training of the tool identification network: the cutter identification network is a DNN network, the network structure is an Encoder-Decoder structure, the network input is a cutting image data set, and the network output is a cutter position hot spot diagram. The loss function of the network is a cross entropy loss function. And inputting all cutting images into a tool recognition network to complete a first round of training of data.
And obtaining a cutting image, labeling and processing to obtain a training data set of the network, and finishing initial training of the network.
In order to improve the identification efficiency of the tool identification network, parameters in the identification network need to be sparse as much as possible, and the identification precision of the network also needs to be guaranteed when the network is sparse, namely, only the parameters with low importance degree can be zero, and the parameters with high importance degree cannot be zero, so that a sparse loss function is constructed based on the method. And the parameter data of each training stage needs to be analyzed in order to reflect the importance degree of each parameter. The method comprises the steps of calculating a cutter identification difficulty weight by considering the cutter identification difficulty of each cutting image during parameter analysis, calculating the independent importance degree of each parameter by combining the weights, analyzing the relation among the parameters to obtain the association degree, constructing a graph structure by taking the independent importance degree of each parameter as a node value and the association degree among the parameters as an edge weight, analyzing the graph structure to obtain the comprehensive importance degree of each parameter node, and constructing a sparse control loss function by combining the comprehensive importance degree of each parameter.
In order to analyze the importance degree of each parameter, the determination degree of the accuracy of each parameter on the identification accuracy of the images with various identification difficulties needs to be considered, wherein the determination degree of the identification accuracy of the images with high identification difficulties can reflect the importance of the parameter, namely, the accuracy degree of each parameter not only affects the identification accuracy of the images with low identification difficulties, but also has great influence on the identification accuracy of the images with high identification difficulties, so that the importance degree of the parameter is higher.
Therefore, the identification difficulty of each image is calculated firstly, and the specific process is as follows:
102. and partitioning all cutter cut images in the cutter cut image data set to obtain an image block set, and performing dictionary training on all image blocks in the image block set to obtain a dictionary matrix and a sparse vector set of the image blocks.
The method comprises the steps of obtaining a circumscribed rectangle of a cutter in a cutting image through a cutter label, obtaining a rectangular frame with the largest area in the circumscribed rectangle of the cutter in all the cutting images, and uniformly dividing each cutting image into a plurality of image blocks with the sizes by utilizing a sliding window to slide in a non-coincident mode according to the size of the rectangular frame as the size of the sliding window. And thus, all the cut images in the data set are subjected to blocking processing to obtain an image block set.
And then performing dictionary training on all image blocks in the data set by utilizing a K-SVD algorithm to obtain a dictionary matrix and a sparse vector set of the image blocks.
And acquiring a sparse vector set corresponding to all the cutter image blocks, recording the sparse vector set as a cutter sparse vector set, and acquiring a sparse vector set corresponding to the non-cutter image blocks, and recording the sparse vector set as a non-cutter sparse vector set.
103. And obtaining a dictionary vector unique to the cutter in each cutter cutting image through the sparse vector of each cutter image block in the sparse vector set of the image blocks, the corresponding sparse vectors of all non-cutter image blocks and the dictionary matrix.
For the convenience of analysis, the following steps are adopted
Figure 370378DEST_PATH_IMAGE022
Analyzing the cutting image to obtain the first
Figure 832452DEST_PATH_IMAGE022
Obtaining the sparse vector corresponding to the cutter image block of each cutting image
Figure 148027DEST_PATH_IMAGE022
And sparse vectors corresponding to all non-cutter image blocks of each cutting image. And comparing the sparse vector of the cutter image block with the sparse vectors of the non-cutter image blocks to obtain dictionary vectors corresponding to non-sparse values unique to the cutter image block, wherein for example, the a-th value in the sparse vector corresponding to the cutter image block is a non-sparse value, and the a-th values of the sparse vectors corresponding to all the non-cutter image blocks are sparse values, so that the dictionary vector corresponding to the a-th value is the dictionary vector unique to the cutter. Analogy to this approach yields a dictionary vector unique to the tool in each of the other cutting images.
104. And obtaining the identification difficulty of the cutter in each cutter cutting image through the unique dictionary vector of the cutter in each cutter cutting image and the times of the unique dictionary vector appearing in the sparse vector of the non-cutter image block of all the cutting images.
Analyzing the descriptive ability of the unique dictionary vectors of each tool to obtain the number of times each unique dictionary vector appears in the non-tool image blocks of all cut images, e.g. the
Figure 168679DEST_PATH_IMAGE022
S-th unique dictionary of image block of cutting image toolThe vector corresponds to the s-th value in the sparse vector, the number of non-sparse values appearing in the s-th value in the sparse vector set corresponding to all the non-tool image blocks is obtained, the number ratio is obtained by taking the ratio of the number of all the non-tool image blocks as the number occupation ratio, the number occupation ratio is recorded as the description capacity of the dictionary vector unique to the s-th tool, when the description capacity of the dictionary vector is small, the description characteristic sometimes serves as the description characteristic of the non-tool image, sometimes serves as the description characteristic of the tool image, and therefore the segmentation accuracy of the tool for describing the characteristic by using the characteristic as the tool is poor.
And obtaining the description value in each sparse vector corresponding to the tool unique dictionary vector in each cutting image (the description value in each sparse vector is the value in each sparse vector). The difficulty of identifying the tool in each cut image is thus:
Figure DEST_PATH_IMAGE037
in the formula:
Figure 959917DEST_PATH_IMAGE020
is shown as
Figure 727147DEST_PATH_IMAGE022
The identification difficulty of the cutter in the cutting image is larger, the image contains a plurality of characteristic information with larger distinguishability, therefore, the identification difficulty of the cutter of the image is smaller,
Figure 21862DEST_PATH_IMAGE024
is shown as
Figure 731192DEST_PATH_IMAGE022
In the sheet cut image
Figure 603202DEST_PATH_IMAGE026
The describing capability of dictionary vectors unique to each tool is larger, the larger the value is, the more accurate the tool identification is performed by taking the feature as the tool feature,
Figure 157812DEST_PATH_IMAGE028
is shown as
Figure 195781DEST_PATH_IMAGE022
In the sheet cut image
Figure 810434DEST_PATH_IMAGE026
The second in sparse vectors corresponding to dictionary vectors unique to each tool
Figure 45106DEST_PATH_IMAGE026
A larger value indicates more feature information in the image containing the description of the s-th dictionary vector,
Figure 387094DEST_PATH_IMAGE026
is shown as
Figure 797347DEST_PATH_IMAGE022
In the sheet cut image
Figure 333633DEST_PATH_IMAGE026
A dictionary vector unique to each tool,
Figure 930967DEST_PATH_IMAGE030
is shown as
Figure 342226DEST_PATH_IMAGE022
The number of dictionary vectors unique to the tool in the cut-open image.
105. Obtaining all neuron parameters in the DNN, obtaining the loss function value of each cutter cutting image in the DNN and the parameter values of all neuron parameters, and grading the loss function values of all cutter cutting images in the cutter cutting image data set and the parameter values of all neuron parameters.
The independent importance degree of each parameter can be reflected by the information gain of the parameter, and only the information gain value of the identification accuracy condition of images with various identification difficulties needs to be considered when the information gain is calculated. (the information gain describes the degree of certainty of each attribute and the degree of certainty of the comprehensive description attribute of the system, the larger the information gain of each attribute is, the larger the purity improvement of the total comprehensive description attribute of the system is (the larger the degree of certainty is), and the determination degree of the accuracy of each network parameter to the loss function is put in this text).
Value data of each parameter (in this embodiment, the parameter refers to a neuron parameter) in the current training stage is obtained, and all the value data of each parameter form a data sequence. And obtaining the loss function value of the loss function in the current training stage, and forming a loss function value sequence by all the loss function values.
Dividing each value in the loss function value sequence into three levels according to the value size, wherein the lower the loss function value, the higher the level, the first, the second and the third level are respectively corresponding, for example, the loss function sequence
Figure DEST_PATH_IMAGE039
The maximum value 14 and the minimum value 2 of the value sequence are obtained, so that the value width of each grade is
Figure DEST_PATH_IMAGE041
Thus, the value intervals of the first, second and third levels are
Figure DEST_PATH_IMAGE043
The first, second and third grades are represented by labels 1, 2 and 3, respectively, so that the discrete values of the loss function sequence are
Figure DEST_PATH_IMAGE045
Dividing each value in each (neuron) parameter sequence into 10 value categories (namely value levels) according to the value taking size, and respectively marking as a value category 1, a value category 2, … and a value category 10.
106. And acquiring the weighted information entropy of the loss function value by using the identification difficulty of the cutter in the cutter cutting image corresponding to each loss function value grade.
In the training process, different loss function values can be obtained due to different identification difficulties of the cutters in different images under the same parameter, so that the identification difficulty of the cutters in each image is different, the loss function value in the image with low identification difficulty is small, and if the problem of identification difficulty is not considered, the problem of inaccurate solved information gain value caused by directly solving the information gain of each parameter can be solved.
Weighted information entropy for calculating loss function value by taking tool identification difficulty of each image as weight
Figure DEST_PATH_IMAGE047
The expression is as follows:
Figure DEST_PATH_IMAGE049
wherein:
Figure DEST_PATH_IMAGE051
data set representing cutting image of tool
Figure DEST_PATH_IMAGE053
The difficulty of identifying the tool in the cutting image of each tool,
Figure 128958DEST_PATH_IMAGE053
data set representing cutting image of tool
Figure 98795DEST_PATH_IMAGE053
The number of images is one of the number of images,
Figure DEST_PATH_IMAGE055
indicating the number of images in the image dataset cut by the tool,
Figure DEST_PATH_IMAGE057
the values of the loss function are all expressed in the second place
Figure DEST_PATH_IMAGE059
Within the individual grade interval
Figure DEST_PATH_IMAGE061
The identification of the cutting tool in the cutting image of the cutting tool is difficultThe degree of the magnetic field is measured,
Figure DEST_PATH_IMAGE063
means that the loss function takes a value of
Figure 730895DEST_PATH_IMAGE059
The number of sample data corresponding to the cutter cutting the image in each grade interval,
Figure 680266DEST_PATH_IMAGE061
indicating loss function value as
Figure 533952DEST_PATH_IMAGE059
Within the individual grade interval
Figure 409111DEST_PATH_IMAGE061
Each cutter cuts the image sample data,
Figure 981038DEST_PATH_IMAGE059
expressing values of loss functions
Figure 343886DEST_PATH_IMAGE059
And (4) each grade interval.
107. And obtaining the weighted conditional entropy of each neuron parameter by using the identification difficulty of the cutter in the cutter cutting image corresponding to the parameter values of all neuron parameters in the parameter value grade of each neuron parameter.
For the analysis, taking the ith parameter as an example, the weighted conditional entropy of the ith parameter is calculated
Figure DEST_PATH_IMAGE065
The expression is as follows:
Figure DEST_PATH_IMAGE067
wherein:
Figure DEST_PATH_IMAGE069
indicates that the ith parameter is in
Figure DEST_PATH_IMAGE071
To the second in the individual value category
Figure DEST_PATH_IMAGE073
The tool identification difficulty value of each tool cutting image,
Figure 347352DEST_PATH_IMAGE073
indicates that the ith parameter is in
Figure 894877DEST_PATH_IMAGE071
To the second in the individual value category
Figure 78733DEST_PATH_IMAGE073
The image is cut by the cutter and the image is cut by the cutter,
Figure DEST_PATH_IMAGE075
indicates that the ith parameter is in
Figure 809054DEST_PATH_IMAGE071
The number of cutter cutting images in each value category;
Figure 637333DEST_PATH_IMAGE051
data set representing cutting image of tool
Figure 355759DEST_PATH_IMAGE053
The tool identification difficulty value of each tool cutting image,
Figure 636699DEST_PATH_IMAGE053
data set representing cutting image of tool
Figure 495677DEST_PATH_IMAGE053
The image is cut by the cutter and the image is cut by the cutter,
Figure 444041DEST_PATH_IMAGE055
indicating the number of tool-cut images in the tool-cut image dataset,
Figure DEST_PATH_IMAGE077
indicates that the ith parameter is in
Figure 130107DEST_PATH_IMAGE071
Individual value classes and their corresponding loss function values are in the second place
Figure 898342DEST_PATH_IMAGE059
To the second in the individual value category
Figure DEST_PATH_IMAGE079
The tool identification difficulty value of each tool cutting image,
Figure 970466DEST_PATH_IMAGE079
indicates that the ith parameter is in
Figure 632391DEST_PATH_IMAGE071
The value class and the loss function value corresponding to the value class are in the kth value class
Figure 692620DEST_PATH_IMAGE079
The image is cut by the cutter and the image is cut by the cutter,
Figure DEST_PATH_IMAGE081
indicates the ith parameter is in the second
Figure 633638DEST_PATH_IMAGE071
Individual value classes and their corresponding loss function values are in the second place
Figure 821037DEST_PATH_IMAGE059
The number of the cutter cutting images in each value category,
Figure 993261DEST_PATH_IMAGE071
to express parameters
Figure 975124DEST_PATH_IMAGE071
And (4) each value level.
108. And obtaining the independent importance degree of each neuron parameter according to the weighted conditional entropy of each neuron parameter and the weighted information entropy of the loss function value.
Calculating the weighted information gain value of the ith parameter, and taking the weighted information gain value of the ith parameter as the independent importance degree of the ith parameter, which is recorded as
Figure 734263DEST_PATH_IMAGE016
. A larger value indicates that the degree of determination (degree of accuracy) of the parameter is more decisive for the degree of determination of the loss function value. Weighted information gain value of ith parameter
Figure DEST_PATH_IMAGE083
The expression of (a) is:
Figure DEST_PATH_IMAGE085
109. and obtaining the association degree of each neuron parameter and each other neuron parameter by using the parameter values of each neuron parameter and each other neuron parameter in all the cutter cutting images.
The DNN network has a plurality of neuron parameters, parameter values of the same parameter corresponding to all cutter cutting images in the cutter cutting image data set form a parameter sequence of the parameter, according to the method, the parameter sequence of each parameter can be obtained, correlation coefficients are respectively calculated pairwise between the parameter sequences, and the correlation degree between the parameters is expressed by utilizing the square of the correlation coefficients. The expression of the correlation coefficient is as follows:
Figure DEST_PATH_IMAGE087
in the formula:
Figure DEST_PATH_IMAGE089
the parameter sequence representing the ith parameter
Figure DEST_PATH_IMAGE091
The value of the one or more of the one,
Figure DEST_PATH_IMAGE093
represents the average of all values in the parameter sequence of the ith parameter,
Figure DEST_PATH_IMAGE095
in the parameter sequence representing the jth parameter
Figure 941997DEST_PATH_IMAGE091
The value of the one or more of the one,
Figure DEST_PATH_IMAGE097
the average of all values in the parameter sequence representing the jth parameter,
Figure 404946DEST_PATH_IMAGE091
the parameter sequence representing the ith parameter
Figure 823289DEST_PATH_IMAGE091
The value of the one or more of the one,
Figure DEST_PATH_IMAGE099
and the number of values in the parameter sequence of the ith parameter, or the number of elements in the parameter sequence of the ith parameter is expressed, and the number of values is consistent with the number of the cutter cutting images in the cutter cutting image data set.
110. And calculating the comprehensive importance degree of each neuron parameter by using the independent importance degree of each neuron parameter and the association degree of the neuron parameter and each other neuron parameter.
The small independent importance degree of each parameter cannot completely indicate that the parameter is unimportant, because the parameter has a larger correlation with some more important parameters, and the parameter is also more important, so the relationship between each parameter and other parameters and the importance degree of the parameter correlated with the parameter need to be analyzed to calculate the comprehensive importance degree of each parameter.
Taking the independent importance degree of each parameter as a node value, and taking the association degree between the parameters as an edge weight value to construct a parameter map structure, for example, if there are five neuron parameters, the independent importance degrees of neuron parameter 1, neuron parameter 2, neuron parameter 3, neuron parameter 4 and neuron parameter 5 are 10, 4, 15, 3, 20, respectively, the association degree of neuron parameter 1 and neuron parameter 2 is 0.9, the association degree of neuron parameter 1 and neuron parameter 3 is 0.5, the association degree of neuron parameter 1 and neuron parameter 4 is 0.33, the association degree of neuron parameter 1 and neuron parameter 5 is 0.91, the association degree of neuron parameter 2 and neuron parameter 3 is 0.86, the association degree of neuron parameter 2 and neuron parameter 4 is 0.33, the association degree of neuron parameter 2 and neuron parameter 5 is 0.61, the association degree of neuron parameter 3 and neuron parameter 4 is 0.53, and if the association degree of the neuron parameter 3 and the neuron parameter 5 is 0.5, and the association degree of the neuron parameter 4 and the neuron parameter 5 is 0.49, constructing a parameter graph structure by taking the respective independent importance degrees of the five neuron parameters as node values and the association degree between the parameters as an edge weight. As shown in fig. 2.
Because the comprehensive importance degree of each parameter needs to be analyzed, the relationship between each parameter and other parameters needs to be analyzed, and when the parameter has a larger relevance with some parameters with larger independent importance degree, or has a larger relevance with some parameters with smaller independent importance degree, but these parameters have some parameters with higher relevance and larger independent importance, this indicates that the importance of the parameter is larger.
First, the
Figure 974785DEST_PATH_IMAGE006
Comprehensive importance of individual neuron parameters (i.e., nodes)
Figure 520298DEST_PATH_IMAGE004
The expression of (a) is:
Figure DEST_PATH_IMAGE101
in the formula:
Figure 542481DEST_PATH_IMAGE004
is shown as
Figure 637386DEST_PATH_IMAGE006
The overall degree of importance of the individual parameters,
Figure 620385DEST_PATH_IMAGE010
indicating the second in the network
Figure 936966DEST_PATH_IMAGE010
The number of the parameters is one,
Figure 423442DEST_PATH_IMAGE012
representing the total number of parameters in the network,
Figure 199900DEST_PATH_IMAGE014
is shown as
Figure 263671DEST_PATH_IMAGE006
A parameter and
Figure 400254DEST_PATH_IMAGE010
the degree of association value of each parameter,
Figure 724925DEST_PATH_IMAGE016
is shown as
Figure 921551DEST_PATH_IMAGE006
The degree of independent importance of each parameter.
111. And obtaining a sparse control loss function by using the parameter value of each neuron parameter and the comprehensive importance degree of each neuron parameter in the DNN network of the cutter cutting image to be detected.
Constructing a sparse control loss function:
Figure DEST_PATH_IMAGE032A
in the formula:
Figure 892525DEST_PATH_IMAGE034
a function representing the loss of sparse control is represented,
Figure 317952DEST_PATH_IMAGE006
denotes the first
Figure 779020DEST_PATH_IMAGE006
The number of the parameters is one,
Figure 395815DEST_PATH_IMAGE012
representing the total number of parameters in the network,
Figure 309545DEST_PATH_IMAGE036
showing the tool to be inspected cutting image in the DNN network
Figure 801312DEST_PATH_IMAGE006
Parameter values of the individual parameters.
112. And obtaining a comprehensive loss function of the cutter cutting image to be detected in the DNN network through the sparse control loss function and the cross entropy loss function, and performing supervision training on the network by using the comprehensive loss function to complete network training.
The original cross entropy loss function of the network is noted as
Figure DEST_PATH_IMAGE103
Thus the composite loss function is:
Figure DEST_PATH_IMAGE105
at this point, the importance degree of each parameter is obtained by analyzing each parameter in the training process, and the sparse control loss function is set according to the importance degree.
And carrying out supervision training on the network by using the comprehensive loss function until the network training is finished.
113. And inputting the cutting image of the cutter to be detected into the trained network to obtain the position of the cutter, and cutting according to the position of the cutter.
And rapidly recognizing the position of the cutter by using the trained cutter recognition network, and determining a cutting instruction at the next moment according to the current position of the cutter and a design drawing for cutting the structural member. And finishing cutting according to the cutting instruction, and realizing intelligent control on cutting of the structural part.
The invention obtains the sparse necessity of each parameter by analyzing the importance of each network parameter in the training process, and constructs the sparse control loss function according to the sparse necessity, thereby meeting the cutting real-time control requirement by improving the real-time property of cutter identification. Compared with the prior art, the method improves the cutter recognition speed of the network by improving the sparsity of the network, thereby reducing the cutting hysteresis and improving the cutting control quality.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (7)

1. An intelligent production control method for a structural member of an overhead working truck is characterized by comprising the following steps:
constructing a DNN network, wherein a loss function of the network is a cross entropy loss function, and performing supervision training on the DNN network through the cross entropy loss function;
partitioning all cutter cutting images in the cutter cutting image data set to obtain an image block set, and performing dictionary training on all image blocks in the image block set to obtain a dictionary matrix and a sparse vector set of the image blocks;
obtaining a dictionary vector unique to the cutter in each cutter cutting image through the sparse vector of each cutter image block in the sparse vector set of the image blocks, the corresponding sparse vectors of all non-cutter image blocks and a dictionary matrix;
obtaining the identification difficulty of the cutter in each cutter cutting image through the unique dictionary vector of the cutter in each cutter cutting image and the times of the unique dictionary vector appearing in the sparse vector of the non-cutter image block of all the cutting images;
acquiring all neuron parameters in the DNN, acquiring loss function values of all cutter cutting images in the DNN and parameter values of all neuron parameters, and grading the loss function values of all cutter cutting images in the cutter cutting image data set and the parameter values of all neuron parameters;
acquiring the weighted information entropy of the loss function value by using the identification difficulty of the cutter in the cutter cutting image corresponding to each loss function value grade;
obtaining the weighted conditional entropy of each neuron parameter by using the identification difficulty of the cutter in the cutter cutting image corresponding to the parameter values of all neuron parameters in the parameter value grade of each neuron parameter;
obtaining the independent importance degree of each neuron parameter according to the weighted conditional entropy of each neuron parameter and the weighted information entropy of the loss function value;
obtaining the association degree of each neuron parameter and each other neuron parameter by using the parameter values of each neuron parameter and each other neuron parameter in all the cutter cutting images;
calculating the comprehensive importance degree of each neuron parameter by using the independent importance degree of each neuron parameter and the association degree of the neuron parameter and each other neuron parameter;
obtaining a sparse control loss function by using the parameter value of each neuron parameter and the comprehensive importance degree of each neuron parameter of the cutter cutting image to be detected in the DNN;
obtaining a comprehensive loss function of the cutter cutting image to be detected in the DNN network through the sparse control loss function and the cross entropy loss function, and performing supervision training on the network by using the comprehensive loss function to complete network training;
and inputting the cutting image of the cutter to be detected into the trained network to obtain the position of the cutter, and cutting according to the position of the cutter.
2. The intelligent production control method for the structural member of the overhead working truck as claimed in claim 1, wherein the method for obtaining the identification difficulty of the cutter in the cutting image of each cutter comprises:
obtaining the description capacity of the dictionary vector unique to the cutter in each cutter cutting image through the times that the dictionary vector unique to the cutter in each cutter cutting image appears in the sparse vectors of the non-cutter image blocks of all the cutting images;
and obtaining the identification difficulty of the cutters in each cutter cutting image through the unique dictionary vectors of all the cutters in each cutter cutting image and the description capacity of the unique dictionary vectors of the cutters.
3. The intelligent production control method for the structural member of the overhead working truck according to claim 1, wherein the method for obtaining the image block set by partitioning all the cutter cut images in the cutter cut image data set comprises the following steps:
acquiring a rectangular frame with the largest area in cutter external rectangles in all cutter cutting images in a cutter cutting image data set;
and dividing all the cutter cut images into a plurality of image blocks by taking the size of the rectangular frame with the largest area as a standard to obtain an image block set.
4. The intelligent production control method for the structural member of the aerial work platform as claimed in claim 1, wherein the expression of the comprehensive importance degree of each neuron parameter is as follows:
Figure DEST_PATH_IMAGE001
in the formula:
Figure 490862DEST_PATH_IMAGE002
is shown as
Figure DEST_PATH_IMAGE003
The overall importance of the individual neuron parameters,
Figure 541601DEST_PATH_IMAGE004
indicating the second in the network
Figure 71940DEST_PATH_IMAGE006
The parameters of each of the neurons are determined,
Figure DEST_PATH_IMAGE007
representing the total number of neuron parameters in the network,
Figure 808820DEST_PATH_IMAGE008
is shown as
Figure 159030DEST_PATH_IMAGE003
The individual neuron parameter and
Figure 875445DEST_PATH_IMAGE006
the degree of correlation values of the parameters of the individual neurons,
Figure DEST_PATH_IMAGE009
is shown as
Figure 650503DEST_PATH_IMAGE003
The degree of independent importance of individual neuron parameters.
5. The intelligent production control method for the structural member of the overhead working truck as claimed in claim 2, wherein the expression of the identification difficulty of the cutter in the cutter cutting image is as follows:
Figure DEST_PATH_IMAGE011
in the formula:
Figure 309017DEST_PATH_IMAGE012
is shown as
Figure DEST_PATH_IMAGE013
The difficulty of identifying the tool in the cutting image of the cutter,
Figure 300850DEST_PATH_IMAGE014
is shown as
Figure 319491DEST_PATH_IMAGE013
In the image cut by the sheet knife
Figure DEST_PATH_IMAGE015
The descriptive ability of dictionary vectors unique to each tool,
Figure 558842DEST_PATH_IMAGE016
is shown as
Figure 138991DEST_PATH_IMAGE013
Cutting the image with a knife
Figure 463793DEST_PATH_IMAGE015
The second in sparse vectors corresponding to dictionary vectors unique to each tool
Figure 958228DEST_PATH_IMAGE015
The value of the one or more of the one,
Figure 317665DEST_PATH_IMAGE015
is shown as
Figure 800206DEST_PATH_IMAGE013
In the image cut by the sheet knife
Figure 674621DEST_PATH_IMAGE015
A dictionary vector unique to each tool,
Figure DEST_PATH_IMAGE017
is shown as
Figure 441588DEST_PATH_IMAGE013
And the number of dictionary vectors unique to the cutter in the image cut by the cutter.
6. The intelligent production control method for the structural member of the aerial work platform as claimed in claim 4, wherein the expression of the sparse control loss function is as follows:
Figure DEST_PATH_IMAGE019
in the formula:
Figure 343948DEST_PATH_IMAGE020
a function representing the loss of sparse control is represented,
Figure 577483DEST_PATH_IMAGE003
is shown as
Figure 611298DEST_PATH_IMAGE003
The parameters of each of the neurons are determined,
Figure DEST_PATH_IMAGE021
showing the tool to be inspected cutting image in the DNN network
Figure 650798DEST_PATH_IMAGE003
Parameter values of individual neuron parameters.
7. The intelligent production control method for the structural member of the aerial work platform as claimed in claim 1, wherein the independent importance degree of each neuron parameter is obtained by subtracting the weighted conditional entropy of each neuron parameter and the weighted information entropy of the loss function value.
CN202210817835.0A 2022-07-13 2022-07-13 Intelligent production control method for structural member of overhead working truck Active CN114898220B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210817835.0A CN114898220B (en) 2022-07-13 2022-07-13 Intelligent production control method for structural member of overhead working truck

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210817835.0A CN114898220B (en) 2022-07-13 2022-07-13 Intelligent production control method for structural member of overhead working truck

Publications (2)

Publication Number Publication Date
CN114898220A true CN114898220A (en) 2022-08-12
CN114898220B CN114898220B (en) 2022-09-09

Family

ID=82729981

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210817835.0A Active CN114898220B (en) 2022-07-13 2022-07-13 Intelligent production control method for structural member of overhead working truck

Country Status (1)

Country Link
CN (1) CN114898220B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115049814A (en) * 2022-08-15 2022-09-13 聊城市飓风工业设计有限公司 Intelligent eye protection lamp adjusting method adopting neural network model
CN115359497A (en) * 2022-10-14 2022-11-18 景臣科技(南通)有限公司 Call center monitoring alarm method and system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109598799A (en) * 2018-11-30 2019-04-09 南京信息工程大学 A kind of Virtual cropping method based on CycleGAN
WO2021150017A1 (en) * 2020-01-23 2021-07-29 Samsung Electronics Co., Ltd. Method for interactive segmenting an object on an image and electronic computing device implementing the same
CN114140485A (en) * 2021-11-29 2022-03-04 昆明理工大学 Method and system for generating cutting track of main root of panax notoginseng
CN114742987A (en) * 2022-06-08 2022-07-12 苏州市洛肯电子科技有限公司 Automatic positioning control method and system for cutting of non-metallic materials

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109598799A (en) * 2018-11-30 2019-04-09 南京信息工程大学 A kind of Virtual cropping method based on CycleGAN
WO2021150017A1 (en) * 2020-01-23 2021-07-29 Samsung Electronics Co., Ltd. Method for interactive segmenting an object on an image and electronic computing device implementing the same
CN114140485A (en) * 2021-11-29 2022-03-04 昆明理工大学 Method and system for generating cutting track of main root of panax notoginseng
CN114742987A (en) * 2022-06-08 2022-07-12 苏州市洛肯电子科技有限公司 Automatic positioning control method and system for cutting of non-metallic materials

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115049814A (en) * 2022-08-15 2022-09-13 聊城市飓风工业设计有限公司 Intelligent eye protection lamp adjusting method adopting neural network model
CN115359497A (en) * 2022-10-14 2022-11-18 景臣科技(南通)有限公司 Call center monitoring alarm method and system

Also Published As

Publication number Publication date
CN114898220B (en) 2022-09-09

Similar Documents

Publication Publication Date Title
CN114898220B (en) Intelligent production control method for structural member of overhead working truck
CN109523520B (en) Chromosome automatic counting method based on deep learning
CN111079602A (en) Vehicle fine granularity identification method and device based on multi-scale regional feature constraint
Gakii et al. A classification model for water quality analysis using decision tree
CN108711148B (en) Tire defect intelligent detection method based on deep learning
CN106355192A (en) Support vector machine method based on chaos and grey wolf optimization
CN105374209B (en) A kind of urban area road network running status characteristics information extraction method
CN103473786A (en) Gray level image segmentation method based on multi-objective fuzzy clustering
CN111652167A (en) Intelligent evaluation method and system for chromosome karyotype image
CN113658174B (en) Microkernel histology image detection method based on deep learning and image processing algorithm
CN112560722A (en) Airplane target identification method and device, computer equipment and storage medium
CN111832608A (en) Multi-abrasive-particle identification method for ferrographic image based on single-stage detection model yolov3
CN110288017B (en) High-precision cascade target detection method and device based on dynamic structure optimization
CN115760484A (en) Method, device and system for improving potential danger identification capability of power distribution station area and storage medium
CN104915679A (en) Large-scale high-dimensional data classification method based on random forest weighted distance
CN111784022A (en) Short-time adjacent fog prediction method based on combination of Wrapper method and SVM method
CN106202274B (en) A kind of defective data automatic abstract classification method based on Bayesian network
CN109446964A (en) Face detection analysis method and device based on end-to-end single-stage multiple scale detecting device
US20150242676A1 (en) Method for the Supervised Classification of Cells Included in Microscopy Images
CN117315380B (en) Deep learning-based pneumonia CT image classification method and system
CN115269958A (en) Internet reliability data information acquisition and analysis system
CN117252459A (en) Fruit quality evaluation system based on deep learning
CN115830302A (en) Multi-scale feature extraction and fusion power distribution network equipment positioning identification method
Nagpal et al. An application of deep learning for sweet cherry phenotyping using yolo object detection
CN115293827A (en) Novel model interpretability analysis method for assisting fine operation of enterprise

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant