CN111159279B - Model visualization method, device and storage medium - Google Patents
Model visualization method, device and storage medium Download PDFInfo
- Publication number
- CN111159279B CN111159279B CN201911422621.8A CN201911422621A CN111159279B CN 111159279 B CN111159279 B CN 111159279B CN 201911422621 A CN201911422621 A CN 201911422621A CN 111159279 B CN111159279 B CN 111159279B
- Authority
- CN
- China
- Prior art keywords
- network model
- target network
- model
- connection
- neuron
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000007794 visualization technique Methods 0.000 title claims abstract description 24
- 210000002569 neuron Anatomy 0.000 claims abstract description 231
- 238000000034 method Methods 0.000 claims abstract description 84
- 238000010586 diagram Methods 0.000 claims abstract description 62
- 238000013507 mapping Methods 0.000 claims abstract description 27
- 238000012549 training Methods 0.000 claims description 71
- 230000008569 process Effects 0.000 claims description 54
- 230000008859 change Effects 0.000 claims description 18
- 238000012163 sequencing technique Methods 0.000 claims description 4
- 238000010276 construction Methods 0.000 claims description 3
- 238000013527 convolutional neural network Methods 0.000 description 10
- 238000004891 communication Methods 0.000 description 8
- 238000013135 deep learning Methods 0.000 description 8
- 238000012512 characterization method Methods 0.000 description 6
- 238000011156 evaluation Methods 0.000 description 6
- 238000013528 artificial neural network Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 238000012360 testing method Methods 0.000 description 4
- 239000003086 colorant Substances 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 238000012800 visualization Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000008676 import Effects 0.000 description 2
- 230000010365 information processing Effects 0.000 description 2
- 210000002364 input neuron Anatomy 0.000 description 2
- 238000003058 natural language processing Methods 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000004913 activation Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 210000004027 cell Anatomy 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000008451 emotion Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 238000013178 mathematical model Methods 0.000 description 1
- 210000005036 nerve Anatomy 0.000 description 1
- 238000011176 pooling Methods 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 239000000523 sample Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 210000000225 synapse Anatomy 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/26—Visual data mining; Browsing structured data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Biomedical Technology (AREA)
- Evolutionary Computation (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Databases & Information Systems (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
The embodiment of the application discloses a model visualization method, equipment and a storage medium, wherein the method comprises the following steps: acquiring a connection relationship between a neuron of a target network model and the neuron; mapping the neurons into nodes, and mapping the connection relations among the neurons into connection lines among the corresponding nodes so as to construct a node relation diagram corresponding to the target network model; the connecting lines can represent the weight of the corresponding connection relation; and generating a model image corresponding to the target network model based on the node relation diagram, and outputting the model image.
Description
Technical Field
The present application relates to deep learning technology, and in particular, to a model visualization method, apparatus, and storage medium.
Background
At present, in the field of artificial intelligence, a deep learning network model trained through deep learning increasingly plays an important role. Particularly in the fields of natural language processing, image recognition, recommendation systems and the like, the deep learning network model helps to make remarkable progress in various fields. However, deep learning network models is often difficult to interpret.
In the related art, data such as a model structure and a loss function value of a deep learning network model are shown as a representative of the visualization tool tensorboard, and a layer structure of the deep learning network model can be shown from a network layer, but a situation of the deep learning network model cannot be shown finely is shown.
Disclosure of Invention
The embodiment of the application provides a model visualization method, equipment and a storage medium.
In one aspect, the method for visualizing a model provided by the embodiment of the application comprises the following steps:
Acquiring a connection relationship between a neuron of a target network model and the neuron;
Mapping the neurons into nodes, and mapping the connection relations among the neurons into connection lines among the corresponding nodes so as to construct a node relation diagram corresponding to the target network model; the connecting lines can represent the weight of the corresponding connection relation;
and generating a model image corresponding to the target network model based on the node relation diagram, and outputting the model image.
In one aspect, an electronic device provided by an embodiment of the present application includes:
the acquisition module is used for acquiring the connection relation between the neuron of the target network model and the neuron;
The construction module is used for mapping the neurons into nodes and mapping the connection relations among the neurons into connection lines among the corresponding nodes so as to construct a node relation diagram corresponding to the target network model; the connecting lines can represent the weight of the corresponding connection relation;
The generation module is used for generating a model image corresponding to the target network model based on the node relation diagram;
and the output module is used for outputting the model image.
In one aspect, an embodiment of the present application provides an electronic device, including: a processor and a memory for storing a computer program capable of running on the processor, wherein the processor, when running the computer program, performs the steps of the model visualization method described above.
In one aspect, an embodiment of the present application provides a storage medium, where a model visualization program is stored, where the model visualization program, when executed by a processor, implements the steps of the model visualization method described above.
In the embodiment of the application, the connection relation between the neuron of the target network model and the neuron is obtained; mapping the neurons into nodes, and mapping the connection relations among the neurons into connection lines among the corresponding nodes so as to construct a node relation diagram corresponding to the target network model; the connecting lines can represent the weight of the corresponding connection relation; and generating a model image corresponding to the target network model based on the node relation diagram, and outputting the model image, so that the node relation diagram corresponding to the network model is constructed based on the connection relation between neurons in the network model, and the degree of connection between the neurons is displayed through the connection lines, so that the network model is visualized, a user can intuitively know the structure of the network model, and the user experience is improved.
Drawings
FIG. 1 is a schematic diagram of an alternative implementation flow of a model visualization method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of an alternative node relationship according to an embodiment of the present application;
FIG. 3 is a schematic flow chart of an alternative implementation of the model visualization method according to the embodiment of the present application;
FIG. 4 is a schematic diagram of an alternative node relationship according to an embodiment of the present application;
FIG. 5 is a schematic flow chart of an alternative implementation of the model visualization method according to the embodiment of the present application;
FIG. 6 is a schematic diagram of an alternative node relationship according to an embodiment of the present application;
FIG. 7 is a schematic diagram of an alternative node relationship according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a model training process according to an embodiment of the present application;
FIG. 9 is a schematic diagram of a model training process according to an embodiment of the present application;
FIG. 10 is a schematic diagram of an alternative system architecture according to an embodiment of the present application;
FIG. 11 is a schematic diagram of an alternative electronic device according to an embodiment of the present application;
Fig. 12 is a schematic diagram of an alternative structure of an electronic device according to an embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples. It is to be understood that the examples provided herein are for the purpose of illustration only and are not intended to limit the application. In addition, the embodiments provided below are some of the embodiments for carrying out the present application, but not all of the embodiments for carrying out the present application, and the technical solutions described in the embodiments of the present application may be implemented in any combination without conflict.
In various embodiments of the application, a connection relationship between a neuron of a target network model and the neuron is obtained; mapping the neurons into nodes, and mapping the connection relations among the neurons into connection lines among the corresponding nodes so as to construct a node relation diagram corresponding to the target network model; the connecting lines can represent the weight of the corresponding connection relation; and generating a model image corresponding to the target network model based on the node relation diagram, and outputting the model image.
The embodiment of the application provides a model visualization method which is applied to electronic equipment, and each functional module in the electronic equipment can be cooperatively realized by hardware resources of the equipment (such as terminal equipment), computing resources such as a processor and the like and communication resources (such as various modes for supporting communication of optical cables, cells and the like).
The electronic device may be any device having information processing capabilities, and in one embodiment, the electronic device may be an intelligent terminal, such as a notebook or other electronic device having wireless communication capabilities, an AR/VR device, a mobile terminal. In another embodiment, the electronic device may also be a terminal device with computing capabilities that is not portable, such as a desktop computer, server, or the like.
Of course, embodiments of the present application are not limited to being provided as methods and hardware, but may be implemented in a variety of ways, such as being provided as a storage medium (storing instructions for performing the model visualization methods provided by embodiments of the present application).
Fig. 1 is a schematic flow chart of an alternative implementation of a model visualization method according to an embodiment of the present application, as shown in fig. 1, where the model visualization method includes:
s101, acquiring a connection relation between a neuron of a target network model and the neuron.
The electronic equipment can train the target network model or directly acquire the network model data of the target network model from other equipment or network sides, wherein the acquired network model data comprises neurons and connection relations among the neurons. The connection relation among the neurons represents the transfer relation of the network model.
The target network model in the embodiment of the application can be an artificial neural network, namely a mathematical model for performing information processing by applying a structure similar to brain nerve synapse connection.
The target network model is capable of deriving an output result based on the input characteristics. The input features may be text, images, etc. of content in various formats. In an example, the target network model is a natural language processing model, the input content is a sentence, and the target network model can identify emotion tendencies, expressed meanings and the like of the input sentence based on the input sentence. In an example, the target network model is an image detection model, the input content is an image, the target network model is capable of identifying an object included in the input image, or identifying a position of a specified object in the input image, or the like, based on the input image. In an example, the target network model is an image segmentation model, the input content is an image, and the target network model is capable of identifying different objects included in the input image based on the input image and segmenting the different objects.
In the embodiment of the present application, the algorithm adopted for the target network model may include: convolutional neural network (convolutional neural network, CNN), regional convolutional neural network (Region convolutional neural network, R-CNN), fast R-CNN (Fast R-CNN), faster R-CNN (Fast R-CNN), single probe detection (Single Shot Detector, SSD), and the like. In the embodiment of the application, the algorithm adopted by the target network model is not limited.
The target network model may include: one or more of an input layer, a convolution layer, an activation layer, a pooling layer, a fully-connected layer. In the embodiment of the application, the hierarchical structure of the target network model is not limited.
The target network model includes a plurality of network layers, and each network layer includes one or more neurons, which may include one or more inputs and one output. Here, neurons can be understood as a linear division. When a connection relationship exists between a neuron of an upper layer and a neuron of a lower layer, the output of the neuron of the upper layer is represented and input to the neuron of the lower layer, and when the output is input, the weight corresponding to the connection relationship is given. Here, the upper layer and the lower layer are a relative concept. In one example, the target network model includes three layers: layer 1, layer 2 and layer 3, and the output of layer 1 is input to layer 2, the output of layer 2 is input to layer 3, for layer 1 and layer 2 layer 1 is the upper layer and layer 2 is the lower layer, for layer 2 and layer 3 layer 2 is the upper layer and layer 3 is the lower layer.
In the embodiment of the present application, the number of layers included in the target network and the number of neurons in each layer are not limited.
S102, mapping the neurons into nodes, and mapping the connection relations among the neurons into connection lines among the corresponding nodes so as to construct a node relation diagram corresponding to the target network model.
The connecting lines can represent the weight of the corresponding connection relation;
After obtaining the neurons of the target neural network and the connection relations among the neurons, mapping each neuron into a corresponding node, and mapping the connection relations among the neurons into connection lines among the corresponding nodes, thereby constructing a node relation diagram corresponding to the target network model.
In the embodiment of the application, the nodes in the node relation graph can be mapped through various node identifiers such as circles, squares, triangles and the like, and in fig. 2, the mapping of the nodes is exemplified through the circle identifiers.
The electronic equipment calls an identifier from an image library supported by the system based on the neuron included in the target network model, calls lines and canvas from the image library supported by the system, adds nodes on the canvas based on the connection relation between the neuron in the target network model, and adds connecting lines between the nodes, so that a node relation diagram is generated.
Here, when there is a transfer relationship between two neurons, there is a connection relationship between the two neurons, and connection of connection lines is performed between nodes corresponding to the two neurons. When there is no transmission relation between the two neurons, there is no connection relation between the two neurons, and connection of the connecting wires is not performed between nodes corresponding to the two neurons.
In one example, the target network model includes three layers: layer a, layer B, and layer a includes 6 neurons: neuron A1, neuron A2, neuron A3, neuron A4, neuron A5, and neuron A6, layer B comprising 6 neurons: neuron B1, neuron B2, neuron B3, neuron B4, neuron B5, and neuron B6, layer C comprises 8 neurons: a connection is present between the neuron C1, the neuron C2, the neuron C3, the neuron C4, the neuron C5, the neuron C6, the neuron C7 and the neuron C8., wherein a connection is present between the neuron A1 and the neuron B1, the neuron B2 and the neuron B4, a connection is present between the neuron A2 and the neuron B1, the neuron B2, the neuron B3 and the neuron B4, a connection is present between the neuron A3 and the neuron B1, the neuron B2, the neuron B3, the neuron B4 and the neuron B6, a connection is present between the neuron A4 and the neuron B1, the neuron B2, the neuron B3, the neuron B4 and the neuron B6, a connection is present between the neuron A5 and the neuron B1, the neuron B2, the neuron B3, the neuron B4, the neuron B5 and the neuron B6, a connection is present between the neuron A6 and the neuron B1, the neuron B2, the neuron B3, the neuron B4 and the neuron B4, the neuron B5 and the neuron B6, there is a connection relationship between the neuron B1 and the neurons C1, C2, C3, C4, C5, C6 and C7, a connection relationship between the neurons B2 and the neurons C1, C2, C3, C4, C5, C6 and C8, a connection relationship between the neurons B3 and the neurons C1, C2, C3, C4, C5, C6, C7 and C8, a connection relationship between the neurons B4 and the neurons C1, C2, C3, C5, C8, a connection relationship between the neurons B4 and C6, C8, C5, C6, C7 and C8, the neuron B6 has a connection relationship with the neuron C1, the neuron C2, the neuron C3, the neuron C4, the neuron C5, the neuron C6, the neuron C7 and the neuron C8. The method comprises the steps of mapping a neuron A1 into a node A1, mapping a neuron A2 into a node A2, mapping a neuron A3 into nodes A3 and … …, mapping a neuron C8 into a node C8, connecting lines between the corresponding nodes based on the connection relation between the neurons to obtain a connection line showing the connection relation, and a constructed node relation diagram is shown in figure 2.
In the embodiment of the application, the connecting lines between the nodes can represent the weights corresponding to the connecting relations between the corresponding neurons. In an example, corresponding weight values may be identified on corresponding connection lines. In an example, different weight values are characterized based on different colors. In an example, as shown in fig. 2, different weight values are represented based on different line thicknesses, the value of the weight corresponding to the connection relationship corresponding to the line thick connection line is large, and the value of the weight corresponding to the connection relationship corresponding to the line thin connection line is small.
The above examples are examples of the characterization method of the weights of the connection relationships corresponding to the connection lines, and in practical application, the characterization method of the weights of the connection relationships corresponding to the connection line table is not limited in any way.
In the embodiment of the application, the grades of the weights corresponding to the connection relations can be classified, and the grades of the weights corresponding to the corresponding connection relations are represented by the connecting lines. In an example, taking the class to which the weight corresponding to the connection relationship corresponding to the color characterization by the connection line belongs as an example, the size of the weight is divided into four classes: the first level, the second level, the third level and the fourth level, and the magnitudes of the corresponding weights are sequentially increased. When the weight size belongs to the first level, the color of the connecting line is black, when the weight size belongs to the second level, the color of the connecting line is green, when the weight size belongs to the third level, the color of the connecting line is yellow, when the weight size belongs to the fourth level, the color of the connecting line is red.
In the embodiment of the application, when the electronic equipment constructs the node relation diagrams corresponding to the plurality of network models, the network models can be classified, and the nodes corresponding to the neurons of the network models of different types are identified in different node identification modes.
S103, generating a model image corresponding to the target network model based on the node relation diagram, and outputting the model image.
After the electronic equipment constructs the node relation diagram, the node relation diagram is output in an image format. In one example, the electronic device outputs the node relation graph in an image format onto a display screen of the electronic device. In an example, the electronic device outputs the node relation graph to other devices in an image format, such that the other devices are able to display the node relation graph.
According to the model visualization method provided by the embodiment of the application, the connection relation between the neuron of the target network model and the neuron is obtained; mapping the neurons into nodes, and mapping the connection relations among the neurons into connection lines among the corresponding nodes so as to construct a node relation diagram corresponding to the target network model; the connecting lines can represent the weight of the corresponding connection relation; and generating a model image corresponding to the target network model based on the node relation diagram, and outputting the model image, so that the node relation diagram corresponding to the network model is constructed based on the connection relation between neurons in the network model, and the degree of connection between the neurons is displayed through the connection lines, so that the network model is visualized, a user can intuitively know the structure of the network model, and the user experience is improved.
In some embodiments, as shown in fig. 3, the execution of S101 includes:
And S300, in the process of carrying out iterative updating on the parameters of the target network model based on the training data, carrying out primary updating on the parameters of the target network model through the training data, and acquiring the connection relationship between the neurons of the updated target network model.
The association relationship between the neuron and the neuron acquired in the iterative updating process of the target network model can obtain a model image set corresponding to the iterative updating process of the target network model.
In the embodiment of the application, in the training process of the target network model, the parameters of the target network model are updated once, and then the connection relation between the neurons of the target network model is obtained once, so that a node relation diagram is constructed until the target network model is trained, and the node relation diagram after each parameter adjustment in the training process of the target network model is obtained.
When training the target network model, the target network model is completely trained once by using all data of the training set, which is called performing first generation training, namely performing epoch once. In a first generation training process, a small part of training data in a training set is used for carrying out one-time back propagation parameter update on the weight of the target network model, the small part of training data can be called a batch of data, the parameter of the network model is updated once through the batch of data, the parameter of the target network model is updated once through a batch of data, and the training on the target network model can be called.
In the process of training the target network model through training data, every time training is performed, namely, parameters of the target network model are updated once through batch data, the connection relation between neurons of the target network model is obtained once, and therefore a node relation graph is constructed.
In an example, the node relation diagram after the nth update of the parameters of the target network model may be shown in fig. 2, and the node relation diagram after the n+1st update of the parameters of the target network model may be shown in fig. 4. As can be seen from a comparison between fig. 2 and fig. 4, after the parameter of the target network model is updated for the n+1th time, the thickness of the connection line between the node A1 and the node B2 is changed, the thickness of the connection line between the node A6 and the node B5 is changed, and the thickness of the connection line between the node B1 and the node C4 is changed, so that it can be determined that: the weight value of the weight corresponding to the connection between the neuron A1 and the neuron B2 changes, the weight value of the weight corresponding to the connection between the neuron A6 and the neuron B5 changes, and the weight value of the weight corresponding to the connection between the neuron B1 and the neuron C4 changes.
In the embodiment of the application, the connection relation between the neurons of the target network model can be backed up, and the node relation graph is constructed based on the connection relation between the neurons of the backed up target network model, so that the training process of the target network model is not influenced.
After the electronic equipment determines the node relation diagram after each training, the model image after each training is obtained based on the constructed node relation diagram, so that a model image set corresponding to the training process of the target network model is formed.
According to the model visualization method provided by the embodiment of the application, in the training process of the target network model, the node relation diagram corresponding to the target network model after each parameter update is constructed, so that a user can intuitively know the training condition of the target network model, and comprehensively know the target network model.
In some embodiments, the model images corresponding to the target network model are ordered according to an update sequence of the parameters of the target network model in an iterative update process.
In the embodiment of the application, the parameters of the target network model are updated, namely the weights corresponding to the connection relations are updated, so that the output result of the target network model is more accurate. The method comprises the steps that model images corresponding to node relation diagrams generated in the training process of a target network model are ordered according to the updating sequence of parameters of the target network model, so that a user can determine the change condition of weights corresponding to all connection relations, and when the weights of the connection relations are gradually increased along with the updating process of the parameters of the target network model, neurons corresponding to the connection relations are represented as strongly-associated neurons in the target network model, and the neurons are important neurons in the target network model; when the weight of a connection relation gradually decreases along with the updating process of the parameters of the target network model, the neuron corresponding to the connection relation is characterized as a weak association neuron in the target network model, and is an unimportant neuron in the target network model.
In the model image set, after the model images corresponding to the target network model are sequenced according to the update sequence of the parameters of the target network model in the iterative update process, each model image is used as a frame image, and the model image set is generated into an animation according to the sequencing of the model images, so that a user does not need to manually operate the model images, and the training process of the target network model is intuitively observed based on the generated animation.
In some embodiments, as shown in fig. 5, after S102, further comprising:
s104, obtaining the weight corresponding to the connection relation between the neurons.
In the embodiment of the application, the weight corresponding to the connection relation can reflect the importance degree of the corresponding neuron in the target network model.
S105, marking the connection lines mapped by the corresponding connection relations according to the weights.
After the electronic equipment acquires the weights corresponding to the connection relations, the corresponding connection relations are identified through the acquired weights.
In the embodiment of the present application, the identification manner for identifying the connection line mapped by the corresponding connection relationship according to the weight includes:
marking the color of the first mode and the color of the connecting wire;
Marking a numerical value on the second mode and the connecting line;
marking a third mode, namely the thickness of the connecting wire;
taking the identification mode as an example, different weights are corresponding to different colors, and the colors of the corresponding connecting lines are set according to the weights of the connection relations.
In practical application, the weights corresponding to the connection relations can be classified, and the grades to which the weights corresponding to the connection relations belong are represented through the connection lines. In an example, taking the class to which the weight corresponding to the connection relationship corresponding to the color characterization by the connection line belongs as an example, the size of the weight is divided into four classes: the first level, the second level, the third level and the fourth level, and the magnitudes of the corresponding weights are sequentially increased. When the weight size belongs to the first level, the color of the connecting line is black, when the weight size belongs to the second level, the color of the connecting line is green, when the weight size belongs to the third level, the color of the connecting line is yellow, when the weight size belongs to the fourth level, the color of the connecting line is red.
Taking the identification mode as an example, the weight of the corresponding connection relation is directly identified on the connection line. In one example, neurons in a target network model include: neuron D1, neuron D2, neuron D3, neuron E1, neuron E2, neuron E3 and neuron E4, wherein a connection relationship exists between neuron D1 and neuron D1, neuron D2 and neuron D4, and the corresponding weights are respectively: 0.2, 0.2 and 0.2, the neuron D2 has a connection relation with the neuron D1, the neuron D2, the neuron D3 and the neuron D4, and the corresponding weights are respectively: 1.5, 0.2 and 0.2, the neuron D3 has a connection relation with the neuron D1, the neuron D2, the neuron D3 and the neuron D4, and the corresponding weights are respectively as follows: 0.2, 2.5 and 0.3.
Taking the identification mode as the identification mode III as an example, the size of the weight of the corresponding connection relation is represented by the thickness of the connection wire, wherein the larger the weight is, the thicker the corresponding connection wire is, the smaller the weight is, and the thinner the corresponding connection wire is. Here, the magnitude of the weight representing the corresponding connection relationship by the thickness of the connection line may be as shown in fig. 2, and in fig. 2, the weight corresponding to the connection relationship between the neuron A2 and the neuron B2, the weight corresponding to the connection relationship between the neuron A3 and the neuron B3, and the weight corresponding to the connection relationship between the neuron B2 and the neuron C8 are large.
In some embodiments, after S104, further comprising: according to the weight, sorting the weights according to the sorting from big to small; determining target weights in a set order in the ordering result; and identifying the target node corresponding to the target neuron by connecting the target neuron corresponding to the connection relation of the target weight.
Here, for a node relation graph, the weights corresponding to all the connection relations in the node relation graph may be ordered, and the neural network element corresponding to the larger weight is used as the strongly associated neuron, i.e. the target neuron, and the node corresponding to the target neuron, i.e. the target node, is identified.
In an example, for the node relation diagram shown in fig. 2, when the set number is 5, then nodes A2, B1, B6, A3, B2, C4, and C8 are target nodes.
In the embodiment of the application, the mode of marking the target node is not limited, and the marking can be performed in a mode of marking the target node which is different from the non-target node, such as highlighting, color and the like, so that a user can intuitively observe the strongly-associated node.
In one example, as shown in fig. 7, the target node in the node relationship diagram shown in fig. 2 is identified in a gray-underfill manner, wherein other nodes are identified in a white-underfill manner, thereby distinguishing the target node from non-target nodes.
In practical application, after determining the target node, the node information of the target node may be directly displayed on the display page.
The electronic device may set a weight threshold, compare the weight corresponding to the connection line in the node relation graph with the weight threshold, and use the node corresponding to the weight greater than the weight threshold as the target node.
In some embodiments, after S104, further comprising: determining the generalization capability of the target network model according to the change degree of the weight in the training process of the target network model; and outputting the generalization capability of the target network model.
In the embodiment of the application, the change degree of all weights or part of weights in the training process of the target network model is counted, so that the change degree of the weights of the target network model in the training process is determined, and the corresponding generalization capability is determined according to the change degree.
In an example, when calculating the degree of change, the strong correlation neurons corresponding to each node relation graph may be counted according to the weights, and the change situations of the strong correlation neurons in all the node relation graphs may be counted, and after the training is finished, the change number of the strong correlation neurons with respect to the previous set times is determined as the degree of change of the weights.
In an example, the weights of the connection relations are counted in different node relation diagrams, fitted into a curve, the fluctuation degree of the curve is determined, the number of weights with the fluctuation degree larger than a fluctuation threshold value is counted, and the counted number is used as the change degree.
In the embodiment of the application, the corresponding relation between the variation degree and the generalization capability can be established, and the generalization capability corresponding to the variation degree of the weight of the current target network model can be determined based on the established corresponding relation.
It should be noted that the generalization capability of the network model reflects the prediction capability of the model for the training set used in the training process, the stronger the generalization capability of the network model, the more accurate the result output when the test set is input, the closer to the test result, the weaker the generalization capability of the network model, the more inaccurate the result output when the test set is input, and the larger the phase difference between the result and the test result.
In some embodiments, after S104, further comprising: and under the condition that the convergence degree of the target network model does not reach the convergence condition, determining suspicious connecting lines corresponding to suspicious connection relations according to the change degree of the weights corresponding to the connection relations in the training process of the target network model, and identifying the suspicious connecting lines.
In the embodiment of the application, under the condition that the training of the target network is finished and the target network is not converged, the change degree of the weight in the training process is determined according to the weight corresponding to each connection relation, the weight with the severe change degree is used as the suspicious connecting line which leads to the non-convergence of the target network model, and the suspicious connecting line is distinguished and identified, so that a user can intuitively check the reason which leads to the non-convergence of the target network model.
In practical application, the electronic device may display the evaluation result of the network model such as the convergence, the interpretability of the model, the generalization capability, and the node relation diagram on a display interface.
According to the model visualization method provided by the embodiment of the application, the training condition of the network model is presented in real time through the node relation diagram while the structure of the network model is visualized, for example: the evaluation results such as convergence condition, interpretability of the model and generalization capability are obtained, so that a user can check and know the evaluation result of the network model by referring to the displayed node relation while observing the node relation graph, and user experience is improved.
The model visualization method provided by the embodiment of the application is further described by taking the weight of the corresponding connection relation represented by the thickness degree of the connecting wire as an example.
In the embodiment of the application, the connection relation between the neurons in the network model is mapped into the nodes in the node relation diagram and the connection lines between the nodes, so that the network model is mapped into the node relation diagram, wherein the thickness of the connection lines corresponding to the connection relations is related to the weight of the connection relations. The whole structure of the network model can be shown through the mapped node relation diagram, and connecting lines with different thicknesses are shown according to different weight values among neurons in the network structure.
In the process of training the network model, the data of the network model can be acquired at regular time and mapped into the corresponding node relation diagram, so that the training condition of the network model can be presented in real time through the node relation diagram, for example: convergence, interpretability of the model, generalization capability, etc.
Taking the training situation of the network model presented by the node relation diagram as an example, a user can view the convergence situation of the network model in real time and intuitively through the presented node relation diagram. When there is a weight in the network model that varies greatly, it can be determined that the network model is not converging, and it can be determined which weight the network model is caused by.
Taking the node relation diagram as an example, the interpretability of the network model is presented, a user can intuitively see the thickness of each connecting line according to the node relation diagram, the weight corresponding to the thick line is large, the nodes corresponding to the thick line are strong association nodes, the strong association nodes possibly play a relatively important role in the prediction result of the network model, the characteristics corresponding to the nodes play an important role in the prediction result of the network model, and therefore the recognition of the characteristics of the network model by the user is enhanced, and the interpretability of the network model is improved.
Taking the node relation diagram as an example, the interpretability of the network model is presented, based on the node relation diagram in the training process of the network model, a user can clearly observe whether the weight numerical value change of the model network is severe or mild in the training process of the model, the more mild the model is, the more stable the network model is, the stronger the credibility of the network model is, the stronger the generalization capability is, the more severe the model is, the weaker the credibility of the network model is, and the weaker the generalization capability is.
As shown in fig. 2, neurons in the network model may be defined as circles and labeled, with the connection between the neurons representing the transmission relationship of the neural network and the thickness of the connection representing the weight relationship between the two neurons. The thicker the wire, the higher the weight of the wire, meaning that the feature importance of the input neuron is stronger. Conversely, the thinner the wire, the lower the weight of the wire, meaning that the input neuron is less important in its characteristics.
In the training process of the network model, the parameters of the network model are updated once to generate a visual image as one frame of the network model, a plurality of frames are generated in the whole training process of the network model, and when the frames are continuously played according to the generation sequence of each frame, animation display of model training can be formed, and the structure of the network model in the model training process is intuitively presented to a user.
In the related art, a training flow of a network model is shown in fig. 8, and includes:
s801, preprocessing training data.
S802, initializing a model.
S803, determine whether to perform training based on the next epoch?
If yes, the characterization training process is not ended, S804 is executed, and if no, the characterization training process is ended, S807 is executed.
S804, determine whether to perform update of model parameters based on the next batch?
If yes, the training corresponding to the current epoch is characterized as not being finished, and S805 is executed, and if no, the training corresponding to the current epoch is characterized as being finished, and S803 is executed.
S805, updating network model parameters.
After the parameters of the network model are updated once, S804 is performed.
S806, evaluating the model and storing the model.
In the model visualization method provided by the embodiment of the application, the training flow of the network model is shown in fig. 9, and S807 and S808 are added on the basis of fig. 8.
S807, network model data backup.
S808, visualizing the network model based on the backup network model data.
In S807, the weight between the structural data of the network model and the neuron is copied.
In S808, according to the backed up network model data, a picture of the node and the connection line is generated and stored as a model training frame.
Here, once S806 is executed, S907 and S808 are executed, that is, once each time the parameters of the network model are updated, backup of the network model data is performed, and visualization is performed once based on the backed up network model data, so as to generate a node relation graph corresponding to the network model after the current parameters are updated. Here, the generated node relation graph may be packaged in the form of an image and saved as a model training frame.
The structural design of the model visualization method provided by the embodiment of the application is shown in fig. 10, and the method comprises the following steps: layer 1001, layer 1002, layer 1004, and layer 1005.
Layer 1001 is a system supported image library layer comprising: elements of the node relationship diagram are constructed, circle 10011, line 10012, canvas 10013.
Layer 1002 is a data abstraction layer comprising: node 10021, connection 10022, and model import block 10023. Wherein the model import block 10023 provides an interface between the system for building a node relation graph and the network model, thereby importing network model data into the system for building a node relation graph.
The layer 1003 is a visualization layer, and includes a model management 10031 and a frame generation 10032, where the model management 10031 is used to manage network model data, such as: classifying, deleting, modifying, saving and the like the network model. Frame generation 10032 is used to construct a node relationship graph based on the network model data and generate a corresponding model frame.
Layer 1004 is an animation generation layer comprising animation generation 10041, and the model training frames generated by layer 1003 are based on model training animations when model training is generated.
According to the model visualization method provided by the embodiment of the application, on one hand, a user can check the convergence condition of the model in real time, and the possible reasons which are not unconverged can be intuitively determined when the unconverged condition is met; on one hand, providing a reference for the user to explore the interpretability of the model; on the other hand, visual reference is provided for evaluating the generalization capability of the model.
In order to implement the method of the embodiment of the present application, an embodiment of the present application provides an electronic device, as shown in fig. 11, an electronic device 1100 includes:
An obtaining module 1101, configured to obtain a connection relationship between a neuron of a target network model and the neuron;
The construction module 1102 is configured to map the neurons to nodes, and map connection relationships between the neurons to connection lines between the corresponding nodes, so as to construct a node relationship diagram corresponding to the target network model; the connecting lines can represent the weight of the corresponding connection relation;
A generating module 1103, configured to generate a model image corresponding to the target network model based on the node relation graph;
an output module 1104 for outputting the model image.
In some embodiments, the obtaining module 1101 is further configured to:
in the process of carrying out iterative updating on the parameters of the target network model based on training data, carrying out primary updating on the parameters of the target network model through the training data, and acquiring the connection relationship between the neurons of the updated target network model;
The association relationship between the neuron and the neuron acquired in the iterative updating process of the target network model can obtain a model image set corresponding to the iterative updating process of the target network model.
In some embodiments, the electronic device 1100 further comprises: a sequencing module, configured to: and sequencing the model images corresponding to the target network model according to the updating sequence of the parameters of the target network model in the iterative updating process.
In some embodiments, the electronic device 1100 further comprises: an identification module for:
acquiring weights corresponding to the connection relations among the neurons;
And identifying the connecting lines mapped by the corresponding connection relation according to the weight.
In some embodiments, the identification module is further to:
and setting the thickness degree of the connecting line mapped by the corresponding connection relation according to the weight.
In some embodiments, the electronic device 1100 further comprises: a first evaluation module for:
according to the weight, sorting the weights according to the sorting from big to small;
Determining target weights in a set order in the ordering result;
And identifying the target node corresponding to the target neuron by connecting the target neuron corresponding to the connection relation of the target weight.
In some embodiments, the electronic device 1100 further comprises: a second evaluation module for:
Determining the generalization capability of the target network model according to the change degree of the weight in the training process of the target network model;
And outputting the generalization capability of the target network model.
In some embodiments, the electronic device 1100 further comprises: a third evaluation module for:
and under the condition that the convergence degree of the target network model does not reach the convergence condition, determining suspicious connecting lines corresponding to suspicious connection relations according to the change degree of the weights corresponding to the connection relations in the training process of the target network model, and identifying the suspicious connecting lines.
It should be noted that, the electronic device provided by the embodiment of the present application includes each unit included, and each module included in each unit may be implemented by a processor in the electronic device; of course, the method can also be realized by a specific logic circuit; in an implementation, the Processor may be a central processing unit (CPU, central Processing Unit), a microprocessor (MPU, micro Processor Unit), a digital signal Processor (DSP, digital Signal Processor), or a Field-Programmable gate array (FPGA), etc.
As shown in fig. 12, an electronic device 1200 provided by an embodiment of the present application includes: a processor 1201, at least one communication bus 1202, a user interface 1203, at least one external communication interface 1204, and a memory 1205. Wherein the communication bus 1202 is configured to enable connected communications between these components. The external communication interface 1204 may include standard wired and wireless interfaces, among others.
Wherein the processor 1201 is configured to execute a control program stored in the memory to implement the following steps:
Acquiring a connection relationship between a neuron of a target network model and the neuron;
Mapping the neurons into nodes, and mapping the connection relations among the neurons into connection lines among the corresponding nodes so as to construct a node relation diagram corresponding to the target network model; the connecting lines can represent the weight of the corresponding connection relation;
and generating a model image corresponding to the target network model based on the node relation diagram, and outputting the model image.
Accordingly, an embodiment of the present application further provides a storage medium, i.e., a computer readable storage medium, where a model visualization program is stored, where the model visualization program, when executed by a processor, implements the steps of the model visualization method described above.
The description of the electronic device and the storage medium embodiments above is similar to that of the method embodiments described above, with similar advantageous effects as the method embodiments. For technical details not disclosed in the electronic device and computer-readable storage medium embodiments of the present application, please refer to the description of the method embodiments of the present application.
In the embodiment of the present application, if the above-mentioned model visualization method is implemented in the form of a software functional module and sold or used as a separate product, it may also be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be embodied in essence or a part contributing to the prior art in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a magnetic disk, an optical disk, or other various media capable of storing program codes. Thus, embodiments of the application are not limited to any specific combination of hardware and software.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present application. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. It should be understood that, in various embodiments of the present application, the sequence numbers of the foregoing processes do not mean the order of execution, and the order of execution of the processes should be determined by the functions and internal logic thereof, and should not constitute any limitation on the implementation process of the embodiments of the present application. The foregoing embodiment numbers of the present application are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above described device embodiments are only illustrative, e.g. the division of the units is only one logical function division, and there may be other divisions in practice, such as: multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. In addition, the various components shown or discussed may be coupled or directly coupled or communicatively coupled to each other via some interface, whether indirectly coupled or communicatively coupled to devices or units, whether electrically, mechanically, or otherwise.
The units described above as separate components may or may not be physically separate, and components shown as units may or may not be physical units; can be located in one place or distributed to a plurality of network units; some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may be separately used as one unit, or two or more units may be integrated in one unit; the integrated units may be implemented in hardware or in hardware plus software functional units.
Those of ordinary skill in the art will appreciate that: all or part of the steps for implementing the above method embodiments may be implemented by hardware related to program instructions, and the foregoing program may be stored in a computer readable storage medium, where the program, when executed, performs steps including the above method embodiments; and the aforementioned storage medium includes: a mobile storage device, a Read Only Memory (ROM), a magnetic disk or an optical disk, or the like, which can store program codes.
Or the above-described integrated units of the application may be stored in a computer-readable storage medium if implemented in the form of software functional modules and sold or used as separate products. Based on such understanding, the technical solutions of the embodiments of the present application may be embodied in essence or a part contributing to the prior art in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a removable storage device, a ROM, a magnetic disk, or an optical disk.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (8)
1. A method of visualizing a model, the method comprising:
In the process of carrying out iterative updating on parameters of a target network model based on training data, carrying out primary updating on the parameters of the target network model through the training data, and acquiring a connection relationship between neurons of the target network model after primary updating and the neurons;
mapping the neurons into nodes, and mapping the connection relations among the neurons into connection lines among the corresponding nodes so as to construct a node relation diagram corresponding to the target network model; the connecting lines can represent the weight of the corresponding connection relation; the weights characterize the importance degree of the neurons in the network model, and the connection relation between the neurons of the target network model is updated based on the update of the parameters of the target network model;
Generating a model image set corresponding to the target network model training process based on at least one node relation graph;
And ordering the model images in the model image set corresponding to the target network model according to the updating sequence of the parameters of the target network model in the iterative updating process, so as to obtain the animation representing the training process of the target network model.
2. The method of claim 1, the method further comprising:
acquiring weights corresponding to the connection relations among the neurons;
And identifying the connecting lines mapped by the corresponding connection relation according to the weight.
3. The method of claim 2, the identifying the connection lines mapped by the corresponding connection relationships according to the weights, comprising:
and setting the thickness degree of the connecting line mapped by the corresponding connection relation according to the weight.
4. The method of claim 2, the method further comprising:
according to the weight, sorting the weights according to the sorting from big to small;
Determining target weights in a set order in the ordering result;
And identifying the target node corresponding to the target neuron by connecting the target neuron corresponding to the connection relation of the target weight.
5. The method of claim 2, the method further comprising:
Determining the generalization capability of the target network model according to the change degree of the weight in the training process of the target network model;
And outputting the generalization capability of the target network model.
6. The method of claim 2, the method further comprising:
and under the condition that the convergence degree of the target network model does not reach the convergence condition, determining suspicious connecting lines corresponding to suspicious connection relations according to the change degree of the weights corresponding to the connection relations in the training process of the target network model, and identifying the suspicious connecting lines.
7. An electronic device, the electronic device comprising:
The acquisition module is used for acquiring the connection relationship between the neuron of the target network model after one time updating and the neuron when the parameter of the target network model is updated once through the training data in the process of carrying out iterative updating on the parameter of the target network model based on the training data;
the construction module is used for mapping the neurons into nodes and mapping the connection relations among the neurons into connection lines among the corresponding nodes so as to construct a node relation diagram corresponding to the target network model; the connecting lines can represent the weight of the corresponding connection relation; the weights characterize the importance degree of the neurons in the network model, and the connection relation between the neurons of the target network model is updated based on the update of the parameters of the target network model;
The generation module is used for generating a model image set corresponding to the target network model training process based on the updating sequence of at least one node relation graph;
And the output module is used for sequencing the model images in the model image set corresponding to the target network model according to the updating sequence of the parameters of the target network model in the iterative updating process to obtain the animation representing the training process of the target network model.
8. A storage medium having stored thereon a model visualization program which, when executed by a processor, implements the steps of the model visualization method of any one of 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911422621.8A CN111159279B (en) | 2019-12-31 | 2019-12-31 | Model visualization method, device and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911422621.8A CN111159279B (en) | 2019-12-31 | 2019-12-31 | Model visualization method, device and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111159279A CN111159279A (en) | 2020-05-15 |
CN111159279B true CN111159279B (en) | 2024-04-26 |
Family
ID=70560823
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911422621.8A Active CN111159279B (en) | 2019-12-31 | 2019-12-31 | Model visualization method, device and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111159279B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112035649B (en) * | 2020-09-02 | 2023-11-17 | 腾讯科技(深圳)有限公司 | Question-answering model processing method and device, computer equipment and storage medium |
CN112215355B (en) * | 2020-10-09 | 2024-09-13 | 广东弓叶科技有限公司 | Neural network model optimization system and method |
CN112819160B (en) * | 2021-02-24 | 2023-10-31 | 文远鄂行(湖北)出行科技有限公司 | Visualization method, device and equipment for neural network model and storage medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109002879A (en) * | 2018-07-23 | 2018-12-14 | 济南浪潮高新科技投资发展有限公司 | The visual modeling method and device of neural network model |
CN109145288A (en) * | 2018-07-11 | 2019-01-04 | 西安电子科技大学 | Based on variation from the text depth characteristic extracting method of encoding model |
CN110245980A (en) * | 2019-05-29 | 2019-09-17 | 阿里巴巴集团控股有限公司 | The method and apparatus for determining target user's exiting form based on neural network model |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11556794B2 (en) * | 2017-08-31 | 2023-01-17 | International Business Machines Corporation | Facilitating neural networks |
-
2019
- 2019-12-31 CN CN201911422621.8A patent/CN111159279B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109145288A (en) * | 2018-07-11 | 2019-01-04 | 西安电子科技大学 | Based on variation from the text depth characteristic extracting method of encoding model |
CN109002879A (en) * | 2018-07-23 | 2018-12-14 | 济南浪潮高新科技投资发展有限公司 | The visual modeling method and device of neural network model |
CN110245980A (en) * | 2019-05-29 | 2019-09-17 | 阿里巴巴集团控股有限公司 | The method and apparatus for determining target user's exiting form based on neural network model |
Also Published As
Publication number | Publication date |
---|---|
CN111159279A (en) | 2020-05-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111754596B (en) | Editing model generation method, device, equipment and medium for editing face image | |
CN111159279B (en) | Model visualization method, device and storage medium | |
CN112434721A (en) | Image classification method, system, storage medium and terminal based on small sample learning | |
CN108345587B (en) | Method and system for detecting authenticity of comments | |
CN111597884A (en) | Facial action unit identification method and device, electronic equipment and storage medium | |
WO2021203865A1 (en) | Molecular binding site detection method and apparatus, electronic device and storage medium | |
CN112330684B (en) | Object segmentation method and device, computer equipment and storage medium | |
CN109522970B (en) | Image classification method, device and system | |
CN110728319B (en) | Image generation method and device and computer storage medium | |
CN111027610B (en) | Image feature fusion method, apparatus, and medium | |
CN113408570A (en) | Image category identification method and device based on model distillation, storage medium and terminal | |
CN116580257A (en) | Feature fusion model training and sample retrieval method and device and computer equipment | |
CN113205017A (en) | Cross-age face recognition method and device | |
CN112818995A (en) | Image classification method and device, electronic equipment and storage medium | |
CN112418256A (en) | Classification, model training and information searching method, system and equipment | |
CN113449840A (en) | Neural network training method and device and image classification method and device | |
CN113408564A (en) | Graph processing method, network training method, device, equipment and storage medium | |
CN112183303A (en) | Transformer equipment image classification method and device, computer equipment and medium | |
CN117475253A (en) | Model training method and device, electronic equipment and storage medium | |
CN113536845A (en) | Face attribute recognition method and device, storage medium and intelligent equipment | |
CN113076755B (en) | Keyword extraction method, keyword extraction device, keyword extraction equipment and storage medium | |
CN117011219A (en) | Method, apparatus, device, storage medium and program product for detecting quality of article | |
CN110826726B (en) | Target processing method, target processing device, target processing apparatus, and medium | |
CN113822291A (en) | Image processing method, device, equipment and storage medium | |
CN114332955B (en) | Pedestrian re-identification method and device and computer readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |