CN117454957B - Reasoning training system for image processing neural network model - Google Patents

Reasoning training system for image processing neural network model Download PDF

Info

Publication number
CN117454957B
CN117454957B CN202311781565.3A CN202311781565A CN117454957B CN 117454957 B CN117454957 B CN 117454957B CN 202311781565 A CN202311781565 A CN 202311781565A CN 117454957 B CN117454957 B CN 117454957B
Authority
CN
China
Prior art keywords
node
neural network
image processing
data
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311781565.3A
Other languages
Chinese (zh)
Other versions
CN117454957A (en
Inventor
张卫平
丁洋
张伟
王晶
李显阔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Global Digital Group Co Ltd
Original Assignee
Global Digital Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Global Digital Group Co Ltd filed Critical Global Digital Group Co Ltd
Priority to CN202311781565.3A priority Critical patent/CN117454957B/en
Publication of CN117454957A publication Critical patent/CN117454957A/en
Application granted granted Critical
Publication of CN117454957B publication Critical patent/CN117454957B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/3003Monitoring arrangements specially adapted to the computing system or computing system component being monitored
    • G06F11/3006Monitoring arrangements specially adapted to the computing system or computing system component being monitored where the computing system is distributed, e.g. networked systems, clusters, multiprocessor systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/3058Monitoring arrangements for monitoring environmental properties or parameters of the computing system or of the computing system component, e.g. monitoring of power, currents, temperature, humidity, position, vibrations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/3065Monitoring arrangements determined by the means or processing involved in reporting the monitored data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/06Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
    • G06N3/063Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computing arrangements using knowledge-based models
    • G06N5/04Inference or reasoning models
    • G06N5/046Forward inferencing; Production systems
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Biomedical Technology (AREA)
  • Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Data Mining & Analysis (AREA)
  • Software Systems (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Quality & Reliability (AREA)
  • Neurology (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides an inference training system for an image processing neural network model, which relates to the field of electric digital data processing and comprises a model access module, a model analysis module, a training monitoring module, an inference analysis module and a model feedback module, wherein the model access module is used for being in butt joint with an image processing neural network model, the model analysis module is used for analyzing the layout condition of neuron nodes in the image processing neural network model, the training monitoring module is used for monitoring the parameter change condition of the neuron nodes in the training process of the image processing neural network model, the inference analysis module is used for carrying out inference analysis on the parameter change condition, and the model feedback module is used for feeding back the inference analysis result to the image processing neural network model; the invention accelerates the learning speed of the image processing neural network model and improves the problems of computer resource waste and poor calculation instantaneity.

Description

Reasoning training system for image processing neural network model
Technical Field
The invention relates to the field of electric digital data processing, in particular to an inference training system for an image processing neural network model.
Background
The neural network model is a calculation model inspired by a biological neural system, has strong learning and pattern recognition capability, can be used for processing the problems of complex image information, unclear image and ambiguous reasoning rules by utilizing the neural network model to recognize or classify the image, but the conventional neural network model for processing the image needs a large amount of data and time to achieve the learning effect in the training process, has low efficiency, wastes computer resources and has poor calculation instantaneity, and an auxiliary system is needed to improve the learning efficiency and improve the problems of wasting computer resources and having poor calculation instantaneity.
The foregoing discussion of the background art is intended to facilitate an understanding of the present invention only. This discussion is not an admission or admission that any of the material referred to was common general knowledge.
Many systems for training neural networks have been developed, and through extensive searching and referencing, existing training systems have been found to have a system as disclosed in publication number CN104978601B, which generally includes a coordinating device and a preset number of computing devices; the coordination equipment is used for synchronously controlling the computing equipment according to the layer of the neural network model; and each computing device is used for processing the nodes divided into the computing devices in the corresponding layers in the neural network model according to the training sequence of the neural network model under the synchronous control of the coordination device according to the layers of the neural network model, and transmitting the data generated by the processing nodes to the model storage device or the computing device where the node of the next layer connected with the node of the device is located until the training of the input training sample is finished. However, the system still adopts a traditional training mode, and does not mine the relation between nodes, so that the learning efficiency is required to be improved.
Disclosure of Invention
The invention aims at providing an inference training system for an image processing neural network model aiming at the defects.
The invention adopts the following technical scheme:
an inference training system for an image processing neural network model comprises a model access module, a model analysis module, a training monitoring module, an inference analysis module and a model feedback module; the image processing neural network model is used for classifying or identifying input image data;
the model access module is used for interfacing with an image processing neural network model, the model analysis module is used for analyzing the layout condition of the neuron nodes in the image processing neural network model, the training monitoring module is used for monitoring the parameter change condition of the neuron nodes in the training process of the image processing neural network model, the reasoning analysis module is used for carrying out reasoning analysis on the parameter change condition, and the model feedback module is used for feeding back the reasoning analysis result to the image processing neural network model;
the model analysis module comprises a node identification unit, a network relation synchronization unit and a node data storage unit, wherein the node identification unit is used for identifying neuron nodes in the image processing neural network model, the network relation synchronization unit is used for synchronously recording connection relations among the neuron nodes, and the node data storage unit is used for creating a storage space for each neuron node and storing parameter change condition data;
the training monitoring module comprises an event monitoring unit and a node conversion unit, wherein the event monitoring unit is used for responding to parameter change events in neuron nodes, and the node conversion unit is used for converting node information into corresponding storage space information and sending new parameters into the corresponding storage spaces;
the reasoning analysis module comprises a data reading unit, a data preprocessing unit and a data analysis unit, wherein the data reading unit is used for reading image data from the node data storage unit, the data preprocessing unit is used for preprocessing the read image data, and the data analysis unit is used for carrying out reasoning analysis on the preprocessed image data;
further, the data preprocessing unit comprises a normalization calculation processor and a difference calculation processor, wherein the normalization calculation processor is used for normalizing all data of the nodes into values in the same interval, and the difference calculation processor is used for calculating the difference value of two adjacent values;
the normalization calculation processor calculates a normalization value Da of one data according to the following formula:
wherein, max represents the maximum value of the parameter in the node, min represents the minimum value of the parameter in the node, ld is the interval length, and da is the original parameter value;
the difference calculation processor calculates the obtained difference value according to DaA representation;
further, the data analysis unit comprises a feature calculation processor, a node classification processor, a node line retrieval processor and a strong logic analysis processor, wherein the feature calculation processor is used for calculating fluctuation features of each node, the node classification processor is used for carrying out weak correlation classification on the nodes according to the fluctuation features, the node line retrieval processor is used for retrieving an inference node line from the classified nodes, and the strong logic analysis processor is used for carrying out strong logic analysis on the inference node line to screen out the inference node line meeting the requirements;
further, the fluctuation feature is a vector Cw, and the elements in the vector areWherein->Is->The number of groups of consecutive positive numbers, +.>Is->The characteristic calculation processor calculates ++according to the following formula>And->
Wherein,cumulative value representing the successive positive numbers of group i, < >>Cumulative value representing the successive negatives of group i, N being +.>Is the number of (3);
further, the strong logic analysis processor calculates and processes two nodes connected on the inference node line to obtain a strong correlation value Q:
wherein,is of strong coefficient->I-th difference value representing a node, < ->An ith difference representing another node;
the strong logic analysis processor refers to two nodes with Q larger than 0 as having strong logics, and sends the reasoning node lines with the strong logics continuously to the model feedback module, wherein the sent reasoning node lines at least comprise m nodes, and m is a reasoning base value.
The beneficial effects obtained by the invention are as follows:
the system monitors the training process of the image processing neural network model and extracts the parameter change condition in each node, performs reasoning analysis on the parameter change condition to obtain reasoning node lines, feeds the reasoning node lines back to the image processing neural network model, wherein the reasoning node lines are continuous nodes with strong logic relations and are selected from a large number of nodes, the reasoning relations among the nodes can be reflected, the training batch required when the function of the image processing neural network model is qualified can be greatly reduced through optimizing the node parameters on the reasoning node lines, the operation time is saved, the training efficiency and effect of the image processing neural network model on the image are improved, the system overcomes the defects of wasting computer resources and poor calculation instantaneity, and the operation performance of electronic equipment is improved.
For a further understanding of the nature and the technical aspects of the present invention, reference should be made to the following detailed description of the invention and the accompanying drawings, which are provided for purposes of reference only and are not intended to limit the invention.
Drawings
FIG. 1 is a schematic diagram of the overall structural framework of the present invention;
FIG. 2 is a schematic diagram of a model access module according to the present invention;
FIG. 3 is a schematic diagram of a model analysis module according to the present invention;
FIG. 4 is a schematic diagram of a training monitor module according to the present invention;
fig. 5 is a schematic diagram of the reasoning analysis module of the present invention.
Detailed Description
The following embodiments of the present invention are described in terms of specific examples, and those skilled in the art will appreciate the advantages and effects of the present invention from the disclosure herein. The invention is capable of other and different embodiments and its several details are capable of modification and variation in various respects, all without departing from the spirit of the present invention. The drawings of the present invention are merely schematic illustrations, and are not intended to be drawn to actual dimensions. The following embodiments will further illustrate the related art content of the present invention in detail, but the disclosure is not intended to limit the scope of the present invention.
Embodiment one: the embodiment provides an inference training system for an image processing neural network model, which comprises a model access module, a model analysis module, a training monitoring module, an inference analysis module and a model feedback module, and is combined with fig. 1; the image processing neural network model is used for classifying or identifying input image data;
the model access module is used for interfacing with an image processing neural network model, the model analysis module is used for analyzing the layout condition of the neuron nodes in the image processing neural network model, the training monitoring module is used for monitoring the parameter change condition of the neuron nodes in the training process of the image processing neural network model, the reasoning analysis module is used for carrying out reasoning analysis on the parameter change condition, and the model feedback module is used for feeding back the reasoning analysis result to the image processing neural network model;
the model analysis module comprises a node identification unit, a network relation synchronization unit and a node data storage unit, wherein the node identification unit is used for identifying neuron nodes in the image processing neural network model, the network relation synchronization unit is used for synchronously recording connection relations among the neuron nodes, and the node data storage unit is used for creating a storage space for each neuron node and storing parameter change condition data;
the training monitoring module comprises an event monitoring unit and a node conversion unit, wherein the event monitoring unit is used for responding to parameter change events in neuron nodes, and the node conversion unit is used for converting node information into corresponding storage space information and sending new parameters into the corresponding storage spaces;
the reasoning analysis module comprises a data reading unit, a data preprocessing unit and a data analysis unit, wherein the data reading unit is used for reading data from the node data storage unit, the data preprocessing unit is used for preprocessing the read data, and the data analysis unit is used for carrying out reasoning analysis on the preprocessed data;
the data preprocessing unit comprises a normalization computing processor and a difference computing processor, wherein the normalization computing processor is used for normalizing all data of the nodes into values in the same interval, and the difference computing processor is used for computing the difference value of two adjacent values;
the normalization calculation processor calculates a normalization value Da of one data according to the following formula:
wherein, max represents the maximum value of the parameter in the node, min represents the minimum value of the parameter in the node, ld is the interval length, and da is the original parameter value;
the difference calculation processor calculates the obtained difference value according to DaA representation;
the data analysis unit comprises a characteristic calculation processor, a node classification processor, a node line retrieval processor and a strong logic analysis processor, wherein the characteristic calculation processor is used for calculating fluctuation characteristics of each node, the node classification processor is used for carrying out weak correlation classification on the nodes according to the fluctuation characteristics, the node line retrieval processor is used for retrieving an reasoning node line from the classified nodes, and the strong logic analysis processor is used for carrying out strong logic analysis on the reasoning node line to screen out the reasoning node line meeting the requirements;
the fluctuation feature is a vector Cw, and the elements in the vector areWherein->Is thatGroup of consecutive positive numbersCount (n)/(l)>Is->The characteristic calculation processor calculates ++according to the following formula>And->
Wherein,cumulative value representing the successive positive numbers of group i, < >>Cumulative value representing the successive negatives of group i, N being +.>Is the number of (3);
the strong logic analysis processor calculates and processes two nodes connected on the reasoning node line to obtain a strong correlation value Q:
wherein,is of strong coefficient->I-th difference value representing a node, < ->An ith difference representing another node;
the strong logic analysis processor refers to two nodes with Q larger than 0 as having strong logics, and sends the reasoning node lines with the strong logics continuously to the model feedback module, wherein the sent reasoning node lines at least comprise m nodes, and m is a reasoning base value.
Embodiment two: the embodiment comprises the whole content of the first embodiment, and provides an inference training system for an image processing neural network model, which comprises a model access module, a model analysis module, a training monitoring module, an inference analysis module and a model feedback module;
the model access module is used for interfacing with an image processing neural network model, the model analysis module is used for analyzing the layout condition of the neuron nodes in the image processing neural network model, the training monitoring module is used for monitoring the parameter change condition of the neuron nodes in the training process of the image processing neural network model, the reasoning analysis module is used for carrying out reasoning analysis on the parameter change condition, and the model feedback module is used for feeding back the reasoning analysis result to the image processing neural network model;
referring to fig. 2, the model access module includes an interface connection unit, a connection control unit and an information storage unit, wherein the information storage unit is used for inputting and storing network information of an image processing neural network model, the interface connection unit completes connection with the image processing neural network model according to the network information, and the connection control unit is used for selecting the network information from the information storage unit, sending the network information to the interface connection unit and controlling the connection state;
referring to fig. 3, the model analysis module includes a node identification unit, a network relationship synchronization unit, and a node data storage unit, where the node identification unit is configured to identify neuronal nodes in the image processing neural network model, the network relationship synchronization unit is configured to synchronously record connection relationships between the neuronal nodes, and the node data storage unit creates a storage space for each neuronal node to store parameter change condition data;
referring to fig. 4, the training monitoring module includes an event monitoring unit for responding to a parameter change event in a neuron node, and a node conversion unit for converting node information into corresponding storage space information and transmitting new parameters to the corresponding storage space;
referring to fig. 5, the reasoning analysis module includes a data reading unit, a data preprocessing unit and a data analysis unit, where the data reading unit is used to read data from the node data storage unit, the data preprocessing unit is used to preprocess the read data, and the data analysis unit is used to perform reasoning analysis on the preprocessed data;
the model feedback module comprises a result storage unit and a result feedback unit, wherein the result storage unit is used for receiving and storing the reasoning analysis result, and the result feedback unit is used for sending the latest reasoning analysis result to the image processing neural network model;
the node identification unit comprises an element feature register, a feature comparison processor and a node information reading processor, wherein the element feature register is used for storing feature information of a neuron node constituent element, the feature comparison processor is used for comparing node feature information in an image processing neural network model and identifying a neuron node, and the node information reading processor is used for reading network information of the neuron node and sending the network information to the network relation synchronization unit;
the network relation synchronization unit comprises a network information receiving processor and a node network synchronization processor, wherein the network information receiving processor is used for receiving the network information sent by the node information reading processor, and the node network synchronization processor is used for analyzing the network information and constructing a node mapping network;
the node network synchronization processor maps the node information to obtain a virtual node, connects the virtual node according to the upper node information and the lower node information, and forms a node mapping network by the virtual node;
the node data storage unit comprises a space creation processor and a space retrieval processor, wherein the space creation processor creates a storage space according to each virtual node number, and the space retrieval processor is used for recording the corresponding relation between the virtual node number and the storage space address and retrieving the corresponding storage space address when receiving the virtual node number;
the event monitoring unit comprises an event triggering processor and a data acquisition processor, wherein the event triggering processor is used for monitoring parameter values in neuron nodes and sending a triggering signal to the data acquisition processor when the parameter values are changed, the data acquisition processor acquires data from corresponding neuron nodes after receiving the triggering signal and sends the data to the node conversion unit, and the acquired data comprises neuron node information and new parameter values;
the node conversion unit comprises a conversion inquiry processor and a data transmission processor, wherein the conversion inquiry processor is used for inquiring corresponding virtual node numbers from the network relation synchronization unit and corresponding storage space address information from the node data storage unit, and the data transmission processor is used for transmitting new parameter values to the corresponding storage spaces;
the data preprocessing unit comprises a normalization computing processor and a difference computing processor, wherein the normalization computing processor is used for normalizing all data of the nodes into values in the same interval, and the difference computing processor is used for computing the difference value of two adjacent values;
the normalization calculation processor calculates a normalization value Da of one data according to the following formula:
wherein, max represents the maximum value of the parameter in the node, min represents the minimum value of the parameter in the node, ld is the interval length, and da is the original parameter value;
the difference calculation processor calculates the obtained difference value according to DaA representation;
the data analysis unit pairThe reasoning analysis process of (1) comprises the following steps:
s1, calculating fluctuation characteristics of each node;
s2, carrying out weak correlation classification on the nodes according to the fluctuation characteristics;
s3, acquiring the connection relation of the nodes from the network relation synchronization unit, and retrieving an inference node line from the classified nodes according to the connection relation, wherein the nodes on the inference node line are sequentially connected and belong to the same weak correlation classification;
s4, carrying out strong logic analysis on the reasoning node line, and sending the reasoning node line meeting the requirements to the model feedback module as an analysis result;
the data analysis unit comprises a characteristic calculation processor, a node classification processor, a node line retrieval processor and a strong logic analysis processor, wherein the characteristic calculation processor is used for executing a step S1, the node classification processor is used for executing a step S2, the node line retrieval processor is used for executing a step S3, and the strong logic analysis processor is used for executing a step S4;
the fluctuation feature is a vector Cw, and the elements in the vector areWherein->Is thatThe number of groups of consecutive positive numbers, +.>Is->The characteristic calculation processor calculates ++according to the following formula>And->
Wherein,cumulative value representing the successive positive numbers of group i, < >>Cumulative value representing the successive negatives of group i, N being +.>Is the number of (3);
the node classification processors will have the sameAnd->The vectors of (2) are classified into one class, and a weak correlation value P is calculated for two vectors in the same class:
wherein,and->Is +.>And->Value of->And->Is +.>Anda value;
the node classification processor continues to divide vectors in the same class into subclasses, so that weak correlation values of any two vectors in each subclass are smaller than a threshold value, and the threshold value is set by a worker;
the strong logic analysis processor calculates and processes two nodes connected on the reasoning node line to obtain a strong correlation value Q:
wherein,is of strong coefficient->I-th difference value representing a node, < ->An ith difference representing another node;
strong coefficientIs set by the staff at his own discretion, but->More than 1 is needed;
the strong logic analysis processor refers to two nodes with Q larger than 0 as having strong logics, and sends the reasoning node lines with the strong logics continuously to the model feedback module, wherein the sent reasoning node lines at least comprise m nodes, and m is a reasoning base value.
Embodiment III: the application scene of the system can be used for image classification and image recognition neural network models in image processing; selecting a neural network model of image classification;
the method comprises the steps of docking a neural network model for image classification with the system, identifying nodes in the neural network model for image classification, and then creating a synchronous network and a corresponding storage space according to an identification result;
inputting a large number of images into a neural network model of image classification for training, and monitoring and storing parameter change conditions of each node in a storage space by the system in the training process;
after image training of one batch is carried out, the system analyzes data in a storage space, finds out reasoning node lines in the neural network model of image classification according to the method in the second embodiment, the weight of a node on each reasoning node line in judging the same type of image is higher than that of a common node, the information of the reasoning node lines is fed back to the neural network model of image classification, the neural network model of image classification optimizes node parameters on the reasoning node lines, and each optimization process is equivalent to image training of a plurality of batches;
the image training and the process of finding out the reasoning node line and optimizing the node parameters are repeated continuously, so that the training efficiency and the effect of the neural network model of image classification on the image are improved, the operation efficiency of the electronic equipment is improved, and the defects of wasting computer resources and poor calculation instantaneity are avoided.
Image training is a process completed inside a butted model, which belongs to the prior art and is not the content of the application;
finding out the reasoning node line is already described in detail in the second embodiment, and will not be described in detail in this embodiment;
the following describes the process of optimizing node parameters:
the variation of the jth parameter of the ith node on the reasoning node line is recorded aspa(i,j);
And optimizing the parameters on the reasoning node line according to the following steps:
wherein Tn is the number of changes of training seed parameters in one batch,pa (i) is the parameter value of the current ith node, and Pa' (i) is the parameter value of the optimized ith node;
if a neural network model for image classification originally needs to be trained for 10 batches to achieve a qualified effect, after the neural network model is processed by the method, parameters are optimized once after each batch is trained, the qualified effect can be achieved only by training 5 batches, and the time for parameter optimization is far less than the time for training one batch, so that the technical effect of efficient training is achieved, and values 10 and 5 are only used for illustration for understanding the training effect and do not represent actual situations;
the model algorithm is optimized after the model is deeply understood by improving a neural network model for image processing, but the training batch required when the function of the image classification neural network model is qualified is greatly reduced by butting the neural network model for image classification processing, so that the operation time is saved, the model algorithm can be directly optimized without learning and knowing the image classification model, and the training efficiency of the image classification neural network model on the image is improved;
by reducing training times, the system saves operation time and improves the performance of the electronic equipment;
i and j appear herein as ordinal numbers used to represent sequence numbers.
The foregoing disclosure is only a preferred embodiment of the present invention and is not intended to limit the scope of the invention, so that all equivalent technical changes made by applying the description of the present invention and the accompanying drawings are included in the scope of the present invention, and in addition, elements in the present invention can be updated as the technology develops.

Claims (5)

1. The reasoning training system for the image processing neural network model is characterized by comprising a model access module, a model analysis module, a training monitoring module, a reasoning analysis module and a model feedback module; the image processing neural network model is used for classifying or identifying input image data;
the model access module is used for interfacing with an image processing neural network model, the model analysis module is used for analyzing the layout condition of the neuron nodes in the image processing neural network model, the training monitoring module is used for monitoring the parameter change condition of the neuron nodes in the training process of the image processing neural network model, the reasoning analysis module is used for carrying out reasoning analysis on the parameter change condition, and the model feedback module is used for feeding back the reasoning analysis result to the image processing neural network model;
the model analysis module comprises a node identification unit, a network relation synchronization unit and a node data storage unit, wherein the node identification unit is used for identifying neuron nodes in the image processing neural network model, the network relation synchronization unit is used for synchronously recording connection relations among the neuron nodes, and the node data storage unit is used for creating a storage space for each neuron node and storing parameter change condition data;
the training monitoring module comprises an event monitoring unit and a node conversion unit, wherein the event monitoring unit is used for responding to parameter change events in neuron nodes, and the node conversion unit is used for converting node information into corresponding storage space information and sending new parameters into the corresponding storage spaces;
the reasoning analysis module comprises a data reading unit, a data preprocessing unit and a data analysis unit, wherein the data reading unit is used for reading data from the node data storage unit, the data preprocessing unit is used for preprocessing the read data, and the data analysis unit is used for carrying out reasoning analysis on the preprocessed data.
2. The inference training system for an image processing neural network model of claim 1, wherein the data preprocessing unit includes a normalization computation processor for normalizing all data of a node to values within a same interval and a differential computation processor for computing differences between adjacent two values;
the normalization calculation processor calculates a normalization value Da of one data according to the following formula:
wherein, max represents the maximum value of the parameter in the node, min represents the minimum value of the parameter in the node, ld is the interval length, and da is the original parameter value;
the difference calculation processor calculates the obtained difference value according to DaAnd (3) representing.
3. The inference training system for an image processing neural network model according to claim 2, wherein the data analysis unit includes a feature calculation processor for calculating a fluctuation feature of each node, a node classification processor for classifying the nodes weakly according to the fluctuation feature, a node line search processor for searching out an inference node line from the classified nodes, and a strong logic analysis processor for screening out the inference node line conforming to the requirement by performing a strong logic analysis on the inference node line.
4. An inference training system for an image processing neural network model as claimed in claim 3, wherein the fluctuation feature is a vector Cw, the elements in the vector beingWherein->Is->The number of groups of consecutive positive numbers, +.>Is->The characteristic calculation processor calculates ++according to the following formula>And->
Wherein,cumulative value representing the successive positive numbers of group i, < >>Cumulative value representing the successive negatives of group i, N beingIs a number of (3).
5. The inference training system for an image processing neural network model of claim 4, wherein the strong logic analysis processor performs a calculation process on two nodes connected on an inference node line to obtain a strong correlation value Q:
wherein,is of strong coefficient->I-th difference value representing a node, < ->An ith difference representing another node;
the strong logic analysis processor refers to two nodes with Q larger than 0 as having strong logics, and sends the reasoning node lines with the strong logics continuously to the model feedback module, wherein the sent reasoning node lines at least comprise m nodes, and m is a reasoning base value.
CN202311781565.3A 2023-12-22 2023-12-22 Reasoning training system for image processing neural network model Active CN117454957B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311781565.3A CN117454957B (en) 2023-12-22 2023-12-22 Reasoning training system for image processing neural network model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311781565.3A CN117454957B (en) 2023-12-22 2023-12-22 Reasoning training system for image processing neural network model

Publications (2)

Publication Number Publication Date
CN117454957A CN117454957A (en) 2024-01-26
CN117454957B true CN117454957B (en) 2024-03-22

Family

ID=89589535

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311781565.3A Active CN117454957B (en) 2023-12-22 2023-12-22 Reasoning training system for image processing neural network model

Country Status (1)

Country Link
CN (1) CN117454957B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015158198A1 (en) * 2014-04-17 2015-10-22 北京泰乐德信息技术有限公司 Fault recognition method and system based on neural network self-learning
CN110334799A (en) * 2019-07-12 2019-10-15 电子科技大学 Integrated ANN Reasoning and training accelerator and its operation method are calculated based on depositing
CN111191769A (en) * 2019-12-25 2020-05-22 中国科学院苏州纳米技术与纳米仿生研究所 Self-adaptive neural network training and reasoning device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015158198A1 (en) * 2014-04-17 2015-10-22 北京泰乐德信息技术有限公司 Fault recognition method and system based on neural network self-learning
CN110334799A (en) * 2019-07-12 2019-10-15 电子科技大学 Integrated ANN Reasoning and training accelerator and its operation method are calculated based on depositing
CN111191769A (en) * 2019-12-25 2020-05-22 中国科学院苏州纳米技术与纳米仿生研究所 Self-adaptive neural network training and reasoning device

Also Published As

Publication number Publication date
CN117454957A (en) 2024-01-26

Similar Documents

Publication Publication Date Title
CN114241282B (en) Knowledge distillation-based edge equipment scene recognition method and device
CN111881714A (en) Unsupervised cross-domain pedestrian re-identification method
US20220375213A1 (en) Processing Apparatus and Method and Storage Medium
CN112132197B (en) Model training, image processing method, device, computer equipment and storage medium
CN110163258A (en) A kind of zero sample learning method and system reassigning mechanism based on semantic attribute attention
CN112633382A (en) Mutual-neighbor-based few-sample image classification method and system
CN114298122B (en) Data classification method, apparatus, device, storage medium and computer program product
CN111524140B (en) Medical image semantic segmentation method based on CNN and random forest method
CN114358250A (en) Data processing method, data processing apparatus, computer device, medium, and program product
Venegas et al. Automatic ladybird beetle detection using deep-learning models
Naumenko et al. Information-extreme machine training of on-board recognition system with optimization of RGB-component digital images
CN113496148A (en) Multi-source data fusion method and system
CN111291785A (en) Target detection method, device, equipment and storage medium
CN113743251B (en) Target searching method and device based on weak supervision scene
CN117454957B (en) Reasoning training system for image processing neural network model
CN114462526B (en) Classification model training method and device, computer equipment and storage medium
CN114638953B (en) Point cloud data segmentation method and device and computer readable storage medium
CN112487927B (en) Method and system for realizing indoor scene recognition based on object associated attention
CN115661923A (en) Domain generalization pedestrian re-identification method of self-adaptive modeling domain features
CN114758135A (en) Unsupervised image semantic segmentation method based on attention mechanism
CN111401519B (en) Deep neural network unsupervised learning method based on similarity distance in object and between objects
CN113537339A (en) Method and system for identifying symbiotic or associated minerals based on multi-label image classification
CN112837701B (en) Voice emotion recognition method based on multi-classifier interactive learning
CN112784674B (en) Cross-domain identification method of key personnel search system based on class center self-adaption
CN114298168B (en) Three-dimensional point cloud processing method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant