CN115392482A - Deep learning model selection method, device and equipment - Google Patents

Deep learning model selection method, device and equipment Download PDF

Info

Publication number
CN115392482A
CN115392482A CN202211025489.9A CN202211025489A CN115392482A CN 115392482 A CN115392482 A CN 115392482A CN 202211025489 A CN202211025489 A CN 202211025489A CN 115392482 A CN115392482 A CN 115392482A
Authority
CN
China
Prior art keywords
training
deep learning
model
verification
testing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211025489.9A
Other languages
Chinese (zh)
Other versions
CN115392482B (en
Inventor
吕成器
周再达
叶浩晨
张文蔚
陈恺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai AI Innovation Center
Original Assignee
Shanghai AI Innovation Center
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai AI Innovation Center filed Critical Shanghai AI Innovation Center
Priority to CN202211025489.9A priority Critical patent/CN115392482B/en
Priority claimed from CN202211025489.9A external-priority patent/CN115392482B/en
Publication of CN115392482A publication Critical patent/CN115392482A/en
Application granted granted Critical
Publication of CN115392482B publication Critical patent/CN115392482B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses a selection method of deep learning models, which trains, verifies and tests different deep learning models, compares test results and selects the deep learning models meeting preset conditions, wherein the training, the verification and the testing of the different deep learning models are realized by the same actuator: firstly, a required module is constructed according to a given deep learning algorithm through an actuator: the method comprises the steps that a model, a data set, a training cycle controller, a verification cycle controller, a test cycle controller and a hook are used, and then the training cycle controller, the verification cycle controller or the test cycle controller is called according to preset logic to conduct model training, verification or testing. Because a special flow is not required to be developed for different deep learning algorithms, and various deep learning algorithms can be conveniently realized by calling corresponding modules, the advantages and the disadvantages of different algorithms can be conveniently compared, and the most appropriate model can be further determined.

Description

Deep learning model selection method, device and equipment
Technical Field
The invention relates to the technical field of computer vision, in particular to a method, a device and equipment for selecting a deep learning model.
Background
The model training, verification and test based on the deep learning framework generally needs the following steps: deep learning environment initialization, data set reading, model initialization, model weight loading, optimizer and parameter scheduler construction, and the like. In practical application, the optimal algorithms adopted by different data inputs are often different, and therefore, in order to obtain an optimal model, training, verification and testing of learning algorithms of different depths are usually required.
At present, different deep learning algorithms usually need to realize a set of different training, verifying and testing processes without uniform realization, so that a plurality of redundancies exist in codes of different deep learning algorithm libraries. The user may encounter different obstacles to the training process of different algorithms in the process of using a plurality of deep learning algorithms.
Disclosure of Invention
In view of some or all of the problems in the prior art, an aspect of the present invention provides a method for selecting a deep learning model, including: training, verifying and testing different deep learning models, comparing test results, and selecting the deep learning model meeting preset conditions, wherein the training, the verifying and the testing of the different deep learning models are all realized by the same actuator, and the training, the verifying and the testing of any deep learning model comprise:
constructing, by the executor, a desired module according to a given deep learning algorithm, the module comprising: a model, a data set, a training cycle controller, a verification cycle controller, a test cycle controller, and a hook; and
and calling the training cycle controller, the verification cycle controller or the test cycle controller according to preset logic to train, verify or test the model.
Further, the model training comprises:
and (4) repeatedly iterating on the data set according to a preset maximum iteration turn, and carrying out forward reasoning and backward propagation and optimization on the model.
Further, the verifying includes:
and verifying the trained model at a preset moment, outputting an evaluation index on a verification set, and adjusting the hyper-parameters of deep learning training by the actuator before the next iteration training according to the evaluation index.
Further, the testing includes:
and testing the trained model at a preset moment, and outputting evaluation indexes on the test set.
Further, the method further comprises querying, modifying or calling input or output parameters of corresponding steps through the hook, wherein the hook is arranged before or after one or more of the following steps:
module construction, training, validation, testing, any training round, any training iteration, any validation round, any validation iteration, any testing round, and any testing iteration.
Further, the module further comprises: an optimizer, a parameter scheduler, and an evaluator.
Further, the data sets include a training data set, a validation data set, and a test data set.
Further, the training cycle controller is used for calling the training data set once for each round or each iteration to perform one round of training or one iteration; and/or
The verification circulation controller is used for calling the verification data set and the verification evaluating device once at a preset moment to perform iteration once so as to verify the model; and/or
The training cycle controller is used for calling the test data set and the test evaluator once at a preset moment to perform iteration once so as to test the model.
Another aspect of the present invention provides an apparatus for selecting a deep learning model, including:
the selection module is used for selecting the deep learning model according to the test results of the different-deep learning models; and
the learning module is used for training, verifying and testing different deep learning models and comprises:
the system comprises a preprocessing module, a data processing module and a data processing module, wherein the preprocessing module is used for constructing modules required by a given deep learning algorithm and comprises a model, a data set, a cycle controller and a hook; and
and the calling module is used for calling the appointed module according to the preset logic so as to train, verify and test the model.
The invention also provides an electronic device comprising a memory and a processor, wherein the memory is used for storing a computer program which, when the processor is run, executes the method for selecting a deep learning model as described above.
The invention also provides a computer-readable storage medium, in which a computer program is stored, which, when run on a processor, performs the method of selecting a deep learning model as described above.
According to the selection method, the selection device and the selection equipment of the deep learning model, training, verification and testing of different deep learning algorithms can be realized through one actuator, so that the execution flow of deep learning tasks is standardized and unified. Corresponding models and dependency modules can be constructed through the executor according to different types of algorithm requirements, execution processes of different algorithms can be further realized, expansibility is achieved, and development burden of deep learning algorithms can be effectively reduced. Meanwhile, the hooks are arranged before and after different steps in the whole task process, visualization of output of different steps can be achieved, for example, model parameters, evaluation results and the like obtained after each iteration are obtained, and according to the model parameters and/or the evaluation results, the hyper-parameters of the deep learning model and/or the deep learning task and the like can be adjusted in a manual intervention mode, so that the model training speed and/or precision are improved.
The deep learning method is applied to the technical fields of image, voice recognition or classification and the like, on one hand, the overall operation speed can be effectively improved, and on the other hand, by adopting the method, the model can be verified or tested at any time of model training through the hook arrangement, so that the current training effect can be obtained in time, and related parameters can be manually adjusted according to actual requirements, so that the training speed is accelerated. On the other hand, the model can be verified and tested in time, and the model parameters and the hyper-parameters can be adjusted according to the verification and test results, so that the performance of the model obtained by final training can be effectively improved, and the recognition or classification accuracy and accuracy are further improved. In addition, in the practical application process, the method does not need to develop a special flow aiming at the deep learning algorithm, and can conveniently realize various deep learning algorithms by calling the corresponding modules, so that the advantages and the disadvantages of different algorithms are conveniently compared, and the most appropriate model is further determined.
Drawings
To further clarify the above and other advantages and features of various embodiments of the present invention, a more particular description of various embodiments of the invention will be rendered by reference to the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope. In the drawings, the same or corresponding parts will be denoted by the same or similar reference numerals for clarity.
FIG. 1 is a flow chart illustrating a method for selecting a deep learning model according to an embodiment of the invention;
FIG. 2 is a schematic diagram of the structure of modules constructed by the actuator according to one embodiment of the invention; and
fig. 3a-3c show a schematic view of a hook site according to an embodiment of the invention.
Detailed Description
In the following description, the present invention is described with reference to various embodiments. One skilled in the relevant art will recognize, however, that the embodiments can be practiced without one or more of the specific details, or with other alternative and/or additional methods or components. In other instances, well-known structures or operations are not shown, or are not described in detail, to avoid obscuring aspects of the invention. Similarly, for purposes of explanation, specific numbers and configurations are set forth in order to provide a thorough understanding of the embodiments of the invention. However, the invention is not limited to these specific details.
Reference in the specification to "one embodiment" or "the embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention. The appearances of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment.
It should be noted that the method steps are described in a specific order according to the embodiments of the present invention, which is only for the purpose of illustrating the specific embodiments and not for limiting the sequence of the steps. On the contrary, in different embodiments of the present invention, the sequence of the steps may be adjusted according to actual requirements.
Based on the following insight of the inventor, in the existing process of recognizing and classifying sound and image targets by adopting a deep learning method, aiming at selecting the most appropriate deep learning model, a plurality of sets of training, verifying and testing processes are often constructed for different deep learning algorithms to compare the implementation effects of the different algorithms, so that the overall efficiency is low. The inventor finds that the execution logics of different types of deep learning algorithms or different types of deep learning frameworks are similar, the adopted models are different, and the overall processes of model training, verification and testing are similar for the same algorithm, and the differences only lie in the difference of data sets and the difference of data set calling frequency. Therefore, if the steps involved in the model training, verification and testing can be modularized, the steps can be flexibly combined in a calling mode to obtain execution flows of different algorithms and different links, and further various customized training strategies are supported to facilitate the use of users, the development burden of deep learning algorithms can be reduced, and the development efficiency is effectively improved.
Based on this, in one embodiment of the invention, a general deep learning task executor is designed to take charge of the initialization of the environment, the construction of the module and the calling of the cycle controller. And the loop controller performs loop iteration on the corresponding data set to complete model training, verification and testing. In addition, each point position in the circulation process can call a corresponding function of the hook so as to call, inquire or modify the parameter or the hyper-parameter, further modify the model parameter manually, accelerate the model convergence and improve the training efficiency and the model precision. In one embodiment of the present invention, each module in the executor can be customized according to a given deep learning algorithm or other requirements, so that execution flows of various algorithms can be combined.
The invention is further described in the following with reference to the drawings of the embodiments.
Fig. 1 is a flowchart illustrating a method for selecting a deep learning model according to an embodiment of the present invention. As shown in fig. 1, a method for selecting a deep learning model includes:
first, in step 101, a model is trained, validated, and tested. As shown in the foregoing, in the embodiment of the present invention, the training, verification, and testing of different deep learning models are implemented by a general actuator, and the actuator mainly includes three interfaces: train, validate val, and test. The training interface is mainly used for operating the training cycle controller to complete model training and further outputting a trained model, the verification interface is mainly used for operating the verification cycle controller to complete model verification and further outputting evaluation indexes on a verification set, and the test interface is mainly used for operating the test cycle controller to complete model testing and further outputting evaluation indexes on a test set. Fig. 2 is a schematic structural diagram of a module constructed by an actuator according to an embodiment of the present invention. As shown in FIG. 2, the executor additionally comprises two modules, namely a loop controller and a hook, besides the necessary modules, such as a model, an optimizer, a data set and the like. The cycle controller is used for defining an iterative process of an algorithm on a corresponding data set, for example, the training cycle controller can repeatedly iterate on training data according to the set maximum iteration number, and forward reasoning and backward propagation and an optimization process of the model are carried out; and the test cycle controller only uses the model to carry out forward reasoning on the test data set and stores the reasoning result. Based on the structure, the training, the verification and the test of each deep learning model comprise the following steps:
firstly, initializing environments through the actuator, and initializing different frames and hardware environments for different deep learning algorithms;
next, constructing a model through the actuator;
next, loading a data set, wherein a training data set, a verification data set, and a test data set are loaded, respectively, since different data sets are usually adopted for training, verifying, and testing;
next, building a dependent module, for example, building an optimizer during training, building a tester during testing, and the like, and for example, also building a parameter scheduler to implement updating and scheduling of hyper-parameters of deep learning training;
next, initializing model weights or loading pre-training weights;
next, a loop controller is constructed, and since the loop logic adopted for training, verifying and testing is different, in one embodiment of the present invention, a training loop controller, a verifying loop controller and a testing loop controller need to be constructed respectively. In an embodiment of the invention, the training cycle controller, the verification cycle controller and the test cycle controller are obtained by instantiating the same class and the cycle controller base class, each cycle controller has a data set attribute and provides an operation interface to the outside, and how to cycle on the data set is defined in an operation function. By implementing different run functions, the loop controller can define which loop the model iterates over the corresponding data set. In one embodiment of the invention, the training loop controller is configured to invoke the training data set once per round or per iteration for one round of training or one iteration. In an embodiment of the present invention, the verification cycle controller is configured to invoke the verification data set and the verification evaluator once for an iteration at a preset time, for example, after a round of training is finished or after any iteration is finished, so as to verify the model. In an embodiment of the present invention, the training cycle controller is configured to invoke the test data set and the test evaluator once for an iteration at a preset time, for example, after all training rounds are finished, so as to test a model;
then, a hook is registered, and input or output parameters of corresponding steps can be inquired, modified or called through the hook, so that visualization of output of different steps, such as model parameters, evaluation results and the like obtained after each iteration, is realized on one hand, and on the other hand, according to the model parameters and/or the evaluation results, hyper-parameters and the like of a deep learning model and/or a deep learning task can be adjusted in a manual intervention mode, and further, the model training speed and/or precision are improved. 3a-3c show schematic views of hook site locations according to one embodiment of the present invention, as shown: the hook is arranged before or after one or more of the following steps: run, train, verify, test, any training round, any training iteration, any verification round, any verification iteration, any test round, and any test iteration. For example, the point before run before run _ run refers to a point before the executor executes a task, and may modify an initialization parameter of a model before run, or add a log record to the executor before run; the running point location after _ run refers to a point location after the executor executes the task, and the model weight or the running log can be saved after the running is finished; the point location of before _ train before training starts refers to the point location before the training cycle starts, which is an auxiliary module used when initializing training, such as a model parameter smoothing module, and is used for improving the training precision; the point location after completion of training, after _ train, refers to a point location after completion of a training cycle, and is used for closing the auxiliary module during the training; the point before the training round before _ train _ epoch refers to the point before each training round, which can be used to adjust the training parameters, such as adjusting the learning rate before each round, or adjusting the random seed of data loading before each round to change the randomness; the point location after each training round, after _ train _ epoch, refers to a point location after each training round, and is generally used for emptying the cache after each round to reduce the memory occupation, and also can be used for saving the model weight once after each round is trained; training the point before the single iteration before the trace _ train _ iter refers to each point before the training single iteration, which can be used to modify the data sample used for this iteration for data enhancement and/or insert a timer flag bit to time the single iteration; the point location after training a single iteration, after _ train _ iter, refers to each point location after training a single iteration, and may be used to determine whether to save the model during the iteration according to the batch number, and/or update the parameters of the model smoothing module, and/or insert a timer flag bit to time the single iteration; some models have different internal logics during training and reasoning, so in one embodiment of the present invention, a point before verification start before point before point before verification start may be further set, which is used to switch the state of the model from the training mode to the reasoning mode; the point location after the completion of the verification, after _ val, is a point location after the completion of the verification, and is mainly used for cleaning the verification cache. The point location before the verification round, before _ val _ epoch, refers to a point location before the start of each verification round, and can be used for controlling the selection of a verification set to realize cross verification; the point location after the verification round (after _ val _ epoch) is the point location after each verification round is completed, and can be used for judging the verification result and adjusting the training strategy; the point location before single iteration verification is the point location before each single iteration verification, and timing point location statistics verification iteration time can be inserted; the point location after single iteration is verified, after the single iteration is verified, the after _ val _ ite is the point location after each single iteration is verified, timing point location statistics verification iteration time can be inserted, and/or the data and the model of the batch are output to the evaluator to calculate an evaluation index; similarly, the point before the test starts is before _ test, which is used for switching the state of the model from the training mode to the inference mode; the point location after the test is finished, after _ test, refers to the point location after the test is finished, and is used for cleaning the test cache; the point location before testing single iteration is the point location before each testing single iteration, and timing point location statistics testing iteration time can be inserted; the point location after single iteration is tested, after _ test _ iter is the point location after each single iteration is tested, timing point location statistics test iteration time can be inserted into the point location after single iteration, and/or the data and the model of the batch are output to the evaluating device to calculate an evaluation index; in addition, the point before _ test _ epoch before the test round and the point after _ test _ epoch after the test round can be set before and after each test round;
finally, the cycle controller is operated. And calling the training cycle controller, the verification cycle controller or the test cycle controller according to preset logic to train, verify or test the model. The model training mainly refers to repeated iteration on a data set according to a preset maximum iteration turn, and forward reasoning, backward propagation and optimization of the model are carried out. The verification means that the trained model is verified at a preset moment, evaluation indexes on a verification set are output, and the actuator adjusts the hyper-parameters of the deep learning training before the next iteration round of training according to the evaluation indexes. The test refers to testing the trained model at a preset moment and outputting evaluation indexes on a test set; and
finally, in step 102, the test results are compared and the model is selected. And comparing the test results, and selecting the deep learning model meeting the preset conditions to perform subsequent operations such as image and sound recognition and classification. In an embodiment of the present invention, for example, parameters such as accuracy, precision, and recall of each model may be considered comprehensively to select a deep learning model meeting requirements, and in other embodiments of the present invention, for example, only one parameter may be compared according to actual requirements, for example, a model with the best accuracy, precision, or recall is selected.
In an embodiment of the present invention, there is also provided a deep learning model selection apparatus for implementing the deep learning model selection method described above, where the processing apparatus includes a selection module and a learning module. The selection module is used for selecting the deep learning model calling module to construct the parameter scheduler according to the test results of the different-deep learning models. The learning module is configured to train, verify and test different deep learning models. The learning module comprises a preprocessing module and a calling module, wherein the preprocessing module is used for constructing modules required by a given deep learning algorithm and comprises a model, a data set, a loop controller and a hook, and the calling module is used for calling a specified module according to preset logic to train, verify and test the model.
In an embodiment of the present invention, there is also provided an electronic device, which includes a memory and a processor, wherein the memory is used for storing a computer program, and the computer program executes the selection method of the deep learning model as described above when the processor runs.
In an embodiment of the present invention, there is also provided a computer readable storage medium storing a computer program which, when run on a processor, performs the method of selecting a deep learning model as described above.
While various embodiments of the present invention have been described above, it should be understood that they have been presented by way of example only, and not limitation. It will be apparent to persons skilled in the relevant art that various combinations, modifications, and changes can be made thereto without departing from the spirit and scope of the invention. Thus, the breadth and scope of the present invention disclosed herein should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.

Claims (11)

1. A method for selecting a deep learning model is characterized by comprising the following steps: training, verifying and testing different deep learning models, comparing the test results, and selecting the deep learning model meeting the preset conditions, wherein the training, verifying and testing of the different deep learning models are all realized by the same actuator, and the training, verifying and testing of any deep learning model comprises the following steps:
constructing, by the executor, a desired module according to a given deep learning algorithm, the module comprising: a model, a data set, a training cycle controller, a verification cycle controller, a test cycle controller, and a hook; and
and calling the training cycle controller, the verification cycle controller or the test cycle controller according to preset logic to train, verify or test the model.
2. The selection method of claim 1, wherein the model training comprises:
and (4) repeatedly iterating on the data set according to a preset maximum iteration turn, and carrying out forward reasoning and backward propagation and optimization on the model.
3. The selection method of claim 1, wherein the verifying comprises:
and verifying the trained model at a preset moment, outputting an evaluation index on a verification set, and adjusting the hyper-parameters of the deep learning training by the actuator before the next iteration training according to the evaluation index.
4. The selection method of claim 1, wherein the testing comprises:
and testing the trained model at a preset moment, and outputting evaluation indexes on the test set.
5. The selection method of claim 1, further comprising querying, modifying or invoking input or output parameters of corresponding steps through the hook, wherein the hook is disposed before or after one or more of:
module construction, training, validation, testing, any training round, any training iteration, any validation round, any validation iteration, any testing round, and any testing iteration.
6. The selection method of claim 1, wherein the module further comprises: the system comprises an optimizer, a parameter scheduler and a evaluator.
7. The selection method of claim 1, wherein the data sets include a training data set, a validation data set, and a test data set, which are used for model training, validation, and testing, respectively.
8. The selection method of claim 7, wherein the training loop controller is configured to invoke the training data set once per round or per iteration for one round of training or one iteration; and/or
The verification circulation controller is configured to call the verification data set and the verification evaluating device once at a preset moment to perform one iteration for the verification of the model; and/or
The training loop controller is configured to invoke the test data set and the test evaluator once for an iteration at a preset time for testing of the model.
9. An apparatus for selecting a deep learning model, comprising:
a selection module configured to select a deep learning model according to test results of different deep learning models; and
a learning module configured to train, validate and test different deep learning models, and comprising:
a pre-processing module configured to build the modules required for a given deep learning algorithm, including models, datasets, loop controllers, and hooks; and
and the calling module is configured to call the specified module according to preset logic so as to carry out model training, verification and testing.
10. An electronic device comprising a memory and a processor, wherein the memory is configured to store a computer program which, when executed by the processor, performs the method of selecting a deep learning model according to any one of claims 1 to 8.
11. A computer-readable storage medium, in which a computer program is stored which, when run on a processor, performs the method of selecting a deep learning model according to any one of claims 1 to 8.
CN202211025489.9A 2022-08-25 Deep learning model selection method, device and equipment Active CN115392482B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211025489.9A CN115392482B (en) 2022-08-25 Deep learning model selection method, device and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211025489.9A CN115392482B (en) 2022-08-25 Deep learning model selection method, device and equipment

Publications (2)

Publication Number Publication Date
CN115392482A true CN115392482A (en) 2022-11-25
CN115392482B CN115392482B (en) 2024-06-28

Family

ID=

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110110848A (en) * 2019-05-05 2019-08-09 武汉烽火众智数字技术有限责任公司 A kind of combination forecasting construction method and device
US20190370684A1 (en) * 2018-06-01 2019-12-05 Sas Institute Inc. System for automatic, simultaneous feature selection and hyperparameter tuning for a machine learning model
CN112330029A (en) * 2020-11-08 2021-02-05 上海海洋大学 Fishing ground prediction calculation method based on multilayer convLSTM
CN112472101A (en) * 2020-10-16 2021-03-12 浙江大学山东工业技术研究院 Deep learning electrocardiogram data classification method and device based on conversion technology
US20210264256A1 (en) * 2020-02-25 2021-08-26 Robert Bosch Gmbh Method, device and computer program for predicting a suitable configuration of a machine learning system for a training data set
CN114202500A (en) * 2020-09-02 2022-03-18 卡尔蔡司显微镜有限责任公司 Microscope system and method for monitoring the learning process of a machine learning model
CN114708540A (en) * 2022-04-24 2022-07-05 上海人工智能创新中心 Data processing method, terminal and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190370684A1 (en) * 2018-06-01 2019-12-05 Sas Institute Inc. System for automatic, simultaneous feature selection and hyperparameter tuning for a machine learning model
CN110110848A (en) * 2019-05-05 2019-08-09 武汉烽火众智数字技术有限责任公司 A kind of combination forecasting construction method and device
US20210264256A1 (en) * 2020-02-25 2021-08-26 Robert Bosch Gmbh Method, device and computer program for predicting a suitable configuration of a machine learning system for a training data set
CN114202500A (en) * 2020-09-02 2022-03-18 卡尔蔡司显微镜有限责任公司 Microscope system and method for monitoring the learning process of a machine learning model
CN112472101A (en) * 2020-10-16 2021-03-12 浙江大学山东工业技术研究院 Deep learning electrocardiogram data classification method and device based on conversion technology
CN112330029A (en) * 2020-11-08 2021-02-05 上海海洋大学 Fishing ground prediction calculation method based on multilayer convLSTM
CN114708540A (en) * 2022-04-24 2022-07-05 上海人工智能创新中心 Data processing method, terminal and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
袁明新等: "联合迁移学习和自适应学习率的苹果成熟度识别", 《中国农机化学报》, no. 11, 30 November 2019 (2019-11-30), pages 137 - 141 *

Similar Documents

Publication Publication Date Title
US20190050734A1 (en) Compression method of deep neural networks
US20210081763A1 (en) Electronic device and method for controlling the electronic device thereof
US20190073591A1 (en) Execution of a genetic algorithm having variable epoch size with selective execution of a training algorithm
CN109978836B (en) User personalized image aesthetic feeling evaluation method, system, medium and equipment based on meta learning
WO2020197529A1 (en) Inverse and forward modeling machine learning-based generative design
Wang et al. Cooling strategies for the moment-generating function in Bayesian global optimization
US20070179917A1 (en) Intelligent design optimization method and system
CN111124916B (en) Model training method based on motion semantic vector and electronic equipment
JP7267966B2 (en) Information processing device and information processing method
CN115392482A (en) Deep learning model selection method, device and equipment
CN115392482B (en) Deep learning model selection method, device and equipment
KR20220032861A (en) Neural architecture search method and attaratus considering performance in hardware
CN112633516B (en) Performance prediction and machine learning compiling optimization method and device
CN111859785B (en) Fluid feature extraction method, system, computer-readable storage medium and device
CN114330119A (en) Deep learning-based pumped storage unit adjusting system identification method
JPH06215163A (en) Method and device for supporting neural network construction, neural network test device and chemical feeding control system
KR20230012790A (en) Method and apparatus for function optimization
CN113435572A (en) Construction method of self-evolution neural network model for intelligent manufacturing industry
CN117933104B (en) Solid attitude and orbit control engine gas regulating valve pressure correction method
JPH0561848A (en) Device and method for selecting and executing optimum algorithm
CN115374910B (en) Method, device and equipment for updating deep learning training super-parameters
US20230169239A1 (en) Device and method for improving simulator parameter
CN117235477B (en) User group evaluation method and system based on deep neural network
CN113657592B (en) Software-defined satellite self-adaptive pruning model compression method
CN117237241B (en) Chromosome enhancement parameter adjustment method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant