WO2021223686A1 - Model training task processing method and apparatus, electronic device, and storage medium - Google Patents

Model training task processing method and apparatus, electronic device, and storage medium Download PDF

Info

Publication number
WO2021223686A1
WO2021223686A1 PCT/CN2021/091635 CN2021091635W WO2021223686A1 WO 2021223686 A1 WO2021223686 A1 WO 2021223686A1 CN 2021091635 W CN2021091635 W CN 2021091635W WO 2021223686 A1 WO2021223686 A1 WO 2021223686A1
Authority
WO
WIPO (PCT)
Prior art keywords
model training
training task
calculation amount
model
real
Prior art date
Application number
PCT/CN2021/091635
Other languages
French (fr)
Chinese (zh)
Inventor
陈伯辉
Original Assignee
深圳市万普拉斯科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市万普拉斯科技有限公司 filed Critical 深圳市万普拉斯科技有限公司
Publication of WO2021223686A1 publication Critical patent/WO2021223686A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Definitions

  • This application relates to the technical field of mobile terminals, and in particular to a method, device, electronic device, and storage medium for processing model training tasks.
  • Machine learning can be roughly divided into three parts: determining model, training model and using model.
  • training model is an important part of machine learning.
  • Traditional model training generally triggers the execution of the training task when there is a new training task, and the calculation amount of model training is generally large, which leads to low training efficiency of the model training task.
  • a method for processing model training tasks comprising:
  • the acquiring the real-time usage status of the mobile terminal includes:
  • the real-time use status of the mobile terminal is determined.
  • the determining the calculation amount corresponding to the model training task according to the model category corresponding to the model training task and the preset relationship between the model category and the calculation amount includes:
  • the calculation amount level corresponding to the model training task is obtained as the highest level
  • model category corresponding to the model training task is a non-neural network model
  • the calculation amount level corresponding to the model training task is obtained
  • the calculation amount corresponding to the model training task is determined according to the obtained calculation amount level and the preset relationship between the calculation amount level and the calculation amount.
  • controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task includes:
  • controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task includes:
  • Different preset second processes are respectively called to execute the executable model training task, and the second process and the first process are different processes.
  • controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task includes:
  • a model training task processing device comprising:
  • the task recognition module is used to recognize the model category corresponding to the model training task when the mobile terminal receives the to-be-processed model training task;
  • the calculation amount determination module is configured to determine the calculation amount corresponding to the model training task according to the model category corresponding to the model training task and the preset relationship between the model category and the calculation amount;
  • the task control module is used to obtain the real-time usage status of the mobile terminal, and control the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task.
  • the task control module is also used to obtain temperature information and real-time operation process information of the mobile terminal; and determine the real-time use status of the mobile terminal according to the temperature information and the real-time operation process information .
  • the calculation amount determination module is further configured to obtain the calculation amount level corresponding to the model training task as the highest level when the model category corresponding to the model training task is a neural network model; When the model category corresponding to the training task is a non-neural network model, by running the model training task, the calculation amount level corresponding to the model training task is obtained; according to the obtained calculation amount level and the preset calculation amount level and calculation amount Relationship to determine the amount of calculation corresponding to the model training task.
  • the task control module is further configured to obtain the calculation amount provided data corresponding to the real-time use status; compare the calculation amount provided data with the calculation amount corresponding to the model training task, and filter Obtain the model training target task within the range of the data provided by the calculation amount; call the calculation resource corresponding to the data provided by the calculation amount to execute the model training target task.
  • the task control module is also used to call a preset first process, and obtain executable model training tasks according to the real-time usage status and the calculation amount corresponding to the model training tasks; respectively call different
  • the preset second process executes the executable model training task, and the second process and the first process are different processes.
  • the task control module is also used to call a preset process, and obtain an executable model training task according to the real-time usage status and the calculation amount corresponding to the model training task; call the same process to execute The executable model training task.
  • An electronic device includes a memory and a processor, the memory stores a computer program, and when the processor executes the computer program, a method for processing a model training task is implemented, and the method includes:
  • the acquiring the real-time usage status of the mobile terminal includes:
  • the real-time use status of the mobile terminal is determined.
  • the determining the calculation amount corresponding to the model training task according to the model category corresponding to the model training task and the preset relationship between the model category and the calculation amount includes:
  • the calculation amount level corresponding to the model training task is obtained as the highest level
  • model category corresponding to the model training task is a non-neural network model
  • the calculation amount level corresponding to the model training task is obtained
  • the calculation amount corresponding to the model training task is determined according to the obtained calculation amount level and the preset relationship between the calculation amount level and the calculation amount.
  • controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task includes:
  • controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task includes:
  • Different preset second processes are respectively called to execute the executable model training task, and the second process and the first process are different processes.
  • controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task includes:
  • a computer-readable storage medium has a computer program stored thereon, and when the computer program is executed by a processor, a method for processing a model training task is implemented, the method comprising:
  • the acquiring the real-time usage status of the mobile terminal includes:
  • the real-time use status of the mobile terminal is determined.
  • the determining the calculation amount corresponding to the model training task according to the model category corresponding to the model training task and the preset relationship between the model category and the calculation amount includes:
  • the calculation amount level corresponding to the model training task is obtained as the highest level
  • model category corresponding to the model training task is a non-neural network model
  • the calculation amount level corresponding to the model training task is obtained
  • the calculation amount corresponding to the model training task is determined according to the obtained calculation amount level and the preset relationship between the calculation amount level and the calculation amount.
  • controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task includes:
  • controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task includes:
  • Different preset second processes are respectively called to execute the executable model training task, and the second process and the first process are different processes.
  • controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task includes:
  • the above-mentioned model training task processing method, device, electronic equipment and storage medium when receiving the model training task to be processed, identify the model category corresponding to the model training task; according to the model category corresponding to the model training task and the preset model category and The relationship between the amount of calculations determines the amount of calculation corresponding to the model training task; obtains the real-time use status of the mobile terminal, and controls the execution of the model training task according to the real-time use status and the amount of calculation corresponding to the model training task, instead of receiving the model training task.
  • the execution operation is triggered, so that the model training task can be executed reasonably by the computing resources of the mobile terminal, thereby improving the training efficiency.
  • FIG. 1 is an application environment diagram of a method for processing model training tasks in an embodiment of this application
  • FIG. 2 is a schematic flowchart of a method for processing model training tasks in an embodiment of this application
  • FIG. 3 is a schematic diagram of a master-slave architecture for processing model training tasks in an embodiment of this application
  • FIG. 4 is a schematic flowchart of a step of determining the amount of calculation corresponding to a model training task in an embodiment of the application
  • FIG. 5 is a schematic flowchart of execution control steps of a model training task in an embodiment of this application.
  • Fig. 6 is a structural block diagram of a model training task processing device in an embodiment of the application.
  • Fig. 7 is an internal structure diagram of an electronic device in an embodiment of the application.
  • the method for processing model training tasks provided in this application can be applied to the application environment as shown in FIG. 1.
  • the data required for training the preset model is collected.
  • the corresponding model training request is triggered, and the model training request carries the model training task to be processed.
  • the mobile terminal receives the model training task to be processed, it recognizes the model category corresponding to the model training task; according to the model category corresponding to the model training task and the relationship between the preset model category and the amount of calculation, the calculation amount corresponding to the model training task is determined ; Acquire the real-time usage status of the mobile terminal, and control the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task.
  • the mobile terminal can be, but is not limited to, various personal computers, notebook computers, smart phones, and tablet computers.
  • the mobile terminal executing the model training task processing method in the embodiment of the present application may be executed by a processor.
  • a method for processing a model training task is provided. Taking the method applied to the mobile terminal in FIG. 1 as an example for description, the method includes the following steps:
  • Step 202 When the mobile terminal receives the to-be-processed model training task, identify the model category corresponding to the model training task.
  • the model training task refers to the data training process that needs to be completed to realize the model function
  • the model category refers to the attribute information of the model, such as whether the model is a neural network model or a non-neural network model.
  • the model training task processing architecture can be built on the processor of the mobile terminal, such as the master-slave architecture, as shown in Figure 3, through the training management module to manage all model training tasks, all model training tasks need to be carried out in the training management module Registration, the model category is provided when the model training task is registered, and different model categories corresponding to the model training task are distinguished by the model category.
  • Step 204 Determine the calculation amount corresponding to the model training task according to the model category corresponding to the model training task and the preset relationship between the model category and the calculation amount.
  • model categories correspond to different calculation amounts.
  • the calculation amount corresponding to model category A is 1 unit
  • the calculation amount corresponding to model category B is 2 units, and so on.
  • the calculation amount corresponding to the model training task is obtained.
  • Step 206 Obtain the real-time usage status of the mobile terminal, and control the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task.
  • the real-time usage status of the mobile terminal is used to characterize the usage scenario of the mobile terminal.
  • different mobile terminal usage states are defined. For example, when the screen off time of the mobile terminal exceeds a preset period of time, it is defined that the mobile terminal is in a sleep state at this time.
  • the execution of model training tasks with different calculation amounts is arranged, and the execution of the model training tasks to be processed is controlled according to the calculation power provided by the current situation of the mobile terminal.
  • model training task processing method when a model training task to be processed is received, the model category corresponding to the model training task is identified; the model training is determined according to the model category corresponding to the model training task and the relationship between the preset model category and the amount of calculation The calculation amount corresponding to the task; obtain the real-time usage status of the mobile terminal, and control the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task, instead of triggering the execution operation when the model training task is received, so that the model training The task can be executed reasonably by the computing resources of the mobile terminal, thereby improving training efficiency.
  • acquiring the real-time usage status of the mobile terminal includes: acquiring temperature information and real-time running process information of the mobile terminal; and determining the real-time usage status of the mobile terminal according to the temperature information and real-time running process information.
  • the temperature information of the mobile terminal can be collected through the temperature sensor, and specifically can be the real-time temperature of the processor of the mobile terminal.
  • Real-time process information refers to the working state of the mobile terminal processor, such as whether the processor is in a working state or an idle state, and the load status in the working state. Specifically, as shown in Table 1, five usage states of the mobile terminal and corresponding state explanations are given.
  • determining the calculation amount corresponding to the model training task includes: step 402, when the model training task corresponds to When the model category of is a neural network model, the calculation level corresponding to the model training task is obtained as the highest level; step 404, when the model category corresponding to the model training task is a non-neural network model, the model training task is obtained by running the model training task Corresponding operation amount level; Step 406: Determine the operation amount corresponding to the model training task according to the obtained operation amount level and the relationship between the preset operation amount level and the operation amount.
  • the model category corresponding to the task is provided, such as distinguishing between xCNN (Convolutional Neural Network) model types and other types.
  • the xCNN model represents a derivative model of the CNN model.
  • the convolutional neural network model is often used in the deep neural network architecture of the image type. It is characterized by a deeper network and relatively large computing resources. For the xCNN model, it is directly classified as an ultra-high computing level.
  • the training management module detects the calculation amount of the model training task when the screen is turned off, and classifies the calculation amount according to the detection result, the specific calculation amount level and the corresponding calculation power consumption unit As shown in table 2.
  • the computing power consumption unit refers to the computing power required for the mobile terminal to perform a model training task of a certain computing level. It can be seen that the higher the computing level, the more computing power is consumed. Take TOPS (Tera Operations Per Second, which executes 10 to the 12th power of operations per second) as an example of a processor's computing power unit. 1TOPS means that the processor can perform 10 to 12th power of operations per second.
  • SGD Spochastic Gradient Descent, Stochastic Gradient Descent
  • samples of a preset batch size are selected in the training set for each training.
  • one iteration means training once with samples of the preset batch size in the training set
  • one period means training once with all samples in the training set.
  • Table 3 is based on the relationship between the level of computation obtained by a certain model of processor and the iteration time.
  • controlling the execution of the model training task includes: Step 502: Obtain the calculation amount provided data corresponding to the real-time usage status; Step 504 , Compare the calculation amount provided data with the calculation amount corresponding to the model training task, and filter the model training target tasks within the range of the calculation amount provided data bearing; step 506, call the calculation amount to provide the corresponding calculation resources of the data, and perform model training Target task.
  • the data provided by the amount of computing refers to the computing power that the mobile terminal can provide when it is in a certain use state. Different use states of the mobile terminal correspond to different use scenarios. In different use scenarios, the mobile terminal can provide different computing power.
  • the termination conditions for training tasks are also different.
  • the training management module recognizes the usage scenarios of the mobile terminal, and arranges and runs model training tasks with different computing levels in different usage scenarios.
  • the execution of the model training tasks to be processed is controlled according to the computing power provided by the usage scenario, as shown in Table 4.
  • Table 4 For example, when the real-time usage status of the mobile terminal satisfies the first usage scenario in the table, at this time, the computing power that the mobile terminal can provide is 8 units. According to Table 2, it can be seen that a model training task with an ultra-high computing level can be performed. , Or perform two high-level model training tasks...and so on.
  • controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task includes: invoking the preset first process, and obtaining the available data according to the real-time usage status and the calculation amount corresponding to the model training task.
  • Performed model training tasks; different preset second processes are respectively called to execute executable model training tasks, and the second process and the first process are different processes.
  • the training management module and the execution of multiple model training tasks run in different processes, that is, the control of the model training task and the execution of the model training task are processed through different processes, which is easy to maintain and can improve stability.
  • the model training task processing method in this application is executed by using a master-slave architecture to build a model training task processing structure on a mobile terminal. Under this structure, different model training tasks run in different processes and will not affect each other. Abnormal termination of the training process will not affect other training processes.
  • controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task includes: calling a preset process, and obtaining the executable according to the real-time usage status and the calculation amount corresponding to the model training task Model training tasks; call the same process to execute executable model training tasks.
  • the training management module and the execution of multiple model training tasks are integrated and run in the same process, and different model training tasks are transformed into multiple different threads for execution, providing another way to implement the model training task processing in this application Methods to meet diverse needs.
  • steps in the flowcharts of FIGS. 2 and 4-5 are displayed in sequence as indicated by the arrows, these steps are not necessarily executed in sequence in the order indicated by the arrows. Unless specifically stated in this article, the execution of these steps is not strictly limited in order, and these steps can be executed in other orders. Moreover, at least part of the steps in Figures 2, 4-5 may include multiple sub-steps or multiple stages. These sub-steps or stages are not necessarily executed at the same time, but can be executed at different times. These sub-steps Or the execution order of the stages is not necessarily carried out sequentially, but may be executed alternately or alternately with at least a part of other steps or sub-steps or stages of other steps.
  • a model training task processing device includes a task identification module 602, a calculation amount determination module 604, and a task control module 606.
  • the task recognition module 602 is configured to recognize the model category corresponding to the model training task when the mobile terminal receives the model training task to be processed.
  • the calculation amount determination module 604 is configured to determine the calculation amount corresponding to the model training task according to the model category corresponding to the model training task and the preset relationship between the model category and the calculation amount.
  • the task control module 606 is used to obtain the real-time usage status of the mobile terminal, and control the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task.
  • the task control module 606 is also used to obtain temperature information and real-time running process information of the mobile terminal; determine the real-time use status of the mobile terminal according to the temperature information and real-time running process information.
  • the calculation amount determining module 604 is also used to obtain the calculation amount level corresponding to the model training task as the highest level when the model category corresponding to the model training task is a neural network model; when the model category corresponding to the model training task is In the case of non-neural network models, by running the model training task, the calculation amount level corresponding to the model training task is obtained; the calculation amount corresponding to the model training task is determined according to the obtained calculation amount level and the relationship between the preset calculation amount level and the calculation amount.
  • the task control module 606 is also used to obtain the calculation amount provided data corresponding to the real-time usage status; compare the calculation amount provided data with the calculation amount corresponding to the model training task, and filter to obtain the calculation amount provided data bearing The model training target tasks within the scope; the calculation amount is called to provide the computing resources corresponding to the data, and the model training target tasks are performed.
  • the task control module 606 is also used to call the preset first process, obtain executable model training tasks according to the real-time usage status and the calculation amount corresponding to the model training tasks; respectively call different preset first processes.
  • the second process executes executable model training tasks, and the second process and the first process are different processes.
  • the task control module 606 is also used to call a preset process to obtain an executable model training task according to the real-time usage status and the calculation amount corresponding to the model training task; call the same process to perform executable model training Task.
  • Each module in the above-mentioned model training task processing device can be implemented in whole or in part by software, hardware and a combination thereof.
  • the above-mentioned modules may be embedded in the form of hardware or independent of the processor in the computer equipment, or may be stored in the memory of the computer equipment in the form of software, so that the processor can call and execute the operations corresponding to the above-mentioned modules.
  • an electronic device may be a mobile terminal.
  • FIG. 7 provides an internal structure diagram of the mobile terminal.
  • the mobile terminal includes a processor, a memory, and a display screen connected through a system bus.
  • the processor is used to provide computing and control capabilities.
  • the memory includes a non-volatile storage medium and internal memory.
  • the non-volatile storage medium stores an operating system and a computer program.
  • the internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage medium.
  • the method for processing model training tasks performed by the mobile terminal in FIG. 1 may be specifically completed by the processor of the mobile terminal, that is, a method for processing model training tasks is implemented when a computer program is executed by the processor.
  • FIG. 7 is only a block diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation on the terminal to which the solution of the present application is applied.
  • the specific mobile terminal may include More or fewer parts than shown in the figure, or some parts are combined, or have a different arrangement of parts.
  • an electronic device including a memory and a processor
  • the memory stores a computer program
  • the processor implements the following steps when executing the computer program: when a mobile terminal receives a model training task to be processed, Identify the model category corresponding to the model training task; determine the calculation amount corresponding to the model training task according to the model category corresponding to the model training task and the relationship between the preset model category and the amount of calculation; obtain the real-time usage status of the mobile terminal according to the real-time usage status And the amount of calculation corresponding to the model training task, to control the execution of the model training task.
  • the processor further implements the following steps when executing the computer program: acquiring temperature information and real-time running process information of the mobile terminal; and determining the real-time usage status of the mobile terminal according to the temperature information and real-time running process information.
  • the processor further implements the following steps when executing the computer program: when the model category corresponding to the model training task is a neural network model, the calculation level corresponding to the model training task is the highest level; when the model training task corresponds to When the model type is a non-neural network model, run the model training task to obtain the calculation level corresponding to the model training task; according to the obtained calculation level and the relationship between the preset calculation level and the calculation volume, determine the corresponding model training task Computation.
  • the processor executes the computer program
  • the following steps are also implemented: obtaining the computing amount provided data corresponding to the real-time use status; comparing the computing amount provided data with the computing amount corresponding to the model training task, and filtering to obtain the
  • the volume provides the target tasks of model training within the scope of data bearing; the call volume provides the computing resources corresponding to the data, and the target tasks of model training are performed.
  • the processor when the processor executes the computer program, the following steps are also implemented: call the preset first process, obtain executable model training tasks according to the real-time usage status and the calculation amount corresponding to the model training tasks; respectively call different
  • the preset second process executes executable model training tasks, and the second process and the first process are different processes.
  • the processor further implements the following steps when executing the computer program: calling a preset process, obtaining an executable model training task according to the real-time usage status and the amount of calculation corresponding to the model training task; calling the same process to execute the executable The model training task performed.
  • a computer-readable storage medium on which a computer program is stored.
  • the computer program is executed by a processor, the following steps are also implemented: when the mobile terminal receives a model training task to be processed, the model is recognized The model category corresponding to the training task; the calculation amount corresponding to the model training task is determined according to the model category corresponding to the model training task and the relationship between the preset model category and the amount of calculation; the real-time usage status of the mobile terminal is obtained, based on the real-time usage status and model The calculation amount corresponding to the training task controls the execution of the model training task.
  • the following steps are further implemented: acquiring temperature information and real-time running process information of the mobile terminal; and determining the real-time use status of the mobile terminal according to the temperature information and real-time running process information.
  • the following steps are also implemented: when the model category corresponding to the model training task is a neural network model, the calculation level corresponding to the model training task is the highest level; when the model training task corresponds to When the model type of is a non-neural network model, by running the model training task, the calculation amount level corresponding to the model training task is obtained; according to the obtained calculation amount level and the relationship between the preset calculation amount level and the calculation amount, the corresponding model training task is determined The amount of computing.
  • the following steps are also implemented: obtain the computing amount provided data corresponding to the real-time use status; The amount of calculation provides the target tasks of model training within the range of data bearing; the amount of calculation is called to provide the calculation resources corresponding to the data, and the target tasks of model training are performed.
  • the following steps are also implemented: call the preset first process, obtain executable model training tasks according to the real-time usage status and the calculation amount corresponding to the model training tasks; respectively call different
  • the preset second process executes executable model training tasks, and the second process and the first process are different processes.
  • the following steps are also implemented: call a preset process, obtain an executable model training task according to the real-time usage status and the calculation amount corresponding to the model training task; call the same process to execute Executable model training tasks.
  • Non-volatile memory may include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory.
  • Volatile memory may include random access memory (RAM) or external cache memory.
  • RAM is available in many forms, such as static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous chain Channel (Synchlink) DRAM (SLDRAM), memory bus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • Stored Programmes (AREA)

Abstract

A model training task processing method and apparatus, an electronic device, and a storage medium, relating to the technical field of mobile terminals. The method comprises: when a mobile terminal receives a model training task to be processed, identifying a model category corresponding to the model training task (S202); determining a computation burden corresponding to the model training task according to the model category corresponding to the model training task and a preset relationship between the model category and the computation burden (S204); and obtaining a real-time use state of the mobile terminal, and controlling the execution of the model training task according to the real-time use state and the computation burden corresponding to the model training task (S206). In this case, the model training task can be reasonably executed by computation resources of the mobile terminal instead of triggering the execution operation when the model training task is received, so that the training efficiency is improved.

Description

模型训练任务处理方法、装置、电子设备和存储介质Model training task processing method, device, electronic equipment and storage medium 【技术领域】【Technical Field】
本申请涉及移动终端技术领域,特别是涉及一种模型训练任务处理方法、装置、电子设备和存储介质。This application relates to the technical field of mobile terminals, and in particular to a method, device, electronic device, and storage medium for processing model training tasks.
【背景技术】【Background technique】
随着科学技术的发展,各种智能设备应用于日常生活,产生了海量的行为数据和信息资源。机器学习应运而生,给人们的生产和生活带来了翻天覆地的变化,通过机器学习的方法可以实现从海量数据中挖掘有价值信息,以更好地服务用户,给用户带来便利。With the development of science and technology, various smart devices are used in daily life, generating massive amounts of behavioral data and information resources. Machine learning emerged at the historic moment, bringing earth-shaking changes to people's production and life. Through machine learning methods, it is possible to mine valuable information from massive amounts of data to better serve users and bring convenience to users.
机器学习大致可以分为确定模型、训练模型和使用模型三个部分,其中,训练模型是机器学习的重要组成部分。传统的模型训练一般存在新增训练任务时,即触发训练任务的执行,而模型训练的计算量一般较大,这样导致模型训练任务的训练效率不高。Machine learning can be roughly divided into three parts: determining model, training model and using model. Among them, training model is an important part of machine learning. Traditional model training generally triggers the execution of the training task when there is a new training task, and the calculation amount of model training is generally large, which leads to low training efficiency of the model training task.
【发明内容】[Summary of the invention]
基于此,有必要针对上述技术问题,提供一种可以提高训练效率的模型训练任务处理方法、装置、电子设备和存储介质。Based on this, it is necessary to provide a model training task processing method, device, electronic device, and storage medium that can improve training efficiency in response to the above technical problems.
一种模型训练任务处理方法,所述方法包括:A method for processing model training tasks, the method comprising:
当移动终端接收到待处理的模型训练任务时,识别所述模型训练任务对应的模型类别;When the mobile terminal receives the model training task to be processed, identifying the model category corresponding to the model training task;
根据所述模型训练任务对应的模型类别以及预设的模型类别与运算量的关系,确定所述模型训练任务对应的运算量;Determine the calculation amount corresponding to the model training task according to the model category corresponding to the model training task and the preset relationship between the model category and the calculation amount;
获取所述移动终端的实时使用状态,根据所述实时使用状态以及所述模型训练任务对应的运算量,控制所述模型训练任务的执行。Acquire the real-time usage status of the mobile terminal, and control the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task.
在一个实施例中,所述获取所述移动终端的实时使用状态包括:In an embodiment, the acquiring the real-time usage status of the mobile terminal includes:
获取所述移动终端的温度信息以及实时运行进程信息;Acquiring temperature information and real-time running process information of the mobile terminal;
根据所述温度信息以及所述实时运行进程信息,确定所述移动终端的实时使用状态。According to the temperature information and the real-time running process information, the real-time use status of the mobile terminal is determined.
在一个实施例中,所述根据所述模型训练任务对应的模型类别以及预设的模型类别与运算量的关系,确定所述模型训练任务对应的运算量包括:In an embodiment, the determining the calculation amount corresponding to the model training task according to the model category corresponding to the model training task and the preset relationship between the model category and the calculation amount includes:
当所述模型训练任务对应的模型类别为神经网络模型时,得到所述模型训练任务对应的运算量等级为最高等级;When the model category corresponding to the model training task is a neural network model, the calculation amount level corresponding to the model training task is obtained as the highest level;
当所述模型训练任务对应的模型类别为非神经网络模型时,通过运行所述模型训练任务,得到所述模型训练任务对应的运算量等级;When the model category corresponding to the model training task is a non-neural network model, by running the model training task, the calculation amount level corresponding to the model training task is obtained;
根据获得的运算量等级以及预设的运算量等级与运算量的关系,确定所述模型训练任务对应的运算量。The calculation amount corresponding to the model training task is determined according to the obtained calculation amount level and the preset relationship between the calculation amount level and the calculation amount.
在一个实施例中,所述根据所述实时使用状态以及所述模型训练任务对应的运算量,控制所述模型训练任务的执行包括:In an embodiment, the controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task includes:
获取与所述实时使用状态对应的运算量提供数据;Obtaining provided data corresponding to the real-time usage status of the calculation amount;
将所述运算量提供数据分别与所述模型训练任务对应的运算量进行比较,筛选得到在所述运算量提供数据承载范围内的模型训练目标任务;Comparing the data provided by the calculation amount with the calculation amount corresponding to the model training task respectively, and filter to obtain the model training target tasks within the range of the data provided by the calculation amount;
调用所述运算量提供数据对应的运算资源,执行所述模型训练目标任务。Invoking the computing resources to provide computing resources corresponding to the data to execute the model training target task.
在一个实施例中,所述根据所述实时使用状态以及所述模型训练任务对应的运算量,控制所述模型训练任务的执行包括:In an embodiment, the controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task includes:
调用预设的第一进程,根据所述实时使用状态以及所述模型训练任务对应的运算量,获得可执行的模型训练任务;Calling a preset first process, and obtaining an executable model training task according to the real-time usage status and the calculation amount corresponding to the model training task;
分别调用不同的预设的第二进程执行所述可执行的模型训练任务,所述第二进程与所述第一进程为不同的进程。Different preset second processes are respectively called to execute the executable model training task, and the second process and the first process are different processes.
在一个实施例中,所述根据所述实时使用状态以及所述模型训练任务对应的运算量,控制所述模型训练任务的执行包括:In an embodiment, the controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task includes:
调用预设的进程,根据所述实时使用状态以及所述模型训练任务对应的运算量,获得可执行的模型训练任务;Calling a preset process, and obtaining an executable model training task according to the real-time use status and the calculation amount corresponding to the model training task;
调用同一个进程执行所述可执行的模型训练任务。Call the same process to execute the executable model training task.
一种模型训练任务处理装置,所述装置包括:A model training task processing device, the device comprising:
任务识别模块,用于当移动终端接收到待处理的模型训练任务时,识别所述模型训练任务对应的模型类别;The task recognition module is used to recognize the model category corresponding to the model training task when the mobile terminal receives the to-be-processed model training task;
运算量确定模块,用于根据所述模型训练任务对应的模型类别以及预设的模型类别与运算量的关系,确定所述模型训练任务对应的运算量;The calculation amount determination module is configured to determine the calculation amount corresponding to the model training task according to the model category corresponding to the model training task and the preset relationship between the model category and the calculation amount;
任务控制模块,用于获取所述移动终端的实时使用状态,根据所述实时使用状态以及所述模型训练任务对应的运算量,控制所述模型训练任务的执行。The task control module is used to obtain the real-time usage status of the mobile terminal, and control the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task.
在一个实施例中,所述任务控制模块还用于获取所述移动终端的温度信息以及实时运行进程信息;根据所述温度信息以及所述实时运行进程信息,确定所述移动终端的实时使用状态。In one embodiment, the task control module is also used to obtain temperature information and real-time operation process information of the mobile terminal; and determine the real-time use status of the mobile terminal according to the temperature information and the real-time operation process information .
在一个实施例中,所述运算量确定模块还用于当所述模型训练任务对应的模型类别为神经网络模型时,得到所述模型训练任务对应的运算量等级为最高等级;当所述模型训练任务对应的模型类别为非神经网络模型时,通过运行所述模型训练任务,得到所述模型训练任务对应的运算量等级;根据获得的运算量等级以及预设的运算量等级与运算量的关系,确定所述模型训练任务对应的运算量。In one embodiment, the calculation amount determination module is further configured to obtain the calculation amount level corresponding to the model training task as the highest level when the model category corresponding to the model training task is a neural network model; When the model category corresponding to the training task is a non-neural network model, by running the model training task, the calculation amount level corresponding to the model training task is obtained; according to the obtained calculation amount level and the preset calculation amount level and calculation amount Relationship to determine the amount of calculation corresponding to the model training task.
在一个实施例中,所述任务控制模块还用于获取与所述实时使用状态对应的运算量提供数据;将所述运算量提供数据分别与所述模型训练任务对应的运算量进行比较,筛选得到在所述运算量提供数据承载范围内的模型训练目标任务;调用所述运算量提供数据对应的运算资源,执行所述模型训练目标任务。In one embodiment, the task control module is further configured to obtain the calculation amount provided data corresponding to the real-time use status; compare the calculation amount provided data with the calculation amount corresponding to the model training task, and filter Obtain the model training target task within the range of the data provided by the calculation amount; call the calculation resource corresponding to the data provided by the calculation amount to execute the model training target task.
在一个实施例中,所述任务控制模块还用于调用预设的第一进程,根据所述实时使用状态以及所述模型训练任务对应的运算量,获得可执行的模型训练任务;分别调用不同的预设的第二进程执行所述可执行的模型训练任务,所述第二进程与所述第一进程为不同的进程。In one embodiment, the task control module is also used to call a preset first process, and obtain executable model training tasks according to the real-time usage status and the calculation amount corresponding to the model training tasks; respectively call different The preset second process executes the executable model training task, and the second process and the first process are different processes.
在一个实施例中,所述任务控制模块还用于调用预设的进程,根据所述实时使用状态以及所述模型训练任务对应的运算量,获得可执行的模型训练任务;调用同一个进程执行所述可执行的模型训练任务。In one embodiment, the task control module is also used to call a preset process, and obtain an executable model training task according to the real-time usage status and the calculation amount corresponding to the model training task; call the same process to execute The executable model training task.
一种电子设备,包括存储器和处理器,所述存储器存储有计算机程序,所述处理器执行所述计算机程序时实现模型训练任务处理方法,所述方法包括:An electronic device includes a memory and a processor, the memory stores a computer program, and when the processor executes the computer program, a method for processing a model training task is implemented, and the method includes:
当移动终端接收到待处理的模型训练任务时,识别所述模型训练任务对应的模型类别;When the mobile terminal receives the model training task to be processed, identifying the model category corresponding to the model training task;
根据所述模型训练任务对应的模型类别以及预设的模型类别与运算量的关系,确定所述模型训练任务对应的运算量;Determine the calculation amount corresponding to the model training task according to the model category corresponding to the model training task and the preset relationship between the model category and the calculation amount;
获取所述移动终端的实时使用状态,根据所述实时使用状态以及所述模型训练任务对应的运算量,控制所述模型训练任务的执行。Acquire the real-time usage status of the mobile terminal, and control the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task.
在一个实施例中,所述获取所述移动终端的实时使用状态包括:In an embodiment, the acquiring the real-time usage status of the mobile terminal includes:
获取所述移动终端的温度信息以及实时运行进程信息;Acquiring temperature information and real-time running process information of the mobile terminal;
根据所述温度信息以及所述实时运行进程信息,确定所述移动终端的实时使用状态。According to the temperature information and the real-time running process information, the real-time use status of the mobile terminal is determined.
在一个实施例中,所述根据所述模型训练任务对应的模型类别以及预设的模型类别与运算量的关系,确定所述模型训练任务对应的运算量包括:In one embodiment, the determining the calculation amount corresponding to the model training task according to the model category corresponding to the model training task and the preset relationship between the model category and the calculation amount includes:
当所述模型训练任务对应的模型类别为神经网络模型时,得到所述模型训练任务对应的运算量等级为最高等级;When the model category corresponding to the model training task is a neural network model, the calculation amount level corresponding to the model training task is obtained as the highest level;
当所述模型训练任务对应的模型类别为非神经网络模型时,通过运行所述模型训练任务,得到所述模型训练任务对应的运算量等级;When the model category corresponding to the model training task is a non-neural network model, by running the model training task, the calculation amount level corresponding to the model training task is obtained;
根据获得的运算量等级以及预设的运算量等级与运算量的关系,确定所述模型训练任务对应的运算量。The calculation amount corresponding to the model training task is determined according to the obtained calculation amount level and the preset relationship between the calculation amount level and the calculation amount.
在一个实施例中,所述根据所述实时使用状态以及所述模型训练任务对应的运算量,控制所述模型训练任务的执行包括:In an embodiment, the controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task includes:
获取与所述实时使用状态对应的运算量提供数据;Obtaining provided data corresponding to the real-time usage status of the calculation amount;
将所述运算量提供数据分别与所述模型训练任务对应的运算量进行比较,筛选得到在所述运算量提供数据承载范围内的模型训练目标任务;Comparing the data provided by the calculation amount with the calculation amount corresponding to the model training task respectively, and filter to obtain the model training target tasks within the range of the data provided by the calculation amount;
调用所述运算量提供数据对应的运算资源,执行所述模型训练目标任务。Invoking the computing resources to provide computing resources corresponding to the data to execute the model training target task.
在一个实施例中,所述根据所述实时使用状态以及所述模型训练任务对应的运算量,控制所述模型训练任务的执行包括:In an embodiment, the controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task includes:
调用预设的第一进程,根据所述实时使用状态以及所述模型训练任务对应的运算量,获得可执行的模型训练任务;Calling a preset first process, and obtaining an executable model training task according to the real-time usage status and the calculation amount corresponding to the model training task;
分别调用不同的预设的第二进程执行所述可执行的模型训练任务,所述第二进程与所述第一进程为不同的进程。Different preset second processes are respectively called to execute the executable model training task, and the second process and the first process are different processes.
在一个实施例中,所述根据所述实时使用状态以及所述模型训练任务对应的运算量,控制所述模型训练任务的执行包括:In an embodiment, the controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task includes:
调用预设的进程,根据所述实时使用状态以及所述模型训练任务对应的运算量,获得可执行的模型训练任务;Calling a preset process, and obtaining an executable model training task according to the real-time use status and the calculation amount corresponding to the model training task;
调用同一个进程执行所述可执行的模型训练任务。Call the same process to execute the executable model training task.
一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序被处理器执行时实现模型训练任务处理方法,所述方法包括:A computer-readable storage medium has a computer program stored thereon, and when the computer program is executed by a processor, a method for processing a model training task is implemented, the method comprising:
当移动终端接收到待处理的模型训练任务时,识别所述模型训练任务对应的模型类别;When the mobile terminal receives the model training task to be processed, identifying the model category corresponding to the model training task;
根据所述模型训练任务对应的模型类别以及预设的模型类别与运算量的关系,确定所述模型训练任务对应的运算量;Determine the calculation amount corresponding to the model training task according to the model category corresponding to the model training task and the preset relationship between the model category and the calculation amount;
获取所述移动终端的实时使用状态,根据所述实时使用状态以及所述模型训练任务对应的运算量,控制所述模型训练任务的执行。Acquire the real-time usage status of the mobile terminal, and control the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task.
在一个实施例中,所述获取所述移动终端的实时使用状态包括:In an embodiment, the acquiring the real-time usage status of the mobile terminal includes:
获取所述移动终端的温度信息以及实时运行进程信息;Acquiring temperature information and real-time running process information of the mobile terminal;
根据所述温度信息以及所述实时运行进程信息,确定所述移动终端的实时使用状态。According to the temperature information and the real-time running process information, the real-time use status of the mobile terminal is determined.
在一个实施例中,所述根据所述模型训练任务对应的模型类别以及预设的模型类别与运算量的关系,确定所述模型训练任务对应的运算量包括:In one embodiment, the determining the calculation amount corresponding to the model training task according to the model category corresponding to the model training task and the preset relationship between the model category and the calculation amount includes:
当所述模型训练任务对应的模型类别为神经网络模型时,得到所述模型训练任务对应的运算量等级为最高等级;When the model category corresponding to the model training task is a neural network model, the calculation amount level corresponding to the model training task is obtained as the highest level;
当所述模型训练任务对应的模型类别为非神经网络模型时,通过运行所述模型训练任务,得到所述模型训练任务对应的运算量等级;When the model category corresponding to the model training task is a non-neural network model, by running the model training task, the calculation amount level corresponding to the model training task is obtained;
根据获得的运算量等级以及预设的运算量等级与运算量的关系,确定所述模型训练任务对应的运算量。The calculation amount corresponding to the model training task is determined according to the obtained calculation amount level and the preset relationship between the calculation amount level and the calculation amount.
在一个实施例中,所述根据所述实时使用状态以及所述模型训练任务对应的运算量,控制所述模型训练任务的执行包括:In an embodiment, the controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task includes:
获取与所述实时使用状态对应的运算量提供数据;Obtaining provided data corresponding to the real-time usage status of the calculation amount;
将所述运算量提供数据分别与所述模型训练任务对应的运算量进行比较,筛选得到在所述运算量提供数据承载范围内的模型训练目标任务;Comparing the data provided by the calculation amount with the calculation amount corresponding to the model training task respectively, and filter to obtain the model training target tasks within the range of the data provided by the calculation amount;
调用所述运算量提供数据对应的运算资源,执行所述模型训练目标任务。Invoking the computing resources to provide computing resources corresponding to the data to execute the model training target task.
在一个实施例中,所述根据所述实时使用状态以及所述模型训练任务对应的运算量,控制所述模型训练任务的执行包括:In an embodiment, the controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task includes:
调用预设的第一进程,根据所述实时使用状态以及所述模型训练任务对应的运算量,获得可执行的模型训练任务;Calling a preset first process, and obtaining an executable model training task according to the real-time usage status and the calculation amount corresponding to the model training task;
分别调用不同的预设的第二进程执行所述可执行的模型训练任务,所述第二进程与所述第一进程为不同的进程。Different preset second processes are respectively called to execute the executable model training task, and the second process and the first process are different processes.
在一个实施例中,所述根据所述实时使用状态以及所述模型训练任务对应的运算量,控制所述模型训练任务的执行包括:In an embodiment, the controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task includes:
调用预设的进程,根据所述实时使用状态以及所述模型训练任务对应的运算量,获得可执行的模型训练任务;Calling a preset process, and obtaining an executable model training task according to the real-time use status and the calculation amount corresponding to the model training task;
调用同一个进程执行所述可执行的模型训练任务。Call the same process to execute the executable model training task.
上述模型训练任务处理方法、装置、电子设备和存储介质,当接收到待处 理的模型训练任务时,识别模型训练任务对应的模型类别;根据模型训练任务对应的模型类别以及预设的模型类别与运算量的关系,确定模型训练任务对应的运算量;获取移动终端的实时使用状态,根据实时使用状态以及模型训练任务对应的运算量,控制模型训练任务的执行,而不是接收到模型训练任务即触发执行操作,这样使得模型训练任务能够被移动终端的运算资源合理地执行,从而提高训练效率。The above-mentioned model training task processing method, device, electronic equipment and storage medium, when receiving the model training task to be processed, identify the model category corresponding to the model training task; according to the model category corresponding to the model training task and the preset model category and The relationship between the amount of calculations determines the amount of calculation corresponding to the model training task; obtains the real-time use status of the mobile terminal, and controls the execution of the model training task according to the real-time use status and the amount of calculation corresponding to the model training task, instead of receiving the model training task. The execution operation is triggered, so that the model training task can be executed reasonably by the computing resources of the mobile terminal, thereby improving the training efficiency.
【附图说明】【Explanation of the drawings】
图1为本申请一个实施例中模型训练任务处理方法的应用环境图;FIG. 1 is an application environment diagram of a method for processing model training tasks in an embodiment of this application;
图2为本申请一个实施例中模型训练任务处理方法的流程示意图;2 is a schematic flowchart of a method for processing model training tasks in an embodiment of this application;
图3为本申请一个实施例中模型训练任务处理主从架构的示意图;FIG. 3 is a schematic diagram of a master-slave architecture for processing model training tasks in an embodiment of this application;
图4为本申请一个实施例中模型训练任务对应的运算量的确定步骤的流程示意图;FIG. 4 is a schematic flowchart of a step of determining the amount of calculation corresponding to a model training task in an embodiment of the application;
图5为本申请一个实施例中模型训练任务的执行控制步骤的流程示意图;FIG. 5 is a schematic flowchart of execution control steps of a model training task in an embodiment of this application;
图6为本申请一个实施例中模型训练任务处理装置的结构框图;Fig. 6 is a structural block diagram of a model training task processing device in an embodiment of the application;
图7为本申请一个实施例中电子设备的内部结构图。Fig. 7 is an internal structure diagram of an electronic device in an embodiment of the application.
【具体实施方式】【Detailed ways】
为了使本申请的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本申请进行进一步详细说明。应当理解,此处描述的具体实施例仅仅用以解释本申请,并不用于限定本申请。In order to make the purpose, technical solutions, and advantages of this application clearer and clearer, the following further describes the application in detail with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are only used to explain the present application, and are not used to limit the present application.
本申请提供的模型训练任务处理方法,可以应用于如图1所示的应用环境中。在用户使用移动终端过程中,收集对预设的模型进行训练所需的数据,当收集到的数据达到预设条件时,触发对应的模型训练请求,模型训练请求携带待处理的模型训练任务。移动终端在接收到待处理的模型训练任务时,识别模型训练任务对应的模型类别;根据模型训练任务对应的模型类别以及预设的模型类别与运算量的关系,确定模型训练任务对应的运算量;获取移动终端的实时使用状态,根据实时使用状态以及模型训练任务对应的运算量,控制模型训练任务的执行。其中,移动终端可以但不限于是各种个人计算机、笔记本电脑、智能手机和平板电脑等。具体地,移动终端执行本申请实施例中的模型训练任务处理方法可以通过处理器来执行。The method for processing model training tasks provided in this application can be applied to the application environment as shown in FIG. 1. When the user uses the mobile terminal, the data required for training the preset model is collected. When the collected data reaches the preset condition, the corresponding model training request is triggered, and the model training request carries the model training task to be processed. When the mobile terminal receives the model training task to be processed, it recognizes the model category corresponding to the model training task; according to the model category corresponding to the model training task and the relationship between the preset model category and the amount of calculation, the calculation amount corresponding to the model training task is determined ; Acquire the real-time usage status of the mobile terminal, and control the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task. Among them, the mobile terminal can be, but is not limited to, various personal computers, notebook computers, smart phones, and tablet computers. Specifically, the mobile terminal executing the model training task processing method in the embodiment of the present application may be executed by a processor.
在一个实施例中,如图2所示,提供了一种模型训练任务处理方法,以该方法应用于图1中的移动终端为例进行说明,包括以下步骤:In an embodiment, as shown in FIG. 2, a method for processing a model training task is provided. Taking the method applied to the mobile terminal in FIG. 1 as an example for description, the method includes the following steps:
步骤202,当移动终端接收到待处理的模型训练任务时,识别模型训练任务对应的模型类别。Step 202: When the mobile terminal receives the to-be-processed model training task, identify the model category corresponding to the model training task.
模型训练任务是指实现模型功能所需完成的数据训练过程,模型类别是指模型的属性信息,比如模型是神经网络模型还是非神经网络模型。可以在移动终端的处理器搭建模型训练任务处理架构,比如采用主从架构,如图3所示,通过训练管理模块来管理所有的模型训练任务,所有的模型训练任务需要在训练管理模块中进行注册,在模型训练任务注册时提供模型类别,通过模型类别 区分模型训练任务对应的不同的模型类别。The model training task refers to the data training process that needs to be completed to realize the model function, and the model category refers to the attribute information of the model, such as whether the model is a neural network model or a non-neural network model. The model training task processing architecture can be built on the processor of the mobile terminal, such as the master-slave architecture, as shown in Figure 3, through the training management module to manage all model training tasks, all model training tasks need to be carried out in the training management module Registration, the model category is provided when the model training task is registered, and different model categories corresponding to the model training task are distinguished by the model category.
步骤204,根据模型训练任务对应的模型类别以及预设的模型类别与运算量的关系,确定模型训练任务对应的运算量。Step 204: Determine the calculation amount corresponding to the model training task according to the model category corresponding to the model training task and the preset relationship between the model category and the calculation amount.
不同的模型类别对应不同的运算量,根据预设的模型类别与运算量的关系,比如模型类别A对应的运算量为1个单位,模型类别B对应的运算量为2个单位,等等。根据模型训练任务对应的模型类别以及预设的对应关系,得到模型训练任务对应的运算量。Different model categories correspond to different calculation amounts. According to the preset relationship between model categories and calculation amounts, for example, the calculation amount corresponding to model category A is 1 unit, and the calculation amount corresponding to model category B is 2 units, and so on. According to the model category corresponding to the model training task and the preset correspondence relationship, the calculation amount corresponding to the model training task is obtained.
步骤206,获取移动终端的实时使用状态,根据实时使用状态以及模型训练任务对应的运算量,控制模型训练任务的执行。Step 206: Obtain the real-time usage status of the mobile terminal, and control the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task.
移动终端的实时使用状态用于表征移动终端的使用场景。在不同的使用场景下,定义不同的移动终端使用状态,比如在移动终端的熄屏时间超过预设时长时,定义此时移动终端处于睡眠状态。在移动终端处于不同的使用场景下,安排不同运算量的模型训练任务的执行,根据移动终端当前所处情境可提供的运算力来控制待处理的模型训练任务的执行。The real-time usage status of the mobile terminal is used to characterize the usage scenario of the mobile terminal. In different usage scenarios, different mobile terminal usage states are defined. For example, when the screen off time of the mobile terminal exceeds a preset period of time, it is defined that the mobile terminal is in a sleep state at this time. When the mobile terminal is in different usage scenarios, the execution of model training tasks with different calculation amounts is arranged, and the execution of the model training tasks to be processed is controlled according to the calculation power provided by the current situation of the mobile terminal.
上述模型训练任务处理方法,当接收到待处理的模型训练任务时,识别模型训练任务对应的模型类别;根据模型训练任务对应的模型类别以及预设的模型类别与运算量的关系,确定模型训练任务对应的运算量;获取移动终端的实时使用状态,根据实时使用状态以及模型训练任务对应的运算量,控制模型训练任务的执行,而不是接收到模型训练任务即触发执行操作,这样使得模型训练任务能够被移动终端的运算资源合理地执行,从而提高训练效率。In the above-mentioned model training task processing method, when a model training task to be processed is received, the model category corresponding to the model training task is identified; the model training is determined according to the model category corresponding to the model training task and the relationship between the preset model category and the amount of calculation The calculation amount corresponding to the task; obtain the real-time usage status of the mobile terminal, and control the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task, instead of triggering the execution operation when the model training task is received, so that the model training The task can be executed reasonably by the computing resources of the mobile terminal, thereby improving training efficiency.
在一个实施例中,获取移动终端的实时使用状态包括:获取移动终端的温度信息以及实时运行进程信息;根据温度信息以及实时运行进程信息,确定移动终端的实时使用状态。可以通过温度传感器采集移动终端的温度信息,具体可以是采集移动终端的处理器的实时温度。实时进程信息是指移动终端处理器的工作状态,比如处理器处于工作状态还是空闲状态,在工作状态下的负载情况等。具体地,如表1所示,给出了移动终端的五种使用状态以及对应的状态解释。In one embodiment, acquiring the real-time usage status of the mobile terminal includes: acquiring temperature information and real-time running process information of the mobile terminal; and determining the real-time usage status of the mobile terminal according to the temperature information and real-time running process information. The temperature information of the mobile terminal can be collected through the temperature sensor, and specifically can be the real-time temperature of the processor of the mobile terminal. Real-time process information refers to the working state of the mobile terminal processor, such as whether the processor is in a working state or an idle state, and the load status in the working state. Specifically, as shown in Table 1, five usage states of the mobile terminal and corresponding state explanations are given.
表1移动终端的使用状态定义Table 1 Definition of mobile terminal usage status
Figure PCTCN2021091635-appb-000001
Figure PCTCN2021091635-appb-000001
在一个实施例中,如图4所示,根据模型训练任务对应的模型类别以及预设的模型类别与运算量的关系,确定模型训练任务对应的运算量包括:步骤402,当模型训练任务对应的模型类别为神经网络模型时,得到模型训练任务对应的 运算量等级为最高等级;步骤404,当模型训练任务对应的模型类别为非神经网络模型时,通过运行模型训练任务,得到模型训练任务对应的运算量等级;步骤406,根据获得的运算量等级以及预设的运算量等级与运算量的关系,确定模型训练任务对应的运算量。模型训练任务向训练管理模块注册时,提供任务对应的模型类别,比如区分xCNN(Convolutional Neural Network,卷积神经网络)模型类型和其它类型。xCNN模型表示CNN模型的衍生类模型,卷积神经网络模型常用于影像类型的深度神经网络架构,其特点是网络较深,相对所需运算资源也较大。对于xCNN模型,直接将其归类为超高运算量等级。而对于其它类型的模型,训练管理模块在熄屏充电情况下,进行模型训练任务的运算量的探测,通过探测结果对其运算量等级归类,具体的运算量等级与对应的运算力消耗单位如表2所示。其中,运算力消耗单位是指移动终端执行某个运算量等级的模型训练任务,所需消耗的运算力,可以看出,运算量等级越高,对应的运算力消耗越多。以TOPS(Tera Operations Per Second,每秒执行10的12次方次操作)作为处理器运算能力单位为例,1TOPS代表处理器每秒钟可进行10的12次方次操作。In one embodiment, as shown in FIG. 4, according to the model category corresponding to the model training task and the relationship between the preset model category and the calculation amount, determining the calculation amount corresponding to the model training task includes: step 402, when the model training task corresponds to When the model category of is a neural network model, the calculation level corresponding to the model training task is obtained as the highest level; step 404, when the model category corresponding to the model training task is a non-neural network model, the model training task is obtained by running the model training task Corresponding operation amount level; Step 406: Determine the operation amount corresponding to the model training task according to the obtained operation amount level and the relationship between the preset operation amount level and the operation amount. When the model training task is registered with the training management module, the model category corresponding to the task is provided, such as distinguishing between xCNN (Convolutional Neural Network) model types and other types. The xCNN model represents a derivative model of the CNN model. The convolutional neural network model is often used in the deep neural network architecture of the image type. It is characterized by a deeper network and relatively large computing resources. For the xCNN model, it is directly classified as an ultra-high computing level. For other types of models, the training management module detects the calculation amount of the model training task when the screen is turned off, and classifies the calculation amount according to the detection result, the specific calculation amount level and the corresponding calculation power consumption unit As shown in table 2. Among them, the computing power consumption unit refers to the computing power required for the mobile terminal to perform a model training task of a certain computing level. It can be seen that the higher the computing level, the more computing power is consumed. Take TOPS (Tera Operations Per Second, which executes 10 to the 12th power of operations per second) as an example of a processor's computing power unit. 1TOPS means that the processor can perform 10 to 12th power of operations per second.
表2运算量等级与对应的运算力消耗单位Table 2 The calculation level and the corresponding calculation power consumption unit
运算量等级Computing level 运算力消耗单位Computing power consumption unit
Low 11
middle 22
high 44
超高Super high 88
对于模型训练任务,一般采用SGD(Stochastic Gradient Descent,随机梯度下降)进行训练,即每次训练时在训练集中选取预设批量大小的样本进行训练。在训练过程中,一次迭代表示使用训练集中预设批量大小的样本训练一次,1次时期表示使用训练集中的全部样本训练一次。比如模型训练任务对应的训练集包括1000个样本,预设批量大小batchsize=10,那么训练整个样本集需要100次迭代,1次时期。如下表3所示,表3是基于某型号处理器获得的运算量等级与迭代时长的关系。For model training tasks, SGD (Stochastic Gradient Descent, Stochastic Gradient Descent) is generally used for training, that is, samples of a preset batch size are selected in the training set for each training. In the training process, one iteration means training once with samples of the preset batch size in the training set, and one period means training once with all samples in the training set. For example, the training set corresponding to the model training task includes 1000 samples, and the preset batch size is batchsize=10, then 100 iterations and one period are required to train the entire sample set. As shown in Table 3 below, Table 3 is based on the relationship between the level of computation obtained by a certain model of processor and the iteration time.
表3运算量等级与对应的训练时长Table 3 Operational level and corresponding training time
运算量等级Computing level 训练1000个迭代所需时间Training time for 1000 iterations
Low 低于10秒Less than 10 seconds
middle 10秒-1分钟10 seconds to 1 minute
high 超过1分钟More than 1 minute
超高Super high 超过10分钟More than 10 minutes
在一个实施例中,如图5所示,根据实时使用状态以及模型训练任务对应的运算量,控制模型训练任务的执行包括:步骤502,获取与实时使用状态对应的运算量提供数据;步骤504,将运算量提供数据分别与模型训练任务对应的运算量进行比较,筛选得到在运算量提供数据承载范围内的模型训练目标任务;步骤506,调用运算量提供数据对应的运算资源,执行模型训练目标任务。运算 量提供数据是指移动终端处于某个使用状态时可提供的运算力,移动终端处于不同的使用状态对应不同的使用场景,在不同的使用场景下,移动终端可提供的运算力不同,模型训练任务的终止条件也不同。训练管理模块识别移动终端所处的使用场景,在不同的使用场景下安排运行不同运算量等级的模型训练任务。当某个使用场景对应的条件满足时,根据该使用场景可提供的运算力来控制待处理的模型训练任务的执行,如表4所示。比如,移动终端的实时使用状态满足表中第一个使用场景时,此时,移动终端可提供的运算力为8个单位,对照表2可知,可以执行一个超高运算量等级的模型训练任务,或者执行两个高运算等级的模型训练任务…依此类推。In one embodiment, as shown in FIG. 5, according to the real-time usage status and the calculation amount corresponding to the model training task, controlling the execution of the model training task includes: Step 502: Obtain the calculation amount provided data corresponding to the real-time usage status; Step 504 , Compare the calculation amount provided data with the calculation amount corresponding to the model training task, and filter the model training target tasks within the range of the calculation amount provided data bearing; step 506, call the calculation amount to provide the corresponding calculation resources of the data, and perform model training Target task. The data provided by the amount of computing refers to the computing power that the mobile terminal can provide when it is in a certain use state. Different use states of the mobile terminal correspond to different use scenarios. In different use scenarios, the mobile terminal can provide different computing power. The termination conditions for training tasks are also different. The training management module recognizes the usage scenarios of the mobile terminal, and arranges and runs model training tasks with different computing levels in different usage scenarios. When the conditions corresponding to a certain usage scenario are met, the execution of the model training tasks to be processed is controlled according to the computing power provided by the usage scenario, as shown in Table 4. For example, when the real-time usage status of the mobile terminal satisfies the first usage scenario in the table, at this time, the computing power that the mobile terminal can provide is 8 units. According to Table 2, it can be seen that a model training task with an ultra-high computing level can be performed. , Or perform two high-level model training tasks...and so on.
表4Table 4
Figure PCTCN2021091635-appb-000002
Figure PCTCN2021091635-appb-000002
在一个实施例中,根据实时使用状态以及模型训练任务对应的运算量,控制模型训练任务的执行包括:调用预设的第一进程,根据实时使用状态以及模型训练任务对应的运算量,获得可执行的模型训练任务;分别调用不同的预设的第二进程执行可执行的模型训练任务,第二进程与第一进程为不同的进程。训练管理模块以及多个模型训练任务的执行运行在不同的进程,即模型训练任务的控制与模型训练任务的执行分别通过不同的进程进行处理,易于维护,可以提高稳定性。通过在移动终端采用主从架构搭建模型训练任务处理架构,来执行本申请中的模型训练任务处理方法,在此架构下,不同的模型训练任务运行在不同的进程,彼此不会互相影响,单一训练进程异常终止不会影响其它的训练进程。In one embodiment, controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task includes: invoking the preset first process, and obtaining the available data according to the real-time usage status and the calculation amount corresponding to the model training task. Performed model training tasks; different preset second processes are respectively called to execute executable model training tasks, and the second process and the first process are different processes. The training management module and the execution of multiple model training tasks run in different processes, that is, the control of the model training task and the execution of the model training task are processed through different processes, which is easy to maintain and can improve stability. The model training task processing method in this application is executed by using a master-slave architecture to build a model training task processing structure on a mobile terminal. Under this structure, different model training tasks run in different processes and will not affect each other. Abnormal termination of the training process will not affect other training processes.
在一个实施例中,根据实时使用状态以及模型训练任务对应的运算量,控制模型训练任务的执行包括:调用预设的进程,根据实时使用状态以及模型训练任务对应的运算量,获得可执行的模型训练任务;调用同一个进程执行可执行的模型训练任务。训练管理模块以及多个模型训练任务的执行整合运行在同一个进程,而不同的模型训练任务转化成多个不同的线程来执行,提供了另外一种方式来实现本申请中的模型训练任务处理方法,以满足多样化的需求。In one embodiment, controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task includes: calling a preset process, and obtaining the executable according to the real-time usage status and the calculation amount corresponding to the model training task Model training tasks; call the same process to execute executable model training tasks. The training management module and the execution of multiple model training tasks are integrated and run in the same process, and different model training tasks are transformed into multiple different threads for execution, providing another way to implement the model training task processing in this application Methods to meet diverse needs.
应该理解的是,虽然图2、4-5的流程图中的各个步骤按照箭头的指示依次显示,但是这些步骤并不是必然按照箭头指示的顺序依次执行。除非本文中有 明确的说明,这些步骤的执行并没有严格的顺序限制,这些步骤可以以其它的顺序执行。而且,图2、4-5中的至少一部分步骤可以包括多个子步骤或者多个阶段,这些子步骤或者阶段并不必然是在同一时刻执行完成,而是可以在不同的时刻执行,这些子步骤或者阶段的执行顺序也不必然是依次进行,而是可以与其它步骤或者其它步骤的子步骤或者阶段的至少一部分轮流或者交替地执行。It should be understood that although the steps in the flowcharts of FIGS. 2 and 4-5 are displayed in sequence as indicated by the arrows, these steps are not necessarily executed in sequence in the order indicated by the arrows. Unless specifically stated in this article, the execution of these steps is not strictly limited in order, and these steps can be executed in other orders. Moreover, at least part of the steps in Figures 2, 4-5 may include multiple sub-steps or multiple stages. These sub-steps or stages are not necessarily executed at the same time, but can be executed at different times. These sub-steps Or the execution order of the stages is not necessarily carried out sequentially, but may be executed alternately or alternately with at least a part of other steps or sub-steps or stages of other steps.
在一个实施例中,如图6所示,提供了一种模型训练任务处理装置,模型训练任务处理装置包括任务识别模块602、运算量确定模块604以及任务控制模块606。任务识别模块602用于当移动终端接收到待处理的模型训练任务时,识别模型训练任务对应的模型类别。运算量确定模块604用于根据模型训练任务对应的模型类别以及预设的模型类别与运算量的关系,确定模型训练任务对应的运算量。任务控制模块606用于获取移动终端的实时使用状态,根据实时使用状态以及模型训练任务对应的运算量,控制模型训练任务的执行。In one embodiment, as shown in FIG. 6, a model training task processing device is provided. The model training task processing device includes a task identification module 602, a calculation amount determination module 604, and a task control module 606. The task recognition module 602 is configured to recognize the model category corresponding to the model training task when the mobile terminal receives the model training task to be processed. The calculation amount determination module 604 is configured to determine the calculation amount corresponding to the model training task according to the model category corresponding to the model training task and the preset relationship between the model category and the calculation amount. The task control module 606 is used to obtain the real-time usage status of the mobile terminal, and control the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task.
在一个实施例中,任务控制模块606还用于获取移动终端的温度信息以及实时运行进程信息;根据温度信息以及实时运行进程信息,确定移动终端的实时使用状态。In one embodiment, the task control module 606 is also used to obtain temperature information and real-time running process information of the mobile terminal; determine the real-time use status of the mobile terminal according to the temperature information and real-time running process information.
在一个实施例中,运算量确定模块604还用于当模型训练任务对应的模型类别为神经网络模型时,得到模型训练任务对应的运算量等级为最高等级;当模型训练任务对应的模型类别为非神经网络模型时,通过运行模型训练任务,得到模型训练任务对应的运算量等级;根据获得的运算量等级以及预设的运算量等级与运算量的关系,确定模型训练任务对应的运算量。In one embodiment, the calculation amount determining module 604 is also used to obtain the calculation amount level corresponding to the model training task as the highest level when the model category corresponding to the model training task is a neural network model; when the model category corresponding to the model training task is In the case of non-neural network models, by running the model training task, the calculation amount level corresponding to the model training task is obtained; the calculation amount corresponding to the model training task is determined according to the obtained calculation amount level and the relationship between the preset calculation amount level and the calculation amount.
在一个实施例中,任务控制模块606还用于获取与实时使用状态对应的运算量提供数据;将运算量提供数据分别与模型训练任务对应的运算量进行比较,筛选得到在运算量提供数据承载范围内的模型训练目标任务;调用运算量提供数据对应的运算资源,执行模型训练目标任务。In one embodiment, the task control module 606 is also used to obtain the calculation amount provided data corresponding to the real-time usage status; compare the calculation amount provided data with the calculation amount corresponding to the model training task, and filter to obtain the calculation amount provided data bearing The model training target tasks within the scope; the calculation amount is called to provide the computing resources corresponding to the data, and the model training target tasks are performed.
在一个实施例中,任务控制模块606还用于调用预设的第一进程,根据实时使用状态以及模型训练任务对应的运算量,获得可执行的模型训练任务;分别调用不同的预设的第二进程执行可执行的模型训练任务,第二进程与第一进程为不同的进程。In one embodiment, the task control module 606 is also used to call the preset first process, obtain executable model training tasks according to the real-time usage status and the calculation amount corresponding to the model training tasks; respectively call different preset first processes. The second process executes executable model training tasks, and the second process and the first process are different processes.
在一个实施例中,任务控制模块606还用于调用预设的进程,根据实时使用状态以及模型训练任务对应的运算量,获得可执行的模型训练任务;调用同一个进程执行可执行的模型训练任务。In one embodiment, the task control module 606 is also used to call a preset process to obtain an executable model training task according to the real-time usage status and the calculation amount corresponding to the model training task; call the same process to perform executable model training Task.
关于模型训练任务处理装置的具体限定可以参见上文中对于模型训练任务处理方法的限定,在此不再赘述。上述模型训练任务处理装置中的各个模块可全部或部分通过软件、硬件及其组合来实现。上述各模块可以硬件形式内嵌于或独立于计算机设备中的处理器中,也可以以软件形式存储于计算机设备中的存储器中,以便于处理器调用执行以上各个模块对应的操作。For the specific limitation of the model training task processing device, please refer to the above limitation of the model training task processing method, which will not be repeated here. Each module in the above-mentioned model training task processing device can be implemented in whole or in part by software, hardware and a combination thereof. The above-mentioned modules may be embedded in the form of hardware or independent of the processor in the computer equipment, or may be stored in the memory of the computer equipment in the form of software, so that the processor can call and execute the operations corresponding to the above-mentioned modules.
在一个实施例中,提供了一种电子设备,该电子设备可以是移动终端,图7提供了一种移动终端的内部结构图。该移动终端包括通过系统总线连接的处理器、存储器和显示屏。其中,该处理器用于提供计算和控制能力。该存储器包 括非易失性存储介质、内存储器。该非易失性存储介质存储有操作系统和计算机程序。该内存储器为非易失性存储介质中的操作系统和计算机程序的运行提供环境。图1中移动终端执行的模型训练任务处理方法具体可以由该移动终端的处理器来完成,即计算机程序被处理器执行时实现一种模型训练任务处理方法。In one embodiment, an electronic device is provided. The electronic device may be a mobile terminal. FIG. 7 provides an internal structure diagram of the mobile terminal. The mobile terminal includes a processor, a memory, and a display screen connected through a system bus. Among them, the processor is used to provide computing and control capabilities. The memory includes a non-volatile storage medium and internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage medium. The method for processing model training tasks performed by the mobile terminal in FIG. 1 may be specifically completed by the processor of the mobile terminal, that is, a method for processing model training tasks is implemented when a computer program is executed by the processor.
本领域技术人员可以理解,图7中示出的结构,仅仅是与本申请方案相关的部分结构的框图,并不构成对本申请方案所应用于其上的终端的限定,具体的移动终端可以包括比图中所示更多或更少的部件,或者组合某些部件,或者具有不同的部件布置。Those skilled in the art can understand that the structure shown in FIG. 7 is only a block diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation on the terminal to which the solution of the present application is applied. The specific mobile terminal may include More or fewer parts than shown in the figure, or some parts are combined, or have a different arrangement of parts.
在一个实施例中,提供了一种电子设备,包括存储器和处理器,该存储器存储有计算机程序,该处理器执行计算机程序时实现以下步骤:当移动终端接收到待处理的模型训练任务时,识别模型训练任务对应的模型类别;根据模型训练任务对应的模型类别以及预设的模型类别与运算量的关系,确定模型训练任务对应的运算量;获取移动终端的实时使用状态,根据实时使用状态以及模型训练任务对应的运算量,控制模型训练任务的执行。In one embodiment, an electronic device is provided, including a memory and a processor, the memory stores a computer program, and the processor implements the following steps when executing the computer program: when a mobile terminal receives a model training task to be processed, Identify the model category corresponding to the model training task; determine the calculation amount corresponding to the model training task according to the model category corresponding to the model training task and the relationship between the preset model category and the amount of calculation; obtain the real-time usage status of the mobile terminal according to the real-time usage status And the amount of calculation corresponding to the model training task, to control the execution of the model training task.
在一个实施例中,处理器执行计算机程序时还实现以下步骤:获取移动终端的温度信息以及实时运行进程信息;根据温度信息以及实时运行进程信息,确定移动终端的实时使用状态。In one embodiment, the processor further implements the following steps when executing the computer program: acquiring temperature information and real-time running process information of the mobile terminal; and determining the real-time usage status of the mobile terminal according to the temperature information and real-time running process information.
在一个实施例中,处理器执行计算机程序时还实现以下步骤:当模型训练任务对应的模型类别为神经网络模型时,得到模型训练任务对应的运算量等级为最高等级;当模型训练任务对应的模型类别为非神经网络模型时,通过运行模型训练任务,得到模型训练任务对应的运算量等级;根据获得的运算量等级以及预设的运算量等级与运算量的关系,确定模型训练任务对应的运算量。In one embodiment, the processor further implements the following steps when executing the computer program: when the model category corresponding to the model training task is a neural network model, the calculation level corresponding to the model training task is the highest level; when the model training task corresponds to When the model type is a non-neural network model, run the model training task to obtain the calculation level corresponding to the model training task; according to the obtained calculation level and the relationship between the preset calculation level and the calculation volume, determine the corresponding model training task Computation.
在一个实施例中,处理器执行计算机程序时还实现以下步骤:获取与实时使用状态对应的运算量提供数据;将运算量提供数据分别与模型训练任务对应的运算量进行比较,筛选得到在运算量提供数据承载范围内的模型训练目标任务;调用运算量提供数据对应的运算资源,执行模型训练目标任务。In one embodiment, when the processor executes the computer program, the following steps are also implemented: obtaining the computing amount provided data corresponding to the real-time use status; comparing the computing amount provided data with the computing amount corresponding to the model training task, and filtering to obtain the The volume provides the target tasks of model training within the scope of data bearing; the call volume provides the computing resources corresponding to the data, and the target tasks of model training are performed.
在一个实施例中,处理器执行计算机程序时还实现以下步骤:调用预设的第一进程,根据实时使用状态以及模型训练任务对应的运算量,获得可执行的模型训练任务;分别调用不同的预设的第二进程执行可执行的模型训练任务,第二进程与第一进程为不同的进程。In one embodiment, when the processor executes the computer program, the following steps are also implemented: call the preset first process, obtain executable model training tasks according to the real-time usage status and the calculation amount corresponding to the model training tasks; respectively call different The preset second process executes executable model training tasks, and the second process and the first process are different processes.
在一个实施例中,处理器执行计算机程序时还实现以下步骤:调用预设的进程,根据实时使用状态以及模型训练任务对应的运算量,获得可执行的模型训练任务;调用同一个进程执行可执行的模型训练任务。In one embodiment, the processor further implements the following steps when executing the computer program: calling a preset process, obtaining an executable model training task according to the real-time usage status and the amount of calculation corresponding to the model training task; calling the same process to execute the executable The model training task performed.
在一个实施例中,提供了一种计算机可读存储介质,其上存储有计算机程序,计算机程序被处理器执行时还实现以下步骤:当移动终端接收到待处理的模型训练任务时,识别模型训练任务对应的模型类别;根据模型训练任务对应的模型类别以及预设的模型类别与运算量的关系,确定模型训练任务对应的运算量;获取移动终端的实时使用状态,根据实时使用状态以及模型训练任务对 应的运算量,控制模型训练任务的执行。In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored. When the computer program is executed by a processor, the following steps are also implemented: when the mobile terminal receives a model training task to be processed, the model is recognized The model category corresponding to the training task; the calculation amount corresponding to the model training task is determined according to the model category corresponding to the model training task and the relationship between the preset model category and the amount of calculation; the real-time usage status of the mobile terminal is obtained, based on the real-time usage status and model The calculation amount corresponding to the training task controls the execution of the model training task.
在一个实施例中,计算机程序被处理器执行时还实现以下步骤:获取移动终端的温度信息以及实时运行进程信息;根据温度信息以及实时运行进程信息,确定移动终端的实时使用状态。In one embodiment, when the computer program is executed by the processor, the following steps are further implemented: acquiring temperature information and real-time running process information of the mobile terminal; and determining the real-time use status of the mobile terminal according to the temperature information and real-time running process information.
在一个实施例中,计算机程序被处理器执行时还实现以下步骤:当模型训练任务对应的模型类别为神经网络模型时,得到模型训练任务对应的运算量等级为最高等级;当模型训练任务对应的模型类别为非神经网络模型时,通过运行模型训练任务,得到模型训练任务对应的运算量等级;根据获得的运算量等级以及预设的运算量等级与运算量的关系,确定模型训练任务对应的运算量。In one embodiment, when the computer program is executed by the processor, the following steps are also implemented: when the model category corresponding to the model training task is a neural network model, the calculation level corresponding to the model training task is the highest level; when the model training task corresponds to When the model type of is a non-neural network model, by running the model training task, the calculation amount level corresponding to the model training task is obtained; according to the obtained calculation amount level and the relationship between the preset calculation amount level and the calculation amount, the corresponding model training task is determined The amount of computing.
在一个实施例中,计算机程序被处理器执行时还实现以下步骤:获取与实时使用状态对应的运算量提供数据;将运算量提供数据分别与模型训练任务对应的运算量进行比较,筛选得到在运算量提供数据承载范围内的模型训练目标任务;调用运算量提供数据对应的运算资源,执行模型训练目标任务。In one embodiment, when the computer program is executed by the processor, the following steps are also implemented: obtain the computing amount provided data corresponding to the real-time use status; The amount of calculation provides the target tasks of model training within the range of data bearing; the amount of calculation is called to provide the calculation resources corresponding to the data, and the target tasks of model training are performed.
在一个实施例中,计算机程序被处理器执行时还实现以下步骤:调用预设的第一进程,根据实时使用状态以及模型训练任务对应的运算量,获得可执行的模型训练任务;分别调用不同的预设的第二进程执行可执行的模型训练任务,第二进程与第一进程为不同的进程。In one embodiment, when the computer program is executed by the processor, the following steps are also implemented: call the preset first process, obtain executable model training tasks according to the real-time usage status and the calculation amount corresponding to the model training tasks; respectively call different The preset second process executes executable model training tasks, and the second process and the first process are different processes.
在一个实施例中,计算机程序被处理器执行时还实现以下步骤:调用预设的进程,根据实时使用状态以及模型训练任务对应的运算量,获得可执行的模型训练任务;调用同一个进程执行可执行的模型训练任务。In one embodiment, when the computer program is executed by the processor, the following steps are also implemented: call a preset process, obtain an executable model training task according to the real-time usage status and the calculation amount corresponding to the model training task; call the same process to execute Executable model training tasks.
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,所述的计算机程序可存储于一非易失性计算机可读取存储介质中,该计算机程序在执行时,可包括如上述各方法的实施例的流程。其中,本申请所提供的各实施例中所使用的对存储器、存储、数据库或其它介质的任何引用,均可包括非易失性和/或易失性存储器。非易失性存储器可包括只读存储器(ROM)、可编程ROM(PROM)、电可编程ROM(EPROM)、电可擦除可编程ROM(EEPROM)或闪存。易失性存储器可包括随机存取存储器(RAM)或者外部高速缓冲存储器。作为说明而非局限,RAM以多种形式可得,诸如静态RAM(SRAM)、动态RAM(DRAM)、同步DRAM(SDRAM)、双数据率SDRAM(DDRSDRAM)、增强型SDRAM(ESDRAM)、同步链路(Synchlink)DRAM(SLDRAM)、存储器总线(Rambus)直接RAM(RDRAM)、直接存储器总线动态RAM(DRDRAM)、以及存储器总线动态RAM(RDRAM)等。A person of ordinary skill in the art can understand that all or part of the processes in the above-mentioned embodiment methods can be implemented by instructing relevant hardware through a computer program. The computer program can be stored in a non-volatile computer readable storage. In the medium, when the computer program is executed, it may include the processes of the above-mentioned method embodiments. Wherein, any reference to memory, storage, database or other media used in the embodiments provided in this application may include non-volatile and/or volatile memory. Non-volatile memory may include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory. Volatile memory may include random access memory (RAM) or external cache memory. As an illustration and not a limitation, RAM is available in many forms, such as static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous chain Channel (Synchlink) DRAM (SLDRAM), memory bus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), etc.
以上实施例的各技术特征可以进行任意的组合,为使描述简洁,未对上述实施例中的各个技术特征所有可能的组合都进行描述,然而,只要这些技术特征的组合不存在矛盾,都应当认为是本说明书记载的范围。The technical features of the above embodiments can be combined arbitrarily. In order to make the description concise, all possible combinations of the technical features in the above embodiments are not described. However, as long as there is no contradiction in the combination of these technical features, they should be It is considered as the range described in this specification.
以上所述实施例仅表达了本申请的几种实施方式,其描述较为具体和详细,但并不能因此而理解为对申请专利范围的限制。应当指出的是,对于本领域的普通技术人员来说,在不脱离本申请构思的前提下,还可以做出若干变形和改 进,这些都属于本申请的保护范围。因此,本申请专利的保护范围应以所附权利要求为准。The above-mentioned embodiments only express several implementation manners of the present application, and their description is relatively specific and detailed, but they should not be interpreted as a limitation on the scope of the patent application. It should be pointed out that for those of ordinary skill in the art, without departing from the concept of this application, several modifications and improvements can be made, and these all fall within the protection scope of this application. Therefore, the scope of protection of the patent of this application shall be subject to the appended claims.

Claims (24)

  1. 一种模型训练任务处理方法,所述方法包括:A method for processing model training tasks, the method comprising:
    当移动终端接收到待处理的模型训练任务时,识别所述模型训练任务对应的模型类别;When the mobile terminal receives the model training task to be processed, identifying the model category corresponding to the model training task;
    根据所述模型训练任务对应的模型类别以及预设的模型类别与运算量的关系,确定所述模型训练任务对应的运算量;Determine the calculation amount corresponding to the model training task according to the model category corresponding to the model training task and the preset relationship between the model category and the calculation amount;
    获取所述移动终端的实时使用状态,根据所述实时使用状态以及所述模型训练任务对应的运算量,控制所述模型训练任务的执行。Acquire the real-time usage status of the mobile terminal, and control the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task.
  2. 根据权利要求1所述的方法,其中,所述获取所述移动终端的实时使用状态包括:The method according to claim 1, wherein said obtaining the real-time usage status of the mobile terminal comprises:
    获取所述移动终端的温度信息以及实时运行进程信息;Acquiring temperature information and real-time running process information of the mobile terminal;
    根据所述温度信息以及所述实时运行进程信息,确定所述移动终端的实时使用状态。According to the temperature information and the real-time running process information, the real-time use status of the mobile terminal is determined.
  3. 根据权利要求1所述的方法,其中,所述根据所述模型训练任务对应的模型类别以及预设的模型类别与运算量的关系,确定所述模型训练任务对应的运算量包括:The method according to claim 1, wherein the determining the calculation amount corresponding to the model training task according to the model category corresponding to the model training task and the preset relationship between the model category and the calculation amount comprises:
    当所述模型训练任务对应的模型类别为神经网络模型时,得到所述模型训练任务对应的运算量等级为最高等级;When the model category corresponding to the model training task is a neural network model, the calculation amount level corresponding to the model training task is obtained as the highest level;
    当所述模型训练任务对应的模型类别为非神经网络模型时,通过运行所述模型训练任务,得到所述模型训练任务对应的运算量等级;When the model category corresponding to the model training task is a non-neural network model, by running the model training task, the calculation amount level corresponding to the model training task is obtained;
    根据获得的运算量等级以及预设的运算量等级与运算量的关系,确定所述模型训练任务对应的运算量。The calculation amount corresponding to the model training task is determined according to the obtained calculation amount level and the preset relationship between the calculation amount level and the calculation amount.
  4. 根据权利要求1所述的方法,其中,所述根据所述实时使用状态以及所述模型训练任务对应的运算量,控制所述模型训练任务的执行包括:The method according to claim 1, wherein the controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task comprises:
    获取与所述实时使用状态对应的运算量提供数据;Obtaining provided data corresponding to the real-time usage status of the calculation amount;
    将所述运算量提供数据分别与所述模型训练任务对应的运算量进行比较,筛选得到在所述运算量提供数据承载范围内的模型训练目标任务;Comparing the data provided by the calculation amount with the calculation amount corresponding to the model training task respectively, and filter to obtain the model training target tasks within the range of the data provided by the calculation amount;
    调用所述运算量提供数据对应的运算资源,执行所述模型训练目标任务。Invoking the computing resources to provide computing resources corresponding to the data to execute the model training target task.
  5. 根据权利要求1所述的方法,其中,所述根据所述实时使用状态以及所述模型训练任务对应的运算量,控制所述模型训练任务的执行包括:The method according to claim 1, wherein the controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task comprises:
    调用预设的第一进程,根据所述实时使用状态以及所述模型训练任务对应的运算量,获得可执行的模型训练任务;Calling a preset first process, and obtaining an executable model training task according to the real-time usage status and the calculation amount corresponding to the model training task;
    分别调用不同的预设的第二进程执行所述可执行的模型训练任务,所述第二进程与所述第一进程为不同的进程。Different preset second processes are respectively called to execute the executable model training task, and the second process and the first process are different processes.
  6. 根据权利要求1所述的方法,其中,所述根据所述实时使用状态以及所述模型训练任务对应的运算量,控制所述模型训练任务的执行包括:The method according to claim 1, wherein the controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task comprises:
    调用预设的进程,根据所述实时使用状态以及所述模型训练任务对应的运算量,获得可执行的模型训练任务;Calling a preset process, and obtaining an executable model training task according to the real-time use status and the calculation amount corresponding to the model training task;
    调用同一个进程执行所述可执行的模型训练任务。Call the same process to execute the executable model training task.
  7. 一种模型训练任务处理装置,所述装置包括:A model training task processing device, the device comprising:
    任务识别模块,用于当移动终端接收到待处理的模型训练任务时,识别所述模型训练任务对应的模型类别;The task recognition module is used to recognize the model category corresponding to the model training task when the mobile terminal receives the to-be-processed model training task;
    运算量确定模块,用于根据所述模型训练任务对应的模型类别以及预设的模型类别与运算量的关系,确定所述模型训练任务对应的运算量;The calculation amount determination module is configured to determine the calculation amount corresponding to the model training task according to the model category corresponding to the model training task and the preset relationship between the model category and the calculation amount;
    任务控制模块,用于获取所述移动终端的实时使用状态,根据所述实时使用状态以及所述模型训练任务对应的运算量,控制所述模型训练任务的执行。The task control module is used to obtain the real-time usage status of the mobile terminal, and control the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task.
  8. 根据权利要求7所述的装置,其中,所述任务控制模块还用于获取所述移动终端的温度信息以及实时运行进程信息;根据所述温度信息以及所述实时运行进程信息,确定所述移动终端的实时使用状态。7. The device according to claim 7, wherein the task control module is further configured to obtain temperature information and real-time running process information of the mobile terminal; determine the mobile terminal according to the temperature information and the real-time running process information The real-time usage status of the terminal.
  9. 根据权利要求7所述的装置,其中,所述运算量确定模块还用于当所述模型训练任务对应的模型类别为神经网络模型时,得到所述模型训练任务对应的运算量等级为最高等级;当所述模型训练任务对应的模型类别为非神经网络模型时,通过运行所述模型训练任务,得到所述模型训练任务对应的运算量等级;根据获得的运算量等级以及预设的运算量等级与运算量的关系,确定所述模型训练任务对应的运算量。8. The device according to claim 7, wherein the calculation amount determining module is further configured to obtain that the calculation amount level corresponding to the model training task is the highest level when the model category corresponding to the model training task is a neural network model ; When the model category corresponding to the model training task is a non-neural network model, by running the model training task, the calculation amount level corresponding to the model training task is obtained; according to the obtained calculation amount level and the preset calculation amount The relationship between the level and the amount of calculation determines the amount of calculation corresponding to the model training task.
  10. 根据权利要求7所述的装置,其中,所述任务控制模块还用于获取与所述实时使用状态对应的运算量提供数据;将所述运算量提供数据分别与所述模型训练任务对应的运算量进行比较,筛选得到在所述运算量提供数据承载范围内的模型训练目标任务;调用所述运算量提供数据对应的运算资源,执行所述模型训练目标任务。7. The device according to claim 7, wherein the task control module is further configured to obtain the calculation amount provided data corresponding to the real-time use status; The model training target tasks within the data bearing range provided by the calculation amount are filtered to obtain the model training target tasks; the calculation resources corresponding to the data provided by the calculation amount are called to execute the model training target tasks.
  11. 根据权利要求7所述的装置,其中,所述任务控制模块还用于调用预设的第一进程,根据所述实时使用状态以及所述模型训练任务对应的运算量,获得可执行的模型训练任务;分别调用不同的预设的第二进程执行所述可执行的模型训练任务,所述第二进程与所述第一进程为不同的进程。8. The device according to claim 7, wherein the task control module is further configured to call a preset first process, and obtain executable model training according to the real-time usage status and the calculation amount corresponding to the model training task Tasks; respectively calling different preset second processes to execute the executable model training task, and the second process and the first process are different processes.
  12. 根据权利要求7所述的装置,其中,所述任务控制模块还用于调用预设的进程,根据所述实时使用状态以及所述模型训练任务对应的运算量,获得可执行的模型训练任务;调用同一个进程执行所述可执行的模型训练任务。8. The device according to claim 7, wherein the task control module is further configured to call a preset process, and obtain an executable model training task according to the real-time usage status and the calculation amount corresponding to the model training task; Call the same process to execute the executable model training task.
  13. 一种电子设备,包括存储器和处理器,所述存储器存储有计算机程序,所述处理器执行所述计算机程序时实现模型训练任务处理方法,所述方法包括:An electronic device includes a memory and a processor, the memory stores a computer program, and when the processor executes the computer program, a method for processing a model training task is implemented, and the method includes:
    当移动终端接收到待处理的模型训练任务时,识别所述模型训练任务对应的模型类别;When the mobile terminal receives the model training task to be processed, identifying the model category corresponding to the model training task;
    根据所述模型训练任务对应的模型类别以及预设的模型类别与运算量的关系,确定所述模型训练任务对应的运算量;Determine the calculation amount corresponding to the model training task according to the model category corresponding to the model training task and the preset relationship between the model category and the calculation amount;
    获取所述移动终端的实时使用状态,根据所述实时使用状态以及所述模型训练任务对应的运算量,控制所述模型训练任务的执行。Acquire the real-time usage status of the mobile terminal, and control the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task.
  14. 根据权利要求13所述的电子设备,其中,所述获取所述移动终端的实时使用状态包括:The electronic device according to claim 13, wherein said acquiring the real-time usage status of the mobile terminal comprises:
    获取所述移动终端的温度信息以及实时运行进程信息;Acquiring temperature information and real-time running process information of the mobile terminal;
    根据所述温度信息以及所述实时运行进程信息,确定所述移动终端的实时 使用状态。According to the temperature information and the real-time running process information, the real-time usage status of the mobile terminal is determined.
  15. 根据权利要求13所述的电子设备,其中,所述根据所述模型训练任务对应的模型类别以及预设的模型类别与运算量的关系,确定所述模型训练任务对应的运算量包括:The electronic device according to claim 13, wherein the determining the calculation amount corresponding to the model training task according to the model category corresponding to the model training task and the preset relationship between the model category and the calculation amount comprises:
    当所述模型训练任务对应的模型类别为神经网络模型时,得到所述模型训练任务对应的运算量等级为最高等级;When the model category corresponding to the model training task is a neural network model, the calculation amount level corresponding to the model training task is obtained as the highest level;
    当所述模型训练任务对应的模型类别为非神经网络模型时,通过运行所述模型训练任务,得到所述模型训练任务对应的运算量等级;When the model category corresponding to the model training task is a non-neural network model, by running the model training task, the calculation amount level corresponding to the model training task is obtained;
    根据获得的运算量等级以及预设的运算量等级与运算量的关系,确定所述模型训练任务对应的运算量。The calculation amount corresponding to the model training task is determined according to the obtained calculation amount level and the preset relationship between the calculation amount level and the calculation amount.
  16. 根据权利要求13所述的电子设备,其中,所述根据所述实时使用状态以及所述模型训练任务对应的运算量,控制所述模型训练任务的执行包括:The electronic device according to claim 13, wherein the controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task comprises:
    获取与所述实时使用状态对应的运算量提供数据;Obtaining provided data corresponding to the real-time usage status of the calculation amount;
    将所述运算量提供数据分别与所述模型训练任务对应的运算量进行比较,筛选得到在所述运算量提供数据承载范围内的模型训练目标任务;Comparing the data provided by the calculation amount with the calculation amount corresponding to the model training task respectively, and filter to obtain the model training target tasks within the range of the data provided by the calculation amount;
    调用所述运算量提供数据对应的运算资源,执行所述模型训练目标任务。Invoking the computing resources to provide computing resources corresponding to the data to execute the model training target task.
  17. 根据权利要求13所述的电子设备,其中,所述根据所述实时使用状态以及所述模型训练任务对应的运算量,控制所述模型训练任务的执行包括:The electronic device according to claim 13, wherein the controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task comprises:
    调用预设的第一进程,根据所述实时使用状态以及所述模型训练任务对应的运算量,获得可执行的模型训练任务;Calling a preset first process, and obtaining an executable model training task according to the real-time usage status and the calculation amount corresponding to the model training task;
    分别调用不同的预设的第二进程执行所述可执行的模型训练任务,所述第二进程与所述第一进程为不同的进程。Different preset second processes are respectively called to execute the executable model training task, and the second process and the first process are different processes.
  18. 根据权利要求13所述的电子设备,其中,所述根据所述实时使用状态以及所述模型训练任务对应的运算量,控制所述模型训练任务的执行包括:The electronic device according to claim 13, wherein the controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task comprises:
    调用预设的进程,根据所述实时使用状态以及所述模型训练任务对应的运算量,获得可执行的模型训练任务;Calling a preset process, and obtaining an executable model training task according to the real-time use status and the calculation amount corresponding to the model training task;
    调用同一个进程执行所述可执行的模型训练任务。Call the same process to execute the executable model training task.
  19. 一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序被处理器执行时实现模型训练任务处理方法,所述方法包括:A computer-readable storage medium has a computer program stored thereon, and when the computer program is executed by a processor, a method for processing a model training task is implemented, the method comprising:
    当移动终端接收到待处理的模型训练任务时,识别所述模型训练任务对应的模型类别;When the mobile terminal receives the model training task to be processed, identifying the model category corresponding to the model training task;
    根据所述模型训练任务对应的模型类别以及预设的模型类别与运算量的关系,确定所述模型训练任务对应的运算量;Determine the calculation amount corresponding to the model training task according to the model category corresponding to the model training task and the preset relationship between the model category and the calculation amount;
    获取所述移动终端的实时使用状态,根据所述实时使用状态以及所述模型训练任务对应的运算量,控制所述模型训练任务的执行。Acquire the real-time usage status of the mobile terminal, and control the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task.
  20. 根据权利要求19所述的计算机可读存储介质,其中,所述获取所述移动终端的实时使用状态包括:The computer-readable storage medium according to claim 19, wherein said obtaining the real-time usage status of the mobile terminal comprises:
    获取所述移动终端的温度信息以及实时运行进程信息;Acquiring temperature information and real-time running process information of the mobile terminal;
    根据所述温度信息以及所述实时运行进程信息,确定所述移动终端的实时使用状态。According to the temperature information and the real-time running process information, the real-time use status of the mobile terminal is determined.
  21. 根据权利要求19所述的计算机可读存储介质,其中,所述根据所述模型训练任务对应的模型类别以及预设的模型类别与运算量的关系,确定所述模型训练任务对应的运算量包括:The computer-readable storage medium according to claim 19, wherein the determining the calculation amount corresponding to the model training task according to the model category corresponding to the model training task and the relationship between the preset model category and the calculation amount comprises :
    当所述模型训练任务对应的模型类别为神经网络模型时,得到所述模型训练任务对应的运算量等级为最高等级;When the model category corresponding to the model training task is a neural network model, the calculation amount level corresponding to the model training task is obtained as the highest level;
    当所述模型训练任务对应的模型类别为非神经网络模型时,通过运行所述模型训练任务,得到所述模型训练任务对应的运算量等级;When the model category corresponding to the model training task is a non-neural network model, by running the model training task, the calculation amount level corresponding to the model training task is obtained;
    根据获得的运算量等级以及预设的运算量等级与运算量的关系,确定所述模型训练任务对应的运算量。The calculation amount corresponding to the model training task is determined according to the obtained calculation amount level and the preset relationship between the calculation amount level and the calculation amount.
  22. 根据权利要求19所述的计算机可读存储介质,其中,所述根据所述实时使用状态以及所述模型训练任务对应的运算量,控制所述模型训练任务的执行包括:The computer-readable storage medium according to claim 19, wherein the controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task comprises:
    获取与所述实时使用状态对应的运算量提供数据;Obtaining provided data corresponding to the real-time usage status of the calculation amount;
    将所述运算量提供数据分别与所述模型训练任务对应的运算量进行比较,筛选得到在所述运算量提供数据承载范围内的模型训练目标任务;Comparing the data provided by the calculation amount with the calculation amount corresponding to the model training task respectively, and filter to obtain the model training target tasks within the range of the data provided by the calculation amount;
    调用所述运算量提供数据对应的运算资源,执行所述模型训练目标任务。Invoking the computing resources to provide computing resources corresponding to the data to execute the model training target task.
  23. 根据权利要求19所述的计算机可读存储介质,其中,所述根据所述实时使用状态以及所述模型训练任务对应的运算量,控制所述模型训练任务的执行包括:The computer-readable storage medium according to claim 19, wherein the controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task comprises:
    调用预设的第一进程,根据所述实时使用状态以及所述模型训练任务对应的运算量,获得可执行的模型训练任务;Calling a preset first process, and obtaining an executable model training task according to the real-time usage status and the calculation amount corresponding to the model training task;
    分别调用不同的预设的第二进程执行所述可执行的模型训练任务,所述第二进程与所述第一进程为不同的进程。Different preset second processes are respectively called to execute the executable model training task, and the second process and the first process are different processes.
  24. 根据权利要求19所述的计算机可读存储介质,其中,所述根据所述实时使用状态以及所述模型训练任务对应的运算量,控制所述模型训练任务的执行包括:The computer-readable storage medium according to claim 19, wherein the controlling the execution of the model training task according to the real-time usage status and the calculation amount corresponding to the model training task comprises:
    调用预设的进程,根据所述实时使用状态以及所述模型训练任务对应的运算量,获得可执行的模型训练任务;Calling a preset process, and obtaining an executable model training task according to the real-time use status and the calculation amount corresponding to the model training task;
    调用同一个进程执行所述可执行的模型训练任务。Call the same process to execute the executable model training task.
PCT/CN2021/091635 2020-05-08 2021-04-30 Model training task processing method and apparatus, electronic device, and storage medium WO2021223686A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010381895.3 2020-05-08
CN202010381895.3A CN111738404B (en) 2020-05-08 2020-05-08 Model training task processing method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
WO2021223686A1 true WO2021223686A1 (en) 2021-11-11

Family

ID=72647036

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/091635 WO2021223686A1 (en) 2020-05-08 2021-04-30 Model training task processing method and apparatus, electronic device, and storage medium

Country Status (2)

Country Link
CN (1) CN111738404B (en)
WO (1) WO2021223686A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117086866A (en) * 2023-08-07 2023-11-21 广州中鸣数码科技有限公司 Task planning training method and device based on programming robot

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111738404B (en) * 2020-05-08 2024-01-12 深圳市万普拉斯科技有限公司 Model training task processing method and device, electronic equipment and storage medium
CN114358302A (en) * 2020-10-14 2022-04-15 华为云计算技术有限公司 Artificial intelligence AI training method, system and equipment
CN113742059B (en) * 2021-07-15 2024-03-29 上海朋熙半导体有限公司 Task allocation method, device, computer equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107480717A (en) * 2017-08-16 2017-12-15 北京奇虎科技有限公司 Train job processing method and system, computing device, computer-readable storage medium
US20190012575A1 (en) * 2017-07-04 2019-01-10 Beijing Baidu Netcom Science And Technology Co., Ltd. Method, apparatus and system for updating deep learning model
CN110175679A (en) * 2019-05-29 2019-08-27 深圳前海微众银行股份有限公司 A kind of method and device of monitoring model training
CN110689134A (en) * 2018-07-05 2020-01-14 第四范式(北京)技术有限公司 Method, apparatus, device and storage medium for performing machine learning process
CN110750342A (en) * 2019-05-23 2020-02-04 北京嘀嘀无限科技发展有限公司 Scheduling method, scheduling device, electronic equipment and readable storage medium
CN111738404A (en) * 2020-05-08 2020-10-02 深圳市万普拉斯科技有限公司 Model training task processing method and device, electronic equipment and storage medium

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140113270A1 (en) * 2012-10-19 2014-04-24 George E. Danis System and method for mobile automated training
CN103577270B (en) * 2013-10-30 2017-05-17 宇龙计算机通信科技(深圳)有限公司 Use method for controlling split type mobile terminal and split type mobile terminal
WO2017045157A1 (en) * 2015-09-16 2017-03-23 Intel Corporation Facial expression recognition using relations determined by class-to-class comparisons
CN108027889B (en) * 2016-01-25 2020-07-28 华为技术有限公司 Training and scheduling method for incremental learning cloud system and related equipment
CN106095556A (en) * 2016-06-20 2016-11-09 惠州Tcl移动通信有限公司 A kind of method and system controlling terminal processes
CN106909488A (en) * 2017-02-23 2017-06-30 深圳市金立通信设备有限公司 A kind of cpu temperature control method and terminal
CN108734293B (en) * 2017-04-13 2023-05-02 北京京东尚科信息技术有限公司 Task management system, method and device
CN107145395B (en) * 2017-07-04 2020-12-08 北京百度网讯科技有限公司 Method and device for processing task
CN107977268B (en) * 2017-10-13 2021-07-20 北京百度网讯科技有限公司 Task scheduling method and device for artificial intelligence heterogeneous hardware and readable medium
CN109768869B (en) * 2017-11-06 2022-05-31 中国移动通信有限公司研究院 Service prediction method, system and computer storage medium
CN111338776B (en) * 2017-12-28 2023-11-28 中科寒武纪科技股份有限公司 Scheduling method and related device
CN111105006B (en) * 2018-10-26 2023-08-04 杭州海康威视数字技术股份有限公司 Deep learning network training system and method
CN109828833B (en) * 2018-11-02 2020-09-29 上海帆一尚行科技有限公司 Queuing system and method for neural network training task
CN109446783B (en) * 2018-11-16 2023-07-25 山东浪潮科学研究院有限公司 Image recognition efficient sample collection method and system based on machine crowdsourcing
CN109634748A (en) * 2018-12-12 2019-04-16 深圳前海微众银行股份有限公司 Cluster resource dispatching method, device, equipment and computer readable storage medium
CN109559734B (en) * 2018-12-18 2022-02-18 百度在线网络技术(北京)有限公司 Acceleration method and device for acoustic model training
CN110413396B (en) * 2019-07-30 2022-02-15 广东工业大学 Resource scheduling method, device and equipment and readable storage medium
CN110618870B (en) * 2019-09-20 2021-11-19 广东浪潮大数据研究有限公司 Working method and device for deep learning training task
CN110796245B (en) * 2019-10-25 2022-03-22 浪潮电子信息产业股份有限公司 Method and device for calculating convolutional neural network model
CN110928689B (en) * 2019-12-05 2020-08-25 中国人民解放军军事科学院国防科技创新研究院 Self-adaptive resource management method and device for distributed reinforcement learning training
CN111104222B (en) * 2019-12-16 2023-06-30 上海众源网络有限公司 Task processing method, device, computer equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190012575A1 (en) * 2017-07-04 2019-01-10 Beijing Baidu Netcom Science And Technology Co., Ltd. Method, apparatus and system for updating deep learning model
CN107480717A (en) * 2017-08-16 2017-12-15 北京奇虎科技有限公司 Train job processing method and system, computing device, computer-readable storage medium
CN110689134A (en) * 2018-07-05 2020-01-14 第四范式(北京)技术有限公司 Method, apparatus, device and storage medium for performing machine learning process
CN110750342A (en) * 2019-05-23 2020-02-04 北京嘀嘀无限科技发展有限公司 Scheduling method, scheduling device, electronic equipment and readable storage medium
CN110175679A (en) * 2019-05-29 2019-08-27 深圳前海微众银行股份有限公司 A kind of method and device of monitoring model training
CN111738404A (en) * 2020-05-08 2020-10-02 深圳市万普拉斯科技有限公司 Model training task processing method and device, electronic equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117086866A (en) * 2023-08-07 2023-11-21 广州中鸣数码科技有限公司 Task planning training method and device based on programming robot
CN117086866B (en) * 2023-08-07 2024-04-12 广州中鸣数码科技有限公司 Task planning training method and device based on programming robot

Also Published As

Publication number Publication date
CN111738404A (en) 2020-10-02
CN111738404B (en) 2024-01-12

Similar Documents

Publication Publication Date Title
WO2021223686A1 (en) Model training task processing method and apparatus, electronic device, and storage medium
WO2020134991A1 (en) Automatic input method for paper form, apparatus , and computer device and storage medium
CN111145910A (en) Abnormal case identification method and device based on artificial intelligence and computer equipment
WO2021012790A1 (en) Page data generation method and apparatus, computer device, and storage medium
WO2021004324A1 (en) Resource data processing method and apparatus, and computer device and storage medium
CN109815803B (en) Face examination risk control method and device, computer equipment and storage medium
CN112418059B (en) Emotion recognition method and device, computer equipment and storage medium
WO2022252454A1 (en) Abnormal data detection method and apparatus, computer device, and readable storage medium
WO2021068607A1 (en) Multi-system multi-store order integrating method and apparatus, computer device and storage medium
WO2020034801A1 (en) Medical feature screening method and apparatus, computer device, and storage medium
CN111400126B (en) Network service abnormal data detection method, device, equipment and medium
CN112330078B (en) Power consumption prediction method and device, computer equipment and storage medium
CN113435912A (en) Data analysis method, device, equipment and medium based on client portrait
CN108182633A (en) Loan data processing method, device, computer equipment and storage medium
WO2021012861A1 (en) Method and apparatus for evaluating data query time consumption, and computer device and storage medium
DE112021001422T5 (en) Algorithmic learning engine for dynamically generating predictive analytics from high-volume, high-speed streaming data
CN114238715A (en) Question-answering system based on social aid, construction method, computer equipment and medium
CN117252362A (en) Scheduling method and device based on artificial intelligence, computer equipment and storage medium
CN109345184A (en) Nodal information processing method, device, computer equipment and storage medium based on micro- expression
CN111898035A (en) Data processing strategy configuration method and device based on Internet of things and computer equipment
CN116431619A (en) User portrait crowd life cycle control method and device
CN115409345A (en) Service index calculation method and device, computer equipment and storage medium
US20230027309A1 (en) System and method for image de-identification to humans while remaining recognizable by machines
CN112000428B (en) JVM optimization method and device based on machine learning and electronic device
CN113033894A (en) Daily electricity consumption prediction method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21799581

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 15.03.2023)

122 Ep: pct application non-entry in european phase

Ref document number: 21799581

Country of ref document: EP

Kind code of ref document: A1