CN114168318A - Training method of storage release model, storage release method and equipment - Google Patents

Training method of storage release model, storage release method and equipment Download PDF

Info

Publication number
CN114168318A
CN114168318A CN202111330010.8A CN202111330010A CN114168318A CN 114168318 A CN114168318 A CN 114168318A CN 202111330010 A CN202111330010 A CN 202111330010A CN 114168318 A CN114168318 A CN 114168318A
Authority
CN
China
Prior art keywords
storage
release
target
sample data
release model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111330010.8A
Other languages
Chinese (zh)
Inventor
周培烁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jinan Inspur Data Technology Co Ltd
Original Assignee
Jinan Inspur Data Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jinan Inspur Data Technology Co Ltd filed Critical Jinan Inspur Data Technology Co Ltd
Priority to CN202111330010.8A priority Critical patent/CN114168318A/en
Publication of CN114168318A publication Critical patent/CN114168318A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5005Allocation of resources, e.g. of the central processing unit [CPU] to service a request
    • G06F9/5011Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resources being hardware resources other than CPUs, Servers and Terminals
    • G06F9/5022Mechanisms to release resources
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention discloses a training method, a storage release method and equipment of a storage release model, wherein the method comprises the following steps: acquiring multiple groups of sample data and tags thereof, wherein each group of sample data comprises access frequencies of multiple storage objects and costs of the storage objects, and the tags release the storage objects for targets in each group of sample data; inputting each group of sample data into a storage release model, and outputting a corresponding prediction release storage object of each group of sample data; and performing loss function calculation based on the predicted released storage object and the corresponding target released storage object, and updating parameters of the storage and release model to determine the target storage and release model. The target storage release model constructed by the invention optimizes the file release strategy when the alloxio is stored in a medium-crossing layered mode, increases the consideration on the storage medium space release cost and the data access efficiency of different media on the basis of considering the file use frequency in the release space, reduces the block release cost and improves the system operation efficiency.

Description

Training method of storage release model, storage release method and equipment
Technical Field
The invention relates to the technical field of data storage processing, in particular to a training method of a storage release model, a storage release method and equipment.
Background
Alluxio is a commonly used memory-based virtual distributed storage system in a large data platform, data of the system is usually stored between a computing layer and a storage layer in a file block form, so that the computing layer can access relevant data more quickly and efficiently through Alluxio, in practical applications, layered storage of Alluxio across storage media (such as MEM, SSD, HDD, etc.) is one of its typical application scenarios, and its process is generally:
1. setting Alluxio as a three-level storage mode of sequentially taking MEM, SSD and HDD as a target release storage object;
2. and adopting a storage rule from high level to low level, and storing data in the high-level target release storage object by default when the high-level target release storage object space is enough.
3. When the space of the high-level target release storage object overflows, the storage space is searched in the secondary target release storage object. If MEM, SSD and HDD have no storage space, calling Alluxio system default space release algorithm to release target release storage object, and allocating storage space for new data.
The default space release algorithm of the Alluxio system is specifically as follows:
the least recently used memory objects are released sequentially according to the access frequency. After the write operation is initiated, if the master node of the Alluxio senses that none of the three layers of target release storage objects has enough space for allocation, the space release algorithm does not distinguish the storage space occupied by the storage object which is least used by the release of the target release storage object and allocates the space for the data required by the write operation, and only considers the access frequency of the storage object.
However, only considering the access frequency to release the data in the target release storage object, when there is data to be stored, the data to be stored is stored randomly in the target release storage object, which results in an increase in the cost of data release and also results in a lower reliability of storage release.
Disclosure of Invention
In view of this, embodiments of the present invention provide a training method for a storage release model, a storage release method, and a device, and aim to solve the problem of low reliability of storage release.
According to a first aspect, an embodiment of the present invention provides a method for training a storage release model, including:
acquiring multiple groups of sample data and tags thereof, wherein each group of sample data comprises access frequencies of multiple storage objects and costs of the storage objects, and the tags release the storage objects for targets in each group of sample data;
inputting each group of sample data into a storage release model, and outputting a storage object which is predicted to be released and corresponds to each group of sample data;
and performing loss function calculation based on the predicted released storage object and the corresponding target released storage object, and updating parameters of the storage release model to determine a target storage release model.
The training method of the storage and release model provided by the embodiment of the invention is applied to the Alluxio system, data of the system is usually stored between a calculation layer and a storage layer in the form of a storage object, so that the calculation layer can calculate a predicted release storage object according to the access frequency of the storage object and the cost of the storage object through the Alluxio system based on a target storage and release model generated by a plurality of groups of sample data and labels through deep learning, then the predicted release storage object and the target release object are subjected to loss function calculation, parameters of the storage and release model are updated based on the calculation result, the target storage and release model is finally determined, two influencing factors of the access frequency and the cost are referred, the accuracy of the target release object for releasing the sample data can be improved, the release cost of the target release object is the lowest, the target storage and release model is input into the Alluxio system, therefore, the computing layer can access the relevant data more quickly and efficiently through the Alluxio system.
With reference to the first aspect, in a first implementation manner of the first aspect, the storage release model includes a first input node, a second input node, and an output node, where inputting each set of the sample data into the storage release model and outputting a corresponding storage object to be predictively released for each set of the sample data includes:
for each group of the sample data, inputting the access frequency of each storage object into the storage release model through the first input node;
inputting the cost of each storage object into the storage release model through the second input node;
acquiring a first weight corresponding to a first input node and a second weight corresponding to a second input node;
and determining the predicted released storage object output by the output node according to the first weight and the access frequency and the second weight and the cost.
According to the training method for the storage release model, on the basis of considering the access frequency of the storage object in the target release storage object, the cost of the storage object and the consideration of the access efficiency of different storage objects are added, so that two factors in the sample data influence the determination of the final target storage release model.
With reference to the first implementation manner of the first aspect, in a second implementation manner of the first aspect, the updating parameters of the storage release model to determine a target storage release model based on the loss function calculation performed on the storage object with the predicted release and the corresponding target release to determine the target storage release model includes:
calculating a loss between the predicted freed memory object and a target freed memory object by an objective loss function and calculating a gradient;
adjusting the first weight and the second weight based on the gradient.
According to the training method of the storage release model provided by the embodiment of the invention, on the basis of considering the access frequency of the storage object in the target release storage object, the cost of the storage object and the consideration of the access efficiency of different storage objects are increased, and the adjustment is carried out according to the proportion weight in each aspect, so that the target release storage object is determined, the release cost of the storage object is reduced, and the operation efficiency of the system is improved.
With reference to the second embodiment of the first aspect, in a third embodiment of the first aspect, the adjusting with the first weight and the second weight based on the gradient includes:
analyzing changes of the first weight and the second weight determined by each training;
and when the change of the first weight and the second weight is within a preset range, generating a first target weight and a second target weight.
According to the training method of the storage release model provided by the embodiment of the invention, as the change of the first weight and the second weight is continuously adjusted after continuously analyzing a plurality of groups of sample data and labels thereof, the output predicted release storage object is more and more close to the target release storage object, so that the change of the first weight and the second weight stick is in a preset range, the change is gradually stable, and finally the first weight and the second weight after the stabilization generate the first target weight and the second target weight.
According to a second aspect, an embodiment of the present invention further provides a storage release method, where the method includes:
acquiring access frequency and cost of a plurality of storage objects to be released;
inputting the access frequency and the cost of a plurality of storage objects to be released into a target storage and release model, and determining a target storage object in the plurality of storage objects to be released, wherein the target storage and release model is obtained by training according to the training method of the storage and release model;
and releasing the target storage object.
The storage release method provided by the embodiment of the invention is applied to the Alluxio system, and data of the system is usually stored between a computing layer and a storage layer in a storage object form, so that the computing layer can conveniently obtain a target storage object through the Alluxio system by computing through a target storage release module according to the access frequency of the storage object and the cost of the storage object, and then release the target storage object, so that the storage object to be released has a free storage space, and the computing layer can conveniently access related data more quickly and efficiently through the Alluxio system.
With reference to the second aspect, in a first embodiment of the second aspect, the method further comprises:
acquiring data to be stored;
and storing the data to be stored into the target storage object.
The storage release method provided by the embodiment of the invention stores the data to be stored into the storage space of the target storage object, so that the computing layer can access the related data more quickly and efficiently through the Alluxio system.
According to a third aspect, an embodiment of the present invention further provides a training apparatus for a memory release model, including:
the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring a plurality of groups of sample data and tags thereof, each group of sample data comprises access frequency of a plurality of storage objects and cost of the storage objects, and the tags release the storage objects for targets in each group of sample data;
the first input module is used for inputting each group of sample data into the storage release model and outputting the corresponding prediction release storage object of each group of sample data;
and the target module is used for performing loss function calculation on the basis of the predicted released storage object and the corresponding target released storage object, and updating parameters of the storage release model to determine a target storage release model.
The training device for the storage and release model provided by the embodiment of the invention is applied to the Alluxio system, data of the system is usually stored between a calculation layer and a storage layer in a storage object form, so that the calculation layer can calculate a predicted release storage object according to the access frequency of the storage object and the cost of the storage object through the Alluxio system based on a target storage and release model generated by a plurality of groups of sample data and labels through deep learning, then the predicted release storage object and the target release object are subjected to loss function calculation, parameters of the storage and release model are updated based on the calculation result, the target storage and release model is finally determined, two influencing factors of the access frequency and the cost are referred, the accuracy of the target release object for releasing the sample data can be improved, the release cost of the target release object is the lowest, and the target storage and release model is input into the Alluxio system, therefore, the computing layer can access the relevant data more quickly and efficiently through the Alluxio system.
According to a fourth aspect, an embodiment of the present invention further provides a storage release apparatus, including:
the second acquisition module is used for acquiring the access frequency and the cost of a plurality of storage objects to be released;
the second input module is used for inputting the access frequency and the cost of the plurality of storage objects to be released into a target storage release model, and determining a target storage object in the plurality of storage objects to be released, wherein the target storage release model is obtained by training according to the training method of the storage release model;
and the releasing module is used for releasing the target storage object.
The storage release device provided by the embodiment of the invention is applied to the Alluxio system, and data of the system is usually stored between a computing layer and a storage layer in a storage object form, so that the computing layer can conveniently obtain a target storage object through the Alluxio system by computing through a target storage release module according to the access frequency of the storage object and the cost of the storage object, and then release the target storage object, so that the storage object with the release function has a free storage space, and the computing layer can conveniently access related data more quickly and efficiently through the Alluxio system.
According to a fifth aspect, an embodiment of the present invention provides an electronic device, including a memory and a processor, where the memory and the processor are communicatively connected to each other, and the memory stores therein computer instructions, and the processor executes the computer instructions to perform the method for training a memory release model according to the first aspect or any one of the embodiments of the first aspect, or to perform the method for training a memory release model according to the second aspect or any one of the embodiments of the second aspect.
According to a sixth aspect, an embodiment of the present invention provides a computer-readable storage medium storing computer instructions for causing a computer to execute the method for training a storage release model according to the first aspect or any one of the embodiments of the first aspect, or execute the method for releasing storage according to the second aspect or any one of the embodiments of the second aspect.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a flow chart of a training method for applying a storage release model provided by an embodiment of the invention;
FIG. 2 is a flow chart illustrating a method for releasing storage according to an embodiment of the present invention;
FIG. 3 is a functional block diagram of a training apparatus applying a memory release model provided by an embodiment of the present invention;
FIG. 4 is a functional block diagram of a storage release device provided by an embodiment of the present invention;
fig. 5 is a schematic diagram of a hardware structure of an electronic device to which an embodiment of the present invention is applied.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that, in the method for training a storage and release model provided in this embodiment of the present application, an execution subject may be a device for training a storage and release model, and the device for training a storage and release model may be implemented as part or all of a computer device in a software, hardware, or a combination of software and hardware, where the computer device may be a server or a terminal, where the server in this embodiment of the present application may be one server or a server cluster composed of multiple servers, and the terminal in this embodiment of the present application may be another intelligent hardware device such as a smart phone, a personal computer, a tablet computer, a wearable device, and an intelligent robot. In the following method embodiments, the execution subject is an electronic device as an example.
In an embodiment of the present application, as shown in fig. 1, a training method for a memory release model is provided, which is described by taking an application of the method and an electronic device as an example, and includes the following steps:
s100, obtaining a plurality of groups of sample data and labels thereof, wherein each group of sample data comprises access frequencies of a plurality of storage objects and the cost of the storage objects, and the labels release the storage objects for targets in each group of sample data.
In this embodiment, the storage objects are set as three-level storage objects including MEM, SSD, and HDD, and the release costs of the storage objects are recorded as cost 1, cost 2, and cost 3, and when the storage space of the three-level storage object is full of sample data, the optimal storage object needs to be evaluated according to the access frequency of the storage object and the cost of the storage object, that is, the target release storage object.
After the storage object is full of data, accessing the data in the storage object, and recording the access frequency of each data, so that the access frequency of each data can be acquired within a certain time, when the data in the storage object is full, releasing the data with a lower access frequency in the storage object is needed to store the subsequent data continuously, and due to the difference of the storage objects, the processing efficiency of the data is also different.
Explaining sample data, for example, there are 3 sets of sample data, including 3 storage objects, and it is now necessary to determine one storage object from the 3 storage objects to release the storage object. The target released memory object is the memory object that should be actually released. Through the training of the storage release model, the release object predicted by the model is the same as the target release storage object.
And S200, inputting each group of sample data into a storage release model, and outputting a storage object which is predicted to be released and corresponds to each group of sample data.
In this embodiment, each set of sample data is input into the storage and release model, the storage object corresponding to each set of sample data is output and predicted to be released, a plurality of data samples can be obtained through the storage and release model, and the sample data facilitates verification of a subsequently generated storage and release model.
In an optional embodiment, taking the memory objects as MEM, SSD and HDD as examples, for example, a set of sample data with the lowest access frequency, where the data exists in MEM, SSD and HDD, the sample data is input into the memory release model, and if the output predicted release memory object is MEM, the memory object is the target release memory object, and if the output predicted release memory object is SSD or HDD, the predicted release memory object is deviated from the target release memory object.
S300, performing loss function calculation based on the predicted released storage object and the corresponding target released storage object, and updating parameters of the storage release model to determine a target storage release model.
The training method of the storage release model provided by the embodiment of the invention is applied to an Alluxio system, data of the system is usually stored between a calculation layer and a storage layer in a storage object form, so that the calculation layer can conveniently pass through the Alluxio system, a predicted release storage object can be calculated according to the access frequency of the storage object and the cost of the storage object based on a target storage release model generated by a plurality of groups of sample data and labels through deep learning, the predicted release storage object and the target release object are subjected to loss function calculation, parameters of the storage release model are updated based on the calculation result, the target storage release model is finally determined, and the target storage release model is input into the Alluxio system, so that the calculation layer can conveniently access related data more quickly and efficiently through the Alluxio system.
In the embodiment, parameters such as the learning rate, the dropout rate, the number of network layers, the optimizer, the number of neurons and the like of the storage and release model are optimized and adjusted through a verification set generated by a plurality of groups of sample data, so that the network has better feature fitting and decision making capability, at the moment, the accuracy of the output of the storage and release model and the accuracy of a real mark are up to more than 95%, the specific accuracy can be set according to actual conditions, and no limitation is made herein; and inputting the test set into a storage release model with the best performance on the verification set for final performance evaluation, measuring the quality of the storage release model through the test result, saving the parameters of the storage release model and introducing the storage release model into an auxiliary system in the Alluxio system to select a target release storage object at the minimum cost when the result meets the test requirement, so that a calculation layer can more quickly and efficiently access related data through the Alluxio.
In an optional embodiment of the present application, as shown in fig. 1, the storage release model includes a first input node, a second input node, and an output node, where the step of inputting each set of sample data into the storage release model and outputting a corresponding storage object to be predicted and released by each set of sample data in S200 includes:
(1) for each group of the sample data, inputting the access frequency of each storage object into the storage release model through the first input node;
(2) inputting the cost of each storage object into the storage release model through the second input node;
(3) acquiring a first weight corresponding to a first input node and a second weight corresponding to a second input node;
(4) and determining the predicted released storage object output by the output node according to the first weight and the access frequency and the second weight and the cost.
In this embodiment, the first weight is set to x, the second weight is set to y, and according to the formula: the optimal storage object is released as min (access frequency x + cost y), and the values of x and y can be calculated through multiple groups of sample data.
According to the training method for the storage release model, on the basis of considering the access frequency of the storage object in the target release storage object, the cost of the storage object and the consideration of the access efficiency of different storage objects are added, so that two factors in the sample data influence the determination of the final target storage release model.
In an optional embodiment of the present application, as shown in fig. 1, the "performing a loss function calculation based on the storage object predicted to be released and the corresponding storage object target to be released, and updating parameters of the storage release model to determine a target storage release model" in S300 includes:
(1) calculating a loss between the predicted freed memory object and a target freed memory object by an objective loss function and calculating a gradient;
(2) adjusting the first weight and the second weight based on the gradient.
In this embodiment, the target loss function adopts a cross entropy loss calculation method, calculates the loss between the output of the predicted released storage object and the target released storage object through cross entropy loss and calculates a gradient, optimizes the first weight and the second weight in the target storage and release model through gradient return, and outputs the target storage and release model when the difference between the output of the target storage and release model and the real mark is within 10%.
According to the training method of the storage release model provided by the embodiment of the invention, on the basis of considering the access frequency of the storage object in the target release storage object, the cost of the storage object and the consideration of the access efficiency of different storage objects are increased, and the adjustment is carried out according to the proportion weight in each aspect, so that the target release storage object is determined, the release cost of the storage object is reduced, and the operation efficiency of the system is improved.
In an alternative embodiment of the present application, as shown in fig. 1, the "adjusting with the first weight and the second weight based on the gradient" includes:
(1) analyzing changes of the first weight and the second weight determined by each training;
(2) and when the change of the first weight and the second weight is within a preset range, generating a first target weight and a second target weight.
According to the training method of the storage release model provided by the embodiment of the invention, the parameter trend graphs of the first weight and the second weight are drawn, and the output predicted release storage object is closer to the target release storage object because the change of the first weight and the second weight is continuously adjusted in the continuous analysis of a plurality of groups of sample data and labels thereof, so that the change of the first weight and the second weight stick is in a preset range, the change is gradually stable, and finally the first weight and the second weight after the stabilization generate the first target weight and the second target weight.
In an embodiment of the present application, as shown in fig. 2, there is further provided a storage release method, including the steps of:
s400, obtaining access frequency and cost of a plurality of storage objects to be released;
s500, inputting the access frequency and the cost of the plurality of storage objects to be released into a target storage and release model, and determining a target storage object in the plurality of storage objects to be released, wherein the target storage and release model is obtained by training according to the training method of the storage and release model;
s600, releasing the target storage object.
The storage release method provided by the embodiment of the invention is applied to the Alluxio system, and data of the system is usually stored between a computing layer and a storage layer in a storage object form, so that the computing layer can conveniently obtain a target storage object through the Alluxio system by computing through a target storage release module according to the access frequency of the storage object and the cost of the storage object, and then release the target storage object, so that the storage object to be released has a free storage space, and the computing layer can conveniently access related data more quickly and efficiently through the Alluxio system.
In an alternative embodiment of the present application, as shown in fig. 2, the method further comprises:
(1) acquiring data to be stored;
(2) and storing the data to be stored into the target storage object.
The storage release method provided by the embodiment of the invention stores the data to be stored into the storage space of the target storage object, so that the computing layer can access the related data more quickly and efficiently through the Alluxio system.
In an embodiment of the present application, as shown in fig. 3, there is further provided a training apparatus for a memory release model, including a first obtaining module 1, a first input module 2, and a target module 3, wherein:
a first obtaining module 1, configured to obtain multiple sets of sample data and tags thereof, where each set of sample data includes access frequencies of multiple storage objects and costs of the storage objects, and the tags release the storage objects for targets in each set of sample data;
the first input module 2 is used for inputting each group of sample data into the storage release model and outputting the corresponding prediction release storage object of each group of sample data;
and the target module 3 is used for performing loss function calculation on the basis of the predicted released storage object and the corresponding target released storage object, and updating parameters of the storage release model to determine a target storage release model.
The training device for the storage and release model provided by the embodiment of the invention is applied to the Alluxio system, data of the system is usually stored between a calculation layer and a storage layer in a storage object form, so that the calculation layer can calculate a predicted release storage object according to the access frequency of the storage object and the cost of the storage object through the Alluxio system based on a target storage and release model generated by a plurality of groups of sample data and labels through deep learning, then the predicted release storage object and the target release object are subjected to loss function calculation, parameters of the storage and release model are updated based on the calculation result, the target storage and release model is finally determined, two influencing factors of the access frequency and the cost are referred, the accuracy of the target release object for releasing the sample data can be improved, the release cost of the target release object is the lowest, and the target storage and release model is input into the Alluxio system, therefore, the computing layer can access the relevant data more quickly and efficiently through the Alluxio system.
In an embodiment of the present application, as shown in fig. 4, there is further provided a storage release apparatus, including a second obtaining module 4, a second inputting module 5, and a releasing module 6, wherein:
the first obtaining module 4 is configured to obtain access frequencies and costs of a plurality of storage objects to be released;
the input module 5 is configured to input the access frequency and the cost of the plurality of storage objects to be released into a target storage release model, and determine a target storage object in the plurality of storage objects to be released, where the target storage release model is obtained by training according to the training method of the storage release model;
and the releasing module 6 is used for releasing the target storage object.
The storage release device provided by the embodiment of the invention is applied to the Alluxio system, and data of the system is usually stored between a computing layer and a storage layer in a storage object form, so that the computing layer can conveniently obtain a target storage object through the Alluxio system by computing through a target storage release module according to the access frequency of the storage object and the cost of the storage object, and then release the target storage object, so that the storage object with the release function has a free storage space, and the computing layer can conveniently access related data more quickly and efficiently through the Alluxio system.
In an alternative embodiment of the present application, as shown in fig. 3, the apparatus further includes a second obtaining module and a storing module, where:
the second acquisition module is used for acquiring data to be stored;
and the storage module is used for storing the data to be stored into the target storage object.
It should be understood that, although the steps in the flowcharts of fig. 1 and 2 are shown in sequence as indicated by the arrows, the steps are not necessarily performed in sequence as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in fig. 1 and 2 may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed in turn or alternatively with other steps or at least a portion of the other steps or stages.
For specific limitations and beneficial effects of the storage release device, reference may be made to the above limitations of the storage release method, which are not described herein again. The modules in the storage release device can be wholly or partially implemented by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent of a processor in the electronic device, or can be stored in a memory in the electronic device in a software form, so that the processor can call and execute operations corresponding to the modules.
An embodiment of the present invention further provides an electronic device, which includes the training apparatus for a storage and release model shown in fig. 3 and the storage and release apparatus shown in fig. 4.
As shown in fig. 5, fig. 5 is a schematic structural diagram of an electronic device according to an alternative embodiment of the present invention, and as shown in fig. 5, the electronic device may include: at least one processor 71, such as a CPU (Central Processing Unit), at least one communication interface 73, memory 74, at least one communication bus 72. Wherein a communication bus 72 is used to enable the connection communication between these components. The communication interface 73 may include a Display (Display) and a Keyboard (Keyboard), and the optional communication interface 73 may also include a standard wired interface and a standard wireless interface. The Memory 74 may be a high-speed RAM Memory (volatile Random Access Memory) or a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. The memory 74 may alternatively be at least one memory device located remotely from the processor 71. Wherein the processor 71 may combine the training apparatus of the memory release model shown in fig. 3 and the memory release apparatus shown in fig. 4, an application program is stored in the memory 74, and the processor 71 calls the program code stored in the memory 74 for executing any of the above method steps.
The communication bus 72 may be a Peripheral Component Interconnect (PCI) bus or an Extended Industry Standard Architecture (EISA) bus. The communication bus 72 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in FIG. 5, but this is not intended to represent only one bus or type of bus.
The memory 74 may include a volatile memory (RAM), such as a random-access memory (RAM); the memory may also include a non-volatile memory (english: non-volatile memory), such as a flash memory (english: flash memory), a hard disk (english: hard disk drive, abbreviated: HDD) or a solid-state drive (english: SSD); the memory 74 may also comprise a combination of memories of the kind described above.
The processor 71 may be a Central Processing Unit (CPU), a Network Processor (NP), or a combination of CPU and NP.
The processor 71 may further include a hardware chip. The hardware chip may be an application-specific integrated circuit (ASIC), a Programmable Logic Device (PLD), or a combination thereof. The PLD may be a Complex Programmable Logic Device (CPLD), a field-programmable gate array (FPGA), a General Array Logic (GAL), or any combination thereof.
Optionally, the memory 74 is also used for storing program instructions. Processor 71 may call program instructions to implement a training method of a memory release model as shown in the embodiment of fig. 1 or the memory release method shown in the embodiment of fig. 2 of the present application.
An embodiment of the present invention further provides a non-transitory computer storage medium, where the computer storage medium stores computer-executable instructions, and the computer-executable instructions may execute the method for training a storage release model in any of the above method embodiments or the storage release method shown in the embodiment of fig. 2. The storage medium may be a magnetic Disk, an optical Disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a Flash Memory (Flash Memory), a Hard Disk (Hard Disk Drive, abbreviated as HDD), a Solid State Drive (SSD), or the like; the storage medium may also comprise a combination of memories of the kind described above.
Although the embodiments of the present invention have been described in conjunction with the accompanying drawings, those skilled in the art may make various modifications and variations without departing from the spirit and scope of the invention, and such modifications and variations fall within the scope defined by the appended claims.

Claims (10)

1. A training method of a storage release model is characterized by comprising the following steps:
acquiring multiple groups of sample data and tags thereof, wherein each group of sample data comprises access frequencies of multiple storage objects and costs of the storage objects, and the tags release the storage objects for targets in each group of sample data;
inputting each group of sample data into a storage release model, and outputting a storage object which is predicted to be released and corresponds to each group of sample data;
and performing loss function calculation based on the predicted released storage object and the corresponding target released storage object, and updating parameters of the storage release model to determine a target storage release model.
2. The method of claim 1, wherein the storage release model comprises a first input node, a second input node, and an output node, and wherein inputting each set of the sample data into the storage release model and outputting the corresponding predicted release storage object for each set of the sample data comprises:
for each group of the sample data, inputting the access frequency of each storage object into the storage release model through the first input node;
inputting the cost of each storage object into the storage release model through the second input node;
acquiring a first weight corresponding to a first input node and a second weight corresponding to a second input node;
and determining the predicted released storage object output by the output node according to the first weight and the access frequency and the second weight and the cost.
3. The method of claim 2, wherein updating parameters of the storage release model to determine a target storage release model based on a loss function calculation of the predicted release storage object and the corresponding target release storage object comprises:
calculating a loss between the predicted freed memory object and a target freed memory object by an objective loss function and calculating a gradient;
adjusting the first weight and the second weight based on the gradient.
4. The method of claim 3, wherein the adjusting with the first weight and the second weight based on the gradient comprises:
analyzing changes of the first weight and the second weight determined by each training;
and when the change of the first weight and the second weight is within a preset range, generating a first target weight and a second target weight.
5. A storage release method, comprising:
acquiring access frequency and cost of a plurality of storage objects to be released;
inputting the access frequency and the cost of a plurality of storage objects to be released into a target storage release model, and determining a target storage object in the plurality of storage objects to be released, wherein the target storage release model is obtained by training according to the training method of the storage release model in any one of claims 1-4;
and releasing the target storage object.
6. The storage release method of claim 5, further comprising:
acquiring data to be stored;
and storing the data to be stored into the target storage object.
7. A training apparatus for a memory release model, comprising:
the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring a plurality of groups of sample data and tags thereof, each group of sample data comprises access frequency of a plurality of storage objects and cost of the storage objects, and the tags release the storage objects for targets in each group of sample data;
the first input module is used for inputting each group of sample data into the storage release model and outputting the corresponding prediction release storage object of each group of sample data;
and the target module is used for performing loss function calculation on the basis of the predicted released storage object and the corresponding target released storage object, and updating parameters of the storage release model to determine a target storage release model.
8. A storage release device, comprising:
the second acquisition module is used for acquiring the access frequency and the cost of a plurality of storage objects to be released;
a second input module, configured to input the access frequency and the cost of the multiple storage objects to be released into a target storage release model, and determine a target storage object in the multiple storage objects to be released, where the target storage release model is obtained by training according to the training method of the storage release model according to any one of claims 1 to 4;
and the releasing module is used for releasing the target storage object.
9. An electronic device, comprising a memory and a processor, wherein the memory stores computer instructions, and the processor executes the computer instructions to execute the memory release model training method according to any one of claims 1 to 4, or execute the memory release method according to claim 5 or 6.
10. A computer-readable storage medium storing computer instructions for causing a computer to execute the method for training a storage release model according to any one of claims 1 to 4, or the method for storing a release as set forth in claim 5 or 6.
CN202111330010.8A 2021-11-10 2021-11-10 Training method of storage release model, storage release method and equipment Pending CN114168318A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111330010.8A CN114168318A (en) 2021-11-10 2021-11-10 Training method of storage release model, storage release method and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111330010.8A CN114168318A (en) 2021-11-10 2021-11-10 Training method of storage release model, storage release method and equipment

Publications (1)

Publication Number Publication Date
CN114168318A true CN114168318A (en) 2022-03-11

Family

ID=80478688

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111330010.8A Pending CN114168318A (en) 2021-11-10 2021-11-10 Training method of storage release model, storage release method and equipment

Country Status (1)

Country Link
CN (1) CN114168318A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117151239A (en) * 2023-03-17 2023-12-01 荣耀终端有限公司 Gradient updating method and related device
CN117193675A (en) * 2023-11-08 2023-12-08 上海飞斯信息科技有限公司 Solid-state storage management system based on distributed computing capacity

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117151239A (en) * 2023-03-17 2023-12-01 荣耀终端有限公司 Gradient updating method and related device
CN117193675A (en) * 2023-11-08 2023-12-08 上海飞斯信息科技有限公司 Solid-state storage management system based on distributed computing capacity
CN117193675B (en) * 2023-11-08 2024-02-02 上海飞斯信息科技有限公司 Solid-state storage management system based on distributed computing capacity

Similar Documents

Publication Publication Date Title
CN108701250B (en) Data fixed-point method and device
CN114168318A (en) Training method of storage release model, storage release method and equipment
KR20210032140A (en) Method and apparatus for performing pruning of neural network
CN111310784B (en) Resource data processing method and device
CN108805174A (en) clustering method and device
CN112132278A (en) Model compression method and device, computer equipment and storage medium
EP4343616A1 (en) Image classification method, model training method, device, storage medium, and computer program
CN114781650B (en) Data processing method, device, equipment and storage medium
CN113408070A (en) Method, device and equipment for determining engine parameters and storage medium
CN116433050B (en) Abnormality alarm method and system applied to agricultural big data management system
CN116166967B (en) Data processing method, equipment and storage medium based on meta learning and residual error network
CN116737373A (en) Load balancing method, device, computer equipment and storage medium
CN111190800B (en) Method, system, device and storage medium for predicting batch operation duration of host
CN113256093A (en) Dynamic risk pool monitoring method, system, equipment and readable storage medium
CN111105144A (en) Data processing method and device and target object risk monitoring method
EP3518153A1 (en) Information processing method and information processing system
CN113407192B (en) Model deployment method and device
CN114928477B (en) Network intrusion detection method and device, readable storage medium and terminal equipment
CN114925821B (en) Compression method and related system of neural network model
CN113065644B (en) Method, apparatus, device and medium for compressing neural network model
CN116796164A (en) Feature selection method, device, electronic equipment and storage medium
CN112184301A (en) Data prediction method, device, equipment and computer readable storage medium
CN117114268A (en) Network point determining method, device, server, storage medium and program product
CN115081596A (en) Convolution neural network model reasoning method, device, equipment and storage medium
CN113066486A (en) Data identification method and device, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination