CN116522007A - Recommendation system model-oriented data forgetting learning method, device and medium - Google Patents
Recommendation system model-oriented data forgetting learning method, device and medium Download PDFInfo
- Publication number
- CN116522007A CN116522007A CN202310814010.8A CN202310814010A CN116522007A CN 116522007 A CN116522007 A CN 116522007A CN 202310814010 A CN202310814010 A CN 202310814010A CN 116522007 A CN116522007 A CN 116522007A
- Authority
- CN
- China
- Prior art keywords
- data
- model
- data set
- recommendation system
- system model
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 72
- 230000006870 function Effects 0.000 claims abstract description 48
- 238000004364 calculation method Methods 0.000 claims abstract description 20
- 238000012549 training Methods 0.000 claims abstract description 13
- 239000011159 matrix material Substances 0.000 claims description 9
- 238000012821 model calculation Methods 0.000 claims description 9
- 238000013138 pruning Methods 0.000 claims description 8
- 230000008859 change Effects 0.000 claims description 6
- 238000004590 computer program Methods 0.000 claims description 6
- 230000008569 process Effects 0.000 claims description 5
- 230000001133 acceleration Effects 0.000 claims description 4
- 230000000694 effects Effects 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000012217 deletion Methods 0.000 description 2
- 230000037430 deletion Effects 0.000 description 2
- 238000002474 experimental method Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 230000007547 defect Effects 0.000 description 1
- 238000004880 explosion Methods 0.000 description 1
- 238000012805 post-processing Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/953—Querying, e.g. by the use of web search engines
- G06F16/9535—Search customisation based on user profiles and personalisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Computation (AREA)
- Software Systems (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Medical Informatics (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The invention discloses a data forgetting learning method, a device and a medium for a recommendation system model, wherein the data forgetting learning method comprises the following steps: firstly, obtaining a calculation function of a recommended system model for a sample under a training data set, outputting the calculation function as a prediction for the sample, training the model to be optimal on the training data set, and obtaining a model after erasing an unavailable data set through the optimal model and influence function estimation. According to the method, the training architecture, the model architecture and the deployment mode of the original model are not changed, the model parameters which are not important for data erasure are deleted, and the accuracy and the data erasure efficiency of data erasure based on the influence function are improved.
Description
Technical Field
The invention relates to the field of data processing systems or methods, in particular to a data forgetting learning method, device and medium for a recommendation system model.
Background
The recommendation system is a key basic tool for solving the information explosion problem in the current mobile internet era, really influences activities such as life, entertainment and travel of people, and is the most commonly used interaction medium between a service provider and a user. Recommendation systems typically infer user interests from historical interaction information of users, which must be remembered by a model deployed to contain parameters. However, a model also requires that some historical data be erased in some cases, such as users requiring deletion of their individual's historical information for privacy, and the system itself requires deletion of some information for attack purposes, etc. Such information to be erased is referred to as unusable data for convenience. It should be noted that these unusable data to be erased are not only erased from the database, but rather that it should be emphasized how to erase these unusable data from the parameters of the model.
In current recommendation systems, the erasure of unusable data is accomplished primarily in dependence upon retraining. The first is full retraining, i.e. training the recommendation system model from scratch to achieve the goal of data erasure without using unusable data, which is often faced with more time consuming, and the recommendation system is a real-time system, and therefore is not practical. The second type of retraining is partial retraining, which is a method in which data is divided into different independent parts at the time of initial training, different parts of data are used to train different sub-models, and when a request for erasing unusable data is received, only a small part of the sub-models need to be retrained, thereby increasing model retraining efficiency. However, this approach requires that the unavailable data only affects a small fraction of the sub-models, and this assumption also limits the practical application of the approach, since the distribution of the unavailable data is often unknown and unrestricted. In addition to the retraining method, some research works are to record gradient update information of the training process and then retrospectively cancel gradient update information of unavailable data to achieve the purpose of data erasure, however, the method ignores the interaction between different samples.
In a technical field outside the recommendation system, there are also research efforts to use the influence function (influence function) to achieve data erasure, but it cannot be directly applied to the recommendation system because it cannot evaluate the influence of erasure-unavailable data on other data calculation functions. In addition, direct application can also cause massive computational overhead problems. The invention improves the influence function, so that the influence caused by the change of the calculation function of other data caused by erasure of unavailable data can be measured, and the calculation acceleration is realized through a pruning scheme.
Disclosure of Invention
The invention aims to solve the defects in the prior art, and provides a data forgetting learning method, device and medium for a recommendation system model, which do not change the training architecture, model architecture and deployment mode of an original model, and improve the accuracy and the data erasing efficiency of erasing data based on an influence function.
In order to achieve the aim of the invention, the invention adopts the following technical scheme:
in a first aspect, the present invention provides a data forgetting learning method for a recommendation system model, where the data forgetting learning method includes the following steps:
s1, definitionIs a parameter of->The recommendation system model of the representation is +.>In training data set->The following calculation function, which is output as a prediction for the sample, and is +.>The optimal model ∈>Expressed as->Wherein->For the recommendation system model in the data set +.>The sum of the loss functions below, expressed as +.>,Representing data set +.>One of them, < >>Representing the user->Representing articles->Representing user +.>For articles->Is->Representing a loss function;
s2, in the data setThe unavailable data set is +.>The remaining data set is +.>The model after erasing the data is +.>Expressed as->Estimating ∈according to the influence function>The estimated result is recorded as +.>The resulting erase unavailable data set +.>The latter model is denoted->。
Further, different data sets are usedThe erasure-disabled data set is used in a different way for the input samples>The model calculation functions of (2) are also different.
Still further, the erasing of the unusable data setThe specific calculation process of the model calculation function of (a) is as follows:
(1) If data setUser item pair input in->Is a recommendation system model calculation functionWill leave the data set +.>All of the sample points satisfying the conditionRecorded as a calculation function change dataset +.>The recommendation system model is based on the dataset +.>And the remaining data setLower calculation data set +.>The difference between the loss functions of all sample points in (a) is +.>,/>Denoted as->;
(2) Computing unusable data setsIn dataset +.>The lower loss function->,Denoted as->;
(3) Based on the obtained、/>Defined as->Based on +.>Intensity plus->And->Optimal model->,/>Expressed as:wherein->Representing a disturbance term, deriving +.>,/>;
(4) Defining unusable data setsThe influence function for the recommendation system model is +.>,Expressed as: />Wherein->Representation pair->Conduct and->Expressed as a Hessian matrix,>an inverse matrix denoted as a Hessian matrix;
(5) Then based on the influence functionEstimate->Is that
Erase unavailable data set +.>The model after that is: />。
Furthermore, when the data forgetting learning method is used for data erasure, acceleration is realized by a pruning method, and model parameters which are not important for data erasure are deleted.
Furthermore, in the pruning method, the specific process of deleting the unimportant model parameters is as follows:
(1) Data setAll users and item sets in (1) are denoted +.>And (2) andfor each element thereinCalculating a set of elements with which statistics interact, denoted +.>And sets the maximum number of iterations K, for all elements +.>Initializing an importance score for it +.>Cutting ratio per iteration;
(2) Initializing an empty setTraversing data set +.>Is>Pairing user articlesAdded to->Execute any->Update->The method comprises the steps of carrying out a first treatment on the surface of the After the traversal is completed, updateFinally let->;
(3) If it isInitializing a null set->Let all->Traversing->For any->Update->And will->Added to->The method comprises the steps of carrying out a first treatment on the surface of the After traversal is completed, update->Let->;
(4) If it isContinuing to execute the step (3) to obtain +.> ,Representing the model parameters corresponding to v and returning +.>The method comprises the steps of carrying out a first treatment on the surface of the According to the->Will->The other parameters are marked as +.>Then there is,/>,/>For the model parameters important for erasing data +.>For model parameters that are not important for erasing data, the +.>Is only considered +.>Is a variation of (c).
Further, an influence function according to the recommendation system model
Will->Replaced by->And +.>Fix->Taking this as a constant, then ∈ ->Namely, the method is simplified as follows:
wherein->Erase unavailable data set +.>The model after that is:wherein->。
Further, if saidThe parameter number of (2) is->(/>) The computational complexity of the model update is reduced toOr->Wherein->For influence->Sample size of->For model parameters>Is the time complexity.
Further, the loss functionIs a binary cross entropy loss function.
In a second aspect, the present invention provides a data recommendation device, comprising a memory storing computer executable instructions and a processor configured to execute the computer executable instructions, the computer executable instructions implementing the data forgetting learning method when executed by the processor.
In a third aspect, the present invention provides a computer readable storage medium having a computer program stored thereon, wherein the computer program when executed by a processor implements the data forgetting learning method.
Compared with the prior art, the invention has the following beneficial effects:
the method improves the accuracy of erasing data based on the influence function, and only considers the loss function when the influence of the traditional influence function calculation data on the modelItem, ignore->And->The prediction function (i.e., the calculation function) that measures other data is the affected part of the removed data, which is widely present in the recommender model, and only taking this affected part into account, complete erasure of the data can be achieved, thereby achieving the same target results as training the recommender model from scratch.
The invention improves the efficiency of erasing data. In one aspect, the present invention is directed to avoiding direct estimationThis quantity is then directly added to the original model to obtain a model of erasure unusable data, avoiding retraining, and in the event that erasure data is not too much, assisting in achieving faster erasure. On the other hand, the present invention proposes that the pruning method is accelerated by +.>Transition to->Further speeding up, deleting model parameters that are not important for data erasure.
The method is a post-processing method, does not need to change the model framework and training framework of the original model, only needs to obtain the gradient of the model and can access the model, can be conveniently and directly grafted to the deployed system model, and is beneficial to the wide application of the method.
Detailed Description
Example 1:
the embodiment discloses a data forgetting learning method facing a recommendation system model, which comprises the following steps:
s1, definitionIs a parameter of->The recommendation system model of the representation is +.>In training data set->The following calculation function, which is output as a prediction for the sample, and is +.>The optimal model ∈>Expressed as->Wherein->For the recommendation system model in the data set +.>The sum of the loss functions below, expressed as +.>,Representing data set +.>One of them, < >>Representing the user->Representing articles->Representing user +.>For articles->Is->Represents a loss function, loss function->The method specifically comprises the steps of binary cross entropy loss function;
s2, in the data setThe unavailable data set is +.>The remaining data set is +.>The model after erasing the data is +.>Expressed as->Estimating ∈according to the influence function>The estimated result is recorded as +.>The resulting erase unavailable data set +.>The latter model is denoted->。
When using different data setsIn the case of different calculation modes for the input samples, the erase unavailable data set is adopted>The model calculation functions of (2) are also different. Erase unusable data set +.>The specific calculation process of the model calculation function of (a) is as follows:
(1) If data setUser item pair input in->Is a recommendation system model calculation functionWill leave the data set +.>All of the sample points satisfying the conditionRecorded as a calculation function change dataset +.>The recommendation system model is based on the dataset +.>And the remaining data setLower calculation data set +.>The difference between the loss functions of all sample points in (a) is +.>,Denoted as->;
(2) Computing unusable data setsIn dataset +.>The lower loss function->,Denoted as->;
(3) Based on the obtained、/>Defined as->Based on the basis ofIntensity plus->And->Optimal model->,/>Expressed as:wherein->Representing a disturbance term, deriving +.>,/>;
(4) Defining unusable data setsThe influence function for the recommendation system model is +.>,Expressed as: />Wherein->Representation pair->Conduct and->Expressed as a Hessian matrix,>an inverse matrix denoted as a Hessian matrix;
(5) Then based on the influence functionEstimate->Is that
Erase unavailable data set +.>The model after that is: />。
In this embodiment, when the data forgetting learning method performs data erasure, acceleration is realized by a pruning method, and model parameters which are not important for data erasure are deleted. In the pruning method, the specific flow of deleting unimportant model parameters is as follows:
(1) Data setAll users and item sets in (1) are denoted +.>And for each element thereinCalculating a set of elements with which statistics interact, denoted +.>And sets the maximum number of iterations K, for all elements +.>Initializing an importance score for it +.>Cutting ratio per iteration;
(2) Initializing an empty setTraversing data set +.>Is>Pairing user articlesAdded to->Execute any->Update->The method comprises the steps of carrying out a first treatment on the surface of the TraversingAfter completion, updateFinally let->;
(3) If it isInitializing a null set->Let all->Traversing->For any->Update->And will->Added to->The method comprises the steps of carrying out a first treatment on the surface of the After traversal is completed, update->Let->;
(4) If it isContinuing to execute the step (3) to obtain +.> ,Representing the model parameters corresponding to v and returning +.>The method comprises the steps of carrying out a first treatment on the surface of the According to the->Will->The other parameters are marked as +.>Then there is,/>,/>For the model parameters important for erasing data +.>For model parameters that are not important for erasing data, the +.>Is only considered +.>Is a variation of (c).
Influence functions according to the recommendation system model
Will->Replaced by->And +.>Fix->Taking this as a constant, then ∈ ->Namely, the method is simplified as follows:
wherein->Erase unavailable data set +.>The model after that is:wherein->. If->The parameter number of (2) is->(/>) The computational complexity of the model update is reduced to +.>Or->Wherein->For influence->Sample size of->For model parameters>Is the time complexity.
In order to verify the effectiveness of the learning method disclosed in example 1 applied to the recommendation model system, the following experimental verification was performed using the real data set. The data set is a common public data set Amazon, and the division and the setting of the data set follow the common mode of data erasure study in a recommended model system to erase attack data in the data so as to realize better model performance. Experiments were performed based on two commonly used recommended models MF, LGCN, in contrast to the different data erasure methods Retrain, recEraser and SISA, etc., where Retrain is a method that is fully trained from scratch, which yields the model performance versus the erasure data that should yield performance. The experimental results are shown in table 1:
table 1 data erasure effects with different erasure methods
From the data erasing effects recorded in table 1, it can be found that the learning method disclosed in this embodiment is consistent with the standard result corresponding to strin, compared with other data erasing methods, and better results than those before erasing the data are obtained, so that it can be proved that the learning method of this embodiment can realize accurate data erasing.
In addition, efficiency experiments can be performed on the data set, and the time efficiency of the learning method and the Retrain method disclosed by the embodiment in erasing data with different ratios is tested respectively, and the experimental results are shown in table 2:
TABLE 2 time Performance comparison results
From the time performance comparison results recorded in table 2, it can be found that the recommendation method disclosed in this embodiment can achieve data erasure faster when deleting data with different proportions. By combining the results of table 1 and table 2, it was verified that the method of embodiment 1 can quickly and accurately achieve data erasure.
The invention can be applied to any recommendation system which is based on data driving and can be derived, and is used for responding to the request of considering data erasure such as privacy protection of a user or security of a platform or other parties, and the like, and assisting the realization of ecology of the recommendation system which is friendly, legal and compliant. The invention can be integrated into a deployed recommendation system in a software mode in a specific implementation level, and can also be installed on a website in an online mode to directly provide data erasure request interfaces for different users.
Example 2:
the embodiment discloses a data recommendation device, which comprises a memory and a processor, wherein the memory stores computer executable instructions, the processor is configured to execute the computer executable instructions, and the computer executable instructions are executed by the processor to realize the data forgetting learning method disclosed in the first embodiment.
Example 3:
the embodiment discloses a computer readable storage medium, and a computer program is stored on the computer readable storage medium, and when the computer program is run by a processor, the data forgetting learning method disclosed in the first embodiment is realized.
Claims (10)
1. A data forgetting learning method facing a recommendation system model is characterized by comprising the following steps:
s1, definitionIs a parameter of->The recommendation system model of the representation is +.>In training data set->The following calculation function, which is output as a prediction for the sample, and is +.>The optimal model ∈>Expressed as->Wherein->For the recommendation system model in the data set +.>The sum of the loss functions below, expressed as +.>,/>Representing data set +.>One of them, < >>Representing the user->Representing articles->Representing user +.>For articles->Is->Representing a loss function;
s2, in the data setThe unavailable data set is +.>The remaining data set is +.>The model after erasing the data isExpressed as->Estimating ∈according to the influence function>The estimated result is recorded as +.>The resulting erase unavailable data set +.>The latter model is denoted->。
2. The recommendation system model oriented data forgetting learning method of claim 1 wherein the pushingThe referral system model uses different data setsThe erasure-disabled data set is used in a different way for the input samples>The model calculation functions of (2) are also different.
3. The recommendation system model oriented data forgetting learning method of claim 2 wherein the erasing of unusable data setsThe specific calculation process of the model calculation function of (a) is as follows:
(1) If data setUser item pair input in->Is a recommendation system model calculation functionWill leave the data set +.>All of the sample points satisfying the conditionRecorded as a calculation function change dataset +.>The recommendation system model is based on the dataset +.>And the left data set->Lower calculation data set +.>The difference between the loss functions of all sample points in (a) is +.>,Denoted as->;
(2) Computing unusable data setsIn dataset +.>The lower loss function->,/>Denoted as->;
(3) Based on the obtained、/>Defined as->Based on +.>Intensity plusAnd->Optimal model->,/>Expressed as:
wherein->Representing a disturbance term, deriving +.>,/>;
(4) Defining unusable data setsThe influence function for the recommendation system model is +.>,/>Expressed as: />Wherein->Representation pair->Conduct and->Expressed as a Hessian matrix,>an inverse matrix denoted as a Hessian matrix;
(5) Then based on the influence functionEstimate->Is->Erase unavailable data set +.>The model after that is:
。
4. the recommendation system model-oriented data forgetting learning method according to claim 3, wherein acceleration is realized by a pruning method when data erasure is performed by the data forgetting learning method, and model parameters which are not important for data erasure are deleted.
5. The recommendation system model-oriented data forgetting learning method according to claim 4, wherein the specific flow of deleting unimportant model parameters in the pruning method is as follows:
(1) Data setAll users and item sets in (1) are denoted +.>And for each element thereinCalculating a set of elements with which statistics interact, denoted +.>And sets the maximum number of iterations K, for all elements +.>Initializing an importance score for it +.>Cutting ratio per iteration;
(2) Initializing an empty setTraversing data set +.>Is>Pairing user articlesAdded to->Execute any->Update->The method comprises the steps of carrying out a first treatment on the surface of the After the traversal is completed, updateFinally let->;
(3) If it isInitializing a null set->Let all->Traversing->For any->Update->And will->Added to->The method comprises the steps of carrying out a first treatment on the surface of the After the traversal is completed, updateLet->;
(4) Continuing to execute the step (3) to obtain ,/>Representing the model parameters corresponding to v and returning +.>The method comprises the steps of carrying out a first treatment on the surface of the According to the->Will->The other parameters are marked as +.>There is->,,/>For the model parameters important for erasing data +.>For model parameters that are not important for erasing data, the +.>Is only considered +.>Is a variation of (c).
6. The push-oriented device of claim 5A data forgetting learning method of a recommendation system model is characterized in that the method is based on an influence function of the recommendation system modelWill beReplaced by->And +.>Fix->Taking this as a constant, then ∈ ->Namely, the method is simplified as follows:whereinErase unavailable data set +.>The model after that is:wherein->。
7. The recommendation system model oriented data forgetting learning method of claim 6 wherein if saidThe parameter number of (2) is->(/>) The computational complexity of the model update is reduced to +.>Or->Wherein->For influence->Sample size of->For model parameters>Is the time complexity.
8. The recommendation system model oriented data forgetting learning method of claim 1 wherein the loss functionIs a binary cross entropy loss function.
9. A data recommendation device comprising a memory storing computer executable instructions and a processor configured to execute the computer executable instructions, wherein the computer executable instructions when executed by the processor implement the data forgetting learning method of any of claims 1 to 8.
10. A computer readable storage medium having a computer program stored thereon, characterized in that the computer program when executed by a processor implements the data forgetting learning method according to any one of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310814010.8A CN116522007B (en) | 2023-07-05 | 2023-07-05 | Recommendation system model-oriented data forgetting learning method, device and medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310814010.8A CN116522007B (en) | 2023-07-05 | 2023-07-05 | Recommendation system model-oriented data forgetting learning method, device and medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116522007A true CN116522007A (en) | 2023-08-01 |
CN116522007B CN116522007B (en) | 2023-10-20 |
Family
ID=87408630
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310814010.8A Active CN116522007B (en) | 2023-07-05 | 2023-07-05 | Recommendation system model-oriented data forgetting learning method, device and medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116522007B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117390685A (en) * | 2023-12-07 | 2024-01-12 | 湖北省楚天云有限公司 | Pedestrian re-identification data privacy protection method and system based on forgetting learning |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110491146A (en) * | 2019-08-21 | 2019-11-22 | 浙江工业大学 | A kind of traffic signal control scheme real-time recommendation method based on deep learning |
CN111199242A (en) * | 2019-12-18 | 2020-05-26 | 浙江工业大学 | Image increment learning method based on dynamic correction vector |
CN111932512A (en) * | 2020-08-06 | 2020-11-13 | 吉林大学 | Intracranial hemorrhage detection algorithm applied to CT image based on CNN and NLSTM neural network |
CN113590958A (en) * | 2021-08-02 | 2021-11-02 | 中国科学院深圳先进技术研究院 | Continuous learning method of sequence recommendation model based on sample playback |
CN114611631A (en) * | 2022-04-14 | 2022-06-10 | 广州大学 | Method, system, device and medium for fast training a model from a partial training set |
CN114692894A (en) * | 2022-04-02 | 2022-07-01 | 南京大学 | Implementation method of machine learning model supporting dynamic addition and deletion of user data |
CN114863243A (en) * | 2022-04-28 | 2022-08-05 | 国家电网有限公司大数据中心 | Data forgetting method, device, equipment and storage medium of model |
EP4083838A1 (en) * | 2021-04-30 | 2022-11-02 | Hochschule Karlsruhe | Method and system to collaboratively train data analytics model parameters |
CN115329864A (en) * | 2022-08-11 | 2022-11-11 | 北京有竹居网络技术有限公司 | Method and device for training recommendation model and electronic equipment |
US20230093019A1 (en) * | 2021-09-21 | 2023-03-23 | Sap Se | Personalized Evolving Search Assistance |
US20230123322A1 (en) * | 2021-04-16 | 2023-04-20 | Strong Force Vcn Portfolio 2019, Llc | Predictive Model Data Stream Prioritization |
CN116226654A (en) * | 2022-09-09 | 2023-06-06 | 西安电子科技大学 | Machine learning data forgetting method based on mask gradient |
-
2023
- 2023-07-05 CN CN202310814010.8A patent/CN116522007B/en active Active
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110491146A (en) * | 2019-08-21 | 2019-11-22 | 浙江工业大学 | A kind of traffic signal control scheme real-time recommendation method based on deep learning |
CN111199242A (en) * | 2019-12-18 | 2020-05-26 | 浙江工业大学 | Image increment learning method based on dynamic correction vector |
CN111932512A (en) * | 2020-08-06 | 2020-11-13 | 吉林大学 | Intracranial hemorrhage detection algorithm applied to CT image based on CNN and NLSTM neural network |
US20230123322A1 (en) * | 2021-04-16 | 2023-04-20 | Strong Force Vcn Portfolio 2019, Llc | Predictive Model Data Stream Prioritization |
EP4083838A1 (en) * | 2021-04-30 | 2022-11-02 | Hochschule Karlsruhe | Method and system to collaboratively train data analytics model parameters |
CN113590958A (en) * | 2021-08-02 | 2021-11-02 | 中国科学院深圳先进技术研究院 | Continuous learning method of sequence recommendation model based on sample playback |
US20230093019A1 (en) * | 2021-09-21 | 2023-03-23 | Sap Se | Personalized Evolving Search Assistance |
CN114692894A (en) * | 2022-04-02 | 2022-07-01 | 南京大学 | Implementation method of machine learning model supporting dynamic addition and deletion of user data |
CN114611631A (en) * | 2022-04-14 | 2022-06-10 | 广州大学 | Method, system, device and medium for fast training a model from a partial training set |
CN114863243A (en) * | 2022-04-28 | 2022-08-05 | 国家电网有限公司大数据中心 | Data forgetting method, device, equipment and storage medium of model |
CN115329864A (en) * | 2022-08-11 | 2022-11-11 | 北京有竹居网络技术有限公司 | Method and device for training recommendation model and electronic equipment |
CN116226654A (en) * | 2022-09-09 | 2023-06-06 | 西安电子科技大学 | Machine learning data forgetting method based on mask gradient |
Non-Patent Citations (1)
Title |
---|
梁琨等: ""深度学习驱动的知识追踪研究进展综述"", 《计算机工程与应用》, pages 41 - 58 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117390685A (en) * | 2023-12-07 | 2024-01-12 | 湖北省楚天云有限公司 | Pedestrian re-identification data privacy protection method and system based on forgetting learning |
CN117390685B (en) * | 2023-12-07 | 2024-04-05 | 湖北省楚天云有限公司 | Pedestrian re-identification data privacy protection method and system based on forgetting learning |
Also Published As
Publication number | Publication date |
---|---|
CN116522007B (en) | 2023-10-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10846643B2 (en) | Method and system for predicting task completion of a time period based on task completion rates and data trend of prior time periods in view of attributes of tasks using machine learning models | |
CN109983483B (en) | Computer-implemented method and computing device for managing machine learning models | |
TWI784941B (en) | A multi-sampling model training method and device | |
US11443015B2 (en) | Generating prediction models in accordance with any specific data sets | |
JP2022548654A (en) | Computer-based system, computer component and computer object configured to implement dynamic outlier bias reduction in machine learning models | |
US11403643B2 (en) | Utilizing a time-dependent graph convolutional neural network for fraudulent transaction identification | |
CN116522007B (en) | Recommendation system model-oriented data forgetting learning method, device and medium | |
WO2018072580A1 (en) | Method for detecting illegal transaction and apparatus | |
CN106471525A (en) | Strength neural network is to generate additional output | |
WO2020047861A1 (en) | Method and device for generating ranking model | |
US11023819B2 (en) | Machine-learning models applied to interaction data for facilitating experience-based modifications to interface elements in online environments | |
CN111932367A (en) | Pre-credit evaluation method and device | |
CN110555148B (en) | User behavior evaluation method, computing device and storage medium | |
US20190220924A1 (en) | Method and device for determining key variable in model | |
Zanette et al. | Problem dependent reinforcement learning bounds which can identify bandit structure in mdps | |
CN109214647B (en) | Method for analyzing overflow effect among online access channels based on network access log data | |
CN106204597A (en) | A kind of based on from the VS dividing method walking the Weakly supervised study of formula | |
Sugiyama et al. | More powerful and general selective inference for stepwise feature selection using homotopy method | |
CN117349899B (en) | Sensitive data processing method, system and storage medium based on forgetting model | |
WO2019047673A1 (en) | Method and device for evaluating longitudinal control model of end-to-end automatic driving system | |
US11055488B2 (en) | Cognitive initialization of large-scale advection-diffusion models | |
WO2021068249A1 (en) | Method and apparatus for hardware simulation and emulation during running, and device and storage medium | |
Zhang et al. | Intrinsic Performance Influence-based Participant Contribution Estimation for Horizontal Federated Learning | |
CN111737491B (en) | Control method, device, storage medium and equipment for interaction process | |
US20170177767A1 (en) | Configuration of large scale advection diffusion models with predetermined rules |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |