CN112734086A - Method and device for updating neural network prediction model - Google Patents

Method and device for updating neural network prediction model Download PDF

Info

Publication number
CN112734086A
CN112734086A CN202011557492.6A CN202011557492A CN112734086A CN 112734086 A CN112734086 A CN 112734086A CN 202011557492 A CN202011557492 A CN 202011557492A CN 112734086 A CN112734086 A CN 112734086A
Authority
CN
China
Prior art keywords
importance
feature
prediction model
source
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011557492.6A
Other languages
Chinese (zh)
Inventor
陈杰
陈高均
潘昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beike Technology Co Ltd
Original Assignee
Beike Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beike Technology Co Ltd filed Critical Beike Technology Co Ltd
Priority to CN202011557492.6A priority Critical patent/CN112734086A/en
Publication of CN112734086A publication Critical patent/CN112734086A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/16Real estate

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Strategic Management (AREA)
  • General Health & Medical Sciences (AREA)
  • Tourism & Hospitality (AREA)
  • Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Evolutionary Biology (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Business, Economics & Management (AREA)
  • Biophysics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biomedical Technology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Computational Linguistics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Game Theory and Decision Science (AREA)
  • Development Economics (AREA)
  • Primary Health Care (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention provides a method and a device for updating a neural network prediction model, wherein the method comprises the following steps: obtaining a current training sample; training a preset first prediction model to obtain a second prediction model; obtaining a first feature importance set consisting of the importance of various features of the first object predicted by using the first prediction model and a second feature importance set consisting of the importance of various features of the second object predicted by using the second prediction model through a model interpretation tool; calculating the similarity between the first feature importance set and the second feature importance set; comparing the similarity with a preset threshold: if the similarity is larger than or equal to a preset threshold value, updating the first prediction model into a second prediction model; and if the similarity is smaller than a preset threshold value, not updating the first prediction model. The method and the device provided by the invention have interpretability while providing timely and reliable update for the neural network prediction model.

Description

Method and device for updating neural network prediction model
Technical Field
The present invention relates to the field of computer technologies, and in particular, to a method and an apparatus for updating a neural network prediction model.
Background
Currently, the neural network model is very powerful, and for example, CNN networks commonly used in the image field and RNN networks for natural language processing can efficiently solve practical problems, such as face recognition and text translation. Similarly, in a typical structured data scenario, a complex neural network can fully utilize static data and time sequence data to realize accurate prediction, for example, in an AI house selection technology, the neural network can calculate accurate scores for nearly 300 million house sources in 100 cities across the country through fully mining house source basic data, user behavior data, broker operation data and the like, so as to provide nearly 14-day deal probability information for brokers and improve the rate of removal of stock house sources.
However, as more and more neural network models are introduced online, the models are usually updated continuously after one model is online in order to adapt to various changing factors and ensure that the models always give correct results. However, in the field of computer machine learning, methods for updating models are very limited.
At present, the neural network model is usually updated based on a loss function, that is, the model is updated by monitoring the change of a mathematical formula capable of reflecting the difference between a predicted value and an actual value. The most common of the regression tasks is the mean square error loss function; the most common in the classification task is the cross-entropy loss function. However, since the variation of the loss function is often less than one in a thousand times in a certain time range, it is not time-consuming and interpretable to update the model only according to the loss function.
Therefore, there is a need for an interpretability-based model updating method to ensure the reliability of a complex neural network model.
It is to be noted that the information disclosed in the background section above is only for enhancement of understanding of the background of the invention and therefore it may contain information that does not constitute prior art that is already known to a person skilled in the art.
Disclosure of Invention
In order to solve the problems in the prior art, the invention provides a method for updating a neural network prediction model, which has interpretability while providing timely and reliable update for the neural network prediction model, thereby overcoming the problems in the prior art.
The invention provides a method for updating a neural network prediction model, which comprises the following steps: obtaining a current training sample; training a preset first prediction model based on the current training sample to obtain a second prediction model; obtaining a first feature importance set consisting of the importance of various features of the first object predicted by the first prediction model and a second feature importance set consisting of the importance of various features of the second object predicted by the second prediction model through a model interpretation tool; calculating the similarity between the first feature importance set and the second feature importance set; comparing the similarity with a preset threshold: if the similarity is larger than or equal to the preset threshold, updating the first prediction model to the second prediction model; and if the similarity is smaller than the preset threshold, not updating the first prediction model.
According to an embodiment of the present invention, obtaining, by a model interpretation tool, a first feature importance set including importance of various features of a first object predicted by using the first prediction model and a second feature importance set including importance of various features of a second object predicted by using the second prediction model includes: obtaining, by a model interpretation tool, an importance value of each feature of each sub-object included in each of the first object and the second object; clustering the importance value of each feature of each sub-object contained in the first object and the second object respectively to obtain the importance of each feature of the first object and the second object respectively; and respectively sorting the importance of each feature of the first object and the second object to obtain a first importance sorting set corresponding to the first object and a second importance sorting set corresponding to the second object.
According to an embodiment of the present invention, calculating the similarity between the first feature importance set and the second feature importance set includes: calculating a similarity between the first importance ranking set and the second importance ranking set.
According to an embodiment of the present invention, obtaining, by a model interpretation tool, an importance value of each feature of each sub-object included in each of the first object and the second object includes: inputting the feature set of each sub-object included in the first object into the first prediction model to obtain a first prediction value, and obtaining an importance value of each feature of each sub-object included in the first object through a model interpretation tool based on the first prediction value and the feature set of each sub-object included in the first object; and inputting the feature set of each sub-object included in the second object into the second prediction model to obtain a second prediction value, and obtaining an importance value of each feature of each sub-object included in the second object through a model interpretation tool based on the second prediction value and the feature set of each sub-object included in the second object.
According to an embodiment of the present invention, the clustering the importance value of each feature of each sub-object included in each of the first object and the second object, respectively, to obtain the importance of each feature of each of the first object and the second object includes: summing the absolute values of the importance values of the same kind of characteristics of each sub-object included in the first object to obtain the importance of each characteristic of the first object; and summing the absolute values of the importance values of the same kind of characteristics of each sub-object contained in the second object to obtain the importance of each characteristic of the second object.
According to an embodiment of the present invention, the respectively ranking the importance of each feature of the first object and the second object to obtain a first importance ranking set corresponding to the first object and a second importance ranking set corresponding to the second object includes: taking a set of features with the importance degree ranked m top in the first object as the first importance degree ranking set; and taking the set of features with the importance degree ranked m top in the second object as the second importance degree ranking set.
According to an embodiment of the present invention, the model interpretation tool is a SHAP, which obtains an importance value of each feature of each sub-object included in each of the first object and the second object by calculating a shape value of each feature of each sub-object included in each of the first object and the second object, so as to obtain a first feature importance set composed of importance of various features of the first object and a second feature importance set composed of importance of various features of the second object.
According to an embodiment of the present invention, the neural network prediction model is a house source prediction model, the current training sample includes a feature set and a label value of the current house source sample, and the method further includes: training a preset first room source prediction model based on the current training sample to obtain a second room source prediction model; and obtaining a first characteristic importance set consisting of the importance of various characteristics of the first target room source predicted by the first room source prediction model and a second characteristic importance set consisting of the importance of various characteristics of the second target room source predicted by the second room source prediction model through a model interpretation tool.
According to an embodiment of the present invention, the obtaining, by the model interpretation tool, a first feature importance set composed of importance of various features of the first target source predicted by the first source prediction model and a second feature importance set composed of importance of various features of the second target source predicted by the second source prediction model includes: obtaining an importance value of each feature of each set of house resources contained in the first target house resource and the second target house resource through a model interpretation tool; clustering the importance value of each feature of each set of house source contained in the first target house source and the second target house source respectively to obtain the importance of each feature of the first target house source and the second target house source respectively; and respectively sequencing the importance of each feature of the first target room source and the second target room source to obtain a first importance sequencing set corresponding to the first target room source and a second importance sequencing set corresponding to the second target room source.
According to an embodiment of the invention, the house resource prediction model is a house resource transaction probability prediction model which is created based on learning of a feature set and transaction probability of a plurality of house resource samples, wherein the feature set comprises one or more of a house resource basic attribute feature, a market related feature, a broker operation feature, an owner's will feature and a customer source heat degree feature.
According to an embodiment of the invention, the house source basic attribute features comprise one or more of floor, house type, area, orientation and geographic position; the market-related characteristics comprise one or more of house source price, community house source average transaction price and same-house type house source transaction price; the broker operation characteristics comprise one or more of number of times of watching, number of interviews, number of stores with a watch, number of brokers with a watch, and number of days of listing; the owner intention characteristics comprise one or more of owner login frequency and/or times and price adjustment times; the passenger source heat characteristics comprise one or more of house source attention and online visit amount.
According to an embodiment of the present invention, the preset first room source prediction model is constructed based on learning of a feature set and a tag value of a room source sample at a first time; the second room source prediction model is constructed based on learning feature sets and label values of room source samples at a second time, wherein the second time is prior to the first time.
According to an embodiment of the present invention, the first target source is a target source at the first time; the second target room source is the target room source at the second moment.
According to an embodiment of the present invention, the first target house source and the second target house source are house sources in the same city.
According to another aspect of the present invention, there is also provided an apparatus for updating a neural network prediction model, including: an acquisition module configured to: obtaining a current training sample; a training module configured to: training a preset first prediction model based on the current training sample to obtain a second prediction model; a parsing module configured to: obtaining a first feature importance set consisting of the importance of various features of the first object predicted by the first prediction model and a second feature importance set consisting of the importance of various features of the second object predicted by the second prediction model through a model interpretation tool; a computing module configured to: calculating the similarity between the first feature importance set and the second feature importance set; a comparison module configured to: comparing the similarity with a preset threshold; an update module configured to: if the similarity is larger than or equal to the preset threshold, updating the first prediction model to the second prediction model; and if the similarity is smaller than the preset threshold, not updating the first prediction model.
According to an embodiment of the present invention, the parsing module includes: a model interpretation submodule configured to: obtaining, by a model interpretation tool, an importance value of each feature of each sub-object included in each of the first object and the second object; a feature clustering submodule configured to: clustering the importance value of each feature of each sub-object contained in the first object and the second object respectively to obtain the importance of each feature of the first object and the second object respectively; a feature ordering submodule configured to: and respectively sorting the importance of each feature of the first object and the second object to obtain a first importance sorting set corresponding to the first object and a second importance sorting set corresponding to the second object.
According to another aspect of the present invention, there is also provided a computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the method for updating a neural network prediction model as described above when executing the program.
According to another aspect of the present invention, there is also provided a computer readable storage medium, having stored thereon a computer program which, when being executed by a processor, carries out the steps of the method for updating a neural network prediction model as set forth above.
According to another aspect of the present invention, there is also provided a computer program product comprising computer instructions which, when executed by a processor, implement the steps of the method for updating a neural network prediction model as described above.
The method and the device for updating the neural network prediction model realize highly white-boxed and real-time diagnosis of the neural network model on the line of the black-boxed complex neural network model through the interpretable algorithm, ensure the stability of the model, simultaneously ensure the reliability of the business, reduce the probability of abnormal scoring results and improve the operation efficiency of brokers.
Drawings
The above and other features of the present invention will be described in detail below with reference to certain exemplary embodiments thereof, which are illustrated in the accompanying drawings, and which are given by way of illustration only, and thus are not limiting of the invention, wherein:
FIG. 1 shows a flow diagram of a method for updating a neural network predictive model, according to an embodiment of the invention.
Fig. 2 illustrates a flowchart of the process of step S130 of fig. 1 according to an embodiment of the present invention.
FIG. 3 shows a flow diagram of a method for updating a neural network predictive model, according to another embodiment of the invention.
Fig. 4 is a schematic structural diagram of an apparatus for updating a neural network prediction model according to an embodiment of the present invention.
Fig. 5 is a schematic structural diagram of the parsing module of fig. 4 according to an embodiment of the present invention.
Detailed Description
The present invention is described in detail below with reference to specific examples so that those skilled in the art can easily practice the present invention based on the disclosure of the present specification. The embodiments described below are only a part of the embodiments of the present invention, and not all of them. All other embodiments obtained by a person skilled in the art on the basis of the embodiments described in the present specification without inventive step are within the scope of the present invention. It should be noted that the embodiments and features of the embodiments in the present specification may be combined with each other without conflict.
As shown in FIG. 1, the present invention provides a method 100 for updating a neural network predictive model. Specifically, the method 100 includes obtaining a current training sample at S110; training a preset first prediction model based on the current training sample at S120 to obtain a second prediction model; obtaining a first feature importance set composed of the importance of various features of the first object predicted by the first prediction model and a second feature importance set composed of the importance of various features of the second object predicted by the second prediction model through a model interpretation tool at S130; calculating similarities between the first set of feature importance and the second set of feature importance at S140; comparing the similarity with a preset threshold at S150; if the calculated similarity is greater than or equal to the preset threshold, updating the first prediction model to a second prediction model at S160; otherwise, the first prediction model is not updated at S170.
Therein, the method 100 obtains a current training sample at S110. In one or more embodiments of the invention, the neural network prediction model may be a machine learning model used in various fields for its prediction purpose, such as a house source prediction model in the field of real estate. For the house source prediction model, the current training sample comprises a feature set and a label value of the current house source sample.
Further, the house source prediction model may be, for example: the system comprises a house source transaction probability prediction model for predicting the future transaction probability of the house source, an owner intention prediction model for predicting the future price reduction probability of the owner, a customer transaction probability prediction model for predicting the customer transaction probability and the like. It is worth explaining that the building of the house source prediction model into the neural network model can better perform deep learning on the house source sample, and compared with the traditional model, the cost of characteristic engineering can be saved, and a better strategy effect is brought. In an alternative embodiment, the house source prediction model may also be constructed as other calculation models such as a linear model, a decision tree model, and the like according to different needs, which is not limited by the invention.
For convenience of explanation, the present specification will describe specific embodiments in detail by taking the house source transaction probability prediction model as an example.
In embodiments that employ a house source deal probability prediction model as the neural network prediction model, the current training sample may include a feature set and a deal probability of a plurality of house source samples. The source sample may be an on-sale source for a predetermined historical period of time, which may be selected based on experience and demand, for example, 3 months or half a year. Also, the number of house source samples can be selected based on experience and need, thereby enabling different model accuracies, such as 1000 thousands sets.
In one or more embodiments of the invention, the feature set of the house source sample can comprise one or more of a house source basic attribute feature, a market related feature, a broker activity feature, a proprietor willingness feature, and a customer source heat feature. The house source basic attribute features may include one or more of floor, house type, area, orientation, geographical location, etc.; the market-related characteristics may include one or more of a house source price, a cell house source average transaction price, a same-dwelling house source transaction price, and the like; the broker operations characteristics may include one or more of a number of views, a number of interviews, a number of stores with a watch, a number of brokerages with a watch, a number of branding days, and the like; the owner intent characteristics may include one or more of owner login frequency and/or times, price adjustment times, and the like; the customer source heat characteristics may include one or more of a house source attention, an online visit volume, and the like. Additionally or alternatively, the set of features may also include other features, and the above list of features is not intended to be a limitation of the present invention. The probability of a deal for the house source sample may be an indication of whether the house source has been in a deal within the preset number of days in the past, which may be selected based on experience and demand, for example, 14 days.
Returning to fig. 1, the method 100 then trains a preset first prediction model based on the current training sample at step S120, resulting in a second prediction model. Specifically, in the embodiment that the house source transaction probability prediction model is used as the prediction model, the house source transaction probability prediction model that is already used or currently used may be used as the first prediction model. The first predictive model is constructed based on historical training samples, which are typically not identical to current training samples due to updates and changes in data. In one embodiment of the present invention, the current training sample is the same as the portion of the house source sample in the historical training sample. For example, assuming that the historical training samples include 1000 ten thousand sets of house source samples, when creating the current training sample, 200 sets of house source samples are first removed from the 1000 ten thousand sets of house source samples, where the 200 removed set of house source samples may be random, or may be selected based on the period length of the house source as the sample, or may be selected based on other rules set by those skilled in the art. As data can be changed and updated along with time, such as room source price adjustment, room source transaction state change and the like, 200 current new room source samples are supplemented into historical training samples, and the historical training samples and the remaining 800 current room source samples jointly form a current training sample. And then, newly establishing a current training sample containing 1000 ten thousand new house source samples to train the first house source transaction probability model again to obtain a second house source transaction probability model. More specifically, the first house source transaction probability model may be retrained to obtain the second house source transaction probability model based on the house source basic attribute features, the market related features, the broker operation features, the owner willingness features, the customer source heat degree features of the 1000 ten thousand house source samples and the mark of whether the 1000 ten thousand house source samples are committed within 14 days.
In a preferred embodiment of the present invention, the preset first room source prediction model is constructed based on learning the feature set and the label value of the room source sample at the first time; the second room source prediction model is constructed based on learning feature sets and label values of room source samples at a second time, wherein the second time is prior to the first time. Specifically, a second house source transaction probability prediction model is constructed based on learning of a feature set and transaction probability of a house source sample at the time T; the preset first house source transaction probability prediction model is constructed on the basis of learning of the feature set and transaction probability of the house source samples at the T-n moment. Wherein n may be in units of month, week, day, hour, minute, second, etc., and n may or may not be an integer. For example, in the embodiment using the house source transaction probability prediction model as the prediction model, assuming that we want the house source transaction probability prediction model to be updated every day, then n is 1 and the unit is day, and T-n represents the previous day of T. The first house source transaction probability prediction model may be constructed the day before the second house source transaction probability prediction model is constructed, and at this time, the first house source transaction probability prediction model is constructed based on the learning of the feature set and transaction probability of the house source sample at the time of T-1. In other embodiments, the first house source deal probability prediction model may be constructed at other times before the second house source deal probability prediction model is constructed. And the second house source transaction probability prediction model is constructed on the basis of the feature set and transaction probability of the house source sample at the time T, namely the learning of the current training sample. In other words, the second house source transaction probability prediction model retrains the first house source transaction probability prediction model by using the feature set and the transaction probability of the house source sample at the time T to obtain the second house source transaction probability prediction model.
Subsequently, the method 100 may obtain, through the model interpretation tool, a first feature importance set composed of the importance of various features of the first object predicted by the first prediction model and a second feature importance set composed of the importance of various features of the second object predicted by the second prediction model at step S130.
Referring to fig. 2, an embodiment of step S130 of the method of fig. 1 is shown. Step S130 includes obtaining, at S132, an importance value for each feature of each sub-object included in each of the first object and the second object via the model interpretation tool.
In the embodiment that the house source transaction probability prediction model is used as the prediction model, the house source to be predicted is taken as the target house source, the first object predicted by the first house source transaction probability prediction model is taken as the first target house source, and the second object predicted by the second house source transaction probability prediction model is taken as the second target house source. The target house source also has a similar feature set as the house source sample, i.e., may also include one or more of a house source base attribute feature, a market related feature, a brokerage job feature, a proprietor willingness feature, and a customer source heat feature. As will be understood by those skilled in the art, inputting the feature set of the target house source into the house source deal probability prediction model, the model can output the prediction result of the deal probability of the target house source within 14 days in the future. It should be noted that the feature set of the target house source input into the house source transaction probability prediction model may be a single set of house source or multiple sets of house sources, and in order to improve the prediction efficiency, the features of multiple sets of target house sources may be simultaneously input into the model to perform batch prediction.
In an embodiment of the present invention, the first target source is a target source at the first time; the second target room source is the target room source at the second moment. Specifically, the first target room source is a target room source at a time of T-n; and the second target house source is the target house source at the time T. For example, as described in the above example, when the house source deal probability prediction model is updated every day, n is 1 and is given in days, and T-n represents the day before T. The first target house source can be the target house source one day before the second target house source, at this time, the first house source transaction probability model predicts the target house source at the time of T-1, and the second target house source transaction probability model predicts the target house source at the time of T. Specifically, the characteristics of 100 ten thousand sets of first target house sources at the time of T-1 are simultaneously input into the first house source transaction probability prediction model, so that the respective transaction probabilities of the 100 ten thousand sets of target house sources can be simultaneously obtained; and simultaneously inputting the characteristics of the second target house source containing 100 thousands of sets at the time T into the second house source transaction probability prediction model, so that the respective transaction probabilities of the 100 thousands of sets of target house sources can be simultaneously obtained. It is worth noting that the target source at time T-1 is usually different from the target source at time T, i.e. the first object and the second object are different, because the target source may change with time. Of course, the first object and the second object may be the same in further embodiments of the invention. It should also be noted that, the feature set of the room source sample and the feature set of the target room source may be completely the same or partially the same, and the present invention is not limited thereto.
In one or more embodiments of the present invention, model interpretable methods including SHAP (Shapley Additive ExPlatics), LIME, AP, Tree Regularization, LRP on LSTM, etc. may be used, and the contribution of each feature in the feature set of the target source input to the source prediction model to the probability of a deal, i.e., the importance value of each feature, can be calculated by the model interpretable methods mentioned above and not exhaustive in this specification.
In a preferred embodiment of the present invention, the model interpretation tool may be a SHAP, which obtains an importance value of each feature of each sub-object included in each of the first object and the second object by calculating a shape value of each feature of each sub-object included in each of the first object and the second object, so as to obtain a first feature importance set composed of importance of various features of the first object and a second feature importance set composed of importance of various features of the second object.
In a more preferred embodiment of the present invention, the feature set of each sub-object included in the first object is input into the first prediction model to obtain a first prediction value, and based on the first prediction value and the feature set of each sub-object included in the first object, an importance value of each feature of each sub-object included in the first object is obtained through a model interpretation tool; and inputting the feature set of each sub-object included in the second object into the second prediction model to obtain a second prediction value, and obtaining an importance value of each feature of each sub-object included in the second object through a model interpretation tool based on the second prediction value and the feature set of each sub-object included in the second object.
Those skilled in the art will appreciate that the SHAP algorithm is based on the Shapley value principle in cooperative game theory, and gives the contribution of each participant by solving a value allocation problem. In the house-source prediction model of the present invention, the results of the model come from the contribution of features, and thus the principle is fully applicable to the interpretation of feature importance. The following formula (1) is a core formula for calculating the sharley value. For example, the feature set N ═ { x ] of the target house source1,x2,…,xnThere are n features xiAny number of features forming a subset
Figure BDA0002855877270000112
Where v (S) represents the Value resulting from the co-operation of the features included in the subset S, the final assigned Value sharley Value is actually the mean of the cumulative contributions.
Figure BDA0002855877270000111
Based on the SHAP algorithm, Shapley value, namely an importance value, of each feature can be calculated, so that the purpose of model interpretability is achieved. In the embodiment of using the house source transaction probability prediction model as the neural network prediction model, the importance value of the total 697-dimensional features can be obtained by using the SHAP algorithm for the house source transaction probability prediction model. By analyzing the size of the Shapley value, we can know the importance of each feature, thereby defining the influence of each feature on the target house source deal.
The SHAP algorithm can be used to interpret the difference between the output value output _ value of the room source intersection probability prediction model and the mean value base _ value calculated from the plurality of output values output _ value. Wherein the output value output _ value is a predicted value of the house source transaction probability prediction model
Figure BDA0002855877270000121
The predicted value ranges from 0 to 1. The average value base _ value is the SHAP reference value
Figure BDA0002855877270000122
That is, the sum of the Shapley values of all the features of the target room source plus the base _ value is the output _ value of the model. Here, the sharley value corresponding to each feature obtained by the operation has a respective value, which may be a positive value or a negative value. When in use
Figure BDA0002855877270000123
When the sharey value is a positive value, it means that the feature corresponding to the sharey value has a positive contribution (is favorable for bargaining) to bargaining of the target house source, and the contribution degree is larger if the value is larger; when in use
Figure BDA0002855877270000124
When the value of sharey value is negative, it means that the feature corresponding to the sharey value has a negative contribution (not favorable for a deal) to the deal of the target house source, and the larger the absolute value of the value (i.e., the smaller the value), the smaller the contribution (the larger the negative effect).
In an embodiment that the house source deal probability prediction model is used as the neural network prediction model, for example, the respective deal probabilities of 100 ten thousand sets of target house sources are obtained through the first house source deal probability prediction model, and a sharey value, that is, an importance value, of each feature of each set of target house sources can be obtained by using the SHAP, that is, an importance value of each feature of each sub-object included in the first object is obtained. Similarly, for the second room source deal model, the importance value of each feature of each sub-object included in the second object can be obtained as well.
In the above embodiment, the characteristics (or factors) affecting the house origin transaction probability can be more accurately located by the SHAP-based heuristic interpretation method, and especially when the house origin transaction probability prediction model is constructed as a neural network model, the neural network model is generally regarded as a black box, so that the interpretation method not only can bring a relatively good strategy effect, but also can increase the trust degree of a user on the house origin transaction probability prediction model.
Subsequently, in step S134, the importance value of each feature of each sub-object included in each of the first object and the second object is clustered to obtain the importance of each feature of each of the first object and the second object. In an embodiment of the present invention, the absolute values of the importance values of the same kind of features of each sub-object included in the first object are summed to obtain the importance of each feature of the first object; and summing the absolute values of the importance values of the same kind of characteristics of each sub-object contained in the second object to obtain the importance of each characteristic of the second object. Specifically, in the embodiment that the house source deal probability prediction model is used as the neural network prediction model, the share value, i.e., the importance value, of each feature of each set of target house sources in the first target house source obtained by the first house source deal probability prediction model is clustered, in other words, share values of different house sources belonging to the same feature are clustered together, and the absolute value of each share value of different house sources belonging to the same feature is summed, and the summed result is used as the importance of each feature of the first target house source. Similarly, the importance of each feature of the second target source can be obtained. In another embodiment of the present invention, the importance of each feature can also be obtained by obtaining an average of the absolute values, and details of the calculation are not described herein again.
Then, at step S136, the importance of each feature of the first object and the second object is respectively ranked, so as to obtain a first importance ranking set corresponding to the first object and a second importance ranking set corresponding to the second object. Specifically, in an embodiment that uses the room source intersection probability prediction model as the neural network prediction model, the obtained importance of each feature of the first target room source and the second target room source are respectively sorted, for example, sorted according to the magnitude of the value, so that a first importance sorting set corresponding to the first target room source and a second importance sorting set corresponding to the second target room source can be obtained.
In a preferred embodiment of the present invention, the importance value of each feature of each set of the first target room source and the second target room source respectively can be clustered and sorted by using the following formula (2):
Figure BDA0002855877270000131
wherein x isshapAnd k represents the number of the room sources contained in the target room source, and Sort represents a sorting function. The feature ranking thus obtained is interpretable, e.g., the highest ranked feature is the online visit amount, which indicates that the feature has a greater impact on the probability of a house source deal than other features. For another example, if the second-ranked feature is the number of times of watching, it means that the influence of the feature on the house source transaction probability is next to the online visit amount, and at the same time, the influence is larger than that of other features on the house source transaction probability.
In an embodiment of the present invention, a set of features with importance degrees ranked m top in the first object is used as the first importance degree ranking set; and taking the set of features with the importance degree ranked m top in the second object as the second importance degree ranking set. Specifically, still taking the house source deal probability prediction model as an example, in order to simplify the analysis after feature ordering, we select features ranked at, for example, the top 40 (m ═ 40) in the feature ordering corresponding to the first target house source and the second target house source, respectively, as the second importance ordering set of the first importance ordering set. If necessary, m may be other values, and m is an integer.
On the basis of feature sorting, effective basis of the model for the prediction result can be obtained in real time. For example, during the epidemic situation of 2-3 months, because the cell is subjected to closed processing, the offline watching behavior amount is reduced, and the user prefers to see and know the interested house source through the VR zone, so that the ordering of the VR features becomes higher in the model; and after the epidemic situation is improved for 5-6 months, the offline zone is regarded as frequent business recovery, and the online zone is seen to be re-arranged into a high position according to the characteristics. It can be seen that the feature ordering of the model changes in real time over time, and it is inexplicable and inefficient to determine whether to update the model based solely on the loss function.
Returning to fig. 1, the method 100 calculates a similarity between the first set of feature importance and the second set of feature importance at S140. Specifically, in an embodiment that employs the room-source deal probability prediction model as the neural network prediction model, the similarity between the first feature importance set and the second feature importance set is obtained by calculating the similarity between the first importance ranking set and the second importance ranking set. In an embodiment of the present invention, the similarity may be calculated by performing an intersection operation on the first importance ranking set and the second importance ranking set. For example, in an example where it is desired to update the house source deal probability model on a daily basis, the similarity may be calculated by the following equation (3):
c=(featureT-1∩featureT)/m (3)
wherein m has the same meaning as m described above and represents the number of features arranged in the first m positions; featureT-1A first importance ranking set, feature, composed of the set of top m-ranked features of the room-source transaction probability prediction model representing T-1TAnd the second importance ranking set is composed of a set of top m-ranked features of the room source transaction probability prediction model representing T. Performing intersection operation on the first importance ordered set and the second importance ordered set, namely the proportion of the same ordered features in the first importance ordered set and the second importance ordered set in all the features is the similarity c of the first importance ordered set and the second importance ordered set,the similarity c is a numerical value between 0 and 1. The similarity can represent the structural difference between the house source transaction probability prediction models between two adjacent days.
In other embodiments of the present invention, the first importance ranking set and the second importance ranking set may be compared in other ways known to those skilled in the art, and the similarity between the two sets may be calculated, which is not limited by the present invention.
Subsequently, the method 100 may determine whether the similarity is greater than or equal to a preset threshold at step S150: if the answer is yes, the method 100 proceeds to step S160 to update the first predictive model to the second predictive model; if the answer is no, the method 100 proceeds to step S170 to not update the first predictive model. Specifically, taking the case of updating the house resource transaction probability prediction model every day as an example, since house transactions are stable in a certain period, the similarity should reflect the stability of the transaction situation of the whole market. If the similarity is too small, the structural difference between the house source transaction probability prediction models of two adjacent days is too large, namely the newly trained second house source transaction probability model is possibly abnormal, which may be caused by influence factors such as inaccurate data contained in the current training sample, and if the second house source transaction probability model is substituted for the previously used first house source transaction probability model, the output result of the model is possibly abnormal, and the broker is misled, so that the rate of removal of house sources is influenced. Therefore, in order to ensure the stability of the second house deal probability model replacing the first house deal probability model, the similarity between the two should be kept above a certain value level, for example, a preset threshold value a may be set as a criterion for judging whether to update the model. The specific value of the preset threshold a can be set according to the practical situation and experience in the field, and is not limited herein.
In an embodiment of the present invention, the first target house source and the second target house source are house sources in the same city. Specifically, in the embodiment that the house resource transaction probability prediction model is used as the prediction model, since each city has its own characteristics, such as a large city and a small city, a coastal city and an inland city, a northern city and a southern city, and the like, the house resource conditions of different cities are different, and the feature ordering of the target house resources of different cities is also different. In order to adapt to the situation of different cities, model updating can be carried out on the single cities respectively. Specifically, the first house source transaction probability prediction model and the second house source transaction probability prediction model may be trained based on training samples including national house source samples, and when prediction is performed, the first target house source and the second target house source are set as house sources of the same city, so that the obtained first importance ranking set and the second importance ranking set have similar characteristics of the city to a certain extent, and comparison between the first importance ranking set and the second importance ranking set is more targeted. In other embodiments, the first room source deal probability model and the second room source deal probability prediction model may also be trained based on room source samples of the same city.
In a more preferred embodiment of the present invention, different predetermined thresholds a may be set for different cities, so as to meet the characteristics and special requirements of each city and effectively ensure the reliability of the prediction result of each city. In addition, the preset threshold value a can be flexibly adjusted, and the set higher preset threshold value a represents that the replacement frequency of the model is lower; setting a lower predetermined threshold a indicates a higher frequency of replacing the on-line model.
Referring to FIG. 3, a method 200 for updating a neural network predictive model in accordance with another embodiment of the present invention is shown. Specifically, the method 200 includes obtaining a current training sample at S210; training a preset first room source prediction model based on the current training sample at S220 to obtain a second room source prediction model; obtaining an importance value of each feature of each set of house resources contained in each of the first target house resource and the second target house resource through a model interpretation tool at S230; at S240, clustering the importance value of each feature of each set of room source included in each of the first target room source and the second target room source, respectively, to obtain the importance of each feature of each of the first target room source and the second target room source; respectively sorting the importance of each feature of the first target room source and the second target room source at S250 to obtain a first importance sorting set corresponding to the first target room source and a second importance sorting set corresponding to the second target room source; calculating a similarity between the first importance ranking set and the second importance ranking set at S260; at S270, it is determined whether the similarity is greater than or equal to a preset threshold: if the answer is yes, the method 200 proceeds to S280 to update the first room source prediction model to the second room source prediction model; if the answer is no, the method 200 proceeds to S290 to not update the first house source prediction model.
In the method 200, the prediction model is a house source prediction model, and the steps S210 to S290 can be implemented based on a method similar to the method described above with reference to the steps S110 to S170 shown in fig. 1 and the steps S132 to 136 shown in fig. 2, and will not be described again here.
Based on the same inventive concept, fig. 4 is a schematic structural diagram illustrating an apparatus for updating a neural network prediction model according to an embodiment of the present invention, the apparatus including: an obtaining module 10 configured to obtain a current training sample, in an embodiment of the present invention, the obtaining module 10 may be configured to perform steps shown in S110 in fig. 1, S210 in fig. 3, and corresponding to S110 in fig. 1, S210 in fig. 3 in this specification; a training module 20 configured to train a preset first prediction model based on the current training sample to obtain a second prediction model, in an embodiment of the present invention, the training module 20 may be configured to perform steps shown in S120 in fig. 1, S220 in fig. 3, and corresponding to S120 in fig. 1 and S220 in fig. 3 in this specification; a parsing module 30 configured to obtain, through a model interpretation tool, a first feature importance set composed of importance of various features of the first object predicted by the first prediction model and a second feature importance set composed of importance of various features of the second object predicted by the second prediction model, in an embodiment of the present invention, the parsing module 30 may be configured to perform steps shown in S130 in fig. 1, S132-S136 in fig. 2, S230-S250 in fig. 3, and corresponding in this specification to S130 in fig. 1, S132-S136 in fig. 2, S230-S250 in fig. 3; a calculating module 40 configured to calculate similarities between the first feature importance set and the second feature importance set, in an embodiment of the present invention, the calculating module 40 may be configured to execute steps shown in S140 in fig. 1, S260 in fig. 3, and corresponding to S140 in fig. 1, S260 in fig. 3 in this specification; a comparing module 50 configured to compare the similarity with a preset threshold, in an embodiment of the present invention, the comparing module 50 may be configured to perform steps shown in S150 in fig. 1, S270 in fig. 3, and corresponding to S150 in fig. 1, S270 in fig. 3 in this specification; an updating module 60 configured to update the first prediction model to the second prediction model if the similarity is greater than or equal to the preset threshold; if the similarity is less than the preset threshold, the first prediction model is not updated, and in an embodiment of the present invention, the updating module 60 may be configured to execute steps S160-S170 in fig. 1 and S280-S290 in fig. 3, and in this specification, steps corresponding to S160-S170 in fig. 1 and S280-S290 in fig. 3.
Fig. 5 illustrates an embodiment of a parsing module of the apparatus shown in fig. 4. The parsing module 30 may include: a model interpretation sub-module 301 configured to obtain, through a model interpretation tool, an importance value of each feature of each sub-object included in each of the first object and the second object, in an embodiment of the present invention, the model interpretation sub-module 301 may be configured to perform steps shown in S132 in fig. 2 and S230 in fig. 3, and corresponding to S132 in fig. 2 and S230 in fig. 3 in this specification; a feature clustering sub-module 302 configured to cluster the importance value of each feature of each sub-object included in each of the first object and the second object to obtain the importance of each feature of each of the first object and the second object, in an embodiment of the present invention, the feature clustering sub-module 302 may be configured to perform steps shown in S134 in fig. 2 and S240 in fig. 3, and corresponding to S134 in fig. 2 and S240 in fig. 3 in this specification; a feature sorting sub-module 303 configured to sort the importance of each feature of the first object and the second object respectively to obtain a first importance sorting set corresponding to the first object and a second importance sorting set corresponding to the second object, in an embodiment of the present invention, the feature sorting sub-module 303 may be configured to perform steps S136 in fig. 2, S250 in fig. 3, and corresponding to S136 in fig. 2 and S250 in fig. 3 in this specification.
It will be appreciated that the configurations shown in figures 4 and 5 are merely illustrative and that the apparatus may also include more or fewer modules or components than shown in figures 4 and 5 or have a different configuration than shown in figures 4 and 5.
The present application further provides a computer device, which according to an embodiment of the present invention may include a memory, a processor, and a computer program stored on the memory and executable on the processor, and when the processor executes the computer program, the steps of the method for updating a neural network prediction model described in this specification may be implemented.
Further, the present application provides a computer-readable medium, which may be contained in the apparatus described in the above embodiments; or may be present separately and not assembled into the device. The computer readable medium carries one or more programs which, when executed by the apparatus, cause the apparatus to perform the steps of the method for updating a neural network prediction model described herein.
Furthermore, the present application also provides a computer program product comprising computer instructions which, when executed by a processor, may implement the steps of the method for updating a neural network prediction model described herein.
In particular, the embodiment processes described above with reference to the flowcharts in the figures may be implemented as computer software programs. For example, embodiments disclosed in the present specification include a computer program product comprising a computer program embodied on a computer-readable medium, the computer program comprising program code for performing the methods illustrated in the flowcharts of the figures, the computer program being executable by a processor for performing the methods of the present application.
It should be noted that the computer readable medium described herein can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: a computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM), a flash memory, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. Each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units or modules referred to in the embodiments of the present application may be implemented by software, or may be implemented by hardware. The above units or modules may also be provided in the processor, and may be described as: a processor comprises an acquisition module, a training module, an analysis module, a calculation module, a comparison module and an update module. The names of these units or modules do not in some cases constitute a limitation of the unit or module itself.
All documents mentioned in this specification are herein incorporated by reference as if each were incorporated by reference in its entirety.
Furthermore, it should be understood that various changes or modifications can be made by those skilled in the art after reading the above description of the present invention, and such equivalents also fall within the scope of the present invention.

Claims (10)

1. A method for updating a neural network predictive model, comprising the steps of:
obtaining a current training sample;
training a preset first prediction model based on the current training sample to obtain a second prediction model;
obtaining a first feature importance set consisting of the importance of various features of the first object predicted by the first prediction model and a second feature importance set consisting of the importance of various features of the second object predicted by the second prediction model through a model interpretation tool;
calculating the similarity between the first feature importance set and the second feature importance set;
comparing the similarity with a preset threshold:
if the similarity is larger than or equal to the preset threshold, updating the first prediction model to the second prediction model; and if the similarity is smaller than the preset threshold, not updating the first prediction model.
2. The method of claim 1, wherein obtaining, by a model interpretation tool, a first set of feature importance values comprising the importance of the various features of the first object predicted by the first prediction model and a second set of feature importance values comprising the importance of the various features of the second object predicted by the second prediction model comprises:
obtaining, by a model interpretation tool, an importance value of each feature of each sub-object included in each of the first object and the second object;
clustering the importance value of each feature of each sub-object contained in the first object and the second object respectively to obtain the importance of each feature of the first object and the second object respectively;
and respectively sorting the importance of each feature of the first object and the second object to obtain a first importance sorting set corresponding to the first object and a second importance sorting set corresponding to the second object.
3. The method of claim 2, wherein computing the similarity between the first set of feature importance and the second set of feature importance comprises:
calculating a similarity between the first importance ranking set and the second importance ranking set.
4. The method of claim 1, wherein the neural network prediction model is an atrial-derived prediction model, wherein the current training sample comprises a feature set and a label value of a current atrial-derived sample, and wherein the method further comprises:
training a preset first room source prediction model based on the current training sample to obtain a second room source prediction model;
and obtaining a first characteristic importance set consisting of the importance of various characteristics of the first target room source predicted by the first room source prediction model and a second characteristic importance set consisting of the importance of various characteristics of the second target room source predicted by the second room source prediction model through a model interpretation tool.
5. The method of claim 4, wherein obtaining, by the model interpretation tool, a first feature importance set consisting of the importance of the features of the first target source predicted by the first source prediction model and a second feature importance set consisting of the importance of the features of the second target source predicted by the second source prediction model comprises:
obtaining an importance value of each feature of each set of house resources contained in the first target house resource and the second target house resource through a model interpretation tool;
clustering the importance value of each feature of each set of house source contained in the first target house source and the second target house source respectively to obtain the importance of each feature of the first target house source and the second target house source respectively;
and respectively sequencing the importance of each feature of the first target room source and the second target room source to obtain a first importance sequencing set corresponding to the first target room source and a second importance sequencing set corresponding to the second target room source.
6. An apparatus for updating a neural network predictive model, comprising:
an acquisition module configured to: obtaining a current training sample;
a training module configured to: training a preset first prediction model based on the current training sample to obtain a second prediction model;
a parsing module configured to: obtaining a first feature importance set consisting of the importance of various features of the first object predicted by the first prediction model and a second feature importance set consisting of the importance of various features of the second object predicted by the second prediction model through a model interpretation tool;
a computing module configured to: calculating the similarity between the first feature importance set and the second feature importance set;
a comparison module configured to: comparing the similarity with a preset threshold;
an update module configured to: if the similarity is larger than or equal to the preset threshold, updating the first prediction model to the second prediction model; and if the similarity is smaller than the preset threshold, not updating the first prediction model.
7. The apparatus of claim 6, wherein the parsing module comprises:
a model interpretation submodule configured to: obtaining, by a model interpretation tool, an importance value of each feature of each sub-object included in each of the first object and the second object;
a feature clustering submodule configured to: clustering the importance value of each feature of each sub-object contained in the first object and the second object respectively to obtain the importance of each feature of the first object and the second object respectively;
a feature ordering submodule configured to: and respectively sorting the importance of each feature of the first object and the second object to obtain a first importance sorting set corresponding to the first object and a second importance sorting set corresponding to the second object.
8. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the steps of the method of any of claims 1-5 are implemented when the program is executed by the processor.
9. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 5.
10. A computer program product comprising computer instructions, characterized in that the computer instructions, when executed by a processor, implement the steps of the method according to any one of claims 1-5.
CN202011557492.6A 2020-12-24 2020-12-24 Method and device for updating neural network prediction model Pending CN112734086A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011557492.6A CN112734086A (en) 2020-12-24 2020-12-24 Method and device for updating neural network prediction model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011557492.6A CN112734086A (en) 2020-12-24 2020-12-24 Method and device for updating neural network prediction model

Publications (1)

Publication Number Publication Date
CN112734086A true CN112734086A (en) 2021-04-30

Family

ID=75615737

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011557492.6A Pending CN112734086A (en) 2020-12-24 2020-12-24 Method and device for updating neural network prediction model

Country Status (1)

Country Link
CN (1) CN112734086A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115238596A (en) * 2022-09-22 2022-10-25 中科三清科技有限公司 Data processing method and device, readable storage medium and electronic equipment

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110533489A (en) * 2019-09-05 2019-12-03 腾讯科技(深圳)有限公司 Sample acquiring method and device, equipment, storage medium applied to model training
US20200097858A1 (en) * 2018-09-22 2020-03-26 Securonix, Inc. Prediction explainer for ensemble learning
CN111008898A (en) * 2020-03-10 2020-04-14 支付宝(杭州)信息技术有限公司 Method and apparatus for evaluating model interpretation tools
CN111275060A (en) * 2018-12-04 2020-06-12 北京嘀嘀无限科技发展有限公司 Recognition model updating processing method and device, electronic equipment and storage medium
CN111325344A (en) * 2020-02-24 2020-06-23 支付宝(杭州)信息技术有限公司 Method and apparatus for evaluating model interpretation tools
CN111598338A (en) * 2020-05-18 2020-08-28 贝壳技术有限公司 Method, apparatus, medium, and electronic device for updating prediction model
CN111639673A (en) * 2020-04-27 2020-09-08 沃太能源南通有限公司 Self-interpretation protocol modeling method for processing mixed feature data
CN111861588A (en) * 2020-08-06 2020-10-30 网易(杭州)网络有限公司 Training method of loss prediction model, player loss reason analysis method and player loss reason analysis device
CN111861190A (en) * 2020-07-16 2020-10-30 贝壳技术有限公司 Method and device for generating house source task

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200097858A1 (en) * 2018-09-22 2020-03-26 Securonix, Inc. Prediction explainer for ensemble learning
CN111275060A (en) * 2018-12-04 2020-06-12 北京嘀嘀无限科技发展有限公司 Recognition model updating processing method and device, electronic equipment and storage medium
CN110533489A (en) * 2019-09-05 2019-12-03 腾讯科技(深圳)有限公司 Sample acquiring method and device, equipment, storage medium applied to model training
CN111325344A (en) * 2020-02-24 2020-06-23 支付宝(杭州)信息技术有限公司 Method and apparatus for evaluating model interpretation tools
CN111008898A (en) * 2020-03-10 2020-04-14 支付宝(杭州)信息技术有限公司 Method and apparatus for evaluating model interpretation tools
CN111639673A (en) * 2020-04-27 2020-09-08 沃太能源南通有限公司 Self-interpretation protocol modeling method for processing mixed feature data
CN111598338A (en) * 2020-05-18 2020-08-28 贝壳技术有限公司 Method, apparatus, medium, and electronic device for updating prediction model
CN111861190A (en) * 2020-07-16 2020-10-30 贝壳技术有限公司 Method and device for generating house source task
CN111861588A (en) * 2020-08-06 2020-10-30 网易(杭州)网络有限公司 Training method of loss prediction model, player loss reason analysis method and player loss reason analysis device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115238596A (en) * 2022-09-22 2022-10-25 中科三清科技有限公司 Data processing method and device, readable storage medium and electronic equipment
CN115238596B (en) * 2022-09-22 2023-01-31 中科三清科技有限公司 Data processing method and device, readable storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
US20230237329A1 (en) Method and System Using a Neural Network for Prediction of Stocks and/or Other Market Instruments Price Volatility, Movements and Future Pricing
Pratt et al. Employee attrition estimation using random forest algorithm
CN109032591B (en) Crowdsourcing software developer recommendation method based on meta-learning
EP3699753A1 (en) Systems and methods for virtual programming by artificial intelligence
CN111738504A (en) Enterprise financial index fund amount prediction method and device, equipment and storage medium
EP4352670A1 (en) Resource allocation optimization for multi-dimensional machine learning environments
Briola et al. Deep learning modeling of limit order book: A comparative perspective
CN110929119A (en) Data annotation method, device, equipment and computer storage medium
CN111861190A (en) Method and device for generating house source task
CN117787569B (en) Intelligent auxiliary bid evaluation method and system
US11468352B2 (en) Method and system for predictive modeling of geographic income distribution
CN112734086A (en) Method and device for updating neural network prediction model
CN116596662A (en) Risk early warning method and device based on enterprise public opinion information, electronic equipment and medium
US20210004716A1 (en) Real-time global ai platform
US20200380446A1 (en) Artificial Intelligence Based Job Wages Benchmarks
KR102519878B1 (en) Apparatus, method and recording medium storing commands for providing artificial-intelligence-based risk management solution in credit exposure business of financial institution
Malawana et al. The Public Sentiment analysis within Big data Distributed system for Stock market prediction–A case study on Colombo Stock Exchange
Kolm et al. Improving Deep Learning of Alpha Term Structures from the Order Book
CN113793220A (en) Stock market investment decision method based on artificial intelligence model and related equipment
CN113570455A (en) Stock recommendation method and device, computer equipment and storage medium
Nagashima et al. Data Imputation Method based on Programming by Example: APREP-S
Benáček et al. Postprocessing of Ensemble Weather Forecast Using Decision Tree–Based Probabilistic Forecasting Methods
AU2020201689A1 (en) Cognitive forecasting
Denanti et al. The Correlation of Headline News Sentiment and Stock Return During Dividend Period
Vodithala et al. A Novel Political Optimizer-Based Feature Selection with an Optimal Machine Learning Model for Financial Crisis Prediction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210430