CN115204386A - Training method of prediction model and method for recommending broker - Google Patents

Training method of prediction model and method for recommending broker Download PDF

Info

Publication number
CN115204386A
CN115204386A CN202210834201.6A CN202210834201A CN115204386A CN 115204386 A CN115204386 A CN 115204386A CN 202210834201 A CN202210834201 A CN 202210834201A CN 115204386 A CN115204386 A CN 115204386A
Authority
CN
China
Prior art keywords
broker
data
neural network
network model
brokers
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210834201.6A
Other languages
Chinese (zh)
Other versions
CN115204386B (en
Inventor
张露露
刘乔杨
魏淑越
童咏昕
叶杰平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Seashell Housing Beijing Technology Co Ltd
Original Assignee
Seashell Housing Beijing Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Seashell Housing Beijing Technology Co Ltd filed Critical Seashell Housing Beijing Technology Co Ltd
Priority to CN202210834201.6A priority Critical patent/CN115204386B/en
Publication of CN115204386A publication Critical patent/CN115204386A/en
Application granted granted Critical
Publication of CN115204386B publication Critical patent/CN115204386B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0631Resource planning, allocation, distributing or scheduling for enterprises or organisations
    • G06Q10/06311Scheduling, planning or task assignment for a person or group
    • G06Q10/063112Skill-based matching of a person or a group to a task

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • Theoretical Computer Science (AREA)
  • Economics (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • General Physics & Mathematics (AREA)
  • Tourism & Hospitality (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Game Theory and Decision Science (AREA)
  • Quality & Reliability (AREA)
  • General Business, Economics & Management (AREA)
  • Development Economics (AREA)
  • Educational Administration (AREA)
  • Data Mining & Analysis (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

A training method of a predictive model, a method, an apparatus, a computer device, a computer readable storage medium and a computer program product for recommending brokers are provided. The training method comprises the following steps: acquiring a sample data set of a plurality of brokers, wherein the sample data set comprises a data group; determining a predicted working capacity of the broker based on an output of the neural network model resulting from inputting the characteristic data in the data set into the neural network model; updating the neural network model based on the corresponding conversion data when the working load data in the data group is used for predicting the working capacity; determining a pre-trained neural network model in response to feature data of all of the plurality of brokers having been input into the neural network model; in response to determining to adjust a broker, adjusting a pre-trained neural network model to obtain a predictive model for the broker; in response to determining not to adjust a broker, determining that the pre-trained neural network model is a predictive model of the broker.

Description

Training method of prediction model and method for recommending broker
Technical Field
The present disclosure relates to the field of artificial intelligence, and in particular to a method of training a predictive model, a method for recommending brokers, an apparatus, a computer device, a computer-readable storage medium and a computer program product.
Background
In trading in the real estate domain, for example, it is important to match brokers and customers for potential trading. Various broker recommendation methods have been developed at present, but the problems of poor matching effect, improper workload distribution and the like still exist, which not only affects the service quality, but also affects the long-term development of trading platforms.
The approaches described in this section are not necessarily approaches that have been previously conceived or pursued. Unless otherwise indicated, it should not be assumed that any of the approaches described in this section qualify as prior art merely by virtue of their inclusion in this section. Similarly, the problems mentioned in this section should not be considered as having been acknowledged in any prior art, unless otherwise indicated.
Disclosure of Invention
It would be advantageous to provide a mechanism that alleviates, mitigates or even eliminates one or more of the above-mentioned problems.
According to an aspect of the present disclosure, there is provided a training method of a prediction model, including: obtaining a sample dataset for a plurality of brokers, the sample dataset comprising a dataset for each of the plurality of brokers, the dataset comprising characteristic data, workload data, and conversion data for the broker, the conversion data indicating a quality of service for the broker; for each broker, performing the steps of: determining the predicted working capacity of the broker based on an output of a neural network model obtained by inputting feature data in a data set into the neural network model, wherein the neural network model comprises L-layer neurons which are sequentially cascaded, wherein the L-layer neuron is an output layer, and L is an integer greater than or equal to 2; updating the neural network model based on the corresponding conversion data when the working load data in the data group is used for predicting the working capacity; in response to the feature data for all of the plurality of brokers having been input into the neural network model, determining the updated neural network model to be a pre-trained neural network model; in response to determining to adjust a broker of the plurality of brokers, based on the data set for the broker, adjusting parameters of an output layer of the pre-trained neural network model and maintaining the parameters of layers 1 through L-1 of the pre-trained neural network model unchanged to obtain a predictive model for the broker; and in response to determining not to adjust a broker of the plurality of brokers, determining the pre-trained neural network model to be a predictive model of the broker.
According to an aspect of the present disclosure, there is provided a method for recommending brokers, the method being applied to a computing device running a prediction model, the prediction model being trained according to the method described above, the method comprising: acquiring feature data and workload data of a first menstrual person; determining a predicted working capacity of the first epoch man based on an output of the predictive model resulting from inputting the feature data of the first epoch man into the predictive model; and recommending the first broker in response to the workload data of the first broker not exceeding the predicted work capacity of the first broker.
According to an aspect of the present disclosure, there is provided a training apparatus of a predictive model, including: a training data acquisition module configured to acquire a sample data set for a plurality of brokers, the sample data set comprising a data set for each of the plurality of brokers, the data set comprising feature data, workload data, and conversion data for the broker, the conversion data indicating a quality of service for the broker; a work capacity prediction module configured to, for each broker, perform the steps of: determining the predicted working capacity of the broker based on an output of a neural network model obtained by inputting the feature data in the data set into the neural network model, wherein the neural network model comprises L layers of neurons which are cascaded in sequence, wherein the L layer of neurons is an output layer, and L is an integer greater than or equal to 2; updating the neural network model based on the corresponding conversion data when the working load data in the data group is used for predicting the working capacity; a pre-trained model determination module configured to determine an updated neural network model as a pre-trained neural network model in response to feature data of all of the plurality of brokers having been input into the neural network model; and a working capacity prediction model output module configured to: in response to determining to adjust a broker of the plurality of brokers, based on the data set for the broker, adjusting parameters of an output layer of the pre-trained neural network model and maintaining the parameters of layers 1 through L-1 of the pre-trained neural network model unchanged to obtain a predictive model for the broker; and in response to determining not to adjust a broker of the plurality of brokers, determining the pre-trained neural network model to be a predictive model for the broker.
According to an aspect of the present disclosure, there is provided an apparatus for recommending brokers, the apparatus being applied to a computing device running a predictive model, the predictive model being trained according to the method described above, the apparatus comprising: a data acquisition module configured to acquire feature data and workload data of a first broker; a working capacity prediction module configured to determine a predicted working capacity of a first broker based on an output of a predictive model resulting from inputting feature data of the first broker into the predictive model; and a broker recommendation module configured to recommend the first broker in response to the workload data of the first broker not exceeding the predicted work capacity of the first broker.
According to an aspect of the present disclosure, there is provided a computer device including: at least one processor; and at least one memory having a computer program stored thereon, wherein the computer program, when executed by the at least one processor, causes the at least one processor to perform any of the methods described above.
According to an aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, causes the processor to perform any of the methods described above.
According to an aspect of the disclosure, there is provided a computer program product comprising a computer program which, when executed by a processor, causes the processor to perform any of the methods described above.
These and other aspects of the disclosure will be apparent from and elucidated with reference to the embodiments described hereinafter.
Drawings
Further details, features and advantages of the disclosure are disclosed in the following description of exemplary embodiments, taken in conjunction with the accompanying drawings, in which:
FIG. 1 is a schematic diagram illustrating an example system in which various methods described herein may be implemented in accordance with example embodiments;
FIG. 2 is a flow chart illustrating a method of training a predictive model in accordance with an exemplary embodiment;
FIG. 3 is a flow chart illustrating a portion of an example process of the method of FIG. 2 in accordance with an example embodiment;
FIG. 4 is a flowchart illustrating a method for recommending brokers in accordance with an illustrative embodiment;
FIG. 5 is a schematic block diagram illustrating a training apparatus of a predictive model in accordance with an exemplary embodiment;
FIG. 6 is a schematic block diagram illustrating an apparatus for recommending brokers in accordance with an illustrative embodiment;
FIG. 7 is a block diagram illustrating an exemplary computer device that can be used in exemplary embodiments.
Detailed Description
In the present disclosure, unless otherwise specified, the use of the terms "first", "second", etc. to describe various elements is not intended to limit the positional relationship, the timing relationship, or the importance relationship of the elements, and such terms are used only to distinguish one element from another. In some examples, a first element and a second element may refer to the same instance of the element, and in some cases, based on the context, they may also refer to different instances.
The terminology used in the description of the various described examples in this disclosure is for the purpose of describing particular examples only and is not intended to be limiting. Unless the context clearly indicates otherwise, if the number of elements is not specifically limited, the elements may be one or more. As used herein, the term "plurality" means two or more, and the term "based on" should be interpreted as "based, at least in part, on. Furthermore, the terms "and/or" and "\8230, at least one of which encompasses any and all possible combinations of the listed items.
With the development of online merchandise sales platforms, it is very important how to accurately recommend merchandise to customers and to match service personnel with customers for potential transactions.
In the related art, matching brokers and customers for potential house transactions may not take into account the workload of the brokers themselves, such as in the real estate field. Thereby possibly overloading the head broker's workload. In particular, when a broker is matched with too many customers, the broker may not be able to provide high quality service or even respond. Through data statistics, it is found that if a broker responds to requests from more than 40 customers per day, the ratio between the number of customers they sign up to the total number of customers served drops from 14.3% to 27.5% to 2.5% to 17.8%. Also, because the head broker occupies too many customers, the workload of other brokers may be insufficient.
In addition, broker recommendation systems in the real estate field and the like have the characteristic of commodity heterogeneity, i.e., a commodity (e.g., a set of houses) is not repeatedly purchased many times. Moreover, in the field, the transaction frequency is low, the transaction amount is large, the client data is sparse, and the use effect of the existing data-driven related technology is not good under the condition of limited training data.
In view of the above, the present disclosure proposes a training method of a prediction model and a method for recommending brokers.
It should be understood that the method proposed by the present disclosure can be applied to brokers, sales personnel, service providers in any other field besides the real estate field.
Exemplary embodiments of the present disclosure are described in detail below with reference to the accompanying drawings.
Fig. 1 is a schematic diagram illustrating an example system 100 in which various methods described herein may be implemented, according to an example embodiment.
Referring to fig. 1, the system 100 includes a client device 110, a server 120, and a network 130 communicatively coupling the client device 110 and the server 120.
The client device 110 includes a display 114 and a client Application (APP) 112 displayable via the display 114. The client application 112 may be an application that needs to be downloaded and installed before running or an applet (litapp) that is a lightweight application. In the case where the client application 112 is an application program that needs to be downloaded and installed before running, the client application 112 may be installed on the client device 110 in advance and activated. In the case where the client application 112 is an applet, the user 102 can run the client application 112 directly on the client device 110 by searching the client application 112 in a host application (e.g., by name of the client application 112, etc.) or scanning a graphical code (e.g., barcode, two-dimensional code, etc.) of the client application 112, etc., without installing the client application 112. In some embodiments, client device 110 may be any type of mobile computer device, including a mobile computer, a mobile phone, a wearable computer device (e.g., a smart watch, a head-mounted device, including smart glasses, etc.), or other type of mobile device. In some embodiments, client device 110 may alternatively be a stationary computer device, such as a desktop, server computer, or other type of stationary computer device.
The server 120 is typically a server deployed by an Internet Service Provider (ISP) or Internet Content Provider (ICP). Server 120 may represent a single server, a cluster of multiple servers, a distributed system, or a cloud server providing an underlying cloud service (such as cloud database, cloud computing, cloud storage, cloud communications). It will be understood that although the server 120 is shown in fig. 1 as communicating with only one client device 110, the server 120 may provide background services for multiple client devices simultaneously.
Examples of network 130 include a Local Area Network (LAN), a Wide Area Network (WAN), a Personal Area Network (PAN), and/or a combination of communication networks such as internetworks. The network 130 may be a wired or wireless network. In some embodiments, data exchanged over network 130 is processed using techniques and/or formats including hypertext markup language (HTML), extensible markup language (XML), and the like. In addition, all or some of the links may also be encrypted using encryption techniques such as Secure Sockets Layer (SSL), transport Layer Security (TLS), virtual Private Network (VPN), internet protocol security (IPsec), and so on. In some embodiments, custom and/or dedicated data communication techniques may also be used in place of or in addition to the data communication techniques described above.
For purposes of the disclosed embodiments, in the example of fig. 1, client application 112 may be an application for a recommendation broker that may provide recommendations to brokers or other service personnel in any business area, such as the real estate area. Accordingly, server 120 may be a server for use with an application for recommending brokers. The server 120 can provide broker recommendations to the client application 112 running in the client device 110 based on information (e.g., feature data) related to the broker.
FIG. 2 is a flowchart illustrating a method 200 of training a predictive model according to an exemplary embodiment.
The method 200 may be performed at a client device (e.g., the client device 110 shown in fig. 1), i.e., the subject of execution of the various steps of the method 200 may be the client device 110 shown in fig. 1. In some embodiments, method 200 may be performed at a server (e.g., server 120 shown in fig. 1). In some embodiments, method 200 may be performed by a client device (e.g., client device 110) in combination with a server (e.g., server 120). In the following, the steps of the method 200 are described in detail by taking the execution subject as the client device 110 as an example.
Referring to fig. 2, method 200 includes steps 210 through 260.
Step 210, a sample data set of the multiple brokers is obtained, where the sample data set includes a data group of each of the multiple brokers, the data group includes feature data, workload data, and conversion data of the broker, and the conversion data indicates quality of service of the broker.
For each broker, steps 220 through 230 are performed.
And step 220, determining the predicted working capacity of the broker based on the output of a neural network model obtained by inputting the characteristic data in the data set into the neural network model, wherein the neural network model comprises L layers of neurons which are sequentially cascaded, the L layer of neurons is an output layer, and L is an integer greater than or equal to 2.
And step 230, updating the neural network model based on the corresponding conversion data when the working load data in the data group is the predicted working capacity.
At step 240, in response to the feature data for all of the plurality of brokers having been input into the neural network model, determining the updated neural network model to be a pre-trained neural network model.
Step 250, in response to determining to adjust a broker of the plurality of brokers, based on the data set of the broker, adjusting parameters of an output layer of the pre-trained neural network model and maintaining the parameters of layers 1 to L-1 of the pre-trained neural network model unchanged to obtain a prediction model of the broker.
Step 260, in response to determining not to adjust a broker of the plurality of brokers, determining the pre-trained neural network model to be a predictive model of the broker.
The method 200 trains the predictive model using sample data including feature data, workload data, and conversion data of the broker, and updates the neural network model with the conversion data in the predictive model training. Therefore, the influence of the working capacity of the broker and the actual working load on the conversion data can be fully considered in the broker recommendation process by the trained prediction model, so that the overload of the working load of the broker is avoided, and the broker recommendation effect is improved. Furthermore, recommending appropriate brokers for the customers can improve the success rate of the overall trading, and is beneficial to the long-term development of trading platforms.
In addition, the method 200 can improve the problem of sparse training data of a single broker by determining a pre-trained neural network model and further adjusting based on the single broker, which is very beneficial to industries such as real estate and the like with low transaction frequency, sparse client data and non-homogeneous commodities, and can solve the cold start problem in the online learning process.
According to some embodiments, the feature data of the broker may comprise at least one of the following information: the information includes, but is not limited to, broker's personal information, broker's business performance information, and broker's business preference information.
In particular, in some embodiments, the personal information of the broker may include various basic information related to the broker, such as the age of the broker, the maximum academic level of the broker, and the position of the broker, among others.
In some embodiments, the business performance information of the broker may include data statistics related to the broker's business, such as message response rate of the broker (proportion of messages in response to service requests), average number of conversations of the broker with clients, number of houses introduced offline by the broker, number of houses introduced by the broker via VR, time of house introduced by the broker via VR, number of clients answered by the broker via telephone, time of clients answered by the broker via telephone, number of clients answered by the broker via application, time of clients answered by the broker via application, number of houses currently maintained by the broker, number of clients serviced by the broker, number of house transactions conducted by the broker, etc. The above statistics may be statistics for a predetermined period of time, such as the above statistics for the broker during the last 7/14/30/90 days.
In some embodiments, the broker's business preference information may include areas around the broker's preferred community and preference points, house prices preferred by the broker, areas and house types, and so on.
It should be understood that any other form of broker feature data is possible, as long as the feature data is capable of indicating differences between different brokers in relation to corresponding traffic.
Illustratively, one triplet (x) may be used b ,w b ,s b ) To represent data set of Broker b, where x b ,w b ,s b Feature data, workload data, and conversion data, respectively, on behalf of the broker. Workload data w b May be the number of requests, s, of broker responses b May be the broker's completed contract rate (i.e., the ratio between the number of contracted customers and the total number of customers responding).
According to some embodiments, the data set for each broker of the plurality of brokers comprises a plurality of data subsets corresponding to a plurality of time periods, each data subset comprising feature data, workload data, and conversion data for the broker over the corresponding time period. For example, each subset of data may correspond to a 1 day period, i.e., the characteristic data, workload data, and conversion data for the subset of data represent the characteristic data, actual workload data, and actual conversion data for the day for the broker. It should be understood that the time period can be any time period of hours, days, weeks, months, quarters, years, etc. The data subsets may also be determined by specifying a start time point and an end time point.
Illustratively, the neural network model in the method 200 may be a fully-connected multi-layer Perceptron (MLP) model, but may also be other types of neural network models.
Fig. 3 is a flow chart illustrating a partial example of the method 200 of fig. 2 in accordance with an example embodiment. As shown in fig. 3, according to some embodiments, the step 220 performed for each broker, determining the predicted work capacity of the broker based on the output of the neural network model resulting from inputting the feature data in the data set into the neural network model, may include:
at step 321, a plurality of candidate working capacity data is determined.
Step 322, for each candidate working capacity data of the plurality of candidate working capacity data: and inputting the characteristic data in the data group and the candidate working capacity data into the neural network model to obtain the reward function value which is output by the neural network model and corresponds to the candidate working capacity data.
Step 323, determine the candidate work capacity data corresponding to the reward function value with the highest upper confidence bound from the plurality of candidate work capacities as the predicted work capacity of the broker.
According to some embodiments, the plurality of candidate working capacity data in step 321 is determined based on the feature data of the broker. E.g. based on x in triplets b . In some embodiments, higher candidate work capacity data is selected for a broker when the feature data indicates that the broker is capable of a higher work capacity, and lower candidate work capacity data is selected for the broker when the feature data indicates that the broker is not capable of a higher work capacity. For example, the candidate workload data may be a set of candidate values, the total number of which is determined based on the accuracy requirements of the predictive model.
According to some embodiments, the upper confidence bound for the reward function value is a function of: a value of the reward function; and a gradient value of the reward function value with respect to both the feature data and a parameter of the neural network model, wherein the parameter of the neural network model comprises a parameter for each of the L layers of neurons.
In some embodiments, the value of the reward function S for the candidate working capacity c θ (x b The upper confidence bound of c) can be expressed as:
Figure BDA0003746848060000081
wherein, g (x) b θ) is the value of the reward function versus the characteristic data x b And the gradient value of the parameter theta of the neural network model, wherein a is a preset learning rate parameter and D is a preset conversion matrix.
Returning to fig. 2, in some embodiments, step 230, updating the neural network model based on the corresponding transformed data for the predicted working capacity for the workload data in the data set includes: a loss function is calculated based on a difference between actual conversion data in the data set and the reward function value, and the neural network model is updated based on the loss function.
Thus, the method 200 may embody a complex association between the reward function values and the broker characterization data, making the output of the prediction model more accurate.
FIG. 4 is a flowchart illustrating a method 400 for recommending brokers, according to an example embodiment.
Method 400 may be performed at a client device (e.g., client device 110 shown in fig. 1), i.e., the subject of execution of the various steps of method 400 may be client device 110 shown in fig. 1. In some embodiments, method 400 may be performed at a server (e.g., server 120 shown in fig. 1). In some embodiments, method 400 may be performed by a client device (e.g., client device 110) in combination with a server (e.g., server 120). In the following, the execution subject is taken as the server 120 for example, and the steps of the method 400 are described in detail.
Referring to fig. 4, method 400 includes steps 410 through 430. The method 400 is applied to a computing device running a predictive model trained according to the method described above.
At step 410, feature data and workload data for a first broker is obtained.
A predicted work capacity of the first broker is determined based on an output of the predictive model from inputting the feature data of the first broker into the predictive model, step 420.
Step 430, recommending the first broker in response to the workload data of the first broker not exceeding the predicted work capacity of the first broker.
The method 400 may improve broker recommendations by first obtaining a broker's predicted work capacity using a predictive model trained from, for example, the method 200, and by checking whether the broker's workload exceeds the work capacity, the problem of broker workload overload may be effectively avoided.
According to some embodiments, the method 400 further comprises: determining a set of candidate brokers including a plurality of brokers, the plurality of brokers including at least a first broker and a second broker, wherein the second broker has a highest priority among brokers having a lower priority than the first broker; in response to the workload data of the first broker exceeding the predicted workload capacity: obtaining characteristic data and workload data of a second broker; determining a predicted work capacity of the second broker based on an output of the predictive model resulting from inputting the feature data of the second broker into the predictive model; and recommending the second broker in response to the workload data of the second broker not exceeding the predicted work capacity of the second broker.
It will be appreciated that a candidate broker set comprising a plurality of brokers may also include more brokers, such as a third broker, where the third broker has the highest priority among brokers having a lower priority than the second broker, and in response to workload data of the second broker exceeding the predicted work capacity, the more brokers, such as the third broker, may continue to repeat the above steps until a broker is found whose workload data does not exceed the predicted work capacity to make the recommendation.
In some embodiments, when it is determined that workload data for a broker exceeds the predicted work capacity, that broker may be temporarily removed from the set of candidate brokers (e.g., removed from the set of candidate brokers on the current day), and when the customer again requests to recommend a broker, the feature data for the broker that has been removed from the set of candidate brokers will no longer be input into the prediction model.
In some embodiments, brokers may be prioritized by a plurality of brokers, and responsive to a workload of an earlier broker exceeding a workload, the earlier broker is replaced with a broker that has a later priority but whose workload does not exceed the workload, thereby generating a broker referral list.
According to some embodiments, the method 400 further comprises: obtaining new workload data (e.g., new w) for multiple brokers after a predetermined period of time b ) And new transformation data (e.g., new s) b ) (ii) a Based on the new workload data and the new conversion data, the sample data sets of the plurality of brokers are updated for use in training the predictive model. For example, the predetermined time period may be 1 day, that is, after business of the current day is finished, new workload data and new conversion data of multiple brokers based on the recommended result of the current day may be acquired and added to the sample data set, or part of the data in the sample data set may be replaced, so as to train the prediction model.
Illustratively, a new triple (x) may be derived based on the new workload data and the new transformation data b ,w b ,s b ) The triplets may be stored in a buffer OB and further update the conversion matrix D by:
D=D+g(x b ,θ)·g(x b ,θ) T
wherein, g (x) b θ) is the value of the reward function versus the characteristic data x b And a gradient value of a parameter theta of the neural network model.
And calculates a loss function based on the following formula based on the new triples in the buffer OB
Figure BDA0003746848060000091
And updates the neural network and then clears the buffer.
Figure BDA0003746848060000092
Where θ is a parameter of the neural network model and λ is a regularization parameter.
Thus, the method 400 may enable further updates to the predictive model to improve recommendation efficiency.
Although the operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, nor that all illustrated operations be performed, to achieve desirable results.
Fig. 5 is a schematic block diagram illustrating a training apparatus 500 of a predictive model according to an exemplary embodiment.
As shown in fig. 5, the training apparatus 500 for the prediction model includes: a training data acquisition module 510, a working capacity prediction module 520, a pre-training model determination module 530, and a prediction model output module 540.
A training data acquisition module 510 configured to acquire a sample data set for a plurality of brokers, the sample data set comprising a data set for each of the plurality of brokers, the data set comprising feature data, workload data, and conversion data for the broker, the conversion data indicating a quality of service for the broker.
A work capacity prediction module 520 configured to perform, for each broker, the steps of: determining the predicted working capacity of the broker based on an output of a neural network model obtained by inputting the feature data in the data set into the neural network model, wherein the neural network model comprises L-layer neurons which are cascaded in sequence, wherein the L-layer neuron is an output layer, and L is an integer greater than or equal to 2; and updating the neural network model based on the corresponding conversion data when the working load data in the data group is used for predicting the working capacity.
A pre-trained model determination module 530 configured to determine the updated neural network model as a pre-trained neural network model in response to feature data of all of the plurality of brokers having been input into the neural network model.
A predictive model output module 540 configured to: in response to determining to adjust a broker of the plurality of brokers, based on the data set for the broker, adjusting parameters of an output layer of the pre-trained neural network model and maintaining the parameters of layers 1 through L-1 of the pre-trained neural network model unchanged to obtain a predictive model for the broker; and in response to determining not to adjust a broker of the plurality of brokers, determining the pre-trained neural network model to be a predictive model of the broker.
The apparatus 500 trains the predictive model by using sample data including feature data, workload data, and conversion data of the broker, and updates the neural network model by the conversion data in the predictive model training. Therefore, the influence of the working capacity of the broker and the actual working load on the conversion data can be fully considered in the broker recommendation process by the trained prediction model, so that the overload of the working load of the broker is avoided, and the broker recommendation effect is improved. Furthermore, recommending appropriate brokers for the customers can improve the success rate of the overall trading, and is beneficial to the long-term development of trading platforms.
In addition, the device 500 can improve the problem of sparse training data of a single broker by determining a pre-trained neural network model and further adjusting the pre-trained neural network model based on the single broker, which is very beneficial to industries such as real estate and the like with low transaction frequency, sparse customer data and non-homogeneous commodities.
It should be understood that the various modules of the apparatus 500 shown in fig. 5 may correspond to the various steps in the method 200 described with reference to fig. 2. Thus, the operations, features and advantages described above with respect to the method 200 are equally applicable to the apparatus 500 and the modules included therein. Certain operations, features and advantages may not be described in detail herein for the sake of brevity.
Fig. 6 is a schematic block diagram illustrating an apparatus 600 for recommending brokers, according to an example embodiment.
As shown in fig. 6, an apparatus 600 for recommending brokers includes: a data acquisition module 610, a working capacity prediction module 620, and a broker recommendation module 630.
A data acquisition module 610 configured to acquire feature data and workload data of a first broker;
a working capacity prediction module 620 configured to determine a predicted working capacity of a first epoch based on an output of a predictive model resulting from inputting feature data of the first epoch into the predictive model; and
a broker recommendation module 630 configured to recommend the first broker in response to the workload data of the first broker not exceeding the predicted work capacity of the first broker.
The apparatus 600 first obtains the predicted work capacity of the broker by using a prediction model trained by, for example, the method 200, and by checking whether the work load of the broker exceeds the work capacity, the problem of overloading the work load of the broker can be effectively avoided, thereby improving the recommendation effect of the broker.
It should be understood that the various modules of the apparatus 600 shown in fig. 6 may correspond to the various steps in the method 400 described with reference to fig. 4. Thus, the operations, features and advantages described above with respect to the method 400 are equally applicable to the apparatus 600 and the modules included therein. Certain operations, features and advantages may not be described in detail herein for the sake of brevity.
Although specific functions are discussed above with reference to specific modules, it should be noted that the functions of the various modules discussed herein can be separated into multiple modules and/or at least some of the functions of multiple modules can be combined into a single module. Performing an action by a particular module as discussed herein includes the particular module itself performing the action, or alternatively the particular module invoking or otherwise accessing another component or module that performs the action (or performs the action in conjunction with the particular module). Thus, a particular module that performs an action can include the particular module that performs the action itself and/or another module that the particular module invokes or otherwise accesses that performs the action. For example, the pre-trained model determination module 530 and the predictive model output module 540 may be combined into a single module in some embodiments. For another example, the predictive model output module 540 may include the pre-trained model determination module 530 in some embodiments. As used herein, the phrase "entity a initiates action B" may refer to entity a issuing instructions to perform action B, but entity a itself does not necessarily perform that action B.
It should also be appreciated that various techniques may be described herein in the general context of software, hardware elements, or program modules. The various modules described above with respect to fig. 5 and 6 may be implemented in hardware or in hardware in combination with software and/or firmware. For example, the modules may be implemented as computer program code/instructions configured to be executed in one or more processors and stored in a computer-readable storage medium. Alternatively, the modules may be implemented as hardware logic/circuitry. For example, in some embodiments, one or more of the training data acquisition module 510, the working capacity prediction module 520, the pre-training model determination module 530, and the prediction model output module 540 may be implemented together in a System on Chip (SoC). The SoC may include an integrated circuit chip (which includes one or more components of a Processor (e.g., a Central Processing Unit (CPU), microcontroller, microprocessor, digital Signal Processor (DSP), etc.), memory, one or more communication interfaces, and/or other circuitry), and may optionally execute received program code and/or include embedded firmware to perform functions.
According to an aspect of the disclosure, a computer device is provided that includes a memory, a processor, and a computer program stored on the memory. The computer program, when executed by the processor, causes the processor to execute the computer program to implement the steps of any of the method embodiments described above.
According to an aspect of the present disclosure, a non-transitory computer-readable storage medium is provided, having stored thereon a computer program which, when executed by a processor, implements the steps of any of the method embodiments described above.
According to an aspect of the present disclosure, a computer program product is provided, comprising a computer program which, when executed by a processor, performs the steps of any of the method embodiments described above.
Illustrative examples of such computer devices, non-transitory computer-readable storage media, and computer program products are described below in connection with FIG. 7.
Fig. 7 illustrates an example configuration of a computer device 700 that may be used to implement the methods described herein. For example, the server 120 and/or the client device 110 shown in fig. 1 may include an architecture similar to the computer device 700. The training arrangement 500 of the predictive model and the arrangement 600 for recommending brokers described above may also be implemented in whole or at least in part by a computer device 700 or similar device or system.
The computer device 700 may be a variety of different types of devices. Examples of computer device 700 include, but are not limited to: a desktop computer, a server computer, a notebook or netbook computer, a mobile device (e.g., a tablet, a cellular or other wireless telephone (e.g., a smartphone), a notepad computer, a mobile station), a wearable device (e.g., glasses, a watch), an entertainment device (e.g., an entertainment appliance, a set-top box communicatively coupled to a display device, a game console), a television or other display device, an automotive computer, and so forth.
The computer device 700 may include at least one processor 702, memory 704, communication interface(s) 706, presentation device 708, other input/output (I/O) devices 710, and one or more mass storage devices 712, capable of communication with each other, such as through a system bus 714 or other suitable connection.
The processor 702 may be a single processing unit or multiple processing units, all of which may include single or multiple computing units or multiple cores. The processor 702 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitry, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor 702 can be configured to retrieve and execute computer readable instructions stored in the memory 704, the mass storage device 712, or other computer readable medium, such as program code for an operating system 716, program code for application programs 718, program code for other programs 720, and the like.
Memory 704 and mass storage device 712 are examples of computer-readable storage media for storing instructions that are executed by processor 702 to implement the various functions described above. By way of example, memory 704 may generally include both volatile and nonvolatile memory (e.g., RAM, ROM, and the like). In addition, mass storage device 712 may generally include a hard disk drive, solid state drive, removable media, including external and removable drives, memory cards, flash memory, floppy disks, optical disks (e.g., CD, DVD), storage arrays, network attached storage, storage area networks, and the like. The memory 704 and mass storage device 712 may both be referred to herein collectively as memory or computer-readable storage media, and may be non-transitory media capable of storing computer-readable, processor-executable program instructions as computer program code that may be executed by the processor 702 as a particular machine configured to implement the operations and functions described in the examples herein.
A number of programs may be stored on the mass storage device 712. These programs include an operating system 716, one or more application programs 718, other programs 720, and program data 722, which can be loaded into memory 704 for execution. Examples of such applications or program modules may include, for instance, computer program logic (e.g., computer program code or instructions) for implementing the following components/functions: client application 112, method 200, and/or method 400 (including any suitable steps of methods 200, 400), and/or further embodiments described herein.
Although illustrated in fig. 7 as being stored in memory 704 of computer device 700, modules 716, 718, 720, and 722, or portions thereof, may be implemented using any form of computer-readable media that is accessible by computer device 700. As used herein, "computer-readable media" includes at least two types of computer-readable media, namely computer-readable storage media and communication media.
Computer-readable storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. Computer-readable storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital Versatile Disks (DVD), or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information for access by a computer device. In contrast, communication media may embody computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism. Computer-readable storage media, as defined herein, does not include communication media.
One or more communication interfaces 706 are used to exchange data with other devices, such as over a network, a direct connection, and so forth. Such communication interfaces may be one or more of the following: any type of network interface (e.g., a Network Interface Card (NIC)), wired or wireless (such as IEEE 802.11 Wireless LAN (WLAN)) wireless interface, worldwide interoperability for microwave Access (Wi-MAX) interface, ethernet interface, universal Serial Bus (USB) interface, cellular network interface, bluetooth TM An interface, a Near Field Communication (NFC) interface, etc. The communication interface 706 may facilitate communications within a variety of networks and protocol types, including wired networks (e.g., LAN, cable, etc.) and wireless networks (e.g., WLAN, cellular, satellite, etc.), the Internet, and so forth. The communication interface 706 may also provide for communications to external storage devices (not shown), such as in storage arrays, network attached storage, storage area networks, and so forth.
In some examples, a display device 708, such as a monitor, may be included for displaying information and images to a user. Other I/O devices 710 may be devices that receive various inputs from a user and provide various outputs to the user, and may include touch input devices, gesture input devices, cameras, keyboards, remote controls, mice, printers, audio input/output devices, and so forth.
The techniques described herein may be supported by these various configurations of the computer device 700 and are not limited to specific examples of the techniques described herein. For example, the functionality may also be implemented in whole or in part on a "cloud" using a distributed system. The cloud includes and/or represents a platform for resources. The platform abstracts underlying functionality of hardware (e.g., servers) and software resources of the cloud. The resources may include applications and/or data that may be used when performing computing processes on servers remote from the computer device 700. Resources may also include services provided over the internet and/or over a subscriber network such as a cellular or Wi-Fi network. The platform may abstract resources and functions to connect the computer device 700 with other computer devices. Thus, implementations of the functionality described herein may be distributed throughout the cloud. For example, the functionality may be implemented in part on the computer device 700 and in part by a platform that abstracts the functionality of the cloud.
While the disclosure has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative and exemplary and not restrictive; the present disclosure is not limited to the disclosed embodiments. Variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed subject matter, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps not listed, the indefinite article "a" or "an" does not exclude a plurality, the term "a" or "an" means two or more, and the term "based on" should be construed as "based at least in part on". The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.

Claims (12)

1. A method of training a predictive model, comprising:
obtaining a sample data set for a plurality of brokers, the sample data set comprising a data set for each of the plurality of brokers, the data set comprising characteristic data, workload data, and conversion data for the broker, the conversion data indicating a quality of service for the broker;
for each broker, performing the steps of:
determining the predicted working capacity of the broker based on an output of a neural network model obtained by inputting the feature data in the data set into the neural network model, wherein the neural network model comprises L-layer neurons which are sequentially cascaded, wherein the L-layer neuron is an output layer, and L is an integer greater than or equal to 2; and
updating the neural network model based on the conversion data corresponding to the predicted working capacity time of the workload data in the data set;
in response to the feature data for all brokers of the plurality of brokers having been input into the neural network model, determining that the updated neural network model is a pre-trained neural network model;
in response to determining to adjust a broker of the plurality of brokers, based on the data set for the broker, adjusting parameters of the output layer of the pre-trained neural network model and maintaining the parameters of layers 1 through L-1 of the pre-trained neural network model unchanged to obtain a predictive model for the broker; and
in response to determining not to adjust a broker of the plurality of brokers, determining that the pre-trained neural network model is a predictive model for the broker.
2. The method of claim 1, wherein determining the predicted work capacity of the broker based on an output of a neural network model from inputting the feature data in the data set into the neural network model comprises:
determining a plurality of candidate working capacity data;
for each candidate working capacity data of the plurality of candidate working capacity data:
inputting the characteristic data and the candidate working capacity data in the data set into the neural network model to obtain a reward function value which is output by the neural network model and corresponds to the candidate working capacity data; and
determining, from the plurality of candidate work capacities, a candidate work capacity data corresponding to the reward function value with the highest upper confidence bound as the predicted work capacity for the broker.
3. The method of claim 2, wherein the upper confidence bound for the reward function value is a function of:
the value of the reward function; and
the reward function values are gradient values with respect to both the feature data and parameters of the neural network model,
wherein the parameters of the neural network model comprise parameters for each of the L layers of neurons.
4. The method of claim 2, wherein the plurality of candidate working capacity data is determined based on the feature data of the broker.
5. The method of any of claims 1-2, wherein the feature data comprises at least one of: the method includes the steps of personal information of a broker, business performance information of the broker and business preference information of the broker.
6. The method of any of claims 1-2, wherein the data set includes a plurality of data subsets corresponding to a plurality of time periods, each data subset including the feature data, the workload data, and the conversion data for the broker over a corresponding time period.
7. A method for recommending brokers, the method applied to a computing device running a predictive model trained according to the method of any one of claims 1-6, the method comprising:
acquiring feature data and workload data of a first menstrual person;
determining a predicted work capacity of the first broker based on an output of the predictive model from inputting the feature data of the first broker into the predictive model; and
recommending the first broker in response to the workload data of the first broker not exceeding the predicted work capacity of the first broker.
8. The method of claim 7, further comprising:
determining a set of candidate brokers including a plurality of brokers, the plurality of brokers including at least the first broker and a second broker, wherein the second broker has a highest priority among brokers having a lower priority than the first broker;
in response to the workload data of the first broker exceeding the predicted workload capacity:
obtaining the feature data and the workload data of the second broker;
determining a predicted work capacity of the second broker based on an output of the predictive model from inputting the feature data of the second broker into the predictive model; and
recommending the second broker in response to the workload data of the second broker not exceeding the predicted work capacity of the second broker.
9. The method of claim 8, further comprising:
obtaining new workload data and new conversion data for the plurality of brokers after a predetermined period of time;
updating the sample data set for the plurality of brokers, based on the new workload data and the new conversion data, for training the predictive model.
10. A computer device, comprising:
at least one processor; and
at least one memory having a computer program stored thereon,
wherein the computer program, when executed by the at least one processor, causes the at least one processor to perform the method of any one of claims 1 to 9.
11. A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, causes the processor to carry out the method of any one of claims 1 to 9.
12. A computer program product comprising a computer program which, when executed by a processor, causes the processor to carry out the method of any one of claims 1 to 9.
CN202210834201.6A 2022-07-14 2022-07-14 Training method of prediction model and method for recommending broker Active CN115204386B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210834201.6A CN115204386B (en) 2022-07-14 2022-07-14 Training method of prediction model and method for recommending broker

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210834201.6A CN115204386B (en) 2022-07-14 2022-07-14 Training method of prediction model and method for recommending broker

Publications (2)

Publication Number Publication Date
CN115204386A true CN115204386A (en) 2022-10-18
CN115204386B CN115204386B (en) 2023-04-07

Family

ID=83582146

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210834201.6A Active CN115204386B (en) 2022-07-14 2022-07-14 Training method of prediction model and method for recommending broker

Country Status (1)

Country Link
CN (1) CN115204386B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110751286A (en) * 2018-07-23 2020-02-04 第四范式(北京)技术有限公司 Training method and training system of neural network model
CN113159552A (en) * 2021-04-13 2021-07-23 重庆锐云科技有限公司 Employee incentive management method, system, equipment and storage medium
CN113570259A (en) * 2021-07-30 2021-10-29 北京房江湖科技有限公司 Data evaluation method and computer program product based on dimension model
CN113570260A (en) * 2021-07-30 2021-10-29 北京房江湖科技有限公司 Task allocation method, computer-readable storage medium and electronic device
CN114240697A (en) * 2021-12-21 2022-03-25 贝壳找房网(北京)信息技术有限公司 Method and device for generating broker recommendation model, electronic equipment and storage medium
CN114707877A (en) * 2022-04-15 2022-07-05 贝壳找房网(北京)信息技术有限公司 Method and device for allocating house resource broker service lines

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110751286A (en) * 2018-07-23 2020-02-04 第四范式(北京)技术有限公司 Training method and training system of neural network model
CN113159552A (en) * 2021-04-13 2021-07-23 重庆锐云科技有限公司 Employee incentive management method, system, equipment and storage medium
CN113570259A (en) * 2021-07-30 2021-10-29 北京房江湖科技有限公司 Data evaluation method and computer program product based on dimension model
CN113570260A (en) * 2021-07-30 2021-10-29 北京房江湖科技有限公司 Task allocation method, computer-readable storage medium and electronic device
CN114240697A (en) * 2021-12-21 2022-03-25 贝壳找房网(北京)信息技术有限公司 Method and device for generating broker recommendation model, electronic equipment and storage medium
CN114707877A (en) * 2022-04-15 2022-07-05 贝壳找房网(北京)信息技术有限公司 Method and device for allocating house resource broker service lines

Also Published As

Publication number Publication date
CN115204386B (en) 2023-04-07

Similar Documents

Publication Publication Date Title
Jiao et al. Toward an automated auction framework for wireless federated learning services market
US11138681B2 (en) Inference model for traveler classification
US20160364783A1 (en) Systems and methods for vehicle purchase recommendations
US20160063065A1 (en) Systems, apparatuses, and methods for providing a ranking based recommendation
US11164199B2 (en) Updating projections using listing data
EP4242955A1 (en) User profile-based object recommendation method and device
CN109685536B (en) Method and apparatus for outputting information
CN107194729A (en) Advertisement price competing method, device, electronic installation and computer-readable medium
US20240095599A1 (en) Simplistic machine learning model generation tool for predictive data analytics
US11410203B1 (en) Optimized management of online advertising auctions
US11741111B2 (en) Machine learning systems architectures for ranking
CN111783810A (en) Method and apparatus for determining attribute information of user
CN112948695A (en) User portrait based general financial fast loan product recommendation method and device
CN115204386B (en) Training method of prediction model and method for recommending broker
CN116910373A (en) House source recommendation method and device, electronic equipment and storage medium
US20230297862A1 (en) Performing predictive inferences using multiple predictive models
CN111768218A (en) Method and device for processing user interaction information
CN114662001A (en) Resource interaction prediction model training method and device and resource recommendation method and device
CN114418609A (en) Deal probability estimation method, storage medium, and program product
CN111695919B (en) Evaluation data processing method, device, electronic equipment and storage medium
US20200090244A1 (en) Vehicle inventory availability notification
US9536267B2 (en) Resolving pairwise links to groups
JP2015103246A (en) Price determination method, system, and computer program (robust solution method of price determination for product and service)
CN110378717A (en) Method and apparatus for output information
US11544774B1 (en) Method, apparatus, and computer program product for device rendered object sets based on multiple objectives

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant