CN111784073A - Deep learning-based work platform task workload prediction method - Google Patents

Deep learning-based work platform task workload prediction method Download PDF

Info

Publication number
CN111784073A
CN111784073A CN202010685539.0A CN202010685539A CN111784073A CN 111784073 A CN111784073 A CN 111784073A CN 202010685539 A CN202010685539 A CN 202010685539A CN 111784073 A CN111784073 A CN 111784073A
Authority
CN
China
Prior art keywords
task
model
data
deep learning
workload
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010685539.0A
Other languages
Chinese (zh)
Inventor
王�琦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Hollow Technology Co ltd
Original Assignee
Wuhan Hollow Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Hollow Technology Co ltd filed Critical Wuhan Hollow Technology Co ltd
Priority to CN202010685539.0A priority Critical patent/CN111784073A/en
Publication of CN111784073A publication Critical patent/CN111784073A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks

Abstract

The invention discloses a work platform task workload prediction method based on deep learning. The invention comprises the following steps: acquiring task data issued by historical clients of a working platform and task data completed by employees; carrying out missing value interpolation and normalization processing on the task data issued by the client; training an LSTM deep learning model as a single-factor prediction model; an LSTM deep learning model based on a double-attention machine system is used as a multi-factor prediction model; inputting the issued task data into a single-factor prediction model to obtain a prediction result of single-factor prediction; and merging the prediction result into a lifting tree to perform regression calculation to obtain a predicted value of the workload. According to the method, on the basis of the construction of a single-factor prediction model and a multi-factor prediction model, analysis is carried out aiming at the failure reason of a work task, a seq2seq model with a double attention mechanism is selected, regression calculation is carried out in a lifting tree, the predicted value of the final workload is obtained, the prediction results of the multi-model are fused, and the optimal predicted value is solved through cooperation.

Description

Deep learning-based work platform task workload prediction method
Technical Field
The invention belongs to the technical field of computer software, and particularly relates to a work platform task workload prediction method based on deep learning.
Background
The work platform is an internet platform which provides various work management related services in a crowdsourcing mode. The packet sender issues work task requirements to a work platform, the platform decomposes the tasks, searches matched packet receivers from a platform talent library according to the skill requirements of each subtask, and distributes the subtasks to the proper packet receivers; the packet receiving party starts working after receiving the assigned subtasks, and submits working results to the platform after the subtasks are completed; and the packet sender and the packet receiver receive the task delivery result and examine the task delivery result. When the packet sender issues the tasks, the task cost is managed on the platform, and after the tasks are delivered and accepted, the platform and the packet receiver settle accounts.
At present, a plurality of corresponding schemes exist for estimating the task workload of the working platform, but due to limited considered factors, the problems that the working platform cannot be applied or the prediction result is inaccurate generally exist. In the work of task workload estimation, data dry-up is often faced. Because of the lack of historical data, enough data cannot be provided for task workload estimation, and the working platform cannot disclose enough data for privacy and safety. Such situations make the data available inside the work platform very limited and the data availability is not high. The lack of data amount and the poor availability of data can cause the predicted result to be inaccurate, and even the prediction cannot be carried out.
It is therefore desirable to have a method for work platform task workload prediction based on deep learning that overcomes or at least alleviates the above-mentioned deficiencies of the prior art.
Disclosure of Invention
The invention aims to provide a work platform task workload prediction method based on deep learning, which comprises the steps of analyzing the failure reasons of work tasks on the basis of the construction of a single-factor prediction model and a multi-factor prediction model, selecting a seq2seq model with a double attention mechanism, performing regression calculation in a lifting tree to obtain the final predicted value extraction of the work platform workload, and solving the problems that the task workload is difficult to predict and the pricing is inaccurate in the conventional customer requirements.
In order to solve the technical problems, the invention is realized by the following technical scheme:
the invention relates to a work platform task workload prediction method based on deep learning, which comprises the following steps:
step S1: acquiring task data issued by historical clients of a working platform and task data completed by employees;
step S2: carrying out missing value interpolation and normalization processing on the task data issued by the client, and dividing a data set required by training the deep neural network into a training sample set and a test sample set according to a proportion;
step S3: training the LSTM deep learning model through a training sample set to obtain a parameter-adjusted LSTM deep learning model as a single-factor prediction model, and verifying the parameter-adjusted LSTM deep learning model through a testing sample set;
step S4: constructing an LSTM deep learning model based on a double attention mechanism as a multi-factor prediction model by utilizing the issued task data and the completed task data;
step S5: inputting the published task data into a single-factor prediction model to obtain a prediction result of single-factor prediction, and inputting the published task data and the completed task data into a multi-factor prediction model to obtain a prediction structure of the multi-factor prediction model;
step S6: and (4) integrating the prediction result of the single-factor prediction model and the result of the multi-factor prediction model into the lifting tree for regression calculation to obtain the final predicted value of the workload of the working platform.
Preferably, in step S1, the client publishing task data includes task content, task duration, task requirement, task budget and task notice; the completion task data includes completion personnel, completion duration, completion content, task cost, and failure reason.
Preferably, in the step S2, the missing value interpolation is to interpolate the missing value of the task data issued by the work platform history client collected in the step S1 by an interpolation algorithm, and delete the data that cannot be interpolated.
Preferably, in step S1, the normalization process includes the steps of:
step S11: setting a screening rule, and screening data which is subjected to missing value interpolation and can be used for model training;
step S12: setting a data correction rule, and correcting the training data which are not beneficial to model training after being screened in the step S11;
step S13: and normalizing the data corrected in the step S12 before training the model.
Preferably, the interpolation algorithm comprises a regression interpolation algorithm, an interpolation algorithm and a nearest neighbor interpolation algorithm.
Preferably, in step S3, the input layer of the one-factor prediction model is used to issue task data for a historical client, the hidden layer is structured as a seq2seq model, the inside is an encoding and decoding structure, the encoding and decoding unit is an LSTM long-term memory neural network, and the output layer issues a predicted value of task workload for the client.
Preferably, in step S4, the input layer of the multi-factor prediction model is a historical client issue task data and complete task data, the hidden layer structure is a seq2seq model with a dual attention mechanism, the hidden layer structure is an encoding and decoding structure inside, a layer of attention mechanism is added before encoding and decoding, and the output layer issues a predicted value of task workload for the client.
Preferably, the encoder encodes the target data set, generating a potential encoding vector; the decoding decodes the potential encoded vectors to reconstruct the data prior to encoding.
Preferably, the encoder is a countering self-encoder, which includes a generator and a discriminator.
The invention has the following beneficial effects:
(1) according to the method, on the basis of the construction of a single-factor prediction model and a multi-factor prediction model, the failure reasons of the work tasks are analyzed, important features are extracted and used as the input of the models, the prediction results of the multi-model are fused, and the optimal prediction values are solved through cooperation;
(2) according to the method, the seq2seq model of the double-attention mechanism is selected, regression calculation is carried out in the lifting tree, the final predicted value of the workload of the working platform is obtained, the sequence data can be well processed, and the future trend can be predicted by self-learning of the rule in the neural network.
Of course, it is not necessary for any product in which the invention is practiced to achieve all of the above-described advantages at the same time.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a step diagram of a deep learning-based work platform task workload prediction method according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, the present invention is a method for predicting task workload of a working platform based on deep learning, comprising the following steps:
step S1: acquiring task data issued by historical clients of a working platform and task data completed by employees;
step S2: carrying out missing value interpolation and normalization processing on the task data issued by the client, and dividing a data set required by training the deep neural network into a training sample set and a test sample set according to a proportion;
step S3: training the LSTM deep learning model through a training sample set to obtain a parameter-adjusted LSTM deep learning model as a single-factor prediction model, and verifying the parameter-adjusted LSTM deep learning model through a testing sample set;
step S4: constructing an LSTM deep learning model based on a double attention mechanism as a multi-factor prediction model by utilizing the issued task data and the completed task data;
step S5: inputting the published task data into a single-factor prediction model to obtain a prediction result of single-factor prediction, and inputting the published task data and the completed task data into a multi-factor prediction model to obtain a prediction structure of the multi-factor prediction model;
step S6: and (4) integrating the prediction result of the single-factor prediction model and the result of the multi-factor prediction model into the lifting tree for regression calculation to obtain the final predicted value of the workload of the working platform.
In step S1, the client issues task data including task content, task duration, task requirements, task budget, and task notes; the completion task data includes completion personnel, completion duration, completion content, task cost, and failure reason.
In step S2, the missing value interpolation is performed by interpolating the missing value of the job data delivered by the work platform history client collected in step S1 by using an interpolation algorithm, and deleting data that cannot be interpolated.
In step S1, the normalization process includes the following steps:
step S11: setting a screening rule, and screening data which is subjected to missing value interpolation and can be used for model training;
step S12: setting a data correction rule, and correcting the training data which are not beneficial to model training after being screened in the step S11;
step S13: and normalizing the data corrected in the step S12 before training the model.
The interpolation algorithm comprises a regression interpolation algorithm, an interpolation algorithm and a nearest neighbor interpolation algorithm.
In step S3, an input layer of the single-factor prediction model is a historical client release task data, a hidden layer is a seq2seq model, an internal part is an encoding and decoding structure, an encoding and decoding unit is an LSTM long-term memory neural network, an output layer is a predicted value of a client release task workload, an analysis is made for a reason of a work task failure, and an important feature is extracted as an input of the model.
In step S4, the input layer of the multi-factor prediction model issues task data and completes task data for a historical client, the hidden layer structure is a seq2seq model with a double attention mechanism, the hidden layer structure is an encoding and decoding structure inside, a layer of attention mechanism is added before encoding and decoding, the output layer issues a predicted value of task workload for the client, and the output layer can process sequence data well, and can predict a future trend by self-learning of a rule in a neural network.
The encoder encodes the target data set to generate a potential encoding vector; decoding decodes the potential encoding vectors and reconstructs the data before encoding.
The encoder is a counter self-encoder, and the counter self-encoder comprises a generator and a discriminator.
It should be noted that, in the above system embodiment, each included unit is only divided according to functional logic, but is not limited to the above division as long as the corresponding function can be implemented; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present invention.
In addition, it is understood by those skilled in the art that all or part of the steps in the method for implementing the embodiments described above may be implemented by a program instructing associated hardware, and the corresponding program may be stored in a computer-readable storage medium.
The preferred embodiments of the invention disclosed above are intended to be illustrative only. The preferred embodiments are not intended to be exhaustive or to limit the invention to the precise embodiments disclosed. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the invention and the practical application, to thereby enable others skilled in the art to best utilize the invention. The invention is limited only by the claims and their full scope and equivalents.

Claims (9)

1. A work platform task workload prediction method based on deep learning is characterized by comprising the following steps:
step S1: acquiring task data issued by historical clients of a working platform and task data completed by employees;
step S2: carrying out missing value interpolation and normalization processing on the task data issued by the client, and dividing a data set required by training the deep neural network into a training sample set and a test sample set according to a proportion;
step S3: training the LSTM deep learning model through a training sample set to obtain a parameter-adjusted LSTM deep learning model as a single-factor prediction model, and verifying the parameter-adjusted LSTM deep learning model through a testing sample set;
step S4: constructing an LSTM deep learning model based on a double attention mechanism as a multi-factor prediction model by utilizing the issued task data and the completed task data;
step S5: inputting the published task data into a single-factor prediction model to obtain a prediction result of single-factor prediction, and inputting the published task data and the completed task data into a multi-factor prediction model to obtain a prediction structure of the multi-factor prediction model;
step S6: and (4) integrating the prediction result of the single-factor prediction model and the result of the multi-factor prediction model into the lifting tree for regression calculation to obtain the final predicted value of the workload of the working platform.
2. The method for predicting task workload of a working platform based on deep learning as claimed in claim 1, wherein in the step S1, the client published task data includes task content, task duration, task requirements, task budget and task notices; the completion task data includes completion personnel, completion duration, completion content, task cost, and failure reason.
3. The method as claimed in claim 1, wherein in the step S2, the missing value interpolation is to interpolate the missing value of the task data issued by the working platform history client collected in the step S1 by an interpolation algorithm and delete the data that cannot be interpolated.
4. The method for predicting task workload of a working platform based on deep learning as claimed in claim 1, wherein in the step S1, the normalization process includes the following steps:
step S11: setting a screening rule, and screening data which is subjected to missing value interpolation and can be used for model training;
step S12: setting a data correction rule, and correcting the training data which are not beneficial to model training after being screened in the step S11;
step S13: and normalizing the data corrected in the step S12 before training the model.
5. The method as claimed in claim 3, wherein the interpolation algorithm comprises a regression interpolation algorithm, an interpolation algorithm and a nearest neighbor interpolation algorithm.
6. The method for predicting the task workload of the working platform based on the deep learning of claim 1, wherein in the step S3, an input layer of the one-factor prediction model issues task data for a historical client, a structure of the hidden layer is a seq2seq model, an inner part of the hidden layer is an encoding and decoding structure, wherein an encoding and decoding unit is an LSTM long-term memory neural network, and an output layer issues a predicted value of the task workload for the client.
7. The method for predicting the task workload of the working platform based on the deep learning of claim 1, wherein in the step S4, an input layer of the multi-factor prediction model is used for issuing task data and completing task data for a historical client, a hidden layer structure is a seq2seq model with a double attention mechanism, an encoding and decoding structure is arranged inside the hidden layer structure, a layer of attention mechanism is added before encoding and decoding, and an output layer is used for issuing a predicted value of the task workload for the client.
8. The method according to claim 7, wherein the encoder encodes a target data set to generate a potential encoding vector; the decoding decodes the potential encoded vectors to reconstruct the data prior to encoding.
9. The method of claim 8, wherein the encoder is a countering self-encoder, and the countering self-encoder comprises a generator and a discriminator.
CN202010685539.0A 2020-07-16 2020-07-16 Deep learning-based work platform task workload prediction method Pending CN111784073A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010685539.0A CN111784073A (en) 2020-07-16 2020-07-16 Deep learning-based work platform task workload prediction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010685539.0A CN111784073A (en) 2020-07-16 2020-07-16 Deep learning-based work platform task workload prediction method

Publications (1)

Publication Number Publication Date
CN111784073A true CN111784073A (en) 2020-10-16

Family

ID=72768332

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010685539.0A Pending CN111784073A (en) 2020-07-16 2020-07-16 Deep learning-based work platform task workload prediction method

Country Status (1)

Country Link
CN (1) CN111784073A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113095491A (en) * 2021-06-09 2021-07-09 北京星天科技有限公司 Sea chart drawing prediction model training and sea chart drawing workload prediction method and device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108197835A (en) * 2018-02-05 2018-06-22 北京航空航天大学 Method for allocating tasks, device, computer readable storage medium and electronic equipment
CN109214592A (en) * 2018-10-17 2019-01-15 北京工商大学 A kind of Air Quality Forecast method of the deep learning of multi-model fusion
CN109784343A (en) * 2019-01-25 2019-05-21 上海深杳智能科技有限公司 A kind of resource allocation methods and terminal based on deep learning model
CN110766280A (en) * 2019-09-20 2020-02-07 南京领行科技股份有限公司 Vehicle scheduling method and generation method and device of target order prediction model
US20200219020A1 (en) * 2019-01-09 2020-07-09 Hrl Laboratories, Llc System and method of structuring rationales for collaborative forecasting

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108197835A (en) * 2018-02-05 2018-06-22 北京航空航天大学 Method for allocating tasks, device, computer readable storage medium and electronic equipment
CN109214592A (en) * 2018-10-17 2019-01-15 北京工商大学 A kind of Air Quality Forecast method of the deep learning of multi-model fusion
US20200219020A1 (en) * 2019-01-09 2020-07-09 Hrl Laboratories, Llc System and method of structuring rationales for collaborative forecasting
CN109784343A (en) * 2019-01-25 2019-05-21 上海深杳智能科技有限公司 A kind of resource allocation methods and terminal based on deep learning model
CN110766280A (en) * 2019-09-20 2020-02-07 南京领行科技股份有限公司 Vehicle scheduling method and generation method and device of target order prediction model

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113095491A (en) * 2021-06-09 2021-07-09 北京星天科技有限公司 Sea chart drawing prediction model training and sea chart drawing workload prediction method and device

Similar Documents

Publication Publication Date Title
US8180710B2 (en) System, method and computer program product for an interactive business services price determination and/or comparison model
CN109961248B (en) Method, device, equipment and storage medium for predicting waybill complaints
CN111915231A (en) Deep learning-based work platform task allocation method
CN115122155B (en) Machine tool remote diagnosis method and system based on industrial internet big data
CN111083013B (en) Test method and device based on flow playback, electronic equipment and storage medium
Grosfeld-Nir et al. Production to order with random yields: Single-stage multiple lotsizing
EP3789935A1 (en) Automated data processing based on machine learning
CN111784073A (en) Deep learning-based work platform task workload prediction method
US10824956B1 (en) System and method for price estimation of reports before execution in analytics
CN110930254A (en) Data processing method, device, terminal and medium based on block chain
CN112995288A (en) Knowledge management based maintenance method and device and electronic equipment
CN116384921A (en) Execution method and device of operation and maintenance event, storage medium and electronic equipment
CN116629599A (en) Cloud management evaluation method and device, electronic equipment and storage medium
CN113095515A (en) Service fault information processing method and device
CN116402181A (en) Product quality prediction method and system based on identification analysis
CN115904916A (en) Hard disk failure prediction method and device, electronic equipment and storage medium
Kamala et al. Process Mining and Deep Neural Network approach for the Prediction of Business Process Outcome
Zhang et al. A combinational QoS-prediction approach based on RBF neural network
CN114723239A (en) Multi-party collaborative modeling method, device, equipment, medium and program product
CN114596054A (en) Service information management method and system for digital office
CN113743692A (en) Business risk assessment method and device, computer equipment and storage medium
CN114519445A (en) Prediction method and device of service interaction network
CN113723663A (en) Power work order data processing method and device, electronic equipment and storage medium
CN107392415B (en) Telecommunication salesman portrait information processing method and device based on big data
CN114331227B (en) Data analysis method and device, electronic equipment and readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20201016