CN111738422A - Data processing method, device and medium based on recurrent neural network - Google Patents

Data processing method, device and medium based on recurrent neural network Download PDF

Info

Publication number
CN111738422A
CN111738422A CN202010591342.0A CN202010591342A CN111738422A CN 111738422 A CN111738422 A CN 111738422A CN 202010591342 A CN202010591342 A CN 202010591342A CN 111738422 A CN111738422 A CN 111738422A
Authority
CN
China
Prior art keywords
time period
neural network
training
recurrent neural
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010591342.0A
Other languages
Chinese (zh)
Inventor
康焱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
WeBank Co Ltd
Original Assignee
WeBank Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by WeBank Co Ltd filed Critical WeBank Co Ltd
Priority to CN202010591342.0A priority Critical patent/CN111738422A/en
Publication of CN111738422A publication Critical patent/CN111738422A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The application discloses a data processing method, a device, equipment and a medium based on a recurrent neural network, wherein the method comprises the following steps: acquiring second time sequence data to be processed, and inputting the second time sequence data to be processed into a second data processing model; the second data processing model is obtained by carrying out federal forward training on second cyclic neural network models in different time periods and then carrying out federal reverse training on the second cyclic neural network models in different time periods after the federal forward training is finished; the second cyclic neural network model in the same time period is obtained by combining the second training time sequence data and the first training time sequence data of the first equipment in the same target time period; and executing a second preset data processing flow on the second to-be-processed time sequence data based on the second data processing model to obtain a target prediction tag of the second to-be-processed time sequence data. The method and the device aim to solve the technical problem that time sequence data are difficult to predict accurately in the prior art.

Description

Data processing method, device and medium based on recurrent neural network
Technical Field
The present application relates to the field of artificial intelligence technology for financial technology (Fintech), and in particular, to a data processing method, device, and medium based on a recurrent neural network.
Background
With the continuous development of financial technologies, especially internet technology and finance, more and more technologies (such as distributed, Blockchain, artificial intelligence, etc.) are applied to the financial field, but the financial industry also puts higher requirements on the technologies, such as higher requirements on data processing in the financial industry.
A Recurrent Neural Network (RNN) model is often used to process time series data, where the time series data refers to characteristic data of the same user at different data sources (participants or devices) in the same time period, such as that user X purchases a commodity at a convenience store, a new sales record is generated at the server of the convenience store, and a new expenditure record is generated at the server of the bank system by user X.
At present, different time sequence data are stored on servers of different data holders or participants, and for privacy protection of the time sequence data, interaction of the time sequence data cannot be directly performed between different servers, that is, the servers cannot share respective time sequence data to perform joint modeling, so that each server can perform modeling only based on a small amount of time sequence data and perform modeling only based on a small amount of time sequence data, and therefore, a model after training needs to be trained for a longer time to reach a target performance, further, a computer needs to consume a large amount of resources and computing power, and the utilization rate of computer computing power resources is low.
Disclosure of Invention
The present application mainly aims to provide a data processing method, apparatus, device and medium based on a recurrent neural network, and aims to solve the technical problems in the prior art that interaction of time series data cannot be directly performed between different participants, and the utilization rate of computer computing resources is low.
In order to achieve the above object, the present application provides a data processing method based on a recurrent neural network, which is applied to a second device, where the second device includes second training time series data with preset labels for each time period, and the data processing method based on the recurrent neural network includes:
acquiring second to-be-processed time sequence data, and inputting the second to-be-processed time sequence data into a second data processing model;
the second data processing model is obtained by carrying out federal forward training on second cyclic neural network models in different time periods and then carrying out federal reverse training on the second cyclic neural network models in different time periods after the federal forward training is finished;
the second cyclic neural network model in the same time period is obtained by combining the second training time sequence data and the first training time sequence data of the first equipment in the same target time period;
and executing a second preset data processing flow on the second to-be-processed time sequence data based on the second data processing model to obtain a target prediction tag of the second to-be-processed time sequence data.
The present application further provides a data processing apparatus based on a recurrent neural network, which is applied to a second device, the second device includes second training time series data with a preset tag in each time period, and the data processing apparatus based on the recurrent neural network includes:
the first acquisition module is used for acquiring second to-be-processed time sequence data and inputting the second to-be-processed time sequence data into a second data processing model;
the second data processing model is obtained by carrying out federal forward training on second cyclic neural network models in different time periods and then carrying out federal reverse training on the second cyclic neural network models in different time periods after the federal forward training is finished;
the second cyclic neural network model in the same time period is obtained by combining the second training time sequence data and the first training time sequence data of the first equipment in the same target time period;
and the first execution module is used for executing a second preset data processing flow on the second to-be-processed time sequence data based on the second data processing model to obtain a target prediction tag of the second to-be-processed time sequence data.
The present application further provides a data processing device based on a recurrent neural network, where the data processing device based on the recurrent neural network is an entity device, and the data processing device based on the recurrent neural network includes: a memory, a processor and a program of the recurrent neural network-based data processing method stored on the memory and executable on the processor, which when executed by the processor, can implement the steps of the recurrent neural network-based data processing method as described above.
The present application also provides a medium having a program stored thereon for implementing the above-described recurrent neural network-based data processing method, wherein the program for implementing the recurrent neural network-based data processing method, when executed by a processor, implements the steps of the above-described recurrent neural network-based data processing method.
The method comprises the steps of inputting second to-be-processed time sequence data into a second data processing model by acquiring the second to-be-processed time sequence data; the second data processing model is obtained by carrying out federal forward training on second cyclic neural network models in different time periods and then carrying out federal reverse training on the second cyclic neural network models in different time periods after the federal forward training is finished; the second cyclic neural network model in the same time period is obtained by combining the second training time sequence data and the first training time sequence data of the first equipment in the same target time period; and executing a second preset data processing flow on the second to-be-processed time sequence data based on the second data processing model to obtain a target prediction tag of the second to-be-processed time sequence data. Compared with the prior art that modeling is performed based on a small amount of time sequence data and then the time sequence data to be processed is processed, the target prediction label of the second time sequence data to be processed is obtained by executing a second preset data processing flow on the second time sequence data to be processed by a second data processing model based on the trained time sequence data, and the second data processing model is obtained by performing federal forward training on second recurrent neural network models in different time periods and performing federal reverse training on the second recurrent neural network models in different time periods after the second recurrent neural network models in different time periods are trained in the united nations frontwards; wherein the second recurrent neural network model in the same time period is obtained by training by combining the second training time sequence data and the first training time sequence data of the first equipment in the same target time period, thereby protecting privacy, realizing the federal model construction based on a great amount of time sequence data of different participants at different moments, overcoming the defect of great consumption of computer computing resources in the prior art, reducing the time consumption of the trained model for achieving target performance, improving the utilization rate of the computer computing resources, and because the time sequence data federation construction model based on different participants at different moments is realized, thereby improving the accuracy of predicting the types and the like of the time sequence data to be processed, solving the problems that in the prior art, the interaction of time sequence data can not be directly carried out between different participants, so that the time sequence data is difficult to be accurately predicted.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, and it is obvious for those skilled in the art to obtain other drawings without inventive exercise.
FIG. 1 is a schematic flowchart of a first embodiment of a recurrent neural network-based data processing method according to the present application;
FIG. 2 is a schematic flowchart of a second embodiment of a recurrent neural network-based data processing method according to the present application;
FIG. 3 is a schematic diagram of an apparatus configuration of a hardware operating environment according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a first scenario in the data processing method based on the recurrent neural network of the present application;
fig. 5 is a schematic diagram of a second scenario in the data processing method based on the recurrent neural network according to the present application.
The objectives, features, and advantages of the present application will be further described with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
In a first embodiment of the data processing method based on the recurrent neural network, referring to fig. 1, the method is applied to a second device, where the second device includes second training time series data with preset labels in each time period, and the data processing method based on the recurrent neural network includes:
step S10, acquiring second time sequence data to be processed, and inputting the second time sequence data to be processed into a second data processing model;
the second data processing model is obtained by carrying out federal forward training on second cyclic neural network models in different time periods and then carrying out federal reverse training on the second cyclic neural network models in different time periods after the federal forward training is finished;
the second cyclic neural network model in the same time period is obtained by combining the second training time sequence data and the first training time sequence data of the first equipment in the same target time period;
step S20, executing a second preset data processing procedure on the second to-be-processed time series data based on the second data processing model, to obtain a target prediction tag of the second to-be-processed time series data.
The method comprises the following specific steps:
step S10, acquiring second time sequence data to be processed, and inputting the second time sequence data to be processed into a second data processing model;
the second data processing model is obtained by carrying out federal forward training on second cyclic neural network models in different time periods and then carrying out federal reverse training on the second cyclic neural network models in different time periods after the federal forward training is finished;
the second cyclic neural network model in the same time period is obtained by combining the second training time sequence data and the first training time sequence data of the first equipment in the same target time period;
in this embodiment, the data processing method based on the recurrent neural network is applied to a second device, the second device is in preset communication connection with a first device, and particularly, the second device is in preset communication connection with a plurality of first devices, where the second device is an active participant, the first device is a passive participant, the active participant refers to a participant having second training timing data with a preset tag, and it should be noted that the active participant has the second training timing data corresponding to the preset tag in each time period, so as to train a second recurrent neural network model based on the second training timing data with the preset tag in each time period.
It should be noted that the second recurrent neural network model and the second training time sequence data are in the second device, and the first recurrent neural network model and the first training time sequence data are in the first device.
In this embodiment, before training the second recurrent neural network model, each participant needs to perform data preprocessing to obtain a common sample or a common sample set, and obtain training time-series data (including first training time-series data in the first device and second training time-series data in the second device) through the common sample or the common sample set, specifically, as shown in fig. 4, sample data owned by each device (participant) may be regarded as a three-dimensional matrix, where an x axis represents a time-series space, a y axis represents a feature space, a z axis represents a sample space, the participant (device) having a preset label is called an active participant (shown in a left diagram in fig. 4, the second device), and the other participants are called passive participants (shown in a right diagram in fig. 4, the first device). All participants find a common sample set of the two participants through a preset sample alignment algorithm (a common sample refers to a sample distributed in each participant and having different timing characteristics, but describing the same entity, such as a common user set). Sample 1, sample 2, sample 3, and sample 4 of active participant a as in fig. 4 correspond to sample 1, sample 2, sample 3, and sample 4 of passive participant B, respectively, being common samples. In addition, in this embodiment, each participant segments the time sequence space of each sample according to an agreed method, each time sequence segment includes a part of time sequence characteristics, each participant is assigned to each time sequence segment with a sequence number, the time sequence segment with the smaller sequence number is located at the front in the whole time sequence, the length of the time sequence segment is not limited, that is, the time sequence segment may be a period of time, or may be a time point such as a moment of time. For each common sample, all participants collected different characteristics of the common sample during the same sequence period (based on different training sequence data for the same time period of the user). For example, in fig. 4, sample 1 is a common sample for parties a and B, both parties a and B having collected different characteristics of sample 1 during time sequence segment 1, and more practically, for example, banks and e-merchants record different consumption data of the same user at the same point in time.
In this embodiment, the second recurrent neural network model is trained together based on training time sequence data (including first training time sequence data and second training time sequence data) of the same time period of the user, specifically, the second data processing model is obtained by performing federal forward training on the second recurrent neural network models of different time periods and then performing federal reverse training on the second recurrent neural network models of different time periods after the federal forward training is completed, and the second recurrent neural network model of the same time period is obtained by training the second training time sequence data and the first training time sequence data of the first device of the same target time period; the training sequence data of the same user time period in this embodiment refers to feature data of the same user in the same time period and on different data sources (participants or devices), such as different servers, for example, user X purchases a commodity in a convenience store (participant or device), a new sales record is generated in the server of the convenience store, and at the same time, user X generates a new expenditure record in the server of the bank system (participant or device).
In this embodiment, specifically, the training of the second recurrent neural network model may also be performed by using a data source a (participant or device) that is three or more different data sources (participants or devices, for example, 3), and the second training of the second recurrent neural network model is performed by using the time-series data of the data source B (participant or device) and the data source C (participant or device) together, that is, federally, where it should be noted that, since the second training time-series data with the preset label is in the data source a, a is a second device or an active participant, the data source B without data label data and the data source C without data label data are the first devices, and for the second device, that is, the data source a, training the second recurrent neural network model by combining each of the first devices, that is, the data source B and the data source C substantially: and iterating to perform second time sequence training processes, in each second time sequence training process, performing forward propagation of second intermediate parameters of each time period based on second training data of a second device, namely the data source A, in each time period, and after the forward propagation of the second intermediate parameters of each time period is completed, determining to perform feedback adjustment on the model parameters of the second recurrent neural network model of each time period, and in the feedback adjustment process, obtaining the first training data of the data source B in each time period and the first training data of the data source C in each time period, namely, the second recurrent neural network model of the same time period by training the first training time sequence data of the first device combining the second training time sequence data and the same target time period.
Step S20, executing a second preset data processing procedure on the second to-be-processed time series data based on the second data processing model, to obtain a target prediction tag of the second to-be-processed time series data.
In this embodiment, the second data processing model is trained, so that after the second preset data processing flow is executed on the second to-be-processed time series data based on the second data processing model, the target prediction label of the second to-be-processed time series data can be accurately obtained.
Before the step of executing the second preset data processing flow on the second to-be-processed time series data based on the second data processing model, the method includes:
step S01, carrying out federal forward training on second recurrent neural network models in different time periods based on second training time sequence data with preset labels, and then carrying out federal reverse training on the second recurrent neural network models in different time periods after the federal forward training is completed so as to obtain second target recurrent neural network models, wherein the second recurrent neural network models in the same time period are obtained by training the second training time sequence data and the first training time sequence data of the first equipment in the same target time period in a combined manner;
to accurately execute a second preset data processing flow on the second time series data to be processed, a second data processing model needs to be trained in advance. Therefore, after the federal forward training is performed on the second recurrent neural network models in different time periods, the federal reverse training is performed on the second recurrent neural network models in different time periods after the federal forward training is completed, so as to obtain a second target recurrent neural network model, and further obtain a second data processing model.
Based on the second training time sequence data with the preset labels, after the second recurrent neural network models in different time periods are subjected to federal forward training, the second recurrent neural network models in different time periods after the federal forward training is finished are subjected to federal reverse training in each time period, so that the step of obtaining the second target recurrent neural network model comprises the following steps:
step S011, determining a second intermediate parameter of forward propagation in a target time period based on a received second intermediate parameter of forward propagation in the previous time period corresponding to the target time period, second training time sequence data of the target time period, and a model parameter of the second recurrent neural network model in the target time period;
step S012, and based on the second intermediate parameter of forward propagation in the target time slot, performing federal forward training in each subsequent time slot until obtaining the second intermediate parameter in the last time slot;
specifically, for example, the second intermediate parameter h2(t1) of forward propagation corresponding to a previous time period, such as a t1 time period, in the target time period is received, wherein the second intermediate parameter of forward propagation corresponding to the previous time period in the target time period is obtained through the second intermediate parameter or initial parameter h2(t0) corresponding to the previous time period in the target time period, the second training time series data corresponding to the previous time period in the target time period, and the model parameter of the second recurrent neural network model corresponding to the previous time period in the target time period.
As shown in fig. 5, based on the forward propagated second intermediate parameter h2(t1) corresponding to the previous time period in the target time period, the second training timing data of the target time period, the model parameters of the second recurrent neural network model in the target time period, the forward propagated second intermediate parameter in the target time period is determined, and the federal forward training is performed for the subsequent time periods based on the forward propagated second intermediate parameter in the target time period until the second intermediate parameter h2(tn) of the last time period is obtained.
Step S013, calculating a loss gradient of the last time period based on the second intermediate parameter of the last time period, the second training time sequence data of the last time period, the model parameter of the second recurrent neural network model in the last time period, the preset prediction model of the last time period, and the received first intermediate parameter obtained by training the first training time sequence data of the first device in the last time period, so as to update the model parameter of the preset prediction model of the last time period;
determining a second intermediate parameter of the last time period based on the second intermediate parameter of the last time period, the second training time sequence data of the last time period, the model parameter of the second recurrent neural network model in the last time period, calculating a loss gradient of the last time period based on the second intermediate parameter of the last time period, the preset prediction model of the last time period, and the received first intermediate parameter trained by the first training time sequence data of the first device in the last time period, specifically, splicing the second intermediate parameter of the last time period and the first intermediate parameter of the last time period to obtain a splicing parameter of the last time period, calculating a loss gradient of the last time period based on the splicing parameter of the last time period and the preset prediction model of the last time period, and based on the loss gradient of the last time period, and updating the model parameters of the preset prediction model of the last time period.
The first intermediate parameter is obtained by training first training time sequence data of the first equipment in the last time period, the first intermediate parameter of the first equipment in the last time period, and the model parameter of the first recurrent neural network model.
Step S014, calculating a second inverse intermediate gradient corresponding to the last time period based on the loss gradient of the last time period, the updated model parameter of the preset prediction model and the second training timing data in the last time period, so as to update the model parameter of the second recurrent neural network model in the last time period and calculate the second inverse intermediate gradient corresponding to the last time period in the last time period;
and step S015, performing federal reverse training on the basis of the second recurrent neural network model which is updated correspondingly in the last time period and the second reverse intermediate gradient in the last time period until the second recurrent neural network model in the first time period is updated.
Calculating a second inverse intermediate gradient g2 (t) corresponding to the last time period based on the loss gradient of the last time period, the model parameters of the updated preset predictive model, and the second training timing data at the last time periodn) And based on a second inverse intermediate gradient g2 (t) of the last time segmentn) Updating the model parameters of the second recurrent neural network model at the last time segment and calculating a second inverse intermediate gradient g2 corresponding to the last time segment at the last time segment (t)n-1) Continuously reversely propagating, updating the model parameters of the second recurrent neural network model in the last time period corresponding to the last time period and calculating the model parameters of the second recurrent neural network model in the last time period corresponding to the last time periodA second inverse intermediate gradient of a further previous time period.
And performing federal reverse training on the basis of the second recurrent neural network model updated in the last time period corresponding to the previous time period and the second reverse intermediate gradient of the last time period corresponding to the previous time period, until the second recurrent neural network model of the initial time period, namely the second recurrent neural network model of the t0 time period, is updated.
It should be noted that, in the present embodiment, the time period may refer to a time of day.
Step S02, setting the second target recurrent neural network model as the second data processing model.
And after a second target recurrent neural network model is obtained, setting the second target recurrent neural network model as the second data processing model.
In this embodiment, it should be noted that the method may be applied to a medical institution, where the training time sequence data includes training medical time sequence data, and after performing federal forward training on second recurrent neural network models in different time periods based on second training medical time sequence data with preset labels, the method further performs federal reverse training on second recurrent neural network models in different time periods in which the federal forward training is completed, so as to obtain a second target recurrent neural network model, and the method includes the steps of:
receiving a second intermediate parameter of forward propagation corresponding to a previous time period in the target time period, wherein the second intermediate parameter of forward propagation corresponding to the previous time period in the target time period is obtained through a second intermediate parameter or an initial parameter of a previous time period corresponding to the previous time period in the target time period, second training medical time sequence data corresponding to the previous time period in the target time period, and a model parameter of a second recurrent neural network model corresponding to the previous time period in the target time period;
determining a second intermediate parameter of forward propagation in the target time period based on a second intermediate parameter of forward propagation in the last time period corresponding to the target time period, second training medical time sequence data of the target time period, and a model parameter of the second recurrent neural network model in the target time period, and performing federal forward training on subsequent time periods based on the second intermediate parameter of forward propagation in the target time period until a second intermediate parameter of the last time period is obtained;
calculating a loss gradient of the last time period based on the second intermediate parameter of the last time period, the second training medical time sequence data of the last time period, the model parameter of the second recurrent neural network model in the last time period, the preset prediction model of the last time period, and the received first intermediate parameter obtained by training the first training medical time sequence data of the first device in the last time period, and updating the model parameter of the preset prediction model of the last time period based on the loss gradient of the last time period;
calculating a second inverse intermediate gradient corresponding to the last time period based on the loss gradient of the last time period, the updated model parameters of the preset prediction model and the second training medical time sequence data of the last time period, and updating the model parameters of the second recurrent neural network model of the last time period and calculating a second inverse intermediate gradient corresponding to the last time period of the last time period based on the second inverse intermediate gradient of the last time period;
based on a second inverse intermediate gradient of the last time period corresponding to a previous time period, the second training medical timing data of the last time period corresponding to the previous time period, the updated model parameters of the second recurrent neural network model in the last time period, the model parameters of the second recurrent neural network model in the last time period corresponding to the previous time period are updated, and a second inverse intermediate gradient of the last time period corresponding to the previous time period is calculated;
and performing federal reverse training on the basis of the second cyclic neural network model updated in the last time period corresponding to the previous time period and the second reverse intermediate gradient of the last time period corresponding to the previous time period until the second cyclic neural network model in the initial time period is updated.
Executing a second preset data processing flow on the second to-be-processed time sequence data based on the second data processing model, and obtaining a target prediction tag of the second to-be-processed time sequence data at least comprises the following two modes:
the first method is as follows: and only through the trained second cyclic neural network model, performing prediction processing on the second to-be-processed time sequence data of the current time period, namely executing a second preset data processing flow, and obtaining a target prediction label of the second to-be-processed time sequence data.
The second method comprises the following steps: and acquiring first to-be-processed time sequence data corresponding to second to-be-processed time sequence data of the current time period, and performing prediction processing on the second to-be-processed time sequence data of the current time period based on the combination of the second to-be-processed time sequence data of the current time period and the corresponding first to-be-processed time sequence data, namely executing a second preset data processing flow, so as to obtain a target prediction tag of the second to-be-processed time sequence data.
In this embodiment, the training time series data is medical training time series data, the second to-be-processed time series data is second to-be-processed medical time series data, the second data processing model is a medical attribute prediction model,
the medical attribute prediction model is obtained by carrying out federal forward training on second cyclic neural network models in different time periods and then carrying out federal reverse training on the second cyclic neural network models in different time periods after the federal forward training is finished;
the second cyclic neural network model in the same time period is obtained by combining the second training medical time sequence data and the first training medical time sequence data of the first equipment in the same target time period;
the executing a second preset data processing flow on the second to-be-processed time sequence data based on the second data processing model to obtain a target prediction tag of the second to-be-processed time sequence data includes:
and step M1, executing medical attribute prediction processing on the second medical time series data to be processed based on the medical attribute prediction model to obtain a medical attribute prediction result of the second medical time series data to be processed.
Specifically, the second to-be-processed medical time series data may be data of blood pressure, blood fat, blood sugar, uric acid, cholesterol, detection times, detection duration, and the like, and the second to-be-processed medical time series data is input into a trained medical attribute prediction model (the medical attribute prediction model is obtained by performing federal forward training on second recurrent neural network models of different time periods and performing federal reverse training on the second recurrent neural network models of different time periods after the federal forward training is completed by federal, wherein the second recurrent neural network model of the same time period is obtained by combining second trained medical time series data (data of blood pressure, blood fat, blood sugar, uric acid, cholesterol, detection times, detection duration, and the like, which are detected by a medical institution before) and first trained medical time series data (consumption data, loan data), performing medical attribute prediction processing on the second to-be-processed medical time series data based on a medical attribute prediction model to obtain a medical attribute prediction result of the second to-be-processed medical time series data, wherein the medical attribute prediction result comprises a first medical attribute prediction result (the possibility of certain medical attribute data is more than 90%) which is greater than a first preset label value, or a second medical attribute prediction result which is less than the first preset label value and greater than a second preset label value, or a third medical attribute prediction result which is less than the second preset label value, and the loan association amount obtained based on the medical attribute prediction result is different.
The method comprises the steps of inputting second to-be-processed time sequence data into a second data processing model by acquiring the second to-be-processed time sequence data; the second data processing model is obtained by carrying out federal forward training on second cyclic neural network models in different time periods and then carrying out federal reverse training on the second cyclic neural network models in different time periods after the federal forward training is finished; the second cyclic neural network model in the same time period is obtained by combining the second training time sequence data and the first training time sequence data of the first equipment in the same target time period; and executing a second preset data processing flow on the second to-be-processed time sequence data based on the second data processing model to obtain a target prediction tag of the second to-be-processed time sequence data. Compared with the prior art that modeling is performed based on a small amount of time sequence data and then the time sequence data to be processed is processed, the target prediction label of the second time sequence data to be processed is obtained by executing a second preset data processing flow on the second time sequence data to be processed by a second data processing model based on the trained time sequence data, and the second data processing model is obtained by performing federal forward training on second recurrent neural network models in different time periods and performing federal reverse training on the second recurrent neural network models in different time periods after the second recurrent neural network models in different time periods are trained in the united nations frontwards; wherein the second recurrent neural network model in the same time period is obtained by training by combining the second training time sequence data and the first training time sequence data of the first equipment in the same target time period, thereby protecting privacy, realizing the federal model construction based on a great amount of time sequence data of different participants at different moments, overcoming the defect of great consumption of computer computing resources in the prior art, reducing the time consumption of the trained model for achieving target performance, improving the utilization rate of the computer computing resources, and because the time sequence data federation construction model based on different participants at different moments is realized, thereby improving the accuracy of predicting the types and the like of the time sequence data to be processed, solving the problems in the prior art, the interaction of time sequence data can not be directly carried out between different participants, so that the time sequence data is difficult to be accurately predicted.
Further, based on the first embodiment in the present application, in another embodiment in the present application, in the process of updating the second recurrent neural network model of the target time period based on the second inverse intermediate gradient of each time period, the second training time series data of each time period and the first training time series data of the plurality of first devices of the same target time period of the second training time series data of each time period are also trained to obtain the loss gradient of the target time period, so as to calculate the second local gradient of the second recurrent neural network model of the target time period based on the loss gradient of the target time period, and the second recurrent neural network model of the target time period updated based on the second inverse intermediate gradient is also updated based on the second local gradient.
In this embodiment, the second recurrent neural network model is updated based on the joint gradient of each time segment, the joint gradient of each time segment includes two parts, one part is the second inverse intermediate gradient of each time segment, and the other is the second local gradient, wherein the second local gradient is obtained by: combining the second training time sequence data of each time period, for example, the second training time sequence data of the t1 time period, and the first training time sequence data of the first device of the same target time period, for example, the t1 time period, of the second training time sequence data of each time period to train and obtain the loss gradient of the target time period, and then calculating and obtaining the second local gradient of the second recurrent neural network model of the target time period based on the loss gradient of the target time period, it should be noted that the second device may be multiple, that is, combining the second training time sequence data of each time period, for example, the second training time sequence data of the t1 time period, and the first training time sequence data of the first devices of the same target time period, for example, the t1 time period, of the second training time sequence data of each time period to train and obtain the loss gradient of the target time period, and then calculating and obtaining the second local gradient of the second recurrent neural network model of the target time period based on the loss gradient of the target time period And (4) gradient.
In this embodiment, in the process of updating the second recurrent neural network model of the target time period based on the second inverse intermediate gradient of each time period, the second training time series data of the time period is also combined with the first training time series data of the plurality of first devices of the same target time period of the second training time series data of the time period to train and obtain the loss gradient of the target time period, so as to calculate the second local gradient of the second recurrent neural network model of the target time period based on the loss gradient of the target time period, and the second recurrent neural network model of the target time period updated based on the second inverse intermediate gradient is also updated based on the second local gradient. In this embodiment, the second recurrent neural network model is updated doubly, so as to improve the prediction accuracy of the second recurrent neural network model.
In another embodiment of the data processing method based on the recurrent neural network, referring to fig. 2, the second training time series data has a preset tag and is applied to the first device, and the data processing method based on the recurrent neural network includes:
step A1, acquiring first time sequence data to be processed, and inputting the first time sequence data to be processed into a first data processing model;
the first data processing model is obtained by carrying out federal forward training on the first cyclic neural network models at different time periods and then carrying out federal reverse training on the first cyclic neural network models at different time periods after the federal forward training is finished;
the first cyclic neural network model in the same time period is obtained by combining the first training time sequence data and second training time sequence data of second equipment in the same target time period;
step a2, executing a first preset data processing flow on the first to-be-processed time series data based on the first data processing model, to obtain a first intermediate parameter corresponding to the first to-be-processed time series data sent to the second device.
In this embodiment, if first to-be-processed time series data is detected by a first device, the first to-be-processed time series data is input into a first data processing model, and because the first data processing model is trained, a first preset data processing procedure is executed on the first to-be-processed time series data based on the trained first data processing model, so as to obtain a first intermediate parameter corresponding to the first to-be-processed time series data sent to a second device, where it is noted that the first data processing model is obtained by performing federal forward training on first recurrent neural network models in different time periods and then performing federal reverse training on the first recurrent neural network models in different time periods after federal forward training is performed on the first recurrent neural network models in different time periods; and the first cyclic neural network model in the same time period is obtained by training by combining the first training time sequence data and the second training time sequence data of the second equipment in the same target time period.
Specifically, that is, before the step of executing the first preset data processing flow on the second to-be-processed time series data based on the second data processing model, the method includes:
step B1, after carrying out federal forward training on the first cyclic neural network models in different time periods, carrying out federal reverse training on the first cyclic neural network models in different time periods after the federal forward training is completed, so as to obtain a second target cyclic neural network model;
the first cyclic neural network model in the same time period is obtained by combining the first training time sequence data and second training time sequence data of second equipment in the same target time period;
wherein, after carrying out the federal forward training to the first recurrent neural network model of different time quantums, carry out the federal reverse training of each time quantum to the first recurrent neural network model of different time quantums that the federal forward training was accomplished again to obtain the step of first target recurrent neural network model, include:
step C1, based on the received first intermediate parameter of forward propagation corresponding to the previous time period in the target time period, the first training time sequence data of the target time period, and the model parameter of the first recurrent neural network model in the target time period, determining the first intermediate parameter of forward propagation in the target time period, and based on the first intermediate parameter of forward propagation in the target time period, performing federal forward training for each subsequent time period until the first intermediate parameter of the last time period is obtained;
specifically, for example, a forward-propagated first intermediate parameter h1(t1) corresponding to a previous time period, such as a t1 time period, is received, wherein the forward-propagated first intermediate parameter corresponding to the previous time period in the target time period is obtained by a first intermediate parameter or initial parameter h1(t0) corresponding to the previous time period in the target time period, and first training timing data corresponding to the previous time period in the target time period, and the first recurrent neural network model corresponds to the model parameter of the previous time period in the target time period.
During the federal forward training process, the first intermediate parameter calculated in each time period is sent to the second equipment in the same time period so as to calculate the loss gradient of each time period;
as shown in fig. 5, based on the forward propagated first intermediate parameter h1(t1) corresponding to the previous time period in the target time period, the first training timing data of the target time period, the model parameters of the first recurrent neural network model in the target time period, the forward propagated first intermediate parameter in the target time period is determined, and the federal forward training is performed on the subsequent time periods based on the forward propagated first intermediate parameter in the target time period until the first intermediate parameter h1(tn) of the last time period is obtained, wherein in the process of the federal forward training, the calculated first intermediate parameter in each time period is sent to the second device in the same time period so that the second device can calculate the loss gradient of each time period.
Step C2, calculating a first inverse intermediate gradient corresponding to the last time period based on the loss gradient of the last time period, the model parameters of the updated preset predictive model and the first training time sequence data in the last time period, and updating the model parameters of the first recurrent neural network model in the last time period and calculating a first inverse intermediate gradient corresponding to the last time period in the last time period based on the first inverse intermediate gradient in the last time period;
and step C3, carrying out federal reverse training on the basis of the first cyclic neural network model updated in the last time period and the first reverse intermediate gradient corresponding to the last time period until the first cyclic neural network model in the initial time period is updated.
As with the first device updating the second recurrent neural network model, in this embodiment, the last device update is based onA loss gradient of a time period, the model parameters of the updated preset predictive model and the first training timing data at the last time period, and calculating a first inverse intermediate gradient g1 (t) corresponding to the last time periodn) And based on a first inverse intermediate gradient g1 (t) of the last time segmentn) Updating the model parameters of the first recurrent neural network model at the last time segment and calculating a first inverse intermediate gradient g1 corresponding to the last time segment at the last time segment (t)n-1) And continuously propagating reversely, updating the model parameters of the first recurrent neural network model in the last time period corresponding to the previous time period and calculating a first reverse intermediate gradient of the last time period corresponding to the previous time period.
And performing federal reverse training on the basis of the first cyclic neural network model updated in the last time period corresponding to the previous time period and the first reverse intermediate gradient of the second previous time period corresponding to the previous time period until the first cyclic neural network model in the initial time period is updated to obtain a second target cyclic neural network model.
Step B2, setting the second target recurrent neural network model as the first data processing model.
And after a second target recurrent neural network model is obtained, setting the second target recurrent neural network model as the first data processing model.
In the embodiment, by acquiring first to-be-processed time series data, the first to-be-processed time series data is input into a first data processing model; the first data processing model is obtained by carrying out federal forward training on the first cyclic neural network models at different time periods and then carrying out federal reverse training on the first cyclic neural network models at different time periods after the federal forward training is finished; the first cyclic neural network model in the same time period is obtained by combining the first training time sequence data and second training time sequence data of second equipment in the same target time period; and executing a first preset data processing flow on the first to-be-processed time sequence data based on the first data processing model to obtain a first intermediate parameter corresponding to the first to-be-processed time sequence data sent to the second equipment. In the application, after first to-be-processed time sequence data are obtained, based on the fact that training is completed, a first data processing model executes a first preset data processing flow on the first to-be-processed time sequence data to obtain first intermediate parameters corresponding to the first to-be-processed time sequence data sent to second equipment by first equipment, and the first data processing model is obtained after federal forward training is performed on first recurrent neural network models in different time periods and then federal reverse training is performed on the first recurrent neural network models in different time periods after federal forward training is completed by the first data processing model; the first cyclic neural network model in the same time period is obtained by training first training time sequence data and first training time sequence data of first equipment in the same target time period, privacy is protected, meanwhile, a time sequence data federation construction model based on different participators at different mass different moments is achieved, accuracy of type prediction and the like of first time sequence data to be processed is improved, the technical problem that in the prior art, time sequence data cannot be directly interacted among different participators, so that accurate prediction of the time sequence data is difficult is solved, and due to the fact that the time sequence data federation construction model based on different participators at different mass different moments is achieved, accuracy of type prediction and the like of the time sequence data to be processed is improved.
Further, based on the foregoing embodiment in the present application, in another embodiment of the present application, in the process of updating the first recurrent neural network model of the target time period based on the first inverse intermediate gradient of each time period, the first training time series data of the time period and the second training time series data of the second device of the same target time period of the first training time series data of the time period are also trained to obtain the loss gradient of the target time period, so as to calculate the first local gradient of the first recurrent neural network model of the target time period based on the loss gradient of the target time period, and the first recurrent neural network model of the target time period updated based on the first inverse intermediate gradient is also updated based on the first local gradient.
In this embodiment, the first recurrent neural network model is updated based on the joint gradient of each time segment, the joint gradient of each time segment includes two parts, one part is a first inverse intermediate gradient of each time segment, and one is a first local gradient, where the first local gradient is obtained by: the first training time sequence data of each time period, for example, the first training time sequence data of the t1 time period, and the second training time sequence data of the second device of the same target time period, for example, the t1 time period, of the first training time sequence data of each time period are trained to obtain the loss gradient of the target time period, and then the first local gradient of the first recurrent neural network model of the target time period is calculated based on the loss gradient of the target time period.
In this embodiment, in the process of updating the first recurrent neural network model of the target time period based on the first inverse intermediate gradient of each time period, the first training time series data of the time period and the second training time series data of the second device of the same target time period of the first training time series data of the time period are also combined to train to obtain the loss gradient of the target time period, so as to calculate the first local gradient of the first recurrent neural network model of the target time period based on the loss gradient of the target time period, and the first recurrent neural network model of the target time period updated based on the first inverse intermediate gradient is also updated based on the first local gradient. The first recurrent neural network model is updated doubly, so that the prediction accuracy of the first recurrent neural network model is improved.
Referring to fig. 3, fig. 3 is a schematic device structure diagram of a hardware operating environment according to an embodiment of the present application.
As shown in fig. 3, the recurrent neural network-based data processing apparatus may include: a processor 1001, such as a CPU, a memory 1005, and a communication bus 1002. The communication bus 1002 is used for realizing connection communication between the processor 1001 and the memory 1005. The memory 1005 may be a high-speed RAM memory or a non-volatile memory (e.g., a magnetic disk memory). The memory 1005 may alternatively be a memory device separate from the processor 1001 described above.
Optionally, the data processing device based on the recurrent neural network may further include a rectangular user interface, a network interface, a camera, an RF (Radio Frequency) circuit, a sensor, an audio circuit, a WiFi module, and the like. The rectangular user interface may comprise a Display screen (Display), an input sub-module such as a Keyboard (Keyboard), and the optional rectangular user interface may also comprise a standard wired interface, a wireless interface. The network interface may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface).
Those skilled in the art will appreciate that the recurrent neural network-based data processing apparatus architecture shown in fig. 3 does not constitute a limitation of the recurrent neural network-based data processing apparatus, and may include more or fewer components than those shown, or some components in combination, or a different arrangement of components.
As shown in fig. 3, a memory 1005, which is a kind of computer medium, may include therein an operating system, a network communication module, and a recurrent neural network-based data processing program. The operating system is a program that manages and controls hardware and software resources of the recurrent neural network-based data processing apparatus, and supports the operation of the recurrent neural network-based data processing program as well as other software and/or programs. The network communication module is used to enable communication between the various components within the memory 1005, as well as with other hardware and software in the recurrent neural network-based data processing system.
In the data processing apparatus based on the recurrent neural network shown in fig. 3, the processor 1001 is configured to execute a data processing program based on the recurrent neural network stored in the memory 1005, and implement the steps of the data processing method based on the recurrent neural network described in any one of the above.
The specific implementation of the data processing device based on the recurrent neural network in the present application is basically the same as that of each embodiment of the data processing method based on the recurrent neural network, and is not described herein again.
The present application further provides a data processing apparatus based on a recurrent neural network, which is applied to a second device, the second device includes a second training sequence data with a preset tag in each time period, the data processing apparatus based on the recurrent neural network includes:
the first acquisition module is used for acquiring second to-be-processed time sequence data and inputting the second to-be-processed time sequence data into a second data processing model;
the second data processing model is obtained by carrying out federal forward training on second cyclic neural network models in different time periods and then carrying out federal reverse training on the second cyclic neural network models in different time periods after the federal forward training is finished;
the second cyclic neural network model in the same time period is obtained by combining the second training time sequence data and the first training time sequence data of the first equipment in the same target time period;
and the first execution module is used for executing a second preset data processing flow on the second to-be-processed time sequence data based on the second data processing model to obtain a target prediction tag of the second to-be-processed time sequence data.
Optionally, the training time sequence data of the user non-identical time period includes the second training time sequence data and the first training time sequence data;
the data processing device based on the recurrent neural network further comprises:
the first training module is used for carrying out federal forward training on second recurrent neural network models in different time periods based on second training time sequence data with preset labels, and then carrying out federal reverse training on the second recurrent neural network models in different time periods after the federal forward training is finished so as to obtain a second target recurrent neural network model;
a first setting module for setting the second target recurrent neural network model as the second data processing model.
Optionally, the first training module comprises:
a first receiving unit, configured to determine a second intermediate parameter of forward propagation in a target time period based on a received second intermediate parameter of forward propagation in the previous time period corresponding to the target time period, second training timing data of the target time period, and a model parameter of the second recurrent neural network model in the target time period;
the federal forward training unit is used for performing federal forward training on each subsequent time period based on the second intermediate parameter which is transmitted forward in the target time period until the second intermediate parameter of the last time period is obtained;
a first calculating unit, configured to calculate a loss gradient of the last time period based on a second intermediate parameter of the last time period, second training timing data of the last time period, a model parameter of the second recurrent neural network model in the last time period, a preset prediction model of the last time period, and a received first intermediate parameter of the last time period, which is obtained by training the first training timing data of the first device, so as to update the model parameter of the preset prediction model of the last time period;
a first updating unit, configured to calculate a second inverse intermediate gradient corresponding to the last time period based on the loss gradient of the last time period, the updated model parameter of the preset prediction model, and the second training timing data in the last time period, so as to update the model parameter of the second recurrent neural network model in the last time period and calculate the second inverse intermediate gradient corresponding to the last time period in the last time period;
and the second updating unit is used for carrying out federal reverse training on the basis of the second recurrent neural network model which is updated correspondingly in the last time period and the second reverse intermediate gradient in the last time period until the second recurrent neural network model in the initial time period is updated.
Optionally, in the process of updating the second recurrent neural network model of the target time period based on the second inverse intermediate gradient of each time period, the second training time series data of the time period and the first training time series data of the plurality of first devices of the same target time period of the second training time series data of the time period are also trained to obtain the loss gradient of the target time period, so as to calculate a second local gradient of the second recurrent neural network model of the target time period based on the loss gradient of the target time period, and the second recurrent neural network model of the target time period updated based on the second inverse intermediate gradient is also updated based on the second local gradient.
Optionally, the training time series data is medical training time series data, the second to-be-processed time series data is second to-be-processed medical time series data, the second data processing model is a medical property prediction model,
the medical attribute prediction model is obtained by carrying out federal forward training on second cyclic neural network models in different time periods and then carrying out federal reverse training on the second cyclic neural network models in different time periods after the federal forward training is finished;
the second cyclic neural network model in the same time period is obtained by combining the second training medical time sequence data and the first training medical time sequence data of the first equipment in the same target time period;
the first execution module includes:
and the acquisition unit is used for executing medical attribute prediction processing on the second medical time sequence data to be processed based on the medical attribute prediction model to obtain a medical attribute prediction result of the second medical time sequence data to be processed.
In order to achieve the above object, the present application provides a data processing apparatus based on a recurrent neural network, wherein a second training time series data has a preset tag and is applied to a first device, the data processing apparatus based on the recurrent neural network includes:
the second acquisition module is used for acquiring first time sequence data to be processed and inputting the first time sequence data to be processed into the first data processing model;
the first data processing model is obtained by carrying out federal forward training on the first cyclic neural network models at different time periods and then carrying out federal reverse training on the first cyclic neural network models at different time periods after the federal forward training is finished;
the first cyclic neural network model in the same time period is obtained by combining the first training time sequence data and second training time sequence data of second equipment in the same target time period;
and the second execution module is used for executing a first preset data processing flow on the first to-be-processed time sequence data based on the first data processing model to obtain a first intermediate parameter corresponding to the first to-be-processed time sequence data sent to the second equipment.
Optionally, the data processing apparatus based on a recurrent neural network further includes:
the second training module is used for carrying out federal forward training on the first cyclic neural network models in different time periods, and then carrying out federal reverse training on the first cyclic neural network models in different time periods after the federal forward training is finished so as to obtain a first target cyclic neural network model;
the first cyclic neural network model in the same time period is obtained by combining the first training time sequence data and second training time sequence data of second equipment in the same target time period;
a second setting module for setting the first target recurrent neural network model as the first data processing model.
Optionally, the second training module comprises:
a second receiving unit, configured to determine a forward-propagated first intermediate parameter in a target time period based on a received forward-propagated first intermediate parameter in the previous time period corresponding to the target time period, first training timing data in the target time period, and a model parameter of the first recurrent neural network model in the target time period, and perform federal forward training for subsequent time periods based on the forward-propagated first intermediate parameter in the target time period until a first intermediate parameter in a last time period is obtained;
during the federal forward training process, the first intermediate parameter calculated in each time period is sent to the second equipment in the same time period so as to calculate the loss gradient of each time period;
a second calculating unit, configured to calculate a first inverse intermediate gradient corresponding to the last time period based on the loss gradient of the last time period, the updated model parameter of the preset prediction model, and the first training timing data in the last time period, and update the model parameter of the first recurrent neural network model in the last time period and calculate a first inverse intermediate gradient corresponding to the last time period in the last time period based on the first inverse intermediate gradient in the last time period;
and the third updating unit is used for carrying out federal reverse training on the basis of the first cyclic neural network model updated in the last time period and the first reverse intermediate gradient of the last time period corresponding to the last time period until the first cyclic neural network model in the initial time period is updated.
Optionally, in the process of updating the first recurrent neural network model of the target time period based on the first inverse intermediate gradient of each time period, the first training time series data of the time period and the second training time series data of the second device of the same target time period of the first training time series data of the time period are also combined to train to obtain the loss gradient of the target time period, so as to calculate a first local gradient of the first recurrent neural network model of the target time period based on the loss gradient of the target time period, and the first recurrent neural network model of the target time period updated based on the first inverse intermediate gradient is also updated based on the first local gradient.
The specific implementation of the data processing apparatus based on the recurrent neural network is substantially the same as that of each embodiment of the data processing method based on the recurrent neural network, and is not described herein again.
The present invention provides a medium, and the medium stores one or more programs, which can be further executed by one or more processors for implementing the steps of any of the above-mentioned recurrent neural network-based data processing methods.
The specific implementation of the medium of the present application is substantially the same as that of each embodiment of the data processing method based on the recurrent neural network, and is not described herein again.
The above description is only a preferred embodiment of the present application, and not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings, or which are directly or indirectly applied to other related technical fields, are included in the scope of the present application.

Claims (12)

1. A data processing method based on a recurrent neural network is applied to second equipment, the second equipment comprises second training time sequence data with preset labels in various time periods, and the data processing method based on the recurrent neural network comprises the following steps:
acquiring second to-be-processed time sequence data, and inputting the second to-be-processed time sequence data into a second data processing model;
the second data processing model is obtained by carrying out federal forward training on second cyclic neural network models in different time periods and then carrying out federal reverse training on the second cyclic neural network models in different time periods after the federal forward training is finished;
the second cyclic neural network model in the same time period is obtained by combining the second training time sequence data and the first training time sequence data of the first equipment in the same target time period;
and executing a second preset data processing flow on the second to-be-processed time sequence data based on the second data processing model to obtain a target prediction tag of the second to-be-processed time sequence data.
2. The recurrent neural network-based data processing method of claim 1, wherein said step of performing a second predetermined data processing procedure on said second to-be-processed time-series data based on said second data processing model is preceded by the step of:
after the second recurrent neural network models in different time periods are subjected to federal forward training based on second training time sequence data with preset labels, performing federal reverse training in each time period on the second recurrent neural network models in different time periods after the federal forward training is completed, so as to obtain a second target recurrent neural network model;
setting the second target recurrent neural network model as the second data processing model.
3. The data processing method based on the recurrent neural network as claimed in claim 1, wherein the step of performing federal forward training on the second recurrent neural network models in different time periods based on the second training time sequence data with preset labels, and then performing federal reverse training on the second recurrent neural network models in different time periods after the federal forward training is completed, so as to obtain the second target recurrent neural network model, comprises:
determining a second intermediate parameter of forward propagation in a target time period based on a received second intermediate parameter of forward propagation in the target time period corresponding to a previous time period, second training timing sequence data of the target time period, and a model parameter of the second recurrent neural network model in the target time period;
performing federal forward training on each subsequent time period based on the second intermediate parameter propagated forward in the target time period until a second intermediate parameter of the last time period is obtained;
calculating a loss gradient of the last time period based on the second intermediate parameter of the last time period, the second training time sequence data of the last time period, the model parameter of the second recurrent neural network model in the last time period, the preset prediction model of the last time period, and the received first intermediate parameter obtained by training the first training time sequence data of the first device in the last time period, so as to update the model parameter of the preset prediction model of the last time period;
calculating a second inverse intermediate gradient corresponding to the last time period based on the loss gradient of the last time period, the updated model parameter of the preset prediction model and the second training time sequence data of the last time period, so as to update the model parameter of the second recurrent neural network model in the last time period and calculate the second inverse intermediate gradient corresponding to the last time period in the last time period;
and performing federal reverse training based on the second recurrent neural network model which is updated correspondingly in the last time period and the second reverse intermediate gradient of the last time period until the second recurrent neural network model of the initial time period is updated.
4. The recurrent neural network-based data processing method of claim 3,
in the process of updating the second recurrent neural network model of the target time period based on the second inverse intermediate gradient of each time period, the second training time sequence data of the time period and the first training time sequence data of a plurality of first devices of the same target time period of the second training time sequence data of the time period are also combined to train to obtain the loss gradient of the target time period, so as to calculate the second local gradient of the second recurrent neural network model of the target time period based on the loss gradient of the target time period, and the second recurrent neural network model of the target time period updated based on the second inverse intermediate gradient is also updated based on the second local gradient.
5. The recurrent neural network-based data processing method of claim 1, wherein the training time-series data is medical training time-series data, the second to-be-processed time-series data is second to-be-processed medical time-series data, and the second data processing model is a medical property prediction model,
the medical attribute prediction model is obtained by carrying out federal forward training on second cyclic neural network models in different time periods and then carrying out federal reverse training on the second cyclic neural network models in different time periods after the federal forward training is finished;
the second cyclic neural network model in the same time period is obtained by combining the second training medical time sequence data and the first training medical time sequence data of the first equipment in the same target time period;
the executing a second preset data processing flow on the second to-be-processed time sequence data based on the second data processing model to obtain a target prediction tag of the second to-be-processed time sequence data includes:
and executing medical attribute prediction processing on the second medical time series data to be processed based on the medical attribute prediction model to obtain a medical attribute prediction result of the second medical time series data to be processed.
6. A data processing method based on a recurrent neural network is characterized in that second training time sequence data are provided with preset labels and are applied to first equipment, and the data processing method based on the recurrent neural network comprises the following steps:
acquiring first time sequence data to be processed, and inputting the first time sequence data to be processed into a first data processing model;
the first data processing model is obtained by carrying out federal forward training on the first cyclic neural network models at different time periods and then carrying out federal reverse training on the first cyclic neural network models at different time periods after the federal forward training is finished;
the first cyclic neural network model in the same time period is obtained by combining the first training time sequence data and second training time sequence data of second equipment in the same target time period;
and executing a first preset data processing flow on the first to-be-processed time sequence data based on the first data processing model to obtain a first intermediate parameter corresponding to the first to-be-processed time sequence data sent to the second equipment.
7. The recurrent neural network-based data processing method of claim 6, wherein said step of performing a first predetermined data processing procedure on said second to-be-processed time-series data based on said second data processing model is preceded by the steps of:
after the first cyclic neural network models in different time periods are subjected to federal forward training, performing federal reverse training in each time period on the first cyclic neural network models in different time periods after the federal forward training is completed, so as to obtain a first target cyclic neural network model;
the first cyclic neural network model in the same time period is obtained by combining the first training time sequence data and second training time sequence data of second equipment in the same target time period;
setting the first target recurrent neural network model as the first data processing model.
8. The data processing method based on the recurrent neural network as claimed in claim 7, wherein the step of performing federal forward training on the first recurrent neural network models in different time periods, and then performing federal reverse training on the first recurrent neural network models in different time periods after the federal forward training is completed, to obtain the first target recurrent neural network model, comprises:
determining a forward propagation first intermediate parameter in a target time period based on a received forward propagation first intermediate parameter in the target time period corresponding to a previous time period, first training timing data of the target time period, and a model parameter of the first recurrent neural network model in the target time period;
performing federal forward training on subsequent time periods based on the forward-propagated first intermediate parameter in the target time period until the first intermediate parameter in the last time period is obtained;
during the federal forward training process, the first intermediate parameter calculated in each time period is sent to the second equipment in the same time period so as to calculate the loss gradient of each time period;
calculating a first reverse intermediate gradient corresponding to the last time period based on the loss gradient of the last time period, the updated model parameters of the preset prediction model and the first training time sequence data in the last time period, and updating the model parameters of the first recurrent neural network model in the last time period and calculating the first reverse intermediate gradient corresponding to the last time period in the last time period based on the first reverse intermediate gradient in the last time period;
and performing federal reverse training based on the first cyclic neural network model updated in the last time period and the first reverse intermediate gradient corresponding to the last time period until the first cyclic neural network model in the initial time period is updated.
9. The recurrent neural network-based data processing method of claim 8, wherein in the updating of the first recurrent neural network model for the target time period based on the first inverse intermediate gradient for each time period, the first training timing data for the time period is also combined with the second training timing data for the second device for the same target time period of the first training timing data for the time period to obtain the loss gradient for the target time period, so as to calculate a first local gradient of the first recurrent neural network model for the target time period based on the loss gradient for the target time period, and the first recurrent neural network model for the target time period updated based on the first inverse intermediate gradient is also updated based on the first local gradient.
10. A data processing device based on a recurrent neural network is applied to a second device, the second device comprises second training time sequence data with preset labels in each time period, and the data processing device based on the recurrent neural network comprises:
the first acquisition module is used for acquiring second to-be-processed time sequence data and inputting the second to-be-processed time sequence data into a second data processing model;
the second data processing model is obtained by carrying out federal forward training on second cyclic neural network models in different time periods and then carrying out federal reverse training on the second cyclic neural network models in different time periods after the federal forward training is finished;
the second cyclic neural network model in the same time period is obtained by combining the second training time sequence data and the first training time sequence data of the first equipment in the same target time period;
and the first execution module is used for executing a second preset data processing flow on the second to-be-processed time sequence data based on the second data processing model to obtain a target prediction tag of the second to-be-processed time sequence data.
11. A recurrent neural network-based data processing apparatus, wherein the recurrent neural network-based data processing apparatus comprises: a memory, a processor, and a program stored on the memory for implementing the recurrent neural network-based data processing method,
the memory is used for storing a program for realizing the data processing method based on the recurrent neural network;
the processor is configured to execute a program implementing the recurrent neural network-based data processing method to implement the steps of the recurrent neural network-based data processing method according to any one of claims 1 to 9.
12. A medium having stored thereon a program for implementing a recurrent neural network-based data processing method, the program being executed by a processor to implement the steps of the recurrent neural network-based data processing method according to any one of claims 1 to 9.
CN202010591342.0A 2020-06-24 2020-06-24 Data processing method, device and medium based on recurrent neural network Pending CN111738422A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010591342.0A CN111738422A (en) 2020-06-24 2020-06-24 Data processing method, device and medium based on recurrent neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010591342.0A CN111738422A (en) 2020-06-24 2020-06-24 Data processing method, device and medium based on recurrent neural network

Publications (1)

Publication Number Publication Date
CN111738422A true CN111738422A (en) 2020-10-02

Family

ID=72651058

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010591342.0A Pending CN111738422A (en) 2020-06-24 2020-06-24 Data processing method, device and medium based on recurrent neural network

Country Status (1)

Country Link
CN (1) CN111738422A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113139646A (en) * 2021-05-17 2021-07-20 中国水利水电科学研究院 Data correction method and device, electronic equipment and readable storage medium
WO2021258882A1 (en) * 2020-06-24 2021-12-30 深圳前海微众银行股份有限公司 Recurrent neural network-based data processing method, apparatus, and device, and medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107730087A (en) * 2017-09-20 2018-02-23 平安科技(深圳)有限公司 Forecast model training method, data monitoring method, device, equipment and medium
US20190188572A1 (en) * 2016-05-20 2019-06-20 Deepmind Technologies Limited Memory-efficient backpropagation through time
CN111160484A (en) * 2019-12-31 2020-05-15 腾讯科技(深圳)有限公司 Data processing method and device, computer readable storage medium and electronic equipment
CN111222628A (en) * 2019-11-20 2020-06-02 深圳前海微众银行股份有限公司 Method, device and system for optimizing recurrent neural network training and readable storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190188572A1 (en) * 2016-05-20 2019-06-20 Deepmind Technologies Limited Memory-efficient backpropagation through time
CN107730087A (en) * 2017-09-20 2018-02-23 平安科技(深圳)有限公司 Forecast model training method, data monitoring method, device, equipment and medium
CN111222628A (en) * 2019-11-20 2020-06-02 深圳前海微众银行股份有限公司 Method, device and system for optimizing recurrent neural network training and readable storage medium
CN111160484A (en) * 2019-12-31 2020-05-15 腾讯科技(深圳)有限公司 Data processing method and device, computer readable storage medium and electronic equipment

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021258882A1 (en) * 2020-06-24 2021-12-30 深圳前海微众银行股份有限公司 Recurrent neural network-based data processing method, apparatus, and device, and medium
CN113139646A (en) * 2021-05-17 2021-07-20 中国水利水电科学研究院 Data correction method and device, electronic equipment and readable storage medium
CN113139646B (en) * 2021-05-17 2023-10-31 中国水利水电科学研究院 Data correction method and device, electronic equipment and readable storage medium

Similar Documents

Publication Publication Date Title
JP6825138B2 (en) Decentralized multi-party security model training framework for privacy protection
EP3446260B1 (en) Memory-efficient backpropagation through time
CN111125735B (en) Method and system for model training based on private data
CN112559007B (en) Parameter updating method and device of multitask model and electronic equipment
WO2021258882A1 (en) Recurrent neural network-based data processing method, apparatus, and device, and medium
US11416760B2 (en) Machine learning based user interface controller
CN114004425B (en) Article circulation information prediction model generation method, information generation method and device
CN111738422A (en) Data processing method, device and medium based on recurrent neural network
Geebelen et al. QoS prediction for web service compositions using kernel-based quantile estimation with online adaptation of the constant offset
CN112785144A (en) Model construction method, device and storage medium based on federal learning
CN110135702A (en) Appraisal procedure, device, system and recording medium are actively spent in a kind of refund of real-time update
Deng et al. Multi-channel autobidding with budget and ROI constraints
Chiarella et al. The stochastic bifurcation behaviour of speculative financial markets
CN111737921B (en) Data processing method, equipment and medium based on cyclic neural network
CN112000987A (en) Factorization machine classification model construction method and device and readable storage medium
CN112000988A (en) Factorization machine regression model construction method and device and readable storage medium
CN111737920B (en) Data processing method, equipment and medium based on cyclic neural network
CN111738421A (en) Data processing method, device and medium based on recurrent neural network
Yang et al. Efficient knowledge management for heterogeneous federated continual learning on resource-constrained edge devices
CN110991661A (en) Method and apparatus for generating a model
CN111222663B (en) Data processing method and system, computer system and computer readable medium
AU2022203778B2 (en) Function processing method and device and electronic apparatus
CN111079947B (en) Method and system for model training based on optional private data
WO2022046312A1 (en) Computer-implemented method and system for testing a model
Ozaki et al. Modeling and simulation of Japanese inter-firm network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination