CN112446544A - Traffic flow prediction model training method and device, electronic equipment and storage medium - Google Patents
Traffic flow prediction model training method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN112446544A CN112446544A CN202011381851.7A CN202011381851A CN112446544A CN 112446544 A CN112446544 A CN 112446544A CN 202011381851 A CN202011381851 A CN 202011381851A CN 112446544 A CN112446544 A CN 112446544A
- Authority
- CN
- China
- Prior art keywords
- model
- traffic flow
- flow prediction
- gradient
- prediction model
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000012549 training Methods 0.000 title claims abstract description 88
- 238000000034 method Methods 0.000 title claims abstract description 38
- 238000013526 transfer learning Methods 0.000 claims abstract description 74
- 230000006870 function Effects 0.000 claims abstract description 69
- 238000004364 calculation method Methods 0.000 claims abstract description 9
- 238000005206 flow analysis Methods 0.000 claims abstract description 9
- 230000005012 migration Effects 0.000 claims description 49
- 238000013508 migration Methods 0.000 claims description 49
- 238000004590 computer program Methods 0.000 claims description 10
- 230000002776 aggregation Effects 0.000 claims description 8
- 238000004220 aggregation Methods 0.000 claims description 8
- 238000007405 data analysis Methods 0.000 claims description 5
- 238000013500 data storage Methods 0.000 claims description 2
- 239000000126 substance Substances 0.000 claims description 2
- 238000013473 artificial intelligence Methods 0.000 abstract description 2
- 238000005516 engineering process Methods 0.000 abstract description 2
- 230000006872 improvement Effects 0.000 abstract description 2
- 230000009467 reduction Effects 0.000 abstract description 2
- 230000000694 effects Effects 0.000 description 10
- 230000007246 mechanism Effects 0.000 description 7
- 238000007726 management method Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 238000011176 pooling Methods 0.000 description 4
- 238000004422 calculation algorithm Methods 0.000 description 3
- 238000010801 machine learning Methods 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 238000012546 transfer Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000012821 model calculation Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/04—Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
- G06N20/20—Ensemble learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
- G06Q50/26—Government or public services
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Tourism & Hospitality (AREA)
- Strategic Management (AREA)
- Human Resources & Organizations (AREA)
- Economics (AREA)
- Physics & Mathematics (AREA)
- Development Economics (AREA)
- Software Systems (AREA)
- General Business, Economics & Management (AREA)
- Marketing (AREA)
- Evolutionary Computation (AREA)
- Quality & Reliability (AREA)
- General Engineering & Computer Science (AREA)
- Game Theory and Decision Science (AREA)
- Data Mining & Analysis (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Entrepreneurship & Innovation (AREA)
- Computing Systems (AREA)
- Operations Research (AREA)
- Medical Informatics (AREA)
- Artificial Intelligence (AREA)
- Mathematical Physics (AREA)
- Educational Administration (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Primary Health Care (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The invention relates to an artificial intelligence technology, and discloses a traffic flow prediction model training method based on federal transfer learning, which comprises the following steps: training a traffic flow prediction model by using a local database, and obtaining a local model gradient when a loss function is converged; transmitting the local model gradient to other participants participating in the federal transfer learning through the transfer learning to carry out the training of respective models; when the loss functions in the models of all the participants are converged, sending the local model gradient to a cloud end; obtaining a standard traffic flow prediction model according to the model gradient after federal learning returned by the cloud; and analyzing traffic data by using the standard traffic flow prediction model to obtain a traffic flow analysis result. The invention also provides a traffic flow prediction model training device, equipment and a storage medium based on the federal transfer learning. The invention realizes the improvement of the accuracy of the model and the reduction of the calculation pressure of the model under the condition of protecting the privacy of user data.
Description
Technical Field
The invention relates to the technical field of artificial intelligence, in particular to a traffic flow prediction model training method and device based on federal transfer learning, electronic equipment and a computer readable storage medium.
Background artwith the rise of machine learning and big data, the existing model sharing method based on deep learning succeeds in some public scenes such as traffic flow prediction, but has difficulties in the field related to privacy protection. Due to the strictness of the law on the privacy protection of the current user, each system can only use the locally stored database to train the model, and the advantage of big data cannot be fully exerted. Similarly, the gradient of the model needs to be updated continuously during the training of a good model, which increases the computational burden.
Disclosure of Invention
The invention provides a method, a device, equipment and a medium for training a traffic flow prediction model based on federal transfer learning, and mainly aims to improve the accuracy of the model and reduce the calculation pressure of the model under the condition of protecting the privacy of user data.
In order to achieve the purpose, the invention provides a traffic flow prediction model training method based on federal transfer learning, which comprises the following steps:
training a traffic flow prediction model which is created in advance by using traffic data in a local database of one participant participating in federal transfer learning until a loss function of the traffic flow prediction model is converged to obtain a local model gradient;
transmitting the local model gradient to other participants participating in the federal transfer learning through the transfer learning to carry out the training of respective models;
when loss functions in models of all participating parties participating in federated transfer learning are converged, sending the local model gradient to a cloud for federated learning;
receiving a model gradient returned by the cloud after federal learning, and modifying the model gradient of the traffic flow prediction model by using the model gradient after federal learning to obtain a standard traffic flow prediction model;
and receiving traffic data transmitted by a user, and analyzing the traffic data by using the standard traffic flow prediction model to obtain a traffic flow analysis result.
Optionally, the training a traffic flow prediction model created in advance by using traffic data in the local database of one of the participants participating in the federal migration learning until a loss function of the traffic flow prediction model converges to obtain a local model gradient includes:
creating a traffic flow prediction model;
training the traffic flow prediction model by using the traffic data in the local database to obtain an output result of the traffic flow prediction model;
calculating a loss function value between the output result and a preset standard result by using a preset loss function;
and when the loss function value tends to be convergent, obtaining a traffic flow prediction model after training, and obtaining gradient parameters of the traffic flow prediction model after training to obtain the local model gradient.
Optionally, the calculating a loss function value between the output result and a preset standard result by using a preset loss function includes:
calculating the loss function value using the following formula;
wherein, f (x)i) Representing the model output result, yiAnd (3) representing a preset standard result, MSE representing a model loss function value, and n representing the calculation times.
Optionally, after the calculating the loss function value between the output result and the preset standard result by using a preset loss function, the method further includes:
when the loss function value does not tend to converge, updating the model gradient of the traffic flow prediction model by using the following formula:
θjrepresenting the updated model gradient, θj-1Representing the gradient of the model before update, theta0、θ1Is a preset initial value of the gradient of the model, alpha represents the step size of gradient descent,indicating the direction of the gradient descent.
Optionally, the transmitting the local model gradient to other participants participating in federated learning through migration learning to perform respective model training includes:
judging the local model Ds=ki{xi,yiD model in other participants participating in Federal transfer learningt=Kj{xj,yjWhether the data types and the user ranges in the data are the same or not is judged;
when the data type and user range are the same, local model gradient k will be usediTransmitting to other participants participating in federated transfer learning to perform model D in the other participantst=kj{xj,yj}。
Optionally, the sending the local model gradient to a cloud for federated learning includes:
carrying out gradient aggregation operation on the local model gradient of each participant participating in the federal migration learning to obtain a combined model gradient;
and sending the combined model gradient to each participant participating in the federal migration learning.
Optionally, after sending the joint model gradient to each participant participating in federal migration learning, the method further includes:
loading the federal model gradient into the traffic flow prediction model;
and modifying variables in the traffic flow prediction model according to the federal model gradient to obtain the standard traffic flow prediction model.
In addition, in order to achieve the above object, the present invention further provides a model training apparatus for bang migration learning, including:
the local model training module is used for training a pre-established traffic flow prediction model by using traffic data in a local database of one participant participating in federal transfer learning until a loss function of the traffic flow prediction model is converged to obtain a local model gradient;
the data migration module is used for transmitting the local model gradient to other participants participating in the federal migration learning through the migration learning to train respective models;
the federated learning module is used for sending the local model gradient to the cloud for federated learning when loss functions in the models of all the participating parties participating in federated transfer learning are converged;
the model updating module is used for receiving the model gradient returned by the cloud after the federal learning and modifying the model gradient of the traffic flow prediction model by using the model gradient after the federal learning to obtain a standard traffic flow prediction model;
and the data analysis module is used for receiving traffic flow data transmitted by a user and analyzing the traffic flow data by using the standard traffic flow prediction model to obtain a traffic flow analysis result.
In addition, to achieve the above object, the present invention also provides an electronic device including:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores computer program instructions executable by the at least one processor to cause the at least one processor to perform the federal transfer learning based traffic flow prediction model training method described above.
In order to solve the above problem, the present invention further provides a computer-readable storage medium including a storage data area and a storage program area, the storage data area storing created data, the storage program area storing a computer program; wherein, the computer program is executed by a processor to realize the traffic flow prediction model training method based on the federal transfer learning.
In addition, the model gradient trained by local data is updated according to the model gradient trained by a plurality of clients in a federal learning mode through a server, so that the effect of expanding training data is realized, and the effect of the model is improved. Therefore, the traffic flow prediction model training method, the traffic flow prediction model training device, the electronic equipment and the computer readable storage medium based on the federal transfer learning, which are provided by the invention, improve the model accuracy and reduce the model calculation pressure under the condition of protecting the user data privacy by the federal transfer learning mode.
Drawings
Fig. 1 is a schematic flow chart of a traffic flow prediction model training method based on federal transfer learning according to an embodiment of the present invention;
fig. 2 is a schematic block diagram of a traffic flow prediction model training device based on federal transfer learning according to an embodiment of the present invention;
fig. 3 is a schematic internal structural diagram of an electronic device for implementing a federal transfer learning-based traffic flow prediction model training method according to an embodiment of the present invention;
the objects, features and advantages of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The embodiment of the application provides a traffic flow prediction model training method based on federal transfer learning. The execution subject of the model training method based on federal learning includes, but is not limited to, at least one of the electronic devices of a server, a terminal, and the like, which can be configured to execute the method provided by the embodiments of the present application. In other words, the federal migration learning based traffic flow prediction model training method may be executed by software or hardware installed in a terminal device or a server device, and the software may be a block chain platform. The server includes but is not limited to: a single server, a server cluster, a cloud server or a cloud server cluster, and the like.
Referring to fig. 1, a flow chart of a traffic flow prediction model training method based on federal transfer learning according to an embodiment of the present invention is schematically shown. In an embodiment of the present invention, the method for training a traffic flow prediction model based on federal transfer learning includes:
s1, training a traffic flow prediction model which is created in advance by using the traffic data in the local database of one participant participating in the federal transfer learning until the loss function of the traffic flow prediction model is converged to obtain a local model gradient.
In the embodiment of the present invention, the data in the local database may be traffic data. The embodiment of the invention can utilize the data acquisition equipment to acquire the traffic data from each traffic scene.
In detail, the traffic data includes vehicle information, personnel information, violation record information, and the like, the data acquisition device includes various camera devices, various sensors, and the like, and the traffic scene includes scenes such as an expressway, a rural road, a road on a peak at morning and evening, a bus station, and the like.
In detail, the S1 includes:
creating a traffic flow prediction model;
training the traffic flow prediction model by using the traffic data in the local database to obtain an output result of the traffic flow prediction model;
calculating a loss function value between the output result and a preset standard result by using a preset loss function;
and when the loss function value tends to be convergent, obtaining a traffic flow prediction model after training, and obtaining gradient parameters of the traffic flow prediction model after training to obtain the local model gradient.
In detail, in the embodiment of the present invention, the traffic flow prediction model may be created by a convolutional neural network, and includes a convolutional layer, a pooling layer, a fully-connected layer, and the like. The convolutional layer is characterized by extracting the characteristics of data by utilizing a pre-constructed function; the pooling layer compresses the extracted feature data to extract main feature data, so that the calculation complexity is simplified; and the full connection layer is used for connecting all characteristic data and outputting the data.
In detail, the training of the pre-created traffic flow prediction model is to adjust parameters of an algorithm in the traffic flow prediction model through traffic data in the local database, so that the trained traffic flow prediction model is preferably mapped or reflected by the traffic data in the whole local database.
The embodiment of the invention can utilize the following mean difference Method (MSE) to calculate the loss function value between the output result and the preset standard result;
wherein, f (x)i) Representing the model output result, yiAnd (3) representing a preset standard result, MSE representing a model loss function value, and n representing the calculation times.
In one embodiment of the present invention, when the loss function value is less than or equal to a preset threshold value, it is determined that the loss function value tends to converge, and a model gradient of the traffic flow prediction model at that time is obtained as a local model gradient.
In another embodiment of the present invention, when the loss function value is greater than the preset threshold value, the loss function value does not tend to converge, and further updating of the model gradient of the traffic flow prediction model is required.
In the embodiment of the invention, the model gradient of the traffic flow prediction model is updated by using the following formula:
θjrepresenting the updated model gradient, θj-1Representing the gradient of the model before update, theta0、θ1Represents the initial value preset by the function in the model, represents the step size of gradient descent,indicating the direction of the gradient descent.
And S2, transmitting the local model gradient to other participants participating in the federal transfer learning through the transfer learning to train the respective model.
The Transfer Learning (Transfer Learning) is a machine Learning method, and is to Transfer knowledge in one field (i.e., a source field) to another field (i.e., a target field), so that the target field can obtain a better Learning effect.
In detail, in the case that the data of the model and the data features are less overlapped, the migration learning is selected not to segment the data, and the migration learning is used for overcoming the condition of insufficient data or labels. For example, there are two different institutions, one being a bank located in china and the other being an e-commerce located in the united states. Due to the limitation of regions, the intersection of the user groups of the two mechanisms is very small, and meanwhile, due to the difference of the mechanism types, the data characteristics of the two mechanisms are only partially overlapped. Under the condition, migration learning must be introduced to solve the problems of small scale of unilateral data and few label samples so as to improve the effect of the model for effective federal learning
In an embodiment of the present invention, the transmitting the local model gradient to other participating parties participating in federated migration learning through migration learning to perform training of respective models includes:
judging the local model Ds=ki{xi,yiD model in other participants participating in Federal transfer learningt=kj{xj,yjWhether the data types and the user ranges in the data are the same or not is judged;
when the data type and user range are the same, local model gradient k will be usediTransmitting to other participants participating in federated transfer learning to perform model D in the other participantst=kj{xj,yj}。
In the embodiment of the invention, the local model gradient is transmitted to other participants participating in the federal transfer learning for training the respective models through the transfer learning, so that the model iteration times of the other participants participating in the federal transfer learning can be saved, the training time is saved, and the model training effect is improved.
And S3, when loss functions in the models of all the participators participating in the federal migration learning are converged, sending the local model gradient to the cloud for the federal learning.
In an embodiment of the present invention, the federal learning includes; and carrying out gradient aggregation operation on the local model gradient of each participant participating in the federal transfer learning to obtain a combined model gradient, and sending the combined model gradient to each participant participating in the federal transfer learning.
In an embodiment of the present invention, the gradient aggregation is a single operation of calculating a value from a set of values. For example, calculating a daily average temperature value from the daily temperatures accumulated over a month is an aggregation operation. One embodiment of the invention may obtain the combined model gradient by performing weighted average on the local gradient models of the participants participating in the federal migration learning.
And S4, receiving the model gradient returned by the cloud after federal learning, and modifying the model gradient of the traffic flow prediction model by using the model gradient after federal learning to obtain a standard traffic flow prediction model.
In detail, the updating the traffic flow prediction model by using the model gradient after the federal learning includes: and loading the federal model gradient into the traffic flow prediction model, and modifying variables in the traffic flow prediction model according to the federal model gradient to obtain the standard traffic flow prediction model.
And S5, receiving traffic data transmitted by a user, and analyzing the traffic data by using the standard traffic flow prediction model to obtain a traffic flow analysis result.
According to the embodiment of the invention, the traffic flow data is analyzed according to the standard traffic flow prediction model, so that the road traffic condition can be predicted. In addition, the model gradient trained by local data is updated according to the model gradient trained by a plurality of clients in a federal learning mode through a server, so that the effect of expanding training data is realized, and the effect of the model is improved. Therefore, the embodiment of the invention realizes the improvement of the accuracy of the model and the reduction of the calculation pressure of the model under the condition of protecting the privacy of the user data through a mode of federal transfer learning.
Fig. 3 is a schematic block diagram of a traffic flow prediction model training device based on federal transfer learning according to the present invention.
The traffic flow prediction model training device 100 based on federal transfer learning can be installed in electronic equipment. According to the realized functions, the federal migration learning based traffic flow prediction model training device can comprise a local model training module 101, a data migration module 102, a federal learning module 103, a model updating module 104 and a data analysis module 105. The module of the present invention, which may also be referred to as a unit, refers to a series of computer program segments that can be executed by a processor of an electronic device and that can perform a fixed function, and that are stored in a memory of the electronic device.
In the present embodiment, the functions regarding the respective modules/units are as follows:
the local model training module 101 is configured to train a traffic flow prediction model created in advance by using traffic data in a local database of one of the participants participating in federal transfer learning until a loss function of the traffic flow prediction model converges to obtain a local model gradient.
The data migration module 102 is configured to transmit the local model gradient to other participants participating in federated migration learning for training of respective models through migration learning,
the federal learning module 103 is configured to send the local model gradient to a cloud for federal learning when loss functions in models of all participating parties participating in federal migration learning converge;
the model updating module 104 is configured to receive the model gradient returned by the cloud after federal learning, and modify the model gradient of the traffic flow prediction model by using the model gradient after federal learning to obtain a standard traffic flow prediction model;
the data analysis module 105 is configured to receive traffic data transmitted by a user, and analyze the traffic data by using the standard traffic flow prediction model to obtain a traffic flow analysis result.
In detail, each module in the federal migration learning based traffic flow prediction model training device 100 may execute a federal migration learning based traffic flow prediction model training method including the following steps:
step one, a local model training module 101 in one of the participants participating in federal transfer learning trains a traffic flow prediction model created in advance by using traffic data in a local database until a loss function of the traffic flow prediction model converges to obtain a local model gradient.
In the embodiment of the present invention, the data in the local database may be traffic data. The local model training module 101 according to the embodiment of the present invention may acquire the traffic data from each traffic scene by using a data acquisition device.
In detail, the traffic data includes vehicle information, personnel information, violation record information, and the like, the data acquisition device includes various camera devices, various sensors, and the like, and the traffic scene includes scenes such as an expressway, a rural road, a road on a peak at morning and evening, a bus station, and the like.
In detail, the local model training module 101 is specifically configured to: creating a traffic flow prediction model; training the traffic flow prediction model by using the data in the local database to obtain an output result of the traffic flow prediction model; calculating a loss function value between the output result and a preset standard result by using a preset loss function; and when the loss function value tends to be convergent, obtaining a traffic flow prediction model after training, and obtaining gradient parameters of the traffic flow prediction model after training to obtain the local model gradient.
In detail, in the embodiment of the present invention, the traffic flow prediction model may be created by a convolutional neural network, and includes a convolutional layer, a pooling layer, a fully-connected layer, and the like. The convolutional layer is characterized by extracting the characteristics of data by utilizing a pre-constructed function; the pooling layer compresses the extracted feature data to extract main feature data, so that the calculation complexity is simplified; and the full connection layer is used for connecting all characteristic data and outputting the data.
In detail, the local model training module 101 trains the pre-created traffic flow prediction model by adjusting parameters of an algorithm in the traffic flow prediction model according to the traffic data in the local database, so that the trained traffic flow prediction model is preferably mapped or reflected by the traffic data in the entire local database.
The local model training module 101 of the embodiment of the present invention may calculate a loss function value between the output result and a preset standard result by using a Mean Square Error (MSE) method;
wherein, f (x)i) Representing the model output result, yiTo representAnd (4) presetting a standard result, wherein MSE represents a model loss function value, and n represents the calculation times.
In one embodiment of the present invention, when the loss function value is smaller than or equal to a preset threshold, the local model training module 101 determines that the loss function value tends to converge, and obtains a model gradient of the traffic flow prediction model at this time as a local model gradient.
In another embodiment of the present invention, when the loss function value is greater than the preset threshold, the local model training module 101 determines that the loss function value does not tend to converge, and further updates the model gradient of the traffic flow prediction model are required.
In this embodiment of the present invention, the local model training module 101 updates the model gradient of the traffic flow prediction model by using the following formula:
θjrepresenting the updated model gradient, θj-1Representing the gradient of the model before update, theta0、θ1Represents the initial value preset by the function in the model, represents the step size of gradient descent,indicating the direction of the gradient descent.
And step two, the data migration module 102 transmits the local model gradient to other participants participating in the federal migration learning through the migration learning to train respective models.
The Transfer Learning (Transfer Learning) is a machine Learning method, and is to Transfer knowledge in one field (i.e., a source field) to another field (i.e., a target field), so that the target field can obtain a better Learning effect.
In detail, in the case that the data of the model and the data features are less overlapped, the migration learning is selected not to segment the data, and the migration learning is used for overcoming the condition of insufficient data or labels. For example, there are two different institutions, one being a bank located in china and the other being an e-commerce located in the united states. Due to the limitation of regions, the intersection of the user groups of the two mechanisms is very small, and meanwhile, due to the difference of the mechanism types, the data characteristics of the two mechanisms are only partially overlapped. Under the condition, migration learning must be introduced to solve the problems of small scale of unilateral data and few label samples so as to improve the effect of the model for effective federal learning
In this embodiment of the present invention, when the data migration module 102 transmits the local model gradient to other participants participating in federated migration learning to perform model training, it performs:
judging the local model Ds=ki{xi,yiD model in other participants participating in Federal transfer learningt=kj{xj,yjWhether the data types and the user ranges in the data are the same or not is judged;
when the data type and user range are the same, local model gradient k will be usediTransmitting to other participants participating in federated transfer learning to perform model D in the other participantst=kj{xj,yj}。
In the embodiment of the present invention, the data migration module 102 transfers the local model gradient to other participants participating in the federal migration learning through the migration learning to perform respective model training, so that the number of model iterations of the other participants participating in the federal migration learning can be saved, thereby saving training time and improving the model training effect.
And step three, when loss functions in the models of all the participating parties participating in the federal transfer learning are converged, the federal learning module 103 sends the local model gradient to the cloud for the federal learning.
In an embodiment of the present invention, the federal learning includes: and carrying out gradient aggregation operation on the local model gradient of each participant participating in the federal transfer learning to obtain a combined model gradient, and sending the combined model gradient to each participant participating in the federal transfer learning.
In an embodiment of the present invention, the gradient aggregation is a single operation of calculating a value from a set of values. For example, calculating a daily average temperature value from the daily temperatures accumulated over a month is an aggregation operation. One embodiment of the invention may obtain the combined model gradient by performing weighted average on the local gradient models of the participants participating in the federal migration learning.
And step four, the model updating module 104 receives the model gradient returned by the cloud after the federal learning, and modifies the model gradient of the traffic flow prediction model by using the model gradient after the federal learning to obtain a standard traffic flow prediction model.
In detail, the model update module 104 updates the traffic flow prediction model with the model gradient after the federal learning, including: and loading the federal model gradient into the traffic flow prediction model, and modifying variables in the traffic flow prediction model according to the federal model gradient to obtain the standard traffic flow prediction model.
And step five, the data analysis module 105 receives traffic flow data transmitted by a user, and analyzes the traffic flow data by using the standard traffic flow prediction model to obtain a traffic flow analysis result.
According to the embodiment of the invention, the traffic flow data is analyzed according to the standard traffic flow prediction model, so that the road traffic condition can be predicted.
Fig. 3 is a schematic structural diagram of an electronic device for implementing a traffic flow prediction model training method based on federal transfer learning according to the present invention.
The electronic device 1 may include a processor 10, a memory 11, and a bus, and may further include a computer program stored in the memory 11 and executable on the processor 10, such as a federal migration learning based traffic flow prediction model training program 12.
The memory 11 includes at least one type of readable storage medium, which includes flash memory, removable hard disk, multimedia card, card-type memory (e.g., SD or DX memory, etc.), magnetic memory, magnetic disk, optical disk, etc. The memory 11 may in some embodiments be an internal storage unit of the electronic device 1, such as a removable hard disk of the electronic device 1. The memory 11 may also be an external storage device of the electronic device 1 in other embodiments, such as a plug-in mobile hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the electronic device 1. Further, the memory 11 may also include both an internal storage unit and an external storage device of the electronic device 1. The memory 11 may be used to store not only application software installed in the electronic device 1 and various types of data, such as codes of the traffic flow prediction model training program 12 based on federal transfer learning, but also temporarily store data that has been output or will be output.
The processor 10 may be composed of an integrated circuit in some embodiments, for example, a single packaged integrated circuit, or may be composed of a plurality of integrated circuits packaged with the same or different functions, including one or more Central Processing Units (CPUs), microprocessors, digital Processing chips, graphics processors, and combinations of various control chips. The processor 10 is a Control Unit (Control Unit) of the electronic device, connects various components of the whole electronic device by using various interfaces and lines, and executes various functions and processes data of the electronic device 1 by running or executing a program or a module stored in the memory 11 (for example, executing a traffic flow prediction model training program based on federal migration learning, etc.), and calling data stored in the memory 11.
The bus may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. The bus is arranged to enable connection communication between the memory 11 and at least one processor 10 or the like.
Fig. 3 shows only an electronic device with components, and it will be understood by those skilled in the art that the structure shown in fig. 3 does not constitute a limitation of the electronic device 1, and may comprise fewer or more components than those shown, or some components may be combined, or a different arrangement of components.
For example, although not shown, the electronic device 1 may further include a power supply (such as a battery) for supplying power to each component, and preferably, the power supply may be logically connected to the at least one processor 10 through a power management device, so as to implement functions of charge management, discharge management, power consumption management, and the like through the power management device. The power supply may also include any component of one or more dc or ac power sources, recharging devices, power failure detection circuitry, power converters or inverters, power status indicators, and the like. The electronic device 1 may further include various sensors, a bluetooth module, a Wi-Fi module, and the like, which are not described herein again.
Further, the electronic device 1 may further include a network interface, and optionally, the network interface may include a wired interface and/or a wireless interface (such as a WI-FI interface, a bluetooth interface, etc.), which are generally used for establishing a communication connection between the electronic device 1 and other electronic devices.
Optionally, the electronic device 1 may further comprise a user interface, which may be a Display (Display), an input unit (such as a Keyboard), and optionally a standard wired interface, a wireless interface. Alternatively, in some embodiments, the display may be an LED display, a liquid crystal display, a touch-sensitive liquid crystal display, an OLED (Organic Light-Emitting Diode) touch device, or the like. The display, which may also be referred to as a display screen or display unit, is suitable for displaying information processed in the electronic device 1 and for displaying a visualized user interface, among other things.
It is to be understood that the described embodiments are for purposes of illustration only and that the scope of the appended claims is not limited to such structures.
The federal migration learning based traffic flow prediction model training program 12 stored in the memory 11 of the electronic device 1 is a combination of a plurality of computer programs, and when running in the processor 10, can realize:
training a traffic flow prediction model which is created in advance by using traffic data in a local database of one participant participating in federal transfer learning until a loss function of the traffic flow prediction model is converged to obtain a local model gradient;
transmitting the local model gradient to other participants participating in the federal transfer learning through the transfer learning to carry out the training of respective models;
when loss functions in models of all participating parties participating in federated transfer learning are converged, sending the local model gradient to a cloud for federated learning;
receiving a model gradient returned by the cloud after federal learning, and modifying the model gradient of the traffic flow prediction model by using the model gradient after federal learning to obtain a standard traffic flow prediction model;
and receiving traffic data transmitted by a user, and analyzing the traffic data by using the standard traffic flow prediction model to obtain a traffic flow analysis result.
Further, the integrated modules/units of the electronic device 1, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. The computer-readable medium may include: any entity or device capable of carrying said computer program code, recording medium, U-disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM).
Further, the computer usable storage medium may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function, and the like; the storage data area may store data created according to the use of the blockchain node, and the like.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus, device and method can be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is only one logical functional division, and other divisions may be realized in practice.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, functional modules in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional module.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof.
The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any accompanying claims should not be construed as limiting the claim concerned.
The block chain is a novel application mode of computer technologies such as distributed data storage, point-to-point transmission, a consensus mechanism, an encryption algorithm and the like. A block chain (Blockchain), which is essentially a decentralized database, is a series of data blocks associated by using a cryptographic method, and each data block contains information of a batch of network transactions, so as to verify the validity (anti-counterfeiting) of the information and generate a next block. The blockchain may include a blockchain underlying platform, a platform product service layer, an application service layer, and the like.
Furthermore, it is obvious that the word "comprising" does not exclude other elements or steps, and the singular does not exclude the plural. A plurality of units or means recited in the system claims may also be implemented by one unit or means in software or hardware. The terms second, etc. are used to denote names, but not any particular order.
Finally, it should be noted that the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting, and although the present invention is described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention.
Claims (10)
1. A traffic flow prediction model training method based on federal transfer learning is characterized in that the method is applied to one of participants participating in federal transfer learning, and comprises the following steps:
training a traffic flow prediction model which is created in advance by using traffic data in a local database of one participant participating in federal transfer learning until a loss function of the traffic flow prediction model is converged to obtain a local model gradient;
transmitting the local model gradient to other participants participating in the federal transfer learning through the transfer learning to carry out the training of respective models;
when loss functions in models of all participating parties participating in federated transfer learning are converged, sending the local model gradient to a cloud for federated learning;
receiving a model gradient returned by the cloud after federal learning, and modifying the model gradient of the traffic flow prediction model by using the model gradient after federal learning to obtain a standard traffic flow prediction model;
and receiving traffic flow data transmitted by a user, and analyzing the traffic data by using the standard traffic flow prediction model to obtain a traffic flow analysis result.
2. The method for training the traffic flow prediction model based on the federal migration learning of claim 1, wherein the step of training the traffic flow prediction model created in advance by using the traffic data in the local database of one of the participants participating in the federal migration learning until the loss function of the traffic flow prediction model converges to obtain a local model gradient comprises the following steps:
creating a traffic flow prediction model;
training the traffic flow prediction model by using the traffic data in the local database to obtain an output result of the traffic flow prediction model;
calculating a loss function value between the output result and a preset standard result by using a preset loss function;
and when the loss function value tends to be convergent, obtaining a traffic flow prediction model after training, and obtaining gradient parameters of the traffic flow prediction model after training to obtain the local model gradient.
3. The method for training a traffic flow prediction model based on federal transfer learning according to claim 2, wherein the calculating the loss function value between the output result and a preset standard result by using a preset loss function includes:
the loss function value is calculated using the following formula:
wherein, f (x)i) Representing the model output result, yiAnd (3) representing a preset standard result, MSE representing a model loss function value, and n representing the calculation times.
4. The method for training a traffic flow prediction model based on federal transfer learning of claim 3, wherein after calculating the loss function value between the output result and a preset standard result by using a preset loss function, the method further comprises:
when the loss function value does not tend to converge, updating the model gradient of the traffic flow prediction model by using the following formula:
5. The method for training a traffic flow prediction model based on federated transfer learning according to claim 1, wherein the transferring the local model gradient to other participants participating in federated learning through transfer learning for respective model training comprises:
judging the local model Ds=ki{xi,yiD model in other participants participating in Federal transfer learningt=kj{xj,yjWhether the data types and the user ranges in the data are the same or not is judged;
when the data type and user range are the same, local model gradient k will be usediTransmitting to other participants participating in federated transfer learning to perform model D in the other participantst=kj{xj,yj}。
6. The method for training a traffic flow prediction model based on federal transfer learning of claim 1, wherein the step of sending the local model gradient to a cloud for federal learning comprises the steps of:
carrying out gradient aggregation operation on the local model gradient of each participant participating in the federal migration learning to obtain a combined model gradient;
and sending the combined model gradient to each participant participating in the federal migration learning.
7. The method for training a traffic flow prediction model based on federal transfer learning of claim 6, wherein the step of sending the combined model gradient to each participant participating in federal transfer learning further comprises:
loading the federal model gradient into the traffic flow prediction model;
and modifying variables in the traffic flow prediction model according to the federal model gradient to obtain the standard traffic flow prediction model.
8. A traffic flow prediction model training device based on federal transfer learning is characterized by comprising:
the local model training module is used for training a pre-established traffic flow prediction model by using traffic data in a local database of one participant participating in federal transfer learning until a loss function of the traffic flow prediction model is converged to obtain a local model gradient;
the data migration module is used for transmitting the local model gradient to other participants participating in the federal migration learning through the migration learning to train respective models;
the federated learning module is used for sending the local model gradient to the cloud for federated learning when loss functions in the models of all the participating parties participating in federated transfer learning are converged;
the model updating module is used for receiving the model gradient returned by the cloud after the federal learning and modifying the model gradient of the traffic flow prediction model by using the model gradient after the federal learning to obtain a standard traffic flow prediction model;
and the data analysis module is used for receiving the traffic data transmitted by the user and analyzing the traffic data by using the standard traffic flow prediction model to obtain a traffic flow analysis result.
9. An electronic device, characterized in that the electronic device comprises:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores computer program instructions executable by the at least one processor to enable the at least one processor to perform the federal transfer learning based traffic flow prediction model training method of any of claims 1 to 7.
10. A computer-readable storage medium comprising a data storage area storing created data and a program storage area storing a computer program, wherein the computer program when executed by a processor implements the federal migration learning based traffic flow prediction model training method as claimed in any one of claims 1 to 7.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011381851.7A CN112446544A (en) | 2020-12-01 | 2020-12-01 | Traffic flow prediction model training method and device, electronic equipment and storage medium |
PCT/CN2021/083086 WO2022116424A1 (en) | 2020-12-01 | 2021-03-25 | Method and apparatus for training traffic flow prediction model, electronic device, and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011381851.7A CN112446544A (en) | 2020-12-01 | 2020-12-01 | Traffic flow prediction model training method and device, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112446544A true CN112446544A (en) | 2021-03-05 |
Family
ID=74739191
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011381851.7A Pending CN112446544A (en) | 2020-12-01 | 2020-12-01 | Traffic flow prediction model training method and device, electronic equipment and storage medium |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN112446544A (en) |
WO (1) | WO2022116424A1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113177595A (en) * | 2021-04-29 | 2021-07-27 | 北京明朝万达科技股份有限公司 | Document classification model construction, training and testing method and model construction system |
CN113313264A (en) * | 2021-06-02 | 2021-08-27 | 河南大学 | Efficient federal learning method in Internet of vehicles scene |
WO2022116424A1 (en) * | 2020-12-01 | 2022-06-09 | 平安科技(深圳)有限公司 | Method and apparatus for training traffic flow prediction model, electronic device, and storage medium |
WO2022250609A1 (en) * | 2021-05-28 | 2022-12-01 | 脸萌有限公司 | Data protection method, network structure training method and apparatus, medium, and device |
WO2024055979A1 (en) * | 2022-09-14 | 2024-03-21 | 抖音视界有限公司 | Model training method and apparatus, system, and storage medium |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115311860A (en) * | 2022-08-09 | 2022-11-08 | 中国科学院计算技术研究所 | Online federal learning method of traffic flow prediction model |
CN116148193B (en) * | 2023-04-18 | 2023-07-18 | 天津中科谱光信息技术有限公司 | Water quality monitoring method, device, equipment and storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109189825A (en) * | 2018-08-10 | 2019-01-11 | 深圳前海微众银行股份有限公司 | Lateral data cutting federation learning model building method, server and medium |
CN111739285A (en) * | 2020-05-14 | 2020-10-02 | 南方科技大学 | Traffic flow prediction method, device, equipment and computer storage medium |
CN111899076A (en) * | 2020-08-12 | 2020-11-06 | 科技谷(厦门)信息技术有限公司 | Aviation service customization system and method based on federal learning technology platform |
CN111935156A (en) * | 2020-08-12 | 2020-11-13 | 科技谷(厦门)信息技术有限公司 | Data privacy protection method for federated learning |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10012766B2 (en) * | 2015-04-10 | 2018-07-03 | Google Llc | Monitoring external vibration sources for data collection |
CN111241580B (en) * | 2020-01-09 | 2022-08-09 | 广州大学 | Trusted execution environment-based federated learning method |
CN111611610B (en) * | 2020-04-12 | 2023-05-30 | 西安电子科技大学 | Federal learning information processing method, system, storage medium, program, and terminal |
CN112446544A (en) * | 2020-12-01 | 2021-03-05 | 平安科技(深圳)有限公司 | Traffic flow prediction model training method and device, electronic equipment and storage medium |
-
2020
- 2020-12-01 CN CN202011381851.7A patent/CN112446544A/en active Pending
-
2021
- 2021-03-25 WO PCT/CN2021/083086 patent/WO2022116424A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109189825A (en) * | 2018-08-10 | 2019-01-11 | 深圳前海微众银行股份有限公司 | Lateral data cutting federation learning model building method, server and medium |
CN111739285A (en) * | 2020-05-14 | 2020-10-02 | 南方科技大学 | Traffic flow prediction method, device, equipment and computer storage medium |
CN111899076A (en) * | 2020-08-12 | 2020-11-06 | 科技谷(厦门)信息技术有限公司 | Aviation service customization system and method based on federal learning technology platform |
CN111935156A (en) * | 2020-08-12 | 2020-11-13 | 科技谷(厦门)信息技术有限公司 | Data privacy protection method for federated learning |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022116424A1 (en) * | 2020-12-01 | 2022-06-09 | 平安科技(深圳)有限公司 | Method and apparatus for training traffic flow prediction model, electronic device, and storage medium |
CN113177595A (en) * | 2021-04-29 | 2021-07-27 | 北京明朝万达科技股份有限公司 | Document classification model construction, training and testing method and model construction system |
WO2022250609A1 (en) * | 2021-05-28 | 2022-12-01 | 脸萌有限公司 | Data protection method, network structure training method and apparatus, medium, and device |
CN113313264A (en) * | 2021-06-02 | 2021-08-27 | 河南大学 | Efficient federal learning method in Internet of vehicles scene |
CN113313264B (en) * | 2021-06-02 | 2022-08-12 | 河南大学 | Efficient federal learning method in Internet of vehicles scene |
WO2024055979A1 (en) * | 2022-09-14 | 2024-03-21 | 抖音视界有限公司 | Model training method and apparatus, system, and storage medium |
Also Published As
Publication number | Publication date |
---|---|
WO2022116424A1 (en) | 2022-06-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112446544A (en) | Traffic flow prediction model training method and device, electronic equipment and storage medium | |
CN112257774A (en) | Target detection method, device, equipment and storage medium based on federal learning | |
CN112541745B (en) | User behavior data analysis method and device, electronic equipment and readable storage medium | |
CN111814962A (en) | Method and device for acquiring parameters of recognition model, electronic equipment and storage medium | |
CN112396005A (en) | Biological characteristic image recognition method and device, electronic equipment and readable storage medium | |
CN113626606B (en) | Information classification method, device, electronic equipment and readable storage medium | |
CN111783982A (en) | Attack sample acquisition method, device, equipment and medium | |
CN114491047A (en) | Multi-label text classification method and device, electronic equipment and storage medium | |
CN113298159A (en) | Target detection method and device, electronic equipment and storage medium | |
CN112885423A (en) | Disease label detection method and device, electronic equipment and storage medium | |
CN114511038A (en) | False news detection method and device, electronic equipment and readable storage medium | |
CN113868529A (en) | Knowledge recommendation method and device, electronic equipment and readable storage medium | |
CN115471775A (en) | Information verification method, device and equipment based on screen recording video and storage medium | |
CN111985449A (en) | Rescue scene image identification method, device, equipment and computer medium | |
CN115205225A (en) | Training method, device and equipment of medical image recognition model and storage medium | |
CN114913371A (en) | Multitask learning model training method and device, electronic equipment and storage medium | |
CN111950707B (en) | Behavior prediction method, device, equipment and medium based on behavior co-occurrence network | |
CN112990374A (en) | Image classification method, device, electronic equipment and medium | |
CN113487621A (en) | Medical image grading method and device, electronic equipment and readable storage medium | |
CN113157739A (en) | Cross-modal retrieval method and device, electronic equipment and storage medium | |
CN112651782A (en) | Behavior prediction method, device, equipment and medium based on zoom dot product attention | |
CN112269875A (en) | Text classification method and device, electronic equipment and storage medium | |
CN116630712A (en) | Information classification method and device based on modal combination, electronic equipment and medium | |
CN112580505B (en) | Method and device for identifying network point switch door state, electronic equipment and storage medium | |
CN115147660A (en) | Image classification method, device and equipment based on incremental learning and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |