CN110598918B - Aquatic product price prediction method and device and computer storage medium - Google Patents

Aquatic product price prediction method and device and computer storage medium Download PDF

Info

Publication number
CN110598918B
CN110598918B CN201910804169.5A CN201910804169A CN110598918B CN 110598918 B CN110598918 B CN 110598918B CN 201910804169 A CN201910804169 A CN 201910804169A CN 110598918 B CN110598918 B CN 110598918B
Authority
CN
China
Prior art keywords
data
encrypted
terminal
intermediate server
data set
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910804169.5A
Other languages
Chinese (zh)
Other versions
CN110598918A (en
Inventor
叶宁
乐仁龙
徐智军
徐旭辉
龚泽熙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ningbo haihaixian Information Technology Co.,Ltd.
Original Assignee
Ningbo Haishangxian Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo Haishangxian Information Technology Co ltd filed Critical Ningbo Haishangxian Information Technology Co ltd
Priority to CN201910804169.5A priority Critical patent/CN110598918B/en
Publication of CN110598918A publication Critical patent/CN110598918A/en
Application granted granted Critical
Publication of CN110598918B publication Critical patent/CN110598918B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/602Providing cryptographic facilities or services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/049Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0283Price estimation or determination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Forestry; Mining
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A20/00Water conservation; Efficient water supply; Efficient water use
    • Y02A20/20Controlling water pollution; Waste water treatment

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Strategic Management (AREA)
  • Development Economics (AREA)
  • General Physics & Mathematics (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Game Theory and Decision Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • Mathematical Physics (AREA)
  • Tourism & Hospitality (AREA)
  • Computing Systems (AREA)
  • Molecular Biology (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Computer Security & Cryptography (AREA)
  • Operations Research (AREA)
  • Bioethics (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Agronomy & Crop Science (AREA)
  • Animal Husbandry (AREA)
  • Marine Sciences & Fisheries (AREA)
  • Mining & Mineral Resources (AREA)
  • Primary Health Care (AREA)

Abstract

The invention discloses a method and a device for predicting aquatic product prices and a computer storage medium, wherein the method is applied to a first terminal and comprises the following steps: sending an encrypted first data set to an intermediate server platform, wherein the first data set comprises historical data and historical sales data of a set aquatic product; the historical sales data at least comprises historical unit parameter data; receiving the same characteristic parameters and encrypted mapping models sent by the intermediate server platform; determining sample data based on the first data set, the same characteristic parameters and the encrypted mapping model; determining a prediction model based on the sample data, determining a loss value based on the prediction model, and sending the encrypted loss value to the intermediate server platform; the loss value is used for determining the convergence of the prediction model by the intermediate server platform; receiving a training stopping instruction sent by an intermediate server platform, and taking a prediction model as a target prediction model; and predicting the unit parameters based on the target prediction model, the current-day fishing ground data and the current-day sales data.

Description

Aquatic product price prediction method and device and computer storage medium
Technical Field
The invention relates to the technical field of data processing, in particular to a method and a device for predicting aquatic product prices and a computer storage medium.
Background
The aquatic product market is an extremely important component in the economic system of the Chinese market, and the unit price of the aquatic product is the core of the aquatic product market, however, factors influencing the unit price of the aquatic product are many, such as historical unit price, seasonal factors, supply and demand factors of the aquatic product, circulation cost factors and the like, so that the prediction of the unit price of the aquatic product is a challenging task. Due to a plurality of influence factors, the unit price of the aquatic product cannot be accurately predicted based on a data processing technology at present, so that the scientific development of the aquatic product market is not facilitated.
Disclosure of Invention
In order to solve the existing technical problems, the embodiment of the invention provides a method and a device for predicting aquatic product prices and a computer storage medium.
In order to achieve the above purpose, the technical solution of the embodiment of the present invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an aquatic product price prediction method, which is applied to a first terminal, and the method includes:
sending the encrypted first data set to the intermediate server platform; wherein the first data set comprises historical data and historical sales data of a fishery for setting aquatic products; the historical sales data at least comprises historical unit parameter data;
Receiving the same characteristic parameters and the encrypted mapping model sent by the intermediate server platform, and determining sample data based on the first data set, the same characteristic parameters and the encrypted mapping model; the same characteristic parameter is determined by the intermediate server platform according to the received first data set and the second data set, and the second data set and the encrypted mapping model are sent to the intermediate server platform by the second terminal;
determining a prediction model based on the sample data, determining a loss value based on the prediction model, and sending an encrypted loss value to an intermediate server platform; wherein the loss value is used for the intermediate server platform to determine convergence of a predictive model;
receiving a training stopping instruction sent by an intermediate server platform, and taking the prediction model as a target prediction model; wherein the training stopping instruction is sent when the intermediate server platform determines that the prediction model converges;
and predicting unit parameters of the set aquatic product based on the target prediction model, the current-day fishing ground data and the current-day sales data.
In the above solution, the determining a prediction model based on the sample data includes:
Constructing a long-short term memory (LSTM) recurrent neural network model based on the sample data;
and training the LSTM recurrent neural network model based on the sample data to obtain a prediction model.
In the above solution, the predicting unit parameters of set aquatic products based on the target prediction model, the current-day fishing ground data, and the current-day sales data includes:
receiving the current fishing ground data and the current sales data; carrying out normalization processing on the current-day fishery data and the current-day sales data to obtain normalized current-day fishery data and normalized current-day sales data; inputting the normalized fishery data on the same day and the normalized sales data on the same day into the LSTM recurrent neural network model, and predicting unit parameters of the set aquatic product; and performing reverse normalization processing on the predicted unit parameters of the set aquatic products to obtain the predicted values of the unit parameters of the set aquatic products.
In the above scheme, before receiving a training stopping instruction sent by the intermediate server platform, the method further includes:
receiving a combined gradient value sent by the intermediate server platform; the joint gradient value is determined by the intermediate server platform according to the received encrypted first gradient value and the encrypted second gradient value; the encrypted first gradient value is sent to the intermediate server platform by the first terminal; the encrypted second gradient value is sent to the intermediate server platform by the second terminal;
Updating sample data based on the joint gradient value;
based on the updated sample data, the predictive model is updated.
In a second aspect, an embodiment of the present invention further provides an aquatic product price prediction method, which is applied to an intermediate server platform, and the method includes:
receiving an encrypted first data set sent by a first terminal and an encrypted second data set sent by a second terminal; the first data set comprises historical data and historical sales data of a fishery of set aquatic products; the historical sales data at least comprises historical unit parameter data;
decrypting the encrypted first data set and the encrypted second data set to obtain a first data set and a second data set;
determining a same feature parameter based on the first data set and the second data set;
receiving an encrypted mapping model sent by a second terminal, and sending the same characteristic parameters and the encrypted mapping model to the first terminal;
receiving an encrypted loss value sent by a first terminal, and determining the convergence of a prediction model based on the encrypted loss value;
and when the prediction model is determined to be converged, sending a training stopping instruction to the first terminal.
In the above solution, the determining convergence of the prediction model based on the encrypted loss value includes:
Receiving an encrypted second loss value sent by a second terminal;
determining a predictive model convergence based on the encrypted loss value and the encrypted second loss value.
In the foregoing solution, the determining the convergence of the prediction model based on the encrypted loss value and the encrypted second loss value includes:
obtaining a loss sum based on the received encrypted loss value and the encrypted second loss value;
comparing the loss sum with a set loss threshold, and if the loss sum is not greater than the set loss threshold, determining that the prediction model converges; and if the loss sum is larger than a set loss threshold value, determining that the prediction model does not converge.
In the above scheme, before sending the training stopping instruction to the first terminal, the method further includes:
receiving the encrypted first gradient value and the encrypted second gradient value;
and determining a joint gradient value based on the encrypted first gradient value and the encrypted second gradient value, and sending the joint gradient value to the first terminal.
In a third aspect, the present invention also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs any of the steps of the method described above.
In a fourth aspect, an embodiment of the present invention provides an aquatic product price prediction apparatus, where the apparatus includes: a processor and a memory for storing a computer program operable on the processor, wherein the processor is operable to perform any of the steps of the method described above when executing the computer program.
The embodiment of the invention provides a method and a device for predicting aquatic product prices and a computer storage medium, wherein the method is applied to a first terminal and comprises the following steps: sending an encrypted first data set to an intermediate server platform, wherein the first data set comprises historical data and historical sales data of a set aquatic product; the historical sales data at least comprises historical unit parameter data; receiving the same characteristic parameters and the encrypted mapping model sent by the intermediate server platform, and determining sample data based on the first data set, the same characteristic parameters and the encrypted mapping model; the same characteristic parameter is determined by the intermediate server platform according to the received first data set and the second data set, and the second data set and the encrypted mapping model are sent to the intermediate server platform by the second terminal; determining a prediction model based on the sample data, determining a loss value based on the prediction model, and sending an encrypted loss value to an intermediate server platform; wherein the loss value is used for the intermediate server platform to determine convergence of a predictive model; receiving a training stopping instruction sent by an intermediate server platform, and taking the prediction model as a target prediction model; wherein the training stopping instruction is sent when the intermediate server platform determines that the prediction model converges; and predicting unit parameters of the set aquatic product based on the target prediction model, the current-day fishing ground data and the current-day sales data. Based on the method, the aquatic product price prediction method provided by the embodiment of the invention can ensure the safety of data in each terminal; and each terminal can acquire more comprehensive sample data, so that each terminal can acquire more accurate prediction models, and each terminal can accurately predict the price of the aquatic product.
Drawings
FIG. 1 is a schematic flow chart of a method for predicting the price of an aquatic product according to an embodiment of the present invention;
FIG. 2 is a schematic structural diagram of an LSTM recurrent neural network model according to an embodiment of the present invention;
FIG. 3 is a schematic flow chart of another aquatic product price forecasting method according to an embodiment of the present invention;
FIGS. 4A-4C are schematic diagrams of a specific application flow of the aquatic product price prediction method according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of an aquatic product price prediction apparatus applied to a terminal side according to an embodiment of the present invention;
FIG. 6 is a schematic structural diagram of an aquatic product price forecasting apparatus applied to an intermediate server platform according to an embodiment of the present invention;
fig. 7 is a schematic diagram of a hardware structure of the aquatic product price prediction apparatus according to the embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the following describes specific technical solutions of the present invention in further detail with reference to the accompanying drawings in the embodiments of the present invention. The following examples are intended to illustrate the invention but are not intended to limit the scope of the invention.
With the development of intellectualization, purchasers want a faster, less expensive, and better purchasing experience; the seller also desires "many", "fast", "secure" business needs. Based on the method, the aquatic product unit price prediction method is provided based on the federal learning-long and short term memory model, so that the aquatic product supplier can accurately predict the aquatic product unit price and ensure the information safety of the aquatic product supplier during data exchange in the aquatic product market.
For a better understanding of the embodiments of the present invention, the related art will be briefly described below.
Federal Learning (FL) is a new artificial intelligence basic technology, and the main issue of Learning research is how to continue machine Learning on the premise of protecting data privacy and meeting legal compliance requirements, and the purpose of satisfying the requirement that data is not shared is to enhance the effectiveness of a model by using data of two or more parties.
Federal Learning is divided into Horizontal Federal Learning (HFL), Vertical Federal Learning (VFL), and Federal Transfer Learning (FTL) for different data sets, in which:
the horizontal federal learning: data sets mainly used in two (or more) parties have the characteristics of more user feature overlapping and less user group overlapping, for example, banks in two different regions have user groups which come from the regions where the user groups are respectively located and have small intersection; however, their business is very similar, i.e., there is more overlap of user features, at which point lateral federal learning can be used to build models.
The longitudinal federal learning: data sets intended primarily for use on two (or more) parties are characterized by more overlapping user groups and less overlapping user features, for example, there are two different institutions, one is a banking institution in a certain area and the other is an e-commerce institution in the same area, and their user groups contain a large proportion of the residents of the area, and thus the user groups overlap more. However, since the banking institution records the income and expenditure behaviors and the credit rating of the user group, and the e-commerce institution records the browsing and purchasing histories of the user group, that is, the user characteristics are less overlapped, in this case, the longitudinal federal learning can be used for constructing the model.
The federal migration learning: data sets mainly used in two (or more) parties have the characteristic that the overlap of user groups and user features is small, for example, two different organizations are provided, one is a bank organization located in China, the other is an e-commerce organization located in the United states, and the overlap of the user groups of the two organizations is small due to regional limitation. Meanwhile, due to the difference of mechanism types, the user characteristics of the two mechanisms are only partially overlapped. At this time, to perform effective federal learning, transfer learning must be introduced to solve the problems of small unilateral data scale and few label samples, thereby improving the effectiveness of the model.
Fig. 1 is a schematic flow chart of a method for predicting aquatic product prices according to an embodiment of the present invention, as shown in fig. 1, the method is applied to a first terminal, and includes:
s101: sending the encrypted first data set to the intermediate server platform; wherein the first data set comprises historical data and historical sales data of a fishery for setting aquatic products; the historical sales data at least comprises historical unit parameter data;
s102: receiving the same characteristic parameters and the encrypted mapping model sent by the intermediate server platform, and determining sample data based on the first data set, the same characteristic parameters and the encrypted mapping model; the same characteristic parameter is determined by the intermediate server platform according to the received first data set and the second data set, and the second data set and the encrypted mapping model are sent to the intermediate server platform by the second terminal;
S103: determining a prediction model based on the sample data, determining a loss value based on the prediction model, and sending an encrypted loss value to an intermediate server platform; wherein the loss value is used for the intermediate server platform to determine convergence of a predictive model;
s104: receiving a training stopping instruction sent by an intermediate server platform, and taking the prediction model as a target prediction model; wherein the training stopping instruction is sent when the intermediate server platform determines that the prediction model converges;
s105: and predicting unit parameters of the set aquatic product based on the target prediction model, the data of the fishing ground on the day and the data of the unit parameters on the day.
It should be noted that the data sets provided by the suppliers of aquatic products satisfy that the overlap of user characteristics is more, but the dates of fishing the set aquatic products by the fishing boats of the suppliers are different from the dates of selling the set aquatic products, that is, the overlap of user groups is less, so that a horizontal federal learning model can be established based on the data sets provided by the suppliers, so as to predict the unit price of the aquatic products and protect the safety of the data provided by the suppliers. And the aquatic products have various types, such as fish, shrimp, crab and other major types, wherein each major type can be divided into different minor types, for example, fish can be divided into different types. In practical applications, the unit prices of different species of fish are different. As the unit price prediction of each aquatic product is independent, the aquatic product price prediction method provided by the embodiment of the invention can be adopted, so that the embodiment of the invention only predicts the unit price of a specific aquatic product, namely a set aquatic product, so as to illustrate the inventive concept of the invention.
The historical unit parameter data mentioned here refers to the unit price of a set aquatic product, that is, the unit price of a certain set aquatic product is predicted in the embodiment of the present invention.
In some embodiments, the fishing ground history data includes marine environment data and fishing data corresponding to each day recorded when the set aquatic product is fished in the fishing ground for a past period of time, wherein the marine environment data refers to marine environment parameters corresponding to a Sea area where the fishing ground is located, such as Sea Surface Temperature (SST), Sea Surface Height (SSH), Sea Surface Wind field (SSW), Sea Surface Chlorophyll content (chlophyl, CHL), and Sea Surface Light Intensity (SSLI). The fishing data refers to the corresponding fishing amount of the fishing ground in each day when the set aquatic products are fished in the past period of time, and is recorded as C, and the measurement unit is ton.
In some embodiments, the historical sales data refers to data recorded when the set aquatic product in the fishing ground is sold in the past period of time, wherein the data at least includes a unit price, which is denoted as P and is unit parameter data, corresponding to the set aquatic product sold in each day in the past period of time; sales volume, denoted W, may also be included.
Specifically, the obtained fishery historical data and the first historical sales data are sorted according to the sequence of dates, and then the fishery historical data and the historical sales data on the same date form a line to obtain the first data set.
Illustratively, assume that the first data set is:
Figure BDA0002183145630000051
wherein the content of the first and second substances,
Figure BDA0002183145630000052
data corresponding to the z-th characteristic parameter in the first data set, such as SST, SSH, SSW, CHL, C, P, etc., wherein z is 1,2,3,4,5, 6; i represents the value of the sequence number sorted according to the date sequence as follows: 1,2,3, … …, N; n is a positive integer; y isi 1For data corresponding to parameters to be predicted in the first data set, for example, data corresponding to the unit price of the aquatic product is set, that is: data corresponding to the unit parameter.
It should be noted that the first terminal is any electronic device with a data processing function, such as a computer, a server, and the like. As can be seen from the above description, the first data set is a data set that includes a plurality of characteristic parameters, and data corresponding to each characteristic parameter is sorted according to the time sequence. The first data set may be input to the first terminal by a first user. The second terminal may represent a different terminal than the first terminal, the second data set being from a different user than the first user. The second terminal obtains the data set in a similar manner as the first terminal, and the obtained data set includes similar data types, so that the following only takes two vendor terminals as an example to describe in detail the specific implementation of the embodiment of the present invention.
It should be noted that the intermediate server platform is a third party platform independent from the first terminal and the second terminal. The first terminal and the second terminal can perform indirect data interaction through the intermediate service side platform, so that the data safety of the first terminal and the second terminal is guaranteed when federal learning is performed.
In some embodiments, prior to sending the encrypted first data set to the intermediate server platform, the method further comprises: the first terminal encrypts the first data set according to a set encryption mode; the set encryption mode may be sent to the first terminal by the intermediate server platform. The set Encryption mode may be a Homomorphic Encryption (HE) algorithm. Here, the same characteristic parameter refers to a characteristic parameter included in both the first data set and the second data set.
For example, assume the first data set is:
Figure BDA0002183145630000053
the second data set is:
Figure BDA0002183145630000054
at this time, the first data set S1 and the second data set S2 both include the following characteristic parameters:
Figure BDA0002183145630000055
wherein the content of the first and second substances,
Figure BDA0002183145630000056
and
Figure BDA0002183145630000057
are used to indicate the same characteristic parameters as,k=2,3,4,5,6。
in some embodiments, the manner in which the second terminal obtains the encrypted mapping model may include the steps of:
the second terminal divides the second data set into a first subset and a second subset according to the same characteristic parameter; the first subset comprises data corresponding to the same characteristic parameters in the second data set; the second subset comprises data corresponding to the non-identical characteristic parameters in the second data set; non-identical characteristic parameters refer to characteristic parameters in the second data set other than identical characteristic parameters. The first terminal then constructs a mapping function of the first subset to the second subset resulting in a mapping model such that each data in the second subset can be represented by the first subset.
Then, the second terminal sets an encryption mode encryption mapping model to obtain the encrypted mapping model. It should be noted that the set encryption scheme in the second terminal may also be transmitted by the intermediate server platform.
For example, with S as above1And S2For example, the second data set is divided into a first subset and a second subset according to the same characteristic parameters; the first subset includes:
Figure BDA0002183145630000061
the second subset includes:
Figure BDA0002183145630000062
at this time, a mapping function from S' to S ″ needs to be constructed to obtain a mapping model, which is recorded as:
Figure BDA0002183145630000063
after obtaining the mapping model, the second terminal encrypts the mapping model according to a set encryption mode, and the obtained encrypted mapping model is as follows:
Figure BDA0002183145630000064
the encrypted mapping model is then sent to the intermediate server platform, which sends the encrypted mapping model to the first terminal.
In some embodiments, for the mapping model based on the first data set, the same feature parameters, and the encryption in S102, determining sample data comprises:
determining completion sample data based on the same characteristic parameters and the encrypted mapping model;
and determining sample data based on the first data set and the completion sample data.
Here, after the first terminal obtains the encrypted mapping model, the first terminal substitutes the data corresponding to the same characteristic parameter in the first data set into the encrypted mapping model to obtain the completion sample data.
For example, with S as above1And S2For example, the encrypted mapping model is received at the first terminal:
Figure BDA0002183145630000065
the first terminal collects the data corresponding to the same characteristic parameters in the first data set
Figure BDA0002183145630000066
Mapping model with encryption:
Figure BDA0002183145630000067
to obtain the complement sample data:
Figure BDA0002183145630000068
in practical application, the complemented sample data is added to the first data set to obtain the sample data.
Illustratively, assume S obtained in the manner described above1The complement sample data is
Figure BDA0002183145630000069
Then, at this time, the sample data
Figure BDA00021831456300000610
In some embodiments, determining a predictive model based on the sample data comprises: constructing a Long Short-Term Memory (LSTM) recurrent neural network model based on the sample data;
and training the LSTM recurrent neural network model based on the sample data to obtain a prediction model.
Here, the structure of the LSTM recurrent neural network model may include an input layer, a hidden layer, and an output layer as shown in fig. 2, in which:
the number of neurons included in the input layer can be determined according to the number of parameters in the first data set; the parameters referred to herein are all parameters included in the first data set, for example, 8 parameters included in the sample data S, and thus 8 neurons can be set in the input layer. It should be noted that, in S, each row represents a group of samples; n is the total number of groups of samples.
The hidden layer may adopt an LSTM network, and the larger the number of the hidden layers is, the more the prediction error of the data corresponding to the parameter to be predicted can be effectively reduced to some extent, and the prediction accuracy of the data corresponding to the parameter to be predicted is improved, for example, the parameter to be predicted may be a unit parameter in the first data set. However, the structure of the LSTM recurrent neural network is complicated while increasing the number of hidden layers. In some embodiments, the hidden layer may comprise 2 layers. Then, each layer in the hidden layer is required to be set to contain the number of the neurons, and the number can be set by a user according to actual conditions. For example, the selection may be performed according to the computing capability of the first terminal, and when the computing capability of the first terminal is stronger, a plurality of neurons may be appropriately selected, for example, more than 20 neurons may be selected; if the first terminal has a weak operation capability, a few neurons, for example, less than 20 neurons, may be selected appropriately.
The number of the neural units arranged in the output layer can be set according to the requirements of the user, for example, if the user needs to predict a plurality of parameters in the first data set, the neurons corresponding to the parameters can be arranged in the output layer. For example, in sample data S, the user needs to predict
Figure BDA0002183145630000071
Yi 1Two parameters, in this case, two neurons can be placed at the output layer.
After the LSTM recurrent neural network model is built, a user can set model parameters such as learning rate, iteration times and the like on a first terminal, wherein the learning rate can be selected according to actual conditions and is generally in a value range (0, 1); the iteration times refer to the times of training sample data and can also be set according to actual needs.
It should be noted that, before training the LSTM recurrent neural network model based on the sample data, the first terminal needs to perform normalization processing on the sample data, and in some embodiments, the min-max normalization method may be used to perform normalization processing on the sample data.
Then, the user inputs the normalized sample data into the LSTM recurrent neural network model in the first terminal, the LSTM recurrent neural network model is trained by the first terminal to further obtain a prediction model, and after obtaining the prediction model, a loss value can be obtained by using the following formula (1) of a loss function (loss).
Figure BDA0002183145630000072
Wherein, N is the number of sample groups contained in the sample data; y isi 1'Normalizing the data corresponding to the parameters needing to be predicted in the ith group of samples of the sample data;
Figure BDA0002183145630000073
The method is used for obtaining a prediction value of data corresponding to a parameter needing to be predicted in the ith group of samples in the sample data based on the prediction model.
After the first terminal obtains the loss value, the first terminal encrypts the loss value according to a set mode and sends the encrypted loss value to the intermediate server platform, and the intermediate server platform determines the convergence of the obtained prediction model according to the encrypted loss value, which may include:
the intermediate server platform receives the encrypted second loss value sent by the second terminal; then determining the convergence of the prediction model based on the encrypted loss value and the encrypted second loss value, and specifically, acquiring loss sum by the intermediate server platform based on the received encrypted loss value and the encrypted second loss value;
comparing the loss sum with a set loss threshold, and if the loss sum is not greater than the set loss threshold, determining that the prediction model converges; and if the loss sum is larger than a set loss threshold value, determining that the prediction model does not converge.
It should be noted that, the above intermediate server platform obtains the loss sum based on the received encrypted loss value and the encrypted second loss value, and includes two ways: the intermediate service side platform decrypts the encrypted loss value and the encrypted second loss value by setting a decryption mode to obtain a loss value and a second loss value; the loss value and the second loss value are then added to obtain a loss sum. And the other method is that the intermediate service side platform firstly sums the encrypted loss value and the encrypted second loss value to obtain the encrypted loss sum, and then decrypts the encrypted loss sum by setting a decryption mode to obtain the loss sum. In practical applications, one of the two implementation manners may be selected. The set decryption method is the reverse of the set encryption method.
In some embodiments, the first terminal takes the prediction model as a target prediction model when receiving a training stopping instruction sent by an intermediate server platform; that is, when a training stopping instruction is received, the trained prediction model meets the requirements, the training of the LSTM recurrent neural network model may be stopped, and the prediction model obtained at this time may be used as the target prediction model. The first terminal may perform data prediction based on the target prediction model and the current data set, for example, Y in the sample data S may be predictedi 1And (6) performing prediction.
In some embodiments, before receiving the stop training instruction sent by the intermediate server platform, the method further includes:
receiving a combined gradient value sent by the intermediate server platform; the joint gradient value is determined by the intermediate server platform according to the received encrypted first gradient value and the encrypted second gradient value; the encrypted first gradient value is sent to the intermediate server platform by the first terminal; the encrypted second gradient value is sent to the intermediate server platform by the second terminal; updating sample data based on the joint gradient value; based on the updated sample data, the predictive model is updated.
In some embodiments, determining the joint gradient value based on the encrypted first gradient value and the encrypted second gradient value includes two ways: the intermediate server side platform decrypts the encrypted first gradient value and the encrypted second gradient value in a set decryption mode to obtain a first gradient value and a second gradient value; then adding the first gradient value and the second gradient value to obtain a combined gradient value; and the other is that the intermediate server side platform sums the encrypted first gradient value and the encrypted second gradient value to obtain an encrypted combined gradient value, and then decrypts the encrypted gradient value by setting a decryption mode to obtain the combined gradient value. In practical applications, one of the two implementation manners may be selected.
It should be noted that, before the receiving of the joint gradient value sent by the intermediate server platform, the first terminal may send the encrypted loss value to the intermediate server platform and also send the encrypted first gradient value to the intermediate server platform; or the first terminal can also send the encrypted first gradient value to the intermediate server platform after receiving the instruction of continuing training sent by the intermediate server platform.
In some embodiments, the first terminal updates sample data based on the joint gradient value, including:
when the first terminal receives the joint gradient value, the first terminal calculates the product between the joint gradient value and the learning rate;
and subtracting the product from each data in the sample data to obtain updated sample data.
And after obtaining the updated sample data, the first terminal retrains the LSTM recurrent neural network model based on the updated sample data to update the prediction model, and then determines an updated loss value based on the updated prediction model. The first terminal encrypts the updated loss value according to a set mode, and sends the encrypted updated loss value to the intermediate service side platform, and the intermediate service side platform receives the updated loss value and determines the convergence of the updated prediction model, and the judgment process is the same as that described above, and is not repeated herein. And if the updated prediction model is still not converged, repeatedly executing the updating operation until the prediction model is converged and the intermediate server side platform sends a training stopping instruction to the first terminal.
In practical applications, for S105, the following steps may be included:
Receiving the current fishing ground data and the current sales data;
carrying out normalization processing on the current-day fishery data and the current-day sales data to obtain normalized current-day fishery data and normalized current-day sales data;
inputting the normalized fishery data on the same day and the normalized sales data on the same day into the LSTM recurrent neural network model, and predicting unit parameters of the set aquatic product;
and performing reverse normalization processing on the predicted unit parameters of the set aquatic products to obtain the predicted values of the unit parameters of the set aquatic products.
The characteristic parameters of the fishing ground data on the same day are the same as the characteristic parameters of the fishing ground historical data; the current day sales data are the same as the characteristic parameters contained in the historical sales data. Here, the predicted value of the unit parameter of the set aquatic product means the unit price of the set aquatic product in the future.
In practical application, the normalization processing and the inverse normalization processing are inverse processes, and the normalization processing can adopt the min-max standardization method to project the fishing ground data and the sales data of the current day between (0, 1); correspondingly, in popular terms, the inverse normalization process is to reduce the data between (0, 1) to a value with physical significance, for example, the predicted value of the unit parameter of the future set aquatic product can be obtained after the unit parameter of the predicted set aquatic product is subjected to the inverse normalization process, that is: in the future, the unit price of aquatic products is set.
In some embodiments, after predicting the unit price of the set aquatic product in the future, the first terminal may send the predicted unit price of the set aquatic product to the intermediate server platform, receive the predicted unit price of the set aquatic product by the intermediate server platform, and push the predicted unit price of the set aquatic product to the client.
It should be noted that the first terminal sends the predicted unit price of the set aquatic product to the intermediate server platform, and then the intermediate server platform pushes the received unit price of the set aquatic product to the client terminal, so that the user can know the unit price of the set aquatic product in the future, wherein the client terminal is a terminal held by the user needing to set the aquatic product, and the user using the client terminal can know the unit price of the set aquatic product in the future. It should be understood that the type of the client terminal may be the same as the type of the first terminal, for example, both are mobile terminals, e.g., mobile phones; or different, for example, the client terminal is a mobile terminal, such as a mobile phone; the first terminal is a fixed terminal, such as a computer.
Fig. 3 is a schematic flow chart of another aquatic product price prediction method provided in an embodiment of the present invention, and as shown in fig. 3, the method is applied to an intermediate server platform, and includes:
S301: receiving an encrypted first data set sent by a first terminal and an encrypted second data set sent by a second terminal; the first data set comprises historical data and historical sales data of a fishery of set aquatic products; the historical sales data at least comprises historical unit parameter data;
s302: decrypting the encrypted first data set and the encrypted second data set to obtain a first data set and a second data set;
s303: determining a same feature parameter based on the first data set and the second data set;
s304: receiving an encrypted mapping model sent by a second terminal, and sending the same characteristic parameters and the encrypted mapping model to the first terminal;
s305: receiving an encrypted loss value sent by a first terminal, and determining the convergence of a prediction model based on the encrypted loss value;
s306: and when the prediction model is determined to be converged, sending a training stopping instruction to the first terminal.
It should be noted that the decryption of the intermediate server platform is the reverse process of the encryption. Since the intermediate server platform is an opposite end interacting with the first terminal, some terms and functions on the intermediate server platform side are described in detail in the above embodiments, and can be understood on the side based on the above description, which is not described herein again.
In some embodiments, for S302, comprising:
and decrypting the encrypted first data set and the encrypted second data set according to a set decryption mode to obtain the first data set and the second data set.
The decryption method is set to the reverse process of the encryption method.
In other embodiments, determining the convergence of the predictive model for the encrypted loss value in step 305 includes:
receiving an encrypted second loss value sent by a second terminal;
determining a predictive model convergence based on the encrypted loss value and the encrypted second loss value.
In some embodiments, the determining the predictive model convergence based on the encrypted loss value and the encrypted second loss value comprises:
obtaining a loss sum based on the received encrypted loss value and the encrypted second loss value;
comparing the loss sum with a set loss threshold, and if the loss sum is not greater than the set loss threshold, determining that the prediction model converges; and if the loss sum is larger than a set loss threshold value, determining that the prediction model does not converge.
It should be noted that, the above intermediate server platform obtains the loss sum based on the received encrypted loss value and the encrypted second loss value, and includes two ways: the intermediate service side platform decrypts the encrypted loss value and the encrypted second loss value by setting a decryption mode to obtain a loss value and a second loss value; the loss value and the second loss value are then added to obtain a loss sum. And the other method is that the intermediate service side platform firstly sums the encrypted loss value and the encrypted second loss value to obtain the encrypted loss sum, and then decrypts the encrypted loss sum by setting a decryption mode to obtain the loss sum. In practical applications, one of the two implementation manners may be selected. The set decryption method is the reverse of the set encryption method.
The set loss threshold is set according to different prediction requirements, and is usually set to have a value range of (0, 1).
In some further embodiments, before sending the stop training instruction to the first terminal, the method further comprises: receiving the encrypted first gradient value and the encrypted second gradient value; and determining a joint gradient value based on the encrypted first gradient value and the encrypted second gradient value, and sending the joint gradient value to the first terminal.
In some embodiments, determining the joint gradient value based on the encrypted first gradient value and the encrypted second gradient value includes two ways: the intermediate server side platform decrypts the encrypted first gradient value and the encrypted second gradient value in a set decryption mode to obtain a first gradient value and a second gradient value; then adding the first gradient value and the second gradient value to obtain a combined gradient value; and the other is that the intermediate server side platform sums the encrypted first gradient value and the encrypted second gradient value to obtain an encrypted combined gradient value, and then decrypts the encrypted gradient value by setting a decryption mode to obtain the combined gradient value. In practical applications, one of the two implementation manners may be selected.
To better understand the embodiment of the present invention, the following describes the idea of the present invention in detail by taking the prediction of the unit price of the set aquatic product based on the data sets provided by the first supplier and the second supplier as an example.
As shown in fig. 4A to 4C, which illustrate a flow chart of a aquatic product unit price prediction method provided by an embodiment of the present invention, the method includes:
s401: the method comprises the steps that a first supplier terminal receives a first data set input by a first supplier, wherein the first data set comprises first fishery historical data and first historical sales data; the second supplier terminal receives a second data set input by a second supplier, wherein the second data set comprises second fishing ground historical data and second historical sales data.
It should be noted that the first provider terminal is a specific type of the first terminal, and functions as the first terminal. The second provider terminal has the same function as the second terminal and is not described in detail. Each supplier terminal corresponds to a supplier. That is, the first vendor terminal corresponds to the first vendor; the second vendor terminal corresponds to a second vendor. It should be understood that the terms "first" and "second" are used herein for convenience of description and are not intended to be limiting. The term "set aquatic product" means any of a wide variety of aquatic products. The fishing ground owned by the first supplier may be one; or may be plural.
Therefore, in some embodiments, when the first supplier has only one fishery, at this time, the first historical sales data refers to data recorded when the set aquatic product in the fishery is sold in a past period of time, wherein the data includes a unit price and a sales volume corresponding to the set aquatic product sold on each day in the past period of time, the unit price is denoted as P, and the measurement unit is one kilogram (yuan/jin); the sales is recorded as W, and the measurement unit is ton. This data may be recorded using a table, as shown in table 1 below.
TABLE 1
Time of sale P (Yuan/jin) W (ton)
20190101 12 80
…… …… ……
20191231 8 50
…… …… ……
It is understood that to predict the unit price of the set water product, it is necessary to obtain factors that affect the unit price of the set water product. According to the unit price rule in economics, within a certain or specific time, the unit price fluctuation of a certain commodity or a certain type of commodity is determined by commodity value and fluctuates around the value under the influence of market supply and demand relationship, in other words, the unit price of a set aquatic product is influenced by the supply and demand relationship, wherein 'supply' means that each supplier provides the number of the set aquatic products, namely the fishing amount; "ask" means that the buyer needs to set the quantity of aquatic products, namely: sales volume. In practical application, the set fishing amount of the aquatic products is influenced by the marine environment of the sea area where the fishing ground of each supplier is located, so that in the embodiment of the invention, marine environment data corresponding to the fishing ground when the set aquatic products are fished in the fishing ground within a past period of time is also required to be acquired.
In practical applications, the first fishing ground history data includes marine environment data and fishing data corresponding to each day recorded when the set aquatic product is fished in the fishing ground within a past period of time, where the marine environment data refers to marine environment parameters corresponding to a Sea area where the fishing ground is located, such as Sea Surface Temperature (SST), Sea Surface Height (SSH), Sea Surface Wind field (SSW), Sea Surface Chlorophyll content (chlophyl, CHL), and Sea Surface illumination Intensity (SSLI). When recorded, the units of measure for SST are typically in degrees celsius (° c); the units of SSH are typically in millimeters (mm); the unit of measurement for SSW is typically in meters per second (m/s); CHL is typically measured in grams per cubic centimeter (g/cm)3) (ii) a The unit of measurement for SSLI is usually Lux (Lux). It should be understood that each of the marine environmental parameters (i.e., SST, SSH, SSW, and CHL) varies with time, i.e., the SST, SSH, SSW, CHL, and SSLI are different at different times in the sea area where the fishing ground is located. In practical application, the marine environment data can also be recorded by using a table. It should be noted that, because the variety of marine environmental parameters is many, in practical application, the marine environmental parameters obtained by various aquatic product suppliers only include one or more, and not all marine environmental parameters can be completely obtained. Thus, at the time of offshore fishing, the marine environmental data of the sea area where the fishing ground is located, which is obtained by the first supplier, may include SST, SSH, SSW and CHL, as shown in table 2 below.
TABLE 2
Fishing time SST(℃) SSH(mm) SSW(m/s) CHL(g/cm3)
20190101 28.95 0.0885 7.80 0.0885
…… …… …… …… ……
20191231 28.65 0.0825 7.80 0.0825
…… …… …… …… ……
Here, the fishing data refers to the fishing amount corresponding to each day when the set aquatic product is fished in the fishing ground within a past period of time, and is recorded as C, and the measurement unit is ton. In practical applications, the fishing data may also be recorded using a table, as shown in table 3 below.
TABLE 3
Fishing time C (ton)
20190101 100
…… ……
20191231 100
…… ……
After the first fishery historical data and the first historical sales data are obtained, the two data need to be fused to obtain a first data set input by a first supplier into a first supplier terminal, specifically, the obtained first fishery historical data and the first historical sales data are sorted according to the sequence of dates, and then the first fishery historical data and the first historical sales data on the same date form a line to obtain the first data set. That is, the first fishing ground history data and the first historical sales data having the same sale date and catch date are grouped into one line as shown in table 4.
TABLE 4
Figure BDA0002183145630000121
Based on this, the first data set is:
Figure BDA0002183145630000122
wherein S1 is used to represent a first data set;
Figure BDA0002183145630000123
a column vector for representing the composition of data recorded on different dates by the characteristic parameter SST in the first data set;
Figure BDA0002183145630000124
a column vector for representing data composition of the characteristic parameters SSH recorded on different dates in the first data set;
Figure BDA0002183145630000125
A column vector for representing data composition of the characteristic parameters SSW recorded on different dates in the first data set;
Figure BDA0002183145630000126
a column vector for representing data composition of the characteristic parameter CHL recorded on different dates in the first data set;
Figure BDA0002183145630000127
the column vector is composed of data recorded by the characteristic parameter catching quantity C in the first data set on different dates;
Figure BDA0002183145630000128
a column vector consisting of data for representing characteristic parameter sales W in the first data set recorded on different dates;
Figure BDA0002183145630000129
a column vector consisting of data recorded on different dates by the unit price P of the set aquatic product in the first data set; q is used for representing serial numbers arranged in sequence on different dates, q is 1,2,3 … … N, and N is a positive integer.
It should be noted that tables 1-4 are merely exemplary of the first fishing ground historical data and the first historical sales data of the present invention, i.e., the data values in tables 1-4 do not limit the present invention itself. It should be noted that the first data set is the same as the first data set in the above embodiment, and each parameter in the first data set is simply replaced by a parameter with a physical meaning here, so as to implement the application of the aquatic product price prediction method provided by the above embodiment in a practical scenario.
In other embodiments, where the first supplier has a plurality of fishing farms, each corresponding to a data set stored in the form of Table 4, the first data set includes first fishing farm history data and first historical sales data stored in the form of a plurality of fishing farms as Table 4.
For example, assuming that the first supplier includes 2 fishing farms a and B, the first data set includes two tables in the form of, for example, 4, such as tables 5 and 6.
TABLE 5
Figure BDA00021831456300001210
TABLE 6
Figure BDA0002183145630000131
XXi is used for indicating the same date of the sale date and the fishing date and is in the format of year, month and day; XX is used for representing specific values of each characteristic parameter at different dates, and can be obtained according to actual conditions. Table 5 corresponds to fishing ground A and can be expressed by the following formula:
Figure BDA0002183145630000132
table 6 corresponds to fishing ground B and can be expressed by the following formula:
Figure BDA0002183145630000133
at this time, S1 ═ S (S)1A,S1B). As can be seen from tables 4, 5 and 6, the first data set provided by the first supplier, whether the first supplier comprises one fishing ground or a plurality of fishing grounds, comprises characteristic parametersThe types of the compounds are as follows: x1,X2,X3,X4,X5,X6
It should be noted that, because the embodiment of the present invention employs the horizontal federal learning, the horizontal federal learning is characterized in that the user features overlap more, that is, the feature parameters overlap more in the embodiment of the present invention; and the user population overlaps less, that is: the dates of fishing the set aquatic product and selling the set aquatic product are different for each supplier. When performing horizontal federal learning by using data sets provided by various suppliers, the feature parameters of the same type between the data sets provided by various suppliers need to be extracted first, so as to facilitate the subsequent federal learning. And since the types of the characteristic parameters contained in the data sets provided by one supplier should be the same, for example, the first data set provided by the first supplier, whether containing one fishing ground or containing a plurality of fishing grounds, contains the following types of characteristic parameters: x 1,X2,X3,X4,X5,X6. Based on this, the embodiment of the present invention can illustrate the idea of the present invention only by taking the first supplier having a fishing ground as an example.
It should be noted that the first data set and the second data set are from different supplier terminals, and therefore, the characteristic parameters included in the two data sets are not identical.
In the present application scenario, the first data set is:
Figure BDA0002183145630000134
the second data set may be
Figure BDA0002183145630000135
Wherein S is2For representing a second data set provided by a second vendor; wherein the content of the first and second substances,
Figure BDA0002183145630000136
Figure BDA0002183145630000137
and S1Each feature ofThe meaning of the parameters is the same;
Figure BDA0002183145630000138
sea Surface Light Intensity (SSLI); j is a sequence number arranged according to the sequence of different dates, and it should be noted that q and m may be equal or unequal. At this time, the process of the present invention,
Figure BDA0002183145630000139
and
Figure BDA00021831456300001310
are characteristic parameters that are not identical in the first data set and the second data set.
S402: the first supplier terminal encrypts the first data set according to a set encryption mode and sends the encrypted first data set to the intermediate server platform; and the second supplier terminal encrypts the second data set according to the set encryption mode and sends the encrypted second data set to the intermediate server platform.
It should be noted that the set encryption method may be sent to the first provider terminal and the second provider terminal by the intermediate server platform. The set Encryption mode may be a Homomorphic Encryption (HE) algorithm.
S403: the intermediate server platform receives the encrypted first data set and the encrypted second data set, decrypts the encrypted first data set and the encrypted second data set according to a set decryption mode, and obtains the first data set and the second data set;
s404: the intermediate server platform determines the same characteristic parameters based on the first data set and the second data set, and sends the same characteristic parameters to the first supplier terminal and the second supplier terminal.
It should be noted that the specific functions of the intermediate server platform are described above; the meaning of the same characteristic parameters is also described above and will not be described again. Here, the same characteristic parameters of the first data set and the second data set are:
Figure BDA0002183145630000141
Figure BDA0002183145630000142
wherein the content of the first and second substances,
Figure BDA0002183145630000143
and
Figure BDA0002183145630000144
for the same characteristic parameter, k is 2,3,4,5, 6.
In some embodiments, setting the decryption mode is the reverse of setting the encryption mode.
S405: the first supplier terminal receives the same characteristic parameter; based on the same characteristic parameters, acquiring a first mapping model, encrypting the first mapping model according to a set encryption mode, and sending the encrypted first mapping model to the intermediate server platform; the second supplier terminal receives the same characteristic parameter; and acquiring a second mapping model based on the same characteristic parameters, encrypting the second mapping model according to a set encryption mode, and sending the encrypted second mapping model to the intermediate server platform.
It should be noted that the determination method of the first mapping model and the second mapping model and the method provided by the intermediate service platform may be the same as those described above, and are not described herein again.
Here, the first mapping model is:
Figure BDA0002183145630000145
after encryption is performed according to a set encryption mode, obtaining an encrypted first mapping model as follows:
Figure BDA0002183145630000146
the second mapping model is:
Figure BDA0002183145630000147
the second mapping model for the encryption is obtained as:
Figure BDA0002183145630000148
s406: the intermediate service side platform receives the encrypted first mapping model sent by the first supplier terminal and sends the encrypted first mapping model to the second supplier terminal; and receiving the encrypted second mapping model sent by the second provider terminal, and sending the encrypted second mapping model to the first provider terminal.
It should be noted that S406 is similar to S304 described above, and can be understood based on the description of S304 described above.
S407: the first supplier terminal receives the encrypted second mapping model sent by the intermediate server platform; determining first completion sample data based on the same characteristic parameters and the encrypted second mapping model; the second supplier terminal receives the encrypted first mapping model sent by the intermediate server platform; determining second completion sample data based on the same feature parameters and the encrypted first mapping model.
In some embodiments, when the first provider terminal obtains the encrypted second mapping model, the data corresponding to the same feature parameter in the first data set is substituted into the encrypted second mapping model to obtain the first sample supplement data, that is:
Figure BDA0002183145630000149
similarly, the second supplier terminal may also obtain second sample completion data:
Figure BDA00021831456300001410
s408: the first supplier terminal determines first sample data based on the first completion sample data and a first data set, and carries out normalization processing on the first sample data; and the second supplier terminal determines second sample data based on the second sample completion data and the second data set, and performs normalization processing on the second sample data.
In practical applications, the first complementing sample data is added to the first data set to obtain the first sample data.
By way of example, assume that the above approach is utilizedObtained S1The first complement sample data of
Figure BDA00021831456300001411
Then, at this time, the first sample data
Figure BDA00021831456300001412
In some embodiments, each of the first sample data needs to be projected to the interval [0, 1], and optionally, the first sample data is normalized by a min-max normalization method, where the calculation formula is as follows:
X't,q=(Xt,q-Xt,min)/(Xt,max-Xt,min) (2)
Wherein, X't,qThe value is a normalized value of the qth data in the tth characteristic parameter in the first sample data, and t is 1,2,3,4,5,6, 7; q is a serial number arranged in sequence on different dates, q is 1,2,3, … …, N is a positive integer; xt,qThe q data in the t characteristic parameters in the first sample data; xt,minThe minimum value of the data corresponding to the t-th characteristic parameter in the first sample data is obtained; xt,maxThe maximum value of the data corresponding to the q-th characteristic parameter in the first sample data. Note that normalization processing is also performed in the same manner as the characteristic parameter for the unit price P of the aquatic product set in the first sample data.
Illustratively, the first sample data
Figure BDA0002183145630000151
After the normalization treatment, the result is
Figure BDA0002183145630000152
It should be noted that the second provider terminal may perform the same processing as the first provider terminal, and details are not repeated herein.
S409: the first supplier terminal obtains a first prediction model based on the normalized first sample data; acquiring a first loss value based on the first prediction model, encrypting the first loss value according to a set encryption mode, and sending the encrypted first loss value to the intermediate server platform; the second supplier terminal obtains a second prediction model based on the normalized second sample data; and acquiring a second loss value based on the second prediction model, encrypting the second loss value according to a set encryption mode, and sending the encrypted second loss value to the intermediate server platform.
It should be noted that the first prediction model and the second prediction model are obtained in a similar manner, and the embodiment of the present invention is described only by taking the first prediction model as an example.
In some embodiments, the first vendor terminal obtains the first predictive model based on the normalized first sample data, including:
building an LSTM recurrent neural network model as shown in fig. 2, wherein in the application scenario, the number of neurons included in the input layer may be determined according to the number of factors affecting the unit price P of the set aquatic product, for example, the factors affecting the unit price P of the set aquatic product provided by the first supplier terminal include first fishery historical data and first historical sales data, that is: the characteristic parameters SST, SSH, SSW, CHL, C, W and the complementary SSLI contained in the first sample data; also, since the unit price P of a set aquatic product of the previous day also affects the unit price P of a set aquatic product of the next day, in some embodiments, the unit price P of a past set aquatic product is also selected as a factor for predicting the unit price P of a future set aquatic product. Based on this, in the embodiment of the present invention, the input layer includes a total of 8 inputs SST, SSH, SSW, CHL, C, W, SSLI, and P, that is, 8 neurons need to be set.
Specifically, the input of the LSTM recurrent neural network model is normalized
Figure BDA0002183145630000153
Wherein q is a sequence number arranged according to the sequence on different dates, and the value is 1,2,3,4 … … N; it is noted that
Figure BDA0002183145630000154
Each row in the set represents a set of samples; n is the total number of groups of samples.
The hidden layer can adopt the LSTM network, and the number of this hidden layer is more, and to a certain extent, can effectual reduction to the error of setting for the unit price P prediction of aquatic products, improve the accuracy to the unit price prediction of setting for aquatic products, however when increasing the number of hidden layer, also can make the structure of LSTM recurrent neural network become complicated. In some embodiments, the hidden layer may comprise 2 layers. Then, it is necessary to set the number of neurons included in each of the hidden layers, and the setting can be performed by the first supplier according to the actual situation. For example, the selection may be performed according to the computing capability of the first provider terminal, and when the computing capability of the first provider terminal is strong, a plurality of neurons may be appropriately selected, for example, more than 20 neurons may be selected; if the first supplier terminal has a weak computing power, a few neurons can be selected properly, for example, less than 20 neurons can be selected.
In the output layer, the unit price P of the aquatic product is set in the present embodiment, and therefore, only one neuron needs to be provided.
After the LSTM recurrent neural network model is built, a first supplier inputs set learning rate, iteration times and the like in a first supplier terminal, wherein the learning rate can be selected according to actual conditions, and the range of values is generally (0, 1); the iteration times refer to the times of training the first sample data, and can also be set according to actual needs.
Then, the first supplier inputs the normalized first sample data into the LSTM recurrent neural network model for training to obtain a first prediction model, and after obtaining the first prediction model, obtains a first loss value by using the following formula (3) of the loss function (loss).
Figure BDA0002183145630000161
Wherein, N is the number of sample groups contained in the first sample data;
Figure BDA0002183145630000163
setting a numerical value of the aquatic product after the normalization of the unit price P in the q group of samples of the first sample data;
Figure BDA0002183145630000162
the predicted value of the unit price P of the qth set aquatic product obtained based on the first prediction model.
S4010: the first supplier terminal encrypts the first loss value according to a set encryption mode and sends the encrypted first loss value to the intermediate server platform; the second supplier terminal encrypts the second loss value according to a set encryption mode and sends the encrypted second loss value to the intermediate server platform;
S4011: the intermediate server platform receives the encrypted first loss value and the encrypted second loss value; determining convergence of the first prediction model and the second prediction model based on the encrypted first loss value and the encrypted second loss value.
In some embodiments, the determining, by the intermediary server platform, the convergence of the first predictive model and the second predictive model based on the encrypted first loss value and the encrypted second loss value comprises: the intermediate server platform obtains loss sum based on the received encrypted first loss value and the encrypted second loss value; comparing the loss sum with a set loss threshold, and if the loss sum is not greater than the set loss threshold, determining that the first prediction model and the second prediction model are in a convergence state; and if the loss sum is larger than the set loss threshold value, determining that the first prediction model and the second prediction model are in an unconverged state.
It should be noted that, the above intermediate server platform obtains the loss sum based on the received encrypted first loss value and the encrypted second loss value, and includes two ways: the intermediate service side platform decrypts the encrypted first loss value and the encrypted second loss value by setting a decryption mode to obtain a first loss value and a second loss value; the first loss value and the second loss value are then added to obtain a loss sum. And the other method is that the intermediate service side platform firstly sums the encrypted first loss value and the encrypted second loss value to obtain an encrypted loss sum, and then decrypts the encrypted loss sum by setting a decryption mode to obtain the loss sum. In practical applications, one of the two implementation manners may be selected. The set decryption method is the reverse of the set encryption method.
S4012: when the first prediction model and the second prediction model are in a convergence state, the intermediate service side platform sends a training stopping instruction to the first supplier terminal and the second supplier terminal;
s4013: the first supplier terminal receives the training stopping instruction to complete the training of the first prediction model; the second supplier terminal receives the training stopping instruction to complete the training of the second prediction model;
s4014: when the first prediction model and the second prediction model are in an unconverged state, the intermediate server side platform sends a continuous training instruction to the first supplier terminal and the second supplier terminal;
s4015: when the first supplier terminal receives the continuous training instruction, determining a first gradient value corresponding to the first loss value, encrypting the first gradient value according to a set encryption mode, and sending the encrypted first gradient value to the intermediate server platform; when the second supplier terminal receives the continuous training instruction, determining a second gradient value corresponding to the second loss value, encrypting the second gradient value according to a set encryption mode, and sending the encrypted second gradient value to the intermediate server platform;
s4016: the intermediate server platform receives the encrypted first gradient value and the encrypted second gradient value; determining a joint gradient value based on the encrypted first gradient value and the encrypted second gradient value; sending the joint gradient value to the first provider terminal and the second provider terminal.
In some embodiments, determining the joint gradient value based on the encrypted first gradient value and the encrypted second gradient value includes two ways: the intermediate server side platform decrypts the encrypted first gradient value and the encrypted second gradient value in a set decryption mode to obtain a first gradient value and a second gradient value; then adding the first gradient value and the second gradient value to obtain a combined gradient value; and the other is that the intermediate server side platform sums the encrypted first gradient value and the encrypted second gradient value to obtain an encrypted combined gradient value, and then decrypts the encrypted gradient value by setting a decryption mode to obtain the combined gradient value. In practical applications, one of the two implementation manners may be selected.
S4017: receiving the joint gradient value by the first provider terminal; updating a first prediction model based on the joint gradient value; updating the first loss value based on the updated first prediction model, encrypting the updated first loss value according to a set encryption mode, and sending the encrypted updated first loss value to the intermediate server platform; receiving the joint gradient value by a second provider terminal; updating a second prediction model based on the joint gradient value; and updating the second loss value based on the updated second prediction model, encrypting the updated second loss value according to a set encryption mode, and sending the encrypted updated second loss value to the intermediate server platform.
In some embodiments, the first provider terminal updates the first predictive model based on the joint gradient value, including: the first provider terminal updating the first sample data based on the joint gradient value; an updated first predictive model is determined based on the updated first sample data. Specifically, the updating, by the first provider terminal, the first sample data based on the joint gradient value includes:
when the first provider terminal receives the joint gradient value, the first provider terminal calculates a product between the joint gradient value and a learning rate;
and subtracting the product from each data in the first sample data to obtain updated first sample data.
After obtaining the updated first sample data, the first provider terminal retrains the first prediction model in the manner of S109 to obtain an updated first prediction model, and then determines an updated first loss value based on the updated first prediction model. It should be noted that the second provider terminal may update the second prediction model in the same manner, which is not described herein again.
S4018: the intermediate server platform receives the encrypted updated first loss value and the encrypted updated second loss value, and determines the convergence of the updated first prediction model and the updated second prediction model based on the encrypted updated first loss value and the encrypted updated second loss value;
It should be noted that, this step S4018 is the same as the judgment process in step S4011, and is not described herein again.
S4019: when the updated first prediction model and the updated second prediction model are in a convergence state, the intermediate service side platform sends a training stopping instruction to the first supplier terminal and the second supplier terminal;
s4020: executing S4013;
s4021: when the updated first prediction model and the updated second prediction model are in an unconverged state, the intermediate service side platform sends a continuous training instruction to the first supplier terminal and the second supplier terminal;
s4022: repeatedly executing S4015-S4018 until the first prediction model and the second prediction model are in a convergence state, and the intermediate server side platform sends a training stopping instruction to the first supplier terminal and the second supplier terminal;
s4023: the first supplier terminal predicts the unit price of the set aquatic product based on the obtained first prediction model and the fishing ground data and the sales data of the day received by the first supplier terminal.
Specifically, the first supplier inputs the fishing ground data of the day and the sales data of the day to the first supplier terminal. The first supplier terminal normalizes the current fishing ground data and the current sales data in the normalization mode, and inputs the normalized current fishing ground data and the normalized sales data of the set aquatic product into a first prediction model, namely an LSTM recurrent neural network model, so as to obtain a future output result, for example, an output result of a future day. Then, the first supplier terminal performs inverse normalization processing on the output result to obtain the unit price P of the set aquatic product in the future, for example, the unit price P of the set aquatic product in the future day, wherein the inverse normalization processing mode is the inverse process of the normalization mode.
S4024: the method comprises the steps that a first supplier terminal sends a predicted unit price of a set aquatic product to an intermediate server platform;
s4025: and the intermediate server platform receives the predicted unit price of the set aquatic product and pushes the predicted unit price of the set aquatic product to the client.
Here, the first supplier terminal sends the predicted unit price of the set aquatic product in the future to the intermediate server platform, and then the intermediate server platform pushes the received unit price of the set aquatic product to the client terminal, wherein the client terminal is a terminal held by a user needing to set the aquatic product, and the user using the client terminal can conveniently know the unit price of the first supplier for the set aquatic product. It should be understood that the type of the client terminal may be the same as the type of the first provider terminal, e.g., both are mobile terminals, e.g., cell phones; or different, for example, the client terminal is a mobile terminal, such as a mobile phone; the first supplier terminal is a fixed terminal, such as a computer.
The aquatic product price prediction method provided by the embodiment of the invention can ensure the safety of data in each terminal; and each terminal can obtain more comprehensive sample data, so that more accurate prediction models can be obtained at each terminal, and each terminal can predict the price of the set aquatic product.
Based on the same inventive concept, fig. 5 is a schematic structural diagram of an aquatic product price prediction apparatus applied to a terminal side according to an embodiment of the present invention, and as shown in fig. 5, the aquatic product price prediction apparatus 50 applied to a first terminal includes: a first sending module 501, a first receiving module 502, a first determining module 503, and a predicting module 504, wherein:
the first sending module 501 is configured to send an encrypted first data set to an intermediate server platform, where the first data set includes historical data and historical sales data of a fishery where aquatic products are set; the historical sales data at least comprises historical unit parameter data; the intermediate server platform is also used for sending the encrypted loss value to the intermediate server platform; wherein the loss value is used for the intermediate server platform to determine convergence of a predictive model;
the first receiving module 502 is configured to receive the same feature parameter and the encrypted mapping model sent by the intermediate server platform; the system is also used for receiving a training stopping instruction sent by the intermediate server platform and taking the prediction model as a target prediction model; wherein the training stopping instruction is sent when the intermediate server platform determines that the prediction model converges;
The first determining module 503 is configured to determine sample data based on the first data set, the same feature parameter, and the encrypted mapping model; the same characteristic parameter is determined by the intermediate server platform according to the received first data set and the second data set, and the second data set and the encrypted mapping model are sent to the intermediate server platform by the second terminal; further for determining a predictive model based on the sample data, determining a loss value based on the predictive model;
the prediction module 504 is configured to predict unit parameters of the set aquatic product based on the target prediction model, the data of the fishing ground on the same day, and the sales data on the same day.
In some embodiments, the first determining module 503 is specifically configured to: determining completion sample data based on the same characteristic parameters and the encrypted mapping model; and determining sample data based on the first data set and the completion sample data.
In some embodiments, the first determining module is further specifically configured to: constructing a long-short term memory (LSTM) recurrent neural network model based on the sample data; and training the LSTM recurrent neural network model based on the sample data to obtain a prediction model.
In some embodiments, the aquatic product price prediction apparatus 50 further includes: an update module 505, wherein:
the first receiving module 502 is further configured to receive a continuous training instruction sent by an intermediate server platform, where the continuous training instruction is sent when the intermediate server platform determines that the prediction model is not converged; the system is also used for receiving the joint gradient value sent by the intermediate server platform; the joint gradient value is determined by the intermediate server platform according to the received encrypted first gradient value and the encrypted second gradient value, and the encrypted second gradient value is sent to the intermediate server platform by the second terminal;
the first determining module 503 is further configured to determine a first gradient value corresponding to the loss value;
the first sending module 501 is further configured to send the encrypted first gradient value to the intermediate server platform;
the updating module 505 is configured to update the prediction model based on the joint gradient value.
In some embodiments, the update module 505 is specifically configured to: updating sample data based on the joint gradient value; based on the updated sample data, the predictive model is updated.
It should be noted that the process of predicting the aquatic product price by the aquatic product price predicting device 50 is similar to the aquatic product price predicting method applied to the first terminal, and therefore, the specific implementation process and implementation principle of the aquatic product price predicting device 50 can refer to the foregoing method, implementation process and description of the implementation principle, and repeated parts are not described again.
Based on the same inventive concept, fig. 6 is a schematic structural diagram of an aquatic product price prediction apparatus applied to an intermediate service platform according to an embodiment of the present invention, and as shown in fig. 6, the aquatic product price prediction apparatus 60 applied to the intermediate service platform includes: a second receiving module 601, a decrypting module 602, a second determining module 603, and a second sending module 604, wherein:
the second receiving module 601 is configured to receive an encrypted first data set sent by a first terminal and an encrypted second data set sent by a second terminal, where the first data set includes historical data and historical sales data of a fishery setting aquatic product; the historical sales data at least comprises historical unit parameter data; the terminal is also used for receiving the encrypted mapping model sent by the second terminal; the system is also used for receiving an encrypted loss value sent by the first terminal, and determining the convergence of a prediction model based on the encrypted loss value;
the decryption module 602 is configured to decrypt the encrypted first data set and the encrypted second data set to obtain a first data set and a second data set.
A second determining module 603 configured to determine a same feature parameter based on the first data set and the second data set; further for determining a convergence of a predictive model based on the encrypted loss value;
A second sending module 604, configured to send the same feature parameter and the encrypted mapping model to the first terminal; and the terminal is further used for sending a training stopping instruction to the first terminal when the prediction model is determined to be converged.
It should be noted that the aquatic product price prediction process of the aquatic product price prediction apparatus 60 is similar to the aquatic product price prediction method applied to the intermediate service platform.
In some embodiments, the decryption module 602 is specifically configured to decrypt the encrypted first data set and the encrypted second data set according to a set decryption manner, so as to obtain the first data set and the second data set.
In some embodiments, the second receiving module 601 is further configured to receive an encrypted second loss value sent by a second terminal;
the second determining module 603 is specifically configured to: determining a predictive model convergence based on the encrypted loss value and the encrypted second loss value.
In some embodiments, the second determining module 603 is specifically configured to obtain a loss sum based on the received encrypted loss value and the encrypted second loss value; comparing the loss sum with a set loss threshold, and if the loss sum is not greater than the set loss threshold, determining that the prediction model converges; and if the loss sum is larger than a set loss threshold value, determining that the prediction model does not converge.
In some embodiments, the second receiving module 601 is further configured to receive the encrypted first gradient value and the encrypted second gradient value;
the second determining module 603 is further configured to determine a joint gradient value based on the encrypted first gradient value and the encrypted second gradient value;
the second sending module 604 is further configured to send the joint gradient value to the first terminal.
It should be noted that the process of predicting the aquatic product price by the aquatic product price predicting device 60 is similar to the aquatic product price predicting method applied to the intermediate service platform, and therefore, the specific implementation process and implementation principle of the aquatic product price predicting device 60 can refer to the foregoing method, implementation process and description of the implementation principle, and repeated details are not repeated.
The aquatic product price prediction device provided by the embodiment of the invention can ensure the safety of data in each terminal; and each terminal can obtain more comprehensive sample data, so that more accurate prediction models can be obtained at each terminal, and each terminal can predict the price of the set aquatic product.
Embodiments of the present invention further provide a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the foregoing method embodiments, and the foregoing storage medium includes: a mobile storage device, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The embodiment of the invention also provides a device for predicting the price of aquatic products, which comprises: a processor and a memory for storing a computer program capable of running on the processor, wherein the processor is configured to execute the steps of the above-described method embodiments stored in the memory when running the computer program.
Fig. 7 is a schematic diagram of a hardware structure of an aquatic product price prediction apparatus according to an embodiment of the present invention, where the aquatic product price prediction apparatus 70 includes: the at least one processor 701, the memory 702, and optionally, the aquatic product price predicting apparatus 70 may further include at least one communication interface 703, and the various components in the aquatic product price predicting apparatus 70 are coupled together through a bus system 704, and it is understood that the bus system 704 is used to implement connection communication between these components. The bus system 704 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled in fig. 7 as the bus system 704.
It will be appreciated that the memory 702 can be either volatile memory or nonvolatile memory, and can include both volatile and nonvolatile memory. Among them, the nonvolatile Memory may be a Read Only Memory (ROM), a Programmable Read Only Memory (PROM), an Erasable Programmable Read-Only Memory (EPROM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a magnetic random access Memory (FRAM), a Flash Memory (Flash Memory), a magnetic surface Memory, an optical disk, or a Compact Disc Read-Only Memory (CD-ROM); the magnetic surface storage may be disk storage or tape storage. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of illustration and not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), Synchronous Static Random Access Memory (SSRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic Random Access Memory (SDRAM), Double Data Rate Synchronous Dynamic Random Access Memory (DDRSDRAM), Enhanced Synchronous Dynamic Random Access Memory (ESDRAM), Enhanced Synchronous Dynamic Random Access Memory (Enhanced DRAM), Synchronous Dynamic Random Access Memory (SLDRAM), Direct Memory (DRmb Access), and Random Access Memory (DRAM). The memory 702 described in connection with the embodiments of the invention is intended to comprise, without being limited to, these and any other suitable types of memory.
The memory 702 in the embodiment of the present invention is used to store various types of data to support the operation of the water product price predicting device 70. Examples of such data include: any computer program for operating on the aquatic product price prediction apparatus 70, such as stored sample data, prediction models, etc., a program implementing a method of an embodiment of the present invention may be contained in the memory 702.
The method disclosed in the above embodiments of the present invention may be applied to the processor 701, or implemented by the processor 701. The processor may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The Processor may be a general purpose Processor, a Digital Signal Processor (DSP), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like. The processor may implement or perform the methods, steps, and logic blocks disclosed in embodiments of the present invention. A general purpose processor may be a microprocessor or any conventional processor or the like. The steps of the method disclosed by the embodiment of the invention can be directly implemented by a hardware decoding processor, or can be implemented by combining hardware and software modules in the decoding processor. The software modules may be located in a storage medium having a memory and a processor reading the information in the memory and combining the hardware to perform the steps of the method.
In an exemplary embodiment, the aquatic product price prediction apparatus 70 may be implemented by one or more Application Specific Integrated Circuits (ASICs), DSPs, Programmable Logic Devices (PLDs), Complex Programmable Logic Devices (CPLDs), Field Programmable Gate Arrays (FPGAs), general purpose processors, controllers, Micro Controllers (MCUs), microprocessors (microprocessors), or other electronic components for performing the above-described methods.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms. The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment. In addition, all the functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (10)

1. An aquatic product price prediction method applied to a first terminal, the method comprising:
sending the encrypted first data set to the intermediate server platform; wherein the first data set comprises historical data and historical sales data of a fishery for setting aquatic products; the historical sales data at least comprises historical unit parameter data;
receiving the same characteristic parameters and the encrypted mapping model sent by the intermediate server platform, and determining sample data based on the first data set, the same characteristic parameters and the encrypted mapping model; the same characteristic parameter is determined by the intermediate server platform according to the received first data set and the second data set, and the second data set and the encrypted mapping model are sent to the intermediate server platform by the second terminal;
The encrypted mapping model is determined by the steps of:
dividing, by the second terminal, the second data set into a first subset and a second subset according to the same feature parameter; a mapping function from the first subset to the second subset is established to obtain a mapping model, the mapping model is encrypted according to a set encryption mode based on the second terminal to obtain an encrypted mapping model, and the encrypted mapping model is sent to the intermediate server platform;
the first subset comprises data corresponding to the same characteristic parameters in the second data set; the second subset comprises data corresponding to the non-identical characteristic parameters in the second data set; the non-identical characteristic parameters refer to characteristic parameters in the second data set except identical characteristic parameters;
said determining sample data based on said first data set, said same feature parameters, and said encrypted mapping model, comprising: determining completion sample data based on the same characteristic parameters and the encrypted mapping model; determining sample data based on the first data set and the completion sample data;
determining a prediction model based on the sample data, determining a loss value based on the prediction model, and sending an encrypted loss value to an intermediate server platform; wherein the loss value is used for the intermediate server platform to determine convergence of a predictive model;
Receiving a training stopping instruction sent by an intermediate server platform, and taking the prediction model as a target prediction model; wherein the training stopping instruction is sent when the intermediate server platform determines that the prediction model converges;
and predicting unit parameters of the set aquatic product based on the target prediction model, the current-day fishing ground data and the current-day sales data.
2. The method of claim 1, wherein said determining a predictive model based on said sample data comprises:
constructing a long-short term memory (LSTM) recurrent neural network model based on the sample data;
and training the LSTM recurrent neural network model based on the sample data to obtain a prediction model.
3. The method of claim 2, wherein predicting unit parameters of a set water product based on the target prediction model, the day fishing ground data, and the day sales data comprises:
receiving the current fishing ground data and the current sales data;
carrying out normalization processing on the current-day fishery data and the current-day sales data to obtain normalized current-day fishery data and normalized current-day sales data;
Inputting the normalized fishery data on the same day and the normalized sales data on the same day into the LSTM recurrent neural network model, and predicting unit parameters of the set aquatic product;
and performing reverse normalization processing on the predicted unit parameters of the set aquatic products to obtain the predicted values of the unit parameters of the set aquatic products.
4. The method of claim 1, wherein prior to receiving a stop training instruction sent by an intermediate server platform, the method further comprises:
receiving a combined gradient value sent by the intermediate server platform; the joint gradient value is determined by the intermediate server platform according to the received encrypted first gradient value and the encrypted second gradient value; the encrypted first gradient value is sent to the intermediate server platform by the first terminal; the encrypted second gradient value is sent to the intermediate server platform by the second terminal;
updating sample data based on the joint gradient value;
based on the updated sample data, the predictive model is updated.
5. An aquatic product price prediction method applied to an intermediate server platform, the method comprising:
receiving an encrypted first data set sent by a first terminal and an encrypted second data set sent by a second terminal; the first data set comprises historical data and historical sales data of a fishery of set aquatic products; the historical sales data at least comprises historical unit parameter data;
Decrypting the encrypted first data set and the encrypted second data set to obtain a first data set and a second data set;
determining a same feature parameter based on the first data set and the second data set;
receiving an encrypted mapping model sent by a second terminal, and sending the same characteristic parameters and the encrypted mapping model to the first terminal;
wherein the encrypted mapping model is obtained by:
dividing, by the second terminal, the second data set into a first subset and a second subset according to the same feature parameter; constructing a mapping function from the first subset to the second subset to obtain a mapping model; encrypting the mapping model according to a set encryption mode based on the second terminal to obtain an encrypted mapping model, and sending the encrypted mapping model to the intermediate server platform;
the first subset comprises data corresponding to the same characteristic parameters in the second data set; the second subset comprises data corresponding to the non-identical characteristic parameters in the second data set; the non-identical characteristic parameters refer to characteristic parameters in the second data set except identical characteristic parameters;
receiving an encrypted loss value sent by a first terminal, and determining the convergence of a prediction model based on the encrypted loss value;
And when the prediction model is determined to be converged, sending a training stopping instruction to the first terminal.
6. The method of claim 5, wherein determining convergence of a predictive model based on the encrypted loss values comprises:
receiving an encrypted second loss value sent by a second terminal;
determining a predictive model convergence based on the encrypted loss value and the encrypted second loss value.
7. The method of claim 6, wherein determining predictive model convergence based on the encrypted loss value and the encrypted second loss value comprises:
obtaining a loss sum based on the received encrypted loss value and the encrypted second loss value;
comparing the loss sum with a set loss threshold, and if the loss sum is not greater than the set loss threshold, determining that the prediction model converges; and if the loss sum is larger than a set loss threshold value, determining that the prediction model does not converge.
8. The method of claim 6, wherein prior to sending the stop training instruction to the first terminal, the method further comprises:
receiving the encrypted first gradient value and the encrypted second gradient value;
and determining a joint gradient value based on the encrypted first gradient value and the encrypted second gradient value, and sending the joint gradient value to the first terminal.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 4 or 5 to 8.
10. An aquatic product price prediction device, comprising: a processor and a memory for storing a computer program operable on the processor, wherein the processor is operable to perform the steps of the method of any one of claims 1 to 4 or 5 to 8 when the computer program is executed.
CN201910804169.5A 2019-08-28 2019-08-28 Aquatic product price prediction method and device and computer storage medium Active CN110598918B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910804169.5A CN110598918B (en) 2019-08-28 2019-08-28 Aquatic product price prediction method and device and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910804169.5A CN110598918B (en) 2019-08-28 2019-08-28 Aquatic product price prediction method and device and computer storage medium

Publications (2)

Publication Number Publication Date
CN110598918A CN110598918A (en) 2019-12-20
CN110598918B true CN110598918B (en) 2021-06-11

Family

ID=68856389

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910804169.5A Active CN110598918B (en) 2019-08-28 2019-08-28 Aquatic product price prediction method and device and computer storage medium

Country Status (1)

Country Link
CN (1) CN110598918B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113112307A (en) * 2021-04-30 2021-07-13 欧冶云商股份有限公司 Steel price prediction method, device, equipment and medium based on federal learning

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105844353A (en) * 2016-03-22 2016-08-10 中国农业大学 Aquatic product price prediction method and device
WO2018014658A1 (en) * 2016-07-22 2018-01-25 上海海洋大学 Ommastrephidaeentral fishing ground prediction method
KR20190002987U (en) * 2018-04-30 2019-12-06 임락견 Optimum on-line distribution structure of agricultural and marine products from big-data based on cooperative union of regional distribution specialists

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107181776B (en) * 2016-03-10 2020-04-28 华为技术有限公司 Data processing method and related equipment and system
CN110110932A (en) * 2019-05-09 2019-08-09 上汽安吉物流股份有限公司 Order forecast method and device, logistics system and computer-readable medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105844353A (en) * 2016-03-22 2016-08-10 中国农业大学 Aquatic product price prediction method and device
WO2018014658A1 (en) * 2016-07-22 2018-01-25 上海海洋大学 Ommastrephidaeentral fishing ground prediction method
KR20190002987U (en) * 2018-04-30 2019-12-06 임락견 Optimum on-line distribution structure of agricultural and marine products from big-data based on cooperative union of regional distribution specialists

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于神经网络的水产品价格预测模型研究;贺艳辉等;《农业网络信息》;20101126(第11期);第20-24页 *

Also Published As

Publication number Publication date
CN110598918A (en) 2019-12-20

Similar Documents

Publication Publication Date Title
TWI788529B (en) Credit risk prediction method and device based on LSTM model
TWI689841B (en) Data encryption, machine learning model training method, device and electronic equipment
CN106462795B (en) System and method for allocating capital to trading strategies for big data trading in financial markets
CN107358247A (en) A kind of method and device for determining to be lost in user
Shevchenko et al. A unified pricing of variable annuity guarantees under the optimal stochastic control framework
Bousoño-Calzón et al. On the economic significance of stock market prediction and the no free lunch theorem
CN108734587A (en) The recommendation method and terminal device of financial product
CN110598918B (en) Aquatic product price prediction method and device and computer storage medium
Buff Uncertain volatility models: theory and application
Zhang The value of Monte Carlo model-based variance reduction technology in the pricing of financial derivatives
Belak et al. Worst-case portfolio optimization with proportional transaction costs
Nouri et al. Implementation of the modified Monte Carlo simulation for evaluate the barrier option prices
Eberlein et al. A simple stochastic rate model for rate equity hybrid products
Aseeri Effective short-term forecasts of Saudi stock price trends using technical indicators and large-scale multivariate time series
Raddant et al. Transitions in the stock markets of the US, UK and Germany
Malyscheff et al. Natural gas storage valuation via least squares Monte Carlo and support vector regression
EWALD Derivatives on nonstorable renewable resources: fish futures and options, not so fishy after all
Song et al. Stochastic optimization methods for buying-low-and-selling-high strategies
US20170177767A1 (en) Configuration of large scale advection diffusion models with predetermined rules
CN114219184A (en) Product transaction data prediction method, device, equipment, medium and program product
Jeong et al. Nonparametric estimation of value-at-risk
KR102102788B1 (en) Artificial intelligence based financial product ordering system and method
Shokrollahi Subdiffusive fractional Black–Scholes model for pricing currency options under transaction costs
Brody et al. Rational term structure models with geometric Lévy martingales
Paulot et al. One-dimensional pricing of CPPI

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: 315500 No.98, Huiming Road, economic development zone, Fenghua District, Ningbo City, Zhejiang Province

Patentee after: Ningbo haihaixian Information Technology Co.,Ltd.

Address before: 315500 No.98, Huiming Road, economic development zone, Fenghua District, Ningbo City, Zhejiang Province

Patentee before: NINGBO HAISHANGXIAN INFORMATION TECHNOLOGY Co.,Ltd.

CP01 Change in the name or title of a patent holder