CN116051260A - Bank loan model construction method, loan amount calculation method, device and system - Google Patents

Bank loan model construction method, loan amount calculation method, device and system Download PDF

Info

Publication number
CN116051260A
CN116051260A CN202211570838.5A CN202211570838A CN116051260A CN 116051260 A CN116051260 A CN 116051260A CN 202211570838 A CN202211570838 A CN 202211570838A CN 116051260 A CN116051260 A CN 116051260A
Authority
CN
China
Prior art keywords
model
local
local model
gradient parameters
gradient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211570838.5A
Other languages
Chinese (zh)
Inventor
王湘灵
梁观术
马希佳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Citic Bank Corp Ltd
Original Assignee
China Citic Bank Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Citic Bank Corp Ltd filed Critical China Citic Bank Corp Ltd
Priority to CN202211570838.5A priority Critical patent/CN116051260A/en
Publication of CN116051260A publication Critical patent/CN116051260A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/04Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks
    • H04L63/0407Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks wherein the identity of one or more communicating identities is hidden
    • H04L63/0414Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks wherein the identity of one or more communicating identities is hidden during transmission, i.e. party's identity is protected against eavesdropping, e.g. by using temporary identifiers, but is known to the other party or parties involved in the communication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/008Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols involving homomorphic encryption
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L2209/00Additional information or applications relating to cryptographic mechanisms or cryptographic arrangements for secret or secure communication H04L9/00
    • H04L2209/56Financial cryptography, e.g. electronic payment or e-cash

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Computer Hardware Design (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Financial Or Insurance-Related Operations Such As Payment And Settlement (AREA)

Abstract

The present invention relates to the field of data processing technologies, and in particular, to a method for constructing a bank lending model, and a method, device and system for calculating a lending amount. Generating local model initial gradient parameters of a multiparty institution; transmitting the initial gradient parameters of the local model to the mechanisms so that each mechanism trains the local model by utilizing the local training data set and the initial gradient parameters of the local model respectively to obtain the gradient parameters of the local model; receiving local model gradient parameters sent by a multiparty organization; carrying out weight aggregation on the local model gradient parameters to obtain global gradient parameters; updating a model of the central server by using the global gradient parameters and calculating a loss function; judging whether the loss function converges or not; if not, taking the global gradient parameter as the initial gradient parameter of the local model of the multi-party mechanism for iterative training; if yes, the model is used as a bank lending model. Through the text embodiment, the problems of low data security and easy leakage in model training are solved.

Description

Bank loan model construction method, loan amount calculation method, device and system
Technical Field
The present invention relates to the field of data processing technologies, and in particular, to a method for constructing a bank lending model, and a method, device and system for calculating a lending amount.
Background
In the existing policy and credit system, the multiparty institutions all need to provide respective customer data to calculate the maximum loan amount of the customers, so that the risk of leakage of the customer data of the multiparty institutions exists, and once the customer data is leaked, a certain risk is brought to the customers, and reputation, money and the like of the multiparty institutions are seriously damaged.
In the prior art, a model can be trained by a federal learning method, but although local original data of each user is not disclosed in the training process of the federal learning model, the limitation is that: if there are "dishonest", "honest and curious" servers or malicious clients, the local data information of the user may still be reversely deduced from the updated model parameters, i.e. inference attack, and other various attacks such as attack against the network based on generation, which causes a huge threat to the privacy security of the bang study.
The bank lending model construction method is needed at present, so that the problems that the safety of user data is low and the user data is easy to leak in the existing model training process are solved.
Disclosure of Invention
In order to solve the problems that the existing model training process has low safety of user data and is easy to leak, the embodiment of the invention provides a method for constructing a bank borrowing model, a method and a device for calculating borrowing amount, which realize the construction of the bank borrowing model by a federal learning method based on homomorphic encryption combined with a weight aggregation method, and respectively train a local model of a multiparty institution to analyze the highest borrowing amount of a user by the federal learning method under the condition that multiparty institutions do not provide original data.
In order to solve the technical problems, the specific technical scheme is as follows:
in one aspect, embodiments herein provide a bank lending model building method, performed by a central server, the method comprising,
generating local model initial gradient parameters of each multiparty institution;
the initial gradient parameters of the local model are sent to corresponding mechanisms, so that each mechanism trains the local model by utilizing respective local training data sets and the initial gradient parameters of the local model to obtain the gradient parameters of the local model, and the local training data sets of the multiparty mechanisms correspond to the same user;
Receiving the local model gradient parameters sent by the multiparty mechanism;
performing weight aggregation on the local model gradient parameters to obtain global gradient parameters;
updating a model of the central server by using the global gradient parameters and calculating a loss function;
judging whether the loss function converges or not;
if not, the global gradient parameter is used as a local model initial gradient parameter of the multiparty institution, and the step of sending the local model initial gradient parameter to the corresponding institution is repeatedly executed;
if yes, the model is used as a bank lending model.
Further, after sending the local model initial gradient parameters to the corresponding institution, the method further comprises,
receiving local model encryption gradient parameters sent by the multiparty mechanism, wherein the local model encryption gradient parameters are obtained by encrypting the local model gradient parameters by the mechanism by utilizing a homomorphic encryption algorithm;
decrypting the local model encryption gradient parameters to obtain the local model gradient parameters.
Further, receiving the local model gradient parameters sent by the multiparty organization further comprises,
Receiving accuracy sent by the multiparty institutions, wherein the accuracy is obtained by respectively calculating the local models of the multiparty institutions;
weight aggregation is carried out on the local model gradient parameters, and the obtaining of global gradient parameters further comprises,
and carrying out weight aggregation on the local model gradient parameters according to the accuracy to obtain global gradient parameters.
Further, weight aggregation is performed on the local model gradient parameters according to the accuracy, and obtaining global gradient parameters further comprises,
calculating a sum of a plurality of said accuracies;
calculating the ratio of the corresponding accuracy to the sum of the accuracies for the same local model gradient parameter;
taking the ratio as a coefficient of the local model gradient parameter, calculating a product of the coefficient and the local model gradient parameter, and taking the product as a sub-gradient parameter;
and calculating the sum of a plurality of sub-gradient parameters to obtain the global gradient parameters.
On the other hand, the embodiment also provides a bank lending model construction device, which comprises,
the local model initial gradient parameter generation unit is used for generating local model initial gradient parameters of each multiparty institution;
The local model initial gradient parameter sending unit is used for sending the local model initial gradient parameters to corresponding mechanisms so that each mechanism trains the local model by utilizing respective local training data sets and the local model initial gradient parameters to obtain local model gradient parameters of the local model, and the local training data sets of the multiparty mechanisms correspond to the same user;
the local model gradient parameter receiving unit is used for receiving the local model gradient parameters sent by the multiparty mechanism;
the global gradient parameter calculation unit is used for carrying out weight aggregation on the local model gradient parameters to obtain global gradient parameters;
a model updating unit for updating a model of the central server and calculating a loss function using the global gradient parameters;
the iterative training unit is used for judging whether the loss function converges or not; if not, the global gradient parameter is used as a local model initial gradient parameter of the multiparty institution, and the step of sending the local model initial gradient parameter to the corresponding institution is repeatedly executed; if yes, the model is used as a bank lending model.
Based on the same inventive concept, embodiments herein also provide a bank lending model construction method, performed by any one of a plurality of institutions, the method comprising,
Receiving a local model initial gradient parameter sent by a central server;
training the local model by using the local training data set and the initial gradient parameters of the local model to obtain the gradient parameters of the local model, wherein the local training data set of the multiparty mechanism corresponds to the same user;
the local model gradient parameters are sent to the central server, so that the central server carries out weight aggregation on the local model gradient parameters of a multiparty mechanism to obtain global gradient parameters, the global gradient parameters are utilized to update a model of the central server, a loss function is calculated, and whether the loss function is converged is judged; if yes, the model is used as a bank lending model; if not, taking the updated global gradient parameter as the initial gradient parameter of the local model of the multiparty institution and sending the initial gradient parameter to the corresponding institution;
receiving updated initial gradient parameters of the local model sent by the central server;
the step of training the local model with the local training dataset and the local model initial gradient parameters is performed again.
Further, after obtaining the local model gradient parameters of the local model, the method further comprises,
Encrypting the local model gradient parameters by using a homomorphic encryption algorithm to obtain local model encryption gradient parameters;
and sending the local model encryption gradient parameters to the central server so that the central server decrypts the local model encryption gradient parameters to obtain the local model gradient parameters.
Further, before training the local model using the local training data set and the local model initial gradient parameters, the method further includes extracting data features of the user using a neural network to construct a local training data set.
Further, extracting data features of the user using a neural network, constructing a local training data set further includes,
and encrypting the identity information of the user by using an irreversible encryption algorithm to obtain a unique identifier of the user, extracting a data set corresponding to the unique identifier from a user data set of the mechanism to be used as the local training data set of the mechanism, and sending the unique identifier to other mechanisms so that the other mechanisms respectively extract the data sets corresponding to the unique identifier from the respective user data sets to be used as the respective local training data sets.
Further, extracting a data set corresponding to the unique identification in the user data set as the local training data set further comprises,
and extracting the h1 layer characteristic of the unique identification from the user data set as the local training data set.
Further, training the local model using the local training dataset and the local model initial gradient parameters further comprises,
and calculating the accuracy of the local model, and sending the accuracy to the central server so that the central server carries out weight aggregation on the local model gradient parameters according to the accuracy of a multiparty organization to obtain global gradient parameters.
On the other hand, the embodiment also provides a bank lending model construction device, which comprises,
the local model initial gradient parameter receiving unit is used for receiving the local model initial gradient parameters sent by the central server;
the local model training unit is used for training the local model by utilizing the local training data set and the initial gradient parameters of the local model to obtain the gradient parameters of the local model, and the local training data set of the multiparty mechanism corresponds to the same user;
The local model gradient parameter sending unit is used for sending the local model gradient parameters to the central server so that the central server carries out weight aggregation on the local model gradient parameters of a multiparty mechanism to obtain global gradient parameters, updating a model of the central server by using the global gradient parameters, calculating a loss function and judging whether the loss function is converged or not; if yes, the model is used as a bank lending model; if not, taking the updated global gradient parameter as the initial gradient parameter of the local model of the multiparty institution and sending the initial gradient parameter to the corresponding institution;
the local model initial gradient parameter receiving unit is further used for receiving the updated local model initial gradient parameters sent by the central server;
and the iterative training unit is used for executing the step of training the local model by using the local training data set and the initial gradient parameters of the local model again.
On the other hand, the embodiment also provides a bank lending model construction system, which comprises a multi-party mechanism and a central server;
when any party of the multi-party mechanism builds a bank lending model, executing the method for building the bank lending model;
And the central server executes the construction method of the bank lending model when constructing the bank lending model.
Based on the same inventive concept, embodiments herein also provide a lending amount calculating method, including,
receiving user data of the same target user sent by a multiparty mechanism;
and calculating the maximum loan amount of the target user according to the user data and a bank loan model corresponding to the target user, wherein the bank loan model is constructed by using the construction method of the bank loan model.
Further, receiving user data of the same target user sent by the multiparty institution further comprises,
receiving user data and a corresponding unique identifier sent by the multiparty mechanism, wherein the unique identifier is obtained by encrypting the identity information of the target user by the multiparty mechanism by using an irreversible encryption algorithm;
and determining the user data belonging to the same target user according to the unique identification.
In another aspect, embodiments herein also provide a debit amount calculating apparatus, comprising,
the user data receiving unit is used for receiving the user data of the same target user sent by the multiparty mechanism;
And the maximum loan amount calculation unit is used for calculating the maximum loan amount of the target user according to the user data and a bank loan model corresponding to the target user, wherein the bank loan model is constructed by using the construction method of the bank loan model.
In another aspect, embodiments herein also provide a computer device including a memory, a processor, and a computer program stored on the memory and executable on the processor, the processor implementing the above method when executing the computer program.
Finally, an embodiment of the present invention also provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the above method.
When the multi-party institution needs to generate a bank lending model, firstly, a central server generates local model initial gradient parameters of the multi-party institution, then sends the local model initial gradient parameters to corresponding institutions, each institution trains the local model of each institution by utilizing a local training data set and the local model initial gradient parameters generated by the central server to obtain local model gradient parameters of the local model, then the multi-party institution sends the local model gradient parameters to the central server, the local data of each institution are not required to be sent out, the real data of a user are not required to be pushed out any more, the privacy of the data of each party institution is guaranteed when the intermediate parameters are prevented from interacting, then the central server carries out weight aggregation on the received local model gradient parameters of the multi-party institution to obtain global gradient parameters, the central server is more inclined to the parameters obtained by training useful features of the finally trained bank lending model when the gradient parameters are distributed, then the global gradient parameters are utilized to update the model of the central server and calculate a loss function, if the loss function is not converged, the global gradient parameters are taken as initial gradient models of the bank lending model, and the multi-party institution is sent to the central institution until the initial gradient model is converged. Compared with the prior art that a model is trained through a federal learning method, the method for federal learning based on homomorphic encryption combined with a weight aggregation method is realized to construct a bank lending model, and under the condition that multiparty institutions do not provide original data, the local models of the multiparty institutions are respectively trained through the federal learning method to analyze the highest lending amount of users, so that the problems that the safety of user data is low and the user data is easy to leak in the model training process are solved.
Drawings
In order to more clearly illustrate the embodiments herein or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments herein and that other drawings may be obtained according to these drawings without inventive effort to a person skilled in the art.
FIG. 1 is a schematic diagram of an implementation system of a bank lending model construction method according to an embodiment of the disclosure;
FIG. 2 is a flowchart of a method for constructing a bank lending model according to an embodiment of the disclosure;
FIG. 3 is a flowchart of a method for constructing a bank lending model according to an embodiment of the disclosure;
fig. 4 is a schematic structural diagram of a bank lending model construction device according to an embodiment of the disclosure;
fig. 5 is a schematic structural diagram of a bank lending model construction device according to an embodiment of the disclosure;
FIG. 6 is a data flow diagram illustrating a bank lending model building system according to one embodiment of the disclosure;
FIG. 7 is a flow chart illustrating a method of calculating a debit amount according to an embodiment of the present disclosure;
FIG. 8 is a schematic diagram showing a construction of a device for calculating a loan amount, which is implemented herein;
Fig. 9 is a schematic structural diagram of a computer device according to an embodiment of the present disclosure.
[ reference numerals description ]:
101. a mechanism;
102. a central server;
401. a local model initial gradient parameter generation unit;
402. a local model initial gradient parameter sending unit;
403. a local model gradient parameter receiving unit;
404. a global gradient parameter calculation unit;
405. a model updating unit;
406. an iterative training unit;
501. a local model initial gradient parameter receiving unit;
502. a local model training unit;
503. a local model gradient parameter transmitting unit;
504. an iterative training unit;
801. a user data receiving unit;
802. a maximum loan amount calculation unit;
902. a computer device;
904. a processing device;
906. storing the resource;
908. a driving mechanism;
910. an input/output module;
912. an input device;
914. an output device;
916. a presentation device;
918. a graphical user interface;
920. a network interface;
922. a communication link;
924. a communication bus.
Detailed Description
The following description of the embodiments of the present disclosure will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all embodiments of the disclosure. All other embodiments, based on the embodiments herein, which a person of ordinary skill in the art would obtain without undue burden, are within the scope of protection herein.
It should be noted that the terms "first," "second," and the like in the description and claims herein and in the foregoing figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments described herein may be capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, apparatus, article, or device that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed or inherent to such process, method, article, or device.
It should be noted that the steps illustrated in the flowcharts of the figures may be performed in a computer system such as a set of computer executable instructions, and that although a logical order is illustrated in the flowcharts, in some cases the steps illustrated or described may be performed in an order other than that illustrated herein.
Fig. 1 is a schematic diagram of an implementation system of a bank lending model construction method according to an embodiment of the disclosure, which may include: multiparty organization 101 and central server 102 communicate between each organization 101 and central server 102 via a network, which may include a local area network (Local Area Network, abbreviated LAN), wide area network (Wide Area Network, abbreviated WAN), the Internet, or a combination thereof, and is connected to web sites, user devices (e.g., computing devices), and backend systems.
Each institution 101 includes different user data for the same user, and each institution may use the respective data to train the local model during the training of the model. The central server 102 has disposed thereon a processor that trains the bank lending model, alternatively the central server 102 may be a node (not shown) of a cloud computing system, or each server may be a separate cloud computing system including a plurality of computers interconnected by a network and operating as a distributed processing system.
Specifically, the embodiment of the invention provides a bank lending model construction method, which realizes the construction of a bank lending model by a federal learning method based on a homomorphic encryption combined weight aggregation method, and respectively trains a local model of a multiparty institution to analyze the highest lending amount of a user by the federal learning method under the condition that the multiparty institution does not provide original data. Fig. 2 is a flowchart illustrating a method for constructing a bank lending model according to an embodiment of the disclosure. The process of constructing a bank lending model is described in this figure, but may include more or fewer operational steps based on conventional or non-creative labor. The order of steps recited in the embodiments is merely one way of performing the order of steps and does not represent a unique order of execution. When a system or apparatus product in practice is executed, it may be executed sequentially or in parallel according to the method shown in the embodiments or the drawings. As shown in fig. 2 in particular, the method may be performed by the central server 102, and may include:
Step 201: generating local model initial gradient parameters of each multiparty institution;
step 202: the initial gradient parameters of the local model are sent to corresponding mechanisms, so that each mechanism trains the local model by utilizing respective local training data sets and the initial gradient parameters of the local model to obtain the gradient parameters of the local model, and the local training data sets of the multiparty mechanisms correspond to the same user;
step 203: receiving the local model gradient parameters sent by the multiparty mechanism;
step 204: performing weight aggregation on the local model gradient parameters to obtain global gradient parameters;
step 205: updating a model of the central server by using the global gradient parameters and calculating a loss function;
step 206: judging whether the loss function converges or not;
step 207: if not, the global gradient parameter is used as a local model initial gradient parameter of the multiparty institution, and the step of sending the local model initial gradient parameter to the corresponding institution is repeatedly executed;
step 208: if yes, the model is used as a bank lending model.
Correspondingly, the embodiment also provides a construction method of the bank lending model, as shown in fig. 3, comprising the following steps of,
Step 301: receiving a local model initial gradient parameter sent by a central server;
step 302: training the local model by using the local training data set and the initial gradient parameters of the local model to obtain the gradient parameters of the local model, wherein the local training data set of the multiparty mechanism corresponds to the same user;
step 303: transmitting the local model gradient parameters to the central server;
after the step, the central server performs weight aggregation on the local model gradient parameters of the multiparty mechanism to obtain global gradient parameters, updates a model of the central server by using the global gradient parameters, calculates a loss function, and judges whether the loss function is converged or not; if yes, the model is used as a bank lending model; if not, taking the updated global gradient parameter as the initial gradient parameter of the local model of the multiparty institution and sending the initial gradient parameter to the corresponding institution;
step 304: receiving updated initial gradient parameters of the local model sent by the central server;
step 305: the step of training the local model with the local training dataset and the local model initial gradient parameters is performed again.
According to the method provided by the embodiment of the invention, when a multiparty institution needs to generate a bank lending model, firstly, a central server generates local model initial gradient parameters of the multiparty institution, then sends the local model initial gradient parameters to corresponding institutions, each institution trains the respective local model by utilizing respective local training data sets and the local model initial gradient parameters generated by the central server to obtain local model gradient parameters of the local model, then the multiparty institution sends the respective local model gradient parameters to the central server, the respective local data do not need to be sent out to a database, the real data of a user are reversely pushed in the process of interaction of the intermediate parameters, so that the privacy of the data of each party institution is ensured, then the central server carries out weight aggregation on the received local model gradient parameters of the multiparty institution to obtain global gradient parameters, when the central server distributes gradient parameters, the central server is more biased to the parameters obtained by training useful features of the finally trained bank lending model, the model is updated by the global gradient parameters, a loss function is calculated, and whether the function is converged or not is judged, if the local gradient parameters are not converged, and the local gradient parameters are sent to the initial lending model as the initial gradient training function. Compared with the prior art that a model is trained through a federal learning method, the method for federal learning based on homomorphic encryption combined with a weight aggregation method is realized to construct a bank lending model, and under the condition that multiparty institutions do not provide original data, the local models of the multiparty institutions are respectively trained through the federal learning method to analyze the highest lending amount of users, so that the problems that the safety of user data is low and the user data is easy to leak in the model training process are solved.
In order to further increase the security of the model gradient parameters during transmission, according to one embodiment herein, to avoid being broken maliciously, after step 302 obtains the local model gradient parameters of the local model, the method further comprises,
encrypting the local model gradient parameters by using a homomorphic encryption algorithm to obtain local model encryption gradient parameters;
and sending the local model encryption gradient parameters to the central server so that the central server decrypts the local model encryption gradient parameters to obtain the local model gradient parameters.
Correspondingly, according to one embodiment herein, after step 202 sends the local model initial gradient parameters to the corresponding institution, the method further comprises,
receiving local model encryption gradient parameters sent by the multiparty mechanism, wherein the local model encryption gradient parameters are obtained by encrypting the local model gradient parameters by the mechanism by utilizing a homomorphic encryption algorithm;
decrypting the local model encryption gradient parameters to obtain the local model gradient parameters.
In this embodiment, the homomorphic encryption algorithm refers to an encryption algorithm that satisfies homomorphic operation properties of ciphertext, that is, after data is homomorphic encrypted, specific computation is performed on ciphertext, and plaintext obtained by performing corresponding homomorphic decryption on the result of ciphertext computation is equivalent to directly performing the same computation on plaintext data, so as to realize "computability and invisibility" of the data. The central server can generate a set of public keys based on an RSA algorithm and send the public keys to a multiparty organization, the multiparty organization encrypts the local model gradient parameters by utilizing the public keys provided by the central server after obtaining the local model gradient parameters, so as to obtain local model encryption gradient parameters, then each party organization sends the local model encryption gradient parameters to the central server, and the central server decrypts the received local model encryption gradient parameters by utilizing the private keys, so as to obtain the local model gradient parameters. Encryption in the model gradient parameter transmission process is realized by utilizing a public-private key mechanism, any plaintext data is not transmitted in the whole process, and even if a violent or collision mode is adopted, the original user data cannot be analyzed.
According to one embodiment herein, prior to training the local model using the local training dataset and the local model initial gradient parameters, step 302 further comprises, using a neural network to extract data features of the user, constructing a local training dataset.
In the embodiment, various features exist in the user data of the same user stored by each party mechanism, and many features have little effect on the federal model training, so that before each party mechanism trains the local model by using the local training data set and the initial gradient parameters of the local model, the data features of the user are extracted by using the neural network, and the features which have effect on the federal model are primarily screened out, thereby reducing the calculation amount of model training.
In the present embodiment, because the parties need to train their respective local models using the same user's data, the trained users need to be negotiated between the parties. In view of the above, to further prevent the identity of the user from being compromised, according to one embodiment herein, extracting data features of the user using a neural network, constructing a local training data set further comprises,
And encrypting the identity information of the user by using an irreversible encryption algorithm to obtain a unique identifier of the user, extracting a data set corresponding to the unique identifier from a user data set of the mechanism to be used as the local training data set of the mechanism, and sending the unique identifier to other mechanisms so that the other mechanisms respectively extract the data sets corresponding to the unique identifier from the respective user data sets to be used as the respective local training data sets.
It can be understood that any party in the multiparty mechanism encrypts the identity information of the user by using an encryption algorithm which cannot be used to obtain the unique identifier of the user, and sends the unique identifier to other mechanisms, so that the encrypted transmission of the user identity information is realized, and the user identity is prevented from being revealed to a certain extent.
Further, extracting a data set corresponding to the unique identification in the user data set as the local training data set further comprises,
and extracting the h1 layer characteristic of the unique identification from the user data set as the local training data set.
In the prior art, when a model is trained, a sparse self-coding network is mostly adopted, firstly, the parameters from an input layer to an h1 layer are trained, then a decoding layer is removed after the training is finished, only the coding stage from the input layer to a hidden layer is left, then, the parameters from the h1 layer to an h2 layer are trained, the activation value of an h1 layer element without label data is used as the input layer of the h2 layer, then, the self-coding training is carried out, finally, the decoding layer of the h2 layer is removed after the training is finished, and the above steps are repeated to train a network of a higher layer. Therefore, the h1 layer characteristics of the unique identification are directly extracted from the user data set to serve as a local training data set, and the training speed of subsequent federal learning can be increased.
In accordance with one embodiment herein, in order for the central server to increase accuracy in assigning gradient parameters (i.e., to extract features useful for the final trained bank lending model), step 302 trains the local model using the local training dataset and the local model initial gradient parameters further includes,
and calculating the accuracy of the local model, and sending the accuracy to the central server so that the central server carries out weight aggregation on the local model gradient parameters according to the accuracy of a multiparty organization to obtain global gradient parameters.
Correspondingly, step 203 of receiving the local model gradient parameters sent by the multiparty institution further comprises,
receiving accuracy sent by the multiparty institutions, wherein the accuracy is obtained by respectively calculating the local models of the multiparty institutions;
weight aggregation is carried out on the local model gradient parameters, and the obtaining of global gradient parameters further comprises,
and carrying out weight aggregation on the local model gradient parameters according to the accuracy to obtain global gradient parameters.
In this embodiment, after the local model is trained, each mechanism calculates the accuracy of the local model, that is, the original data is input into the trained local model, a predicted value is calculated, then the accuracy of the local model is calculated according to the predicted value and an actual value corresponding to the original data, and the greater the accuracy is, the higher the usefulness of the local model gradient parameter of the local model to the final bank lending model is indicated, so that the multiparty mechanisms respectively send the accuracy of the local model to the central server, and the central server performs weight aggregation on the local model gradient parameter according to the accuracy to obtain the global gradient parameter. Thereby improving the accuracy of the central server training.
In embodiments herein, the average aggregation method may be used to calculate the gradient parameters for each local model. In order to take into account the degree of influence of user features of the parties' authorities on the final model during model training, according to one embodiment herein, weight-aggregating the local model gradient parameters according to the accuracy, obtaining global gradient parameters further comprises,
calculating a sum of a plurality of said accuracies;
calculating the ratio of the corresponding accuracy to the sum of the accuracies for the same local model gradient parameter;
taking the ratio as a coefficient of the local model gradient parameter, calculating a product of the coefficient and the local model gradient parameter, and taking the product as a sub-gradient parameter;
and calculating the sum of a plurality of sub-gradient parameters to obtain the global gradient parameters.
It will be appreciated that the benefit of doing so is that the useful features of the multi-sided mechanism are not equally relevant to the final model, some mechanisms may have more useful features to the user, and the accuracy of the final linkage learning is somewhat greater, so that the above approach may be more biased towards the local model gradient parameters corresponding to the local model of the useful features when assigning the gradient parameters.
Based on the same inventive concept, the embodiments herein also provide a device for constructing a bank lending model, as shown in fig. 4, including,
a local model initial gradient parameter generating unit 401, configured to generate local model initial gradient parameters of each multiparty mechanism;
a local model initial gradient parameter sending unit 402, configured to send the local model initial gradient parameters to corresponding institutions, so that each institution trains a local model by using a respective local training data set and the local model initial gradient parameters to obtain local model gradient parameters of the local model, where the local training data sets of the multiparty institutions correspond to the same user;
a local model gradient parameter receiving unit 403, configured to receive the local model gradient parameter sent by the multiparty mechanism;
a global gradient parameter calculation unit 404, configured to aggregate weights of the local model gradient parameters to obtain global gradient parameters;
a model updating unit 405 for updating a model of the central server and calculating a loss function using the global gradient parameters;
an iterative training unit 406, configured to determine whether the loss function converges; if not, the global gradient parameter is used as a local model initial gradient parameter of the multiparty institution, and the step of sending the local model initial gradient parameter to the corresponding institution is repeatedly executed; if yes, the model is used as a bank lending model.
Correspondingly, the embodiment also provides a device for constructing the bank lending model, as shown in fig. 5, which comprises,
a local model initial gradient parameter receiving unit 501, configured to receive a local model initial gradient parameter sent by a central server;
the local model training unit 502 is configured to train the local model by using a local training data set and the initial gradient parameters of the local model to obtain the gradient parameters of the local model, where the local training data set of the multiparty mechanism corresponds to the same user;
a local model gradient parameter sending unit 503, configured to send the local model gradient parameter to the central server, so that the central server performs weight aggregation on the local model gradient parameter of a multiparty mechanism to obtain a global gradient parameter, update a model of the central server by using the global gradient parameter, calculate a loss function, and determine whether the loss function converges; if yes, the model is used as a bank lending model; if not, taking the updated global gradient parameter as the initial gradient parameter of the local model of the multiparty institution and sending the initial gradient parameter to the corresponding institution;
The local model initial gradient parameter receiving unit 501 is further configured to receive the updated local model initial gradient parameter sent by the central server;
an iterative training unit 504 for performing again the step of training the local model with the local training dataset and the local model initial gradient parameters.
The beneficial effects obtained by the device are consistent with those obtained by the method, and the embodiments of the present disclosure are not repeated.
Based on the same inventive concept, the embodiments herein further provide a system for constructing a bank lending model, which includes a multi-party mechanism and a central server, and specifically, a data flow diagram of the system for constructing a bank lending model in the embodiments herein is shown in fig. 6, and it should be noted that only a data interaction process between one mechanism and the central service server is shown in the steps shown in fig. 6, and it is not difficult for those skilled in the art to think about the data interaction process between the multi-party mechanism and the central server, which is not repeated herein. Specifically, as shown in fig. 6, includes:
step 601: generating local model initial gradient parameters of a mechanism by a central server;
step 602: the central server sends the initial gradient parameters of the local model to the corresponding mechanism;
Step 603: the mechanism trains the local model by using the local training data set and the initial gradient parameters of the local model to obtain the gradient parameters of the local model;
step 604: a mechanism calculates the accuracy of the local model;
step 605: encrypting the local model gradient parameters by a mechanism to obtain local model encrypted gradient parameters;
step 606: the mechanism sends the encryption gradient parameters of the local model and the accuracy of the local model to a central server;
step 607: the central server decrypts the local model encryption gradient parameters to obtain local model gradient parameters;
step 608: the central server carries out weight aggregation on the local model gradient parameters according to the accuracy to obtain global gradient parameters;
step 609: the central server updates a model of the central server by using the global gradient parameters and calculates a loss function;
step 610: the central server judges whether the loss function is converged or not;
step 611: if not, the central server re-executes step 602 by taking the global gradient parameter as the initial gradient parameter of the local model of the multiparty institution;
step 612: if yes, the central server takes the model as a bank lending model.
Based on the same inventive concept, the embodiments herein also provide a lending amount calculating method, as shown in fig. 7, including,
step 701: receiving user data of the same target user sent by a multiparty mechanism;
step 702: and calculating the maximum loan amount of the target user according to the user data and a bank loan model corresponding to the target user, wherein the bank loan model is constructed by using the construction method of the bank loan model.
In this embodiment, the method shown in fig. 7 may be performed by a third party trusted server, where the multiparty organization sends data of the same target user to the third party trusted server, and then the third party trusted server calculates the maximum loan amount of the target user according to the received user data and the bank loan model corresponding to the target user.
In order to avoid that, after the malicious attacker intercepts the user data sent by the multiparty organization and determines that the user identity causes the user data to be compromised, according to one embodiment herein, step 701 of receiving the user data of the same target user sent by the multiparty organization further comprises,
receiving user data and a corresponding unique identifier sent by the multiparty mechanism, wherein the unique identifier is obtained by encrypting the identity information of the target user by the multiparty mechanism by using an irreversible encryption algorithm;
And determining the user data belonging to the same target user according to the unique identification.
In this embodiment, the same non-encryption algorithm is used in the multiparty mechanism to encrypt the identity information of the same user, so as to obtain the unique identifier of the user, and the unique identifier and the corresponding user data are sent to the third party trusted server together, so that the third party trusted server does not need to analyze the unique identifier, but only needs to determine the user data corresponding to the same unique identifier, even if the user data is intercepted by a malicious attacker in the transmission process, the malicious attacker cannot reversely derive the identity information of the user according to the unique identifier, so that the user's separation information cannot be obtained, and the user data cannot be utilized. Thereby improving the security of the user data to a certain extent.
Based on the same inventive concept, as shown in fig. 8, the embodiments herein also provide a lending amount calculating apparatus, including,
a user data receiving unit 801, configured to receive user data of the same target user sent by a multiparty mechanism;
and a maximum loan amount calculating unit 802, configured to calculate the maximum loan amount of the target user according to the user data and a bank loan model corresponding to the target user, where the bank loan model is constructed by using the method for constructing the bank loan model.
The beneficial effects obtained by the device are consistent with those obtained by the method, and the embodiments of the present disclosure are not repeated.
Fig. 9 is a schematic structural diagram of a computer device according to an embodiment of the present disclosure, where the apparatus may be a computer device according to the present disclosure, and perform the method of the present disclosure. The computer device 902 may include one or more processing devices 904, such as one or more Central Processing Units (CPUs), each of which may implement one or more hardware threads. The computer device 902 may also include any storage resources 906 for storing any kind of information, such as code, settings, data, etc. For example, and without limitation, storage resources 906 may include any one or more of the following combinations: any type of RAM, any type of ROM, flash memory devices, hard disks, optical disks, etc. More generally, any storage resource may store information using any technology. Further, any storage resource may provide volatile or non-volatile retention of information. Further, any storage resources may represent fixed or removable components of computer device 902. In one case, the computer device 902 may perform any of the operations of the associated instructions when the processing device 904 executes the associated instructions stored in any storage resource or combination of storage resources. The computer device 902 also includes one or more drive mechanisms 908, such as a hard disk drive mechanism, an optical disk drive mechanism, and the like, for interacting with any storage resources.
The computer device 902 may also include an input/output module 910 (I/O) for receiving various inputs (via an input device 912) and for providing various outputs (via an output device 914). One particular output mechanism may include a presentation device 916 and an associated Graphical User Interface (GUI) 918. In other embodiments, input/output module 910 (I/O), input device 912, and output device 914 may not be included, but merely as a computer device in a network. The computer device 902 may also include one or more network interfaces 920 for exchanging data with other devices via one or more communication links 922. One or more communication buses 924 couple the above-described components together.
The communication link 922 may be implemented in any manner, for example, through a local area network, a wide area network (e.g., the internet), a point-to-point connection, etc., or any combination thereof. Communication link 922 may include any combination of hardwired links, wireless links, routers, gateway functions, name servers, etc., governed by any protocol or combination of protocols.
Corresponding to the methods in fig. 2, 3, 6, 7, embodiments herein also provide a computer readable storage medium having a computer program stored thereon, which computer program when run by a processor performs the above steps.
Embodiments herein also provide a computer readable instruction wherein the program therein causes the processor to perform the method as shown in fig. 2, 3, 6, 7 when the processor executes the instruction.
It should be understood that, in the various embodiments herein, the sequence number of each process described above does not mean the sequence of execution, and the execution sequence of each process should be determined by its functions and internal logic, and should not constitute any limitation on the implementation process of the embodiments herein.
It should also be understood that in embodiments herein, the term "and/or" is merely one relationship that describes an associated object, meaning that three relationships may exist. For example, a and/or B may represent: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps described in connection with the embodiments disclosed herein may be embodied in electronic hardware, in computer software, or in a combination of the two, and that the elements and steps of the examples have been generally described in terms of function in the foregoing description to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the several embodiments provided herein, it should be understood that the disclosed systems, devices, and methods may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. In addition, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices, or elements, or may be an electrical, mechanical, or other form of connection.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the elements may be selected according to actual needs to achieve the objectives of the embodiments herein.
In addition, each functional unit in the embodiments herein may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions herein are essentially or portions contributing to the prior art, or all or portions of the technical solutions may be embodied in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments herein. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Specific examples are set forth herein to illustrate the principles and embodiments herein and are merely illustrative of the methods herein and their core ideas; also, as will be apparent to those of ordinary skill in the art in light of the teachings herein, many variations are possible in the specific embodiments and in the scope of use, and nothing in this specification should be construed as a limitation on the invention.

Claims (18)

1. A bank lending model construction method, characterized by being executed by a central server, the method comprising,
generating local model initial gradient parameters of each multiparty institution;
the initial gradient parameters of the local model are sent to corresponding mechanisms, so that each mechanism trains the local model by utilizing respective local training data sets and the initial gradient parameters of the local model to obtain the gradient parameters of the local model, and the local training data sets of the multiparty mechanisms correspond to the same user;
receiving the local model gradient parameters sent by the multiparty mechanism;
performing weight aggregation on the local model gradient parameters to obtain global gradient parameters;
updating a model of the central server by using the global gradient parameters and calculating a loss function;
Judging whether the loss function converges or not;
if not, the global gradient parameter is used as a local model initial gradient parameter of the multiparty institution, and the step of sending the local model initial gradient parameter to the corresponding institution is repeatedly executed;
if yes, the model is used as a bank lending model.
2. The method of claim 1, wherein after transmitting the local model initial gradient parameters to the corresponding institution, the method further comprises,
receiving local model encryption gradient parameters sent by the multiparty mechanism, wherein the local model encryption gradient parameters are obtained by encrypting the local model gradient parameters by the mechanism by utilizing a homomorphic encryption algorithm;
decrypting the local model encryption gradient parameters to obtain the local model gradient parameters.
3. The method of claim 1, wherein receiving the local model gradient parameters sent by the multi-party authority further comprises,
receiving accuracy sent by the multiparty institutions, wherein the accuracy is obtained by respectively calculating the local models of the multiparty institutions;
weight aggregation is carried out on the local model gradient parameters, and the obtaining of global gradient parameters further comprises,
And carrying out weight aggregation on the local model gradient parameters according to the accuracy to obtain global gradient parameters.
4. The method of claim 3, wherein weight aggregating the local model gradient parameters based on the accuracy further comprises,
calculating a sum of a plurality of said accuracies;
calculating the ratio of the corresponding accuracy to the sum of the accuracies for the same local model gradient parameter;
taking the ratio as a coefficient of the local model gradient parameter, calculating a product of the coefficient and the local model gradient parameter, and taking the product as a sub-gradient parameter;
and calculating the sum of a plurality of sub-gradient parameters to obtain the global gradient parameters.
5. A bank lending model construction device is characterized by comprising,
the local model initial gradient parameter generation unit is used for generating local model initial gradient parameters of each multiparty institution;
the local model initial gradient parameter sending unit is used for sending the local model initial gradient parameters to corresponding mechanisms so that each mechanism trains the local model by utilizing respective local training data sets and the local model initial gradient parameters to obtain local model gradient parameters of the local model, and the local training data sets of the multiparty mechanisms correspond to the same user;
The local model gradient parameter receiving unit is used for receiving the local model gradient parameters sent by the multiparty mechanism;
the global gradient parameter calculation unit is used for carrying out weight aggregation on the local model gradient parameters to obtain global gradient parameters;
a model updating unit for updating a model of the central server and calculating a loss function using the global gradient parameters;
the iterative training unit is used for judging whether the loss function converges or not; if not, the global gradient parameter is used as a local model initial gradient parameter of the multiparty institution, and the step of sending the local model initial gradient parameter to the corresponding institution is repeatedly executed; if yes, the model is used as a bank lending model.
6. A method of bank lending model construction, performed by any one of a plurality of parties, the method comprising,
receiving a local model initial gradient parameter sent by a central server;
training the local model by using the local training data set and the initial gradient parameters of the local model to obtain the gradient parameters of the local model, wherein the local training data set of the multiparty mechanism corresponds to the same user;
The local model gradient parameters are sent to the central server, so that the central server carries out weight aggregation on the local model gradient parameters of a multiparty mechanism to obtain global gradient parameters, the global gradient parameters are utilized to update a model of the central server, a loss function is calculated, and whether the loss function is converged is judged; if yes, the model is used as a bank lending model; if not, taking the updated global gradient parameter as the initial gradient parameter of the local model of the multiparty institution and sending the initial gradient parameter to the corresponding institution;
receiving updated initial gradient parameters of the local model sent by the central server;
the step of training the local model with the local training dataset and the local model initial gradient parameters is performed again.
7. The method of claim 6, wherein after obtaining local model gradient parameters of the local model, the method further comprises,
encrypting the local model gradient parameters by using a homomorphic encryption algorithm to obtain local model encryption gradient parameters;
and sending the local model encryption gradient parameters to the central server so that the central server decrypts the local model encryption gradient parameters to obtain the local model gradient parameters.
8. The method of claim 6, wherein prior to training a local model using a local training dataset and the local model initial gradient parameters, the method further comprises extracting data features of the user using a neural network to construct a local training dataset.
9. The method of claim 8, wherein extracting data features of the user using a neural network, constructing a local training data set further comprises,
and encrypting the identity information of the user by using an irreversible encryption algorithm to obtain a unique identifier of the user, extracting a data set corresponding to the unique identifier from a user data set of the mechanism to be used as the local training data set of the mechanism, and sending the unique identifier to other mechanisms so that the other mechanisms respectively extract the data sets corresponding to the unique identifier from the respective user data sets to be used as the respective local training data sets.
10. The method of claim 9, wherein extracting the data set corresponding to the unique identification in the user data set as the local training data set further comprises,
And extracting the h1 layer characteristic of the unique identification from the user data set as the local training data set.
11. The method of claim 6, wherein training a local model using a local training dataset and the local model initial gradient parameters further comprises,
and calculating the accuracy of the local model, and sending the accuracy to the central server so that the central server carries out weight aggregation on the local model gradient parameters according to the accuracy of a multiparty organization to obtain global gradient parameters.
12. A bank lending model construction device is characterized by comprising,
the local model initial gradient parameter receiving unit is used for receiving the local model initial gradient parameters sent by the central server;
the local model training unit is used for training the local model by utilizing the local training data set and the initial gradient parameters of the local model to obtain the gradient parameters of the local model, and the local training data set of the multiparty mechanism corresponds to the same user;
the local model gradient parameter sending unit is used for sending the local model gradient parameters to the central server so that the central server carries out weight aggregation on the local model gradient parameters of a multiparty mechanism to obtain global gradient parameters, updating a model of the central server by using the global gradient parameters, calculating a loss function and judging whether the loss function is converged or not; if yes, the model is used as a bank lending model; if not, taking the updated global gradient parameter as the initial gradient parameter of the local model of the multiparty institution and sending the initial gradient parameter to the corresponding institution;
The local model initial gradient parameter receiving unit is further used for receiving the updated local model initial gradient parameters sent by the central server;
and the iterative training unit is used for executing the step of training the local model by using the local training data set and the initial gradient parameters of the local model again.
13. The bank lending model construction system is characterized by comprising a multiparty institution and a central server;
any one of the parties of the multi-party institution performs the construction method of the bank lending model according to any one of claims 6 to 11 when constructing the bank lending model;
the central server performs the construction method of the bank lending model according to any one of claims 1 to 4 when constructing the bank lending model.
14. A loan amount calculating method is characterized by comprising the steps of,
receiving user data of the same target user sent by a multiparty mechanism;
calculating the maximum loan amount of the target user according to the user data and a bank loan model corresponding to the target user, wherein the bank loan model is constructed by using the construction method of any one of claims 1 to 4 and 6 to 11.
15. The method of claim 14, wherein receiving user data for the same target user sent by the multi-party authority further comprises,
receiving user data and a corresponding unique identifier sent by the multiparty mechanism, wherein the unique identifier is obtained by encrypting the identity information of the target user by the multiparty mechanism by using an irreversible encryption algorithm;
and determining the user data belonging to the same target user according to the unique identification.
16. A loan amount calculating device is characterized by comprising,
the user data receiving unit is used for receiving the user data of the same target user sent by the multiparty mechanism;
a maximum loan amount calculating unit, configured to calculate a maximum loan amount of the target user according to the user data and a bank loan model corresponding to the target user, where the bank loan model is constructed by using the construction method of any one of claims 1 to 4 and 6 to 11.
17. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method of any of claims 1 to 4, 6 to 11 when executing the computer program.
18. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program which, when executed by a processor, implements the method of any one of claims 1 to 4, 6 to 11.
CN202211570838.5A 2022-12-08 2022-12-08 Bank loan model construction method, loan amount calculation method, device and system Pending CN116051260A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211570838.5A CN116051260A (en) 2022-12-08 2022-12-08 Bank loan model construction method, loan amount calculation method, device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211570838.5A CN116051260A (en) 2022-12-08 2022-12-08 Bank loan model construction method, loan amount calculation method, device and system

Publications (1)

Publication Number Publication Date
CN116051260A true CN116051260A (en) 2023-05-02

Family

ID=86126343

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211570838.5A Pending CN116051260A (en) 2022-12-08 2022-12-08 Bank loan model construction method, loan amount calculation method, device and system

Country Status (1)

Country Link
CN (1) CN116051260A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116828453A (en) * 2023-06-30 2023-09-29 华南理工大学 Unmanned aerial vehicle edge computing privacy protection method based on self-adaptive nonlinear function

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116828453A (en) * 2023-06-30 2023-09-29 华南理工大学 Unmanned aerial vehicle edge computing privacy protection method based on self-adaptive nonlinear function
CN116828453B (en) * 2023-06-30 2024-04-16 华南理工大学 Unmanned aerial vehicle edge computing privacy protection method based on self-adaptive nonlinear function

Similar Documents

Publication Publication Date Title
Pieprzyk et al. Fundamentals of computer security
CN109787743B (en) Verifiable fully homomorphic encryption method based on matrix operation
Abid et al. RETRACTED ARTICLE: An optimised homomorphic CRT-RSA algorithm for secure and efficient communication
CN108259158A (en) Efficient and secret protection individual layer perceptron learning method under a kind of cloud computing environment
Nagaraju et al. Trusted framework for online banking in public cloud using multi-factor authentication and privacy protection gateway
CN109600228B (en) Anti-quantum-computation signature method and system based on public key pool
CN107359998A (en) A kind of foundation of portable intelligent password management system and operating method
CN112115201B (en) Transaction processing method and device based on block chain and transaction tracking method and device
CN107276752A (en) The methods, devices and systems that limitation key is decrypted are paid to cloud
CN108155994A (en) Safely outsourced computational methods applied to RSA decryption
Ganesh et al. Efficient adaptively secure zero-knowledge from garbled circuits
Barta et al. On succinct arguments and witness encryption from groups
CN116051260A (en) Bank loan model construction method, loan amount calculation method, device and system
CN109586918B (en) Anti-quantum-computation signature method and signature system based on symmetric key pool
Liu et al. DHSA: efficient doubly homomorphic secure aggregation for cross-silo federated learning
Shivaramakrishna et al. A novel hybrid cryptographic framework for secure data storage in cloud computing: Integrating AES-OTP and RSA with adaptive key management and Time-Limited access control
Sivasundari et al. RETRACTED ARTICLE: Hybrid aggregated signcryption scheme using multi-constraints differential evolution algorithm for security
Arulananth et al. Multi party secure data access management in cloud using user centric block chain data encryption
Gou et al. A novel quantum E-payment protocol based on blockchain
Zhang et al. Understanding privacy-preserving techniques in digital cryptocurrencies
Zehtabchi et al. A new method for privacy preserving association rule mining using homomorphic encryption with a secure communication protocol
Yang et al. Federated Medical Learning Framework Based on Blockchain and Homomorphic Encryption
CN108632033B (en) Homomorphic encryption method based on random weighted unitary matrix in outsourcing calculation
Aragona et al. Several proofs of security for a tokenization algorithm
Nikishova et al. Cryptographic Protection of Data Transmission Channel

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination