CN113472805A - Model training method and device, storage medium and electronic equipment - Google Patents

Model training method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN113472805A
CN113472805A CN202110794955.9A CN202110794955A CN113472805A CN 113472805 A CN113472805 A CN 113472805A CN 202110794955 A CN202110794955 A CN 202110794955A CN 113472805 A CN113472805 A CN 113472805A
Authority
CN
China
Prior art keywords
model training
ciphertext
model
identity verification
cloud platform
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110794955.9A
Other languages
Chinese (zh)
Other versions
CN113472805B (en
Inventor
车瑞红
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bank of China Ltd
Original Assignee
Bank of China Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bank of China Ltd filed Critical Bank of China Ltd
Priority to CN202110794955.9A priority Critical patent/CN113472805B/en
Publication of CN113472805A publication Critical patent/CN113472805A/en
Application granted granted Critical
Publication of CN113472805B publication Critical patent/CN113472805B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/04Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks
    • H04L63/0428Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks wherein the data content is protected, e.g. by encrypting or encapsulating the payload
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Molecular Biology (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The application provides a model training method and device, a storage medium and an electronic device, wherein the method comprises the following steps: and sending the encrypted model training data and the encrypted model training algorithm to the cloud platform, performing model training on the encrypted model training data through the cloud platform based on the encrypted model training algorithm to obtain an encrypted model, and then decrypting the encrypted model to obtain the target model. Therefore, according to the scheme, the model training data and the model training algorithm are transmitted to the cloud platform in an encrypted mode, the cloud platform conducts model training on the encrypted model training data directly based on the encrypted model training algorithm, leakage of the model training data of the data is avoided, and the model training is conducted on the cloud platform instead of local training, so that the resource loss of a computer is reduced, and further the influence on the running performance of the computer is reduced.

Description

Model training method and device, storage medium and electronic equipment
Technical Field
The present application relates to the field of artificial intelligence, and in particular, to a model training method and apparatus, a storage medium, and an electronic device.
Background
With the development of artificial intelligence, neural network models are widely applied to various fields including speech recognition, computer vision, medical treatment, intelligent gaming and the like. The neural network model needs to be trained and put into use.
In the prior art, model training is usually performed locally, and a large amount of computer resources are often occupied in the process of model training, so that the operation performance of a computer is affected, and other services cannot operate normally.
Disclosure of Invention
The application provides a model training method and device, a storage medium and electronic equipment, and aims to solve the problem that other services cannot normally run due to the fact that a large amount of computer resources are occupied when a model is trained locally, and therefore the running performance of a computer is affected.
In order to achieve the above object, the present application provides the following technical solutions:
a model training method, comprising:
obtaining model training data and a model training algorithm;
respectively encrypting the model training data and the model training algorithm to obtain a model training data ciphertext and a model training algorithm ciphertext;
sending the model training data ciphertext and the model training algorithm ciphertext to a cloud platform so that the cloud platform can train the model training data according to the model training algorithm ciphertext to obtain an encrypted model;
and acquiring an encryption model fed back by the cloud platform, and decrypting the encryption model to obtain a target model.
Optionally, the encrypting the model training data and the model training algorithm to obtain the model training data ciphertext and the model training algorithm ciphertext respectively includes:
respectively encrypting the model training data and the model training algorithm by using the encrypted public key to obtain a model training data ciphertext and a model training algorithm ciphertext;
the decrypting the encrypted model to obtain the target model includes:
and decrypting the encrypted model by using a decryption private key to obtain the target model.
Optionally, the method for obtaining model training data and model training algorithm includes:
responding to a model training trigger instruction of a user, and acquiring a user information ciphertext of the user;
sending an identity verification request to a cloud platform according to the user information ciphertext to trigger the cloud platform to verify the identity of the user according to each pre-stored user information ciphertext to generate an identity verification result;
receiving an identity verification result fed back by the cloud platform;
and if the identity verification result represents that the user passes the identity verification, obtaining model training data and a model training algorithm.
The above method, optionally, further includes:
if the identity verification result represents that the user does not pass the identity verification, outputting prompt information; the prompt message is used for prompting that the identity verification is not passed.
Optionally, the method for triggering the cloud platform to perform identity verification on the user according to the pre-stored user information ciphertexts to generate an identity verification result includes:
and triggering the cloud platform to match the user information ciphertext with each pre-stored user information ciphertext, if the pre-stored user information ciphertext has a user information ciphertext matched with the user information ciphertext carried in the identity verification request, generating an identity verification result representing that the identity verification is passed, and if the pre-stored user information ciphertext does not have a user information ciphertext matched with the user information ciphertext carried in the identity verification request, generating an identity verification result representing that the identity verification is not passed.
In the foregoing method, optionally, the cloud platform includes a private cloud platform.
A model training apparatus comprising:
the acquisition unit is used for acquiring model training data and a model training algorithm;
the encryption unit is used for encrypting the model training data and the model training algorithm respectively to obtain a model training data ciphertext and a model training algorithm ciphertext;
the sending unit is used for sending the model training data ciphertext and the model training algorithm ciphertext to a cloud platform so that the cloud platform can train the model training data according to the model training algorithm ciphertext to obtain an encrypted model;
and the decryption unit is used for decrypting the encryption model after receiving the encryption model fed back by the cloud platform to obtain the target model.
Optionally, the above apparatus, wherein the obtaining unit is specifically configured to:
responding to a model training trigger instruction of a user, and acquiring a user information ciphertext of the user;
sending an identity verification request to a cloud platform according to the user information ciphertext of the user to trigger the cloud platform to verify the identity of the user according to each pre-stored user information ciphertext to generate an identity verification result;
receiving an identity verification result fed back by the cloud platform;
and if the identity verification result represents that the user passes the identity verification, obtaining model training data and a model training algorithm.
A storage medium storing a set of instructions, wherein the set of instructions, when executed by a processor, implements any of the model training methods described above.
An electronic device, comprising:
a memory for storing at least one set of instructions;
a processor for executing a set of instructions stored in the memory, the execution of the set of instructions implementing any of the model training methods described above.
Compared with the prior art, the method has the following advantages:
the application provides a model training method and a device, wherein the method comprises the following steps: and sending the encrypted model training data and the encrypted model training algorithm to the cloud platform, performing model training on the encrypted model training data through the cloud platform based on the encrypted model training algorithm to obtain an encrypted model, and then decrypting the encrypted model to obtain the target model. Therefore, according to the scheme, the model training data and the model training algorithm are transmitted to the cloud platform in an encrypted mode, the cloud platform conducts model training on the encrypted model training data directly based on the encrypted model training algorithm, so that model training data leakage is avoided, and the model training is conducted on the cloud platform instead of locally, so that the resource loss of a computer is reduced, and the influence on the operation performance of the computer is further reduced.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
FIG. 1 is a method flow diagram of a model training method provided herein;
FIG. 2 is a flow chart of yet another method of a model training method provided herein;
FIG. 3 is an exemplary diagram of a model training method provided herein;
fig. 4 is a schematic diagram of a loan application monitoring apparatus according to the present application;
fig. 5 is a schematic structural diagram of an electronic device provided in the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly and completely with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The term "include" and variations thereof as used herein are open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description.
It should be noted that the terms "first", "second", and the like in the disclosure of the present application are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in the disclosure herein are exemplary rather than limiting, and those skilled in the art will understand that "one or more" will be understood unless the context clearly dictates otherwise.
The application is operational with numerous general purpose or special purpose computing device environments or configurations. For example: personal computers, server computers, hand-held or portable devices, tablet-type devices, multi-processor apparatus, distributed computing environments that include any of the above devices or equipment, and the like.
The embodiment of the present application provides a model training method, which may be applied to a variety of system platforms, and an execution subject of the method may be a computer terminal or a processor of various mobile devices, and a flowchart of the method is shown in fig. 1, and specifically includes:
s101, obtaining model training data and a model training algorithm.
In this embodiment, model training data and a model training algorithm for model training are obtained.
The obtaining process of the model training algorithm may be to obtain the model training algorithm selected by the user through the display interface, or to obtain the model training algorithm input by the user through the display interface.
In the process of acquiring the model training data, the model training data uploaded by the user through the display interface can be acquired, or the model training data stored locally in advance can be acquired.
Referring to fig. 2, the process of obtaining model training data and a model training algorithm specifically includes the following steps:
s201, responding to a model training trigger instruction of a user, and acquiring a user information ciphertext of the user.
In this embodiment, an encryption public key and a decryption private key are generated in advance, where the encryption public key and the decryption private key are generated based on an encryption algorithm, where the encryption algorithm includes, but is not limited to, a homomorphic encryption algorithm.
Optionally, the encrypted public key may be sent to a cloud platform, where the cloud platform may be a private cloud platform or a public cloud platform.
In this embodiment, after receiving a model training trigger instruction of a user, the model training trigger instruction is responded to obtain a user information ciphertext of the user, specifically, the user information of the user is obtained, and the user information is encrypted to obtain the user information ciphertext. The user information is information representing the identity of the user, and optionally, the user information may be a user identity card number or a user work number.
The encrypting the user information specifically includes: and encrypting the user information by using the pre-generated encryption public key.
S202, sending an identity verification request to the cloud platform according to the user information ciphertext to trigger the cloud platform to perform identity verification on the user according to each pre-stored user information ciphertext, and generating an identity verification result.
In this embodiment, an identity verification request is sent to the cloud platform according to the user information ciphertext, where the identity verification request carries the user information ciphertext.
After receiving the identity verification request, the cloud platform performs identity verification on the user according to each pre-stored user information ciphertext, specifically, the user information ciphertexts are respectively matched with the pre-stored user information ciphertexts, if the pre-stored user information ciphertexts have the user information ciphertexts matched with the user information ciphertexts carried in the identity verification request, it is determined that the user passes the identity verification, and an identity verification result representing that the user passes the identity verification is generated, and if the pre-stored user information ciphertexts do not have the user information ciphertexts matched with the user information ciphertexts carried in the identity verification request, it is determined that the user does not pass the identity verification, and an identity verification result representing that the user does not pass the identity verification is generated.
The pre-stored user information cryptograph is the user information cryptograph uploaded when the user registers the private platform.
In this embodiment, the cloud platform directly performs identity verification on the encrypted user information without decrypting the user information, thereby preventing leakage of the cloud platform user information.
In this embodiment, the cloud platform feeds back an identity verification result of the identity verification performed by the user.
And S203, receiving an identity verification result fed back by the cloud platform.
And S204, judging whether the identity verification result indicates that the user passes the identity verification, if not, executing S205, and if so, executing S206.
And S205, outputting the prompt message.
In this embodiment, if the identity verification result indicates that the user fails to pass the identity verification, a prompt message is output, where the prompt message is used to prompt the user that the user fails to pass the identity verification.
And S206, obtaining model training data and a model training algorithm.
In this embodiment, if the user identity verification result indicates that the user passes the identity verification, model training data and a model training algorithm for model training are obtained.
S102, respectively encrypting the model training data and the model training algorithm to obtain a model training data ciphertext and a model training algorithm ciphertext.
In this embodiment, the model training data and the model training algorithm are encrypted respectively to obtain a model training data ciphertext and a model training algorithm ciphertext, that is, the model training data is encrypted to obtain a model training data ciphertext, and the model training algorithm is encrypted to obtain a model training algorithm ciphertext.
In this embodiment, the process of encrypting the model training data and the model training algorithm to obtain the model training data ciphertext and the model training algorithm ciphertext respectively includes: and respectively encrypting the model training data and the model training algorithm by using the encryption public key to obtain a model training data ciphertext and a model training algorithm ciphertext, wherein the encryption public key is generated based on an encryption algorithm, and the encryption algorithm comprises but is not limited to a homomorphic encryption algorithm.
S103, sending the model training data ciphertext and the model training algorithm ciphertext to the cloud platform, so that the cloud platform can train the model training data according to the model training algorithm ciphertext to obtain the encrypted model.
In this embodiment, the model training data ciphertext and the model training algorithm ciphertext are sent to the cloud platform.
After receiving the model training data ciphertext and the model training algorithm ciphertext, the cloud platform directly trains the model training data based on the model training algorithm ciphertext, so that the encrypted model is obtained.
And the cloud platform feeds back the encryption model.
And S104, after receiving the encryption model fed back by the cloud platform, decrypting the encryption model to obtain the target model.
In this embodiment, after receiving the encryption model fed back by the cloud platform, the encryption model is decrypted, specifically, the encryption model is decrypted by using a decryption private key, so as to obtain the target model, where the decryption private key is generated based on an encryption algorithm, where the encryption algorithm includes, but is not limited to, a homomorphic encryption algorithm.
The model training method provided by the embodiment of the application encrypts and transmits model training data and a model training algorithm to a cloud platform, the cloud platform directly conducts model training on the encrypted model training data based on the encrypted model training algorithm, so that leakage of the model training data of the data is avoided, and the model training is conducted on the cloud platform instead of being conducted locally, so that the resource loss of a computer is reduced, the influence on the running performance of the computer is reduced, and the decryption private key is not published externally and is only stored locally, and when data leakage occurs, the leakage source can be directly tracked.
It should be noted that the cloud platform mentioned in the embodiment of the present application may be a public cloud platform or a private cloud platform, where the security of the private cloud is higher than that of the public cloud, and preferably, the private cloud platform is applied to model training, so as to further improve the security of data.
Referring to fig. 3, a model training process provided by the embodiment of the present application is illustrated as follows:
for the private cloud user 1, the terminal corresponding to the private cloud user 1 generates an encrypted public key pk1 and a decrypted private key sk1 corresponding to the private cloud user 1 based on a homomorphic encryption algorithm, and stores the encrypted public key pk1 and the decrypted private key sk1, and optionally, the encrypted public key pk1 may be published to the outside.
For the private cloud user 2, the terminal corresponding to the private cloud user 2 generates an encrypted public key pk2 and a decrypted private key sk2 corresponding to the private cloud user 2 based on a homomorphic encryption algorithm, and stores the encrypted public key pk2 and the decrypted private key sk2, and optionally, the encrypted public key pk2 may be published to the outside.
By analogy, for the private cloud user n, the terminal corresponding to the private cloud user n generates the encrypted public key pk1 and the decrypted private key skn corresponding to the private cloud user n based on the homomorphic encryption algorithm, and stores the encrypted public key pkn and the decrypted private key skn, and optionally, the encrypted public key pkn may be published to the outside.
The terminal corresponding to each private cloud user comprises a user registration module, a homomorphic encryption model and a model training algorithm, wherein the user registration module is used for encrypting user information by using an encryption public key to obtain a user information ciphertext and sending the user information ciphertext to the private cloud platform to complete registration of the private cloud platform; the homomorphic encryption module is used for encrypting the user information and the model training data by using an encryption public key corresponding to the private cloud user, sending the encrypted user information and the encrypted model training data to the private cloud platform, and decrypting the encryption model fed back by the private cloud platform by using a decryption private key sk1 to obtain a target model; and the model training algorithm is used for encrypting the model training algorithm by using the encryption public key.
The private cloud platform comprises an identity authentication module and a model training module, wherein the identity authentication module is used for carrying out identity verification on a user according to a pre-stored user information ciphertext; and the model training module is used for training the encrypted model training data based on the encrypted model training algorithm to obtain an encrypted model.
It should be noted that while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous.
It should be understood that the various steps recited in the method embodiments disclosed herein may be performed in a different order and/or performed in parallel. Moreover, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the disclosure is not limited in this respect.
Corresponding to the method described in fig. 1, an embodiment of the present application further provides a model training apparatus, which is used for implementing the method in fig. 1 specifically, and a schematic structural diagram of the model training apparatus is shown in fig. 4, and specifically includes:
an obtaining unit 401, configured to obtain model training data and a model training algorithm;
an encrypting unit 402, configured to encrypt the model training data and the model training algorithm respectively to obtain a model training data ciphertext and a model training algorithm ciphertext;
a sending unit 403, configured to send the model training data ciphertext and the model training algorithm ciphertext to a cloud platform, so that the cloud platform trains the model training data according to the model training algorithm ciphertext to obtain an encrypted model;
and the decryption unit 404 is configured to obtain the encryption model fed back by the cloud platform, and decrypt the encryption model to obtain the target model.
The model training device that this application embodiment provided, encrypt model training data and model training algorithm and transmit to the cloud platform, the cloud platform is direct based on the model training algorithm after encrypting, model training is carried out to the model training data after encrypting, thereby the model training data's of data reveal has been avoided, because model training trains at the cloud platform, but not train locally, thereby computer resource loss has been reduced, and then reduce the influence to computer running performance, and, because the decryption private key does not externally publish, only store in locally, when data leak appears, can directly track to revealing the root cause.
In an embodiment of the present application, based on the foregoing scheme, the encryption unit 402 is specifically configured to:
respectively encrypting the model training data and the model training algorithm by using the encrypted public key to obtain a model training data ciphertext and a model training algorithm ciphertext;
the decryption unit 404 is specifically configured to:
and decrypting the encrypted model by using a decryption private key to obtain the target model.
In an embodiment of the present application, based on the foregoing scheme, the obtaining unit 401 is specifically configured to:
responding to a model training trigger instruction of a user, and acquiring a user information ciphertext of the user;
sending an identity verification request to a cloud platform according to the user information ciphertext of the user to trigger the cloud platform to verify the identity of the user according to each pre-stored user information ciphertext to generate an identity verification result;
receiving an identity verification result fed back by the cloud platform;
and if the identity verification result represents that the user passes the identity verification, obtaining model training data and a model training algorithm.
In an embodiment of the present application, based on the foregoing scheme, the method may further include:
the output unit is used for outputting prompt information if the identity verification result represents that the user does not pass the identity verification; the prompt message is used for prompting that the identity verification is not passed.
In an embodiment of the application, based on the foregoing scheme, the sending unit 403 is configured to execute the sending unit to trigger the cloud platform to perform identity verification on the user according to each pre-stored user information ciphertext, and generate an identity verification result, and specifically configured to:
and triggering the cloud platform to match the user information ciphertext with each pre-stored user information ciphertext, if the pre-stored user information ciphertext has a user information ciphertext matched with the user information ciphertext carried in the identity verification request, generating an identity verification result representing that the identity verification is passed, and if the pre-stored user information ciphertext does not have a user information ciphertext matched with the user information ciphertext carried in the identity verification request, generating an identity verification result representing that the identity verification is not passed.
In an embodiment of the present application, based on the foregoing solution, the cloud platform includes a private cloud platform.
An embodiment of the present application further provides a storage medium, where the storage medium includes stored instructions, where the following operations are performed when the instructions are executed:
obtaining model training data and a model training algorithm;
respectively encrypting the model training data and the model training algorithm to obtain a model training data ciphertext and a model training algorithm ciphertext;
sending the model training data ciphertext and the model training algorithm ciphertext to a cloud platform so that the cloud platform can train the model training data according to the model training algorithm ciphertext to obtain an encrypted model;
and acquiring an encryption model fed back by the cloud platform, and decrypting the encryption model to obtain a target model.
An embodiment of the present application further provides an electronic device, a schematic structural diagram of which is shown in fig. 5, and specifically includes a memory 501, configured to store at least one set of instruction sets; a processor 502 for executing a set of instructions stored in the memory, the execution of the set of instructions resulting in the following:
obtaining model training data and a model training algorithm;
respectively encrypting the model training data and the model training algorithm to obtain a model training data ciphertext and a model training algorithm ciphertext;
sending the model training data ciphertext and the model training algorithm ciphertext to a cloud platform so that the cloud platform can train the model training data according to the model training algorithm ciphertext to obtain an encrypted model;
and acquiring an encryption model fed back by the cloud platform, and decrypting the encryption model to obtain a target model.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
While several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
The foregoing description is only exemplary of the preferred embodiments disclosed herein and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the disclosure. For example, the above features and (but not limited to) technical features having similar functions disclosed in the present disclosure are mutually replaced to form the technical solution.

Claims (10)

1. A method of model training, comprising:
obtaining model training data and a model training algorithm;
respectively encrypting the model training data and the model training algorithm to obtain a model training data ciphertext and a model training algorithm ciphertext;
sending the model training data ciphertext and the model training algorithm ciphertext to a cloud platform so that the cloud platform can train the model training data according to the model training algorithm ciphertext to obtain an encrypted model;
and acquiring an encryption model fed back by the cloud platform, and decrypting the encryption model to obtain a target model.
2. The method of claim 1, wherein the encrypting the model training data and the model training algorithm to obtain the model training data ciphertext and the model training algorithm ciphertext respectively comprises:
respectively encrypting the model training data and the model training algorithm by using the encrypted public key to obtain a model training data ciphertext and a model training algorithm ciphertext;
the decrypting the encrypted model to obtain the target model includes:
and decrypting the encrypted model by using a decryption private key to obtain the target model.
3. The method of claim 1, wherein the obtaining model training data and model training algorithms comprises:
responding to a model training trigger instruction of a user, and acquiring a user information ciphertext of the user;
sending an identity verification request to a cloud platform according to the user information ciphertext to trigger the cloud platform to verify the identity of the user according to each pre-stored user information ciphertext to generate an identity verification result;
receiving an identity verification result fed back by the cloud platform;
and if the identity verification result represents that the user passes the identity verification, obtaining model training data and a model training algorithm.
4. The method of claim 3, further comprising:
if the identity verification result represents that the user does not pass the identity verification, outputting prompt information; the prompt message is used for prompting that the identity verification is not passed.
5. The method according to claim 3, wherein the triggering the cloud platform to perform identity verification on the user according to each pre-stored user information ciphertext to generate an identity verification result includes:
and triggering the cloud platform to match the user information ciphertext with each pre-stored user information ciphertext, if the pre-stored user information ciphertext has a user information ciphertext matched with the user information ciphertext carried in the identity verification request, generating an identity verification result representing that the identity verification is passed, and if the pre-stored user information ciphertext does not have a user information ciphertext matched with the user information ciphertext carried in the identity verification request, generating an identity verification result representing that the identity verification is not passed.
6. The method of any one of claims 1 to 5, wherein the cloud platform comprises a private cloud platform.
7. A model training apparatus, comprising:
the acquisition unit is used for acquiring model training data and a model training algorithm;
the encryption unit is used for encrypting the model training data and the model training algorithm respectively to obtain a model training data ciphertext and a model training algorithm ciphertext;
the sending unit is used for sending the model training data ciphertext and the model training algorithm ciphertext to a cloud platform so that the cloud platform can train the model training data according to the model training algorithm ciphertext to obtain an encrypted model;
and the decryption unit is used for decrypting the encryption model after receiving the encryption model fed back by the cloud platform to obtain the target model.
8. The apparatus according to claim 7, wherein the obtaining unit is specifically configured to:
responding to a model training trigger instruction of a user, and acquiring a user information ciphertext of the user;
sending an identity verification request to a cloud platform according to the user information ciphertext of the user to trigger the cloud platform to verify the identity of the user according to each pre-stored user information ciphertext to generate an identity verification result;
receiving an identity verification result fed back by the cloud platform;
and if the identity verification result represents that the user passes the identity verification, obtaining model training data and a model training algorithm.
9. A storage medium storing a set of instructions, wherein the set of instructions, when executed by a processor, implement the model training method of any one of claims 1 to 6.
10. An electronic device, comprising:
a memory for storing at least one set of instructions;
a processor for executing a set of instructions stored in the memory, the set of instructions being executable to implement the model training method of any one of claims 1 to 6.
CN202110794955.9A 2021-07-14 2021-07-14 Model training method and device, storage medium and electronic equipment Active CN113472805B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110794955.9A CN113472805B (en) 2021-07-14 2021-07-14 Model training method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110794955.9A CN113472805B (en) 2021-07-14 2021-07-14 Model training method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN113472805A true CN113472805A (en) 2021-10-01
CN113472805B CN113472805B (en) 2022-11-18

Family

ID=77880167

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110794955.9A Active CN113472805B (en) 2021-07-14 2021-07-14 Model training method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN113472805B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116151370A (en) * 2023-04-24 2023-05-23 西南石油大学 Model parameter optimization selection system
WO2024000571A1 (en) * 2022-07-01 2024-01-04 Intel Corporation Network architecture for artificial intelligence model protection

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107124276A (en) * 2017-04-07 2017-09-01 西安电子科技大学 A kind of safe data outsourcing machine learning data analysis method
CN111241559A (en) * 2020-01-07 2020-06-05 深圳壹账通智能科技有限公司 Training model protection method, device, system, equipment and computer storage medium
CN111371734A (en) * 2018-12-26 2020-07-03 美的集团股份有限公司 Identity verification and upgrade method, medium, cloud platform, equipment and upgrade server
CN111898145A (en) * 2020-07-22 2020-11-06 苏州浪潮智能科技有限公司 Neural network model training method, device, equipment and medium
CN111914281A (en) * 2020-08-18 2020-11-10 中国银行股份有限公司 Bayes model training method and device based on block chain and homomorphic encryption
CN112822005A (en) * 2021-02-01 2021-05-18 福州大学 Secure transfer learning system based on homomorphic encryption

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107124276A (en) * 2017-04-07 2017-09-01 西安电子科技大学 A kind of safe data outsourcing machine learning data analysis method
CN111371734A (en) * 2018-12-26 2020-07-03 美的集团股份有限公司 Identity verification and upgrade method, medium, cloud platform, equipment and upgrade server
CN111241559A (en) * 2020-01-07 2020-06-05 深圳壹账通智能科技有限公司 Training model protection method, device, system, equipment and computer storage medium
CN111898145A (en) * 2020-07-22 2020-11-06 苏州浪潮智能科技有限公司 Neural network model training method, device, equipment and medium
CN111914281A (en) * 2020-08-18 2020-11-10 中国银行股份有限公司 Bayes model training method and device based on block chain and homomorphic encryption
CN112822005A (en) * 2021-02-01 2021-05-18 福州大学 Secure transfer learning system based on homomorphic encryption

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024000571A1 (en) * 2022-07-01 2024-01-04 Intel Corporation Network architecture for artificial intelligence model protection
CN116151370A (en) * 2023-04-24 2023-05-23 西南石油大学 Model parameter optimization selection system
CN116151370B (en) * 2023-04-24 2023-07-21 西南石油大学 Model parameter optimization selection system

Also Published As

Publication number Publication date
CN113472805B (en) 2022-11-18

Similar Documents

Publication Publication Date Title
US10003582B2 (en) Technologies for synchronizing and restoring reference templates
CN111654367B (en) Method for cryptographic operation and creation of working key, cryptographic service platform and device
CN109039628A (en) Cryptographic key negotiation method, Cloud Server, equipment, storage medium and system
EP3232634A1 (en) Identity authentication method and device
CN113472805B (en) Model training method and device, storage medium and electronic equipment
CN110535641B (en) Key management method and apparatus, computer device, and storage medium
CN107360002B (en) Application method of digital certificate
CN107171796A (en) A kind of many KMC key recovery methods
CN115022102B (en) Transmission line monitoring data transmission method and device, computer equipment and storage medium
CN113961956A (en) Method, device, equipment and medium for generating and applying tagged network information service
CN112491529A (en) Data file encryption and integrity verification method and system used in untrusted server environment
CN112948883B (en) Method, device and system for multiparty joint modeling of privacy data protection
CN111079178A (en) Method for desensitizing and backtracking trusted electronic medical record
CN113672957A (en) Method, device and equipment for processing buried point data and storage medium
CN111159727B (en) Multi-party cooperation oriented Bayes classifier safety generation system and method
CN116340918A (en) Full-secret-text face comparison method, device, equipment and storage medium
CN115022012B (en) Data transmission method, device, system, equipment and storage medium
CN114157473A (en) Biometric technology sharing and verification method, system, device and medium
CN113672954A (en) Feature extraction method and device and electronic equipment
CN109242591B (en) Shared unmanned aerial vehicle renting method, device and system
Swathi et al. Privacy-Cheating Discouragement: A New Homomorphic Encryption Scheme for Cloud Data Security
CN113051587A (en) Privacy protection intelligent transaction recommendation method, system and readable medium
CN113761513A (en) Data processing method, device, equipment and computer readable storage medium
CN113434177A (en) Medical software updating method and device based on medical data safety
CN113746836B (en) Data holding verification method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant