CN116861485A - Student information privacy protection method based on deep learning fusion - Google Patents

Student information privacy protection method based on deep learning fusion Download PDF

Info

Publication number
CN116861485A
CN116861485A CN202310924681.XA CN202310924681A CN116861485A CN 116861485 A CN116861485 A CN 116861485A CN 202310924681 A CN202310924681 A CN 202310924681A CN 116861485 A CN116861485 A CN 116861485A
Authority
CN
China
Prior art keywords
data
key
information
algorithm
encryption
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310924681.XA
Other languages
Chinese (zh)
Inventor
谢升余
潘赟
高春芳
党中华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiaxing Vocational and Technical College
Original Assignee
Jiaxing Vocational and Technical College
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiaxing Vocational and Technical College filed Critical Jiaxing Vocational and Technical College
Priority to CN202310924681.XA priority Critical patent/CN116861485A/en
Priority to ZA2023/08103A priority patent/ZA202308103B/en
Publication of CN116861485A publication Critical patent/CN116861485A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/55Detecting local intrusion or implementing counter-measures
    • G06F21/554Detecting local intrusion or implementing counter-measures involving event detection and direct action
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/602Providing cryptographic facilities or services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/604Tools and structures for managing or administering access control systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • G06Q50/205Education administration or guidance

Abstract

The invention discloses a student information privacy protection method based on deep learning fusion, which combines the deep learning and blockchain technology to ensure that student personal information is safely protected, pre-processes student personal information data uploaded by a user, trains the pre-processed data by using a deep learning algorithm, is mainly used for encrypting and decrypting the personal information, stores the encrypted information on a blockchain in a Hyperledger Fabric network, manages the access right to the student information by using an intelligent contract and an access control strategy to protect the privacy of the student personal information, only an authorized user can check or modify the student personal information, ensures the safety and privacy of the student personal information by the non-tamper property and the distributed property of the blockchain, and decrypts the encrypted information by using a corresponding key and a deep learning model when the authorized user uses the student personal information.

Description

Student information privacy protection method based on deep learning fusion
Technical Field
The invention relates to the technical field of data security, in particular to a student information privacy protection method based on deep learning fusion.
Background
In modern educational environments, personal information storage and processing of students has become an important issue, however, traditional information storage and processing approaches may present a risk of data leakage, which poses a threat to students' privacy, blockchain technology provides a distributed and non-tamperable data storage approach, has been widely used in various fields, hyperledger Fabric is an open-source blockchain framework, providing customizable and scalable blockchain solutions.
Disclosure of Invention
In order to achieve the above purpose, the invention is realized by the following technical scheme: a student information privacy protection method based on deep learning fusion comprises the steps that a user side uploads data, wherein the data uploading content comprises data codes, numerical characteristics and data set classification content;
a series of data cleaning, extraction, conversion and validity pretreatment are carried out on the data uploading content;
s1: checking and processing errors, missing values, outliers, etc. in the data, which may involve repairing errors, filling missing values, deleting outliers, or processing outliers to ensure quality and consistency of the data, otherwise ending;
s2: extracting relevant features from the original data, capturing key information in the data, and converting the data into a form which can be processed by a machine learning algorithm, wherein the form comprises numerical features, classification features, text features and the like;
s3: the data is subjected to standardized conversion including feature scaling, discretization, normalization and the like so as to better adapt to the requirements of a machine learning algorithm, and the conversion can improve the performance and convergence speed of the algorithm and ensure that different features have similar scales;
s4: the data set is divided into a training set, a verification set and a test set, wherein the training set is used for training and parameter adjustment of a model, the verification set is used for selecting and optimizing the model, the test set is used for evaluating the performance of a final model, the data set is divided to ensure that the model is effectively evaluated and verified, the classification characteristic is encoded and converted into a digital form which can be understood by an algorithm, common encoding methods comprise single-heat encoding, label encoding and the like, and the encoded data can be correctly processed and analyzed by the algorithm.
Preferably, after the data preprocessing is completed, a deep learning algorithm, such as a Convolutional Neural Network (CNN) or a Recurrent Neural Network (RNN), is used to perform data model training on the preprocessed data to identify private and non-private data, so as to encrypt and decrypt a part of personal privacy information of the student;
the CNN is used for extracting characteristics in the personal information, the characteristics are used as input and transmitted to an encryption algorithm, the complexity of the personal information can be increased by using abstract characteristics learned by the CNN, so that the security is improved, and then the characteristics are encrypted and decrypted by using a corresponding secret key to restore the original personal information;
the RNN is used for processing the coded personal information sequence, inputting the sequence data into the RNN for encryption, decrypting the encrypted sequence data by using a corresponding key and a decryption algorithm, and restoring the encrypted sequence data into the original personal information sequence by the RNN through reverse operation.
Preferably, in the training stage of the data model, in order to effectively prevent leakage of training data on the premise of ensuring availability, a differential privacy training method is used to reduce the correlation degree between the sharing weight and the original data, and the method is as follows:
(1) Calculating the gradient:wherein θ is t-1 Is an initialized model parameter, x k Is training data, T is total number of iterations, k ε [0, l ength (data (T i ))],T i ,i∈[0,n]Training times;
(2) For g t (x k ) The gradient is clipped and noise N (0,σ 2 C 2 i) Obtain g' t (x k ) Wherein C is a constant;
(3) Calculating the weight θ t ←θ t-1 -ηT i .g' t (x k ) Where η is the degree of learning.
Preferably, the encrypting data by using an AES symmetric encryption algorithm and encrypting an AES algorithm key by using an RSA very symmetric encryption algorithm, wherein the encrypting and decrypting policy specifically includes:
according to the data encryption method, the uploaded data is encrypted by using an AES symmetric encryption algorithm, and an AES key is encrypted and stored by using an RSA public key and decrypted by using an RSA private key:
adopting an AES-256 algorithm, wherein the key is 256 bits long, the plaintext P is an encrypted file, and the ciphertext E is a ciphertext encrypted by the plaintext P by using the AES algorithm;
the AES algorithm key K is generated according to a set rule, and the key ciphertext S is generated by encrypting the AES encryption algorithm key by using an RSA public key;
if the AES encryption function is F, e=f (K, P), F takes plaintext P and key K as input parameters, and outputs ciphertext E; assuming that the AES decryption function is D, then p=d (K, E), the decryption function D ciphertext E and the key K are input parameters, and the plaintext P is output;
public key and private key SK of RSA encryption algorithm, then with public key PK and private key SK, through generating function GenKey () of the key and producing, genKey (C) = (PK, SK), the key produces the function and inputs constant C, output public key PK and private key SK;
RSA encryption function is EncryptData (), AES key ciphertext s=encryptdata (PK, K); RSA decryption function is DecryptData (), AES key k=decryptdata (SK, S).
Preferably, in the encryption and decryption policy, in order to protect privacy of personal information of students, an intelligent contract and an access control policy are used to manage access rights of an encryption and decryption method, roles and rights are defined in the intelligent contract to determine which roles are authorized to use the encryption and decryption method, for example, only users with specific roles (such as an administrator) can call the encryption and decryption function, and digital signature is used to ensure validity of access to the encryption key.
Preferably, the data training and encrypting and decrypting method needs to manage data authority, when authorized users need to use personal information of students, the encrypted information is decrypted by using a corresponding secret key and a deep learning model, and the decrypted information can be used in an authorized range so as to meet corresponding education or management requirements;
writing different groups of user information, access rights and corresponding rules into the intelligent contracts, loading the user information into each node, creating corresponding encrypted files through adjustment and use of the intelligent contracts, inputting the information into the blockchain, inquiring the access rights of the user through the intelligent contracts, and accessing the encrypted files of different partitions when the preset rules are met.
The application of the student information privacy protection method based on deep learning fusion in the blockchain network comprises the steps of selecting Hyperledger Fabric permission blockchain as a bottom layer platform to store private data, storing Docker mirror images and storing non-private data in a Postgres database to improve the security, efficiency and the like of accessing data privacy.
The invention provides a student information privacy protection method based on deep learning fusion, which has the following beneficial effects compared with the prior art:
according to the student information privacy protection method based on deep learning fusion, malicious attacks are analyzed and blocked by utilizing artificial intelligence on data uploaded by a user, data are classified, validity of core data is verified, protected data are identified and hidden, and the processed data are stored in a partition mode on a blockchain according to specific rules after being encrypted according to different attributes of importance and protection degree.
Drawings
FIG. 1 is a schematic diagram of a system model provided by the present invention;
fig. 2 is a schematic block diagram of a flow provided by the present invention.
Detailed Description
The technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are only some embodiments of the present invention, but not all embodiments, and all other embodiments obtained by those skilled in the art without making creative efforts based on the embodiments of the present invention are included in the protection scope of the present invention.
Examples:
referring to fig. 1-2, a student information privacy protection method based on deep learning fusion includes that a user side performs data uploading, wherein the data uploading content comprises data coding, numerical characteristics and data set classification content;
a series of data cleaning, extraction, conversion and validity pretreatment are carried out on the data uploading content;
s1: checking and processing errors, missing values, outliers, etc. in the data, which may involve repairing errors, filling missing values, deleting outliers, or processing outliers to ensure quality and consistency of the data, otherwise ending;
s2: extracting relevant features from the original data, capturing key information in the data, and converting the data into a form which can be processed by a machine learning algorithm, wherein the form comprises numerical features, classification features, text features and the like;
s3: the data is subjected to standardized conversion including feature scaling, discretization, normalization and the like so as to better adapt to the requirements of a machine learning algorithm, and the conversion can improve the performance and convergence speed of the algorithm and ensure that different features have similar scales;
s4: the data set is divided into a training set, a verification set and a test set, wherein the training set is used for training and parameter adjustment of a model, the verification set is used for selecting and optimizing the model, the test set is used for evaluating the performance of a final model, the data set is divided to ensure that the model is effectively evaluated and verified, the classification characteristic is encoded and converted into a digital form which can be understood by an algorithm, common encoding methods comprise single-heat encoding, label encoding and the like, and the encoded data can be correctly processed and analyzed by the algorithm.
In this embodiment, after the data preprocessing is completed, a deep learning algorithm, such as a Convolutional Neural Network (CNN) or a Recurrent Neural Network (RNN), is used to perform data model training on the preprocessed data to identify private and non-private data, so as to encrypt and decrypt a portion of the personal privacy information of the student;
the CNN is used for extracting characteristics in the personal information, the characteristics are used as input and transmitted to an encryption algorithm, the complexity of the personal information can be increased by using abstract characteristics learned by the CNN, so that the security is improved, and then the characteristics are encrypted and decrypted by using a corresponding secret key to restore the original personal information;
the RNN is used for processing the coded personal information sequence, inputting the sequence data into the RNN for encryption, decrypting the encrypted sequence data by using a corresponding key and a decryption algorithm, and restoring the encrypted sequence data into the original personal information sequence by the RNN through reverse operation.
In this embodiment, in the data model training stage, in order to effectively prevent leakage of training data on the premise of ensuring usability, a differential privacy training method is used to reduce the correlation degree between the sharing weight and the original data, and the method is as follows:
(1) Calculating the gradient:wherein θ is t-1 Is an initialized model parameter, x k Is training data, T is total number of iterations, k e [0, length (data (T i ))],T i ,i∈[0,n]Training times;
(2) For g t (x k ) Adding noise N (0, sigma) after clipping gradient 2 C 2 I) Obtain g' t (x k ) Wherein C is a constant;
(3) Calculating the weight θ t ←θ t-1 -ηT i .g' t (x k ) Where η is the degree of learning.
In this embodiment, the AES symmetric encryption algorithm is used to encrypt data and the RSA very symmetric encryption algorithm is used to encrypt the AES algorithm key, where the encryption and decryption policy specifically includes:
according to the data encryption method, the uploaded data is encrypted by using an AES symmetric encryption algorithm, and an AES key is encrypted and stored by using an RSA public key and decrypted by using an RSA private key:
adopting an AES-256 algorithm, wherein the key is 256 bits long, the plaintext P is an encrypted file, and the ciphertext E is a ciphertext encrypted by the plaintext P by using the AES algorithm;
the AES algorithm key K is generated according to a set rule, and the key ciphertext S is generated by encrypting the AES encryption algorithm key by using an RSA public key;
if the AES encryption function is F, e=f (K, P), F takes plaintext P and key K as input parameters, and outputs ciphertext E; assuming that the AES decryption function is D, then p=d (K, E), the decryption function D ciphertext E and the key K are input parameters, and the plaintext P is output;
public key and private key SK of RSA encryption algorithm, then with public key PK and private key SK, through generating function GenKey () of the key and producing, genKey (C) = (PK, SK), the key produces the function and inputs constant C, output public key PK and private key SK;
RSA encryption function is EncryptData (), AES key ciphertext s=encryptdata (PK, K); RSA decryption function is DecryptData (), AES key k=decryptdata (SK, S).
In this embodiment, in order to protect privacy of personal information of students during encryption and decryption policies, an intelligent contract and an access control policy are used to manage access rights of encryption and decryption methods, roles and rights are defined in the intelligent contract to determine which roles are authorized to use the encryption and decryption methods, for example, only users with specific roles (such as administrators) can call the encryption and decryption functions, and digital signatures are used to ensure validity of access to the encryption keys.
In this embodiment, the data training and encrypting and decrypting method needs to manage the data authority, when an authorized user needs to use personal information of a student, the encrypted information is decrypted by using a corresponding key and a deep learning model, and the decrypted information can be used in an authorized range so as to meet the corresponding education or management requirement;
writing different groups of user information, access rights and corresponding rules into the intelligent contracts, loading the user information into each node, creating corresponding encrypted files through adjustment and use of the intelligent contracts, inputting the information into the blockchain, inquiring the access rights of the user through the intelligent contracts, and accessing the encrypted files of different partitions when the preset rules are met.
The application of the student information privacy protection method based on deep learning fusion in the blockchain network comprises the steps of selecting Hyperledger Fabric permission blockchain as a bottom layer platform to store private data, storing Docker mirror images and storing non-private data in a Postgres database to improve the security, efficiency and the like of accessing data privacy.
And all that is not described in detail in this specification is well known to those skilled in the art.
It is noted that relational terms such as first and second, and the like, are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions, and further, that the terms "comprise," "include," or any other variation thereof, are intended to cover a non-exclusive inclusion such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements, but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
Although embodiments of the present invention have been shown and described, it will be understood by those skilled in the art that various changes, modifications, substitutions and alterations can be made therein without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (7)

1. The student information privacy protection method based on deep learning fusion is characterized by comprising the following steps of:
the user side uploads data, wherein the data uploading content comprises data codes, numerical characteristics and data set classification content;
a series of data cleaning, extraction, conversion and validity pretreatment are carried out on the data uploading content;
s1: checking and processing errors, missing values and abnormal value problems in the data, and ensuring the quality and consistency of the data;
s2: extracting relevant features from the original data, capturing key information in the data, and converting the data into a form which can be processed by a machine learning algorithm, wherein the form comprises numerical features, classification features and text features;
s3: the standardized conversion of the data comprises feature scaling, discretization and normalization, so that the requirements of a machine learning algorithm can be better met, and different features are ensured to have similar scales;
s4: the data set is divided into a training set, a verification set and a test set, wherein the training set is used for training and parameter adjustment of a model, the verification set is used for selecting and optimizing the model, the test set is used for evaluating the performance of a final model, the data set is divided to ensure that the model is effectively evaluated and verified, the classification characteristic is encoded and converted into a digital form which can be understood by an algorithm, a common encoding method comprises single-heat encoding and label encoding, and the encoded data can be correctly processed and analyzed by the algorithm.
2. The method for protecting student information privacy based on deep learning fusion according to claim 1, wherein after the data preprocessing is completed, a deep learning algorithm such as a Convolutional Neural Network (CNN) or a cyclic neural network (RNN) is used to perform data model training on the preprocessed data to identify private and non-private data, so as to encrypt and decrypt a portion of student personal privacy information;
the CNN is used for extracting characteristics in the personal information, the characteristics are used as input and transmitted to an encryption algorithm, the complexity of the personal information can be increased by using abstract characteristics learned by the CNN, so that the security is improved, and then the characteristics are encrypted and decrypted by using a corresponding secret key to restore the original personal information;
the RNN is used for processing the coded personal information sequence, inputting the sequence data into the RNN for encryption, decrypting the encrypted sequence data by using a corresponding key and a decryption algorithm, and restoring the encrypted sequence data into the original personal information sequence by the RNN through reverse operation.
3. The student information privacy protection method based on deep learning fusion according to claim 2, wherein in the data model training stage, in order to prevent leakage of training data on the premise of ensuring usability, a differential privacy training method is used to reduce the correlation degree between the sharing weight and the original data, and the method is as follows:
(1) Calculating the gradient:wherein θ is t-1 Is an initialized model parameter, x k Is training data, T is total number of iterations, k e [0, length (data (T i ))],T i ,i∈[0,n]Training times;
(2) For g t (x k ) Adding noise N (0, sigma) after clipping gradient 2 C 2 I) Obtain g' t (x k ) Wherein C is a constant;
(3) Calculating the weight θ t ←θ t-1 -ηT i .g' t (x k ) Where η is the degree of learning.
4. The student information privacy protection method based on deep learning fusion according to claim 1, wherein the encrypting of the data by using the AES symmetric encryption algorithm and the encrypting of the AES algorithm key by using the RSA very symmetric encryption algorithm specifically comprise:
according to the data encryption method, the uploaded data is encrypted by using an AES symmetric encryption algorithm, and an AES key is encrypted and stored by using an RSA public key and decrypted by using an RSA private key:
adopting an AES-256 algorithm, wherein the key is 256 bits long, the plaintext P is an encrypted file, and the ciphertext E is a ciphertext encrypted by the plaintext P by using the AES algorithm;
the AES algorithm key K is generated according to a set rule, and the key ciphertext S is generated by encrypting the AES encryption algorithm key by using an RSA public key;
if the AES encryption function is F, e=f (K, P), F takes plaintext P and key K as input parameters, and outputs ciphertext E; assuming that the AES decryption function is D, then p=d (K, E), the decryption function D ciphertext E and the key K are input parameters, and the plaintext P is output;
public key and private key SK of RSA encryption algorithm, then with public key PK and private key SK, through generating function GenKey () of the key and producing, genKey (C) = (PK, SK), the key produces the function and inputs constant C, output public key PK and private key SK;
RSA encryption function is EncryptData (), AES key ciphertext s=encryptdata (PK, K); RSA decryption function is DecryptData (), AES key k=decryptdata (SK, S).
5. The student information privacy protection method based on deep learning fusion according to claim 1, wherein in order to protect the privacy of student personal information during encryption and decryption policies, an intelligent contract and an access control policy are used to manage access rights of the encryption and decryption methods, roles and rights are defined in the intelligent contract to determine which roles are entitled to use the encryption and decryption methods, for example, only users with specific roles (such as an administrator) can call the encryption and decryption functions, and digital signatures are used to ensure the legitimacy of accessing the encryption keys.
6. The student information privacy protection method based on deep learning fusion according to claim 1, wherein the data training and encryption and decryption method is characterized in that data authority management is required, when authorized users need to use personal information of students, the encrypted information is decrypted by using a corresponding secret key and a deep learning model, and the decrypted information can be used in an authorized range so as to meet corresponding education or management requirements;
writing different groups of user information, access rights and corresponding rules into the intelligent contracts, loading the user information into each node, creating corresponding encrypted files through adjustment and use of the intelligent contracts, inputting the information into the blockchain, inquiring the access rights of the user through the intelligent contracts, and accessing the encrypted files of different partitions when the preset rules are met.
7. The application of the student information privacy protection method based on deep learning fusion in a blockchain network according to any one of claims 1-6, wherein the method comprises the steps of selecting Hyperledger Fabri c permission blockchain as a bottom layer platform to store private data, and storing non-private data in a Postgres database to improve the security and efficiency of accessing data privacy.
CN202310924681.XA 2023-07-26 2023-07-26 Student information privacy protection method based on deep learning fusion Pending CN116861485A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202310924681.XA CN116861485A (en) 2023-07-26 2023-07-26 Student information privacy protection method based on deep learning fusion
ZA2023/08103A ZA202308103B (en) 2023-07-26 2023-08-22 A method for students' private information protection based on deep learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310924681.XA CN116861485A (en) 2023-07-26 2023-07-26 Student information privacy protection method based on deep learning fusion

Publications (1)

Publication Number Publication Date
CN116861485A true CN116861485A (en) 2023-10-10

Family

ID=88232159

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310924681.XA Pending CN116861485A (en) 2023-07-26 2023-07-26 Student information privacy protection method based on deep learning fusion

Country Status (2)

Country Link
CN (1) CN116861485A (en)
ZA (1) ZA202308103B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117436132A (en) * 2023-12-21 2024-01-23 福建中科星泰数据科技有限公司 Data privacy protection method integrating blockchain technology and artificial intelligence

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117436132A (en) * 2023-12-21 2024-01-23 福建中科星泰数据科技有限公司 Data privacy protection method integrating blockchain technology and artificial intelligence
CN117436132B (en) * 2023-12-21 2024-03-05 福建中科星泰数据科技有限公司 Data privacy protection method integrating blockchain technology and artificial intelligence

Also Published As

Publication number Publication date
ZA202308103B (en) 2024-03-27

Similar Documents

Publication Publication Date Title
Mehmood et al. Protection of big data privacy
CN100536393C (en) Secret shared key mechanism based user management method
KR102224998B1 (en) Computer-implemented system and method for protecting sensitive data via data re-encryption
CN108898028A (en) It is related to the neural network model encryption protection system and method for iteration and accidental enciphering
CN110611662B (en) Attribute-based encryption-based fog collaborative cloud data sharing method
CN106888080A (en) Protection whitepack feistel network implementations are in case fault analysis
CN105024803A (en) Behavioral fingerprint in a white-box implementation
WO2021010896A1 (en) Method and system for distributed data management
CN111475866A (en) Block chain electronic evidence preservation method and system
CN113779355B (en) Network rumor tracing evidence obtaining method and system based on blockchain
CN105184115A (en) Method For Including An Implicit Integrity Or Authenticity Check Into A White-box Implementation
CN116861485A (en) Student information privacy protection method based on deep learning fusion
Kaur et al. A secure data classification model in cloud computing using machine learning approach
Sun et al. Face security authentication system based on deep learning and homomorphic encryption
Bernal-Romero et al. A review on protection and cancelable techniques in biometric systems
CN108804931B (en) Neural network model encryption protection system and method related to domain transformation data encryption
Aldin et al. Quad-color image encryption based on Chaos and Fibonacci Q-matrix
CN105790929A (en) High-efficient access control method based on rule redundancy elimination in encryption environment
Jamil et al. Cyber Security for Medical Image Encryption using Circular Blockchain Technology Based on Modify DES Algorithm.
Arulananth et al. Multi party secure data access management in cloud using user centric block chain data encryption
Sanjeevi et al. The improved DROP security based on hard AI problem in cloud
CN114065169B (en) Privacy protection biometric authentication method and device and electronic equipment
CN114430321B (en) DFA self-adaptive security-based black box traceable key attribute encryption method and device
Sharma et al. Privacy-preserving boosting with random linear classifiers
Liu et al. Design of Updating Encryption Algorithm for Privacy Big Data Based on Consortium Blockchain Technology

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination