CN116861459A - Credit scoring model modeling method and system based on privacy calculation - Google Patents

Credit scoring model modeling method and system based on privacy calculation Download PDF

Info

Publication number
CN116861459A
CN116861459A CN202310821730.7A CN202310821730A CN116861459A CN 116861459 A CN116861459 A CN 116861459A CN 202310821730 A CN202310821730 A CN 202310821730A CN 116861459 A CN116861459 A CN 116861459A
Authority
CN
China
Prior art keywords
data
privacy
model
encryption
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310821730.7A
Other languages
Chinese (zh)
Inventor
胡亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Tengsuo Technology Co ltd
Original Assignee
Shanghai Tengsuo Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Tengsuo Technology Co ltd filed Critical Shanghai Tengsuo Technology Co ltd
Priority to CN202310821730.7A priority Critical patent/CN116861459A/en
Publication of CN116861459A publication Critical patent/CN116861459A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/602Providing cryptographic facilities or services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • G06F21/6254Protecting personal data, e.g. for financial or medical purposes by anonymising data, e.g. decorrelating personal data from the owner's identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • G06Q40/03Credit; Loans; Processing thereof
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The invention provides a credit score model modeling method and a credit score model modeling system based on privacy calculation, wherein the privacy calculation can be used for carrying out data analysis and modeling under the condition of not exposing sensitive data. In credit score modeling, privacy calculations are used to model where data is available and not visible. Specifically, the method comprises the steps of encrypting sensitive data by data encryption so as to protect the privacy of the data. The encryption algorithm may choose symmetric encryption or asymmetric encryption. Data desensitization: and (3) desensitizing the encrypted data to reduce the leakage risk of sensitive information. Privacy calculation: the encrypted and desensitized data is computed and analyzed using privacy computation techniques to generate a credit scoring model. Privacy calculations include secure multiparty calculations, homomorphic encryption. Model verification: through model verification and evaluation, the credit scoring model generated by privacy calculation is ensured to have higher accuracy and reliability, the privacy data of the user is protected, and meanwhile, accurate and reliable credit scoring service is provided.

Description

Credit scoring model modeling method and system based on privacy calculation
Technical Field
The invention relates to the field of computers, in particular to a credit score model modeling method and a credit score model modeling system based on privacy calculation.
Background
Existing credit scoring models are based primarily on the collection and analysis of personal information, but this approach tends to leak the user's private information. Thus, privacy calculations have emerged as a new type of calculation model.
However, there are some disadvantages in the use process of the existing credit scoring model, and the existing credit scoring model is mainly based on collection and analysis of personal information, but this way privacy information of users is easily leaked.
Therefore, the method and the system for modeling the credit scoring model based on the privacy calculation are improved, the privacy calculation can be carried out by processing personal data in the terminal, so that the privacy of the user can be protected, and meanwhile, the data analysis task can be completed.
Disclosure of Invention
The invention aims at: aiming at the problems of the existing background. In order to achieve the dual objective of credit scoring and privacy protection by performing computation on a user terminal, the present invention provides the following scheme: the credit score model modeling method based on privacy calculation and the system thereof comprise a user terminal module and a privacy calculation module, wherein the user terminal is used for collecting user data, processing the user data by an encryption module and importing the disturbed data into a model training module; the encryption module comprises a key generation unit, a data encryption unit and a data decryption unit;
the privacy computing module comprises safe multiparty computing and homomorphic encryption computing, wherein the safe multiparty computing performs data computing and analysis under the condition of multiparty participation, and meanwhile, data privacy is protected; the homomorphic encryption can perform data calculation and analysis in a ciphertext state, and meanwhile, data privacy is protected;
the model training module comprises a data collecting unit, a data characteristic extracting unit, a data analyzing unit and a model establishing unit, and is used for importing data into a model application algorithm module, and the model application algorithm module is used for calculating credit scores of users.
As a preferable scheme of the invention, a data collection unit in the model training module collects the credit score related data in the user terminal module, and a data feature extraction unit performs feature extraction on the data subjected to the differential privacy processing to convert the collected user information into a numerical feature.
As a preferable scheme of the invention, a data analysis unit learning algorithm in the model training module processes and optimizes the extracted numerical characteristics, and the processing optimization uses principal component analysis to perform dimension reduction processing on the characteristics.
As a preferred scheme of the invention, the key of the model training module is to carry out privacy protection on data, and model and optimize the data by utilizing a machine learning algorithm to obtain a reliable credit scoring model.
As a preferred scheme of the invention, the encryption module comprises a key generation unit for encrypting data, the key comprises a public key and a private key, the public key is used for encrypting the data, the private key only has independent use permission for decrypting the data, and the key is generated through a plurality of mathematical algorithms and comprises a large prime number decomposition unit, an euler function unit and a modular exponentiation unit, and the data encryption unit and the data decryption unit.
As a preferable scheme of the invention, the encryption process of the data encryption unit of the encryption module is to encrypt the plaintext data by using a public key to obtain the ciphertext data, and the secret key can carry out mathematical transformation on the plaintext data in the encryption process.
As a preferred scheme of the invention, the data decryption unit of the encryption module decrypts the encrypted ciphertext data by using the private key in the data decryption process to restore the encrypted ciphertext data to original plaintext data, and in the decryption process, the private key performs mathematical transformation on the ciphertext data, and only a person with the private key can decrypt the ciphertext data to restore the encrypted ciphertext data to original plaintext data.
A credit score model modeling method based on privacy calculation comprises the following steps: step 1: collecting user data including age, occupation and income information related data information of a user through a user terminal module;
step 2: noise disturbance is carried out on the encrypted data by utilizing differential privacy, so that the privacy of a user is protected, and the disturbed data is ensured to keep certain accuracy;
step 3: model training is carried out on the disturbed data transmission, and the credit score of the user is calculated by using the model on the user terminal.
As a preferable scheme of the invention, the model training in the step 3 establishes a logic unit, and the logic unit is used for fitting the characteristic variable and the target variable to obtain a logistic regression model so as to predict the classification result.
In the preferred scheme of the invention, in step 4, the credit score data is classified and estimated through a calculation network of nodes formed by the neural network.
Compared with the prior art, the invention has the beneficial effects that:
in the scheme of the invention: data analysis and modeling can be performed without exposing sensitive data through privacy calculations. In credit score modeling, privacy calculations are used to model where data is available and not visible. Specifically, the method comprises the steps of encrypting sensitive data by data encryption so as to protect the privacy of the data. The encryption algorithm may choose symmetric encryption or asymmetric encryption. Data desensitization: and (3) desensitizing the encrypted data to reduce the leakage risk of sensitive information. Privacy calculation: the encrypted and desensitized data is computed and analyzed using privacy computation techniques to generate a credit scoring model. Privacy calculations include secure multiparty calculations, homomorphic encryption. Model verification: through model verification and evaluation, the credit scoring model generated by privacy calculation is ensured to have higher accuracy and reliability, the privacy data of the user is protected, and meanwhile, accurate and reliable credit scoring service is provided.
Drawings
FIG. 1 is a flow chart provided by the present invention;
FIG. 2 is a flow chart of a model training module provided by the present invention;
FIG. 3 is a flowchart of an encryption module provided by the present invention;
fig. 4 is a flowchart of privacy calculation provided by the present invention.
Detailed Description
In order to make the objects, aspects and advantages of the embodiments of the present invention more apparent, the embodiments of the present invention will be more fully described hereinafter with reference to the accompanying drawings. It will be apparent that the described embodiments are some, but not all, embodiments of the invention.
Thus, the following detailed description of the embodiments of the invention is not intended to limit the scope of the invention, as claimed, but is merely representative of some embodiments of the invention. All other embodiments obtained by those skilled in the art without making any creative effort based on the embodiments of the present invention are within the protection scope of the present invention, and it should be noted that the embodiments of the present invention and features and schemes of the embodiments may be combined with each other without collision: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
Example 1: a credit score model modeling method based on privacy calculation comprises the following steps: step 1: collecting user data including age, occupation and income information related data information of a user through a user terminal module;
step 2: noise disturbance is carried out on the encrypted data by utilizing differential privacy, so that the privacy of a user is protected, and the disturbed data is ensured to keep certain accuracy;
step 3: model training is carried out on the disturbed data transmission, and the credit score of the user is calculated by using the model on the user terminal.
And 3, building a logic unit through model training in the step, and fitting the characteristic variable and the target variable through the logic unit to obtain a logistic regression model so as to predict the classification result.
And 4, classifying and estimating credit score data through a calculation network of nodes formed by the neural network.
Example 2: referring to fig. 1-4, a credit score model modeling system based on privacy calculation includes a user terminal module and a privacy calculation module, wherein the user terminal is used for collecting user data, processing the user data by an encryption module, and importing the disturbed data into a model training module; the encryption module comprises a key generation unit, a data encryption unit and a data decryption unit;
the privacy computing module comprises safe multiparty computing and homomorphic encryption computing, wherein the safe multiparty computing performs data computing and analysis under the condition of multiparty participation, and meanwhile, data privacy is protected; the homomorphic encryption can perform data calculation and analysis in a ciphertext state, and meanwhile, data privacy is protected;
the model training module comprises a data collecting unit, a data characteristic extracting unit, a data analyzing unit and a model establishing unit, and is used for importing data into a model application algorithm module, and the model application algorithm module is used for calculating credit scores of users.
The data collection unit in the model training module collects the credit score related data in the user terminal module, and the data after differential privacy processing is subjected to feature extraction through the data feature extraction unit, so that the collected user information is converted into numerical features.
The data analysis unit in the model training module processes and optimizes the extracted numerical characteristics, the processing and optimizing process uses principal component analysis to perform dimension reduction processing on the characteristics, and the training and modeling of the model are completed in cloud encryption.
The key of the model training module is that the privacy protection is carried out on the data, meanwhile, the machine learning algorithm is utilized to model and optimize the data, a reliable credit scoring model is obtained, the training and modeling work of the model are completed at the cloud, the privacy of a user is fully protected, and the risk of information leakage of the user is avoided.
The encryption module comprises a key generation unit for encrypting data, wherein the key comprises a public key and a private key, the public key is used for encrypting the data, the private key only has independent use permission for decrypting the data, and the key is generated through a plurality of mathematical algorithms and comprises a large prime number decomposition unit, an Euler function unit and a modular exponentiation unit, and the data encryption unit and the data decryption unit.
The encryption process of the data encryption unit of the encryption module is to encrypt plaintext data by using a public key to obtain ciphertext data, and the secret key performs mathematical transformation on the plaintext data in the encryption process.
And the data decryption unit of the encryption module decrypts the encrypted ciphertext data by using a private key in the data decryption process to restore the encrypted ciphertext data into original plaintext data, and in the decryption process, the private key performs mathematical transformation on the ciphertext data, and only a person with the private key can decrypt the ciphertext data to restore the original plaintext data.
Privacy calculations can be data analyzed and modeled without exposing sensitive data. In credit score modeling, privacy calculations can be modeled in cases where data is available and not visible by the following steps: step 1, data encryption: sensitive data is encrypted to protect data privacy. The encryption algorithm may choose symmetric encryption or asymmetric encryption.
Step 2, data desensitization: and (3) desensitizing the encrypted data to reduce the leakage risk of sensitive information. The data desensitization method comprises data disturbance, data noise adding, data shielding and the like.
Step 3, privacy calculation: the encrypted and desensitized data is computed and analyzed using privacy computation techniques to generate a credit scoring model. Common privacy calculation methods include secure multi-party calculation, homomorphic encryption, and differential privacy.
Step 4, model verification: and through model verification and evaluation, the credit scoring model generated by privacy calculation is ensured to have higher accuracy and reliability.
Through the steps, the privacy calculation can realize credit score modeling under the condition that the data is available and invisible, protect the privacy data of the user and simultaneously provide accurate and reliable credit score service.
Privacy calculations are a technique to preserve data privacy and allow data analysis and processing without exposing sensitive data. The specific calculation method and steps of the privacy calculation are different from one privacy calculation technology to another, and the specific calculation method and steps are introduced below by two common privacy calculations of secure multiparty calculation and homomorphic encryption.
Secure multiparty computing: the secure multiparty computation performs data computation and analysis with multiparty participation while protecting data privacy. The specific calculation steps of the secure multiparty calculation are as follows:
(1) The preparation stage: the parties participating in the calculation perform identity verification and key negotiation according to a security protocol to ensure secure communication and calculation.
(2) An input stage: each participant encrypts the input data and sends the encrypted ciphertext to the other participants.
(3) And (3) a calculation stage: and each party uses the ciphertext to calculate, encrypts the calculation result and sends the calculation result to other parties.
(4) Output stage: each party decrypts the encrypted calculation result to obtain a final calculation result.
Homomorphic encryption can perform data calculation and analysis in a ciphertext state, and meanwhile data privacy is protected. The specific calculation steps of homomorphic encryption are as follows:
(1) Data encryption: the data owner encrypts the data to generate ciphertext.
(2) Homomorphism calculation: and the data analyzer calculates and analyzes the ciphertext by using a homomorphic encryption algorithm to generate a ciphertext of a calculation result.
(3) Decryption output: and the data analyzer decrypts the ciphertext of the calculation result to obtain a final calculation result.
In the using process, the invention carries out encryption module processing on the user data and leads the disturbed data into a model training module; the encryption module comprises a key generation unit, a data encryption unit and a data decryption unit; the encryption module comprises a key generation unit for encrypting data, wherein the key comprises a public key and a private key, the public key is used for encrypting the data, the private key only has independent use permission for decrypting the data, and the key is generated through a plurality of mathematical algorithms and comprises a large prime number decomposition unit, an Euler function unit and a modular exponentiation unit, and the data encryption unit and the data decryption unit. The encryption process of the data encryption unit of the encryption module is to encrypt plaintext data by using a public key to obtain ciphertext data, and the secret key performs mathematical transformation on the plaintext data in the encryption process. And the data decryption unit of the encryption module decrypts the encrypted ciphertext data by using a private key in the data decryption process to restore the encrypted ciphertext data into original plaintext data, and in the decryption process, the private key performs mathematical transformation on the ciphertext data, and only a person with the private key can decrypt the ciphertext data to restore the original plaintext data.
The model training module is used for carrying out model training and establishing a credit scoring model at the cloud, and comprises a data collecting unit, a data characteristic extracting unit, a data analyzing unit and a model establishing unit, wherein the model training module is used for importing data into a model application algorithm module, and the model application algorithm module is used for calculating the credit score of a user by using the model on a user terminal. The data collection unit in the model training module collects the credit score related data in the user terminal module, and the data after differential privacy processing is subjected to feature extraction through the data feature extraction unit, so that the collected user information is converted into numerical features. The data analysis unit in the model training module processes and optimizes the extracted numerical characteristics, the processing optimization uses a principal component analysis method to perform dimension reduction processing on the characteristics, and the data encrypted by differential privacy is subjected to noise disturbance so as to protect user privacy.
The above embodiments are only for illustrating the present invention and not for limiting the scheme described in the present invention, and although the present invention has been described in detail in the present specification with reference to the above embodiments, the present invention is not limited to the above specific embodiments, and thus any modifications or equivalents thereof are possible; all such modifications and variations which do not depart from the spirit and scope of the invention are intended to be included within the scope of the invention as defined in the appended claims.

Claims (10)

1. The credit score model modeling system based on privacy calculation is characterized by comprising a user terminal module and a privacy calculation module, wherein the user terminal is used for collecting user data, processing the user data by an encryption module and importing the disturbed data into a model training module; the encryption module comprises a key generation unit, a data encryption unit and a data decryption unit;
the privacy computing module comprises safe multiparty computing and homomorphic encryption computing, wherein the safe multiparty computing performs data computing and analysis under the condition of multiparty participation, and meanwhile, data privacy is protected; the homomorphic encryption can perform data calculation and analysis in a ciphertext state, and meanwhile, data privacy is protected;
the model training module comprises a data collecting unit, a data characteristic extracting unit, a data analyzing unit and a model establishing unit, and is used for importing data into a model application algorithm module, and the model application algorithm module is used for calculating credit scores of users.
2. The privacy calculation-based credit score model modeling system of claim 1, wherein the data collection unit in the model training module collects credit score related data in the user terminal module, and the data after differential privacy processing is subjected to feature extraction by the data feature extraction unit to convert the collected user information into numerical features.
3. The privacy computation-based credit score model modeling system of claim 2, wherein the data analysis unit learning algorithm in the model training module processes and optimizes the extracted numerical features.
4. A credit score model modeling system based on privacy calculations as claimed in claim 3, wherein the model training module is critical to privacy protection of data, and simultaneously uses a machine learning algorithm to model and optimize the data to obtain the credit score model.
5. The system of claim 4, wherein the encryption module comprises a key generation unit for encrypting data, the key comprises a public key and a private key, the public key is used for encrypting the data, the private key has independent use authority for decrypting the data, the key is generated by a plurality of mathematical algorithms, and the key comprises a large prime number decomposition, an euler function and a modular exponentiation, a data encryption unit and a data decryption unit.
6. The system of claim 5, wherein the encryption process of the data encryption unit of the encryption module encrypts plaintext data with a public key to obtain ciphertext data, and the secret key performs mathematical transformation on the plaintext data during the encryption process.
7. The system of claim 6, wherein the data decryption unit of the encryption module decrypts the encrypted ciphertext data using a private key to restore the encrypted ciphertext data to original plaintext data, and wherein the private key mathematically transforms the ciphertext data during decryption, and only a person having the private key can decrypt the ciphertext data to restore the ciphertext data to the original plaintext data.
8. The credit score model modeling method based on privacy calculation is characterized by comprising the following steps of: step 1: collecting user data including age, occupation and income information related data information of a user through a user terminal module;
step 2: noise disturbance is carried out on the encrypted data by utilizing differential privacy, so that the privacy of a user is protected, and the accuracy of the disturbed data is ensured;
step 3: model training is carried out on the disturbed data transmission, and the credit score of the user is calculated by using the model on the user terminal.
9. The method for modeling a credit score model based on privacy calculation according to claim 8, wherein the model training in the step 3 establishes a logic unit, and the logic unit fits the feature variable and the target variable to obtain a logistic regression model, so as to predict the classification result.
10. The method of claim 9, wherein in step 4, the credit score data is classified and estimated by a computing network of nodes formed by a neural network.
CN202310821730.7A 2023-07-05 2023-07-05 Credit scoring model modeling method and system based on privacy calculation Pending CN116861459A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310821730.7A CN116861459A (en) 2023-07-05 2023-07-05 Credit scoring model modeling method and system based on privacy calculation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310821730.7A CN116861459A (en) 2023-07-05 2023-07-05 Credit scoring model modeling method and system based on privacy calculation

Publications (1)

Publication Number Publication Date
CN116861459A true CN116861459A (en) 2023-10-10

Family

ID=88218490

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310821730.7A Pending CN116861459A (en) 2023-07-05 2023-07-05 Credit scoring model modeling method and system based on privacy calculation

Country Status (1)

Country Link
CN (1) CN116861459A (en)

Similar Documents

Publication Publication Date Title
CN110084063B (en) Gradient descent calculation method for protecting private data
CN111259443B (en) PSI (program specific information) technology-based method for protecting privacy of federal learning prediction stage
CN113537633B (en) Prediction method, device, equipment, medium and system based on longitudinal federal learning
Barni et al. SEMBA: secure multi‐biometric authentication
Tan et al. High-secure fingerprint authentication system using ring-LWE cryptography
CN101331706A (en) Secure threshold decryption protocol computation
CN110400162B (en) Data processing method, device, server and system
Alberto Torres et al. Privacy-preserving biometrics authentication systems using fully homomorphic encryption
CN112185395B (en) Federal voiceprint recognition method based on differential privacy
CN110674941B (en) Data encryption transmission method and system based on neural network
Torres et al. Effectiveness of fully homomorphic encryption to preserve the privacy of biometric data
CN112532383B (en) Privacy protection calculation method based on secret sharing
Liu et al. An efficient biometric identification in cloud computing with enhanced privacy security
CN115913537A (en) Data intersection method and system based on privacy protection and related equipment
CN111490995A (en) Model training method and device for protecting privacy, data processing method and server
CN116561787A (en) Training method and device for visual image classification model and electronic equipment
Luo et al. Anonymous biometric access control based on homomorphic encryption
CN112380404B (en) Data filtering method, device and system
CN105897401B (en) General differential power consumption analysis method and system based on bit
Eltaieb et al. Efficient implementation of cancelable face recognition based on elliptic curve cryptography
Hamian et al. Blockchain-based User Re-enrollment for Biometric Authentication Systems
CN116861459A (en) Credit scoring model modeling method and system based on privacy calculation
CN115456766A (en) Credit risk prediction method and device
CN111475690B (en) Character string matching method and device, data detection method and server
CN114547684A (en) Method and device for protecting multi-party joint training tree model of private data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination