CN112308236A - Method, device, electronic equipment and storage medium for processing user request - Google Patents

Method, device, electronic equipment and storage medium for processing user request Download PDF

Info

Publication number
CN112308236A
CN112308236A CN202011191057.6A CN202011191057A CN112308236A CN 112308236 A CN112308236 A CN 112308236A CN 202011191057 A CN202011191057 A CN 202011191057A CN 112308236 A CN112308236 A CN 112308236A
Authority
CN
China
Prior art keywords
prediction model
user request
user
data
actual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011191057.6A
Other languages
Chinese (zh)
Inventor
尉乃升
张梦
陈浩
孙冠超
卢瑶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202011191057.6A priority Critical patent/CN112308236A/en
Publication of CN112308236A publication Critical patent/CN112308236A/en
Priority to EP21179364.1A priority patent/EP3869374B1/en
Priority to US17/304,281 priority patent/US20210312017A1/en
Priority to JP2021103326A priority patent/JP7223067B2/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/10Protecting distributed programs or content, e.g. vending or licensing of copyrighted material ; Digital rights management [DRM]
    • G06F21/12Protecting executable software
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/629Protecting access to data via a platform, e.g. using keys or access control rules to features or functions of an application
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/52Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems during program execution, e.g. stack integrity ; Preventing unwanted data erasure; Buffer overflow
    • G06F21/53Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems during program execution, e.g. stack integrity ; Preventing unwanted data erasure; Buffer overflow by executing in a restricted environment, e.g. sandbox or secure virtual machine
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/57Certifying or maintaining trusted computer platforms, e.g. secure boots or power-downs, version controls, system software checks, secure updates or assessing vulnerabilities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/57Certifying or maintaining trusted computer platforms, e.g. secure boots or power-downs, version controls, system software checks, secure updates or assessing vulnerabilities
    • G06F21/577Assessing vulnerabilities and evaluating computer system security
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/602Providing cryptographic facilities or services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/64Protecting data integrity, e.g. using checksums, certificates or signatures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • G06N20/20Ensemble learning
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/04Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks
    • H04L63/0428Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks wherein the data content is protected, e.g. by encrypting or encapsulating the payload
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • H04L63/0823Network architectures or network communication protocols for network security for authentication of entities using certificates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2141Access rights, e.g. capability lists, access control lists, access tables, access matrices

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Bioethics (AREA)
  • Medical Informatics (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Technology Law (AREA)
  • Databases & Information Systems (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Information Transfer Between Computers (AREA)
  • Storage Device Security (AREA)

Abstract

The embodiment of the application discloses a method and a device for processing a user request, electronic equipment and a computer-readable storage medium, and relates to the technical field of intelligent cloud and machine learning. One embodiment of the method comprises: receiving an incoming user request; sending a user request to a target prediction model stored in a confidential container, wherein the confidential container is created in a local storage space by utilizing a software protection expansion (SGX) technology, the target prediction model is obtained by training an initial prediction model through an encrypted feature sample and a corresponding marking result sample, and the encrypted feature sample is transmitted from a feature data providing end through a ciphertext transmission channel established between the feature data providing end and the confidential container; and receiving a prediction result output by the target prediction model. By applying the implementation mode, sensitive data of a data user can be prevented from being out of the domain, and the performance and time overhead can be reduced, so that the user-oriented data user can respond to the user request more quickly.

Description

Method, device, electronic equipment and storage medium for processing user request
Technical Field
The present application relates to the field of artificial intelligence, and in particular, to the field of smart cloud and machine learning technologies, and in particular, to a method and an apparatus for processing a user request, an electronic device, and a computer-readable storage medium.
Background
With the development of electronic informatization, it is difficult for user data collected by a single party to comprehensively evaluate the actual situation of a user in other aspects, so that multiple parties are often required to respectively summarize user data collected by themselves (authorized by the user), and a machine learning model capable of outputting more comprehensive and accurate results is trained.
The prior art provides a scheme that a feature data provider builds an initial model locally, trains the initial model based on its own feature data and a labeling result from a data user, and then provides a calling interface of the trained model to the data user.
Disclosure of Invention
The embodiment of the application provides a method and a device for processing a user request, electronic equipment and a computer-readable storage medium.
In a first aspect, an embodiment of the present application provides a method for processing a user request, including: receiving an incoming user request; sending the user request to a target prediction model stored in a secure container; the security container is established in a local storage space by utilizing a software protection expansion SGX technology, a target prediction model is obtained by training an initial prediction model through encrypted feature samples and corresponding marked result samples, and the encrypted feature samples are transmitted from a feature data providing end through a ciphertext transmission channel established between the feature data providing end and the security container; and receiving a prediction result output by the target prediction model.
In a second aspect, an embodiment of the present application provides an apparatus for processing a user request, including: a user request receiving unit configured to receive an incoming user request; a user request transmitting unit configured to transmit a user request to a target prediction model stored in a secure container; the security container is established in a local storage space by utilizing a software protection expansion SGX technology, a target prediction model is obtained by training an initial prediction model through encrypted feature samples and corresponding marked result samples, and the encrypted feature samples are transmitted from a feature data providing end through a ciphertext transmission channel established between the feature data providing end and the security container; a prediction result receiving unit configured to receive a prediction result output by the target prediction model.
In a third aspect, an embodiment of the present application provides an electronic device, including: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to cause the at least one processor to perform the method for processing a user request as described in any implementation manner of the first aspect when executed.
In a fourth aspect, embodiments of the present application provide a non-transitory computer-readable storage medium storing computer instructions for enabling a computer to implement a method for processing a user request as described in any implementation manner of the first aspect when executed.
According to the method, the device, the electronic equipment and the computer-readable storage medium for processing the user request, firstly, a data using end receives an incoming user request; then, the data using end sends a user request to a target prediction model stored in a confidential container, the confidential container is created in a local storage space by utilizing a software protection expansion (SGX) technology, the target prediction model is obtained by training an initial prediction model through an encrypted feature sample and a corresponding marking result sample, and the encrypted feature sample is transmitted from a feature data providing end through a ciphertext transmission channel built between the feature data providing end and the confidential container; and finally, the data using end receives the prediction result output by the target prediction model.
Different from the prior art that the prediction model is created and stored at the feature data provider, the prediction model is created and stored at the data user, so that compared with the feature data provided by the feature data provider, the more sensitive annotation result does not need to be exported from the data user, and the risk possibly caused by exporting from the data user is reduced. Meanwhile, the data using end can use the target model based on the more common request sent by the user, and the setting of the data using end at the local part of the data using end can also reduce the performance overhead and the time overhead brought by the characteristic data providing end and the long-distance data transmission, so that the user request can be responded more quickly.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present disclosure, nor do they limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture to which the present application may be applied;
fig. 2 is a flowchart of a method for processing a user request according to an embodiment of the present application;
FIG. 3 is a flow chart of another method for processing a user request provided by an embodiment of the present application;
FIG. 4 is a timing diagram illustrating a method for processing a user request according to an embodiment of the present disclosure;
fig. 5 is a flowchart of a method for verifying the identity validity of incoming actual data in the method for processing a user request according to the embodiment of the present application;
FIG. 6 is a flowchart illustrating a method for processing a user request in an application scenario according to an embodiment of the present application;
fig. 7 is a block diagram illustrating an apparatus for processing a user request according to an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of an electronic device adapted to execute a method for processing a user request according to an embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 illustrates an exemplary system architecture 100 to which embodiments of the method, apparatus, electronic device, and computer-readable storage medium for processing a user request of the present application may be applied.
As shown in fig. 1, the system architecture 100 may include a feature data providing end 101, a data using end 102, and a user terminal 103, where data transmission between the feature data providing end 101 and the data using end 102, and between the data using end 102 and the user terminal 103 may be implemented through a network, and may be embodied in the form of a wired or wireless communication link or an optical fiber cable.
The user may initiate a user request to the data consumer 102 through the user terminal 103, and the data consumer 102 may implement processing of the received user request through various applications installed or running thereon, such as a risk rating application, an authentication application, and the like. Meanwhile, in order to enable the data consumer 102 to correctly process the user request, the data consumer 102 needs to combine the relevant feature data provided by the feature data provider 101 to characterize whether the content of the request initiated by the user is reasonable before processing the user request. To achieve this, the data consumer 102 and the user terminal 101 may also have corresponding types of applications installed or running thereon, such as a multi-party training application, a secure container construction application, and so on.
The feature data provider 101, the data consumer 102 and the user terminal 103 may be hardware or software. When the user terminal 103 is hardware, it may be various electronic devices with a display screen, including but not limited to a smart phone, a tablet computer, a laptop portable computer, a desktop computer, and the like; when the user terminal 103 is software, it may be installed in the electronic device listed above, and it may be implemented as multiple pieces of software or software modules, or may be implemented as a single piece of software or software modules, and is not limited herein. When the feature data providing terminal 101 and the data using terminal 102 are hardware, they may be implemented as a single server or a distributed server cluster composed of multiple servers; when the feature data providing terminal 101 and the data using terminal 102 are software, they may be implemented as a plurality of software or software modules, or may be implemented as a single software or software module, and are not limited in detail herein.
The user-oriented data using terminal 102 can provide various services through various built-in applications, for example, a request processing application of a figure for judging whether a user request can pass can be provided, and the data using terminal 102 can realize the following effects when running the request processing application: firstly, receiving a user request transmitted by a user through a user terminal 103; then, the user request is sent to a target prediction model stored in a secure container, the secure container is created in a local storage space by a data using end 102 through an SGX (software protection extensions) technology, the target prediction model is obtained by training an initial prediction model through encrypted feature samples and corresponding labeling result samples, and the encrypted feature samples are transmitted from a feature data providing end through a ciphertext transmission path established between the feature data providing end and the secure container; and finally, receiving the target prediction model and outputting a prediction result corresponding to the user request. Further, the data usage terminal 102 may return the predicted result to the user terminal 103, so as to inform the user of the result of the request initiated by the user.
The method for processing the user request provided in the following embodiments of the present application is generally performed by the data consumer 102, and accordingly, the apparatus for processing the user request is also generally disposed in the data consumer 102.
It should be understood that the number of feature data providers, data consumers, user terminals in fig. 1 is merely illustrative. According to the implementation requirement, on the premise of fixing the data using end, the data processing system can have a larger number of feature data providing ends and user terminals.
Referring to fig. 2, fig. 2 is a flowchart of a method for processing a user request according to an embodiment of the present application, where the process 200 includes the following steps:
step 201: receiving an incoming user request;
this step is intended to receive, by an execution subject (e.g., data consumer 102 shown in fig. 1) of the method for processing a user request, a user request that a user has come in through a user terminal (e.g., user terminal 103 shown in fig. 1).
Specifically, the identity authentication request may be a request for a user to initiate a request for authenticating a real identity to an application running on the execution subject and having an identity authentication requirement, so that the application triggers a subsequent operation after determining that the application is a legal user, and the determination may be implemented by detecting, by a related determination (or determination) model, identity authentication data uploaded by the user along with the user request; the sensitive data reading request can be a request for a user to initiate reading of some sensitive data to an application running on the execution main body and storing the sensitive data, so that whether the user can be allowed to read the sensitive data is determined after request judgment, and the judgment can be realized by detecting the authority data of the user by a related judgment (or determination) model; the fund lending request can be a fund lending request of a certain amount from a user to a lending application which runs on the execution body and has certified qualifications, so that the application meets the fund lending requirement when judging that the user has enough repayment capacity according to time, and the judgment can be realized by detecting data which is related to the fund strength and is used for representing the user by a related judgment (or determination) model.
Step 202: sending the user request to a target prediction model stored in a secure container;
on the basis of step 201, this step is intended to send the user request by the executing entity to the target prediction model stored in the secure container. The security container (English name: Enclave) is created in a local storage space of the execution main body by utilizing an SGX technology, the target prediction model is obtained by training an initial prediction model through an encrypted feature sample and a corresponding marking result sample, the encrypted feature sample is transmitted from a feature data providing end through a ciphertext transmission channel built between the feature data providing end and the security container, and the marking result sample corresponding to the encrypted feature sample is provided by the execution main body and is transmitted to the initial prediction model through the security container in a local data transmission mode for training.
SGX is a new extension of the Intel architecture, a new group of instruction sets and a memory access mechanism are added on the original architecture, the extensions allow an application program to realize a container called as Enclave, a protected area is divided in an address space of the application program, and protection of confidentiality and integrity is provided for codes and data in the container, so that the protection is prevented from being damaged by malicious software with special authority. Enclave is a protected content container for storing application sensitive data and code. SGX allows applications to specify portions of code and data that need to be protected, which need not be examined or analyzed before creating an Enclave, but which must be measured. After the parts of the application that need to be protected are loaded into Enclave, SGX protects them from external software. Enclave can prove its identity to a remote authenticator and provide the necessary functional structure for securely providing a key. The user may also request a unique key, which is unique by combining the Enclave identity and the platform identity, and which may be used to protect keys or data stored outside the Enclave.
It can be seen that, the reason why the SGX technology is used in the present application to create the prediction model in its envelope container is that the training sample for training to obtain the available prediction model is composed of two parts and comes from two different execution subjects, one is a data using end (e.g., the data using end 102 shown in fig. 1) directly facing the user and directly providing service to the user, and the other is a feature data providing end (e.g., the feature data providing end 101 shown in fig. 1) providing part of feature data used for obtaining the prediction result but missing (or unavailable, difficult to obtain) for the data using end facing the user.
It should be understood that, when there is no third party that receives sensitive data of the data using end and the feature data providing end at the same time to train the prediction model, the creation and training of the prediction model must be performed by one of the two parties, and no matter the data using end or the feature data providing end, the training samples provided by the data using end and the feature data providing end are sensitive data that need to be kept secret from the other party, so in order to avoid the other party from acquiring its own sensitive data, the SGX technology is adopted in the present application to construct a "black box" (i.e., Enclave) that is invisible to both parties, thereby ensuring that no one party can acquire sensitive data of the other party.
Unlike the prior art that places the creation and training of the prediction model on the feature data provider (also called model provider), the present application places the creation and training of the prediction model on the data user directly facing the user, therefore, the feature data provider needs to transmit the training sample to the initial prediction model in the secure container (i.e. envelope), in order to avoid the loss of sensitive data caused by the interception of the training sample during the transmission process, a ciphertext transmission path between the feature data provider and the secure container needs to be constructed to realize the secure transmission of data with higher security, the ciphertext transmission path may be created based on a creation request initiated by the execution main body to the feature data providing end, or may be created based on a creation request initiated by the feature data providing end to the execution main body.
Step 203: and receiving a prediction result output by the target prediction model.
On the basis of step 202, this step is intended to receive, by the execution agent, a prediction result corresponding to a user request output by the target prediction model. The prediction result output by the target prediction model can be used to represent different meanings according to different specific contents requested by the user request.
For example, when the user request is specifically an identity authentication request, the data using end and the feature data providing end respectively provide first-class information and second-class information for determining the real identity of the user, and then the trained target prediction model predicts the possibility that the user belongs to a legal user according to the user data extracted from the user request, namely, the actual possibility is used as a prediction result; when the user request is a sensitive data reading request, the data using end and the feature data providing end respectively provide first authority information and second authority information of the user, and then the trained target prediction model predicts the possibility that the user data extracted from the user request has the sensitive data reading authority, namely the actual possibility is used as a prediction result; when the user request is a fund borrowing request, the data using end can provide deposit information of the user in the Unionpay, the characteristic data providing end can provide a user behavior portrait constructed by the user in a social platform or a shopping platform, and then the trained target prediction model predicts the on-time repayment capacity or the risk rating that the on-time repayment cannot be carried out by the user according to the user identity information extracted from the user request, namely the on-time repayment capacity or the risk rating is used as a prediction result.
Further, the actual risk rating of the user corresponding to the user request can be determined according to the prediction result, and in response to the fact that the actual risk rating is not higher than the preset rating, response information that the user request does not pass is returned, namely the request initiated by the user is rejected.
Different from the prior art that the prediction model is created and stored at the feature data provider, the method for processing the user request provided by the embodiment of the application creates and stores the prediction model at the data user, so that the annotation result which is more sensitive than the feature data provided by the feature data provider does not need to be exported from the data user, and the risk possibly caused by exporting the data from the data user is reduced. Meanwhile, the data using end can use the target model based on the more common request sent by the user, and the setting of the data using end at the local part of the data using end can also reduce the performance overhead and the time overhead brought by the characteristic data providing end and the long-distance data transmission, so that the user request can be responded more quickly.
Referring to fig. 3, fig. 3 is a flowchart of another method for processing a user request according to an embodiment of the present application, where the process 300 includes the following steps:
step 301: creating a secure container in a local storage space by using an SGX technology;
this step is intended to create a secure container in the local storage space by the executing agent described above using SGX techniques.
Step 302: establishing an initial prediction model in the secret container, and establishing a ciphertext transmission path between the secret container and a characteristic data providing end;
on the basis of step 301, this step is to create an initial prediction model in the secure container by the execution body, and actively build a ciphertext transmission path between the secure container and the feature data providing end by the execution body.
Step 303: receiving an encrypted feature sample transmitted by a feature data providing end through a ciphertext transmission path;
the encrypted feature sample is a feature sample obtained by encrypting a plaintext feature sample provided by a feature data providing end, and in order to prevent an encryption mode from being known by the executing body, so that the executing body can unilaterally obtain sensitive data provided by the feature data providing end, the encryption mode should only inform the secure container, so that the secure container can be used by replacing the secure container with identifiable plaintext data.
Step 304: training an initial prediction model by using the encrypted characteristic samples and the marked result samples corresponding to the encrypted characteristic samples to obtain a target prediction model;
on the basis of step 303, this step aims to train an initial prediction model by the execution subject using the encrypted feature samples and the labeled result samples corresponding to the encrypted feature samples, thereby obtaining a trained and available target prediction model. The execution main body provides the sensitive data, namely the sensitive data provided by the execution main body, and provides the sensitive data in an excessive covering manner, because the execution main body can only provide the labeling result corresponding to the actual content of the encrypted feature sample as far as possible by the manner on the premise that the specific content of the encrypted feature sample cannot be known.
Step 305: receiving an incoming user request;
step 306: sending the user request to a target prediction model stored in a secure container;
step 307: and receiving a prediction result output by the target prediction model.
The above step 305-.
On the basis of the above embodiment, the embodiment provides a specific scheme for training the target prediction model stored at the data using end through steps 301 to 304, and gives details of the operation mode of each step, so that the scheme has higher feasibility.
The above-mentioned fig. 2 and fig. 3 illustrate the implementation of the present application, each described as standing alone in the perspective of the data user facing the user, and in order to more intuitively see the operations performed by different execution bodies throughout the entire process of the scheme, there is also provided herein, by means of fig. 4, a timing diagram of a method for processing a user request, comprising the following steps:
step 401: the data using end utilizes the SGX technology to create a secure container;
step 402: the data using end creates an initial prediction model in the security container;
step 403: the characteristic data providing direction sends the encrypted characteristic sample to the secret container;
step 404: the data using end uses the encrypted characteristic sample and the labeling result sample corresponding to the encrypted characteristic sample to train to obtain a target prediction model;
step 405: the user terminal sends a user request to the data using terminal;
step 406: the data using end inputs a user request into the target prediction model for processing;
step 407: and the data using end returns the prediction result output by the target prediction model to the user terminal.
As shown in a timing diagram of fig. 4, operations executed by three execution main bodies, namely, a feature data providing end, a data using end and a user terminal, in the whole scheme are shown, steps 401 to 404 describe data interaction between the feature data providing end and the data using end in the previous stage, so as to build a trained target prediction model at the data using end, and on this basis, steps 405 to 407 describe a process in which the user terminal interacts with the data using end in an actual application scene and invokes the trained target prediction model to output a prediction result.
On the basis of any of the above embodiments, in order to prevent error or malicious data from being transmitted into the secure container and further interfering with the model training process, the identity of the actual data transmitted through the ciphertext transmission path may be verified, that is, whether the sender of the actual data is a legitimate feature data providing end or not, or whether the feature data providing end as a generator is in a normal working state or not. Taking the example of verifying whether the sender is a legitimate feature data provider, the process 500 shown in fig. 5 provides a specific authentication step:
step 501: extracting an actual certificate from actual data transmitted through a ciphertext transmission path;
the legal characteristic data providing end and the data using end can apply for the certified electronic certificate to the electronic certificate issuing structure of Intel and embed the certified electronic certificate into the data sent by the legal characteristic data providing end and the data using end, so that the other party can conveniently extract an actual certificate from the received data and carry out identity verification by sending the actual certificate to the issuing mechanism for legal certification.
Step 502: judging whether the actual certificate is a legal certificate issued by an authority structure of the SGX, if so, executing a step 503, otherwise, executing a step 505;
step 503: determining that the feature data provider of the transmitted actual data is a legal feature data provider;
this step is based on the determination result in step 502 being that the actual certificate is a legitimate certificate issued by the authority structure of the SGX, and thus it can be determined that the feature data provider of the incoming actual data is a legitimate feature data provider.
Step 504: allowing the actual data to pass into the secure container;
on the basis of step 503, this step is intended to allow the actual data to be transferred into the secure container by the executing entity described above.
Step 505: determining that the feature data provider of the transmitted actual data is an illegal feature data provider;
this step is based on the determination result of step 502 being that the actual certificate is not a valid certificate issued by the authority structure of the SGX, so that it can be determined that the feature data provider that has transmitted the actual data is an invalid feature data provider
Step 506: the actual data is not allowed to pass into the secure container.
On the basis of step 505, this step is intended to allow the actual data to be transferred into the secure container by the executing entity described above.
In addition, in a manner of verifying whether the feature data providing end as the generator is in a normal working state, the implementation manner may be improved, for example, whether the content of the incoming data is abnormal, whether the incoming time is a predetermined time, whether a specific identifier exists, and the like are combined, and details are not repeated here.
On the basis of any of the above embodiments, in order to ensure the continuous availability of the target prediction model, the execution subject may further receive an incremental encryption feature transmitted by the feature data provider through the ciphertext transmission path, so as to update the target prediction model with the incremental encryption feature and the labeling result corresponding to the incremental encryption feature. Specifically, the update frequency may be fixed or modifiable, as appropriate.
For further understanding, the present application further provides a specific implementation scheme in combination with a specific application scenario, please refer to the flowchart shown in fig. 6:
the feature data provider is a party A, the data user is a party B, and the online prediction service is developed based on SGX, deployed on an SGX machine in a party B machine room and specifically operated in Enclave created by an SGX technology.
Early preparation work:
configuring a B-party SGX machine certificate (issued by Intel's certificate issuing architecture) into an A-party machine for subsequent feature services to authenticate whether the request is from an authorized SGX machine; configuring a client identifier and a key distributed by the A party to an online prediction service of a machine of the A party for making a request signature and encrypting request data; and deploying the prediction model to the B-party machine online prediction service for subsequent model operation.
1) A decision engine of the B party requests an online prediction service by taking a user ID + B party characteristics X (optional, according to the specific type of a model) + the model ID as a request parameter;
2) an online prediction service requests a local DCAP (Data Link Switching Client Access Protocol) service to acquire information signed by using a current machine SGX (secure gateway X);
3) the online prediction service encrypts a user ID and an SGX signature by using a client key sk distributed by the A-party service to obtain encrypted information encryt by using AES (Advanced Encryption Standard), then encrypts the encrypted information encryt, generates a sha256 (hash algorithm with the hash value length of 256 bits) integrity check signature sign, and sends the integrity check signature sign together with a B-party client identifier ak to the A-party feature service, wherein the communication between the A-party and the B-party ensures safety by using an HTTPS (Hyper Text Transfer Protocol over secure Security token Layer, a secure HTTP Protocol) Protocol;
4) after the B-party characteristic service receives the A-party request, checking the validity of ak, obtaining sk after the checking is passed, checking sign and data integrity by using the sk, then decrypting encrypt to obtain a plaintext user ID and an SGX signature, and requesting a local DCAP service to check the validity of the SGX signature. Denying access if any checks fail;
5) b-party characteristic service requests a database to acquire desensitization characteristic information of a user according to the user ID;
6) and the B-party feature service carries out AES encryption on the feature information of the user by using sk and generates a sha256 integrity check signature sign.
7) The A-party service returns the user ID + A-party characteristic X + sha256 signature to the online prediction service;
8) and the online prediction service receives the returned information of the party A, decrypts the returned information in the envelope to obtain the characteristic X of the party A and the characteristic X of the party B (if the characteristic X of the party A exists), calls the model to perform operation, generates the model score of the user, and returns the model score to the decision engine of the party A.
In order to enhance the understanding of the above process, a specific implementation flow is also given in combination when the user request is a specific fund loan request:
the feature provider at this time is a user behavior representation provider (such as a social application and/or a shopping application) capable of collecting the user social behaviors and/or shopping behaviors, and is used for providing the user behavior representation; the data user is a fund lender (such as a credit agency with qualification, such as a bank) which records the user asset information, and after the scheme is adopted, a target model which can carry out risk rating on whether the fund lending user can pay according to the term or not is trained in a machine room of the fund lender by using the user behavior portrait and the user asset information;
when receiving a fund borrowing request of a certain user, the fund lender inputs the user identity information contained in the user request as input information into the target model so as to enable the target model to output risk rating in combination with the behavior portrait of the user and the asset information, and the risk rating can be specifically output in the form of quantitative scores. The fund lender finally determines whether to pass the fund lending request of the user according to the output risk rating score.
With further reference to fig. 7, as an implementation of the method shown in the above-mentioned figures, the present application provides an embodiment of an apparatus for processing a user request, where the embodiment of the apparatus corresponds to the embodiment of the method shown in fig. 2, and the apparatus may be applied to various electronic devices.
As shown in fig. 7, the apparatus 700 for processing a user request of the present embodiment may include: a user request receiving unit 701, a user request transmitting unit 702, and a prediction result receiving unit 703. Wherein, the user request receiving unit 701 is configured to receive an incoming user request; a user request sending unit 702 configured to send a user request to the target prediction model stored in the secure container; the security container is established in a local storage space by utilizing a software protection expansion SGX technology, a target prediction model is obtained by training an initial prediction model through encrypted feature samples and corresponding marked result samples, and the encrypted feature samples are transmitted from a feature data providing end through a ciphertext transmission channel established between the feature data providing end and the security container; a prediction result receiving unit 703 configured to receive a prediction result output by the target prediction model.
In the present embodiment, in the apparatus 700 for processing a user request: the detailed processing and the technical effects of the user request receiving unit 701, the user request sending unit 702, and the prediction result receiving unit 703 can refer to the related descriptions of step 201 and step 203 in the corresponding embodiment of fig. 2, which are not described herein again.
In some optional implementations of this embodiment, the apparatus 700 for processing a user request may further include a target prediction model training unit configured to train a target prediction model, and the target prediction model training unit may be further configured to:
creating a secure container in a local storage space by using an SGX technology;
establishing an initial prediction model in the secret container, and establishing a ciphertext transmission path between the secret container and a characteristic data providing end;
receiving an encrypted feature sample transmitted by a feature data providing end through a ciphertext transmission path;
and training an initial prediction model by using the encrypted characteristic samples and the marked result samples corresponding to the encrypted characteristic samples to obtain a target prediction model.
In some optional implementations of this embodiment, the apparatus 700 for processing a user request may further include:
an actual risk rating determination unit configured to determine an actual risk rating of the user corresponding to the user request according to the prediction result;
and a non-passing response information returning unit configured to return response information that the user request does not pass in response to the actual risk rating not being higher than the preset rating.
In some optional implementations of this embodiment, the apparatus 700 for processing a user request may further include:
an authentication unit configured to authenticate actual data transmitted through the ciphertext transmission path;
and the identity legal processing unit is configured to respond to the result of the identity verification that the feature data provider is legal and allow the actual data to be transmitted into the secret container.
In some optional implementations of this embodiment, the identity verification unit may be further configured to:
extracting an actual certificate from actual data transmitted through a ciphertext transmission path;
judging whether the actual certificate is a legal certificate issued by an authority structure of the SGX;
if the actual certificate is a legal certificate, determining that the feature data provider of the transmitted actual data is a legal feature data provider;
and if the actual certificate is not a legal certificate, determining that the feature data provider of the transmitted actual data is an illegal feature data provider.
In some optional implementations of this embodiment, the apparatus 700 for processing a user request may further include:
the incremental encryption characteristic receiving unit is configured to receive incremental encryption characteristics transmitted by a characteristic data provider through a ciphertext transmission path;
and a model updating unit configured to update the target prediction model using the incremental cryptographic features and the labeling result corresponding to the incremental cryptographic features.
The embodiment of the apparatus exists as an embodiment of an apparatus corresponding to the above method, and is different from the prior art in which the prediction model is created and stored at the feature data provider, the apparatus for processing a user request provided by the embodiment of the present application creates and stores the prediction model at the data consumer instead, so that a tagging result that is more sensitive than feature data provided by the feature data provider does not need to be exported from the data consumer, and a risk that may occur due to data export is reduced. Meanwhile, the data using end can use the target model based on the more common request sent by the user, and the setting of the data using end at the local part of the data using end can also reduce the performance overhead and the time overhead brought by the characteristic data providing end and the long-distance data transmission, so that the user request can be responded more quickly.
According to an embodiment of the present application, an electronic device and a computer-readable storage medium are also provided.
FIG. 8 shows a block diagram of an electronic device suitable for use in implementing the method for processing a user request of an embodiment of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the present application that are described and/or claimed herein.
As shown in fig. 8, the electronic apparatus includes: one or more processors 801, memory 802, and interfaces for connecting the various components, including a high speed interface and a low speed interface. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions for execution within the electronic device, including instructions stored in or on the memory to display graphical information of a GUI on an external input/output apparatus (such as a display device coupled to the interface). In other embodiments, multiple processors and/or multiple buses may be used, along with multiple memories and multiple memories, as desired. Also, multiple electronic devices may be connected, with each device providing portions of the necessary operations (e.g., as a server array, a group of blade servers, or a multi-processor system). Fig. 8 illustrates an example of a processor 801.
The memory 802 is a non-transitory computer readable storage medium as provided herein. Wherein the memory stores instructions executable by the at least one processor to cause the at least one processor to perform the method for processing a user request provided herein. The non-transitory computer readable storage medium of the present application stores computer instructions for causing a computer to perform the method for processing a user request provided by the present application.
The memory 802, as a non-transitory computer readable storage medium, may be used to store non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules corresponding to the method for processing a user request in the embodiment of the present application (e.g., the user request receiving unit 701, the user request transmitting unit 702, and the prediction result receiving unit 703 shown in fig. 7). The processor 801 executes various functional applications of the server and data processing by running non-transitory software programs, instructions, and modules stored in the memory 802, that is, implements the method for processing a user request in the above-described method embodiment.
The memory 802 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store various types of data, etc., created by the electronic device in performing the method for processing the user request. Further, the memory 802 may include high speed random access memory and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 802 optionally includes memory located remotely from the processor 801, which may be connected via a network to an electronic device adapted to perform a method for processing user requests. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device adapted to perform the method for processing a user request may further include: an input device 803 and an output device 804. The processor 801, the memory 802, the input device 803, and the output device 804 may be connected by a bus or other means, and are exemplified by a bus in fig. 8.
The input device 803 may receive input numeric or character information and generate key signal inputs related to user settings and function control of an electronic apparatus suitable for performing a method for processing a user request, such as an input device like a touch screen, a keypad, a mouse, a track pad, a touch pad, a pointing stick, one or more mouse buttons, a track ball, a joystick, etc. The output devices 804 may include a display device, auxiliary lighting devices (e.g., LEDs), and haptic feedback devices (e.g., vibrating motors), among others. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device can be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
These computer programs (also known as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented using high-level procedural and/or object-oriented programming languages, and/or assembly/machine languages. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computing system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The Server may be a cloud Server, which is also called a cloud computing Server or a cloud host, and is a host product in a cloud computing service system, so as to solve the defects of high management difficulty and weak service extensibility in the conventional physical host and Virtual Private Server (VPS) service.
Different from the prior art that the prediction model is created and stored at the feature data provider, the embodiment of the application creates and stores the prediction model at the data user, so that compared with the feature data provided by the feature data provider, the more sensitive annotation result does not need to be exported from the data user, and the risk possibly caused by exporting from the data user is reduced. Meanwhile, the data using end can use the target model based on the more common request sent by the user, and the setting of the data using end at the local part of the data using end can also reduce the performance overhead and the time overhead brought by the characteristic data providing end and the long-distance data transmission, so that the user request can be responded more quickly.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present application may be executed in parallel, sequentially, or in different orders, and the present invention is not limited thereto as long as the desired results of the technical solutions disclosed in the present application can be achieved.
The above-described embodiments should not be construed as limiting the scope of the present application. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (14)

1. A method for processing a user request, comprising:
receiving an incoming user request;
sending the user request to a target prediction model stored in a secure container; the secure container is created in a local storage space by utilizing a software protection extension SGX technology, the target prediction model is obtained by training an initial prediction model through encrypted feature samples and corresponding marked result samples, and the encrypted feature samples are transmitted from a feature data providing end through a ciphertext transmission channel built between the feature data providing end and the secure container;
and receiving a prediction result output by the target prediction model.
2. The method of claim 1, wherein training the target prediction model comprises:
creating the secure container in a local storage space using the SGX technique;
creating the initial prediction model in the secret container, and constructing a ciphertext transmission path between the secret container and the feature data providing end;
receiving an encrypted feature sample transmitted by the feature data providing end through the ciphertext transmission path;
and training the initial prediction model by using the encrypted characteristic samples and the marked result samples corresponding to the encrypted characteristic samples to obtain the target prediction model.
3. The method of claim 1, further comprising:
determining the actual risk rating of the user corresponding to the user request according to the prediction result;
and responding to the fact that the actual risk rating is not higher than the preset rating, and returning response information that the user request is not passed.
4. The method of claim 1, further comprising:
performing identity verification on actual data transmitted through the ciphertext transmission channel;
and in response to the result of the identity verification being a legal feature data provider, allowing the actual data to be transmitted into the secure container.
5. The method of claim 4, wherein the authenticating the actual data incoming through the ciphertext transmission path comprises:
extracting an actual certificate from actual data transmitted through the ciphertext transmission path;
judging whether the actual certificate is a legal certificate issued by an authority structure of the SGX;
if the actual certificate is the legal certificate, determining that the feature data provider of the actual data is a legal feature data provider;
and if the actual certificate is not the legal certificate, determining that the feature data provider of the actual data is an illegal feature data provider.
6. The method of any of claims 1 to 5, further comprising:
receiving incremental encryption characteristics transmitted by the characteristic data provider through the ciphertext transmission path;
and updating the target prediction model by using the incremental encryption characteristics and the labeling result corresponding to the incremental encryption characteristics.
7. An apparatus for processing a user request, comprising:
a user request receiving unit configured to receive an incoming user request;
a user request transmitting unit configured to transmit the user request to a target prediction model stored in a secure container; the secure container is created in a local storage space by utilizing a software protection extension SGX technology, the target prediction model is obtained by training an initial prediction model through encrypted feature samples and corresponding marked result samples, and the encrypted feature samples are transmitted from a feature data providing end through a ciphertext transmission channel built between the feature data providing end and the secure container;
a prediction result receiving unit configured to receive a prediction result output by the target prediction model.
8. The apparatus of claim 7, further comprising a target prediction model training unit configured to train the target prediction model, the target prediction model training unit further configured to:
creating the secure container in a local storage space using the SGX technique;
creating the initial prediction model in the secret container, and constructing a ciphertext transmission path between the secret container and the feature data providing end;
receiving an encrypted feature sample transmitted by the feature data providing end through the ciphertext transmission path;
and training the initial prediction model by using the encrypted characteristic samples and the marked result samples corresponding to the encrypted characteristic samples to obtain the target prediction model.
9. The apparatus of claim 7, further comprising:
an actual risk rating determination unit configured to determine an actual risk rating of a user corresponding to the user request according to the prediction result;
a fail response information returning unit configured to return response information in which the user request fails in response to the actual risk rating not being higher than a preset rating.
10. The apparatus of claim 7, further comprising:
an authentication unit configured to authenticate actual data transmitted through the ciphertext transmission path;
and the identity legal processing unit is configured to respond to the result of the identity verification that the feature data is legal and then allow the actual data to be transmitted into the secret container.
11. The apparatus of claim 10, wherein the identity verification unit is further configured to:
extracting an actual certificate from actual data transmitted through the ciphertext transmission path;
judging whether the actual certificate is a legal certificate issued by an authority structure of the SGX;
if the actual certificate is the legal certificate, determining that the feature data provider of the actual data is a legal feature data provider;
and if the actual certificate is not the legal certificate, determining that the feature data provider of the actual data is an illegal feature data provider.
12. The apparatus of any of claims 7 to 11, further comprising:
an incremental encryption feature receiving unit configured to receive an incremental encryption feature that the feature data provider has introduced through the ciphertext transmission path;
a model updating unit configured to update the target prediction model using the incremental cryptographic features and the labeling results corresponding to the incremental cryptographic features.
13. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method for processing user requests of any of claims 1-6.
14. A non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the method for processing a user request of any one of claims 1-6.
CN202011191057.6A 2020-10-30 2020-10-30 Method, device, electronic equipment and storage medium for processing user request Pending CN112308236A (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN202011191057.6A CN112308236A (en) 2020-10-30 2020-10-30 Method, device, electronic equipment and storage medium for processing user request
EP21179364.1A EP3869374B1 (en) 2020-10-30 2021-06-14 Method, apparatus and electronic device for processing user request and storage medium
US17/304,281 US20210312017A1 (en) 2020-10-30 2021-06-17 Method, apparatus and electronic device for processing user request and storage medium
JP2021103326A JP7223067B2 (en) 2020-10-30 2021-06-22 Methods, apparatus, electronics, computer readable storage media and computer programs for processing user requests

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011191057.6A CN112308236A (en) 2020-10-30 2020-10-30 Method, device, electronic equipment and storage medium for processing user request

Publications (1)

Publication Number Publication Date
CN112308236A true CN112308236A (en) 2021-02-02

Family

ID=74332850

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011191057.6A Pending CN112308236A (en) 2020-10-30 2020-10-30 Method, device, electronic equipment and storage medium for processing user request

Country Status (4)

Country Link
US (1) US20210312017A1 (en)
EP (1) EP3869374B1 (en)
JP (1) JP7223067B2 (en)
CN (1) CN112308236A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115118470A (en) * 2022-06-16 2022-09-27 深圳乐播科技有限公司 Processing method and device for content mis-uploading, computer equipment and storage medium
CN116305071A (en) * 2023-03-18 2023-06-23 广州锦拓信息科技有限公司 Account password security system based on artificial intelligence

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115168848B (en) * 2022-09-08 2022-12-16 南京鼎山信息科技有限公司 Interception feedback processing method based on big data analysis interception

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160162693A1 (en) * 2014-12-09 2016-06-09 International Business Machines Corporation Automated management of confidential data in cloud environments
CN109308418A (en) * 2017-07-28 2019-02-05 阿里巴巴集团控股有限公司 A kind of model training method and device based on shared data
WO2019217876A1 (en) * 2018-05-10 2019-11-14 Equifax Inc. Training or using sets of explainable machine-learning modeling algorithms for predicting timing of events
US10534933B1 (en) * 2017-12-27 2020-01-14 Symantec Corporation Encrypting and decrypting sensitive files on a network device
CN110738323A (en) * 2018-07-03 2020-01-31 百度在线网络技术(北京)有限公司 Method and device for establishing machine learning model based on data sharing
CN111027870A (en) * 2019-12-14 2020-04-17 支付宝(杭州)信息技术有限公司 User risk assessment method and device, electronic equipment and storage medium
CN111310204A (en) * 2020-02-10 2020-06-19 北京百度网讯科技有限公司 Data processing method and device
CN111401558A (en) * 2020-06-05 2020-07-10 腾讯科技(深圳)有限公司 Data processing model training method, data processing device and electronic equipment
CN111681091A (en) * 2020-08-12 2020-09-18 腾讯科技(深圳)有限公司 Financial risk prediction method and device based on time domain information and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9584517B1 (en) * 2014-09-03 2017-02-28 Amazon Technologies, Inc. Transforms within secure execution environments
WO2020257783A1 (en) * 2019-06-21 2020-12-24 nference, inc. Systems and methods for computing with private healthcare data

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160162693A1 (en) * 2014-12-09 2016-06-09 International Business Machines Corporation Automated management of confidential data in cloud environments
CN109308418A (en) * 2017-07-28 2019-02-05 阿里巴巴集团控股有限公司 A kind of model training method and device based on shared data
US10534933B1 (en) * 2017-12-27 2020-01-14 Symantec Corporation Encrypting and decrypting sensitive files on a network device
WO2019217876A1 (en) * 2018-05-10 2019-11-14 Equifax Inc. Training or using sets of explainable machine-learning modeling algorithms for predicting timing of events
CN110738323A (en) * 2018-07-03 2020-01-31 百度在线网络技术(北京)有限公司 Method and device for establishing machine learning model based on data sharing
CN111027870A (en) * 2019-12-14 2020-04-17 支付宝(杭州)信息技术有限公司 User risk assessment method and device, electronic equipment and storage medium
CN111310204A (en) * 2020-02-10 2020-06-19 北京百度网讯科技有限公司 Data processing method and device
CN111401558A (en) * 2020-06-05 2020-07-10 腾讯科技(深圳)有限公司 Data processing model training method, data processing device and electronic equipment
CN111681091A (en) * 2020-08-12 2020-09-18 腾讯科技(深圳)有限公司 Financial risk prediction method and device based on time domain information and storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115118470A (en) * 2022-06-16 2022-09-27 深圳乐播科技有限公司 Processing method and device for content mis-uploading, computer equipment and storage medium
CN115118470B (en) * 2022-06-16 2023-11-17 深圳乐播科技有限公司 Processing method, device, computer equipment and storage medium for content error uploading
CN116305071A (en) * 2023-03-18 2023-06-23 广州锦拓信息科技有限公司 Account password security system based on artificial intelligence
CN116305071B (en) * 2023-03-18 2023-09-26 广州锦拓信息科技有限公司 Account password security system based on artificial intelligence

Also Published As

Publication number Publication date
JP2022006164A (en) 2022-01-12
EP3869374B1 (en) 2023-09-06
US20210312017A1 (en) 2021-10-07
EP3869374A2 (en) 2021-08-25
JP7223067B2 (en) 2023-02-15
EP3869374A3 (en) 2022-01-05

Similar Documents

Publication Publication Date Title
CN110417750B (en) Block chain technology-based file reading and storing method, terminal device and storage medium
CN105007279B (en) Authentication method and Verification System
CN103107995B (en) A kind of cloud computing environment date safety storing system and method
US8806652B2 (en) Privacy from cloud operators
CN107493291A (en) A kind of identity identifying method and device based on safety element SE
CN112287379B (en) Service data using method, device, equipment, storage medium and program product
CN112308236A (en) Method, device, electronic equipment and storage medium for processing user request
CN110492990A (en) Private key management method, apparatus and system under block chain scene
WO2015131394A1 (en) Method and apparatus for verifying processed data
CN110048848A (en) Method, system and the storage medium of session token are sent by passive client
WO2023029447A1 (en) Model protection method, device, apparatus, system and storage medium
CN112182635A (en) Method, device, equipment and medium for realizing joint modeling
CN114363088B (en) Method and device for requesting data
Raghavendra et al. [Retracted] Critical Retrospection of Security Implication in Cloud Computing and Its Forensic Applications
KR102211033B1 (en) Agency service system for accredited certification procedures
CN112053159A (en) Transaction data verification method and device, risk control server and business server
KR102199486B1 (en) Authorized authentication agency for content providers
US11526633B2 (en) Media exfiltration prevention system
CN114186994A (en) Method, terminal and system for using digital currency wallet application
CN106534047A (en) Information transmitting method and apparatus based on Trust application
CN114826616B (en) Data processing method, device, electronic equipment and medium
CN113961970B (en) Cross-network-segment network disk login identity authentication method and device, network disk and storage medium
CN113271306B (en) Data request and transmission method, device and system
US20230024967A1 (en) Transaction security techniques
Raghavendra et al. Research Article Critical Retrospection of Security Implication in Cloud Computing and Its Forensic Applications

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination