CN111797851A - Feature extraction method and device, storage medium and electronic equipment - Google Patents

Feature extraction method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN111797851A
CN111797851A CN201910282011.6A CN201910282011A CN111797851A CN 111797851 A CN111797851 A CN 111797851A CN 201910282011 A CN201910282011 A CN 201910282011A CN 111797851 A CN111797851 A CN 111797851A
Authority
CN
China
Prior art keywords
data
model
feature extraction
server
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201910282011.6A
Other languages
Chinese (zh)
Inventor
何明
陈仲铭
黄粟
刘耀勇
陈岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201910282011.6A priority Critical patent/CN111797851A/en
Publication of CN111797851A publication Critical patent/CN111797851A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/217Validation; Performance evaluation; Active pattern learning techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

The embodiment of the application discloses a feature extraction method, a feature extraction device, a storage medium and electronic equipment, wherein the embodiment of the application receives a feature extraction instruction sent by a server, and acquires target data of a target user according to the feature extraction instruction; training a preset deep learning model according to target data, acquiring a first model parameter, encrypting the first model parameter and sending the encrypted first model parameter to a server; receiving second model parameters returned by the server, wherein the second model parameters are obtained by combining the server according to the first model parameters sent by each user in the user set, and the target user belongs to the user set; retraining the deep learning model based on the second model parameter, and extracting first data features from the target data according to the deep learning model obtained through training; and encrypting the first data characteristic and then sending the encrypted first data characteristic to the server. On the basis of privacy protection, collaborative learning and training among a plurality of users are realized.

Description

Feature extraction method and device, storage medium and electronic equipment
Technical Field
The present application relates to the field of data processing technologies, and in particular, to a method and an apparatus for feature extraction, a storage medium, and an electronic device.
Background
Because the feature data and quality of a single user are limited, when the user data is analyzed to extract the first data feature, the data analysis task is difficult to realize only by using the feature of the single user. If the data of most users are subjected to cooperative processing, the problem of user privacy is also involved, especially, part of sensitive data, such as user income, home address and other information, can bring great privacy potential safety hazard if the sensitive data are directly uploaded to the cloud for cooperative processing. How to extract the characteristics of the sensitive data to well complete corresponding tasks and avoid the disclosure of user privacy is a practical problem to be solved urgently.
Disclosure of Invention
The embodiment of the application provides a feature extraction method, a feature extraction device, a storage medium and electronic equipment, which can realize system learning and training among a plurality of users while ensuring data privacy.
In a first aspect, an embodiment of the present application provides a feature extraction method, applied to a terminal, including:
receiving a feature extraction instruction sent by a server, and acquiring target data of a target user according to the feature extraction instruction;
training a preset deep learning model according to the target data, acquiring a first model parameter, encrypting the first model parameter and sending the encrypted first model parameter to the server;
receiving second model parameters returned by a server, wherein the second model parameters are obtained by combining the server according to first model parameters sent by each user in a user set, and the user set comprises the target user;
retraining the deep learning model based on the second model parameter, and extracting a first data feature from the target data according to the deep learning model obtained through training;
and encrypting the first data characteristic and then sending the encrypted first data characteristic to the server.
In a second aspect, an embodiment of the present application provides a feature extraction method, applied to a server, including:
sending a feature extraction instruction to a terminal corresponding to a target user in a user set;
receiving a first encrypted model parameter returned by the terminal according to the feature extraction instruction, wherein the first model parameter is obtained by training a preset deep learning model by the terminal according to target data of the target user;
decrypting the encrypted first model parameter to obtain the first model parameter;
merging the first model parameters of all target users in the user set to generate second model parameters;
sending the second model parameters to the terminal, and receiving first data characteristics sent by the terminal, wherein the first data characteristics are obtained by retraining the deep learning model by the terminal according to the second model parameters and extracting target data according to the deep learning model obtained by training;
and merging the first data features of the target users in the user set to generate second data features.
In a third aspect, an embodiment of the present application provides a feature extraction apparatus, which is applied to a terminal, and includes:
the data acquisition module is used for receiving a feature extraction instruction sent by the server and acquiring target data of a target user according to the feature extraction instruction;
the model training module is used for training a preset deep learning model according to the target data to obtain a first model parameter;
the data sending module is used for encrypting the first model parameter and then sending the encrypted first model parameter to the server;
the parameter receiving module is used for receiving second model parameters returned by the server, wherein the second model parameters are obtained by combining the server according to the first model parameters sent by each user in a user set, and the user set comprises the target user;
the model training module is further configured to: retraining the deep learning model based on the second model parameter, and extracting a first data feature from the target data according to the deep learning model obtained through training;
the data sending module is further configured to: and encrypting the first data characteristic and then sending the encrypted first data characteristic to the server.
In a fourth aspect, an embodiment of the present application provides a feature extraction apparatus, which is applied to a server, and includes:
the instruction sending module is used for sending a feature extraction instruction to the electronic equipment corresponding to the target user in the user set;
the data receiving module is used for receiving encrypted first model parameters returned by the electronic equipment according to the feature extraction instruction, wherein the first model parameters are obtained by training a preset deep learning model by the electronic equipment according to target data of the target user;
the data decryption module is used for decrypting the encrypted first model parameter to obtain the first model parameter;
the parameter merging module is used for merging the first model parameters of all target users in the user set to generate second model parameters;
the parameter sending module is used for sending the second model parameter to the electronic equipment;
the characteristic receiving module is used for receiving first data characteristics sent by the electronic equipment, wherein the first data characteristics are obtained by retraining the deep learning model by the electronic equipment according to the second model parameters and extracting target data according to the deep learning model obtained by training;
and the characteristic merging module is used for merging the first data characteristics of the target users in the user set to generate second data characteristics.
In a fifth aspect, a storage medium is provided in this application, and has a computer program stored thereon, where the computer program is configured to, when run on a computer, cause the computer to perform a feature extraction method as provided in any of the embodiments of this application.
In a sixth aspect, an embodiment of the present application provides an electronic device, including a processor and a memory, where the memory has a computer program, and the processor is configured to execute the feature extraction method provided in any embodiment of the present application by calling the computer program.
According to the technical scheme, the method comprises the steps of receiving a feature extraction instruction sent by a server, obtaining target data of a target user according to the feature extraction instruction, training a preset deep learning model according to the target data, obtaining first model parameters, encrypting the first model parameters, sending the encrypted first model parameters to the server, receiving second model parameters returned by the server, combining the second model parameters according to the first model parameters sent by each user in a user set by the server, retraining the deep learning model based on the second model parameters, extracting first data features from the target data according to the deep learning model obtained through training, and sending the encrypted first data features to the server. According to the method and the system, based on the idea of federal learning, each user trains a deep learning model locally by using local data to obtain model parameters, the model parameters are encrypted and sent to the server, and the server jointly processes the model parameters of a plurality of users, so that the data of other users can be well cooperated, and then the data are returned to the local for learning and training. On the basis of privacy protection, collaborative learning and training among a plurality of users are realized, the fact that the finally learned sensitive features can well fuse knowledge and preference of other users is guaranteed, and the extraction quality and accuracy of the sensitive features can be improved to a large extent.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic view of a panoramic sensing architecture of a feature extraction method according to an embodiment of the present application.
Fig. 2 is a schematic flow chart of a first feature extraction method according to an embodiment of the present application.
Fig. 3 is a federal learning framework diagram provided in an embodiment of the present application.
Fig. 4 is a schematic flowchart of a second method for feature extraction according to an embodiment of the present disclosure.
Fig. 5 is a schematic structural diagram of a first feature extraction device according to an embodiment of the present application.
Fig. 6 is a schematic structural diagram of a second feature extraction device according to an embodiment of the present application.
Fig. 7 is a schematic structural diagram of a first electronic device according to an embodiment of the present application.
Fig. 8 is a schematic structural diagram of a second electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without inventive step, are within the scope of the present application.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
Referring to fig. 1, fig. 1 is a schematic view of a panoramic sensing architecture of a feature extraction method provided in an embodiment of the present application. The feature extraction method is applied to electronic equipment. A panoramic perception framework is arranged in the electronic equipment. The panoramic sensing architecture is an integration of hardware and software used for implementing the feature extraction method in electronic equipment.
The panoramic perception architecture comprises an information perception layer, a data processing layer, a feature extraction layer, a scene modeling layer and an intelligent service layer.
The information perception layer is used for acquiring information of the electronic equipment or information in an external environment. The information-perceiving layer may include a plurality of sensors. For example, the information sensing layer includes a plurality of sensors such as a distance sensor, a magnetic field sensor, a light sensor, an acceleration sensor, a fingerprint sensor, a hall sensor, a position sensor, a gyroscope, an inertial sensor, an attitude sensor, a barometer, and a heart rate sensor.
Among other things, a distance sensor may be used to detect a distance between the electronic device and an external object. The magnetic field sensor may be used to detect magnetic field information of the environment in which the electronic device is located. The light sensor can be used for detecting light information of the environment where the electronic equipment is located. The acceleration sensor may be used to detect acceleration data of the electronic device. The fingerprint sensor may be used to collect fingerprint information of a user. The Hall sensor is a magnetic field sensor manufactured according to the Hall effect, and can be used for realizing automatic control of electronic equipment. The location sensor may be used to detect the geographic location where the electronic device is currently located. Gyroscopes may be used to detect angular velocity of an electronic device in various directions. Inertial sensors may be used to detect motion data of an electronic device. The gesture sensor may be used to sense gesture information of the electronic device. A barometer may be used to detect the barometric pressure of the environment in which the electronic device is located. The heart rate sensor may be used to detect heart rate information of the user.
And the data processing layer is used for processing the data acquired by the information perception layer. For example, the data processing layer may perform data cleaning, data integration, data transformation, data reduction, and the like on the data acquired by the information sensing layer.
The data cleaning refers to cleaning a large amount of data acquired by the information sensing layer to remove invalid data and repeated data. The data integration refers to integrating a plurality of single-dimensional data acquired by the information perception layer into a higher or more abstract dimension so as to comprehensively process the data of the plurality of single dimensions. The data transformation refers to performing data type conversion or format conversion on the data acquired by the information sensing layer so that the transformed data can meet the processing requirement. The data reduction means that the data volume is reduced to the maximum extent on the premise of keeping the original appearance of the data as much as possible.
The characteristic extraction layer is used for extracting characteristics of the data processed by the data processing layer so as to extract the characteristics included in the data. The extracted features may reflect the state of the electronic device itself or the state of the user or the environmental state of the environment in which the electronic device is located, etc.
The feature extraction layer may extract features or process the extracted features by a method such as a filtering method, a packing method, or an integration method.
The filtering method is to filter the extracted features to remove redundant feature data. Packaging methods are used to screen the extracted features. The integration method is to integrate a plurality of feature extraction methods together to construct a more efficient and more accurate feature extraction method for extracting features.
The scene modeling layer is used for building a model according to the features extracted by the feature extraction layer, and the obtained model can be used for representing the state of the electronic equipment, the state of a user, the environment state and the like. For example, the scenario modeling layer may construct a key value model, a pattern identification model, a graph model, an entity relation model, an object-oriented model, and the like according to the features extracted by the feature extraction layer.
The intelligent service layer is used for providing intelligent services for the user according to the model constructed by the scene modeling layer. For example, the intelligent service layer can provide basic application services for users, perform system intelligent optimization for electronic equipment, and provide personalized intelligent services for users.
In addition, the panoramic perception architecture can further comprise a plurality of algorithms, each algorithm can be used for analyzing and processing data, and the plurality of algorithms can form an algorithm library. For example, the algorithm library may include algorithms such as markov algorithm, hidden dirichlet distribution algorithm, bayesian classification algorithm, support vector machine, K-means clustering algorithm, K-nearest neighbor algorithm, conditional random field, residual network, long-short term memory network, convolutional neural network, cyclic neural network, and the like.
Based on the panoramic sensing framework, the electronic equipment acquires panoramic data of a target user through an information sensing layer and/or other modes as target data. The data processing layer processes the target data, for example, performs data cleaning, data integration, and the like on the acquired target data. Next, the feature extraction layer extracts features from the target data according to the feature extraction scheme provided by the embodiment of the application, for example, receives a feature extraction instruction sent by the server, obtains the target data of the target user according to the feature extraction instruction, trains a preset deep learning model according to the target data, obtains first model parameters, encrypts the first model parameters, sends the first model parameters to the server, receives second model parameters returned by the server, combines the second model parameters according to the first model parameters sent by each user in the user set, retrains the deep learning model based on the second model parameters, extracts first data features from the target data according to the deep learning model obtained by training, encrypts the first data features, and sends the first data features to the server. The extracted features can be used as the input of models such as Bayesian classification algorithm and support vector machine in the scene modeling layer, or the extracted features can also be used as the input of models or algorithms in the intelligent service layer. On the characteristic extraction layer, based on the idea of federal learning, each user trains a deep learning model by using local data locally to obtain model parameters, the model parameters are encrypted and sent to a server, and the server jointly processes the model parameters of a plurality of users, so that the data of other users can be well cooperated, and then the data are returned to the local for learning and training. On the basis of privacy protection, collaborative learning and training among a plurality of users are realized, the fact that the finally learned sensitive features can well fuse knowledge and preference of other users is guaranteed, and the extraction quality and accuracy of the sensitive features can be improved to a large extent.
An execution main body of the feature extraction method may be the feature extraction device provided in the embodiment of the present application, or an electronic device integrated with the feature extraction device, where the feature extraction device may be implemented in a hardware or software manner. The electronic device may be a smart phone, a tablet computer, a palm computer, a notebook computer, or a desktop computer.
Referring to fig. 2, fig. 2 is a first flowchart of a feature extraction method according to an embodiment of the present disclosure. The specific process of the feature extraction method provided by the embodiment of the application can be as follows:
step 101, receiving a feature extraction instruction sent by a server, and acquiring target data of a target user according to the feature extraction instruction.
Referring to fig. 3, fig. 3 is a diagram of a federal learning framework provided in an embodiment of the present application. In the embodiment of the application, based on the idea of federal learning, each user trains a deep learning model locally by using local data to obtain model parameters, the model parameters are encrypted and sent to a server, the server jointly processes the model parameters of a plurality of users and sends the processed model parameters to each user, and each user trains the model parameters locally again by using the combined model parameters.
Firstly, a user set is determined by a server, and the user set comprises a plurality of target users. And the server sends a feature extraction instruction to a terminal, namely the electronic equipment, corresponding to the target user. After receiving the feature extraction instruction, the electronic equipment acquires user data stored locally by the target user according to the feature extraction instruction, and the user data is used as target data for training a deep learning model and extracting data features.
Specifically, which data is selected as the target data may be predetermined by the server according to the task to be executed, and the training task or the target data is indicated in the feature extraction instruction. After receiving the feature extraction instruction, the electronic equipment acquires needed target data from local according to the feature extraction instruction.
For example, for an image classification task, the target data is image data; aiming at the translation task, the target data is text data; for the classification task of the panoramic category, the target data is panoramic data of a target user collected by the electronic device, wherein the panoramic data may include terminal usage data, sensor data, terminal operating state data and the like. The target data can be collected and acquired by the electronic equipment according to the use condition of the user in the operation process.
And 102, training a preset deep learning model according to the target data, acquiring a first model parameter, encrypting the first model parameter and sending the encrypted first model parameter to the server.
The selection of the deep learning model is determined according to the task, and a convolutional neural network model is constructed aiming at the image classification task; aiming at a translation task, constructing an LSTM (Long Short-Term Memory) neural network model; and aiming at the classification task of the panoramic category, constructing a classification model, such as a Bayesian classification model, a support vector machine classification model and the like.
After the electronic equipment acquires the target data, the electronic equipment trains the deep learning model by using the target data, wherein before the model training, model parameters are initialized according to a preset parameter initialization method.
The electronic equipment takes the target data as training data, inputs a preset deep learning model for learning, and obtains a first model parameter after repeated iterative training.
When the electronic equipment sends the first model parameter to the server, in order to further avoid user data leakage, the first model parameter is sent to the server after being encrypted. Specifically, the step of sending the encrypted first model parameter to the server includes: and encrypting the first model parameter by using a public key issued by the server.
In the embodiment of the application, the model parameters are encrypted and transmitted in an asymmetric encryption mode, the server sets a group of public keys and private keys, issues the public keys to each user participating in federal learning and is used for encrypting the first model parameters, and stores the private keys in the server and is used for decrypting the first model parameters which are sent by each target user and encrypted by the public keys.
And after receiving the encrypted first model parameter sent by the electronic equipment, the server decrypts the encrypted first model parameter by using the stored private key to obtain the first model parameter.
And 103, receiving second model parameters returned by the server, wherein the second model parameters are obtained by combining the server according to the first model parameters sent by each user in a user set, and the user set comprises the target user.
After the server obtains the first model parameters of the users, the first model parameters of the users are subjected to combined processing to obtain second model parameters. The first model parameters of each user are obtained through local data training, the parameters can reflect the characteristics and the preference of each user data, and the second model parameters generated by the server are integrated with the characteristics and the preference of a plurality of user data, so that the knowledge of other users can be well coordinated. The server may calculate an average value, a maximum value, a median, or the like of the first model parameters of the plurality of users as the second model parameter.
And 104, retraining the deep learning model based on the second model parameter, and extracting a first data feature from the target data according to the deep learning model obtained by training.
Referring to fig. 3, the server sends the second model parameters obtained by the fusion processing to each user. At this time, the received second model parameters are the same for each user.
The step of retraining the deep learning model based on the second model parameters includes: initializing the deep learning model according to the second model parameters; retraining the initialized deep learning model using the target data.
The electronic device may update the locally trained deep learning model using the parameters, and then train the locally trained deep learning model using the local user data again until the model iteration is stable. At the moment, the training of the local deep learning model of the single user adopts the joint parameters, so that the knowledge of other users can be well integrated in the learning process.
And extracting data characteristics corresponding to the target data from the deep learning model with stable iteration. Specifically, in an optional embodiment, the deep learning model is a convolutional neural network model, and the step of extracting the first data feature from the target data according to the deep learning model obtained by training includes: inputting the target data into the convolutional neural network model for operation; and acquiring the neuron output characteristics of the last hidden layer of the convolutional neural network model, and taking the neuron output characteristics as the first data characteristics.
Referring to fig. 3, the deep learning model includes an input layer, an output layer, and a plurality of hidden layers between the input layer and the output layer, and the present embodiment takes the neuron output feature of the last hidden layer of the plurality of hidden layers as the final first data feature.
And 105, encrypting the first data characteristic and sending the encrypted first data characteristic to the server.
And after the first data characteristic is obtained from the deep learning model obtained through training, encrypting the first data characteristic according to a public key issued by the server, and sending the encrypted first data characteristic to the server. And the server decrypts the received first data characteristic after encryption to obtain the first data characteristic. Then, the server carries out merging processing on the plurality of first data characteristics to generate second data characteristics coordinating with the plurality of user characteristics. The feature merging method is similar to the merging method of the first model parameter, and an average value, a median or a maximum value of the first data feature may be calculated as the second feature data.
Optionally, in some embodiments, before the step of receiving a feature extraction instruction sent by the server and obtaining target data of the target user according to the feature extraction instruction, the method further includes:
when a quality evaluation instruction sent by a server is received, calculating evaluation values of the target data on a plurality of preset quality evaluation indexes; and sending the evaluation value to the server, wherein the server selects a target user from a plurality of users according to the evaluation value.
In the embodiment, the server sets quality evaluation indexes, each user evaluates the quality of target data according to the preset quality evaluation indexes, the evaluation values are sent to the server, and the server selects a user with high data quality as a target user.
The quality evaluation index includes, but is not limited to, the following items: fluctuation degree, redundancy, missing degree, noise degree and the like of the target data. Each quality evaluation index has a corresponding evaluation method.
For example, a variance of the target data is calculated, and a fluctuation degree of the target data is determined based on the variance, wherein the variance is proportional to the fluctuation degree.
Calculating a difference between adjacent data in the target data, determining the quantity of redundant data according to the difference, and calculating the redundancy according to the quantity of the redundant data. For example, if the target data is GPS (global positioning System) data collected by the electronic device, assuming that the electronic device samples 50 times per second, the data between every two adjacent times hardly changes, and actually samples 25 times per second, at this time, we can consider that the data sampled 25 times is redundant data, and thus can calculate the redundancy to be 50%. Wherein, the difference between the adjacent data can be calculated to represent the difference between the data.
And searching the number of null values in the target data, and determining the missing degree according to the number of the null values. The target data is stored in the database according to the attributes of the data, that is, each data attribute has a value in the database, and if the value is missing, the value is directly set to null, that is, null. Therefore, the number of missing data can be determined by counting the number of null, and the missing degree of the target data is determined according to the proportion of the number of the missing data to the total number of the data.
Calculating a mean value in the target data, calculating the number of data in the target data which are larger than a preset multiple of the mean value, and calculating the degree according to the number of data deviating from the mean value. For example, the number of data deviating from the average value by a preset multiple is calculated, wherein the preset multiple can be set according to actual requirements, for example, the preset multiple is 3-10. For example, the average value of all data is 5, one of the data values is 1000, and this 1000 may be considered as a noise value, which may affect the accuracy of subsequent data analysis or feature extraction.
And the electronic equipment calculates the evaluation value of the target data on each quality evaluation index according to the evaluation method corresponding to each quality evaluation index, and sends the evaluation value to the server. The server is preset with a preset threshold corresponding to the quality evaluation index, and after receiving the evaluation value sent by each user, the server compares the evaluation value with the corresponding preset threshold to judge whether the user data instruction meets the quality requirement. For example, the preset threshold of the redundancy is 5%, if the data redundancy of a certain user is 3%, it may be determined that the data redundancy of the user meets the quality requirement, otherwise, it is determined that the data redundancy does not meet the quality requirement. The user with high data quality is selected from all users in the above manner.
In particular implementation, the present application is not limited by the execution sequence of the described steps, and some steps may be performed in other sequences or simultaneously without conflict.
As can be seen from the above, the feature extraction method provided in this embodiment of the present application receives a feature extraction instruction sent by a server, obtains target data of a target user according to the feature extraction instruction, trains a preset deep learning model according to the target data, obtains a first model parameter, encrypts the first model parameter and sends the encrypted first model parameter to the server, receives a second model parameter returned by the server, the second model parameter is obtained by combining the first model parameters sent by each user in a user set by the server, retrains the deep learning model based on the second model parameter, extracts a first data feature from the target data according to the deep learning model obtained by the training, and encrypts the first data feature and sends the encrypted first data feature to the server. According to the method and the system, based on the idea of federal learning, each user trains a deep learning model locally by using local data to obtain model parameters, the model parameters are encrypted and sent to the server, and the server jointly processes the model parameters of a plurality of users, so that the data of other users can be well cooperated, and then the data are returned to the local for learning and training. On the basis of privacy protection, collaborative learning and training among a plurality of users are realized, the fact that the finally learned sensitive features can well fuse knowledge and preference of other users is guaranteed, and the extraction quality and accuracy of the sensitive features can be improved to a large extent.
In addition, the application also provides a feature extraction method, and an execution subject of the feature extraction method can be a server. Referring to fig. 4, fig. 4 is a schematic flow chart of a second feature extraction method according to an embodiment of the present application. The specific process of the feature extraction method provided by the embodiment of the application can be as follows:
step S201, a characteristic extraction instruction is sent to a terminal corresponding to a target user in a user set.
And selecting users with high data quality from all the users of the server as target users, and forming a user set by the target users. The user screening method refers to the above embodiments, and is not described herein again.
And the server sends a feature extraction instruction to the electronic equipment corresponding to the target user, and indicates which data are to be selected as target data in the feature extraction instruction.
Step S202, receiving a first encrypted model parameter returned by the terminal according to the feature extraction instruction, wherein the first model parameter is obtained by training a preset deep learning model by the terminal according to target data of the target user.
Step S203, decrypting the encrypted first model parameter to obtain the first model parameter.
Step S204, merging the first model parameters of all target users in the user set to generate second model parameters.
Please refer to the above embodiments for the decryption process of the first model parameter by the server, which is not described herein again.
In some embodiments, assume that the first model parameter for user u is wu=(wu1,wu2,…,wun) Wherein n represents the number of parameters. A model parameter combination method is to take the average value of the first model parameters of all users, and refer to the following formula:
Figure BDA0002021989270000121
wherein, U represents the total number of users,
Figure BDA0002021989270000122
the second model parameter obtained after the joint processing.
Alternatively, in other embodiments, the maximum value or median of the first model parameters of all users may also be taken.
The combined model parameters are selected from the U users, namely the finally obtained parameters, whether the parameters are average values, maximum values or median values
Figure BDA0002021989270000123
The method introduces parameters of a plurality of user knowledge, so that the knowledge of a plurality of users can be coordinated under the condition that each user does not need to upload local data to a server.
Step S205, sending the second model parameter to the terminal, and receiving a first data feature sent by the terminal, wherein the terminal retrains the deep learning model according to the second model parameter according to the first data feature, and extracts the deep learning model from target data according to the deep learning model obtained through training.
Step S206, merging the first data characteristics of the target users in the user set to generate second data characteristics.
And the server sends the second model parameters obtained by the joint processing to the electronic equipment of each user, the electronic equipment retrains the deep learning model by using the local user data of each electronic equipment to obtain an iterative stable model, and then extracts the first data characteristics to send to the server, and the server combines the first data characteristics to obtain the second data characteristics. For the specific implementation, please refer to the above embodiments, which are not described herein again.
In the feature extraction method provided by the embodiment of the application, a server sends a feature extraction instruction to a terminal corresponding to a target user in a user set; the method comprises the steps of receiving encrypted first model parameters returned by a terminal, decrypting the encrypted first model parameters to obtain the first model parameters, combining the first model parameters of all users to generate second model parameters, sending the second model parameters to the users by a server, carrying out model training locally at the terminal again, extracting first data characteristics by using a stable model obtained by training by each terminal, sending the first data characteristics to the server, combining the first data characteristics by the server to obtain second data characteristics. Through encrypting the parameters, the double protection of the user privacy data is realized, the first protection is that the user sensitive data is not directly uploaded but the model parameters, and the second protection is that the encrypted data of the user model parameters is uploaded; through the idea of Federal learning, on the basis of privacy protection, collaborative learning and training among a plurality of users are realized, the fact that the finally learned sensitive features can well fuse knowledge and preference of other users is guaranteed, and the extraction quality and accuracy of the sensitive features can be improved to a large extent.
In one embodiment, a feature extraction apparatus is also provided. Referring to fig. 5, fig. 5 is a schematic view illustrating a first structure of a feature extraction device according to an embodiment of the present disclosure. The feature extraction apparatus 400 is applied to an electronic device, and the feature extraction apparatus 400 includes a data obtaining module 401, a model training module 402, a data sending module 403, and a parameter receiving module 404, as follows:
the data acquisition module 401 is configured to receive a feature extraction instruction sent by a server, and acquire target data of a target user according to the feature extraction instruction;
a model training module 402, configured to train a preset deep learning model according to the target data, and obtain a first model parameter;
a data sending module 403, configured to encrypt the first model parameter and send the encrypted first model parameter to the server;
a parameter receiving module 404, configured to receive a second model parameter returned by a server, where the second model parameter is obtained by the server after merging according to a first model parameter sent by each user in a user set, and the user set includes the target user;
the model training module 402 is further configured to: retraining the deep learning model based on the second model parameter, and extracting a first data feature from the target data according to the deep learning model obtained through training;
the data sending module 403 is further configured to: and encrypting the first data characteristic and then sending the encrypted first data characteristic to the server.
In some embodiments, the data sending module 403 is further configured to encrypt the first model parameter by using a public key issued by the server.
In some embodiments, model training module 402 is further configured to: initializing the deep learning model according to the second model parameters; retraining the initialized deep learning model using the target data.
In some embodiments, the deep learning model is a convolutional neural network model, and the model training module 402 is further configured to: inputting the target data into the convolutional neural network model for operation; and acquiring the neuron output characteristics of the last hidden layer of the convolutional neural network model, and taking the neuron output characteristics as the first data characteristics.
In some embodiments, the feature extraction apparatus 400 further includes a quality evaluation module, configured to, when receiving a quality evaluation instruction sent by the server, calculate evaluation values of the target data on a plurality of preset quality evaluation indexes; and sending the evaluation value to the server, wherein the server selects a target user from a plurality of users according to the evaluation value.
In addition, please refer to fig. 6, fig. 6 is a second structural schematic diagram of the feature extraction device according to the embodiment of the present application. Wherein the feature extraction apparatus 400 is applied to a server, the feature extraction apparatus 500 includes: the instruction sending module 501, the data receiving module 502, the data decrypting module 503, the parameter combining module 504, the parameter sending module 505, the feature receiving module 506, and the feature combining module 507 are as follows:
an instruction sending module 501, configured to send a feature extraction instruction to an electronic device corresponding to a target user in a user set;
a data receiving module 502, configured to receive an encrypted first model parameter returned by the electronic device according to the feature extraction instruction, where the first model parameter is obtained by the electronic device through training a preset deep learning model according to target data of the target user;
a data decryption module 503, configured to decrypt the encrypted first model parameter to obtain the first model parameter;
a parameter merging module 504, configured to merge the first model parameters of all target users in the user set to generate second model parameters;
a parameter sending module 505, configured to send the second model parameter to the electronic device;
a feature receiving module 506, configured to receive a first data feature sent by the electronic device, where the first data feature is obtained by the electronic device retraining the deep learning model according to the second model parameter, and extracting from target data according to the deep learning model obtained through training;
a feature merging module 507, configured to merge the first data features of the target users in the user set to generate second data features.
In the feature extraction device provided in the embodiment of the present application, the instruction sending module 501 sends a feature extraction instruction to a terminal corresponding to a target user in a user set; the data receiving module 502 receives the encrypted first model parameters returned by the terminals, the data decryption module 503 decrypts the first model parameters to obtain the first model parameters, the parameter merging module 504 merges the first model parameters of all users to generate second model parameters, the parameter sending module 505 sends the second model parameters to the users, model training is performed locally at the terminals again, each terminal extracts first data features by using a stable model obtained by training and sends the first data features to the server, and the feature receiving module 506 performs merging processing to obtain second data features. Through encrypting the parameters, the double protection of the user privacy data is realized, the first protection is that the user sensitive data is not directly uploaded but the model parameters, and the second protection is that the encrypted data of the user model parameters is uploaded; through the idea of Federal learning, on the basis of privacy protection, collaborative learning and training among a plurality of users are realized, the fact that the finally learned sensitive features can well fuse knowledge and preference of other users is guaranteed, and the extraction quality and accuracy of the sensitive features can be improved to a large extent.
The embodiment of the application also provides the electronic equipment. The electronic device can be a smart phone, a tablet computer and the like. As shown in fig. 7, fig. 7 is a schematic view of a first structure of an electronic device according to an embodiment of the present application. The electronic device 300 comprises a processor 301 and a memory 302. The processor 301 is electrically connected to the memory 302.
The processor 301 is a control center of the electronic device 300, connects various parts of the entire electronic device using various interfaces and lines, and performs various functions of the electronic device and processes data by running or calling a computer program stored in the memory 302 and calling data stored in the memory 302, thereby performing overall monitoring of the electronic device.
In this embodiment, the processor 301 in the electronic device 300 loads instructions corresponding to one or more processes of the computer program into the memory 302 according to the following steps, and the processor 301 runs the computer program stored in the memory 302, so as to implement various functions:
receiving a feature extraction instruction sent by a server, and acquiring target data of a target user according to the feature extraction instruction;
training a preset deep learning model according to the target data, acquiring a first model parameter, encrypting the first model parameter and sending the encrypted first model parameter to the server;
receiving second model parameters returned by a server, wherein the second model parameters are obtained by combining the server according to first model parameters sent by each user in a user set, and the user set comprises the target user;
retraining the deep learning model based on the second model parameter, and extracting a first data feature from the target data according to the deep learning model obtained through training;
and encrypting the first data characteristic and then sending the encrypted first data characteristic to the server.
In some embodiments, when the first model parameter is encrypted and sent to the server, the processor 301 performs the following steps:
and encrypting the first model parameter by using a public key issued by the server.
In some embodiments, when retraining the deep learning model based on the second model parameters, processor 301 performs the following steps:
in some embodiments, when the deep learning model is a convolutional neural network model, and the first data feature is extracted from the target data according to the deep learning model obtained by training, the processor 301 performs the following steps:
inputting the target data into the convolutional neural network model for operation;
and acquiring the neuron output characteristics of the last hidden layer of the convolutional neural network model, and taking the neuron output characteristics as the first data characteristics.
In some embodiments, the processor 301 performs the following steps before the step of receiving a feature extraction instruction sent by the server and acquiring target data of the target user according to the feature extraction instruction:
when a quality evaluation instruction sent by a server is received, calculating evaluation values of the target data on a plurality of preset quality evaluation indexes;
and sending the evaluation value to the server, wherein the server selects a target user from a plurality of users according to the evaluation value.
Memory 302 may be used to store computer programs and data. The memory 302 stores computer programs containing instructions executable in the processor. The computer program may constitute various functional modules. The processor 301 executes various functional applications and data processing by calling a computer program stored in the memory 302.
In some embodiments, as shown in fig. 8, fig. 8 is a second schematic structural diagram of an electronic device provided in the embodiments of the present application. The electronic device 300 further includes: radio frequency circuit 303, display screen 304, control circuit 305, input unit 306, audio circuit 307, sensor 308, and power supply 309. The processor 301 is electrically connected to the rf circuit 303, the display 304, the control circuit 305, the input unit 306, the audio circuit 307, the sensor 308, and the power source 309, respectively.
The radio frequency circuit 303 is used for transceiving radio frequency signals to communicate with a network device or other electronic devices through wireless communication.
The display screen 304 may be used to display information entered by or provided to the user as well as various graphical user interfaces of the electronic device, which may be comprised of images, text, icons, video, and any combination thereof.
The control circuit 305 is electrically connected to the display screen 304, and is used for controlling the display screen 304 to display information.
The input unit 306 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint), and to generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control. The input unit 306 may include a fingerprint recognition module.
Audio circuitry 307 may provide an audio interface between the user and the electronic device through a speaker, microphone. Where audio circuitry 307 includes a microphone. The microphone is electrically connected to the processor 301. The microphone is used for receiving voice information input by a user.
The sensor 308 is used to collect external environmental information. The sensor 308 may include one or more of an ambient light sensor, an acceleration sensor, a gyroscope, and the like.
The power supply 309 is used to power the various components of the electronic device 300. In some embodiments, the power source 309 may be logically coupled to the processor 301 through a power management system, such that functions to manage charging, discharging, and power consumption management are performed through the power management system.
Although not shown in fig. 8, the electronic device 300 may further include a camera, a bluetooth module, and the like, which are not described in detail herein.
Therefore, the embodiment of the application provides an electronic device, the electronic device receives a feature extraction instruction sent by a server, obtains target data of a target user according to the feature extraction instruction, trains a preset deep learning model according to the target data, obtains a first model parameter, encrypts the first model parameter and sends the encrypted first model parameter to the server, receives a second model parameter returned by the server, combines the second model parameter according to the first model parameter sent by each user in a user set, retrains the deep learning model based on the second model parameter, extracts a first data feature from the target data according to the trained deep learning model, and encrypts the first data feature and sends the encrypted first data feature to the server. According to the method and the system, based on the idea of federal learning, each user trains a deep learning model locally by using local data to obtain model parameters, the model parameters are encrypted and sent to the server, and the server jointly processes the model parameters of a plurality of users, so that the data of other users can be well cooperated, and then the data are returned to the local for learning and training. On the basis of privacy protection, collaborative learning and training among a plurality of users are realized, the fact that the finally learned sensitive features can well fuse knowledge and preference of other users is guaranteed, and the extraction quality and accuracy of the sensitive features can be improved to a large extent.
Alternatively, in other embodiments, the electronic device 300 is a server, and the processor 301 in the electronic device 300 loads instructions corresponding to one or more processes of the computer program into the memory 302 according to the following steps, and the processor 301 executes the computer program stored in the memory 302, so as to implement various functions:
sending a feature extraction instruction to a terminal corresponding to a target user in a user set;
receiving a first encrypted model parameter returned by the terminal according to the feature extraction instruction, wherein the first model parameter is obtained by training a preset deep learning model by the terminal according to target data of the target user;
decrypting the encrypted first model parameter to obtain the first model parameter;
merging the first model parameters of all target users in the user set to generate second model parameters;
sending the second model parameters to the terminal, and receiving first data characteristics sent by the terminal, wherein the first data characteristics are obtained by retraining the deep learning model by the terminal according to the second model parameters and extracting target data according to the deep learning model obtained by training;
and merging the first data features of the target users in the user set to generate second data features.
An embodiment of the present application further provides a storage medium, where a computer program is stored in the storage medium, and when the computer program runs on a computer, the computer executes the feature extraction method according to any of the above embodiments.
It should be noted that, all or part of the steps in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer-readable storage medium, which may include, but is not limited to: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Furthermore, the terms "first", "second", and "third", etc. in this application are used to distinguish different objects, and are not used to describe a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or modules is not limited to only those steps or modules listed, but rather, some embodiments may include other steps or modules not listed or inherent to such process, method, article, or apparatus.
The feature extraction method, the feature extraction device, the storage medium, and the electronic device provided in the embodiments of the present application are described in detail above. The principle and the implementation of the present application are explained herein by applying specific examples, and the above description of the embodiments is only used to help understand the method and the core idea of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (13)

1. A feature extraction method is applied to a terminal, and is characterized by comprising the following steps:
receiving a feature extraction instruction sent by a server, and acquiring target data of a target user according to the feature extraction instruction;
training a preset deep learning model according to the target data, acquiring a first model parameter, encrypting the first model parameter and sending the encrypted first model parameter to the server;
receiving second model parameters returned by a server, wherein the second model parameters are obtained by combining the server according to first model parameters sent by each user in a user set, and the user set comprises the target user;
retraining the deep learning model based on the second model parameter, and extracting a first data feature from the target data according to the deep learning model obtained through training;
and encrypting the first data characteristic and then sending the encrypted first data characteristic to the server.
2. The feature extraction method of claim 1, further comprising:
and encrypting the first model parameter by using a public key issued by the server.
3. The feature extraction method of claim 1, further comprising:
initializing the deep learning model according to the second model parameters;
retraining the initialized deep learning model using the target data.
4. The feature extraction method according to claim 1, wherein the deep learning model is a convolutional neural network model, and the step of extracting the first data feature from the target data according to the deep learning model obtained by training comprises:
inputting the target data into the convolutional neural network model for operation;
and acquiring the neuron output characteristics of the last hidden layer of the convolutional neural network model, and taking the neuron output characteristics as the first data characteristics.
5. The feature extraction method of any one of claims 1 to 4, further comprising:
when a quality evaluation instruction sent by a server is received, calculating evaluation values of the target data on a plurality of preset quality evaluation indexes;
and sending the evaluation value to the server, wherein the server selects a target user from a plurality of users according to the evaluation value.
6. A feature extraction method is applied to a server and is characterized by comprising the following steps:
sending a feature extraction instruction to a terminal corresponding to a target user in a user set;
receiving a first encrypted model parameter returned by the terminal according to the feature extraction instruction, wherein the first model parameter is obtained by training a preset deep learning model by the terminal according to target data of the target user;
decrypting the encrypted first model parameter to obtain the first model parameter;
merging the first model parameters of all target users in the user set to generate second model parameters;
sending the second model parameters to the terminal, and receiving first data characteristics sent by the terminal, wherein the first data characteristics are obtained by retraining the deep learning model by the terminal according to the second model parameters and extracting target data according to the deep learning model obtained by training;
and merging the first data features of the target users in the user set to generate second data features.
7. The feature extraction method according to claim 6, wherein the step of combining the first model parameters of all target users in the user set to generate second model parameters comprises:
calculating an average value of the first model parameters of all target users in the user set;
and taking the average value as a second model parameter.
8. The feature extraction method according to claim 6, wherein the step of combining the first model parameters of all target users in the user set to generate second model parameters comprises:
acquiring first model parameters of all target users in the user set, and determining the maximum value or median of the acquired first model parameters;
and taking the maximum value or the median as a second model parameter.
9. The feature extraction method according to any one of claims 6 to 8, wherein the step of generating a second data feature by merging the first data features of the target users in the user set comprises:
calculating an average value of the first data characteristics of all target users in the user set;
and taking the average value as the second data characteristic.
10. A feature extraction device applied to a terminal is characterized by comprising:
the data acquisition module is used for receiving a feature extraction instruction sent by the server and acquiring target data of a target user according to the feature extraction instruction;
the model training module is used for training a preset deep learning model according to the target data to obtain a first model parameter;
the data sending module is used for encrypting the first model parameter and then sending the encrypted first model parameter to the server;
the parameter receiving module is used for receiving second model parameters returned by the server, wherein the second model parameters are obtained by combining the server according to the first model parameters sent by each user in a user set, and the user set comprises the target user;
the model training module is further configured to: retraining the deep learning model based on the second model parameter, and extracting a first data feature from the target data according to the deep learning model obtained through training;
the data sending module is further configured to: and encrypting the first data characteristic and then sending the encrypted first data characteristic to the server.
11. A feature extraction device applied to a server is characterized by comprising:
the instruction sending module is used for sending a feature extraction instruction to the electronic equipment corresponding to the target user in the user set;
the data receiving module is used for receiving encrypted first model parameters returned by the electronic equipment according to the feature extraction instruction, wherein the first model parameters are obtained by training a preset deep learning model by the electronic equipment according to target data of the target user;
the data decryption module is used for decrypting the encrypted first model parameter to obtain the first model parameter;
the parameter merging module is used for merging the first model parameters of all target users in the user set to generate second model parameters;
the parameter sending module is used for sending the second model parameter to the electronic equipment;
the characteristic receiving module is used for receiving first data characteristics sent by the electronic equipment, wherein the first data characteristics are obtained by retraining the deep learning model by the electronic equipment according to the second model parameters and extracting target data according to the deep learning model obtained by training;
and the characteristic merging module is used for merging the first data characteristics of the target users in the user set to generate second data characteristics.
12. A storage medium on which a computer program is stored, characterized in that, when the computer program is run on a computer, the computer is caused to execute the feature extraction method according to any one of claims 1 to 5;
or, when the computer program is run on a computer, causes the computer to execute the feature extraction method according to any one of claims 6 to 9.
13. An electronic device comprising a processor and a memory, the memory storing a computer program, wherein the processor is configured to execute the feature extraction method according to any one of claims 1 to 5 by calling the computer program;
alternatively, the processor is configured to execute the feature extraction method according to any one of claims 6 to 9 by calling the computer program.
CN201910282011.6A 2019-04-09 2019-04-09 Feature extraction method and device, storage medium and electronic equipment Withdrawn CN111797851A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910282011.6A CN111797851A (en) 2019-04-09 2019-04-09 Feature extraction method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910282011.6A CN111797851A (en) 2019-04-09 2019-04-09 Feature extraction method and device, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN111797851A true CN111797851A (en) 2020-10-20

Family

ID=72805283

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910282011.6A Withdrawn CN111797851A (en) 2019-04-09 2019-04-09 Feature extraction method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN111797851A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112508101A (en) * 2020-12-07 2021-03-16 杭州海康威视数字技术股份有限公司 System, method and equipment for adjusting neural network model
CN113408745A (en) * 2021-08-20 2021-09-17 北京瑞莱智慧科技有限公司 Task scheduling method, device, equipment and storage medium
CN114520817A (en) * 2022-02-18 2022-05-20 中国农业银行股份有限公司 Data transmission method and device, storage medium and electronic equipment
CN114549951A (en) * 2020-11-26 2022-05-27 未岚大陆(北京)科技有限公司 Method for obtaining training data, related device, system and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180018590A1 (en) * 2016-07-18 2018-01-18 NantOmics, Inc. Distributed Machine Learning Systems, Apparatus, and Methods
CN108021819A (en) * 2016-11-04 2018-05-11 西门子保健有限责任公司 Anonymity and security classification using deep learning network
US20190012592A1 (en) * 2017-07-07 2019-01-10 Pointr Data Inc. Secure federated neural networks
CN109347668A (en) * 2018-10-17 2019-02-15 网宿科技股份有限公司 A kind of training method and device of service quality assessment model
CN109413087A (en) * 2018-11-16 2019-03-01 京东城市(南京)科技有限公司 Data sharing method, device, digital gateway and computer readable storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180018590A1 (en) * 2016-07-18 2018-01-18 NantOmics, Inc. Distributed Machine Learning Systems, Apparatus, and Methods
CN108021819A (en) * 2016-11-04 2018-05-11 西门子保健有限责任公司 Anonymity and security classification using deep learning network
US20190012592A1 (en) * 2017-07-07 2019-01-10 Pointr Data Inc. Secure federated neural networks
CN109347668A (en) * 2018-10-17 2019-02-15 网宿科技股份有限公司 A kind of training method and device of service quality assessment model
CN109413087A (en) * 2018-11-16 2019-03-01 京东城市(南京)科技有限公司 Data sharing method, device, digital gateway and computer readable storage medium

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114549951A (en) * 2020-11-26 2022-05-27 未岚大陆(北京)科技有限公司 Method for obtaining training data, related device, system and storage medium
CN114549951B (en) * 2020-11-26 2024-04-23 未岚大陆(北京)科技有限公司 Method for obtaining training data, related device, system and storage medium
CN112508101A (en) * 2020-12-07 2021-03-16 杭州海康威视数字技术股份有限公司 System, method and equipment for adjusting neural network model
WO2022121840A1 (en) * 2020-12-07 2022-06-16 杭州海康威视数字技术股份有限公司 Neural network model adjustment system and method, and device
CN113408745A (en) * 2021-08-20 2021-09-17 北京瑞莱智慧科技有限公司 Task scheduling method, device, equipment and storage medium
CN114520817A (en) * 2022-02-18 2022-05-20 中国农业银行股份有限公司 Data transmission method and device, storage medium and electronic equipment
CN114520817B (en) * 2022-02-18 2024-04-16 中国农业银行股份有限公司 Data transmission method and device, storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
CN111797851A (en) Feature extraction method and device, storage medium and electronic equipment
CN111931877B (en) Target detection method, device, equipment and storage medium
CN111930964B (en) Content processing method, device, equipment and storage medium
CN111797302A (en) Model processing method and device, storage medium and electronic equipment
CN111797288A (en) Data screening method and device, storage medium and electronic equipment
CN111581958A (en) Conversation state determining method and device, computer equipment and storage medium
CN111797854A (en) Scene model establishing method and device, storage medium and electronic equipment
CN111800445B (en) Message pushing method and device, storage medium and electronic equipment
CN112561084B (en) Feature extraction method and device, computer equipment and storage medium
CN111797148A (en) Data processing method, data processing device, storage medium and electronic equipment
CN112818733B (en) Information processing method, device, storage medium and terminal
CN111796926A (en) Instruction execution method and device, storage medium and electronic equipment
CN111798367A (en) Image processing method, image processing device, storage medium and electronic equipment
CN111798019B (en) Intention prediction method, intention prediction device, storage medium and electronic equipment
CN111797849A (en) User activity identification method and device, storage medium and electronic equipment
CN113987326B (en) Resource recommendation method and device, computer equipment and medium
CN111797986A (en) Data processing method, data processing device, storage medium and electronic equipment
CN111797867A (en) System resource optimization method and device, storage medium and electronic equipment
CN111797856A (en) Modeling method, modeling device, storage medium and electronic equipment
CN111797873A (en) Scene recognition method and device, storage medium and electronic equipment
CN111800538B (en) Information processing method, device, storage medium and terminal
CN111797860B (en) Feature extraction method and device, storage medium and electronic equipment
CN111796663B (en) Scene recognition model updating method and device, storage medium and electronic equipment
CN111797869A (en) Model training method and device, storage medium and electronic equipment
CN111797880A (en) Data processing method, data processing device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20201020

WW01 Invention patent application withdrawn after publication