CN111832787B - Teacher style prediction model training method and computer storage medium - Google Patents

Teacher style prediction model training method and computer storage medium Download PDF

Info

Publication number
CN111832787B
CN111832787B CN201910330162.4A CN201910330162A CN111832787B CN 111832787 B CN111832787 B CN 111832787B CN 201910330162 A CN201910330162 A CN 201910330162A CN 111832787 B CN111832787 B CN 111832787B
Authority
CN
China
Prior art keywords
data
teacher
style
low
teaching content
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910330162.4A
Other languages
Chinese (zh)
Other versions
CN111832787A (en
Inventor
杨嵩
黄健
杨非
刘子韬
黄琰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xintang Sichuang Educational Technology Co Ltd
Original Assignee
Beijing Xintang Sichuang Educational Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xintang Sichuang Educational Technology Co Ltd filed Critical Beijing Xintang Sichuang Educational Technology Co Ltd
Priority to CN201910330162.4A priority Critical patent/CN111832787B/en
Priority to PCT/CN2020/086363 priority patent/WO2020216286A1/en
Publication of CN111832787A publication Critical patent/CN111832787A/en
Application granted granted Critical
Publication of CN111832787B publication Critical patent/CN111832787B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • G06Q50/205Education administration or guidance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation

Abstract

The embodiment of the invention provides a training method of a teacher style prediction model and a computer storage medium. Wherein the method comprises the following steps: determining multiple groups of low-dimensional feature data of a teaching content sample based on high-dimensional feature data of the teaching content sample; obtaining teacher style prediction data corresponding to the teaching content sample based on the multiple groups of low-dimensional feature data through a teacher style prediction model to be trained; and training the teacher style prediction model based on the teacher style annotation data and the teacher style prediction data of the teaching content sample. According to the embodiment of the invention, the high-dimensional characteristic data of the teaching content sample is grouped into a plurality of groups of low-dimensional characteristic data, so that the dimension of the input characteristic of the teacher style prediction model to be trained is greatly reduced, and the teacher style prediction performance of the teacher style prediction model obtained by training can be effectively improved.

Description

Teacher style prediction model training method and computer storage medium
Technical Field
The embodiment of the invention relates to the field of artificial intelligence, in particular to a training method of a teacher style prediction model and a computer storage medium.
Background
The teacher style is a judgment of the individual value of the teacher and is an important content of education evaluation. The teaching style of teachers is predicted, so that school teaching management departments and teachers can know teaching conditions, find problems, summarize experiences and modify work, and the purpose of improving teaching quality is achieved finally. Therefore, how to fairly, justly and accurately predict the teacher style is always a problem explored in the education community.
At present, the teaching style of a teacher is predicted mainly by adopting a modeling mode, and input data of the model can comprise teaching data such as teaching audio and video of the teacher. Because it is difficult to obtain teaching data samples of different teacher styles, the data volume of the teaching data samples is often small. In addition, the dimensionality of the features extracted from the teaching data samples is often high, so that the problem of overfitting is easily generated during model training, and a model with good performance cannot be trained. Aiming at the problems of small data quantity of a teaching data sample and high dimensionality of extracted features, most of the existing processing methods utilize a principal component analysis technology to reduce the dimensionality of high-dimensional features extracted from the teaching data sample, and then train a model by using the reduced dimensionality features. However, this processing method inevitably loses some characteristics of the original features extracted from the teaching data sample, cannot fully utilize information of the extracted original features, and cannot analyze the meaning specifically represented by the features after dimension reduction. Therefore, there is no model training method that can effectively improve teacher style prediction performance so far.
Disclosure of Invention
In view of the above, an objective of the present invention is to provide a method for training a teacher-style prediction model and a computer storage medium, so as to solve at least one of the above problems.
The embodiment of the invention provides a method for training a teacher style prediction model. The method comprises the following steps: determining multiple groups of low-dimensional feature data of a teaching content sample based on high-dimensional feature data of the teaching content sample; obtaining teacher style prediction data corresponding to the teaching content sample based on the multiple groups of low-dimensional feature data through a teacher style prediction model to be trained; and training the teacher style prediction model based on the teacher style annotation data and the teacher style prediction data of the teaching content sample.
An embodiment of the present invention further provides a computer-readable medium, where a readable program is stored in the computer-readable medium, and the readable program includes: instructions for determining sets of low-dimensional feature data for a teaching content sample based on high-dimensional feature data for the teaching content sample; instructions for obtaining teacher style prediction data corresponding to the teaching content sample based on the plurality of groups of low-dimensional feature data through a teacher style prediction model to be trained; instructions for training the teacher-style prediction model based on the teacher-style annotation data and the teacher-style prediction data of the teaching content samples.
According to the training scheme of the teacher style prediction model provided by the embodiment of the invention, multiple groups of low-dimensional feature data of the teaching content sample are determined based on the high-dimensional feature data of the teaching content sample, the teacher style prediction data corresponding to the teaching content sample is obtained based on the multiple groups of low-dimensional feature data through the teacher style prediction model to be trained, and then the teacher style prediction model is trained based on the teacher style marking data and the teacher style prediction data of the teaching content sample.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the embodiments of the present invention, and it is also possible for a person skilled in the art to obtain other drawings based on the drawings.
FIG. 1 is a flow chart illustrating the steps of a method for training a teacher-style predictive model according to a first embodiment of the invention;
FIG. 2 is a schematic diagram illustrating a structure of a teacher-style prediction model according to an embodiment of the present invention;
fig. 3 is a flowchart illustrating steps of a teacher style prediction method according to a second embodiment of the present invention.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the embodiments of the present invention, the technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all embodiments. All other embodiments obtained by a person skilled in the art based on the embodiments in the embodiments of the present invention should fall within the scope of protection of the embodiments of the present invention.
The following further describes concrete implementation of the embodiment of the invention by combining the drawings of the embodiment of the invention.
Example one
Referring to fig. 1, a flowchart illustrating steps of a method for training a teacher-style prediction model according to a first embodiment of the present invention is shown.
Specifically, the teacher style prediction model training method provided by the embodiment of the invention comprises the following steps:
in step S101, a plurality of sets of low-dimensional feature data of a teaching content sample are determined based on high-dimensional feature data of the teaching content sample.
In this embodiment, the teaching content sample may include audio data or video data of teaching content as a training sample. The high-dimensional feature data may be understood as a feature vector of a higher dimension, for example, a feature vector of 1000 dimensions, a feature vector of 2000 dimensions, and the like. When the teaching content sample is audio data of teaching content serving as a training sample, the high-dimensional feature data of the teaching content sample may be high-dimensional speech acoustic feature data extracted from the audio data, the speech acoustic feature data may include prosodic feature data, spectral feature data, voice quality feature data, and the like of audio, and the speech acoustic feature data is specifically a speech acoustic feature vector. In a specific embodiment, existing speech acoustic feature extraction algorithms can be used to extract high-dimensional speech acoustic feature data from the audio data. When the teaching content sample is video data of teaching content serving as a training sample, the high-dimensional feature data of the teaching content sample may be high-dimensional facial feature data extracted from the video data, and the facial feature data may include feature data of a mouth region, feature data of an eye region, feature data of a cheek region, and the like, and the facial feature data is specifically a facial feature vector. In a specific embodiment, the existing facial feature extraction algorithm can be used to extract high-dimensional facial feature data from the video data.
In this embodiment, when multiple sets of low-dimensional feature data of a teaching content sample are determined based on high-dimensional feature data of the teaching content sample, performing correlation analysis on the high-dimensional feature data to determine a grouping of the high-dimensional feature data; and dividing the high-dimensional feature data based on the grouping of the high-dimensional feature data to obtain multiple groups of low-dimensional feature data of the teaching content samples. Thereby, the dimensionality of the input features of the teacher-style prediction model is greatly reduced.
Specifically, when the high-dimensional voice acoustic feature data is high-dimensional voice acoustic feature data, the voice acoustic feature data includes a prosodic feature, a spectral feature and a vocal quality feature, and therefore, correlation analysis may be performed on the high-dimensional voice acoustic feature data based on the prior knowledge that the voice acoustic feature includes the prosodic feature, the spectral feature and the vocal quality feature to determine the grouping of the high-dimensional voice acoustic feature data. Then, according to the grouping, dividing the high-dimensional voice acoustic feature data to obtain rhythm feature data, frequency spectrum feature data and tone feature data of the teaching content sample. Briefly, the type of the features included in the voice acoustic features is determined according to the priori knowledge of the voice acoustic, and then the correlation analysis is performed on the high-dimensional voice acoustic feature data according to the type of the features included in the voice acoustic features. When the high-dimensional feature data is specifically high-dimensional facial feature data, the human face comprises a mouth region, an eye region, a nose region and a cheek region according to the priori knowledge of the human face, so that correlation analysis can be performed on the high-dimensional facial feature data based on the priori knowledge of the human face comprising the mouth region, the eye region, the nose region and the cheek region to determine the grouping of the high-dimensional facial feature data. Then, according to the grouping, dividing the high-dimensional face and face feature data to obtain mouth region feature data, eye region feature data, nose region feature data and cheek region feature data of the teaching content sample. Briefly, different regions included in the face are determined according to the priori knowledge of the face, and then correlation analysis is performed on high-dimensional face feature data according to the different regions included in the face.
In this embodiment, when determining multiple sets of low-dimensional feature data of a teaching content sample based on high-dimensional feature data of the teaching content sample, the high-dimensional feature data of the teaching content sample is divided in a dimensionally equal manner without prior knowledge to obtain the multiple sets of low-dimensional feature data of the teaching content sample. For example, when the high-dimensional feature data is 1000-dimensional feature data, the 1000-dimensional high-dimensional feature data may be equally divided into 10 sets of low-dimensional feature data, and the dimension of each set of low-dimensional feature data is 100 dimensions. In a specific embodiment, how many dimensions are specifically divided into how many groups can be set through experiments. Thereby, the dimensionality of the input features of the teacher-style prediction model is greatly reduced.
Specifically, a teaching content sample is input into the system, and the high-dimensional feature data of one sample data N (N =1, 2.. Multidot.n, if N sample data is set in total) is set as v n Dimension D, then using a certain priori knowledge to obtain high-dimensional characteristic data v n Performing correlation analysis to obtain high-dimensional characteristic data v n Divide into K groups (K =1, 2.., K), with the dimension of each group set to D k Satisfy the following requirements
Figure BDA0002037441110000051
For the nth sample data, the original high-dimensional feature data is v n The low dimensional feature data of the k-th group after division is
Figure BDA0002037441110000052
Then
Figure BDA0002037441110000053
Wherein concat (-) represents that the feature data are spliced together in sequence. If the correlation analysis of the characteristics can be carried out without prior knowledge, the characteristics can be equally divided into K parts, and the above relation is also satisfied.
In step S102, teacher-style prediction data corresponding to the teaching content sample is obtained based on the plurality of sets of low-dimensional feature data through a teacher-style prediction model to be trained.
In this embodiment, the teacher-style predictive model may be any suitable neural network model that may enable feature extraction or target object detection, including but not limited to convolutional neural networks, reinforcement learning neural networks, generative networks in antagonistic neural networks, deep neural networks, and so forth. The specific structure of the neural network may be set by those skilled in the art according to actual requirements, such as the number of convolution layers, the size of convolution kernel, the number of channels, and so on. The teacher-style prediction data may be a predicted teacher-style category, a predicted teacher-style numerical value, or the like.
In this embodiment, the teacher-style prediction model includes a plurality of low-level models and a high-level model connected to output ends of the plurality of low-level models, and the plurality of low-level models and the high-level model are deep neural network models. When teacher style prediction data corresponding to the teaching content sample is obtained through a teacher style prediction model to be trained based on the multiple groups of low-dimensional feature data, obtaining multiple teacher style preliminary prediction data corresponding to the teaching content sample through the multiple low-level models based on the multiple groups of low-dimensional feature data; and obtaining final teacher style prediction data corresponding to the teaching content sample based on the plurality of teacher style preliminary prediction data through the high-level model. Therefore, the teaching style of the teaching content samples is preliminarily predicted through the plurality of low-level models included in the teacher style prediction model, the teaching style of the teaching content samples is finally predicted through the high-level models included in the teacher style prediction model based on the preliminary prediction result of the teaching style, and the prediction accuracy of the teacher style prediction model on the teacher style corresponding to the teaching content samples can be improved.
In this embodiment, each of the plurality of low-layer models includes a hidden layer and a predicted layer connected to an output end of the hidden layer, where the hidden layer is a fully-connected layer or a convolutional layer, and the predicted layer is a fully-connected layer. When a plurality of teacher style preliminary prediction data corresponding to the teaching content sample are obtained through the plurality of low-dimensional models and based on the plurality of groups of low-dimensional feature data, respectively performing feature extraction operation on the plurality of groups of low-dimensional feature data through the hidden layer to obtain feature characterization data corresponding to the plurality of groups of low-dimensional feature data; and mapping the characteristic characterization data corresponding to the plurality of groups of low-dimensional characteristic data through the prediction layer to obtain a plurality of teacher style preliminary prediction data corresponding to the teaching content samples. The feature characterization data is specifically a feature characterization vector. Therefore, through the hidden layer, the characteristic extraction operation is respectively carried out on the multiple groups of low-dimensional characteristic data, the characteristic recoding can be respectively carried out on the multiple groups of low-dimensional characteristic data, the robustness of the characteristic representation data respectively corresponding to the multiple groups of low-dimensional characteristic data is improved, and the accuracy of the primary prediction of the teacher style corresponding to the teaching content sample by the low-layer model is further improved.
In this embodiment, when final teacher-style prediction data corresponding to the teaching content sample is obtained through the high-level model based on the plurality of teacher-style preliminary prediction data, high-level feature characterization data corresponding to the high-level model is generated based on the plurality of teacher-style preliminary prediction data; and acquiring final teacher style prediction data corresponding to the teaching content sample through the high-level model based on the high-level characteristic characterization data. The high-level feature characterization data is specifically a high-level feature characterization vector. Therefore, high-level feature characterization data corresponding to the high-level model are generated based on the teacher style preliminary prediction data, and final teacher style prediction data corresponding to the teaching content sample are obtained through the high-level model based on the high-level feature characterization data, so that the accuracy of final teacher style prediction of the high-level model corresponding to the teaching content sample can be improved.
In this embodiment, when generating the high-level feature characterization data corresponding to the high-level model based on the plurality of teacher-style preliminary prediction data, the high-level feature characterization data is generated based on the feature characterization data corresponding to each of the plurality of teacher-style preliminary prediction data and the plurality of sets of low-dimensional feature data. Therefore, high-level feature characterization data are generated based on the teacher style preliminary prediction data and feature characterization data corresponding to the low-dimensional feature data, the robustness of the high-level feature characterization data can be improved, and the accuracy of final prediction of the teacher style corresponding to the teaching content sample by the high-level model is further improved.
In this embodiment, when final prediction data of a teacher style corresponding to the teaching content sample is obtained through the high-level model based on the high-level feature characterization data, feature extraction operation is performed on the high-level feature characterization data through a hidden layer in the high-level model to obtain feature characterization data corresponding to the high-level feature characterization data; and mapping the characteristic characterization data corresponding to the high-level characteristic characterization data through a prediction layer in the high-level model to obtain final prediction data of a teacher style corresponding to the teaching content sample. The hidden layer is a full-link layer or a convolutional layer, the prediction layer is a full-link layer, and the feature characterization data is a feature characterization vector. Therefore, the high-level feature characterization data is subjected to feature extraction operation through the hidden layer, feature recoding can be performed on the high-level feature characterization data, the robustness of the feature characterization data corresponding to the high-level feature characterization data is improved, and the accuracy of final prediction of a teacher style corresponding to a teaching content sample by a high-level model is further improved.
Specifically, as shown in fig. 2, the teacher-style prediction model includes a plurality of lower-level models and a higher-level model connected to an output of the plurality of lower-level models. After the high-dimensional feature data are divided, a plurality of feature groups are obtained, and then each feature group is respectively input into a corresponding low-level model. And performing preliminary prediction of the teaching style on the teaching content sample based on the characteristic groups through the corresponding low-level model to obtain preliminary teacher style prediction data corresponding to the teaching content sample. The low-level model comprises a plurality of hidden layers which are connected in sequence and a prediction layer connected with the output end of the last hidden layer in the hidden layers which are connected in sequence. And generating high-level feature characterization data corresponding to the high-level model based on teacher style preliminary prediction data output by the low-level models and feature characterization data of feature groups output by the last hidden layer in the low-level models. And finally, performing final prediction on the teaching style of the teaching content sample through a high-level model based on high-level characteristic representation data to obtain final prediction data of the teacher style corresponding to the teaching content sample.
Specifically, the high-dimensional feature data is divided into K groups of low-dimensional feature data, each group of low-dimensional feature data corresponds to one low-level model, and then the kth low-level model and the kth group of low-dimensional feature data
Figure BDA0002037441110000081
And (4) corresponding to each other. Low-dimensional feature data of kth group for nth sample data
Figure BDA0002037441110000082
Let L be the number of hidden layers of the lower layer model k (l k =1,2,...,L k ) First, of k Hidden node dimension of a hidden layer is
Figure BDA0002037441110000083
When l is k When the ratio is not less than 1,
Figure BDA0002037441110000084
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0002037441110000085
is a weight matrix of the first hidden layer of the kth group of low-layer models with the dimension of
Figure BDA0002037441110000086
Figure BDA0002037441110000087
Is the bias vector of the first hidden layer of the kth group of low-layer models with the dimension of
Figure BDA0002037441110000088
f (-) is a non-linear function, typically a sigmoid function;
Figure BDA0002037441110000089
is the first hidden vector representation of the kth group of low-level models for the nth sample data, and the dimensionality is
Figure BDA00020374411100000810
When 1 < l k <L k When the temperature of the water is higher than the set temperature,
Figure BDA00020374411100000811
wherein the content of the first and second substances,
Figure BDA00020374411100000812
is the kth group of low-level models k A weight matrix of hidden layers with dimensions of
Figure BDA00020374411100000813
Figure BDA00020374411100000814
Is the kth group of low-level models k A bias vector of a hidden layer having a dimension of
Figure BDA00020374411100000815
Figure BDA00020374411100000816
Is the kth group of low-level models for the nth numberAccording to the first k A hidden vector representation of hidden layer with dimension of
Figure BDA0002037441110000091
When l is k =L k When the temperature of the water is higher than the set temperature,
Figure BDA0002037441110000092
wherein the content of the first and second substances,
Figure BDA0002037441110000093
is the kth group of low-level models k A weight matrix of hidden layers with dimensions of
Figure BDA0002037441110000094
Figure BDA0002037441110000095
Is the kth group of low-level models L k A bias vector of a hidden layer having a dimension of
Figure BDA0002037441110000096
Figure BDA0002037441110000097
Is the Lth of the kth group of low-level models for the nth sample data k A hidden vector representation of hidden layer with dimension of
Figure BDA0002037441110000098
The output of the hidden layer of the kth group of low-level models is
Figure BDA0002037441110000099
Input as prediction layer for the kth group of low-layer models:
Figure BDA00020374411100000910
wherein the content of the first and second substances,
Figure BDA00020374411100000911
is a weight matrix of the predicted layer of the kth group of low-layer models with dimensions of
Figure BDA00020374411100000912
Figure BDA00020374411100000913
Is the bias vector of the prediction layer of the kth group of low-layer models, with dimension 1;
Figure BDA00020374411100000914
the preliminary teacher-style prediction data of the kth group of low-level models for the nth sample data has a dimension of 1 and is a real value between 0 and 1.
And combining the hidden vector representation of the last hidden layer of each low-layer model and the teacher style preliminary prediction data to obtain high-layer feature characterization data. The high-level feature characterization data is:
Figure BDA00020374411100000915
wherein h is n Has a dimension of
Figure BDA00020374411100000916
The teacher-style preliminary prediction data of each low-level model are combined and added into the hidden vector representation of the last hidden layer, so that more information can be obtained, and the high-level model can predict more accurately.
And the high-layer characteristic characterization data is used as the input of a high-layer model for final prediction, and the high-layer model comprises a plurality of hidden layers which are sequentially connected and a prediction layer connected with the output end of the last hidden layer in the plurality of hidden layers which are sequentially connected. Let the number of hidden layers of the high-level model be L, and the hidden node dimension of the ith hidden layer be D l
When the ratio of l =1, the control unit is in a state of,
y 1n =W 1 h n +b 1 g 1n =f(y 1n )
wherein the content of the first and second substances,
Figure BDA0002037441110000101
is a weight matrix of the first hidden layer of the high-level model with the dimension of
Figure BDA0002037441110000102
Figure BDA0002037441110000103
Is the bias vector of the first hidden layer of the high-level model with the dimension D 1 (ii) a f (-) is a non-linear function, typically a sigmoid function; g 1n Is the first hidden vector representation of the high-level model aiming at the nth sample data, and the dimension is D 1 ×1。
When 1 < L < L,
y ln =W l g (l-1)n +b l g ln =f(y ln )
wherein the content of the first and second substances,
Figure BDA0002037441110000104
is the weight matrix of the ith hidden layer of the high-level model with the dimension D l ×D l-1
Figure BDA0002037441110000105
Is the bias vector of the ith hidden layer of the high-level model with the dimension D l ;g ln Is the implicit vector representation of the high-level model aiming at the nth sample data with the dimension of D l ×1。
When l is k =L k When the utility model is used, the water is discharged,
y Ln =W L g (L-1)n +b L h Ln =f(y Ln )
wherein the content of the first and second substances,
Figure BDA0002037441110000106
is a weight matrix of the L-th hidden layer of the high-level model with the dimension of D L ×D L-1
Figure BDA0002037441110000107
Is the bias vector of the L-th hidden layer of the high-level model with the dimension of D L ;h Ln Is the L-th hidden layer hidden vector representation of the high-level model aiming at the nth sample data, and the dimensionality is D L ×1。
The output of the high-level model hidden layer is h Ln As input to the high layer model prediction layer:
s n =Wh Ln +b
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0002037441110000108
is a weight matrix of a prediction layer of a high-layer model with the dimension of 1 × D L
Figure BDA0002037441110000109
Is a bias vector of a high-level model prediction level, and the dimensionality is 1; s n The dimension of the final prediction data of the teacher style of the nth sample data is 1, and the final prediction data is a real value between 0 and 1.
As can be seen from the above description, in a specific embodiment, the low-level model and the high-level model are similar in structure, and therefore the low-level model and the high-level model are used, because the teaching style of the teaching content sample is preliminarily predicted through the low-level model, and then the teaching style of the teaching content sample is finally predicted through the high-level model based on the teaching style preliminary prediction result of the low-level model, so that the accuracy of the teacher style prediction model in predicting the teacher style corresponding to the teaching content sample can be improved. In addition, because the data volume of the teaching content sample is small, and the dimensionality of the feature data of the teaching content sample is too high, a model (such as only one bottom model) is directly used for modeling, a dimensionality disaster is caused, the model obtained through training is only suitable for training data, good performance cannot be obtained on test data, and overfitting influence is caused.
In step S103, the teacher style prediction model is trained based on the teacher style labeling data and the teacher style prediction data of the teaching content sample.
In this embodiment, the teacher-style annotation data may be understood as teacher-style real data of the teaching content sample.
In this embodiment, when the teacher-style prediction model is trained based on teacher-style labeling data and teacher-style prediction data of the teaching content sample, a difference value between the teacher-style labeling data and the teacher-style prediction data is determined through a target loss function; and adjusting parameters of the teacher style prediction model based on the difference values.
In this embodiment, when the difference value between the teacher-style labeling data and the teacher-style prediction data is determined by the target loss function, the difference value between the teacher-style labeling data and the teacher-style final prediction data is determined by the target loss function. When parameters of the teacher-style prediction model are adjusted based on the difference values, parameters of the plurality of lower-level models and the higher-level model in the teacher-style prediction model are adjusted based on the difference values.
In this embodiment, the objective loss function includes a mean square error term and an L2 regularization term. Thereby, the training process of the teacher-style prediction model can be prevented from being influenced by overfitting.
Specifically, given the high-dimensional feature data v of the nth sample data n Finally, teacher style prediction data s can be obtained from the prediction layer of the high-level model through calculation of the low-level model and the high-level model n . Let the teacher style real data of the nth sample data be s n ', train teacher-style prediction model so that s n And s n ' as close as possible. In the training process, the following functions are selected as loss functions for training the teacher style prediction model:
Figure BDA0002037441110000121
wherein s is n Is teacher-style real data of the nth sample data,
Figure BDA0002037441110000122
is teacher-style preliminary prediction data, s, of the kth low-level model for the nth sample data n ' teacher-style final prediction data of high-level model for nth sample data, W l k Is the weight matrix of the hidden layer of the lower model, W k Is a weight matrix, W, of the prediction layer of the low-level model l The weight matrix of the hidden layer of the high-layer model, W the weight matrix of the prediction layer of the high-layer model, and lambda the weight attenuation term, wherein the value is between 0 and 1. The first term and the second term of the above formula calculate the mean square error, and the latter four terms are added with L2 regularization to prevent the teacher style prediction model from being over-fitted.
And the teacher style prediction model is trained by combining a low-level model and a high-level model together and integrally optimizing the teacher style prediction model through a target loss function. The whole model is trained by using a minimum target loss function, namely parameters of the teacher style prediction model are obtained through training
Figure BDA0002037441110000123
Figure BDA0002037441110000124
Specifically, the currently obtained final teacher-style prediction data is evaluated by determining a difference value between the teacher-style labeling data and the final teacher-style prediction data, and the evaluation is used as a basis for subsequently training the teacher-style prediction model. Specifically, the difference values may be transmitted back to the teacher-style prediction model, thereby iteratively training the teacher-style prediction model. The teacher-style prediction model is trained in an iterative process, and only one training process is described in the embodiment of the application, but it should be understood by those skilled in the art that the training mode can be adopted for each training of the teacher-style prediction model until the training of the teacher-style prediction model is completed.
According to the teacher style prediction model training method, multiple groups of low-dimensional feature data of a teaching content sample are determined based on high-dimensional feature data of the teaching content sample, teacher style prediction data corresponding to the teaching content sample are obtained based on the multiple groups of low-dimensional feature data through the teacher style prediction model to be trained, teacher style annotation data and the teacher style prediction data of the teaching content sample are used for training the teacher style prediction model, compared with other existing modes, the teacher style prediction model training method greatly reduces the dimension of input features of the teacher style prediction model to be trained by grouping the high-dimensional feature data of the teaching content sample into the multiple groups of low-dimensional feature data, and therefore teacher style prediction performance of the teacher style prediction model obtained through training can be effectively improved.
Example two
Referring to fig. 3, a flowchart illustrating steps of a teacher style prediction method according to a second embodiment of the present invention is shown.
Specifically, the teacher style prediction method provided by the embodiment of the invention comprises the following steps:
in step S201, a plurality of sets of low-dimensional feature data of the teaching content data are determined based on the high-dimensional feature data of the teaching content data.
In this embodiment, the tutorial data may include audio data or video data of the tutorial. When the teaching content data is audio data of teaching content, the high-dimensional feature data of the teaching content data may be high-dimensional speech acoustic feature data extracted from the audio data. When the teaching content data is video data of teaching content, the high-dimensional feature data of the teaching content data may be high-dimensional facial feature data extracted from the video data.
In this embodiment, the specific implementation of step S201 is similar to the specific implementation of step S101, and is not described herein again.
In step S202, through the teacher style prediction model obtained through the first training in the embodiment, teacher style prediction data corresponding to the teaching content data is obtained based on the multiple sets of low-dimensional feature data of the teaching content data.
In this embodiment, when the teacher-style prediction data corresponding to the teaching content data is obtained based on the plurality of sets of low-dimensional feature data by using the teacher-style prediction model obtained through the training in the first embodiment, a plurality of teacher-style preliminary prediction data corresponding to the teaching content data is obtained based on the plurality of sets of low-dimensional feature data by using the plurality of low-level models; and acquiring final teacher style prediction data corresponding to the teaching content data based on the plurality of teacher style preliminary prediction data through the high-level model. Therefore, the teaching style of the teaching content data is preliminarily predicted through the plurality of low-layer models included in the teacher style prediction model, and the teaching style of the teaching content data is finally predicted through the high-layer models included in the teacher style prediction model based on the preliminary prediction result of the teaching style, so that the accuracy of the teacher style prediction model for predicting the teacher style corresponding to the teaching content data can be improved.
In this embodiment, when multiple teacher-style preliminary prediction data corresponding to the teaching content data are obtained based on the multiple sets of low-dimensional feature data through the multiple low-level models, feature extraction operations are respectively performed on the multiple sets of low-dimensional feature data through the hidden layer to obtain feature characterization data corresponding to the multiple sets of low-dimensional feature data; and mapping the characteristic representation data corresponding to the plurality of groups of low-dimensional characteristic data through the prediction layer to obtain a plurality of teacher style preliminary prediction data corresponding to the teaching content data. The feature characterization data is specifically a feature characterization vector. Therefore, through the hidden layer, the characteristic extraction operation is respectively carried out on the multiple groups of low-dimensional characteristic data, the characteristic recoding can be respectively carried out on the multiple groups of low-dimensional characteristic data, the robustness of the characteristic representation data respectively corresponding to the multiple groups of low-dimensional characteristic data is improved, and the accuracy of the primary prediction of the teacher style corresponding to the teaching content data by the low-layer model is further improved.
In this embodiment, when final teacher-style prediction data corresponding to the teaching content data is obtained through the high-level model based on the plurality of teacher-style preliminary prediction data, high-level feature characterization data corresponding to the high-level model is generated based on the plurality of teacher-style preliminary prediction data; and acquiring final teacher style prediction data corresponding to the teaching content data through the high-level model based on the high-level characteristic representation data. The high-level feature characterization data is specifically a high-level feature characterization vector. Therefore, high-level feature characterization data corresponding to the high-level model are generated based on the teacher style preliminary prediction data, and final teacher style prediction data corresponding to the teaching content data are obtained through the high-level model based on the high-level feature characterization data, so that the accuracy of final teacher style prediction corresponding to the teaching content data by the high-level model can be improved.
In this embodiment, when generating the high-level feature characterization data corresponding to the high-level model based on the plurality of teacher-style preliminary prediction data, the high-level feature characterization data is generated based on the feature characterization data corresponding to each of the plurality of teacher-style preliminary prediction data and the plurality of sets of low-dimensional feature data. Therefore, high-level feature characterization data are generated based on the teacher style preliminary prediction data and feature characterization data corresponding to the low-dimensional feature data, the robustness of the high-level feature characterization data can be improved, and the accuracy of final prediction of the teacher style corresponding to the teaching content data by the high-level model is further improved.
In this embodiment, when final predicted data of a teacher style corresponding to the teaching content data is obtained through the high-level model based on the high-level feature characterization data, feature extraction operation is performed on the high-level feature characterization data through a hidden layer in the high-level model to obtain feature characterization data corresponding to the high-level feature characterization data; and mapping the characteristic representation data corresponding to the high-level characteristic representation data through a prediction layer in the high-level model to obtain final prediction data of a teacher style corresponding to the teaching content data. Therefore, the high-level feature characterization data is subjected to feature extraction operation through the hidden layer, feature recoding can be performed on the high-level feature characterization data, the robustness of the feature characterization data corresponding to the high-level feature characterization data is improved, and the accuracy of final prediction of a teacher style corresponding to teaching content data by a high-level model is further improved.
In this embodiment, the method further includes: and mapping the teacher style prediction data to obtain a teacher style category corresponding to the teaching content data. Therefore, the teacher style category corresponding to the teaching content data can be obtained.
Specifically, based on teacher-style prediction data, mapping operation is performed in a pre-constructed teacher-style semantic space to obtain a teacher-style category corresponding to the teaching content data. Wherein the teacher-style semantic space is understood to be a mapping space between teacher-style prediction data and teacher-style categories.
According to the teacher style prediction method provided by the embodiment of the application, multiple groups of low-dimensional feature data of teaching content data are determined based on the high-dimensional feature data of the teaching content data, then the teacher style prediction model obtained through training in the embodiment I is used, and the teacher style prediction data corresponding to the teaching content data is obtained based on the multiple groups of low-dimensional feature data of the teaching content data.
EXAMPLE III
An embodiment of the present invention further provides a computer-readable medium, where a readable program is stored in the computer-readable medium, and the readable program includes: instructions for determining sets of low-dimensional feature data for a teaching content sample based on high-dimensional feature data for the teaching content sample; instructions for obtaining teacher style prediction data corresponding to the teaching content samples based on the plurality of groups of low-dimensional feature data through a teacher style prediction model to be trained; and instructions for training the teacher-style prediction model based on the teacher-style annotation data and the teacher-style prediction data of the teaching content samples.
Optionally, the instruction for obtaining, by the teacher-style prediction model to be trained, teacher-style prediction data corresponding to the teaching content sample based on the plurality of sets of low-dimensional feature data includes: instructions for obtaining, by the plurality of low-level models, a plurality of teacher-style preliminary prediction data corresponding to the teaching content sample based on the plurality of sets of low-dimensional feature data; and instructions for obtaining final teacher-style prediction data corresponding to the teaching content sample based on the plurality of teacher-style preliminary prediction data through the high-level model.
Optionally, each of the plurality of low-level models includes a hidden layer and a prediction layer connected to an output end of the hidden layer, and correspondingly, the instructions for obtaining, through the plurality of low-level models and based on the plurality of sets of low-dimensional feature data, a plurality of teacher-style preliminary prediction data corresponding to the teaching content sample include: instructions for performing feature extraction operations on the sets of low-dimensional feature data through the hidden layer to obtain feature characterization data corresponding to the sets of low-dimensional feature data; and the instruction is used for mapping the characteristic characterization data corresponding to the plurality of groups of low-dimensional characteristic data through the prediction layer so as to obtain a plurality of teacher style preliminary prediction data corresponding to the teaching content samples.
Optionally, the instructions for obtaining, by the high-level model, final teacher-style prediction data corresponding to the teaching content sample based on the preliminary teacher-style prediction data include: instructions for generating high-level feature characterization data corresponding to the high-level model based on the plurality of teacher-style preliminary prediction data; and instructions for obtaining final teacher style prediction data corresponding to the teaching content sample based on the high-level feature characterization data through the high-level model.
Optionally, the instructions for generating high-level feature characterization data corresponding to the high-level model based on the plurality of teacher-style preliminary prediction data include: and generating the high-level feature characterization data based on feature characterization data corresponding to the teacher-style preliminary prediction data and the plurality of sets of low-dimensional feature data.
Optionally, the instructions for obtaining final teacher-style prediction data corresponding to the teaching content sample based on the high-level feature characterization data through the high-level model include: instructions for performing feature extraction operations on the high-level feature characterization data through a hidden layer in the high-level model to obtain feature characterization data corresponding to the high-level feature characterization data; and the instruction is used for mapping the characteristic representation data corresponding to the high-level characteristic representation data through a prediction layer in the high-level model so as to obtain final prediction data of the teacher style corresponding to the teaching content sample.
Optionally, the instructions for training the teacher-style prediction model based on the teacher-style annotation data and the teacher-style prediction data of the teaching content sample comprise: instructions for determining a difference value between the teacher-style annotation data and the final teacher-style prediction data using a target loss function; instructions for adjusting parameters of the plurality of lower-level models and the higher-level model in the teacher-style predictive model based on the difference values.
Optionally, the readable program further comprises: instructions for determining a plurality of sets of low dimensional feature data of the instructional content data based on high dimensional feature data of the instructional content data; and instructions for obtaining teacher style prediction data corresponding to the teaching content data based on the plurality of groups of low-dimensional feature data of the teaching content data through the trained teacher style prediction model.
Optionally, the readable program further comprises: and the instruction is used for carrying out mapping operation on the teacher style prediction data so as to obtain a teacher style category corresponding to the teaching content data.
According to the computer readable medium provided by the embodiment of the application, multiple groups of low-dimensional feature data of a teaching content sample are determined based on high-dimensional feature data of the teaching content sample, teacher style prediction data corresponding to the teaching content sample is obtained based on the multiple groups of low-dimensional feature data through a teacher style prediction model to be trained, and the teacher style prediction model is trained based on teacher style labeling data and the teacher style prediction data of the teaching content sample.
It should be noted that, according to implementation requirements, each component/step described in the embodiment of the present invention may be divided into more components/steps, and two or more components/steps or partial operations of the components/steps may also be combined into a new component/step to achieve the purpose of the embodiment of the present invention.
The above-described methods according to the embodiments of the present invention may be implemented in hardware, firmware, or as software or computer code that may be stored in a recording medium such as a CD ROM, RAM, floppy disk, hard disk, or magneto-optical disk, or as computer code downloaded through a network, originally stored in a remote recording medium or a non-transitory machine-readable medium, and to be stored in a local recording medium, so that the methods described herein may be stored in such software processes on a recording medium using a general purpose computer, a dedicated processor, or programmable or dedicated hardware such as an ASIC or FPGA. It is understood that the computer, processor, microprocessor controller, or programmable hardware includes memory components (e.g., RAM, ROM, flash memory, etc.) that can store or receive software or computer code that, when accessed and executed by the computer, processor, or hardware, implements the teacher style prediction model training method described herein. Further, when a general-purpose computer accesses code for implementing the teacher-style prediction model training method shown herein, execution of the code transforms the general-purpose computer into a special-purpose computer for performing the teacher-style prediction model training method shown herein.
Those of ordinary skill in the art will appreciate that the various illustrative elements and method steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present embodiments.
The above embodiments are only used for illustrating the embodiments of the present invention, and not for limiting the embodiments of the present invention, and those skilled in the art can make various changes and modifications without departing from the spirit and scope of the embodiments of the present invention, so that all equivalent technical solutions also belong to the scope of the embodiments of the present invention, and the scope of patent protection of the embodiments of the present invention should be defined by the claims.

Claims (9)

1. A method of training a teacher-style predictive model, the method comprising:
determining multiple groups of low-dimensional feature data of a teaching content sample based on high-dimensional feature data of the teaching content sample, wherein the steps comprise: dividing the high-dimensional feature data to obtain multiple groups of low-dimensional feature data of the teaching content sample;
obtaining teacher style prediction data corresponding to the teaching content sample based on the multiple groups of low-dimensional feature data through a teacher style prediction model to be trained, wherein the teacher style prediction model comprises a plurality of low-level models and high-level models connected with output ends of the plurality of low-level models;
the obtaining of teacher style prediction data corresponding to the teaching content sample through the teacher style prediction model to be trained based on the multiple groups of low-dimensional feature data comprises: inputting each group of low-dimensional feature data in the multiple groups of low-dimensional feature data into a corresponding low-level model respectively to obtain multiple teacher style preliminary prediction data corresponding to the teaching content samples; obtaining final teacher style prediction data corresponding to the teaching content sample based on the plurality of teacher style preliminary prediction data through the high-level model;
and training the teacher style prediction model based on the teacher style annotation data and the teacher style prediction data of the teaching content sample.
2. The method of claim 1, wherein each of the plurality of lower layer models comprises a hidden layer and a predicted layer connected to an output of the hidden layer,
correspondingly, the obtaining, by the plurality of low-level models and based on the plurality of sets of low-dimensional feature data, a plurality of teacher-style preliminary prediction data corresponding to the teaching content samples includes:
respectively performing feature extraction operation on the multiple groups of low-dimensional feature data through the hidden layer to obtain feature characterization data corresponding to the multiple groups of low-dimensional feature data;
and mapping the characteristic characterization data corresponding to the plurality of groups of low-dimensional characteristic data through the prediction layer to obtain a plurality of teacher style preliminary prediction data corresponding to the teaching content samples.
3. The method of claim 1, wherein obtaining, by the high-level model, teacher-style final prediction data corresponding to the sample of instructional content based on the plurality of teacher-style preliminary prediction data comprises:
generating high-level feature characterization data corresponding to the high-level model based on the plurality of teacher style preliminary prediction data;
and acquiring final teacher style prediction data corresponding to the teaching content sample through the high-level model based on the high-level characteristic characterization data.
4. The method of claim 3, wherein generating high-level feature characterization data corresponding to the high-level model based on the plurality of teacher-style preliminary prediction data comprises:
and generating the high-level feature characterization data based on the feature characterization data respectively corresponding to the plurality of teacher style preliminary prediction data and the plurality of groups of low-dimensional feature data.
5. The method of claim 3, wherein obtaining teacher-style final prediction data corresponding to the teaching content sample based on the high-level feature characterization data through the high-level model comprises:
performing feature extraction operation on the high-level feature characterization data through a hidden layer in the high-level model to obtain feature characterization data corresponding to the high-level feature characterization data;
and mapping the characteristic characterization data corresponding to the high-level characteristic characterization data through a prediction layer in the high-level model to obtain final prediction data of a teacher style corresponding to the teaching content sample.
6. The method of claim 1, wherein training the teacher-style prediction model based on the teacher-style annotation data and the teacher-style prediction data of the teaching content sample comprises:
determining a difference value between the teacher style marking data and the final teacher style prediction data through a target loss function;
adjusting parameters of the plurality of lower-level models and the higher-level model in the teacher-style prediction model based on the difference values.
7. The method according to any one of claims 1-6, further comprising:
determining multiple groups of low-dimensional feature data of the teaching content data based on the high-dimensional feature data of the teaching content data;
and acquiring teacher style prediction data corresponding to the teaching content data based on the multiple groups of low-dimensional characteristic data of the teaching content data through the trained teacher style prediction model.
8. The method of claim 7, further comprising:
and mapping the teacher style prediction data to obtain a teacher style category corresponding to the teaching content data.
9. A computer-readable medium, characterized in that the computer-readable medium stores a readable program, the readable program comprising:
the instruction for determining multiple groups of low-dimensional feature data of the teaching content sample based on the high-dimensional feature data of the teaching content sample comprises an instruction for dividing the high-dimensional feature data to obtain multiple groups of low-dimensional feature data of the teaching content sample;
instructions for obtaining teacher-style prediction data corresponding to the teaching content samples based on the plurality of groups of low-dimensional feature data through a teacher-style prediction model to be trained, wherein the teacher-style prediction model comprises a plurality of low-level models and a high-level model connected with output ends of the plurality of low-level models;
the instruction for obtaining the teacher style prediction data corresponding to the teaching content sample based on the plurality of groups of low-dimensional feature data through the teacher style prediction model to be trained specifically includes: inputting each group of low-dimensional feature data in the multiple groups of low-dimensional feature data into a corresponding low-level model respectively to obtain multiple teacher style preliminary prediction data corresponding to the teaching content samples; obtaining final teacher style prediction data corresponding to the teaching content sample based on the plurality of teacher style preliminary prediction data through the high-level model; instructions for training the teacher-style prediction model based on the teacher-style annotation data and the teacher-style prediction data of the teaching content samples.
CN201910330162.4A 2019-04-23 2019-04-23 Teacher style prediction model training method and computer storage medium Active CN111832787B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910330162.4A CN111832787B (en) 2019-04-23 2019-04-23 Teacher style prediction model training method and computer storage medium
PCT/CN2020/086363 WO2020216286A1 (en) 2019-04-23 2020-04-23 Method for training teaching style prediction model, and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910330162.4A CN111832787B (en) 2019-04-23 2019-04-23 Teacher style prediction model training method and computer storage medium

Publications (2)

Publication Number Publication Date
CN111832787A CN111832787A (en) 2020-10-27
CN111832787B true CN111832787B (en) 2022-12-09

Family

ID=72911994

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910330162.4A Active CN111832787B (en) 2019-04-23 2019-04-23 Teacher style prediction model training method and computer storage medium

Country Status (2)

Country Link
CN (1) CN111832787B (en)
WO (1) WO2020216286A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113408571B (en) * 2021-05-08 2022-07-19 浙江智慧视频安防创新中心有限公司 Image classification method and device based on model distillation, storage medium and terminal
CN114780785B (en) * 2022-06-23 2022-09-13 新缪斯(深圳)音乐科技产业发展有限公司 Music teaching recommendation method and system based on knowledge graph

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104392174A (en) * 2014-10-23 2015-03-04 腾讯科技(深圳)有限公司 Generation method and device for characteristic vectors of dynamic behaviors of application program
CN104933588A (en) * 2015-07-01 2015-09-23 北京京东尚科信息技术有限公司 Data annotation platform for expanding merchandise varieties and data annotation method
CN107423442A (en) * 2017-08-07 2017-12-01 火烈鸟网络(广州)股份有限公司 Method and system, storage medium and computer equipment are recommended in application based on user's portrait behavioural analysis
CN107577943A (en) * 2017-09-08 2018-01-12 北京奇虎科技有限公司 Sample predictions method, apparatus and server based on machine learning
CN108021947A (en) * 2017-12-25 2018-05-11 北京航空航天大学 A kind of layering extreme learning machine target identification method of view-based access control model
CN108897834A (en) * 2018-06-22 2018-11-27 招商信诺人寿保险有限公司 Data processing and method for digging

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7684390B2 (en) * 2004-12-30 2010-03-23 Intel Corporation Integrated circuit capable of transmitting probe packets across a stack of switches
CN106250403A (en) * 2016-07-19 2016-12-21 北京奇艺世纪科技有限公司 Customer loss Forecasting Methodology and device
CN107045673B (en) * 2017-03-31 2020-09-29 杭州电子科技大学 Public bicycle flow variation prediction method based on stack model fusion
KR102570278B1 (en) * 2017-07-31 2023-08-24 삼성전자주식회사 Apparatus and method for generating training data used to training student model from teacher model
CN108073888A (en) * 2017-08-07 2018-05-25 中国科学院深圳先进技术研究院 A kind of teaching auxiliary and the teaching auxiliary system using this method
CN107992887B (en) * 2017-11-28 2021-02-19 东软集团股份有限公司 Classifier generation method, classification device, electronic equipment and storage medium
CN108090857B (en) * 2017-12-29 2021-06-22 复旦大学 Multi-mode student classroom behavior analysis system and method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104392174A (en) * 2014-10-23 2015-03-04 腾讯科技(深圳)有限公司 Generation method and device for characteristic vectors of dynamic behaviors of application program
CN104933588A (en) * 2015-07-01 2015-09-23 北京京东尚科信息技术有限公司 Data annotation platform for expanding merchandise varieties and data annotation method
CN107423442A (en) * 2017-08-07 2017-12-01 火烈鸟网络(广州)股份有限公司 Method and system, storage medium and computer equipment are recommended in application based on user's portrait behavioural analysis
CN107577943A (en) * 2017-09-08 2018-01-12 北京奇虎科技有限公司 Sample predictions method, apparatus and server based on machine learning
CN108021947A (en) * 2017-12-25 2018-05-11 北京航空航天大学 A kind of layering extreme learning machine target identification method of view-based access control model
CN108897834A (en) * 2018-06-22 2018-11-27 招商信诺人寿保险有限公司 Data processing and method for digging

Also Published As

Publication number Publication date
WO2020216286A1 (en) 2020-10-29
CN111832787A (en) 2020-10-27

Similar Documents

Publication Publication Date Title
CN107590127A (en) A kind of exam pool knowledge point automatic marking method and system
US20220067588A1 (en) Transforming a trained artificial intelligence model into a trustworthy artificial intelligence model
CN112115967B (en) Image increment learning method based on data protection
CN110046706A (en) Model generating method, device and server
CN111832787B (en) Teacher style prediction model training method and computer storage medium
CN112766496B (en) Deep learning model safety guarantee compression method and device based on reinforcement learning
CN110895729A (en) Prediction method for construction period of power transmission line engineering
KR20200143450A (en) Image processing method, device, electronic device and storage medium
CN115511069A (en) Neural network training method, data processing method, device and storage medium
CN116051388A (en) Automatic photo editing via language request
CN110808036B (en) Incremental voice command word recognition method
CN111292715B (en) Speech synthesis method, speech synthesis device, electronic equipment and computer-readable storage medium
CN110598737A (en) Online learning method, device, equipment and medium of deep learning model
CN113689514B (en) Theme-oriented image scene graph generation method
CN113591988B (en) Knowledge cognitive structure analysis method, system, computer equipment, medium and terminal
CN116186824A (en) Building structure arrangement method based on image embedded graph neural network model
CN115131646A (en) Deep network model compression method based on discrete coefficient
CN114299995A (en) Language emotion recognition method for emotion assessment
CN110111810B (en) Voice personality prediction method based on convolutional neural network
CN110728292A (en) Self-adaptive feature selection algorithm under multi-task joint optimization
CN111832595B (en) Teacher style determination method and computer storage medium
CN113282705B (en) Case pre-judgment intelligent body training method and system capable of being automatically updated
CN117251599B (en) Video corpus intelligent test optimization method, device and storage medium
CN113965472B (en) Multi-scale network flow prediction method based on deep ESN
CN114818659B (en) Text emotion source analysis method and system and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant