CN111832787A - Teacher style prediction model training method and computer storage medium - Google Patents

Teacher style prediction model training method and computer storage medium Download PDF

Info

Publication number
CN111832787A
CN111832787A CN201910330162.4A CN201910330162A CN111832787A CN 111832787 A CN111832787 A CN 111832787A CN 201910330162 A CN201910330162 A CN 201910330162A CN 111832787 A CN111832787 A CN 111832787A
Authority
CN
China
Prior art keywords
data
teacher
style
prediction
teaching content
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910330162.4A
Other languages
Chinese (zh)
Other versions
CN111832787B (en
Inventor
杨嵩
黄健
杨非
刘子韬
黄琰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xintang Sichuang Educational Technology Co Ltd
Original Assignee
Beijing Xintang Sichuang Educational Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xintang Sichuang Educational Technology Co Ltd filed Critical Beijing Xintang Sichuang Educational Technology Co Ltd
Priority to CN201910330162.4A priority Critical patent/CN111832787B/en
Priority to PCT/CN2020/086363 priority patent/WO2020216286A1/en
Publication of CN111832787A publication Critical patent/CN111832787A/en
Application granted granted Critical
Publication of CN111832787B publication Critical patent/CN111832787B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • G06Q50/205Education administration or guidance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation

Abstract

The embodiment of the invention provides a training method of a teacher style prediction model and a computer storage medium. Wherein the method comprises the following steps: determining multiple groups of low-dimensional feature data of a teaching content sample based on high-dimensional feature data of the teaching content sample; obtaining teacher style prediction data corresponding to the teaching content sample based on the multiple groups of low-dimensional feature data through a teacher style prediction model to be trained; and training the teacher style prediction model based on the teacher style annotation data and the teacher style prediction data of the teaching content sample. According to the embodiment of the invention, the high-dimensional characteristic data of the teaching content sample is grouped into a plurality of groups of low-dimensional characteristic data, so that the dimension of the input characteristic of the teacher style prediction model to be trained is greatly reduced, and the teacher style prediction performance of the teacher style prediction model obtained by training can be effectively improved.

Description

Teacher style prediction model training method and computer storage medium
Technical Field
The embodiment of the invention relates to the field of artificial intelligence, in particular to a training method of a teacher style prediction model and a computer storage medium.
Background
The teacher style is a judgment of the individual value of the teacher and is an important content of education evaluation. The method can predict the teaching style of the teacher, so that the teaching management department of the school and the teacher can know the teaching condition, find problems, summarize experience and modify the work, and the aim of finally improving the teaching quality is fulfilled. Therefore, how to fairly, justly and accurately predict the teacher style is always a problem explored in the education community.
At present, the teaching style of a teacher is predicted mainly by adopting a modeling mode, and input data of the model can comprise teaching data such as teaching audio and video of the teacher. Because it is difficult to obtain teaching data samples of different teacher styles, the data volume of the teaching data samples is often small. In addition, the dimensionality of the features extracted from the teaching data samples is often high, so that the problem of overfitting is easily generated during model training, and a model with good performance cannot be trained. Aiming at the problems of small data quantity of a teaching data sample and high dimensionality of extracted features, most of the existing processing methods utilize a principal component analysis technology to reduce the dimensionality of high-dimensional features extracted from the teaching data sample, and then train a model by using the reduced dimensionality features. However, this processing method inevitably loses some of the characteristics of the original features extracted from the teaching data sample, cannot fully utilize the information of the extracted original features, and cannot analyze the meaning of the feature after dimension reduction. Therefore, there is no model training method that can effectively improve teacher style prediction performance so far.
Disclosure of Invention
In view of the above, an objective of the present invention is to provide a method for training a teacher-style prediction model and a computer storage medium, so as to solve at least one of the above problems.
The embodiment of the invention provides a method for training a teacher style prediction model. The method comprises the following steps: determining multiple groups of low-dimensional feature data of a teaching content sample based on high-dimensional feature data of the teaching content sample; obtaining teacher style prediction data corresponding to the teaching content sample based on the multiple groups of low-dimensional feature data through a teacher style prediction model to be trained; and training the teacher style prediction model based on the teacher style annotation data and the teacher style prediction data of the teaching content sample.
An embodiment of the present invention further provides a computer-readable medium, where a readable program is stored in the computer-readable medium, and the readable program includes: instructions for determining sets of low-dimensional feature data for a teaching content sample based on high-dimensional feature data for the teaching content sample; instructions for obtaining teacher style prediction data corresponding to the teaching content samples based on the plurality of groups of low-dimensional feature data through a teacher style prediction model to be trained; instructions for training the teacher-style prediction model based on the teacher-style annotation data and the teacher-style prediction data of the teaching content samples.
According to the training scheme of the teacher style prediction model provided by the embodiment of the invention, multiple groups of low-dimensional feature data of a teaching content sample are determined based on the high-dimensional feature data of the teaching content sample, the teacher style prediction data corresponding to the teaching content sample is obtained based on the multiple groups of low-dimensional feature data through the teacher style prediction model to be trained, and then the teacher style prediction model is trained based on the teacher style marking data and the teacher style prediction data of the teaching content sample.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the embodiments of the present invention, and it is also possible for a person skilled in the art to obtain other drawings based on the drawings.
FIG. 1 is a flow chart illustrating the steps of a method for training a teacher-style predictive model according to a first embodiment of the invention;
FIG. 2 is a schematic diagram illustrating a structure of a teacher-style prediction model according to an embodiment of the present invention;
fig. 3 is a flowchart illustrating steps of a teacher style prediction method according to a second embodiment of the present invention.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the embodiments of the present invention, the technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all embodiments. All other embodiments obtained by a person skilled in the art based on the embodiments of the present invention shall fall within the scope of the protection of the embodiments of the present invention.
The following further describes specific implementation of the embodiments of the present invention with reference to the drawings.
Example one
Referring to fig. 1, a flowchart illustrating steps of a method for training a teacher-style prediction model according to a first embodiment of the present invention is shown.
Specifically, the teacher style prediction model training method provided by the embodiment of the invention comprises the following steps:
in step S101, a plurality of sets of low-dimensional feature data of a teaching content sample are determined based on high-dimensional feature data of the teaching content sample.
In this embodiment, the teaching content sample may include audio data or video data of teaching content as a training sample. The high-dimensional feature data may be understood as feature vectors with higher dimensions, such as 1000-dimensional feature vectors, 2000-dimensional feature vectors, and the like. When the teaching content sample is audio data of teaching content serving as a training sample, the high-dimensional feature data of the teaching content sample may be high-dimensional speech acoustic feature data extracted from the audio data, the speech acoustic feature data may include prosodic feature data, spectral feature data, voice quality feature data, and the like of audio, and the speech acoustic feature data is specifically a speech acoustic feature vector. In a specific embodiment, existing speech acoustic feature extraction algorithms can be used to extract high-dimensional speech acoustic feature data from the audio data. When the teaching content sample is video data of teaching content serving as a training sample, the high-dimensional feature data of the teaching content sample may be high-dimensional facial feature data extracted from the video data, the facial feature data may include feature data of a mouth region, feature data of an eye region, feature data of a cheek region, and the like, and the facial feature data is specifically a facial feature vector. In a specific embodiment, the existing facial feature extraction algorithm can be used to extract high-dimensional facial feature data from the video data.
In this embodiment, when determining multiple sets of low-dimensional feature data of a teaching content sample based on high-dimensional feature data of the teaching content sample, performing correlation analysis on the high-dimensional feature data to determine a grouping of the high-dimensional feature data; and dividing the high-dimensional feature data based on the grouping of the high-dimensional feature data to obtain multiple groups of low-dimensional feature data of the teaching content samples. Thereby, the dimensionality of the input features of the teacher-style prediction model is greatly reduced.
Specifically, when the high-dimensional voice acoustic feature data is high-dimensional voice acoustic feature data, the voice acoustic feature data includes a prosodic feature, a spectral feature and a vocal quality feature, and therefore, correlation analysis may be performed on the high-dimensional voice acoustic feature data based on the prior knowledge that the voice acoustic feature includes the prosodic feature, the spectral feature and the vocal quality feature to determine the grouping of the high-dimensional voice acoustic feature data. Then, according to the grouping, dividing the high-dimensional voice acoustic feature data to obtain prosody feature data, frequency spectrum feature data and tone quality feature data of the teaching content sample. Briefly, the kind of the features included in the voice acoustic features is determined according to the prior knowledge of the voice acoustic, and then the correlation analysis is performed on the high-dimensional voice acoustic feature data according to the kind of the features included in the voice acoustic features. When the high-dimensional feature data is specifically high-dimensional facial feature data, the human face comprises a mouth region, an eye region, a nose region and a cheek region according to the priori knowledge of the human face, so that correlation analysis can be performed on the high-dimensional facial feature data based on the priori knowledge of the human face comprising the mouth region, the eye region, the nose region and the cheek region to determine the grouping of the high-dimensional facial feature data. Then, according to the grouping, high-dimensional face and face feature data are divided to obtain mouth region feature data, eye region feature data, nose region feature data and cheek region feature data of the teaching content sample. Briefly, different regions included in the face are determined according to the priori knowledge of the face, and then correlation analysis is performed on high-dimensional face feature data according to the different regions included in the face.
In this embodiment, when determining multiple sets of low-dimensional feature data of a teaching content sample based on high-dimensional feature data of the teaching content sample, the high-dimensional feature data of the teaching content sample is divided in a dimensionally equal manner without prior knowledge to obtain the multiple sets of low-dimensional feature data of the teaching content sample. For example, when the high-dimensional feature data is 1000-dimensional feature data, the 1000-dimensional high-dimensional feature data may be equally divided into 10 sets of low-dimensional feature data, and the dimension of each set of low-dimensional feature data is 100 dimensions. In a specific embodiment, how many dimensions are specifically divided into how many groups can be set through experiments. Thereby, the dimensionality of the input features of the teacher-style prediction model is greatly reduced.
In particular, the method comprises the following steps of,inputting teaching content samples into a system, and setting v as high-dimensional characteristic data of one sample data N (N is 1,2,.. multidot.n) as vnDimension D, then using a certain priori knowledge to obtain high-dimensional characteristic data vnPerforming correlation analysis to obtain high-dimensional characteristic data vnDividing the three-dimensional space into K groups (K1, 2.., K), wherein the dimension of each group is set as DkSatisfy the following requirements
Figure BDA0002037441110000051
For the nth sample data, the original high-dimensional feature data is vnThe low dimensional feature data of the k-th group after division is
Figure BDA0002037441110000052
Then
Figure BDA0002037441110000053
Wherein concat () represents the sequential concatenation of feature data. If the correlation analysis of the features can be performed without prior knowledge, the features can be equally divided into K parts, and the above relationship is also satisfied.
In step S102, teacher style prediction data corresponding to the teaching content sample is obtained based on the plurality of sets of low-dimensional feature data through a teacher style prediction model to be trained.
In this embodiment, the teacher-style predictive model may be any suitable neural network model that may enable feature extraction or target object detection, including but not limited to convolutional neural networks, reinforcement learning neural networks, generative networks in antagonistic neural networks, deep neural networks, and so forth. The specific configuration of the neural network can be set by those skilled in the art according to actual requirements, such as the number of convolutional layers, the size of convolutional core, the number of channels, and so on. The teacher-style prediction data may be a predicted teacher-style category, a predicted teacher-style numerical value, or the like.
In this embodiment, the teacher-style prediction model includes a plurality of low-level models and a high-level model connected to output ends of the plurality of low-level models, and the plurality of low-level models and the high-level model are deep neural network models. When teacher style prediction data corresponding to the teaching content sample is obtained through a teacher style prediction model to be trained based on the multiple groups of low-dimensional feature data, obtaining multiple teacher style preliminary prediction data corresponding to the teaching content sample through the multiple low-level models based on the multiple groups of low-dimensional feature data; and obtaining final teacher style prediction data corresponding to the teaching content sample based on the plurality of teacher style preliminary prediction data through the high-level model. Therefore, the teaching style of the teaching content samples is preliminarily predicted through the plurality of low-level models included in the teacher style prediction model, the teaching style of the teaching content samples is finally predicted through the high-level models included in the teacher style prediction model based on the preliminary prediction result of the teaching style, and the prediction accuracy of the teacher style prediction model on the teacher style corresponding to the teaching content samples can be improved.
In this embodiment, each of the plurality of low-layer models includes a hidden layer and a predicted layer connected to an output end of the hidden layer, where the hidden layer is a fully-connected layer or a convolutional layer, and the predicted layer is a fully-connected layer. When a plurality of teacher style preliminary prediction data corresponding to the teaching content sample are obtained through the plurality of low-level models and based on the plurality of groups of low-dimensional feature data, respectively performing feature extraction operation on the plurality of groups of low-dimensional feature data through the hidden layer to obtain feature characterization data corresponding to the plurality of groups of low-dimensional feature data; and mapping the characteristic characterization data corresponding to the plurality of groups of low-dimensional characteristic data through the prediction layer to obtain a plurality of teacher style preliminary prediction data corresponding to the teaching content samples. The feature characterization data is specifically a feature characterization vector. Therefore, through the hidden layer, the characteristic extraction operation is respectively carried out on the multiple groups of low-dimensional characteristic data, the characteristic recoding can be respectively carried out on the multiple groups of low-dimensional characteristic data, the robustness of the characteristic representation data respectively corresponding to the multiple groups of low-dimensional characteristic data is improved, and the accuracy of the primary prediction of the teacher style corresponding to the teaching content sample by the low-layer model is further improved.
In this embodiment, when final teacher-style prediction data corresponding to the teaching content sample is obtained through the high-level model based on the plurality of teacher-style preliminary prediction data, high-level feature characterization data corresponding to the high-level model is generated based on the plurality of teacher-style preliminary prediction data; and acquiring final teacher style prediction data corresponding to the teaching content sample through the high-level model based on the high-level characteristic characterization data. The high-level feature characterization data is specifically a high-level feature characterization vector. Therefore, high-level feature characterization data corresponding to the high-level model are generated based on the teacher style preliminary prediction data, and final teacher style prediction data corresponding to the teaching content sample are obtained through the high-level model based on the high-level feature characterization data, so that the accuracy of final teacher style prediction of the high-level model corresponding to the teaching content sample can be improved.
In this embodiment, when generating the high-level feature characterization data corresponding to the high-level model based on the plurality of teacher-style preliminary prediction data, the high-level feature characterization data is generated based on the feature characterization data corresponding to each of the plurality of teacher-style preliminary prediction data and the plurality of sets of low-dimensional feature data. Therefore, high-level feature characterization data are generated based on the teacher style preliminary prediction data and feature characterization data corresponding to the low-dimensional feature data, the robustness of the high-level feature characterization data can be improved, and the accuracy of final prediction of the teacher style corresponding to the teaching content sample by the high-level model is further improved.
In this embodiment, when final prediction data of a teacher style corresponding to the teaching content sample is obtained through the high-level model based on the high-level feature characterization data, feature extraction operation is performed on the high-level feature characterization data through a hidden layer in the high-level model to obtain feature characterization data corresponding to the high-level feature characterization data; and mapping the characteristic representation data corresponding to the high-level characteristic representation data through a prediction layer in the high-level model to obtain final teacher style prediction data corresponding to the teaching content sample. The hidden layer is a full-link layer or a convolutional layer, the prediction layer is a full-link layer, and the feature characterization data is a feature characterization vector. Therefore, the high-level feature characterization data is subjected to feature extraction operation through the hidden layer, feature recoding can be performed on the high-level feature characterization data, the robustness of the feature characterization data corresponding to the high-level feature characterization data is improved, and the accuracy of final prediction of a teacher style corresponding to a teaching content sample by a high-level model is further improved.
Specifically, as shown in fig. 2, the teacher-style prediction model includes a plurality of lower-level models and a higher-level model connected to an output of the plurality of lower-level models. After high-dimensional feature data are divided, a plurality of feature groups are obtained, and then each feature group is respectively input into a corresponding low-level model. And performing preliminary prediction of the teaching style on the teaching content sample based on the characteristic groups through the corresponding low-level model to obtain preliminary teacher style prediction data corresponding to the teaching content sample. The low-level model comprises a plurality of hidden layers which are connected in sequence and a prediction layer connected with the output end of the last hidden layer in the hidden layers which are connected in sequence. And generating high-level feature characterization data corresponding to the high-level model based on teacher style preliminary prediction data output by the low-level models and feature characterization data of feature groups output by the last hidden layer in the low-level models. And finally, performing final prediction on the teaching style of the teaching content sample through a high-level model based on high-level characteristic representation data to obtain final prediction data of the teacher style corresponding to the teaching content sample.
Specifically, the high-dimensional feature data is divided into K groups of low-dimensional feature data, each group of low-dimensional feature data corresponds to one low-level model, and then the kth low-level model and the kth group of low-dimensional feature data
Figure BDA0002037441110000081
And correspond to each other. Low-dimensional feature data of kth group for nth sample data
Figure BDA0002037441110000082
Let the hidden layer number of the low-level model be Lk(lk=1,2,...,Lk) No. lkHidden node dimension of a hidden layer is
Figure BDA0002037441110000083
When l iskWhen the number is equal to 1, the alloy is put into a container,
Figure BDA0002037441110000084
wherein the content of the first and second substances,
Figure BDA0002037441110000085
is a weight matrix of the first hidden layer of the kth group of low-layer models with the dimension of
Figure BDA0002037441110000086
Figure BDA0002037441110000087
Is the bias vector of the first hidden layer of the kth group of low-layer models with the dimension of
Figure BDA0002037441110000088
f (-) is a non-linear function, typically a sigmoid function;
Figure BDA0002037441110000089
is the first hidden vector representation of the kth group of low-level models aiming at the nth sample data, and the dimensionality is
Figure BDA00020374411100000810
When 1 < lk<LkWhen the temperature of the water is higher than the set temperature,
Figure BDA00020374411100000811
wherein the content of the first and second substances,
Figure BDA00020374411100000812
is the kth group of low-level modelskA weight matrix of hidden layers with dimensions of
Figure BDA00020374411100000813
Figure BDA00020374411100000814
Is the kth group of low-level modelskA bias vector of a hidden layer having a dimension of
Figure BDA00020374411100000815
Figure BDA00020374411100000816
Is the l < th > of the k < th > group of low-level models for the n < th > piece of datakA hidden vector representation of hidden layer with dimension of
Figure BDA0002037441110000091
When l isk=LkWhen the temperature of the water is higher than the set temperature,
Figure BDA0002037441110000092
wherein the content of the first and second substances,
Figure BDA0002037441110000093
is the kth group of low-level modelskA weight matrix of hidden layers with dimensions of
Figure BDA0002037441110000094
Figure BDA0002037441110000095
Is the kth group of low-level modelskA bias vector of a hidden layer having a dimension of
Figure BDA0002037441110000096
Figure BDA0002037441110000097
Is the Lth of the kth group of low-level models for the nth sample datakA hidden vector representation of hidden layer with dimension of
Figure BDA0002037441110000098
The output of the hidden layer of the kth group of low-layer models is
Figure BDA0002037441110000099
Input as prediction layer for the kth group of low-layer models:
Figure BDA00020374411100000910
wherein the content of the first and second substances,
Figure BDA00020374411100000911
is a weight matrix of the predicted layer of the kth group of low-layer models with dimensions of
Figure BDA00020374411100000912
Figure BDA00020374411100000913
Is the bias vector of the prediction layer of the kth group of low-layer models, with dimension 1;
Figure BDA00020374411100000914
the preliminary teacher-style prediction data of the kth group of low-level models for the nth sample data has a dimension of 1 and is a real value between 0 and 1.
And combining the hidden vector representation of the last hidden layer of each low-layer model and the preliminary teacher style prediction data to obtain high-layer characteristic representation data. The high-level feature characterization data is:
Figure BDA00020374411100000915
wherein h isnHas the dimension of
Figure BDA00020374411100000916
The teacher-style preliminary prediction data of each low-level model are combined and added into the hidden vector representation of the last hidden layer, so that more information can be obtained, and the high-level model can predict more accurately.
And the high-layer characteristic characterization data is used as the input of a high-layer model for final prediction, and the high-layer model comprises a plurality of hidden layers which are sequentially connected and a prediction layer connected with the output end of the last hidden layer in the plurality of hidden layers which are sequentially connected. Let the number of hidden layers of the high-level model be L, and the hidden node dimension of the ith hidden layer be Dl
When l is equal to 1, the ratio of the total of the two,
y1n=W1hn+b1g1n=f(y1n)
wherein the content of the first and second substances,
Figure BDA0002037441110000101
is a weight matrix of the first hidden layer of the high-level model with the dimension of
Figure BDA0002037441110000102
Figure BDA0002037441110000103
Is the bias vector of the first hidden layer of the high-level model with the dimension D1(ii) a f (-) is a non-linear function, typically a sigmoid function; g1nIs the first hidden vector representation of the high-level model aiming at the nth sample data, and the dimension is D1×1。
When 1 < L < L,
yln=Wlg(l-1)n+blgln=f(yln)
wherein the content of the first and second substances,
Figure BDA0002037441110000104
is a weight matrix of the ith hidden layer of the high-level model with a dimension Dl×Dl-1
Figure BDA0002037441110000105
Is the bias vector of the ith hidden layer of the high-level model with the dimension Dl;glnIs the implicit vector representation of the high-level model aiming at the nth sample data with the dimension of Dl×1。
When l isk=LkWhen the temperature of the water is higher than the set temperature,
yLn=WLg(L-1)n+bLhLn=f(yLn)
wherein the content of the first and second substances,
Figure BDA0002037441110000106
is a weight matrix of the L-th hidden layer of the high-level model with the dimension of DL×DL-1
Figure BDA0002037441110000107
Is the bias vector of the L-th hidden layer of the high-level model with the dimension of DL;hLnIs the L hidden vector representation of the high-level model aiming at the nth sample data, and the dimensionality is DL×1。
The output of the high-level model hidden layer is hLnAs input to the high layer model prediction layer:
sn=WhLn+b
wherein the content of the first and second substances,
Figure BDA0002037441110000108
is a weight matrix of a prediction layer of a high-layer model with the dimension of 1 × DL
Figure BDA0002037441110000109
Is a bias vector of a high-level model prediction level, and the dimensionality is 1; snThe final prediction data of the teacher style of the nth sample data is obtained by the high-level model, the dimension is 1, and the real value is between 0 and 1.
As can be seen from the above description, in a specific embodiment, the low-level model and the high-level model are similar in structure, and therefore the low-level model and the high-level model are used, because the teaching style of the teaching content sample is preliminarily predicted through the low-level model, and then the teaching style of the teaching content sample is finally predicted through the high-level model based on the teaching style preliminary prediction result of the low-level model, so that the accuracy of the teacher style prediction model in predicting the teacher style corresponding to the teaching content sample can be improved. In addition, because the data volume of the teaching content sample is small, and the dimensionality of the feature data of the teaching content sample is too high, a model (such as only one bottom model) is directly used for modeling, a dimensionality disaster is caused, the model obtained through training is only suitable for training data, good performance cannot be obtained on test data, and overfitting influence is caused.
In step S103, the teacher-style prediction model is trained based on the teacher-style labeling data and the teacher-style prediction data of the teaching content sample.
In this embodiment, the teacher-style annotation data may be understood as teacher-style real data of the teaching content sample.
In this embodiment, when the teacher-style prediction model is trained based on teacher-style labeling data and teacher-style prediction data of the teaching content sample, a difference value between the teacher-style labeling data and the teacher-style prediction data is determined through a target loss function; and adjusting parameters of the teacher style prediction model based on the difference values.
In this embodiment, when the difference value between the teacher-style labeling data and the teacher-style prediction data is determined by the target loss function, the difference value between the teacher-style labeling data and the teacher-style final prediction data is determined by the target loss function. When parameters of the teacher-style prediction model are adjusted based on the difference values, parameters of the plurality of lower-level models and the higher-level model in the teacher-style prediction model are adjusted based on the difference values.
In this embodiment, the objective loss function includes a mean square error term and an L2 regularization term. Thereby, the training process of the teacher-style prediction model can be prevented from being influenced by overfitting.
Specifically, the nth bar is givenHigh-dimensional feature data v of sample datanFinally, teacher style prediction data s can be obtained from the prediction layer of the high-level model through calculation of the low-level model and the high-level modeln. Let the teacher style real data of the nth sample data be sn' training teacher style predictive models so that snAnd sn' as close as possible. In the training process, the following functions are selected as loss functions for training the teacher style prediction model:
Figure BDA0002037441110000121
wherein s isnIs teacher-style real data of the nth sample data,
Figure BDA0002037441110000122
is teacher-style preliminary prediction data, s, of the kth low-level model for the nth sample datan' teacher-style final prediction data of high-level model for nth sample data, Wl kIs a weight matrix, W, of the hidden layer of the low-level modelkIs a weight matrix, W, of the prediction layer of the low-level modellThe weight matrix of a high-level model hidden layer is obtained, W is the weight matrix of a high-level model prediction layer, and lambda is a weight attenuation item, and the value is between 0 and 1. The first and second terms of the above equation calculate the mean square error, and the latter four terms are normalized by adding L2 to prevent overfitting of the teacher style prediction model.
And the teacher style prediction model is trained by combining a low-level model and a high-level model together and integrally optimizing the teacher style prediction model through a target loss function. The whole model is trained by using a minimum target loss function, namely parameters of the teacher style prediction model are obtained through training
Figure BDA0002037441110000123
Figure BDA0002037441110000124
Specifically, the currently obtained teacher-style final prediction data is evaluated by determining a difference value between the teacher-style annotation data and the teacher-style final prediction data, so as to serve as a basis for subsequently training the teacher-style prediction model. Specifically, the difference values may be transmitted back to the teacher-style prediction model, thereby iteratively training the teacher-style prediction model. The teacher-style prediction model is trained in an iterative process, and only one training process is described in the embodiment of the application, but it should be understood by those skilled in the art that the training mode can be adopted for each training of the teacher-style prediction model until the training of the teacher-style prediction model is completed.
According to the teacher style prediction model training method, multiple groups of low-dimensional feature data of a teaching content sample are determined based on high-dimensional feature data of the teaching content sample, teacher style prediction data corresponding to the teaching content sample are obtained based on the multiple groups of low-dimensional feature data through the teacher style prediction model to be trained, teacher style annotation data and the teacher style prediction data of the teaching content sample are used for training the teacher style prediction model, compared with other existing modes, the teacher style prediction model training method greatly reduces the dimension of input features of the teacher style prediction model to be trained by grouping the high-dimensional feature data of the teaching content sample into the multiple groups of low-dimensional feature data, and therefore teacher style prediction performance of the teacher style prediction model obtained through training can be effectively improved.
Example two
Referring to fig. 3, a flowchart illustrating steps of a teacher style prediction method according to a second embodiment of the present invention is shown.
Specifically, the teacher style prediction method provided by the embodiment of the invention comprises the following steps:
in step S201, a plurality of sets of low-dimensional feature data of the teaching content data are determined based on the high-dimensional feature data of the teaching content data.
In this embodiment, the teaching content data may include audio data or video data of teaching content. When the teaching content data is audio data of teaching content, the high-dimensional feature data of the teaching content data may be high-dimensional speech acoustic feature data extracted from the audio data. When the teaching content data is video data of teaching content, the high-dimensional feature data of the teaching content data may be high-dimensional facial feature data extracted from the video data.
In this embodiment, the specific implementation of step S201 is similar to the specific implementation of step S101, and is not described herein again.
In step S202, through the teacher style prediction model obtained through the first training in the embodiment, teacher style prediction data corresponding to the teaching content data is obtained based on the multiple sets of low-dimensional feature data of the teaching content data.
In this embodiment, when the teacher-style prediction data corresponding to the teaching content data is obtained based on the plurality of sets of low-dimensional feature data by using the teacher-style prediction model obtained through the training in the first embodiment, a plurality of teacher-style preliminary prediction data corresponding to the teaching content data is obtained based on the plurality of sets of low-dimensional feature data by using the plurality of low-level models; and obtaining final teacher style prediction data corresponding to the teaching content data based on the plurality of teacher style preliminary prediction data through the high-level model. Therefore, the teaching style of the teaching content data is preliminarily predicted through the plurality of low-level models included in the teacher style prediction model, and the teaching style of the teaching content data is finally predicted through the high-level models included in the teacher style prediction model based on the preliminary prediction result of the teaching style, so that the accuracy of the teacher style prediction model for predicting the teacher style corresponding to the teaching content data can be improved.
In this embodiment, when multiple teacher-style preliminary prediction data corresponding to the teaching content data are obtained based on the multiple sets of low-dimensional feature data through the multiple low-level models, feature extraction operations are respectively performed on the multiple sets of low-dimensional feature data through the hidden layer to obtain feature characterization data corresponding to the multiple sets of low-dimensional feature data; and mapping the characteristic representation data corresponding to the plurality of groups of low-dimensional characteristic data through the prediction layer to obtain a plurality of teacher style preliminary prediction data corresponding to the teaching content data. The feature characterization data is specifically a feature characterization vector. Therefore, through the hidden layer, the characteristic extraction operation is respectively carried out on the multiple groups of low-dimensional characteristic data, the characteristic recoding can be respectively carried out on the multiple groups of low-dimensional characteristic data, the robustness of the characteristic representation data respectively corresponding to the multiple groups of low-dimensional characteristic data is improved, and the accuracy of the primary prediction of the teacher style corresponding to the teaching content data by the low-layer model is further improved.
In this embodiment, when final teacher-style prediction data corresponding to the teaching content data is obtained through the high-level model based on the plurality of teacher-style preliminary prediction data, high-level feature characterization data corresponding to the high-level model is generated based on the plurality of teacher-style preliminary prediction data; and acquiring final teacher style prediction data corresponding to the teaching content data through the high-level model based on the high-level characteristic representation data. The high-level feature characterization data is specifically a high-level feature characterization vector. Therefore, high-level feature characterization data corresponding to the high-level model are generated based on the teacher style preliminary prediction data, and final teacher style prediction data corresponding to the teaching content data are obtained through the high-level model based on the high-level feature characterization data, so that the accuracy of final teacher style prediction corresponding to the teaching content data by the high-level model can be improved.
In this embodiment, when generating the high-level feature characterization data corresponding to the high-level model based on the plurality of teacher-style preliminary prediction data, the high-level feature characterization data is generated based on the feature characterization data corresponding to each of the plurality of teacher-style preliminary prediction data and the plurality of sets of low-dimensional feature data. Therefore, high-level feature characterization data are generated based on the teacher style preliminary prediction data and feature characterization data corresponding to the low-dimensional feature data, the robustness of the high-level feature characterization data can be improved, and the accuracy of final prediction of the teacher style corresponding to the teaching content data by the high-level model is further improved.
In this embodiment, when final prediction data of a teacher style corresponding to the teaching content data is obtained through the high-level model based on the high-level feature characterization data, feature extraction operation is performed on the high-level feature characterization data through a hidden layer in the high-level model to obtain feature characterization data corresponding to the high-level feature characterization data; and mapping the characteristic representation data corresponding to the high-level characteristic representation data through a prediction layer in the high-level model to obtain final teacher style prediction data corresponding to the teaching content data. Therefore, the high-level feature characterization data is subjected to feature extraction operation through the hidden layer, feature recoding can be performed on the high-level feature characterization data, the robustness of the feature characterization data corresponding to the high-level feature characterization data is improved, and the accuracy of final prediction of a teacher style corresponding to teaching content data by a high-level model is further improved.
In this embodiment, the method further includes: and mapping the teacher style prediction data to obtain a teacher style category corresponding to the teaching content data. Therefore, the teacher style category corresponding to the teaching content data can be obtained.
Specifically, based on teacher-style prediction data, mapping operation is performed in a pre-constructed teacher-style semantic space to obtain a teacher-style category corresponding to the teaching content data. Wherein the teacher-style semantic space is understood to be a mapping space between teacher-style prediction data and teacher-style categories.
According to the teacher style prediction method provided by the embodiment of the application, multiple groups of low-dimensional feature data of teaching content data are determined based on the high-dimensional feature data of the teaching content data, then the teacher style prediction model obtained through training in the embodiment I is used, and the teacher style prediction data corresponding to the teaching content data is obtained based on the multiple groups of low-dimensional feature data of the teaching content data.
EXAMPLE III
An embodiment of the present invention further provides a computer-readable medium, where a readable program is stored in the computer-readable medium, and the readable program includes: instructions for determining sets of low-dimensional feature data for a teaching content sample based on high-dimensional feature data for the teaching content sample; instructions for obtaining teacher style prediction data corresponding to the teaching content samples based on the plurality of groups of low-dimensional feature data through a teacher style prediction model to be trained; instructions for training the teacher-style prediction model based on the teacher-style annotation data and the teacher-style prediction data of the teaching content samples.
Optionally, the instruction for obtaining, by the teacher-style prediction model to be trained, teacher-style prediction data corresponding to the teaching content sample based on the plurality of sets of low-dimensional feature data includes: instructions for obtaining, by the plurality of low-level models, a plurality of teacher-style preliminary prediction data corresponding to the teaching content sample based on the plurality of sets of low-dimensional feature data; and instructions for obtaining final teacher-style prediction data corresponding to the teaching content sample based on the plurality of teacher-style preliminary prediction data through the high-level model.
Optionally, each of the plurality of low-level models includes a hidden layer and a prediction layer connected to an output end of the hidden layer, and correspondingly, the instructions for obtaining, through the plurality of low-level models and based on the plurality of sets of low-dimensional feature data, a plurality of teacher-style preliminary prediction data corresponding to the teaching content sample include: instructions for performing feature extraction operations on the sets of low-dimensional feature data through the hidden layer to obtain feature characterization data corresponding to the sets of low-dimensional feature data; and the instruction is used for mapping the characteristic representation data corresponding to the plurality of groups of low-dimensional characteristic data through the prediction layer to obtain a plurality of teacher style preliminary prediction data corresponding to the teaching content samples.
Optionally, the instructions for obtaining, by the high-level model, final teacher-style prediction data corresponding to the teaching content sample based on the preliminary teacher-style prediction data include: instructions for generating high-level feature characterization data corresponding to the high-level model based on the plurality of teacher-style preliminary prediction data; and instructions for obtaining final teacher style prediction data corresponding to the teaching content sample based on the high-level feature characterization data through the high-level model.
Optionally, the instructions for generating high-level feature characterization data corresponding to the high-level model based on the plurality of teacher-style preliminary prediction data include: and generating the high-level feature characterization data based on feature characterization data corresponding to the teacher-style preliminary prediction data and the plurality of sets of low-dimensional feature data.
Optionally, the instructions for obtaining final teacher-style prediction data corresponding to the teaching content sample based on the high-level feature characterization data through the high-level model include: instructions for performing feature extraction operations on the high-level feature characterization data through a hidden layer in the high-level model to obtain feature characterization data corresponding to the high-level feature characterization data; and the instruction is used for mapping the characteristic representation data corresponding to the high-level characteristic representation data through a prediction layer in the high-level model so as to obtain final prediction data of the teacher style corresponding to the teaching content sample.
Optionally, the instructions for training the teacher-style prediction model based on the teacher-style annotation data and the teacher-style prediction data of the teaching content sample comprise: instructions for determining a difference value between the teacher-style annotation data and the final teacher-style prediction data via a target loss function; instructions for adjusting parameters of the plurality of lower-level models and the higher-level model in the teacher-style predictive model based on the difference values.
Optionally, the readable program further comprises: instructions for determining sets of low-dimensional feature data of the instructional content data based on high-dimensional feature data of the instructional content data; and instructions for obtaining teacher style prediction data corresponding to the teaching content data based on the plurality of groups of low-dimensional feature data of the teaching content data through the trained teacher style prediction model.
Optionally, the readable program further comprises: and the instruction is used for carrying out mapping operation on the teacher style prediction data so as to obtain a teacher style category corresponding to the teaching content data.
According to the computer readable medium provided by the embodiment of the application, multiple groups of low-dimensional feature data of a teaching content sample are determined based on high-dimensional feature data of the teaching content sample, teacher style prediction data corresponding to the teaching content sample is obtained based on the multiple groups of low-dimensional feature data through a teacher style prediction model to be trained, and the teacher style prediction model is trained based on teacher style labeling data and the teacher style prediction data of the teaching content sample.
It should be noted that, according to the implementation requirement, each component/step described in the embodiment of the present invention may be divided into more components/steps, and two or more components/steps or partial operations of the components/steps may also be combined into a new component/step to achieve the purpose of the embodiment of the present invention.
The above-described method according to an embodiment of the present invention may be implemented in hardware, firmware, or as software or computer code storable in a recording medium such as a CD ROM, a RAM, a floppy disk, a hard disk, or a magneto-optical disk, or as computer code originally stored in a remote recording medium or a non-transitory machine-readable medium downloaded through a network and to be stored in a local recording medium, so that the method described herein may be stored in such software processing on a recording medium using a general-purpose computer, a dedicated processor, or programmable or dedicated hardware such as an ASIC or FPGA. It will be appreciated that the computer, processor, microprocessor controller, or programmable hardware includes memory components (e.g., RAM, ROM, flash memory, etc.) that can store or receive software or computer code that, when accessed and executed by the computer, processor, or hardware, implements the teacher-style predictive model training method described herein. Further, when a general-purpose computer accesses code for implementing the teacher-style prediction model training method shown herein, execution of the code transforms the general-purpose computer into a special-purpose computer for performing the teacher-style prediction model training method shown herein.
Those of ordinary skill in the art will appreciate that the various illustrative elements and method steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present embodiments.
The above embodiments are only for illustrating the embodiments of the present invention and not for limiting the embodiments of the present invention, and those skilled in the art can make various changes and modifications without departing from the spirit and scope of the embodiments of the present invention, so that all equivalent technical solutions also belong to the scope of the embodiments of the present invention, and the scope of patent protection of the embodiments of the present invention should be defined by the claims.

Claims (10)

1. A method of training a teacher-style predictive model, the method comprising:
determining multiple groups of low-dimensional feature data of a teaching content sample based on high-dimensional feature data of the teaching content sample;
obtaining teacher style prediction data corresponding to the teaching content sample based on the multiple groups of low-dimensional feature data through a teacher style prediction model to be trained;
and training the teacher style prediction model based on the teacher style annotation data and the teacher style prediction data of the teaching content sample.
2. The method of claim 1, wherein the teacher-style predictive model includes a plurality of lower-level models and a higher-level model coupled to outputs of the plurality of lower-level models,
correspondingly, the obtaining of teacher style prediction data corresponding to the teaching content sample through the teacher style prediction model to be trained based on the multiple sets of low-dimensional feature data includes:
obtaining a plurality of teacher style preliminary prediction data corresponding to the teaching content samples based on the plurality of groups of low-dimensional feature data through the plurality of low-level models;
and obtaining final teacher style prediction data corresponding to the teaching content sample based on the plurality of teacher style preliminary prediction data through the high-level model.
3. The method of claim 2, wherein each of the plurality of lower layer models comprises a hidden layer and a predicted layer connected to an output of the hidden layer,
correspondingly, the obtaining, by the plurality of low-level models and based on the plurality of sets of low-dimensional feature data, a plurality of teacher-style preliminary prediction data corresponding to the teaching content samples includes:
respectively performing feature extraction operation on the multiple groups of low-dimensional feature data through the hidden layer to obtain feature characterization data corresponding to the multiple groups of low-dimensional feature data;
and mapping the characteristic characterization data corresponding to the plurality of groups of low-dimensional characteristic data through the prediction layer to obtain a plurality of teacher style preliminary prediction data corresponding to the teaching content samples.
4. The method of claim 3, wherein obtaining, by the high-level model, teacher-style final prediction data corresponding to the sample of instructional content based on the plurality of teacher-style preliminary prediction data comprises:
generating high-level feature characterization data corresponding to the high-level model based on the plurality of teacher style preliminary prediction data;
and acquiring final teacher style prediction data corresponding to the teaching content sample through the high-level model based on the high-level characteristic characterization data.
5. The method of claim 4, wherein generating high-level feature characterization data corresponding to the high-level model based on the plurality of teacher-style preliminary prediction data comprises:
and generating the high-level feature characterization data based on the feature characterization data respectively corresponding to the plurality of teacher style preliminary prediction data and the plurality of groups of low-dimensional feature data.
6. The method of claim 4, wherein obtaining teacher-style final prediction data corresponding to the teaching content sample based on the high-level feature characterization data via the high-level model comprises:
performing feature extraction operation on the high-level feature characterization data through a hidden layer in the high-level model to obtain feature characterization data corresponding to the high-level feature characterization data;
and mapping the characteristic representation data corresponding to the high-level characteristic representation data through a prediction layer in the high-level model to obtain final teacher style prediction data corresponding to the teaching content sample.
7. The method of claim 2, wherein training the teacher-style prediction model based on the teacher-style annotation data and the teacher-style prediction data of the teaching content sample comprises:
determining a difference value between the teacher style annotation data and the final teacher style prediction data through a target loss function;
adjusting parameters of the plurality of lower-level models and the higher-level model in the teacher-style prediction model based on the difference values.
8. The method according to any one of claims 1-7, further comprising:
determining multiple groups of low-dimensional feature data of the teaching content data based on the high-dimensional feature data of the teaching content data;
and acquiring teacher style prediction data corresponding to the teaching content data based on the multiple groups of low-dimensional characteristic data of the teaching content data through the trained teacher style prediction model.
9. The method of claim 8, further comprising:
and mapping the teacher style prediction data to obtain a teacher style category corresponding to the teaching content data.
10. A computer-readable medium, characterized in that the computer storage medium stores a readable program, the readable program comprising:
instructions for determining sets of low-dimensional feature data for a teaching content sample based on high-dimensional feature data for the teaching content sample;
instructions for obtaining teacher style prediction data corresponding to the teaching content samples based on the plurality of groups of low-dimensional feature data through a teacher style prediction model to be trained;
instructions for training the teacher-style prediction model based on the teacher-style annotation data and the teacher-style prediction data of the teaching content samples.
CN201910330162.4A 2019-04-23 2019-04-23 Teacher style prediction model training method and computer storage medium Active CN111832787B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910330162.4A CN111832787B (en) 2019-04-23 2019-04-23 Teacher style prediction model training method and computer storage medium
PCT/CN2020/086363 WO2020216286A1 (en) 2019-04-23 2020-04-23 Method for training teaching style prediction model, and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910330162.4A CN111832787B (en) 2019-04-23 2019-04-23 Teacher style prediction model training method and computer storage medium

Publications (2)

Publication Number Publication Date
CN111832787A true CN111832787A (en) 2020-10-27
CN111832787B CN111832787B (en) 2022-12-09

Family

ID=72911994

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910330162.4A Active CN111832787B (en) 2019-04-23 2019-04-23 Teacher style prediction model training method and computer storage medium

Country Status (2)

Country Link
CN (1) CN111832787B (en)
WO (1) WO2020216286A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113408571B (en) * 2021-05-08 2022-07-19 浙江智慧视频安防创新中心有限公司 Image classification method and device based on model distillation, storage medium and terminal
CN114780785B (en) * 2022-06-23 2022-09-13 新缪斯(深圳)音乐科技产业发展有限公司 Music teaching recommendation method and system based on knowledge graph

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104392174A (en) * 2014-10-23 2015-03-04 腾讯科技(深圳)有限公司 Generation method and device for characteristic vectors of dynamic behaviors of application program
CN104933588A (en) * 2015-07-01 2015-09-23 北京京东尚科信息技术有限公司 Data annotation platform for expanding merchandise varieties and data annotation method
CN107423442A (en) * 2017-08-07 2017-12-01 火烈鸟网络(广州)股份有限公司 Method and system, storage medium and computer equipment are recommended in application based on user's portrait behavioural analysis
CN107577943A (en) * 2017-09-08 2018-01-12 北京奇虎科技有限公司 Sample predictions method, apparatus and server based on machine learning
CN108021947A (en) * 2017-12-25 2018-05-11 北京航空航天大学 A kind of layering extreme learning machine target identification method of view-based access control model
CN108897834A (en) * 2018-06-22 2018-11-27 招商信诺人寿保险有限公司 Data processing and method for digging
US20190034764A1 (en) * 2017-07-31 2019-01-31 Samsung Electronics Co., Ltd. Method and apparatus for generating training data to train student model using teacher model

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7684390B2 (en) * 2004-12-30 2010-03-23 Intel Corporation Integrated circuit capable of transmitting probe packets across a stack of switches
CN106250403A (en) * 2016-07-19 2016-12-21 北京奇艺世纪科技有限公司 Customer loss Forecasting Methodology and device
CN107045673B (en) * 2017-03-31 2020-09-29 杭州电子科技大学 Public bicycle flow variation prediction method based on stack model fusion
CN108073888A (en) * 2017-08-07 2018-05-25 中国科学院深圳先进技术研究院 A kind of teaching auxiliary and the teaching auxiliary system using this method
CN107992887B (en) * 2017-11-28 2021-02-19 东软集团股份有限公司 Classifier generation method, classification device, electronic equipment and storage medium
CN108090857B (en) * 2017-12-29 2021-06-22 复旦大学 Multi-mode student classroom behavior analysis system and method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104392174A (en) * 2014-10-23 2015-03-04 腾讯科技(深圳)有限公司 Generation method and device for characteristic vectors of dynamic behaviors of application program
CN104933588A (en) * 2015-07-01 2015-09-23 北京京东尚科信息技术有限公司 Data annotation platform for expanding merchandise varieties and data annotation method
US20190034764A1 (en) * 2017-07-31 2019-01-31 Samsung Electronics Co., Ltd. Method and apparatus for generating training data to train student model using teacher model
CN107423442A (en) * 2017-08-07 2017-12-01 火烈鸟网络(广州)股份有限公司 Method and system, storage medium and computer equipment are recommended in application based on user's portrait behavioural analysis
CN107577943A (en) * 2017-09-08 2018-01-12 北京奇虎科技有限公司 Sample predictions method, apparatus and server based on machine learning
CN108021947A (en) * 2017-12-25 2018-05-11 北京航空航天大学 A kind of layering extreme learning machine target identification method of view-based access control model
CN108897834A (en) * 2018-06-22 2018-11-27 招商信诺人寿保险有限公司 Data processing and method for digging

Also Published As

Publication number Publication date
WO2020216286A1 (en) 2020-10-29
CN111832787B (en) 2022-12-09

Similar Documents

Publication Publication Date Title
CN105975573B (en) A kind of file classification method based on KNN
CN108038107B (en) Sentence emotion classification method, device and equipment based on convolutional neural network
CN110210032B (en) Text processing method and device
KR20180125905A (en) Method and apparatus for classifying a class to which a sentence belongs by using deep neural network
CN114021524B (en) Emotion recognition method, device, equipment and readable storage medium
CN110348352B (en) Training method, terminal and storage medium for human face image age migration network
CN114332578A (en) Image anomaly detection model training method, image anomaly detection method and device
CN109711356B (en) Expression recognition method and system
CN111832787B (en) Teacher style prediction model training method and computer storage medium
CN112115967A (en) Image increment learning method based on data protection
CN111598153B (en) Data clustering processing method and device, computer equipment and storage medium
KR20200143450A (en) Image processing method, device, electronic device and storage medium
CN111667016A (en) Incremental information classification method based on prototype
CN115511069A (en) Neural network training method, data processing method, device and storage medium
AU2022221471A1 (en) Automatic photo editing via linguistic request
CN114332565A (en) Method for generating image by generating confrontation network text based on distribution estimation condition
CN111292715B (en) Speech synthesis method, speech synthesis device, electronic equipment and computer-readable storage medium
CN111145787B (en) Voice emotion feature fusion method and system based on main and auxiliary networks
CN113011532A (en) Classification model training method and device, computing equipment and storage medium
CN110598737A (en) Online learning method, device, equipment and medium of deep learning model
CN115131646A (en) Deep network model compression method based on discrete coefficient
JP2021077352A (en) Information processing device and method, and device for performing classification by using model
CN111259138A (en) Tax field short text emotion classification method and device
CN117332090B (en) Sensitive information identification method, device, equipment and storage medium
CN117251599B (en) Video corpus intelligent test optimization method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant