CN113793239B - Personalized knowledge tracking method and system integrating learning behavior characteristics - Google Patents

Personalized knowledge tracking method and system integrating learning behavior characteristics Download PDF

Info

Publication number
CN113793239B
CN113793239B CN202110928810.3A CN202110928810A CN113793239B CN 113793239 B CN113793239 B CN 113793239B CN 202110928810 A CN202110928810 A CN 202110928810A CN 113793239 B CN113793239 B CN 113793239B
Authority
CN
China
Prior art keywords
vector
sequence
learning behavior
answer
student
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110928810.3A
Other languages
Chinese (zh)
Other versions
CN113793239A (en
Inventor
袁华
王兰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN202110928810.3A priority Critical patent/CN113793239B/en
Publication of CN113793239A publication Critical patent/CN113793239A/en
Application granted granted Critical
Publication of CN113793239B publication Critical patent/CN113793239B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • G06Q50/205Education administration or guidance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The invention discloses a personalized knowledge tracking method and a personalized knowledge tracking system integrating learning behavior characteristics, which utilize a convolutional neural network to extract effective characteristics in a composite vector formed by learning behavior and answer result data; and extracting the topic information characteristics including the knowledge points through the noise reduction self-encoder, finally combining the learning behavior characteristics with the topic information characteristics, and obtaining the knowledge grasping degree state of the students through the LSTM network and the full connection layer. In the modeling process, the invention fuses a series of learning behavior characteristics of students in the learning process, the exercises themselves and rich information of knowledge points, and more accurately predicts the knowledge grasping degree of each student. The invention can be applied to hybrid teaching and provides a quantization basis for personalized teaching.

Description

Personalized knowledge tracking method and system integrating learning behavior characteristics
Technical Field
The invention relates to the technical field of hybrid teaching, in particular to a personalized knowledge tracking method and system integrating learning behavior characteristics.
Background
Hybrid teaching is a deep fusion of online and offline teaching modes. At present, the rapid development of information technology makes hybrid teaching mode more and more applied to teaching, and hybrid teaching not only keeps the face-to-face communication opportunity of teachers and students, but also breaks through space-time limitation to the maximum extent simultaneously, and learners can use an online platform to conduct teaching activities such as pre-course study, overturning classroom, post-course review consolidation and the like, and the teaching activities enable students to leave marks on the teaching platform, so that learning behavior data are generated. Typically, the teaching platform provides a visual window, a log, and the like to show the statistical characteristics of the data, but lacks deep analysis and mining of the data, and cannot provide a direct basis for personalized learning.
In fact, a great benefit of hybrid teaching is to facilitate individualized learning for students. If the data can be used for providing a quantized knowledge grasping report for each student, the quantized basis can be definitely provided for personalized learning, and blindness is reduced.
The task of knowledge tracking (Knowledge Tracing, KT) is to model the knowledge of students based on time series in order to be able to accurately predict their performance in future interactions. The main research direction at present is to combine deep learning with knowledge tracking, classical model history development: the first introduction of RNN into knowledge tracking field by Chris piece 2015 enabled a first deep knowledge tracking model (Deep Knowledge Tracing, DKT). On this basis, several variants were developed in succession to increase predictive power of the model, such as DKT-t, E2E-DKT, DKT-DSC. The method has the breakthrough that a Dynamic Key-value Memory network (Dynamic Key-Value Memory Network, DKVMN) is proposed by 2017, and a Memory-enhanced neural network (Memory-Augment Neural Network, MANs) is utilized to replace RNN for deep knowledge tracking. Pandey in 2019 has proposed a Self-attention knowledge tracking (Self-Attention Knowledge Tracing) model for knowledge tracking problems, with better long-sequence learning capabilities than RNNs. Although the models achieve good effect in technical improvement, most researches only consider knowledge points and response results, and neglect the influence of learning behavior on knowledge mastering; and only the behavior of the answer is considered in a small part of work, such as answering, checking and analyzing, and the like, so that the prediction accuracy of knowledge tracking is still to be improved. Learning performance has not been added to studies, and authors believe that one of the main reasons is that data during learning of students is difficult to obtain before the educational platform is generally developed, so that the influence of learning states on learning effects is difficult to quantify, thereby limiting the progress of research. And the rapid development of the education platform nowadays, the rich study log data enable researchers to conduct intensive research conditionally.
Disclosure of Invention
The first aim of the invention is to overcome the defect that the existing knowledge tracking model and method mostly only consider the question and answer result characteristics, but neglect the influence of the learning behavior characteristics of students on knowledge achievement degree in the learning process (in practice, the learning situation, learning method and effort of students are more marked by the behavior characteristics, knowledge points are directly influenced by the learning situation, knowledge points can be predicted by the students, the learning behavior characteristics are integrated into knowledge tracking, the influence of the learning behavior characteristics, exercise texts and knowledge points on the answer is synthesized, and the knowledge points are more accurately predicted by the students.
A second object of the present invention is to provide a personalized knowledge tracking system that incorporates learning behavior features.
The first object of the invention is achieved by the following technical scheme: the personalized knowledge tracking method integrating the learning behavior characteristics comprises the following steps:
s1, learning behavior characteristic data and response data of students in a teaching process are obtained; the acquired learning behavior characteristic data comprise data generated by learning activities of students on a teaching platform; the acquired answering data comprise questions and answering results of the students;
s2, preprocessing the acquired learning behavior characteristic data and response data to obtain a corresponding sequence; for learning behavior feature data, cleaning is needed first, and then standard normalization processing is carried out to obtain an original learning behavior feature vector; for answer data, a problem text sequence, a problem related knowledge point sequence and an answer result sequence are required to be separated and extracted from the answer data;
s3, encoding the answer result sequence by using a single-heat encoding rule to obtain an answer result vector, forming a two-dimensional vector by the answer result vector and the original learning behavior feature vector, and learning by using a convolutional neural network model to obtain a learning behavior feature vector influencing the answer result; splicing the problem text sequence and the problem related knowledge point sequence, and inputting the problem text sequence and the problem related knowledge point sequence into a noise reduction self-encoder to obtain a problem encoding vector;
s4, splicing the learning behavior feature vector, the problem coding vector and the answer result vector to obtain a feature set, then intersecting features in the feature set, carrying out feature cascading, and finally carrying out dimension reduction on the feature set through a self-encoder to obtain an answer record vector;
s5, taking the answer record vector as input, training a depth knowledge tracking model based on LSTM, inputting one answer record vector at each moment to obtain a knowledge state hidden vector at the corresponding moment, and inputting the obtained knowledge state hidden vector to a full connection layer to obtain a knowledge point mastering state vector of a student to realize personalized knowledge tracking.
Further, in step S1, learning behavior feature data and response data of the student are acquired from the online teaching platform including the MOOC and the rain class by the teaching unit; wherein, a teaching unit is used as a stage to count the learning behavior characteristics of students.
Further, the step S2 includes the steps of:
s201, cleaning learning behavior feature data, removing student individuals with the selected feature missing more than 80%, performing simple numerical operation on part of the learning behavior features according to the original learning behavior features, and extracting the learning behavior features which can better reflect the student status of the students;
s202, carrying out Max-Min normalization on the cleaned learning behavior feature data to keep data balance, obtaining an original learning behavior feature vector of each student, and marking the original learning behavior feature vector as F, wherein the original learning behavior feature vector is expressed as follows:
wherein N is the number of students, c is the number of curriculum chapter sections, N is a positive integer,representing the learning behavior vector of the nth student in the c-th unit,/for the student>Can be specifically describedThe { b } 1 ,b 2 ,…,b fk },b fk Representing a certain learning behavior characteristic of the nth student in the c unit, wherein the total of the statistical learning behavior characteristics is fk;
extracting a problem text sequence of the student and marking the problem text sequence as Q; sequencing the answering data according to the serial numbers of the students, splicing the answering records of the same student in the same unit into a record according to the sequence of the answering, and unifying the expression forms of the data by using a standardized unit, wherein the formalized expression is as follows:
wherein,indicating that the nth student makes the problem of the nth channel, the answer amount of each student or the problem amount of each unit may be different and respectively marked as (t 1, t2, …, tn), and tn indicates the answer amount of the nth student;
extracting a problem related knowledge point sequence, and marking the sequence as K; according to the answer data and the extracted answer series of the student problems and the corresponding relation between the problems and the knowledge points, the relevant knowledge point sequence of the problems of each student answer is obtained, and formalized representation is as follows:
wherein,the n-th question contains the sn knowledge points, and the number of the knowledge points contained in each question is different and is respectively marked as (s 1, s2, …, sn);
extracting a student answer result sequence, and marking the student answer result sequence as A; according to the answer data and the extracted answer series of the student problems, an answer result sequence of each student is obtained, and formalized representation is as follows:
wherein,indicating that the answering result of the nth student on the nth problem is correct or incorrect.
Further, the step S3 includes the steps of:
s301, encoding a response result sequence by using a single-hot encoding rule, wherein 1 represents correct response, 0 represents incorrect response, obtaining a response result vector, constructing a student learning behavior feature and the response result vector into a two-dimensional vector, performing sequence splicing on a problem text sequence and a problem related knowledge point sequence, and inputting the sequence into a noise reduction self-encoder to obtain a problem encoding vector, wherein the specific steps are as follows:
s3011, preprocessing a problem text, including punctuation and word spacing, removing nonsensical characters, then word segmentation, word stopping, finally extracting a keyword sequence of the problem, namely a problem text sequence, and splicing the keyword sequence representing the problem with a related knowledge point sequence of a corresponding problem to obtain a problem feature sequence;
s3012, converting the obtained problem feature sequence into bit sequence codes, and inputting the bit sequence codes into an embedded layer, or initializing the embedded layer of the text directly through a pre-training word vector; assuming that the dimension of the embedding layer is d and the vocabulary quantity in the corpus is m, the embedding layer is randomly initialized to a matrix with the size of d x m, and the vocabulary contained in the topic can obtain a corresponding word vector, namely an embedding vector through the bit sequence index;
s3013, inputting the word vector into a noise reduction self-encoder, and reconstructing to obtain a problem coding vector; the noise reduction self-encoder is composed of a multi-layer feedforward neural network and comprises an encoding layer, a hiding layer and a decoding layer, wherein the encoding layer, the hiding layer and the decoding layer are defined, the left side is the encoder, the right side is the decoder, the noise reduction self-encoder can restore original information of a text through decoding of the decoder, in the process, the hiding layer captures implicit description of the text by using fewer neurons, the hidden layer is a more abstract low-dimensional information representation of the text, the hidden layer is analyzed in terms of interpretability, and the hidden layer extracts subject information of the text; the weight of each layer of feedforward neural network is randomly initialized according to Gaussian distribution;
the coding layer is used for mapping word vector input to a low-dimensional space, and is specifically expressed as:
h=f(w T x'+d)
wherein h represents the characteristic of the problem after encoding, x' represents the noisy version of the word vector, w T The weight matrix input for the coding layer, d is the bias term of the coding layer, f (·) is an element-width mapping function comprising an identity function f (g) =g or a sigmoid function f (x) =1/(1+e) (-x) );
The decoding layer is used for reconstructing original input data from noise data, and specifically expressed as:
wherein,to reconstruct the problem code vector by the noise reduction encoder, the parameter w' T For the weight matrix input by the decoding layer, d' is the decoding layer bias term, g (·) is an element-wise mapping function;
s302, performing feature learning based on a convolutional neural network, wherein a plurality of convolutional kernels with different sizes are used for extracting a plurality of groups of local features by a convolutional layer, and the output after the convolutional operation is as follows:
co=fr(wd*x i,i+cw-1 +br)
wherein wd is a shared weight parameter, cw is a sliding window size, x is a two-dimensional vector composed of learning behavior features indicating the window size and a response result vector, br is a bias term, and fr is an activation function;
s303, carrying out maximum pooling operation on the features extracted by the convolution layer, and then calculating the features through a softmax function of the full connection layer, so that learning behavior feature vectors affecting the answer result are extracted according to the calculated probability value.
Further, the step S4 includes the steps of:
s401, vector splicing is carried out on the learning feature vector, the problem coding vector and the answering result vector, and a feature set of each student affecting the answering result is obtained;
s402, intersecting the feature sets influencing the answer result, carrying out feature cascading on the basis, and reducing the dimension of the feature vector by using a self-encoder to obtain an answer record vector.
Further, the step S5 includes the steps of:
s501, taking answer record vectors as input, training a depth knowledge tracking model based on LSTM, inputting one answer record vector at each moment, and obtaining knowledge state hidden vectors at corresponding moments; the transfer formula between models is as follows:
i t =σ(W ri r t +U ri h t-1 +b i )
f t =σ(W rf r t +U rf h t-1 +b f )
c t =f t *c t-1 +i t *Tanh(W rc r t +U rc h t-1 )
o t =σ(W ro r t +U ro h t-1 +b o )
h t =o t *Tanh(c t )
wherein i is t 、f t 、o t 、c t Respectively representing an input gate, a forgetting gate, an output gate and a memory unit in the LSTM, h t For hidden vector output of current layer, r t For input at time t, U ri h t-1 、U rf h t-1 、U rc h t-1 、U ro h t-1 H representing the hidden vectors of the respective gates at the previous time t-1 Weight, W ri 、W rf 、W rc 、W ro Respectively representing the weights of the corresponding gates, b i 、b f 、b o Respectively representing the bias of the corresponding gates, wherein sigma is an activation function;
s502, inputting knowledge state hidden vectors at corresponding moments into a full connection layer to obtain knowledge point mastering state vectors K of students t The expression is as follows:
K t =σ(w k o t +b k )
wherein w is k 、b k Is a parameter to be learned;
the cross entropy loss function L is adopted in the training process, and the formula is as follows:
wherein q t+1 A, representing the problem answered by the student at time t+1 t+1 Representing whether the problem answered at the time t+1 is correct or not, delta represents the one-hot encoding format after dimension reduction, l is a cross entropy function, y t T The output at time t is shown.
The second object of the invention is achieved by the following technical scheme: a personalized knowledge tracking system integrating learning behavior features comprises:
a data preprocessing unit for preparing an input data set; firstly, cleaning learning behavior data, removing students with the learning behavior data missing more than 80% according to the selected learning behavior data, then carrying out Max-Min normalization on the learning behavior data to keep data balance, obtaining original learning behavior feature vectors of each student, and separating and extracting problem text sequences, problem related knowledge point sequences and answer result sequences from answer data;
the learning behavior feature extraction unit is used for extracting learning behavior features affecting the answer result; encoding the answer result sequence by using a single-heat encoding rule to obtain an answer result vector, forming a two-dimensional vector by the answer result vector and the original learning behavior feature vector, and learning by using a convolutional neural network model to obtain a learning behavior feature vector influencing the answer result;
the problem information extraction unit is used for obtaining the problem text and the information of the included knowledge points; the method comprises the steps of processing a problem text, including punctuation and text interval, removing nonsensical characters, then word segmentation, word deactivation, finally extracting a keyword sequence of the problem, namely a problem text sequence, splicing the keyword sequence representing the problem with a related knowledge point sequence of a corresponding problem to obtain a problem feature sequence, converting the obtained problem feature sequence into a bit sequence code, inputting the bit sequence code into an embedded layer, or initializing the embedded layer of the text directly through a pre-training word vector; assuming that the dimension of the embedding layer is d and the vocabulary quantity in the corpus is m, the embedding layer is randomly initialized to a matrix with the size of d x m, and the vocabulary contained in the topic can obtain a corresponding word vector, namely an embedding vector through the bit sequence index, and the word vector is input into a noise reduction self-encoder to be reconstructed to obtain a problem coding vector;
the feature dimension reduction unit is used for reducing dimension of the obtained feature set; splicing the learning behavior feature vector, the problem coding vector and the answer result vector to obtain a feature set, then intersecting features in the feature set, carrying out feature cascading, and finally carrying out dimension reduction on the feature set through a self-encoder to obtain an answer record vector;
the knowledge tracking training unit is used for training a knowledge tracking model and predicting the mastering condition of the student on the knowledge points; and training a depth knowledge tracking model based on LSTM by taking the answer record vector as input, inputting one answer record vector at each moment to obtain a knowledge state hidden vector at the corresponding moment, and inputting the obtained knowledge state hidden vector to a full connection layer to obtain a knowledge point mastering state vector of a student to realize personalized knowledge tracking.
Compared with the prior art, the invention has the following advantages and beneficial effects:
the invention integrates the learning behavior characteristics of students in the learning process, fully considers the influence of individual learning habit and state change of the students on the learning effect in the learning process, automatically learns the influence of the data on knowledge mastering of the students by using a convolutional neural network, comprehensively considers learning behavior characteristics, problem texts and problem knowledge point information, comprehensively and accurately predicts the knowledge mastering degree of each student, has important application value for hybrid teaching, and can predict the mastering degree of the students on the learning content by using the model in practical teaching, for example, provides individualized learning recommendation for learners. The invention can be applied to hybrid teaching, provides a quantization basis for personalized teaching, and is worth popularizing.
Drawings
FIG. 1 is a schematic logic flow diagram of the method of the present invention.
FIG. 2 is a block diagram of a system according to the present invention.
Detailed Description
The present invention will be described in further detail with reference to examples and drawings, but embodiments of the present invention are not limited thereto.
Example 1
As shown in fig. 1, the personalized knowledge tracking method for fusing learning behavior features provided in this embodiment includes the following steps:
s1, learning behavior characteristic data and response data of students in a teaching process are obtained; the acquired learning behavior characteristic data comprise data generated by learning activities of students on a teaching platform; the acquired answering data comprise questions and answering results of the students; the method comprises the following steps:
the method comprises the steps that learning behavior characteristic data and response data of students are obtained from an online teaching platform comprising MOOC and rain class according to a teaching unit; wherein, a teaching unit is used as a stage to count the learning behavior characteristics of students.
S2, preprocessing the acquired learning behavior characteristic data and response data to obtain a corresponding sequence; for learning behavior feature data, cleaning is needed first, and then standard normalization processing is carried out to obtain an original learning behavior feature vector; for answer data, a problem text sequence, a problem related knowledge point sequence and an answer result sequence need to be separated and extracted from the answer data. The method comprises the following specific steps:
s201, cleaning learning behavior feature data, removing students with the selected feature missing more than 80%, performing simple numerical operation on part of learning behavior features according to original learning behavior features, extracting learning behavior features which can better reflect student states, and obtaining average video watching duration each time according to the ratio of original video watching duration and video watching times.
S202, carrying out Max-Min normalization on the cleaned learning behavior feature data to keep data balance in consideration of large difference in numerical representation of the learning behavior features, obtaining an original learning behavior feature vector of each student, and marking as F, wherein the method is represented as follows:
wherein N is the number of students, c is the number of curriculum chapter sections, N is a positive integer,representing the learning behavior vector of the nth student in the c-th unit,/for the student>Can be specifically described as { b } 1 ,b 2 ,…,b fk },b fk Indicating that the nth student learns behavior features in a c-th unit, and the total of the statistical learning behavior features is fk.
And extracting a problem text sequence of the student and marking the sequence as Q. The answering data are ordered according to the serial numbers of the students, then the answering records of the same student in the same unit are spliced into a record according to the sequence of the answering, and then a standardized unit is used for unifying the expression form of the data. Formalized representation is as follows:
wherein,the question number of each student or each unit may be different, and is respectively denoted as (t 1, t2, …, tn), and tn represents the question number of the nth student.
And extracting a problem related knowledge point sequence, and marking the sequence as K. According to the answer data and the extracted answer series of the student problems and the corresponding relation between the problems and the knowledge points, the knowledge point sequence related to the problems of each student answer is obtained, and formalized representation is as follows:
wherein,the n-th question contains the sn-th knowledge points, and the number of knowledge points contained in each question is different and is respectively denoted as (s 1, s2, …, sn).
And extracting a student answer result sequence, and marking the answer result sequence as A. According to the answer data and the extracted answer series of the student problems, an answer result sequence of each student is obtained, and formalized representation is as follows:
wherein,indicating the result (correct or incorrect) of the nth student's answer on the nth problem.
S3, encoding the answer result sequence by using a single-heat encoding rule to obtain an answer result vector, forming a two-dimensional vector by the answer result vector and the original learning behavior feature vector, and learning by using a convolutional neural network model to obtain a learning behavior feature vector influencing the answer result; and splicing the problem text sequence and the problem related knowledge point sequence, and inputting the problem text sequence and the problem related knowledge point sequence into a noise reduction self-encoder to obtain a problem encoding vector. The method comprises the following specific steps:
s301, encoding a response result sequence by using a single-hot encoding rule, wherein 1 represents correct response, 0 represents incorrect response, obtaining a response result vector, constructing a student learning behavior feature and the response result vector into a two-dimensional vector, performing sequence splicing on a problem text sequence and a problem related knowledge point sequence, and inputting the sequence into a noise reduction self-encoder to obtain a problem encoding vector, wherein the specific steps are as follows:
s3011, preprocessing a problem text, including punctuation and word spacing, removing nonsensical characters, then word segmentation, word stopping, finally extracting a keyword sequence of the problem, namely a problem text sequence, and splicing the keyword sequence representing the problem with a related knowledge point sequence of the corresponding problem to obtain a problem feature sequence.
S3012, converting the obtained problem feature sequence into bit sequence codes, and inputting the bit sequence codes into an embedded layer, or initializing the embedded layer of the text directly through a pre-training word vector. Assuming that the dimension of the embedding layer is d and the vocabulary quantity in the corpus is m, the embedding layer is randomly initialized to a matrix with the size of d x m, and the vocabulary contained in the topic can obtain a corresponding word vector (embedding vector) through the bit sequence index.
S3013, inputting the word vector into a noise reduction self-encoder, and reconstructing to obtain the problem coding vector. The noise-reducing self-encoder is composed of a multi-layer feedforward neural network and comprises an encoding layer, a hiding layer and a decoding layer, wherein the hiding layer is used as a boundary, the left side is an encoder, the right side is a decoder, and the noise-reducing self-encoder can restore original information of a text through decoding of the decoder. In this process, the hidden layer captures an implicit description of the text by using fewer neurons, is a more abstract low-dimensional representation of the information of the text, analyzed on an interpretability, and extracts the subject information of the text. The weight of each layer of feedforward neural network is randomly initialized according to Gaussian distribution.
The coding layer is used for mapping word vector input to a low-dimensional space, and is specifically expressed as:
h=f(w T x'+d)
wherein h represents the characteristic of the problem after encoding, x' represents the noisy version of the word vector, w T The weight matrix input for the coding layer, d is the bias term of the coding layer, f (·) is an element-width mapping function comprising an identity function f (g) =g or a sigmoid function f (x) =1/(1+e) (-x) )。
The decoding layer is used for reconstructing original input data from noise data, and specifically expressed as:
wherein,to reconstruct the problem code vector by the noise reduction encoder, the parameter w' T For the weight matrix input by the decoding layer, d' is the decoding layer bias term, and g (·) is an element-wise mapping function.
S302, performing feature learning based on a convolutional neural network, wherein a plurality of convolutional kernels with different sizes are used for extracting a plurality of groups of local features by a convolutional layer, and the output after the convolutional operation is as follows:
co=fr(wd*x i,i+cw-1 +br)
wherein wd is a shared weight parameter, cw is a sliding window size, x is a two-dimensional vector composed of learning behavior features indicating the window size and a response result vector, br is a bias term, and fr is an activation function.
S303, carrying out maximum pooling operation on the features extracted by the convolution layer, and then calculating the features through a softmax function of the full connection layer, so that learning behavior feature vectors affecting the answer result are extracted according to the calculated probability value.
And S4, splicing the learning behavior feature vector, the problem coding vector and the answer result vector to obtain a feature set, then intersecting features in the feature set, carrying out feature cascading, and finally carrying out dimension reduction on the feature set through a self-encoder to obtain an answer record vector. The method comprises the following specific steps:
s401, vector splicing is carried out on the learning feature vector, the problem coding vector and the answering result vector, and a feature set of each student affecting the answering result is obtained.
S402, intersecting the feature sets influencing the answer result, carrying out feature cascading on the basis, and reducing the dimension of the feature vector by using a self-encoder to obtain an answer record vector.
S5, taking the answer record vector as input, training a depth knowledge tracking model based on LSTM, inputting one answer record vector at each moment to obtain a knowledge state hidden vector at the corresponding moment, and inputting the obtained knowledge state hidden vector to a full connection layer to obtain a knowledge point mastering state vector of a student to realize personalized knowledge tracking. The method comprises the following specific steps:
s501, training a depth knowledge tracking model based on LSTM by taking the answer record vector as input, and inputting one answer record vector at each moment to obtain a knowledge state hidden vector at the corresponding moment. The transfer formula between models is as follows:
i t =σ(W ri r t +U ri h t-1 +b i )
f t =σ(W rf r t +U rf h t-1 +b f )
c t =f t *c t-1 +i t *Tanh(W rc r t +U rc h t-1 )
o t =σ(W ro r t +U ro h t-1 +b o )
h t =o t *Tanh(c t )
wherein i is t 、f t 、o t 、c t Respectively representing an input gate, a forgetting gate, an output gate and a memory unit in the LSTM, h t For the hidden direction of the current layerQuantity output, r t For input at time t, U ri h t-1 、U rf h t-1 、U rc h t-1 、U ro h t-1 H representing the hidden vectors of the respective gates at the previous time t-1 Weight, W ri 、W rf 、W rc 、W ro Respectively representing the weights of the corresponding gates, b i 、b f 、b o Representing the bias of the corresponding gates, respectively, σ being the activation function.
S502, inputting knowledge state hidden vectors at corresponding moments into a full connection layer to obtain knowledge point mastering state vectors K of students t The expression is as follows:
K t =σ(w k o t +b k )
wherein w is k 、b k Is the parameter to be learned.
The cross entropy loss function L is adopted in the training process, and the formula is as follows:
wherein q t+1 A, representing the problem answered by the student at time t+1 t+1 Representing whether the problem answered at the time t+1 is correct or not, delta represents the one-hot encoding format after dimension reduction, l is a cross entropy function, y t T The output at time t is shown.
Example 2
The embodiment also discloses a personalized knowledge tracking system integrating learning behavior characteristics, as shown in fig. 2, the system comprises the following functional units:
a data preprocessing unit for preparing an input data set; firstly, cleaning learning behavior data, removing students with the learning behavior data missing more than 80% according to the selected learning behavior data, then carrying out Max-Min normalization on the learning behavior data to keep data balance, obtaining original learning behavior feature vectors of each student, and separating and extracting problem text sequences, problem related knowledge point sequences and answer result sequences from answer data;
the learning behavior feature extraction unit is used for extracting learning behavior features affecting the answer result; encoding the answer result sequence by using a single-heat encoding rule to obtain an answer result vector, forming a two-dimensional vector by the answer result vector and the original learning behavior feature vector, and learning by using a convolutional neural network model to obtain a learning behavior feature vector influencing the answer result;
the problem information extraction unit is used for obtaining the problem text and the information of the included knowledge points; the method comprises the steps of processing a problem text, including punctuation and text interval, removing nonsensical characters, then word segmentation, word deactivation, finally extracting a keyword sequence of the problem, namely a problem text sequence, splicing the keyword sequence representing the problem with a related knowledge point sequence of a corresponding problem to obtain a problem feature sequence, converting the obtained problem feature sequence into a bit sequence code, inputting the bit sequence code into an embedded layer, or initializing the embedded layer of the text directly through a pre-training word vector; assuming that the dimension of the embedding layer is d and the vocabulary quantity in the corpus is m, the embedding layer is randomly initialized to a matrix with the size of d x m, and the vocabulary contained in the topic can obtain a corresponding word vector, namely an embedding vector through the bit sequence index, and the word vector is input into a noise reduction self-encoder to be reconstructed to obtain a problem coding vector;
the feature dimension reduction unit is used for reducing dimension of the obtained feature set; splicing the learning behavior feature vector, the problem coding vector and the answer result vector to obtain a feature set, then intersecting features in the feature set, carrying out feature cascading, and finally carrying out dimension reduction on the feature set through a self-encoder to obtain an answer record vector;
the knowledge tracking training unit is used for training a knowledge tracking model and predicting the mastering condition of the student on the knowledge points; and training a depth knowledge tracking model based on LSTM by taking the answer record vector as input, inputting one answer record vector at each moment to obtain a knowledge state hidden vector at the corresponding moment, and inputting the obtained knowledge state hidden vector to a full connection layer to obtain a knowledge point mastering state vector of a student to realize personalized knowledge tracking.
The above examples are preferred embodiments of the present invention, but the embodiments of the present invention are not limited to the above examples, and any other changes, modifications, substitutions, combinations, and simplifications that do not depart from the spirit and principle of the present invention should be made in the equivalent manner, and the embodiments are included in the protection scope of the present invention.

Claims (6)

1. The personalized knowledge tracking method integrating the learning behavior characteristics is characterized by comprising the following steps of:
s1, learning behavior characteristic data and response data of students in a teaching process are obtained; the acquired learning behavior characteristic data comprise data generated by learning activities of students on a teaching platform; the acquired answering data comprise questions and answering results of the students;
s2, preprocessing the acquired learning behavior characteristic data and response data to obtain a corresponding sequence; for learning behavior feature data, cleaning is needed first, and then standard normalization processing is carried out to obtain an original learning behavior feature vector; for answer data, a problem text sequence, a problem related knowledge point sequence and an answer result sequence are required to be separated and extracted from the answer data;
s3, encoding the answer result sequence by using a single-heat encoding rule to obtain an answer result vector, forming a two-dimensional vector by the answer result vector and the original learning behavior feature vector, and learning by using a convolutional neural network model to obtain a learning behavior feature vector influencing the answer result; splicing the problem text sequence and the problem related knowledge point sequence, and inputting the problem text sequence and the problem related knowledge point sequence into a noise reduction self-encoder to obtain a problem encoding vector; the method comprises the following steps:
s301, encoding a response result sequence by using a single-hot encoding rule, wherein 1 represents correct response, 0 represents incorrect response, obtaining a response result vector, constructing a student learning behavior feature and the response result vector into a two-dimensional vector, performing sequence splicing on a problem text sequence and a problem related knowledge point sequence, and inputting the sequence into a noise reduction self-encoder to obtain a problem encoding vector, wherein the specific steps are as follows:
s3011, preprocessing a problem text, including punctuation and word spacing, removing nonsensical characters, then word segmentation, word stopping, finally extracting a keyword sequence of the problem, namely a problem text sequence, and splicing the keyword sequence representing the problem with a related knowledge point sequence of a corresponding problem to obtain a problem feature sequence;
s3012, converting the obtained problem feature sequence into bit sequence codes, and inputting the bit sequence codes into an embedded layer, or initializing the embedded layer of the text directly through a pre-training word vector; assuming that the dimension of the embedding layer is d and the vocabulary quantity in the corpus is m, the embedding layer is randomly initialized to a matrix with the size of d x m, and the vocabulary contained in the topic can obtain a corresponding word vector, namely an embedding vector through the bit sequence index;
s3013, inputting the word vector into a noise reduction self-encoder, and reconstructing to obtain a problem coding vector; the noise reduction self-encoder is composed of a multi-layer feedforward neural network and comprises an encoding layer, a hiding layer and a decoding layer, wherein the encoding layer, the hiding layer and the decoding layer are defined, the left side is the encoder, the right side is the decoder, the noise reduction self-encoder can restore original information of a text through decoding of the decoder, in the process, the hiding layer captures implicit description of the text by using fewer neurons, the hidden layer is a more abstract low-dimensional information representation of the text, the hidden layer is analyzed in terms of interpretability, and the hidden layer extracts subject information of the text; the weight of each layer of feedforward neural network is randomly initialized according to Gaussian distribution;
the coding layer is used for mapping word vector input to a low-dimensional space, and is specifically expressed as:
h=f(w T x'+d)
wherein h represents the characteristic of the problem after encoding, x' represents the noisy version of the word vector, w T The weight matrix input for the coding layer, d is the bias term of the coding layer, f (·) is an element-width mapping function comprising an identity function f (g) =g or a sigmoid function f (x) =1/(1+e) (-x) );
The decoding layer is used for reconstructing original input data from noise data, and specifically expressed as:
wherein,to reconstruct the problem code vector by the noise reduction encoder, the parameter w' T For the weight matrix input by the decoding layer, d' is the decoding layer bias term, g (·) is an element-wise mapping function;
s302, performing feature learning based on a convolutional neural network, wherein a plurality of convolutional kernels with different sizes are used for extracting a plurality of groups of local features by a convolutional layer, and the output after the convolutional operation is as follows:
co=fr(wd*x i,i+cw-1 +br)
wherein wd is a shared weight parameter, cw is a sliding window size, x is a two-dimensional vector composed of learning behavior features indicating the window size and a response result vector, br is a bias term, and fr is an activation function;
s303, carrying out maximum pooling operation on the features extracted by the convolution layer, and then calculating the features through a softmax function of the full connection layer, so as to extract learning behavior feature vectors affecting the answer result according to the calculated probability value;
s4, splicing the learning behavior feature vector, the problem coding vector and the answer result vector to obtain a feature set, then intersecting features in the feature set, carrying out feature cascading, and finally carrying out dimension reduction on the feature set through a self-encoder to obtain an answer record vector;
s5, taking the answer record vector as input, training a depth knowledge tracking model based on LSTM, inputting one answer record vector at each moment to obtain a knowledge state hidden vector at the corresponding moment, and inputting the obtained knowledge state hidden vector to a full connection layer to obtain a knowledge point mastering state vector of a student to realize personalized knowledge tracking.
2. The personalized knowledge tracking method according to claim 1, wherein in step S1, learning behavior feature data and response data of students are obtained from online teaching platforms including MOOC and rain class by teaching units; wherein, a teaching unit is used as a stage to count the learning behavior characteristics of students.
3. The personalized knowledge tracking method according to claim 1, wherein the step S2 comprises the steps of:
s201, cleaning learning behavior feature data, removing student individuals with the selected feature missing more than 80%, performing simple numerical operation on part of the learning behavior features according to the original learning behavior features, and extracting the learning behavior features which can better reflect the student status of the students;
s202, carrying out Max-Min normalization on the cleaned learning behavior feature data to keep data balance, obtaining an original learning behavior feature vector of each student, and marking the original learning behavior feature vector as F, wherein the original learning behavior feature vector is expressed as follows:
wherein N is the number of students, c is the number of curriculum chapter sections, N is a positive integer,representing the learning behavior vector of the nth student in the c-th unit,/for the student>Can be specifically described as { b 1 ,b 2 ,…,b fk },b fk Representing a certain learning behavior characteristic of the nth student in the c unit, wherein the total of the statistical learning behavior characteristics is fk;
extracting a problem text sequence of the student and marking the problem text sequence as Q; sequencing the answering data according to the serial numbers of the students, splicing the answering records of the same student in the same unit into a record according to the sequence of the answering, and unifying the expression forms of the data by using a standardized unit, wherein the formalized expression is as follows:
wherein,indicating that the nth student makes the problem of the nth channel, the answer amount of each student or the problem amount of each unit may be different and respectively marked as (t 1, t2, …, tn), and tn indicates the answer amount of the nth student;
extracting a problem related knowledge point sequence, and marking the sequence as K; according to the answer data and the extracted answer series of the student problems and the corresponding relation between the problems and the knowledge points, the relevant knowledge point sequence of the problems of each student answer is obtained, and formalized representation is as follows:
wherein,the n-th question contains the sn knowledge points, and the number of the knowledge points contained in each question is different and is respectively marked as (s 1, s2, …, sn);
extracting a student answer result sequence, and marking the student answer result sequence as A; according to the answer data and the extracted answer series of the student problems, an answer result sequence of each student is obtained, and formalized representation is as follows:
wherein,indicating that the answering result of the nth student on the nth problem is correct or incorrect.
4. The personalized knowledge tracking method according to claim 1, wherein the step S4 comprises the steps of:
s401, vector splicing is carried out on the learning feature vector, the problem coding vector and the answering result vector, and a feature set of each student affecting the answering result is obtained;
s402, intersecting the feature sets influencing the answer result, carrying out feature cascading on the basis, and reducing the dimension of the feature vector by using a self-encoder to obtain an answer record vector.
5. The personalized knowledge tracking method according to claim 1, wherein the step S5 comprises the steps of:
s501, taking answer record vectors as input, training a depth knowledge tracking model based on LSTM, inputting one answer record vector at each moment, and obtaining knowledge state hidden vectors at corresponding moments; the transfer formula between models is as follows:
i t =σ(W ri r t +U ri h t-1 +b i )
f t =σ(W rf r t +U rf h t-1 +b f )
c t =f t *c t-1 +i t *Tanh(W rc r t +U rc h t-1 )
o t =σ(W ro r t +U ro h t-1 +b o )
h t =o t *Tanh(c t )
wherein i is t 、f t 、o t 、c t Respectively in LSTMInput gate, forget gate, output gate and memory cell, h t For hidden vector output of current layer, r t For input at time t, U ri h t-1 、U rf h t-1 、U rc h t-1 、U ro h t-1 H representing the hidden vectors of the respective gates at the previous time t-1 Weight, W ri 、W rf 、W rc 、W ro Respectively representing the weights of the corresponding gates, b i 、b f 、b o Respectively representing the bias of the corresponding gates, wherein sigma is an activation function;
s502, inputting knowledge state hidden vectors at corresponding moments into a full connection layer to obtain knowledge point mastering state vectors K of students t The expression is as follows:
K t =σ(w k o t +b k )
wherein w is k 、b k Is a parameter to be learned;
the cross entropy loss function L is adopted in the training process, and the formula is as follows:
wherein q t+1 A, representing the problem answered by the student at time t+1 t+1 Representing whether the problem answered at the time t+1 is correct or not, delta represents the one-hot encoding format after dimension reduction, l is a cross entropy function, y t T The output at time t is shown.
6. The personalized knowledge tracking system integrating learning behavior features is characterized by being used for realizing the personalized knowledge tracking method integrating learning behavior features as claimed in any one of claims 1-5, and comprising the following steps:
a data preprocessing unit for preparing an input data set; firstly, cleaning learning behavior data, removing students with the learning behavior data missing more than 80% according to the selected learning behavior data, then carrying out Max-Min normalization on the learning behavior data to keep data balance, obtaining original learning behavior feature vectors of each student, and separating and extracting problem text sequences, problem related knowledge point sequences and answer result sequences from answer data;
the learning behavior feature extraction unit is used for extracting learning behavior features affecting the answer result; encoding the answer result sequence by using a single-heat encoding rule to obtain an answer result vector, forming a two-dimensional vector by the answer result vector and the original learning behavior feature vector, and learning by using a convolutional neural network model to obtain a learning behavior feature vector influencing the answer result;
the problem information extraction unit is used for obtaining the problem text and the information of the included knowledge points; the method comprises the steps of processing a problem text, including punctuation and text interval, removing nonsensical characters, then word segmentation, word deactivation, finally extracting a keyword sequence of the problem, namely a problem text sequence, splicing the keyword sequence representing the problem with a related knowledge point sequence of a corresponding problem to obtain a problem feature sequence, converting the obtained problem feature sequence into a bit sequence code, inputting the bit sequence code into an embedded layer, or initializing the embedded layer of the text directly through a pre-training word vector; assuming that the dimension of the embedding layer is d and the vocabulary quantity in the corpus is m, the embedding layer is randomly initialized to a matrix with the size of d x m, and the vocabulary contained in the topic can obtain a corresponding word vector, namely an embedding vector through the bit sequence index, and the word vector is input into a noise reduction self-encoder to be reconstructed to obtain a problem coding vector;
the feature dimension reduction unit is used for reducing dimension of the obtained feature set; splicing the learning behavior feature vector, the problem coding vector and the answer result vector to obtain a feature set, then intersecting features in the feature set, carrying out feature cascading, and finally carrying out dimension reduction on the feature set through a self-encoder to obtain an answer record vector;
the knowledge tracking training unit is used for training a knowledge tracking model and predicting the mastering condition of the student on the knowledge points; and training a depth knowledge tracking model based on LSTM by taking the answer record vector as input, inputting one answer record vector at each moment to obtain a knowledge state hidden vector at the corresponding moment, and inputting the obtained knowledge state hidden vector to a full connection layer to obtain a knowledge point mastering state vector of a student to realize personalized knowledge tracking.
CN202110928810.3A 2021-08-13 2021-08-13 Personalized knowledge tracking method and system integrating learning behavior characteristics Active CN113793239B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110928810.3A CN113793239B (en) 2021-08-13 2021-08-13 Personalized knowledge tracking method and system integrating learning behavior characteristics

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110928810.3A CN113793239B (en) 2021-08-13 2021-08-13 Personalized knowledge tracking method and system integrating learning behavior characteristics

Publications (2)

Publication Number Publication Date
CN113793239A CN113793239A (en) 2021-12-14
CN113793239B true CN113793239B (en) 2023-12-19

Family

ID=79181650

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110928810.3A Active CN113793239B (en) 2021-08-13 2021-08-13 Personalized knowledge tracking method and system integrating learning behavior characteristics

Country Status (1)

Country Link
CN (1) CN113793239B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114238613B (en) * 2022-02-22 2022-05-27 北京一起航帆科技有限公司 Method and device for determining mastery degree of knowledge points and electronic equipment
CN114781710B (en) * 2022-04-12 2022-12-23 云南师范大学 Knowledge tracking method for difficulty characteristics of comprehensive learning process and question knowledge points
CN114971066A (en) * 2022-06-16 2022-08-30 兰州理工大学 Knowledge tracking method and system integrating forgetting factor and learning ability
CN116127048B (en) * 2023-04-04 2023-06-27 江西师范大学 Sequential self-attention knowledge tracking model integrating exercises and learning behavior characterization
CN117291775B (en) * 2023-11-27 2024-03-01 山东多科科技有限公司 Depth knowledge tracking accurate teaching method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110428010A (en) * 2019-08-05 2019-11-08 中国科学技术大学 Knowledge method for tracing
CN111831831A (en) * 2020-07-17 2020-10-27 广东金融学院 Knowledge graph-based personalized learning platform and construction method thereof
CN112182308A (en) * 2020-09-29 2021-01-05 华中师范大学 Multi-feature fusion depth knowledge tracking method and system based on multi-thermal coding
CN112800323A (en) * 2021-01-13 2021-05-14 中国科学技术大学 Intelligent teaching system based on deep learning
CN112949935A (en) * 2021-03-26 2021-06-11 华中师范大学 Knowledge tracking method and system fusing student knowledge point question interaction information
CN112990464A (en) * 2021-03-12 2021-06-18 东北师范大学 Knowledge tracking method and system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110428010A (en) * 2019-08-05 2019-11-08 中国科学技术大学 Knowledge method for tracing
CN111831831A (en) * 2020-07-17 2020-10-27 广东金融学院 Knowledge graph-based personalized learning platform and construction method thereof
CN112182308A (en) * 2020-09-29 2021-01-05 华中师范大学 Multi-feature fusion depth knowledge tracking method and system based on multi-thermal coding
CN112800323A (en) * 2021-01-13 2021-05-14 中国科学技术大学 Intelligent teaching system based on deep learning
CN112990464A (en) * 2021-03-12 2021-06-18 东北师范大学 Knowledge tracking method and system
CN112949935A (en) * 2021-03-26 2021-06-11 华中师范大学 Knowledge tracking method and system fusing student knowledge point question interaction information

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
一种结合深度知识追踪的个性化习题推荐方法;马骁睿;徐圆;朱群雄;;小型微型计算机系统(第05期);第96-101页 *

Also Published As

Publication number Publication date
CN113793239A (en) 2021-12-14

Similar Documents

Publication Publication Date Title
CN113793239B (en) Personalized knowledge tracking method and system integrating learning behavior characteristics
CN110379225B (en) System and method for interactive language acquisition
CN107608943B (en) Image subtitle generating method and system fusing visual attention and semantic attention
CN110717431A (en) Fine-grained visual question and answer method combined with multi-view attention mechanism
CN111695779A (en) Knowledge tracking method, knowledge tracking device and storage medium
CN108257052B (en) Online student knowledge assessment method and system
CN111159419B (en) Knowledge tracking data processing method, system and storage medium based on graph convolution
CN111291940B (en) Student class dropping prediction method based on Attention deep learning model
Suresh et al. Automating analysis and feedback to improve mathematics teachers’ classroom discourse
CN108228674B (en) DKT-based information processing method and device
CN112529155B (en) Dynamic knowledge mastering modeling method, modeling system, storage medium and processing terminal
CN110704510A (en) User portrait combined question recommendation method and system
CN111444432A (en) Domain-adaptive deep knowledge tracking and personalized exercise recommendation method
CN112116137A (en) Student class dropping prediction method based on mixed deep neural network
CN113569001A (en) Text processing method and device, computer equipment and computer readable storage medium
CN113408852B (en) Meta-cognition ability evaluation model based on online learning behavior and deep neural network
CN113610235A (en) Adaptive learning support device and method based on deep knowledge tracking
CN114254127A (en) Student ability portrayal method and learning resource recommendation method and device
CN114881331A (en) Learner abnormal learning state prediction method facing online education
CN114861754A (en) Knowledge tracking method and system based on external attention mechanism
Cai Automatic essay scoring with recurrent neural network
CN115080715A (en) Span extraction reading understanding method based on residual error structure and bidirectional fusion attention
CN113283488B (en) Learning behavior-based cognitive diagnosis method and system
CN112949935B (en) Knowledge tracking method and system fusing student knowledge point question interaction information
CN113378581A (en) Knowledge tracking method and system based on multivariate concept attention model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant