CN113792626A - Teaching process evaluation method based on teacher non-verbal behaviors - Google Patents

Teaching process evaluation method based on teacher non-verbal behaviors Download PDF

Info

Publication number
CN113792626A
CN113792626A CN202111002386.6A CN202111002386A CN113792626A CN 113792626 A CN113792626 A CN 113792626A CN 202111002386 A CN202111002386 A CN 202111002386A CN 113792626 A CN113792626 A CN 113792626A
Authority
CN
China
Prior art keywords
teacher
behavior
nonverbal
feature
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111002386.6A
Other languages
Chinese (zh)
Inventor
刘海
张昭理
李林峰
赵万里
张胜强
时振武
童宇航
吴远芳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Central China Normal University
Original Assignee
Central China Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Central China Normal University filed Critical Central China Normal University
Priority to CN202111002386.6A priority Critical patent/CN113792626A/en
Publication of CN113792626A publication Critical patent/CN113792626A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a teaching process evaluation method based on teacher nonverbal behaviors, which comprises the following steps of: 1) acquiring nonverbal behavior data, and acquiring nonverbal behavior data of teachers in a teaching classroom; 2) carrying out omnibearing characterization quantification on the nonverbal behaviors in the teacher teaching by adopting a behavior characteristic matrix; 3) training a behavior feature classifier: training a classifier by using the emotion label to obtain corresponding teacher-class personality characteristic representation and class nonverbal behavior characteristic representation; 4) according to the non-verbal behavior acquisition data of a specific teacher, feature characterization and feature decomposition are carried out to obtain the feature characterization of the teacher and the classroom behavior feature characterization, and the feature vectors of the teacher and the classroom behavior feature characterization are multiplied to obtain the teaching evaluation expression of the teacher. According to the invention, through three modal data of emotion, posture and physiological signals, a comprehensive and accurate teacher class nonverbal behavior depiction is established, and a quantitative evaluation frame of a teaching process based on the teacher nonverbal behavior is established.

Description

Teaching process evaluation method based on teacher non-verbal behaviors
Technical Field
The invention relates to a teaching evaluation informatization technology, in particular to a teaching process evaluation method based on teacher nonverbal behaviors.
Background
Traditional classroom evaluation relies on classroom observation, and this method is time-consuming and labor-consuming and is not objective enough. The teacher class nonverbal behaviors embody the teaching artistic level of the teacher and are necessary supplement of the teacher class verbal behaviors; can play a typical demonstration role for students and influence the communication skills of the students and the comprehension degree of the information transmitted by the teachers. Therefore, the teaching quality is influenced finally, and the study on the nonverbal behaviors of the teacher in the classroom plays a key role in comprehensively improving the teaching quality.
In the past, people evaluate the literacy of teachers from the perspective of speech behaviors, such as speech application skills, organization teaching skills and the like. The comprehension of the traditional nonverbal behaviors lacks of fully describing the teaching effect, so that the interpretability of the quantification of the nonverbal behaviors in a teacher classroom cannot be ensured. And the individual difference of the behavior pattern is large, the existing understanding pattern has low efficiency, small scale, strong subjectivity, lack of scientific judging standard, insufficient overall intellectualization and low efficiency. The traditional scheme depends on the result evaluation of questionnaire survey, the nonverbal behavior of a teacher runs through the whole teaching process, and the unique function of the teacher in teaching practice is valued by vast excellent teachers. Teaching activities are cognitive processes under special situations, and the nonverbal behaviors of teachers can invoke factors such as motivations, willingness, attitudes, interests and the like of students, and the factors do not directly participate in the cognitive processes but have promotion and regulation effects on cognitive acquisition. In the current era of deep integration of information technology and education theory, the method is a key ring for realizing the innovation and reform of teaching mode by establishing a digital teaching scene and effectively quantifying the non-language behaviors of teachers by using an artificial intelligence technology.
The classroom teaching process is evaluated by a teacher nonverbal behavior quantitative calculation method in an intelligent teaching environment by utilizing a big data technology, and the method has the characteristics of objectivity, continuity, iteration and the like, and breaks through the traditional scale evaluation form.
Disclosure of Invention
The invention aims to solve the technical problem of providing a teaching process evaluation method based on teacher non-verbal behaviors aiming at the defects in the prior art.
The technical scheme adopted by the invention for solving the technical problems is as follows: a teaching process evaluation method based on teacher nonverbal behaviors comprises the following steps:
1) nonverbal behavior data acquisition, obtain the nonverbal behavior data of teacher in the teaching classroom, include: three modal data of emotion, posture and physiological signals; the emotion modal data are facial expression images of the teacher and corresponding gesture action pictures under the facial expression; the posture modal data are head steering posture pictures of teachers; the physiological signals are electroencephalogram data of teachers;
2) carrying out omnibearing characterization quantification on the nonverbal behaviors in the teacher teaching by adopting a behavior characteristic matrix;
2.1) preprocessing data; processing including noisy data, redundant data, and missing values;
2.2) characterizing and quantifying non-verbal behaviors in teacher teaching;
2.2.1) teacher's facial expression and feature representation of the gesture movement, input the modal data of emotion into LeNet-5 neural network of facial expression image and gesture movement to carry on the feature extraction, the characteristic outputted in the last layer is regarded as the characteristic vector of emotion;
the LeNet-5 neural network training method comprises the following steps:
calculating the similarity of facial expressions and gesture actions of different teachers through a cosine similarity formula; secondly, ranking the calculated emotion similarity obtained by fusing the facial expression and gesture action similarities of the teacher according to a Hump ranking algorithm, fitting a ranking result by using a student-t distribution function to obtain the real label distribution of each teaching emotion, and obtaining an optimal solution, namely a teacher facial emotion result after a plurality of iterative optimizations; the Student-t distribution function for the construction of the expression label is expressed as a formula (1), wherein Gamma (·) represents a Gamma function, and v represents a degree of freedom;
Figure BDA0003235958170000031
2.2.2) extracting a head turning posture image of the teacher, taking the head turning posture image of the teacher as input, and extracting features by utilizing an EfficientNet network;
the method for training the EfficientNet network comprises the following steps:
and (4) calculating the three-element attitude perception loss and the von Mises-Fisher distribution loss. The loss of the three-element posture perception is generated by calculating the similarity between different features, firstly defining a similarity measurement D, and then obtaining a perception loss LtExpressed as formulas (2) and (3);
Figure BDA0003235958170000041
Lt(xn,xa,xp)=max(0,D(xa,xn)-D(xa,xp)). (4)
whereas for the distribution loss the respective characteristics are input to the distribution module. In a von Mises-Fisher distribution module, firstly inputting the obtained characteristics into a full connection layer, outputting to obtain an unconstrained matrix M, constructing von Mises-Fisher distribution, and then calculating distribution loss by using a rotation matrix R representing the head posture of the teacher;
extracting the image of the teacher's face, setting the width and height of the image as w and h respectively, and obtaining the center of the left eye pupil and the right eye pupilCoordinate B1And B2And calculating and obtaining pitch and yaw angles of the left eye and the right eye of the teacher, adopting the mean square error and the KL divergence loss as the loss of the network model, and improving the accuracy of the pitch and yaw angles of the left eye and the right eye through multiple iterations. The learning label is constructed into a Gaussian distribution form, and the specific formula is as follows:
Figure BDA0003235958170000042
wherein x represents a face image for teaching by a teacher,
Figure BDA0003235958170000043
representing the pitch and yaw angles of the eyeball.
2.2.3) dividing the acquired multichannel electroencephalogram signals into a plurality of segments, extracting time domain, frequency domain and nonlinear dynamics characteristics from each segment of signals by adopting a Student-t distribution function for expression label construction, and constructing a characteristic sequence;
and constructing an emotion recognition model based on a long short-term memory (LSTM) network and EGG. Firstly, preprocessing an electroencephalogram signal to obtain high-efficiency robust electroencephalogram data; then extracting various characteristics from the preprocessed electroencephalogram data respectively and constructing characteristic sequences; and finally, training a classifier by using the characteristic sequence and the corresponding emotion label.
2.3) forming a real-time and accurate behavior feature matrix of each teacher according to the feature vectors after the representation is finished; behavior characterization matrix HijI represents the number of the teacher, and j represents the characteristic representation of the data of the three modes;
after characterizing and quantifying the nonverbal behaviors of the teacher in the step 2.2), obtaining a behavior expression vector of the teacher, wherein the behavior expression vector comprises the following steps: facial emotion E ═<e1,e2,…,et>The body constitution B ═<b1,b2,…,bt>And physiological P ═<p1,p2,…,pt>,
First, through a multi-channel attention mechanism, for different behavior vectors,weight of learning behavior WE、WBAnd WPMultiplied by the vector, input into the frame attention model, learn frame sequence weights in different behavior vectors, such as We,1、We,2、We,3And the like. After weight distribution is carried out on behavior states under different time sequences, the behavior states are input into an LSTM module for learning, and an LSTM layer at the t-th moment converges and learns a previous layer of memory ht-1For example, in the LSTM layer under facial emotion, the output at time t is ht=σ(W0*We,t*et+U0*ht-1+b0). Performing feature compression coding by using Hadamard product to obtain final behavior feature matrix Hij=(Eij,Bij,Pij)
3) Training a behavior feature classifier: training a classifier by using the emotion label to obtain corresponding teacher style characteristic representation and corresponding class behavior characteristic representation;
3.1) constructing a neural network based on time recursion from the emotion index, the physical index and the physiological index to carry out matrix characteristic decomposition to obtain corresponding teacher style characteristic features and corresponding class behavior characteristic features;
teacher behavior feature matrix H using non-Negative Matrix Factorization (NMF)ijDecomposition into teacher-like personality characterization thetai*And class nonverbal behavior feature vector betaj*
To be provided with
Figure BDA0003235958170000061
The minimum is an objective function, and the teacher-like personality characteristic representation is calculated and determined
Figure BDA0003235958170000062
And classroom nonverbal behavior feature vector
Figure BDA0003235958170000063
Wherein, superscript T represents transposition;
wherein N is the total number of teachers, M is the number of hidden characteristics of classroom behaviors obtained by matrix decomposition, aiFor learning(ii) the derived self-attention weight;
3.2) characterizing behavior matrix HijConstructing a Gaussian distribution form with the learning label, and finally training a classifier by using the characteristic sequence and the corresponding emotion label;
4) according to the non-verbal behavior acquisition data of a specific teacher, feature characterization and feature decomposition are carried out to obtain the feature characterization of the teacher and the classroom behavior feature characterization, and the feature vectors of the teacher and the classroom behavior feature characterization are multiplied to obtain the teaching evaluation expression of the teacher. 4.1) utilizing the collected data of the nonverbal behaviors of a specific teacher to represent the obtained teacher feature vector, and obtaining the feature representation theta corresponding to the specific teacher by using direct aggregation operation after the teacher feature representation of the teacher personality type obtained in the step 3.1) passes through a Recurrent Neural Network (RNN) networki
4.2) utilizing the teacher characteristic vector obtained by data representation of nonverbal behavior acquisition of a specific teacher, and carrying out direct aggregation operation on the classroom nonverbal behavior representation obtained in the step 3.1) to obtain the nonverbal behavior representation beta of the specific teacherj
And 4.3) multiplying the feature expression vector of the specific teacher by the nonverbal behavior characterization vector to obtain the teaching evaluation expression of the teacher.
The invention has the following beneficial effects:
1. the invention starts from three modal data of emotion, body condition and physiological signal, and establishes comprehensive and accurate non-verbal characterization of teacher classroom behavior; establishing a quantitative evaluation framework combining the macro and the micro of multi-modal data fusion;
2. according to the invention, a time-series-based recurrent neural network is introduced for teaching behavior perception, classroom teaching scenes are identified and data fusion is carried out, so that the perception and quantification of nonverbal behaviors of teachers are realized, and refined data support is provided for teaching effect evaluation;
drawings
The invention will be further described with reference to the accompanying drawings and examples, in which:
FIG. 1 is a flow chart of a method of an embodiment of the present invention;
FIG. 2 is a construction of a temporal recurrent neural network of an embodiment of the present invention;
FIG. 3 is a schematic diagram of a temporal recurrent neural network training in accordance with an embodiment of the present invention;
FIG. 4 is a flow chart of training the hidden factor model according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail with reference to the following embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
As shown in fig. 1, a teaching process evaluation method based on teacher nonverbal behavior includes the following steps:
step 1, three types of modal data in nonverbal behaviors are extracted.
Obtain teacher's non-speech behavior data in the teaching classroom respectively through many meshes camera device, intelligent wearable equipment, include: three modal data of emotion, posture and physiological signals; the emotion modal data are facial expression images of the teacher and corresponding gesture action pictures under the facial expression; the posture modal data are head steering posture pictures of teachers; the physiological signals are electroencephalogram data of teachers;
and 2, extracting the characteristics of the emotion, the body posture and the physiological signal data.
And deleting irrelevant data and repeated data in the original data set, smoothing noise data, screening data irrelevant to the mining theme, and processing missing values and abnormal values. And putting the characteristic vectors after the representation is finished into a multi-label network learning module to form a real-time and accurate behavior characteristic matrix of each teacher.
The method comprises the following specific steps: 2.1) preprocessing data; processing including noisy data, redundant data, and missing values;
2.2) characterizing and quantifying non-verbal behaviors in teacher teaching;
2.2.1) teacher's facial expression and feature representation of the gesture movement, input the modal data of emotion into LeNet-5 neural network of facial expression image and gesture movement to carry on the feature extraction, the characteristic outputted in the last layer is regarded as the characteristic vector of emotion;
the LeNet-5 neural network training method comprises the following steps:
calculating the similarity of facial expressions and gesture actions of different teachers through a cosine similarity formula; secondly, ranking the calculated emotion similarity obtained by fusing the facial expression and gesture action similarities of the teacher according to a Hump ranking algorithm, fitting a ranking result by using a student-t distribution function to obtain the real label distribution of each teaching emotion, and obtaining an optimal solution, namely a teacher facial emotion result after a plurality of iterative optimizations; the Student-t distribution function for the construction of the expression label is expressed as a formula (1), wherein Gamma (·) represents a Gamma function, and v represents a degree of freedom;
Figure BDA0003235958170000091
2.2.2) extracting a head turning posture image of the teacher, taking the head turning posture image of the teacher as input, and extracting features by utilizing an EfficientNet network;
the method for training the EfficientNet network comprises the following steps:
and (4) calculating the three-element attitude perception loss and the von Mises-Fisher distribution loss. The loss of the three-element posture perception is generated by calculating the similarity between different features, firstly defining a similarity measurement D, and then obtaining a perception loss LtExpressed as formulas (2) and (3);
Figure BDA0003235958170000101
Lt(xn,xa,xp)=max(0,D(xa,xn)-D(xa,xp)). (4)
whereas for the distribution loss the respective characteristics are input to the distribution module. In a von Mises-Fisher distribution module, firstly inputting the obtained characteristics into a full connection layer, outputting to obtain an unconstrained matrix M, constructing von Mises-Fisher distribution, and then calculating distribution loss by using a rotation matrix R representing the head posture of the teacher;
extracting the image of the face of the teacher, setting the width and the height of the image as w and h respectively, and acquiring the coordinates B of the centers of the pupils of the left eye and the right eye1And B2And calculating and obtaining pitch and yaw angles of the left eye and the right eye of the teacher, adopting the mean square error and the KL divergence loss as the loss of the network model, and improving the accuracy of the pitch and yaw angles of the left eye and the right eye through multiple iterations. The learning label is constructed into a Gaussian distribution form, and the specific formula is as follows:
Figure BDA0003235958170000102
wherein x represents a face image for teaching by a teacher,
Figure BDA0003235958170000103
representing the pitch and yaw angles of the eyeball.
2.2.3) dividing the acquired multichannel electroencephalogram signals into a plurality of segments, extracting time domain, frequency domain and nonlinear dynamics characteristics from each segment of signals by adopting a Student-t distribution function for expression label construction, and constructing a characteristic sequence;
and constructing an emotion recognition model based on a long short-term memory (LSTM) network and EGG. Firstly, preprocessing an electroencephalogram signal to obtain high-efficiency robust electroencephalogram data; then extracting various characteristics from the preprocessed electroencephalogram data respectively and constructing characteristic sequences; and finally, training a classifier by using the characteristic sequence and the corresponding emotion label.
2.3) forming a real-time and accurate behavior feature matrix of each teacher according to the feature vectors after the representation is finished; behavior characterization matrix HijI represents the number of the teacher, and j represents the characteristic representation of the data of the three modes;
after characterizing and quantifying the nonverbal behaviors of the teacher in the step 2.2), obtaining a behavior expression vector of the teacher, wherein the behavior expression vector comprises the following steps: facial emotion E ═<e1,e2,…,et>The body constitution B ═<b1,b2,…,bt>And physiological P ═<p1,p2,…,pt>,
Firstly, through a multi-channel attention mechanism, aiming at different behavior vectors, learning a behavior weight WE、WBAnd WPMultiplied by the vector, input into the frame attention model, learn frame sequence weights in different behavior vectors, such as We,1、We,2、We,3And the like. After weight distribution is carried out on behavior states under different time sequences, the behavior states are input into an LSTM module for learning, and an LSTM layer at the t-th moment converges and learns a previous layer of memory ht-1For example, in the LSTM layer under facial emotion, the output at time t is ht=σ(W0*We,t*et+U0*ht-1+b0). Performing feature compression coding by using Hadamard product to obtain final behavior feature matrix Hij=(Eij,Bij,Pij)
The behavior feature matrix for each teacher needs to be trained separately, and the training modality data is generated by large-scale integration data. For the collaborative network learning module, offline learning and online learning can be adopted for training the model.
If offline learning is selected, an automated encoder (C2AE) associated with the collaborative classification specification is employed for the collaborative web learning module (see document: Yeh C K, Wu W C, Ko W J, et al. Unlike most tag embedding-based methods, which generally view tag embedding and prediction as two separate tasks, the collaborative classification canonical dependent auto-encoder is an auto-encoder that performs Deep Canonical Correlation Analysis (DCCA) to learn the feature-aware potential subspace for tag embedding and collaborative classification of each modality information.
If online learning is selected, modal data acquisition is performed according to the data characteristics that arrive in an online time series and occur only once. Based on this, the training method that can be selected is: cost-sensitive dynamic main projection (CS-DPP) (see Chu H M, Huang K H, Lin H T. dynamic primary projection for cost-sensitive online multi-label classification [ J ]. Machine Learning,2019,108(8): 1193-. The basis is the online LSR framework derived from a leading LSDR algorithm. In particular, the CS-DPP is provided with an efficient online dimensionality reducer excited by a matrix random gradient and is combined with a well-designed online regression learner to establish the theoretical basis of the online dimensionality reducer. In addition, the CS-DPP embeds cost information into the label weights.
Step 3, fusing the characteristics of matrix factorization under a cyclic (time recursive) neural network, and training the classifier by using an emotion label;
according to the important observable and decomposable characteristic that the class nonverbal behaviors of the teacher are observable and decomposable, the neural network matrix characteristic decomposition based on time recursion is constructed from the emotion index, the posture index and the body distance index. Specifically, the teacher obtains corresponding teacher-like personality characteristic representation and corresponding class behavior characteristic representation by means of individual body situation symbols and nonverbal behavior information sent by physiological symbols through Nonnegative Matrix Factorization (NMF). On the basis, the feature sequences and the learning labels are constructed into a Gaussian distribution form, and finally, the classifier is trained by using the feature sequences and the corresponding emotion labels.
3.1) constructing a neural network based on time recursion from the emotion index, the physical index and the physiological index to carry out matrix characteristic decomposition to obtain corresponding teacher style characteristic features and corresponding class behavior characteristic features;
teacher behavior feature matrix H using non-Negative Matrix Factorization (NMF)ijDecomposition into teacher-like personality characterization thetai*And corresponding classroom behavior characterization betaj*
To be provided with
Figure BDA0003235958170000131
The minimum is an objective function, and the characteristic vector of the teacher is calculated and determined
Figure BDA0003235958170000132
And classroom nonverbal behavior feature vector
Figure BDA0003235958170000133
Wherein, superscript T represents transposition;
wherein N is the total number of teachers, M is the number of hidden characteristics of classroom behaviors obtained by matrix decomposition, aiSelf-attention weights learned for a neural network based on temporal recursion;
3.2) characterizing behavior matrix HijConstructing a Gaussian distribution form with the learning label, and finally training a classifier by using the characteristic sequence and the corresponding emotion label;
and step 4, finishing teacher teaching evaluation representation.
Collecting data according to non-verbal behaviors of a specific teacher, and performing characteristic characterization and characteristic decomposition to obtain characteristic characterization theta of the teacheri*And classroom behavior characterization betaj*And multiplying the feature vectors of the two to obtain the teaching evaluation expression of the teacher. The method comprises the following specific steps:
after the teacher feature vector obtained by initialization and the teacher-like personality feature characterization obtained in the previous step pass through a neural network (RNN) based on cycle (time recursion), the structure is shown in figure 2, and the feature characterization theta corresponding to a specific teacher is obtained by using direct aggregation operationi
Directly aggregating the teacher behavior feature vector obtained by initialization and the classroom non-verbal behavior feature vector obtained in the front to obtain the special teacher non-verbal behavior characterization betaj
And finally, multiplying the feature expression vector of the specific teacher by the nonverbal behavior characterization vector to obtain the teaching evaluation expression of the teacher. To be provided with
Figure BDA0003235958170000141
Determining teacher feature vectors for utility functions using gradient iterative optimization calculations
Figure BDA0003235958170000142
And class nonverbalAs feature vectors
Figure BDA0003235958170000143
The specific solving process is shown in fig. 3 and 4.
Teacher's feature representation vector θi*Column number and non-verbal behavior characterization vector
Figure BDA0003235958170000144
The number of rows is equal, the specific size can be determined according to the number of training samples, the larger the number of training samples is, the larger the number of columns and rows is, and otherwise, the smaller the training samples is, the smaller the number of columns and rows is, and the adjustment can be carried out according to the recommendation result.
In particular experiments using the present invention, the lesson is divided into closely-connected micro-segments by time-slice, and the motion can be considered stationary within a segment, so that a segment can be observed and judged and recorded in a specific manner. The classroom teaching evaluation method with student evaluation and classroom investigation as assistance is adopted, a non-verbal evaluation mechanism of a teacher is finely adjusted to adapt to individual style teaching requirements of the teacher, the non-verbal evaluation mechanism is taken as a teaching feedback mode of an experimental group, statistical calculation is carried out on data of a school period collected in an experimental scene by a project, a final non-verbal behavior quantitative feedback result of the teacher is obtained, and a practical case analysis is obtained.
The experiment was then carried out in two stages. The method comprises the following steps that firstly, before an experiment, a teacher nonverbal behavior evaluation system is tried on an individualized teaching function and parameters are debugged, and debugging parameters of key technologies in different modules are dynamically adjusted by organizing a plurality of subjects and teachers trying the system in different grades, so that initialization parameters covering teaching styles and teacher instrument panel display are formed; and in the second stage, a classroom teaching evaluation method which mainly evaluates the nonverbal behaviors of teachers and assists in student evaluation and classroom investigation is constructed, parameters of a nonverbal evaluation system of teachers in the experimental group are finely adjusted to adapt to individual style teaching requirements of the teachers, the parameters are used as a teaching feedback mode of the experimental group, and finally, the difference of two groups of evaluation results before and after the evaluation results is compared and analyzed through an evaluation mode which combines whole-period teaching and single-section classroom teaching, so that the influence of adaptive teaching feedback is obtained. On the basis of representing the non-verbal behaviors of the teacher, the performance of different non-verbal behaviors in a time sequence state is explored, and the teaching effect of the teacher is analyzed and summarized.
It will be understood that modifications and variations can be made by persons skilled in the art in light of the above teachings and all such modifications and variations are intended to be included within the scope of the invention as defined in the appended claims.

Claims (5)

1. A teaching process evaluation method based on teacher nonverbal behaviors is characterized by comprising the following steps:
1) nonverbal behavior data acquisition, obtain the nonverbal behavior data of teacher in the teaching classroom, include: three modal data of emotion, posture and physiological signals; the emotion modal data are facial expression images of the teacher and corresponding gesture action pictures under the facial expression; the posture modal data are head steering posture pictures of teachers; the physiological signals are electroencephalogram data of teachers;
2) carrying out omnibearing characterization quantification on the nonverbal behaviors in the teacher teaching by adopting a behavior characteristic matrix;
2.1) preprocessing data; processing including noisy data, redundant data, and missing values;
2.2) characterizing and quantifying non-verbal behaviors in teacher teaching;
2.2.1) teacher's facial expression and feature representation of the gesture movement, input the modal data of emotion into LeNet-5 neural network of facial expression image and gesture movement to carry on the feature extraction, the characteristic outputted in the last layer is regarded as the characteristic vector of emotion;
2.2.2) extracting a head turning posture image of the teacher, taking the head turning posture image of the teacher as input, and extracting features by utilizing an EfficientNet network;
2.2.3) dividing the acquired multichannel electroencephalogram signals into a plurality of segments, extracting time domain, frequency domain and nonlinear dynamics characteristics from each segment of signals by adopting a Student-t distribution function for expression label construction, and constructing a characteristic sequence;
2.3) feature vector after finishing according to characterizationForming a real-time and accurate behavior characteristic matrix of each teacher; behavior characterization matrix HijI represents the number of the teacher, and j represents the characteristic representation of the data of the three modes;
after characterizing and quantifying the nonverbal behaviors of the teacher in the step 2.2), obtaining a behavior expression vector of the teacher, wherein the behavior expression vector comprises the following steps: facial emotion E ═<e1,e2,…,et>The body constitution B ═<b1,b2,…,bt>And physiological P ═<p1,p2,…,pt>,
Firstly, through a multi-channel attention mechanism, aiming at different behavior vectors, learning a behavior weight WE、WBAnd WPMultiplying the vector by the frame attention model, inputting the multiplied vector to the frame attention model, and learning the frame sequence weight in different behavior vectors; after weight distribution is carried out on the behavior states under different time sequences, the behavior states are input into an LSTM model for learning, Hadamard products are used for carrying out feature compression coding on the output of the LSTM model to obtain a final behavior feature matrix Hij=(Eij,Bij,Pij);
3) Training a behavior feature classifier: training a classifier by using the emotion label to obtain teacher-like personality characteristic representation and class nonverbal behavior characteristics;
3.1) constructing a neural network based on time recursion from the emotion index, the physical index and the physiological index to carry out matrix characteristic decomposition to obtain a teacher personality characteristic representation and a classroom nonverbal behavior characteristic representation;
using non-negative matrix factorization to factor teacher behavior feature matrix HijDecomposition into teacher-like personality characterization thetai*And class nonverbal behavioral characteristics betaj*
To be provided with
Figure FDA0003235958160000021
The minimum is an objective function, and the teacher-like personality-type feature vector is calculated and determined
Figure FDA0003235958160000022
Course mixingHall nonverbal behavior feature vector
Figure FDA0003235958160000023
Wherein, superscript T represents transposition;
wherein N is the total number of teachers, M is the number of hidden characteristics of classroom behaviors obtained by matrix decomposition, aiA self-attention weight obtained for learning;
3.2) characterizing behavior matrix HijConstructing a Gaussian distribution form with the learning label, and finally training a neural network classifier based on time recursion by using the characteristic sequence and the corresponding emotion label;
4) according to the non-verbal behavior acquisition data of a specific teacher, feature characterization and feature decomposition are carried out to obtain the feature characterization of the teacher and the classroom behavior feature characterization, and the feature vectors of the teacher and the classroom behavior feature characterization are multiplied to obtain the teaching evaluation expression of the teacher.
2. The teacher non-verbal behavior based teaching process evaluation method according to claim 1, wherein in the step 2.2.1), the LeNet-5 neural network training method is as follows:
calculating the similarity of facial expressions and gesture actions of different teachers through a cosine similarity formula; secondly, ranking the calculated emotion similarity obtained by fusing the facial expression and gesture action similarities of the teacher according to a Hump ranking algorithm, fitting a ranking result by using a student-t distribution function to obtain the real label distribution of each teaching emotion, and obtaining an optimal solution, namely a teacher facial emotion result after a plurality of iterative optimizations; the Student-t distribution function for the construction of the expression label is expressed as a formula (1), wherein Gamma (·) represents a Gamma function, and v represents a degree of freedom;
Figure FDA0003235958160000031
3. the teacher non-verbal behavior based teaching process evaluation method according to claim 1, wherein in step 2.2.2), the training method of the EfficientNet network is as follows:
and (4) calculating the three-element attitude perception loss and the von Mises-Fisher distribution loss. The loss of the three-element posture perception is generated by calculating the similarity between different features, firstly defining a similarity measurement D, and then obtaining a perception loss LtExpressed as formulas (2) and (3);
Figure FDA0003235958160000041
Lt(xn,xa,xp)=max(0,D(xa,xn)-D(xa,xp)). (3)
for the distribution loss, the distribution loss is obtained by inputting respective characteristics into a distribution module, in a von Mises-Fisher distribution module, firstly inputting the obtained characteristics into a full connection layer, outputting to obtain an unconstrained matrix M and constructing von Mises-Fisher distribution, and then calculating the distribution loss by using a rotation matrix R representing the head posture of the teacher;
extracting the image of the face of the teacher, setting the width and the height of the image as w and h respectively, and acquiring the coordinates B of the centers of the pupils of the left eye and the right eye1And B2Calculating and obtaining pitch and yaw angles of the left eye and the right eye of the teacher, adopting a mean square error and KL divergence loss as a loss function of a network model, and improving the accuracy of the pitch and yaw angles of the left eye and the right eye through multiple iterations; the learning label is constructed into a Gaussian distribution form, and the specific formula is as follows:
Figure FDA0003235958160000042
wherein x represents a face image for teaching by a teacher,
Figure FDA0003235958160000043
representing the pitch and yaw angles of the eyeball.
4. The teacher non-verbal behavior based teaching process evaluation method according to claim 1, wherein in the step 2.2.3), firstly, preprocessing is performed on electroencephalogram signals to obtain efficient and robust electroencephalogram data; then respectively extracting time domain, frequency domain and nonlinear dynamics from the preprocessed electroencephalogram data and constructing characteristic sequences; and finally, training a classifier by using a student-t distribution function according to the characteristic sequence and the corresponding emotion label, wherein the classifier is an emotion recognition model based on a long-time memory LSTM network and an EGG.
5. The teacher non-verbal behavior-based teaching process evaluation method according to claim 1, wherein in the step 4), the following steps are specifically performed:
4.1) utilizing the collected data of the nonverbal behaviors of a specific teacher to represent the obtained teacher characteristic vector, and obtaining the characteristic representation theta corresponding to the specific teacher by using direct aggregation operation after the characteristic representation of the teacher personality type obtained in the step 3.1) and the characteristic representation of the teacher personality type obtained in the step 3.1) pass through a recurrent neural networki
4.2) utilizing the teacher characteristic vector obtained by data representation of nonverbal behavior acquisition of a specific teacher, and carrying out direct aggregation operation on the classroom nonverbal behavior representation obtained in the step 3.1) to obtain the nonverbal behavior representation beta of the specific teacherj
And 4.3) multiplying the feature expression vector of the specific teacher by the nonverbal behavior characterization vector to obtain the teaching evaluation expression of the teacher.
CN202111002386.6A 2021-08-30 2021-08-30 Teaching process evaluation method based on teacher non-verbal behaviors Pending CN113792626A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111002386.6A CN113792626A (en) 2021-08-30 2021-08-30 Teaching process evaluation method based on teacher non-verbal behaviors

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111002386.6A CN113792626A (en) 2021-08-30 2021-08-30 Teaching process evaluation method based on teacher non-verbal behaviors

Publications (1)

Publication Number Publication Date
CN113792626A true CN113792626A (en) 2021-12-14

Family

ID=78876649

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111002386.6A Pending CN113792626A (en) 2021-08-30 2021-08-30 Teaching process evaluation method based on teacher non-verbal behaviors

Country Status (1)

Country Link
CN (1) CN113792626A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115331154A (en) * 2022-10-12 2022-11-11 成都西交智汇大数据科技有限公司 Method, device and equipment for scoring experimental steps and readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109657529A (en) * 2018-07-26 2019-04-19 台州学院 Classroom teaching effect evaluation system based on human facial expression recognition
CN110334610A (en) * 2019-06-14 2019-10-15 华中师范大学 A kind of various dimensions classroom based on computer vision quantization system and method
US20200175264A1 (en) * 2017-08-07 2020-06-04 Shenzhen Institutes Of Advanced Technology Chinese Academy Of Sciences Teaching assistance method and teaching assistance system using said method
CN111353439A (en) * 2020-03-02 2020-06-30 北京文香信息技术有限公司 Method, device, system and equipment for analyzing teaching behaviors

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200175264A1 (en) * 2017-08-07 2020-06-04 Shenzhen Institutes Of Advanced Technology Chinese Academy Of Sciences Teaching assistance method and teaching assistance system using said method
CN109657529A (en) * 2018-07-26 2019-04-19 台州学院 Classroom teaching effect evaluation system based on human facial expression recognition
CN110334610A (en) * 2019-06-14 2019-10-15 华中师范大学 A kind of various dimensions classroom based on computer vision quantization system and method
CN111353439A (en) * 2020-03-02 2020-06-30 北京文香信息技术有限公司 Method, device, system and equipment for analyzing teaching behaviors

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
牟智佳;苏秀玲;严大虎;: "课堂环境下基于教学行为的教师教学投入度评测建模研究", 现代远距离教育, no. 03 *
路远;李彦敏;: "基于RealSense的在线课堂注意力评测系统设计", 中国医学教育技术, no. 03 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115331154A (en) * 2022-10-12 2022-11-11 成都西交智汇大数据科技有限公司 Method, device and equipment for scoring experimental steps and readable storage medium
CN115331154B (en) * 2022-10-12 2023-01-24 成都西交智汇大数据科技有限公司 Method, device and equipment for scoring experimental steps and readable storage medium

Similar Documents

Publication Publication Date Title
Pabba et al. An intelligent system for monitoring students' engagement in large classroom teaching through facial expression recognition
Cukurova et al. The promise and challenges of multimodal learning analytics
Dewan et al. A deep learning approach to detecting engagement of online learners
US20200379575A1 (en) Systems and methods for facilitating accessible virtual education
Kacorri Teachable machines for accessibility
CN105493130A (en) Adaptive learning environment driven by real-time identification of engagement level
Cheng Lin et al. Facial emotion recognition towards affective computing‐based learning
Zakka et al. Estimating student learning affect using facial emotions
Casalino et al. Deep learning for knowledge tracing in learning analytics: an overview.
CN112529054A (en) Multi-dimensional convolution neural network learner modeling method for multi-source heterogeneous data
Villegas-Ch et al. Identification of emotions from facial gestures in a teaching environment with the use of machine learning techniques
CN113792626A (en) Teaching process evaluation method based on teacher non-verbal behaviors
Chen et al. Developing AI into Explanatory Supporting Models: An Explanation-visualized Deep Learning Prototype for Computer Supported Collaborative Learning
CN117237766A (en) Classroom cognition input identification method and system based on multi-mode data
CN116244474A (en) Learner learning state acquisition method based on multi-mode emotion feature fusion
Vishnumolakala et al. In-class student emotion and engagement detection system (iSEEDS): an AI-based approach for responsive teaching
Gouraguine et al. Handwriting treatment and acquisition in dysgraphic children using a humanoid robot-assistant
RU2751759C2 (en) Software and hardware complex of the training system with automatic assessment of the student&#39;s emotions
Gambo et al. A conceptual framework for detection of learning style from facial expressions using convolutional neural network
CN112818741A (en) Behavior etiquette dimension evaluation method and device for intelligent interview
Komaravalli et al. Detecting Academic Affective States of Learners in Online Learning Environments Using Deep Transfer Learning
Kadyrov et al. Automated Reading Detection in an Online Exam.
Lu et al. Design and implementation of a virtual teacher teaching system algorithm based on facial expression recognition in the era of big data
Horvat et al. Quantitative measures for classification of human upper body posture in video signal to improve online learning
Zheng et al. Automated Multi-Mode Teaching Behavior Analysis: A Pipeline Based Event Segmentation and Description

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination