CN111640341B - Smart classroom interaction analysis method based on face emotion recognition - Google Patents

Smart classroom interaction analysis method based on face emotion recognition Download PDF

Info

Publication number
CN111640341B
CN111640341B CN202010628581.9A CN202010628581A CN111640341B CN 111640341 B CN111640341 B CN 111640341B CN 202010628581 A CN202010628581 A CN 202010628581A CN 111640341 B CN111640341 B CN 111640341B
Authority
CN
China
Prior art keywords
interactive
student
interaction
behavior
technology
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010628581.9A
Other languages
Chinese (zh)
Other versions
CN111640341A (en
Inventor
闫强
易兰丽
张笑妍
夏宇
周思敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing University of Posts and Telecommunications
Original Assignee
Beijing University of Posts and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing University of Posts and Telecommunications filed Critical Beijing University of Posts and Telecommunications
Priority to CN202010628581.9A priority Critical patent/CN111640341B/en
Publication of CN111640341A publication Critical patent/CN111640341A/en
Application granted granted Critical
Publication of CN111640341B publication Critical patent/CN111640341B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/08Electrically-operated educational appliances providing for individual presentation of information to a plurality of student stations
    • G09B5/14Electrically-operated educational appliances providing for individual presentation of information to a plurality of student stations with provision for individual teacher-student communication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • G06Q50/205Education administration or guidance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Tourism & Hospitality (AREA)
  • Strategic Management (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • General Business, Economics & Management (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Electrically Operated Instructional Devices (AREA)

Abstract

A smart classroom interaction analysis method based on face emotion recognition comprises the following steps: intercepting a plurality of pictures at the same time interval in a teaching video to form a picture data set; recording interactive behaviors, interactive equipment and interactive technologies appearing in the picture data set, and matching corresponding codes to form a code table; establishing a data matrix based on the coding table to obtain the proportion of each interactive behavior, interactive equipment and interactive technology; carrying out face emotion recognition based on the picture data set to generate a face emotion recognition result; the relation between the emotion of the student and each interactive behavior, interactive equipment and interactive technology is analyzed through a statistical test method. The method provided by the invention combines practical analysis of the interaction behavior of the teacher and the students in the intelligent classroom, provides a more scientific and effective way for evaluating the teaching interaction effect of the intelligent classroom at the present stage, and enriches the evaluation method of the interaction behavior in the intelligent classroom.

Description

Smart classroom interaction analysis method based on face emotion recognition
Technical Field
The invention relates to the technical field of intelligent classroom interaction, in particular to an intelligent classroom interaction analysis method based on face emotion recognition.
Background
A Smart Classroom (Smart classorom) is a typical Smart learning environment that optimizes the presentation of teaching content, facilitates the acquisition of learning resources, and facilitates the implementation of Classroom interaction functions. In the wisdom classroom, the application of novel interactive technology and interactive environment's design make teaching interactive means more abundant, make the replenishment for traditional speech exchange. With the rapid development of innovative applications such as 5G and artificial intelligence, an intelligent classroom based on various novel technologies starts to be built on the ground, so that the learning motivation and the participation degree of learners are effectively improved in a richer interaction mode, and the learning performance is obviously improved.
An Information Technology-Based Interaction Analysis System (ITIAS) is an Analysis method for evaluating Interaction conditions of classroom teaching by using an Information Technology, and the ITIAS records Interaction events occurring in a classroom as a coding form according to a specific coding table so as to analyze behaviors and modes of classroom teaching. The method is a supplement made according to the change of teaching situation on the basis of an interactive Analysis System (FIAS).
The ITIAS, whether it is the information technology used for interaction or the specific teaching situation of the intelligent classroom, is different from the teaching environment of today. With the gradual application of new technologies such as mobile internet, artificial intelligence and 5G to smart classrooms, the interaction behaviors of teachers and students, students and the technologies change. If the coding method of ITIAS is still used, the interaction condition of the intelligent classroom cannot be effectively evaluated, and the obtained evaluation result is inaccurate.
Disclosure of Invention
Objects of the invention
The invention aims to provide a smart classroom interaction analysis method based on face emotion recognition, which is used for classifying and evaluating emotion changes of learners in a teaching interaction process by analyzing a teaching video through a face emotion recognition technology, so that the influence of teaching interaction characteristics on the emotions of students is verified.
(II) technical scheme
In order to solve the above problems, according to an aspect of the present invention, there is provided a method for analyzing interaction in a smart classroom based on human face emotion recognition, including: intercepting a plurality of pictures at the same time interval in a teaching video to form a picture data set; recording interactive behaviors, interactive equipment and interactive technologies appearing in the picture data set, matching corresponding codes, and recording the codes once every appearance to form an interactive code table; establishing a data matrix based on the coding table to obtain the proportion of each interactive behavior, interactive equipment and interactive technology; carrying out face emotion recognition based on the picture data set to generate a face emotion recognition result; analyzing the relation between the emotion of the student and each interactive behavior, interactive equipment and interactive technology through a statistical test method, wherein the method comprises the following steps: and determining a statistical test variable by combining the proportion of each interactive behavior, interactive equipment and interactive technology and the face emotion recognition result, and inputting and analyzing the statistical test variable.
Further, the interactive behavior comprises: teacher operation interactive device behavior, student operation interactive device behavior, teacher use interactive technology behavior, student use interactive technology behavior, teacher-student interactive behavior, student-content interactive behavior, and other teaching behaviors.
Further, the teacher using interactive technical behavior comprises: the teacher operates the demonstration content, the teacher shows student achievements, and the teacher evaluates and guides the technology; student usage interaction technology behaviors include: cooperative practice, sharing display and student evaluation; the teacher-student interaction behaviors comprise: teacher teaching, questioning, feedback and evaluation, organization instruction, student passive response, student active questioning, operation demonstration content, student achievement display, teacher evaluation, technical guidance and student sharing display; the life interactive behavior comprises the following steps: communication discussion, speech evaluation, cooperation practice and sharing display; the student interaction with the content comprises the following actions: autonomous learning, student evaluation, and silence conducive to teaching; other instructional activities include pauses or confusion that do not contribute to instruction.
Further, the interaction device comprises: a whiteboard, a tablet, an e-bag, an Augmented Reality (AR) device, a projection device, an electronic desk, or a traditional blackboard.
Further, the interaction technique includes: interactive electronic whiteboard technology, classroom response systems, virtual reality technology, Computer Aided Instruction (CAI) classroom interaction technology, multimedia courseware, multimedia projection, or interactive Application (APP).
Further, establishing a data matrix based on the coding table, and obtaining the proportion of each interactive behavior, each interactive device and each interactive technology comprises: matching corresponding codes for the interactive behaviors, interactive devices and interactive technologies of each sample in the picture data set based on the coding table to form an original data table; combining the interactive behavior codes of adjacent samples in the original data sheet to form a sequence pair, combining the interactive equipment codes to form a sequence pair, and combining the interactive technology codes to form a sequence pair to obtain an interactive behavior data matrix, an interactive equipment data matrix or an interactive technology data matrix; the data matrix comprises a row sequence, a column sequence and data units, wherein the row sequence and the column sequence represent interactive behavior codes, interactive equipment codes or interactive technology codes, and the data units represent the occurrence times of sequence pairs.
Further, performing face emotion recognition based on the image data set, and generating a face emotion recognition result includes: reading a picture sample in a picture data set; identifying picture samples containing effective faces, and storing the picture samples into a face data list; extracting attributes corresponding to the faces in the face data list to obtain a face emotion recognition result; the attributes include: anger, disgust, fear, joy, neutrality, injury or surprise.
Further, Face emotion recognition api provided by the Face + + platform is adopted for Face emotion recognition.
Further, the statistical test method comprises the following steps: normal test, independence test, homogeneity of variance test, and multivariate analysis of variance test.
Further, the statistical test variables include: teacher's technical operation ratio, student's technical operation ratio, classroom technical use rate, teacher's and student's interaction frequency, student's interaction frequency with content and a certain class of student's emotional score.
Furthermore, a plurality of pictures are intercepted at the same time interval in the teaching video, and the picture data set is formed by the following steps: intercepting a plurality of pictures at the same time interval in a teaching video based on Ffmpeg software to form a picture data set; the time interval was 10 seconds.
(III) advantageous effects
The technical scheme of the invention has the following beneficial technical effects:
the change of the novel information technology to the intelligent classroom interaction mode is fully considered, the evaluation method of the interaction behavior in the intelligent classroom is enriched, the interaction behavior of a teacher and students in the intelligent classroom can be combined with actual analysis, the effect of the interaction of the intelligent classroom teaching at the present stage can be evaluated more scientifically, and a more effective measurement mode is provided.
Drawings
FIG. 1 is a table of interaction behavior codes in an intelligent classroom interaction analysis provided by the present invention;
FIG. 2 is a table of codes of interactive devices in an intelligent classroom interaction analysis provided by the present invention;
FIG. 3 is a table of interaction technique codes in an intelligent classroom interaction analysis provided by the present invention;
FIG. 4 is a table of raw data for one embodiment of intelligent classroom interaction analysis provided by the present invention;
FIG. 5 is a data matrix for one embodiment of intelligent classroom interaction analysis provided by the present invention; a
FIG. 6 is a flow chart of Face recognition by a Face + + platform-provided Face emotion recognition api in the intelligent classroom interaction analysis provided by the present invention;
FIG. 7 is a detailed flow chart of Face recognition by the Face + + platform-provided Face emotion recognition api in the intelligent classroom interaction analysis provided by the present invention;
fig. 8 is an exemplary table of facial emotion recognition results in an embodiment of the intelligent classroom interaction analysis provided by the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further detail with reference to the accompanying drawings in conjunction with the following detailed description. It should be understood that the description is intended to be exemplary only, and is not intended to limit the scope of the present invention. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present invention.
The present invention will be described in detail below with reference to the accompanying drawings and examples.
The invention provides a smart classroom interaction analysis method based on face emotion recognition, which comprises the following steps:
s1: and picture extraction, which comprises the step of intercepting a plurality of pictures at the same time interval in the teaching video to form a picture data set.
Optionally, the teaching video is intercepted by using the Ffmpeg software to obtain a plurality of pictures to be analyzed, so as to form a picture data set. The time interval was 10 seconds (frequency was 0.1).
S2: and interactive code extraction, which comprises recording interactive behaviors, interactive equipment and interactive technologies appearing in the picture data set, matching corresponding codes, recording the codes once every appearance, and finally forming an interactive code table.
Optionally, the interaction behavior includes: teacher operation interactive device behavior, student operation interactive device behavior, teacher use interactive technology behavior, student use interactive technology behavior, teacher-student interactive behavior, student-content interactive behavior, and other teaching behaviors.
Fig. 1 is a table of interaction behavior codes in the interaction analysis of the intelligent classroom provided by the present invention, please refer to fig. 1.
The teacher operation interaction equipment behavior represents that the teacher operates corresponding interaction equipment; the behavior of the student operating the interactive device represents that the student operates the corresponding interactive device.
Teacher usage interactive technical behaviors include: the teacher operates demonstration contents (comprising operation demonstration steps, demonstration learning contents or resources, and the corresponding code is a), the teacher demonstrates student achievements (comprising student works, student homework and the like, and the corresponding code is b), the teacher evaluates (comprising evaluating students by using technology, such as arranging interactive questions, demonstrating gauges, evaluating works and the like, and the corresponding code is c), and technical guidance (comprising technical guidance, participation in student-technology activities, and the corresponding code is d).
Student usage interaction technology behaviors include: cooperative practice (including cooperative creation, cooperative exploration, corresponding code of f), sharing display (including displaying works, demonstration steps, etc., corresponding code of g), student evaluation (including interactive practice, self evaluation, companion evaluation, etc., corresponding code of h).
The teacher-student interaction behaviors comprise: teacher's instruction (including explaining teaching content, corresponding code is 1), question (including proposing question, corresponding code is 2), feedback and evaluation (including solving student's question, to student's answer or reflect instant comment, corresponding code is 3), organization instruction (including organizing learning activity flow, such as explaining task, individual group course guidance, etc., corresponding code is 4), student's passive answer (including student's passive request answering teacher's question, corresponding code is 5), student's active answer (including active answer teacher's question, corresponding code is 6), student's active question (including student's proposing own question, corresponding code is 7), operation demonstration content (including operation demonstration step, demonstration content or resource, corresponding code is a), demonstration student's achievement (including student's work, student's homework, etc., the corresponding code is b), teacher evaluation (including using technology to evaluate students, such as arranging interactive questions, displaying gauges, selecting works, etc., and the corresponding code is c), technical guidance (including performing technical guidance, participating in student-technology activities, and the corresponding code is d), and student sharing display (including displaying works, demonstrating steps, etc., and the corresponding code is g).
The life interactive behavior comprises the following steps: the method comprises the following steps of communication discussion (including peer discussion, expressing the viewpoint of work of the user, sharing group communication viewpoint, corresponding coding being 8), speech evaluation (including evaluation description on the answer or viewpoint of the peer, corresponding coding being 9), cooperation practice (including cooperation creation, cooperation exploration, corresponding coding being f), sharing display (including displaying work, demonstration steps and the like, corresponding coding being g).
The student interaction with the content comprises the following actions: autonomous learning (including autonomous learning activities such as personal operation, video watching, webpage browsing and the like, corresponding code is e), student evaluation (including interactive exercise, self evaluation, fellow evaluation and the like, corresponding code is h), silence helpful for teaching (including thinking, contact of traditional learning tools, note making and the like, corresponding code is i).
Other instructional activities represent pauses or confusion that do not contribute to the instruction (including pauses or classroom confusion that do not contribute to the progress of the instruction, with the corresponding code being j).
Optionally, fig. 2 is a coding table of an interactive device in the interaction analysis of the intelligent classroom provided by the invention, please refer to fig. 2. The interaction device includes: the system comprises an electronic whiteboard (corresponding to a code w), a tablet personal computer or an electronic schoolbag (corresponding to a code p), an Augmented Reality (AR) device (corresponding to a code r), a projection device (corresponding to a code m), an electronic desk (corresponding to a code t) and a traditional blackboard (corresponding to a code l).
Optionally, fig. 3 is a coding table of interaction techniques in the interaction analysis of the intelligent classroom provided by the present invention, please refer to fig. 3. The interaction technology comprises the following steps: the interactive electronic whiteboard technology (corresponding code is w), the classroom response system (corresponding code is s), the virtual reality technology (corresponding code is r), the classroom interaction technology of Computer Aided Instruction (CAI) (corresponding code is m), the multimedia courseware (corresponding code is m), the multimedia projection (corresponding code is m), and the interactive application program (i.e. APP, corresponding code is k).
S3: and establishing a data matrix based on the coding table, and analyzing the proportion of each interactive behavior, interactive equipment and interactive technology.
Wherein, S3 includes the following steps:
s31: and matching corresponding codes for the interactive behaviors, interactive devices and interactive technologies of each sample in the picture data set based on the coding table to form an original data table.
Optionally, fig. 4 is a raw data table of an embodiment of the intelligent classroom interaction analysis provided by the present invention, in which six picture samples are selected and sorted; and meanwhile, counting the interactive behaviors appearing in the picture, interactive equipment and corresponding codes of the interactive technology, and finally forming an original data table.
S32: and combining the interactive behavior codes of adjacent samples in the original data table to form a sequence pair, combining the interactive equipment codes to form a sequence pair, and combining the interactive technology codes to form a sequence pair to obtain an interactive behavior data matrix, an interactive equipment data matrix or an interactive technology data matrix.
The data matrix comprises a row sequence, a column sequence and data units, wherein the row sequence and the column sequence represent interactive behavior codes, interactive equipment codes or interactive technology codes, and the data units represent the occurrence times of sequence pairs.
Specifically, two codes of adjacent sampling points in the original data are combined to form an ordered pair, and the ordered pair can be obtained through the original data table in the above embodiment as (6,6), (6, a), (a, g), (g, g), and so on. Except that the first and the last codes are used once, the other codes are used twice, namely if n codes exist, the (n-1) sequence pairs are formed.
The intelligent classroom teaching behaviors of the invention totally comprise 19 classes, and the 19 classes of classroom teaching behaviors form a 19 multiplied by 19 order matrix. The former digit of each sequence pair corresponds to the row sequence, the latter digit corresponds to the column sequence, and the occurrence times are written into the corresponding data units to finally form the data matrix.
FIG. 5 is a data matrix of an embodiment of the intelligent classroom interaction analysis provided by the present invention, which is encoded based on interaction behavior in the raw data table of FIG. 4. The data matrix of the interaction device and the interaction technology may refer to the data matrix of the interaction behavior, which is not described herein again.
S4: and carrying out face emotion recognition based on the image data set to generate a face emotion recognition result.
The emotion of a learner, such as anxiety, boredom, joy, anger and the like, has important influence on learning motivation, participation and the like of the learner, and the emotion measurement of the learner, such as anxiety, boredom, joy, anger and the like, cannot objectively, quickly and accurately acquire measurement data only through a questionnaire scale. Therefore, the invention applies the face recognition technology to the intelligent classroom to efficiently acquire a large amount of emotion objective data.
Wherein, S4 includes the following steps:
s41: reading a picture sample in a picture data set;
s42: identifying picture samples containing effective faces, and storing the picture samples into a face data list;
s43: extracting attributes corresponding to the faces in the face data list to obtain a face emotion recognition result; the attributes include: anger, disgust, fear, joy, neutrality, injury or surprise.
Optionally, the Face + + platform is used to provide Face emotion recognition api for Face emotion recognition. Fig. 6 is a flow chart of Face recognition by the Face + + platform-provided Face emotion recognition api in the interactive analysis of the smart classroom provided by the present invention, please look at fig. 6, first read the picture samples in the picture data set in the form of binary system, input the picture samples in the form of binary system into the Face + + platform-provided Face emotion recognition api, and the Face emotion recognition api analyzes the data of the picture samples in the form of binary system, and finally generate a Face emotion analysis result.
Fig. 7 is a detailed flow chart of Face recognition by the Face + + platform-provided Face emotion recognition api in the intelligent classroom interaction analysis provided by the present invention, please see fig. 7. Firstly, inputting a picture sample in a picture data set into a Face + + platform in a binary system mode, and after the Face + + platform performs data transcoding on the input data, judging whether an effective Face exists in the picture sample?
If the effective face exists, storing the picture sample into a face data list; and if the valid face does not exist, exiting the program and executing the next picture sample.
After the image sample is stored in a face data list, whether the face data list is empty or not needs to be identified, and if not, the attribute corresponding to the face in the face data list is extracted; if the image is empty, outputting the face which is not matched, or returning to execute the next image sample.
Fig. 8 is an example table of face emotion recognition results in an intelligent classroom interaction analysis according to an embodiment of the present invention, please refer to fig. 8, wherein "face _ id" represents a picture sample of a valid face recognized, anger, disgust, fear, happense, neutral, sadness, and surprie respectively represent angry, disgust, fear, joy, neutrality, hurry, and surprise, and the data in the table is an emotion proportion of anger, disgust, fear, joy, neutrality, hurry, or surprise.
S5: analyzing the relationship between the emotion of the student and various interactive behaviors, interactive devices and interactive technologies by a statistical test method, wherein the relationship comprises the following steps: and determining statistical test variables by combining the proportion of various interactive behaviors, interactive equipment and interactive technologies and the face emotion recognition result, and inputting and analyzing the statistical test variables.
Optionally, the variables of the statistical test include:
teacher's technical operation ratio, represent the ratio of teacher's technical operation in teaching;
the student technical operation ratio represents the ratio of student technical operations in teaching;
the classroom technology utilization rate represents the technology utilization rate in teaching, reflects the conditions that teachers and students use interactive equipment and interactive technology, and the higher the numerical value is, the more frequently the technology is used in the teaching is marked;
the teacher-student interaction frequency represents the interaction rate of the teacher and students in teaching;
the student interaction frequency represents the interaction rate of students and students in teaching;
the interaction frequency of the students and the content represents the interaction ratio of the students and the content in the teaching;
the emotion score of a certain class of students represents the average score of the emotion of a certain class of students in a certain picture sample.
Wherein the variables of the statistical test are calculated using the following formula:
Figure BDA0002565672110000091
Figure BDA0002565672110000092
Figure BDA0002565672110000093
Figure BDA0002565672110000094
Figure BDA0002565672110000095
Figure BDA0002565672110000096
Figure BDA0002565672110000097
optionally, the statistical test method comprises: normal test, independence test, homogeneity of variance test, and multivariate analysis of variance test. And analyzing the statistical data according to the four statistical test methods, and finally analyzing whether the emotion of the student changes according to the use conditions of different interaction forms and technologies.
(1) Normal test
In a normal distribution, two main components are considered: kurtosis and skewness. Studies show that if skewness and kurtosis values are between ± 2, data is normally distributed; if the skewness and kurtosis values of the experimental data are not within this range, a square root transform is required on the data.
And (3) a checking step: the analysis-description statistics-exploration function is performed, and the data column is selected and added to the "dependent variable list". Then click the "draw" button to check the "normal graph with test". Clicking "ok" gets the output result, and if the significance level sig. is greater than 0.05, the data column can be considered to be statistically normally distributed. If the data does not conform to the normal distribution, the square root of the data in the column is used to perform the normal test again.
(2) Independence test
The independence between the statistical test variables is tested in SPSS using a general linear model, and if the Sig value of the correction model is less than the significance level of 0.05, it is shown to be effective to fit this model using linearity, with relative independence between the variables. For variables that do not pass independence, a one-way analysis of variance was used alone for testing.
And (3) a checking step: and executing the functions of analysis, description statistics and cross table, setting interaction frequency data columns as rows, setting emotion score data columns as columns, and outputting an analysis result. If the progressive significance, sig, is less than 0.05, the groups of data can be considered statistically independent.
(3) Homogeneity test of variance
Box's M statistic was used to test covariance homogeneity, and Leven test was used to test dependent variable homogeneity.
And (3) a checking step: adding a column of data as a grouping variable, executing analysis-classification-discrimination functions, setting the newly added data column as the grouping variable, setting the interaction frequency data column as an independent variable, and obtaining an output result after calculation. If the sig. value of Box's M is greater than 0.05, the covariance homogeneity can be considered statistically to pass the test.
And then, performing analysis, mean comparison and one-way ANOVA, setting the newly added data column as a factor, and setting the emotion score data column as a dependent variable. And obtaining an output result after calculation. If the value of sig. of the Levene test is greater than 0.05, the dependent variable variance homogeneity can be considered to pass the test in a statistical sense.
(4) Multivariate analysis of variance
The three detection methods are independently carried out, and whether the data are normally distributed or not is observed through normal detection; observing whether the variables have relative independence through an independence test; and observing whether the data distribution characteristics of different emotion groups are similar through a variance homogeneous test.
When the assumptions of the three testing methods are established, the influence of the interaction form on the emotion of the student is determined through multi-variable variance analysis. I.e. where all assumptions are satisfied, it is determined by using multivariate analysis of variance statistics whether the student's mood changes according to different interaction modalities and technical use cases.
When multivariate analysis of variance was used, Bonferroni correction was performed on students from different mood groups in the data analysis and this value was used to check whether their mood was significantly different due to variable differences. For the Bonferroni correction, the significance p was taken to be 0.01 in the study.
And (3) a checking step: and performing analysis, namely a general linear model and a multivariable function, setting the emotion score data column as a dependent variable, setting the interaction frequency data column as a fixed factor, and obtaining an output result after calculation. If the Bonferroni corrected sig value is less than 0.01, then it is assumed that the student's mood changes statistically depending on different interaction modalities and technology usage.
The invention aims to protect an intelligent classroom interaction analysis method based on face emotion recognition, which comprises the following steps: intercepting a plurality of pictures at the same time interval in a teaching video to form a picture data set; recording interactive behaviors, interactive equipment and interactive technologies appearing in the picture data set, matching corresponding codes, and recording the codes once every appearance to form an interactive code table; establishing a data matrix based on the coding table to obtain the proportion of each interactive behavior, interactive equipment and interactive technology; carrying out face emotion recognition based on the picture data set to generate a face emotion recognition result; analyzing the relation between the emotion of the student and each interactive behavior, interactive equipment and interactive technology through a statistical test method, wherein the method comprises the following steps: and determining a statistical test variable by combining the proportion of each interactive behavior, interactive equipment and interactive technology and the face emotion recognition result, and inputting and analyzing the statistical test variable. The teaching video is analyzed by using the face emotion recognition technology, emotion changes of learners in the teaching interaction process are classified and evaluated, and therefore the influence of teaching interaction characteristics on the emotion of students is verified. The change of the novel information technology to the intelligent classroom interaction mode is fully considered, the evaluation method of the interaction behavior in the intelligent classroom is enriched, the interaction behavior of a teacher and students in the intelligent classroom can be combined with actual analysis, the effect of the interaction of the intelligent classroom teaching at the present stage can be evaluated more scientifically, and a more effective measurement mode is provided.
It is to be understood that the above-described embodiments of the present invention are merely illustrative of or explaining the principles of the invention and are not to be construed as limiting the invention. Therefore, any modification, equivalent replacement, improvement and the like made without departing from the spirit and scope of the present invention should be included in the protection scope of the present invention. Further, it is intended that the appended claims cover all such variations and modifications as fall within the scope and boundaries of the appended claims or the equivalents of such scope and boundaries.

Claims (11)

1. A smart classroom interaction analysis method based on face emotion recognition is characterized by comprising the following steps:
intercepting a plurality of pictures at the same time interval in a teaching video to form a picture data set;
recording interactive behaviors, interactive equipment and interactive technologies appearing in the picture data set, matching corresponding codes, and recording the codes once every appearance to form an interactive code table;
establishing a data matrix based on the coding table to obtain the proportion of each interactive behavior, the interactive equipment and the interactive technology;
carrying out face emotion recognition based on the picture data set to generate a face emotion recognition result;
analyzing the relation of the emotion of the student to each of the interaction behaviors, the interaction equipment and the interaction technology through a statistical test method, and finally analyzing whether the emotion of the student changes according to the use conditions of different interaction forms and technologies, wherein the method comprises the following steps: and determining a statistical test variable by combining the proportion of each interactive behavior, interactive equipment and interactive technology and the face emotion recognition result, and inputting and analyzing the statistical test variable.
2. The method of claim 1,
the interactive behavior comprises: teacher operation interactive device behavior, student operation interactive device behavior, teacher use interactive technology behavior, student use interactive technology behavior, teacher-student interactive behavior, student-content interactive behavior, and other teaching behaviors.
3. The method of claim 2,
the teacher using interactive technical behavior comprises: the teacher operates the demonstration content, the teacher shows student achievements, and the teacher evaluates and guides the technology;
the student using interactive technical behaviors comprising: cooperative practice, sharing display and student evaluation;
the teacher-student interaction behaviors comprise: teacher teaching, questioning, feedback and evaluation, organization instruction, student passive response, student active questioning, operation demonstration content, student achievement display, teacher evaluation, technical guidance and student sharing display;
the birth interactive behavior comprises: communication discussion, speech evaluation, cooperation practice and sharing display;
the student and content interaction behavior comprises: autonomous learning, student evaluation and silence contributing to teaching;
other instructional activities include pauses or confusion that do not contribute to instruction.
4. The method of claim 1,
the interaction device includes: a whiteboard, a tablet, an e-bag, an Augmented Reality (AR) device, a projection device, an electronic desk, or a traditional blackboard.
5. The method of claim 1,
the interaction technology comprises the following steps: interactive electronic whiteboard technology, classroom response systems, virtual reality technology, Computer Aided Instruction (CAI) classroom interaction technology, multimedia courseware, multimedia projection, or interactive Application (APP).
6. The method of claim 1, wherein the building a data matrix based on the code table to obtain the weight of each of the interaction behavior, the interaction device, and the interaction technology comprises:
matching corresponding codes for the interactive behaviors, interactive devices and interactive technologies of each sample in the picture data set based on the coding table to form an original data table;
combining the interactive behavior codes of adjacent samples in the original data sheet to form a sequence pair, combining the interactive equipment codes to form a sequence pair, and combining the interactive technology codes to form a sequence pair to obtain an interactive behavior data matrix, an interactive equipment data matrix or an interactive technology data matrix;
the data matrix comprises a row sequence, a column sequence and data units, wherein the row sequence and the column sequence represent interactive behavior codes, interactive equipment codes or interactive technology codes, and the data units represent the times of occurrence of the sequence pairs.
7. The method of claim 1, wherein performing facial emotion recognition based on the picture data set, and generating a facial emotion recognition result comprises:
reading a picture sample in the picture data set;
identifying the picture sample containing the effective face, and storing the picture sample into a face data list;
extracting attributes corresponding to the faces in the face data list to obtain a face emotion recognition result; the attributes include: anger, disgust, fear, joy, neutrality, injury or surprise.
8. The method of claim 7,
and adopting the Face + + platform to provide Face emotion recognition api for Face emotion recognition.
9. The method of claim 1,
the statistical test method comprises the following steps: normal test, independence test, homogeneity of variance test, and multivariate analysis of variance test.
10. The method of claim 7,
the statistical test variables include: teacher's technical operation ratio, student's technical operation ratio, classroom technical use rate, teacher's and student's interaction frequency, student's interaction frequency with content and a certain class of student's emotional score.
11. The method of claim 1, wherein the capturing multiple pictures at the same time interval in the instructional video, and the forming the picture data set comprises:
intercepting a plurality of pictures at the same time interval in a teaching video based on Ffmpeg software to form a picture data set;
the time interval was 10 seconds.
CN202010628581.9A 2020-07-01 2020-07-01 Smart classroom interaction analysis method based on face emotion recognition Active CN111640341B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010628581.9A CN111640341B (en) 2020-07-01 2020-07-01 Smart classroom interaction analysis method based on face emotion recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010628581.9A CN111640341B (en) 2020-07-01 2020-07-01 Smart classroom interaction analysis method based on face emotion recognition

Publications (2)

Publication Number Publication Date
CN111640341A CN111640341A (en) 2020-09-08
CN111640341B true CN111640341B (en) 2022-04-12

Family

ID=72331527

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010628581.9A Active CN111640341B (en) 2020-07-01 2020-07-01 Smart classroom interaction analysis method based on face emotion recognition

Country Status (1)

Country Link
CN (1) CN111640341B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105894413A (en) * 2016-05-04 2016-08-24 华中师范大学 Method for analysis and encoding of classroom teaching interactive behaviors
US9767349B1 (en) * 2016-05-09 2017-09-19 Xerox Corporation Learning emotional states using personalized calibration tasks
CN108648757A (en) * 2018-06-14 2018-10-12 北京中庆现代技术股份有限公司 A kind of analysis method based on various dimensions Classroom Information
CN109657529A (en) * 2018-07-26 2019-04-19 台州学院 Classroom teaching effect evaluation system based on human facial expression recognition
CN110765417A (en) * 2019-09-29 2020-02-07 昆明医科大学 Advanced medical classroom teaching interactive behavior analysis and evaluation method
CN111027865A (en) * 2019-12-12 2020-04-17 山东大学 Classroom teaching analysis and quality assessment system and method based on intelligent behavior and expression recognition

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105894413A (en) * 2016-05-04 2016-08-24 华中师范大学 Method for analysis and encoding of classroom teaching interactive behaviors
US9767349B1 (en) * 2016-05-09 2017-09-19 Xerox Corporation Learning emotional states using personalized calibration tasks
CN108648757A (en) * 2018-06-14 2018-10-12 北京中庆现代技术股份有限公司 A kind of analysis method based on various dimensions Classroom Information
CN109657529A (en) * 2018-07-26 2019-04-19 台州学院 Classroom teaching effect evaluation system based on human facial expression recognition
CN110765417A (en) * 2019-09-29 2020-02-07 昆明医科大学 Advanced medical classroom teaching interactive behavior analysis and evaluation method
CN111027865A (en) * 2019-12-12 2020-04-17 山东大学 Classroom teaching analysis and quality assessment system and method based on intelligent behavior and expression recognition

Also Published As

Publication number Publication date
CN111640341A (en) 2020-09-08

Similar Documents

Publication Publication Date Title
Zinovieva et al. The use of online coding platforms as additional distance tools in programming education
Pellas et al. A systematic literature review of mixed reality environments in K-12 education
Ulum The effects of online education on academic success: A meta-analysis study
Chu et al. Impacts of concept map-based collaborative mobile gaming on English grammar learning performance and behaviors
Vidakis et al. In-game raw data collection and visualization in the context of the “ThimelEdu” educational game
Sun et al. Applying learning analytics to explore the effects of motivation on online students' reading behavioral patterns
Watts et al. An examination of children's learning progression shifts while using touch screen virtual manipulative mathematics apps
Zielezinski et al. Promising practices: A literature review of technology use by underserved students
US20140045162A1 (en) Device of Structuring Learning Contents, Learning-Content Selection Support System and Support Method Using the Device
Agbo et al. Smart mobile learning environment for programming education in Nigeria: adaptivity and context-aware features
Arthur et al. The impact of emerging technologies on selection models and research: Mobile devices and gamification as exemplars
Zapata-Rivera et al. Assessing science inquiry skills in an immersive, conversation-based scenario
Oliva Córdova et al. An experience making use of learning analytics techniques in discussion forums to improve the interaction in learning ecosystems
Tsoni et al. From Analytics to Cognition: Expanding the Reach of Data in Learning.
Lashari et al. The impact of mobile assisted language learning (MALL) on ESL students’ learning
Li et al. Virtual reality in foreign language learning: A review of the literature
López-Fernández et al. Learning and motivational impact of using a virtual reality serious video game to learn scrum
Villegas-Ch et al. Identification of emotions from facial gestures in a teaching environment with the use of machine learning techniques
Sozcu The relationships between cognitive style of field dependence and learner variables in e-learning instruction
CN111640341B (en) Smart classroom interaction analysis method based on face emotion recognition
Alonso-Fernández et al. Game Learning Analytics:: Blending Visual and Data Mining Techniques to Improve Serious Games and to Better Understand Player Learning
Rodriguez et al. Gamifying users’ learning experience of scrum
Halverson et al. Games and learning
KR20190096508A (en) Imaginary job experience system using aprirude test
Hundhausen Evaluating visualization environments: Cognitive, social, and cultural perspectives

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant