CN114187640A - Learning situation observation method, system, equipment and medium based on online classroom - Google Patents

Learning situation observation method, system, equipment and medium based on online classroom Download PDF

Info

Publication number
CN114187640A
CN114187640A CN202111541953.5A CN202111541953A CN114187640A CN 114187640 A CN114187640 A CN 114187640A CN 202111541953 A CN202111541953 A CN 202111541953A CN 114187640 A CN114187640 A CN 114187640A
Authority
CN
China
Prior art keywords
student
classroom
expression
students
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111541953.5A
Other languages
Chinese (zh)
Inventor
海克洪
杨俊�
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hubei Meihe Yisi Education Technology Co ltd
Original Assignee
Hubei Meihe Yisi Education Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hubei Meihe Yisi Education Technology Co ltd filed Critical Hubei Meihe Yisi Education Technology Co ltd
Priority to CN202111541953.5A priority Critical patent/CN114187640A/en
Publication of CN114187640A publication Critical patent/CN114187640A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2135Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on approximation criteria, e.g. principal component analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • G06Q50/205Education administration or guidance
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Educational Administration (AREA)
  • General Health & Medical Sciences (AREA)
  • Educational Technology (AREA)
  • Health & Medical Sciences (AREA)
  • Tourism & Hospitality (AREA)
  • Software Systems (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • Molecular Biology (AREA)
  • Strategic Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • General Business, Economics & Management (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a learning condition observation method, a system, equipment and a medium based on an online classroom, which are characterized in that student expression data are obtained at intervals of preset time, an expression recognition model combining HGO characteristics and an LBP + algorithm is adopted to analyze the expression data, and whether the student expression data are not attentive or not is judged according to the student expression analysis result; setting classroom exercise questions after each teaching link, flexibly analyzing classroom exercise answers of students in combination with difficulty of the classroom exercise questions, and judging mastering conditions of knowledge points according to classroom exercise answer analysis results of the students; and generating classroom learning situation statistics according to the student expression analysis result and the exercise answer analysis result, and feeding the statistics back to students and teachers. The invention detects the expression of the student in real time to acquire the concentration condition of the student, acquires the knowledge point mastering degree of the student through classroom exercise, facilitates the teacher to observe the student learning condition in real time in the classroom on line, adjusts the classroom according to the student learning condition in time, and is beneficial to improving the teaching quality.

Description

Learning situation observation method, system, equipment and medium based on online classroom
Technical Field
The invention relates to the technical field of teaching, in particular to a learning condition observation method, a learning condition observation system, learning condition observation equipment and a learning condition observation medium based on an online classroom.
Background
The current online teaching mode is still mainly that class education, a teacher teaches a plurality of students simultaneously, in addition, the characteristics of online teaching, the teacher can only observe the student's action from the camera, however, a plurality of video pictures do not make things convenient for the teacher to observe the emotional state of the student and then notice the not action of being absorbed in of the student, and student's study achievement also needs the teacher to examine the homework that the student accomplished in unison after class to learn, this just leads to the teacher to be difficult to in time observe student's feelings when carrying out online education, be difficult to adjust classroom progress according to student's performance.
However, students are not constrained by teachers, often cannot realize the own vague behaviors in time, easily miss key teaching contents in a classroom and do not know the contents, so that the teaching quality is reduced, and the self-behavior constraint of the students is not facilitated.
Therefore, a generally applicable method is not available at present, and the problem that students can not observe and know the learning situation in time in an online classroom can be solved.
Disclosure of Invention
In view of the above, the invention provides an online classroom-based learning condition observation method, which is used for solving the problem that the online classroom cannot observe and know the learning condition of students in time.
The technical scheme of the invention is realized as follows:
the invention discloses a study condition observation method based on an online classroom, which comprises the following steps:
s1, obtaining student expression data at preset intervals, analyzing the expression data, and judging whether the student expression data is not attentive according to the student expression analysis result;
s2, setting classroom exercise questions after each teaching link, analyzing classroom exercise answers of students, and judging mastering conditions of knowledge points according to classroom exercise answer analysis results of the students;
and S3, generating classroom learning situation statistics according to the student expression analysis result and the exercise answer analysis result, and feeding the statistics back to students and teachers.
By the method, the expression of the student is detected in real time so as to acquire the concentration condition of the student, the knowledge point mastering degree condition of the student is acquired through classroom exercise, a teacher can conveniently observe the learning condition of the student on a classroom in real time, the classroom is adjusted in time according to the learning condition of the student, and the teaching quality is improved.
On the basis of the above technical solution, preferably, step S1 specifically includes:
s1-1, constructing an expression recognition model based on HGO characteristics, and training the expression recognition model;
s1-2, obtaining student expression data at preset time intervals, putting the expression data into a trained expression recognition model for analysis, obtaining student expression analysis results, sending warning information to students when the analysis results show that the students are not attentive, and recording the non-attentive behaviors; meanwhile, a teacher can send the attention-free behavior prompt of the student;
and S1-3, in the identification process, storing the student expression data into a sample data set for sample data expansion.
By the method, the expressions of the students are regularly observed, the warning information is sent out when the attention of the students is not concentrated, the students can know the behaviors of the students and carry out self-restraint, and the students can follow the classroom progress in time.
On the basis of the above technical solution, preferably, step S1-1 specifically includes:
s1-1-1, collecting a plurality of pieces of identification photo data with different expressions as sample data, carrying out geometric correction and normalization preprocessing on the sample data, and carrying out texture feature extraction to obtain an LBP + feature image of the sample data; meanwhile, obtaining an expression feature sample picture;
s1-1-2, performing unified mode pixel histogram statistics: dividing the LBP + characteristic image into 256 collecting boxes according to the gray scale of 0-255, and counting the number of pixels in the corresponding gray scale in each collecting box according to the sequence of the gray scale to obtain an LBP + unified mode histogram; normalizing the histogram of the LBP + unified mode to obtain the histogram characteristic of the LBP + unified mode of the image;
s1-1-3, carrying out HOG feature extraction on the sample data; performing tandem feature fusion on LBP + features and HOG features under the same type of modes; randomly extracting training samples in the fused feature space, and taking the rest samples as test samples;
s1-1-4, performing sample expansion by using a confrontation generating network, extracting an expression from the expression feature sample picture, attaching the expression to a single face image through a convolutional neural network, training to generate face images under different expressions, and storing the face images into the training sample;
s1-1-5, carrying out PCA (principal component analysis) dimensionality reduction calculation by using a training sample to obtain a projection matrix W, projecting the training sample to a low-dimensional subspace through the projection matrix W to obtain a feature representation of a facial expression image in the low-dimensional subspace, and training the expression recognition model;
s1-1-6, projecting the test sample to a low-dimensional subspace through a projection matrix W, classifying the characteristics of the test sample by using a sparse representation classifier to obtain the class to which the test sample belongs, and finishing the training of the expression recognition model when the error value of the output result of the test sample is lower than the preset error; otherwise, the process continues to step S1-1-5.
According to the method, an expression recognition model based on HGO characteristics is constructed, the expression recognition model is trained, a data mode is fixed in 256 dimensions through LBP + unified mode histogram statistics, high-dimensional data generated when a two-dimensional image matrix is converted into an image vector is avoided, characteristics extracted by the LBP + unified mode histogram are avoided, integral texture characteristic image pixels are counted, and texture gray scale information of different expression images is extracted; the HOG feature extraction method can well represent the appearance and the shape of an image target by using the edge gradient direction and the gradient strength, can effectively extract the pixel gradient direction information in different expression images by using the HOG method to extract the features of the sample data, obtains the expression shape change features, fuses the features extracted by the LBP + unified mode histogram and the HOG features, fully utilizes the complementarity of the two algorithms for extracting the features, uses the PCA algorithm for carrying out data dimension reduction, represents the expression features with texture and shape information by using lower dimension, and improves the calculation efficiency.
On the basis of the above technical solution, preferably, step S1-1-1 specifically includes:
setting the size of a pixel region to be nxn for the preprocessed sample data, comparing the neighborhood edge pixels in the set pixel region with the edge pixels in the central symmetry direction respectively, namely, only coding 4 directions to obtain 4-bit binary numbers:
Figure BDA0003414555630000031
Figure BDA0003414555630000032
wherein P represents the number of uniformly selected pixel points in the neighborhood with R as the radius, and NiFor the ith neighborhood, (x, y) is the pixel point coordinate.
According to the method, an improved LBP + algorithm is adopted, neighborhood edge pixels are compared with edge pixels in the central symmetry direction in a defined pixel region, namely 4 directions are coded to obtain 4-bit binary numbers, the coding length and the characteristic value length are shortened, sufficient characteristic details can be obtained, the smaller the characteristic value is, the higher the calculation speed is, and the beneficial effect is generated on the characteristic value calculation amount.
On the basis of the above technical solution, preferably, step S1-2 specifically includes:
the method comprises the steps of obtaining student expression data at preset intervals, inputting the expression data into a trained expression recognition model, carrying out classification judgment on the student expression data by the expression recognition model, determining the current student expression type, and judging whether the student expression type is not attentive or not according to the student expression type.
On the basis of the above technical solution, preferably, step S2 specifically includes:
s2-1, when a teacher sets a classroom exercise question, the difficulty of the classroom exercise question is preset;
s2-2, the student inputs classroom practice answers, and the similarity between the classroom practice answers of the student and the standard answers is calculated by using a kmeans algorithm;
s2-3, determining the mastery degree of the knowledge points of the students according to the difficulty and the similarity of the classroom practice problems;
when the topic difficulty is high difficulty: the similarity is higher than a first preset threshold value, and the knowledge point mastering degree is completely mastered; the similarity is lower than a first preset threshold and higher than a second preset threshold, and the mastery degree of the knowledge points is basic mastery; the similarity is lower than a second preset threshold value, and the mastery degree of the knowledge points is not mastered;
when the topic difficulty is medium difficulty: the similarity is higher than a third preset threshold value, and the knowledge point mastering degree is completely mastered; the similarity is lower than a third preset threshold and higher than a fourth preset threshold, and the mastery degree of the knowledge points is basic mastery; the similarity is lower than a fourth preset threshold value, and the mastery degree of the knowledge points is not mastered;
when the topic difficulty is low difficulty: the similarity is higher than a fifth preset threshold value, and the knowledge point mastering degree is completely mastered; the similarity is lower than a fifth preset threshold and higher than a sixth preset threshold, and the mastery degree of the knowledge points is basic mastery; and if the similarity is lower than a sixth preset threshold value, the mastery degree of the knowledge points is not mastered.
According to the method, the mastering conditions of the knowledge points of the students are flexibly judged according to the classroom exercise question difficulty and the answering conditions of the students.
On the basis of the above technical solution, preferably, step S3 specifically includes:
the classroom learning situation statistics comprises a concentration degree report and a knowledge point mastering degree report;
according to the non-concentration behavior record, generating a student concentration degree report and feeding the student concentration degree report back to the student; counting the non-concentration behavior records of students in the whole class, and generating a class concentration degree report to be fed back to a teacher;
generating a student knowledge point mastering degree report to be fed back to the student according to the classroom exercise answer analysis result; and (4) counting the analysis results of classroom exercise answers of students in the whole class, and generating a class knowledge point mastery degree report which is fed back to the teacher.
On the basis of the technical scheme, preferably, when the student is judged to be not attentive, warning information is sent to the student in a classroom, and meanwhile reminding information of the student not to be attentive can also be sent to a teacher; after the students finish the classroom exercise questions, the analysis results of knowledge point mastering conditions reflected by the exercise can be directly sent to the students.
By the method, the classroom situation is fed back to students and teachers in time, the classroom situation is reflected conveniently through intuitive data statistics reports, the teachers can know that the students adjust the classroom in time, the students can know the self situation, and self restraint is facilitated.
In a second aspect of the present invention, an online classroom-based learning situation observation system is disclosed, the system comprising:
an image module: the system comprises a data acquisition module, an expression recognition module, a data analysis module and a data analysis module, wherein the data acquisition module is used for acquiring student expression data at intervals of preset time, analyzing the expression data through the expression recognition module and judging whether the student expression data is not attentive according to the student expression analysis result;
the answering module comprises: the system is used for providing interfaces for teacher classroom exercise questions and students to input classroom exercise answers and analyzing the classroom exercise answers of the students to determine the mastering conditions of the students on the knowledge points;
and the feedback module is used for generating classroom learning situation statistics according to the student expression analysis result and the exercise answer analysis result and feeding the statistics back to students and teachers.
In a third aspect of the present invention, an electronic device is disclosed, the device comprising: at least one processor, at least one memory, a communication interface, and a bus; the processor, the memory and the communication interface complete mutual communication through the bus; the memory stores an online classroom-based learning context observation method program executable by the processor, and the online classroom-based learning context observation method program is configured to implement an online classroom-based learning context observation method according to the first aspect of the present invention.
In a fourth aspect of the present invention, a computer-readable storage medium is disclosed, wherein a program of an online classroom-based learning context observation method is stored on the storage medium, and when executed, the online classroom-based learning context observation method implements the online classroom-based learning context observation method according to the first aspect of the present invention.
Compared with the prior art, the study condition observation method based on the online classroom has the following beneficial effects:
(1) the expression of the student is detected in real time, so that the concentration condition of the student is known, the knowledge point mastering degree condition of the student is obtained through classroom practice, a teacher can conveniently observe the learning condition of the student in real time on a classroom online, the classroom is adjusted according to the learning condition of the student in time, and the teaching quality is improved;
(2) features extracted by the LBP + unified mode histogram and HOG features are fused, complementarity of the features extracted by the two algorithms is fully utilized, the PCA algorithm is used for data dimension reduction, the expression features with texture and shape information are expressed by lower dimensions, and the calculation efficiency is improved;
(3) by adopting an improved LBP + algorithm, in a defined pixel region, neighborhood edge pixels are respectively compared with edge pixels in the central symmetry direction, namely 4 directions are only coded to obtain 4-bit binary numbers, the coding length and the characteristic value length are shortened, enough characteristic details can be obtained, the smaller the characteristic value is, the higher the computing speed is, and a beneficial effect is generated on the characteristic value computing quantity.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a flow chart of the work flow of the learning situation observation method based on the on-line classroom of the invention;
FIG. 2 is a flow chart of the work flow of feature extraction based on the on-line classroom learning situation observation method LBP + algorithm of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, are within the scope of the present invention.
Examples
The working flow of the learning situation observation method based on the online classroom is shown in figure 1, and the processing steps are as follows:
the method comprises the steps of firstly, obtaining student expression data at preset intervals, analyzing the expression data, and judging whether the student expression data are not attentive according to student expression analysis results. And turning to the second step.
It should be understood that, on the basis of the above scheme, step S1 specifically includes:
s1-1, constructing an expression recognition model based on HGO characteristics, and training the expression recognition model;
s1-2, obtaining student expression data at preset time intervals, putting the expression data into a trained expression recognition model for analysis, obtaining student expression analysis results, sending warning information to students when the analysis results show that the students are not attentive, and recording the non-attentive behaviors;
and S1-3, in the identification process, storing the student expression data into a sample data set for sample data expansion.
It should be understood that, on the basis of the above scheme, step S1-1 specifically includes:
s1-1-1, collecting a plurality of pieces of identification photo data with different expressions as sample data, carrying out geometric correction and normalization preprocessing on the sample data, and carrying out texture feature extraction to obtain an LBP + feature image of the sample data; meanwhile, obtaining an expression feature sample picture;
s1-1-2, performing unified mode pixel histogram statistics: dividing the LBP + characteristic image into 256 collecting boxes according to the gray scale of 0-255, and counting the number of pixels in the corresponding gray scale in each collecting box according to the sequence of the gray scale to obtain an LBP + unified mode histogram; normalizing the histogram of the LBP + unified mode to obtain the histogram characteristic of the LBP + unified mode of the image;
s1-1-3, carrying out HOG feature extraction on the sample data; performing tandem feature fusion on LBP + features and HOG features under the same type of modes; randomly extracting training samples in the fused feature space, and taking the rest samples as test samples;
s1-1-4, performing sample expansion by using a confrontation generating network, extracting an expression from the expression feature sample picture, attaching the expression to a single face image through a convolutional neural network, training to generate face images under different expressions, and storing the face images into the training sample;
s1-1-5, carrying out PCA (principal component analysis) dimensionality reduction calculation by using a training sample to obtain a projection matrix W, projecting the training sample to a low-dimensional subspace through the projection matrix W to obtain a feature representation of a facial expression image in the low-dimensional subspace, and training the expression recognition model;
s1-1-6, projecting the test sample to a low-dimensional subspace through a projection matrix W, classifying the characteristics of the test sample by using a sparse representation classifier to obtain the class to which the test sample belongs, and finishing the training of the expression recognition model when the error value of the output result of the test sample is lower than the preset error; otherwise, the process continues to step S1-1-5.
It should be understood that, on the basis of the above scheme, the working flow of feature extraction performed by LBP + is shown in fig. 2, and step S1-1-1 specifically includes:
setting the size of a pixel region to be nxn for the preprocessed sample data, comparing the neighborhood edge pixels in the set pixel region with the edge pixels in the central symmetry direction respectively, namely, only coding 4 directions to obtain 4-bit binary numbers:
Figure BDA0003414555630000081
Figure BDA0003414555630000082
wherein P represents the number of uniformly selected pixel points in the neighborhood with R as the radius, and NiFor the ith neighborhood, (x, y) is the pixel point coordinate.
The invention adopts an improved LBP + algorithm, and in a defined pixel area, neighborhood edge pixels are respectively compared with edge pixels in the central symmetry direction, namely 4 directions are only coded to obtain 4-bit binary numbers, the coding length and the characteristic value length are shortened, enough characteristic details can be obtained, the smaller the characteristic value is, the higher the computing speed is, and a beneficial effect is generated on the characteristic value computing quantity;
the invention also fuses the features extracted by the LBP + unified mode histogram and the HOG features, fully utilizes the complementarity of the two algorithms for extracting the features, effectively extracts the expression feature information, the extracted features have stronger identification and distinction among different categories, uses the normalized image pixel statistical histogram and the image gradient direction statistical histogram for feature series fusion, uses the dimension reduction technology to reduce the dimension of the fusion feature data, represents the expression features with texture and shape information by lower dimension, obtains higher expression recognition rate and obtains higher recognition precision.
It should be understood that, on the basis of the above scheme, step S1-2 specifically includes:
the method comprises the steps of obtaining student expression data at preset intervals, inputting the expression data into a trained expression recognition model, carrying out classification judgment on the student expression data by the expression recognition model, determining the current student expression type, and judging whether the student expression type is not attentive or not according to the student expression type.
And secondly, setting classroom exercise questions after each teaching link, analyzing classroom exercise answers of students, and judging mastering conditions of the students on knowledge points according to classroom exercise answer analysis results. And (6) turning to the third step.
It should be understood that, on the basis of the above scheme, step S2 specifically includes:
s2-1, when a teacher sets a classroom exercise question, the difficulty of the classroom exercise question is preset;
s2-2, the student inputs classroom practice answers, and the similarity between the classroom practice answers of the student and the standard answers is calculated by using a kmeans algorithm;
s2-3, determining the mastery degree of the knowledge points of the students according to the difficulty and the similarity of the classroom practice problems;
when the topic difficulty is high difficulty: the similarity is higher than a first preset threshold value, and the knowledge point mastering degree is completely mastered; the similarity is lower than a first preset threshold and higher than a second preset threshold, and the mastery degree of the knowledge points is basic mastery; the similarity is lower than a second preset threshold value, and the mastery degree of the knowledge points is not mastered;
when the topic difficulty is medium difficulty: the similarity is higher than a third preset threshold value, and the knowledge point mastering degree is completely mastered; the similarity is lower than a third preset threshold and higher than a fourth preset threshold, and the mastery degree of the knowledge points is basic mastery; the similarity is lower than a fourth preset threshold value, and the mastery degree of the knowledge points is not mastered;
when the topic difficulty is low difficulty: the similarity is higher than a fifth preset threshold value, and the knowledge point mastering degree is completely mastered; the similarity is lower than a fifth preset threshold and higher than a sixth preset threshold, and the mastery degree of the knowledge points is basic mastery; and if the similarity is lower than a sixth preset threshold value, the mastery degree of the knowledge points is not mastered.
And thirdly, generating classroom learning situation statistics according to the student expression analysis result and the exercise answer analysis result, and feeding the statistics back to students and teachers.
It should be understood that, on the basis of the above scheme, step S3 specifically includes:
the classroom learning situation statistics comprises a concentration degree report and a knowledge point mastering degree report;
according to the non-concentration behavior record, generating a student concentration degree report and feeding the student concentration degree report back to the student; counting the non-concentration behavior records of students in the whole class, and generating a class concentration degree report to be fed back to a teacher;
generating a student knowledge point mastering degree report to be fed back to the student according to the classroom exercise answer analysis result; and (4) counting the analysis results of classroom exercise answers of students in the whole class, and generating a class knowledge point mastery degree report which is fed back to the teacher.
The invention combines the concentration degree condition and knowledge point mastering degree condition of students to remind in time, thereby being convenient for teachers to observe the learning condition of the students in real time on the online class, generating intuitive classroom learning condition statistics, adjusting the classroom in time according to the learning condition of the students and being beneficial to improving the teaching quality.
The invention also discloses an on-line classroom-based learning situation observation system, which comprises:
an image module: the system comprises a data acquisition module, an expression recognition module, a data analysis module and a data analysis module, wherein the data acquisition module is used for acquiring student expression data at intervals of preset time, analyzing the expression data through the expression recognition module and judging whether the student expression data is not attentive according to the student expression analysis result;
the answering module comprises: the system is used for providing interfaces for teacher classroom exercise questions and students to input classroom exercise answers and analyzing the classroom exercise answers of the students to determine the mastering conditions of the students on the knowledge points;
and the feedback module is used for generating classroom learning situation statistics according to the student expression analysis result and the exercise answer analysis result and feeding the statistics back to students and teachers.
The invention also discloses an electronic device, comprising: at least one processor, at least one memory, a communication interface, and a bus; the processor, the memory and the communication interface complete mutual communication through the bus; the memory stores an online classroom-based learning context observation method program executable by the processor, and the online classroom-based learning context observation method program is configured to implement an online classroom-based learning context observation method according to an embodiment of the present invention.
The invention also discloses a computer readable storage medium, wherein the storage medium is stored with an on-line classroom-based learning context observation method program, and when the on-line classroom-based learning context observation method program is executed, the on-line classroom-based learning context observation method is realized.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (10)

1. The study condition observation method based on the online classroom is characterized by comprising the following steps of:
s1, obtaining student expression data at preset intervals, analyzing the expression data, and judging whether the student expression data is not attentive according to the student expression analysis result;
s2, setting classroom exercise questions after each teaching link, analyzing classroom exercise answers of students, and judging mastering conditions of knowledge points according to classroom exercise answer analysis results of the students;
and S3, generating classroom learning situation statistics according to the student expression analysis result and the exercise answer analysis result, and feeding the statistics back to students and teachers.
2. The method for observing learning situations in on-line class according to claim 1, wherein the step S1 specifically comprises:
s1-1, constructing an expression recognition model based on HGO characteristics, and training the expression recognition model;
s1-2, obtaining student expression data at preset time intervals, putting the expression data into a trained expression recognition model for analysis, obtaining student expression analysis results, sending warning information to students when the analysis results show that the students are not attentive, and recording the non-attentive behaviors;
and S1-3, in the identification process, storing the student expression data into a sample data set for sample data expansion.
3. The method for observing learning situations in class on line according to claim 2, wherein the step S1-1 specifically comprises:
s1-1-1, collecting a plurality of pieces of identification photo data with different expressions as sample data, carrying out geometric correction and normalization preprocessing on the sample data, and carrying out texture feature extraction to obtain an LBP + feature image of the sample data; meanwhile, obtaining an expression feature sample picture;
s1-1-2, performing unified mode pixel histogram statistics: dividing the LBP + characteristic image into 256 collecting boxes according to the gray scale of 0-255, and counting the number of pixels in the corresponding gray scale in each collecting box according to the sequence of the gray scale to obtain an LBP + unified mode histogram; normalizing the histogram of the LBP + unified mode to obtain the histogram characteristic of the LBP + unified mode of the image;
s1-1-3, carrying out HOG feature extraction on the sample data; performing tandem feature fusion on LBP + features and HOG features under the same type of modes; randomly extracting training samples in the fused feature space, and taking the rest samples as test samples;
s1-1-4, performing sample expansion by using a confrontation generating network, extracting an expression from the expression feature sample picture, attaching the expression to a single face image through a convolutional neural network, training to generate face images under different expressions, and storing the face images into the training sample;
s1-1-5, carrying out PCA (principal component analysis) dimensionality reduction calculation by using a training sample to obtain a projection matrix W, projecting the training sample to a low-dimensional subspace through the projection matrix W to obtain a feature representation of a facial expression image in the low-dimensional subspace, and training the expression recognition model;
s1-1-6, projecting the test sample to a low-dimensional subspace through a projection matrix W, classifying the characteristics of the test sample by using a sparse representation classifier to obtain the class to which the test sample belongs, and finishing the training of the expression recognition model when the error value of the output result of the test sample is lower than the preset error; otherwise, the process continues to step S1-1-5.
4. The method for observing learning situations in on-line class according to claim 3, wherein the step S1-1-1 comprises:
setting the size of a pixel region to be nxn for the preprocessed sample data, comparing the neighborhood edge pixels in the set pixel region with the edge pixels in the central symmetry direction respectively, namely, only coding 4 directions to obtain 4-bit binary numbers:
Figure FDA0003414555620000021
Figure FDA0003414555620000022
wherein P represents the number of uniformly selected pixel points in the neighborhood with R as the radius, and NiFor the ith neighborhood, (x, y) is the pixel point coordinate.
5. The method for observing learning situations in class on line according to claim 3, wherein the step S1-2 comprises:
the method comprises the steps of obtaining student expression data at preset intervals, inputting the expression data into a trained expression recognition model, carrying out classification judgment on the student expression data by the expression recognition model, determining the current student expression type, and judging whether the student expression type is not attentive or not according to the student expression type.
6. The method for observing learning situations in on-line class according to claim 4, wherein the step S2 specifically comprises:
s2-1, when a teacher sets a classroom exercise question, the difficulty of the classroom exercise question is preset;
s2-2, the student inputs classroom practice answers, and the similarity between the classroom practice answers of the student and the standard answers is calculated by using a kmeans algorithm;
s2-3, determining the mastery degree of the knowledge points of the students according to the difficulty and the similarity of the classroom practice problems;
when the topic difficulty is high difficulty: the similarity is higher than a first preset threshold value, and the knowledge point mastering degree is completely mastered; the similarity is lower than a first preset threshold and higher than a second preset threshold, and the mastery degree of the knowledge points is basic mastery; the similarity is lower than a second preset threshold value, and the mastery degree of the knowledge points is not mastered;
when the topic difficulty is medium difficulty: the similarity is higher than a third preset threshold value, and the knowledge point mastering degree is completely mastered; the similarity is lower than a third preset threshold and higher than a fourth preset threshold, and the mastery degree of the knowledge points is basic mastery; the similarity is lower than a fourth preset threshold value, and the mastery degree of the knowledge points is not mastered;
when the topic difficulty is low difficulty: the similarity is higher than a fifth preset threshold value, and the knowledge point mastering degree is completely mastered; the similarity is lower than a fifth preset threshold and higher than a sixth preset threshold, and the mastery degree of the knowledge points is basic mastery; and if the similarity is lower than a sixth preset threshold value, the mastery degree of the knowledge points is not mastered.
7. The method for observing learning situations in on-line class according to claim 6, wherein the step S3 specifically comprises:
the classroom learning situation statistics comprises a concentration degree report and a knowledge point mastering degree report;
according to the non-concentration behavior record, generating a student concentration degree report and feeding the student concentration degree report back to the student; counting the non-concentration behavior records of students in the whole class, and generating a class concentration degree report to be fed back to a teacher;
generating a student knowledge point mastering degree report to be fed back to the student according to the classroom exercise answer analysis result; and (4) counting the analysis results of classroom exercise answers of students in the whole class, and generating a class knowledge point mastery degree report which is fed back to the teacher.
8. An on-line classroom based learning situation observation system, the system comprising:
an image module: the system comprises a data acquisition module, an expression recognition module, a data analysis module and a data analysis module, wherein the data acquisition module is used for acquiring student expression data at intervals of preset time, analyzing the expression data through the expression recognition module and judging whether the student expression data is not attentive according to the student expression analysis result;
the answering module comprises: the system is used for providing interfaces for teacher classroom exercise questions and students to input classroom exercise answers and analyzing the classroom exercise answers of the students to determine the mastering conditions of the students on the knowledge points;
and the feedback module is used for generating classroom learning situation statistics according to the student expression analysis result and the exercise answer analysis result and feeding the statistics back to students and teachers.
9. An electronic device comprising at least one processor, at least one memory, a communication interface, and a bus; the processor, the memory and the communication interface complete mutual communication through the bus; the memory stores an online classroom-based learning observation method program executable by the processor, the online classroom-based learning observation method program configured to implement an online classroom-based learning observation method according to any one of claims 1 to 7.
10. A computer-readable storage medium, wherein the storage medium stores thereon an online classroom-based academic aptitude observation method program, and when the online classroom-based academic aptitude observation method program is executed, the online classroom-based academic aptitude observation method according to any one of claims 1 to 7 is implemented.
CN202111541953.5A 2021-12-16 2021-12-16 Learning situation observation method, system, equipment and medium based on online classroom Pending CN114187640A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111541953.5A CN114187640A (en) 2021-12-16 2021-12-16 Learning situation observation method, system, equipment and medium based on online classroom

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111541953.5A CN114187640A (en) 2021-12-16 2021-12-16 Learning situation observation method, system, equipment and medium based on online classroom

Publications (1)

Publication Number Publication Date
CN114187640A true CN114187640A (en) 2022-03-15

Family

ID=80605302

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111541953.5A Pending CN114187640A (en) 2021-12-16 2021-12-16 Learning situation observation method, system, equipment and medium based on online classroom

Country Status (1)

Country Link
CN (1) CN114187640A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114937383A (en) * 2022-06-02 2022-08-23 北京新唐思创教育科技有限公司 Interactive online teaching method, device, equipment and medium
CN116259004A (en) * 2023-01-09 2023-06-13 盐城工学院 Student learning state detection method and system applied to online education

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114937383A (en) * 2022-06-02 2022-08-23 北京新唐思创教育科技有限公司 Interactive online teaching method, device, equipment and medium
CN116259004A (en) * 2023-01-09 2023-06-13 盐城工学院 Student learning state detection method and system applied to online education
CN116259004B (en) * 2023-01-09 2023-08-15 盐城工学院 Student learning state detection method and system applied to online education

Similar Documents

Publication Publication Date Title
CN109522815B (en) Concentration degree evaluation method and device and electronic equipment
US11790641B2 (en) Answer evaluation method, answer evaluation system, electronic device, and medium
CN111027865B (en) Teaching analysis and quality assessment system and method based on behavior and expression recognition
CN110175501B (en) Face recognition-based multi-person scene concentration degree recognition method
CN112183238B (en) Remote education attention detection method and system
CN111242049A (en) Student online class learning state evaluation method and system based on facial recognition
CN114187640A (en) Learning situation observation method, system, equipment and medium based on online classroom
CN113657168B (en) Student learning emotion recognition method based on convolutional neural network
CN111507227A (en) Multi-student individual segmentation and state autonomous identification method based on deep learning
CN115205764B (en) Online learning concentration monitoring method, system and medium based on machine vision
CN112883867A (en) Student online learning evaluation method and system based on image emotion analysis
CN112686462A (en) Student portrait-based anomaly detection method, device, equipment and storage medium
CN111178263B (en) Real-time expression analysis method and device
CN115546861A (en) Online classroom concentration degree identification method, system, equipment and medium
CN113762107A (en) Object state evaluation method and device, electronic equipment and readable storage medium
CN116403262A (en) Online learning concentration monitoring method, system and medium based on machine vision
CN112528777A (en) Student facial expression recognition method and system used in classroom environment
CN111507467A (en) Neural network model training method and device, computer equipment and storage medium
CN115797829A (en) Online classroom learning state analysis method
CN114638988A (en) Teaching video automatic classification method and system based on different presentation modes
CN111914801A (en) Classroom analysis method for intelligent education
CN115546692A (en) Remote education data acquisition and analysis method, equipment and computer storage medium
Huang et al. Research on learning state based on students’ attitude and emotion in class learning
Madake et al. Vision-based Monitoring of Student Attentiveness in an E-Learning Environment
Ramos et al. A Facial Expression Emotion Detection using Gabor Filter and Principal Component Analysis to identify Teaching Pedagogy

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination