CN110766580A - Classroom quality monitoring system based on human face characteristics - Google Patents

Classroom quality monitoring system based on human face characteristics Download PDF

Info

Publication number
CN110766580A
CN110766580A CN201911020705.9A CN201911020705A CN110766580A CN 110766580 A CN110766580 A CN 110766580A CN 201911020705 A CN201911020705 A CN 201911020705A CN 110766580 A CN110766580 A CN 110766580A
Authority
CN
China
Prior art keywords
information
student
face
module
analysis module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911020705.9A
Other languages
Chinese (zh)
Inventor
邹杨
韦鹏程
李洋
郭红利
万莉
管树林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing University of Education
Original Assignee
Chongqing University of Education
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing University of Education filed Critical Chongqing University of Education
Priority to CN201911020705.9A priority Critical patent/CN110766580A/en
Publication of CN110766580A publication Critical patent/CN110766580A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • G06Q50/205Education administration or guidance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06395Quality analysis or management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition

Abstract

The invention relates to a classroom quality monitoring system based on human face characteristics, in particular to the field of image communication. The system comprises a data acquisition module, a face recognition module, a model analysis module and an output module; the data acquisition module is used for acquiring a video file and a data file and sending the video file and the data file to the face recognition module; the face recognition module is used for performing image framing processing on the video file to obtain an image file, the face recognition module is also used for performing feature extraction on the image file to obtain face feature information, the face recognition module is also used for obtaining expression recognition information according to the face feature information, and the face recognition module is also used for sending the face feature information and the expression recognition information to the model analysis module. The technical problem of how to improve the accuracy of the classroom quality monitoring system is solved, and the classroom quality monitoring system is suitable for classroom quality monitoring.

Description

Classroom quality monitoring system based on human face characteristics
Technical Field
The invention relates to the field of image communication, in particular to a classroom quality monitoring system based on human face characteristics.
Background
The existing classroom real-time monitoring system does not perform multidimensional analysis on the states of students, for example, in Chinese patent CN109684984A, an expression recognition method suitable for classroom teaching is disclosed, which comprises the following steps of recording classroom conditions in real time by using camera equipment to obtain classroom real-time video information; processing the real-time video information to obtain the facial image information of the student, and judging the disappearance time of the facial image information; when the disappearance time of the facial image information of the student is greater than a preset time value T1, determining that the student is in a vague state, and counting the vague rate of the student; the invention can monitor the real-time condition in the classroom by arranging the camera shooting equipment, detect the time when the face appears by utilizing the camera shooting equipment and combining the processing device, and automatically judge that a student is in a vague state when the time when the face of the student does not appear exceeds a preset value; otherwise, the system is regarded as normal pause, and activities such as lowering heads and making notes are acquired; by the method, whether the student is distracted in class can be effectively detected.
However, by using the above scheme, the students are required to keep a posture for listening, so the judgment error of the student state is large, and the reliability of the obtained real-time classroom monitoring information is not high.
Disclosure of Invention
The invention aims to solve the technical problem of how to improve the accuracy of a classroom quality monitoring system.
The technical scheme for solving the technical problems is as follows: a class quality monitoring system based on human face features comprises: the system comprises a data acquisition module, a face recognition module, a model analysis module and an output module;
the data acquisition module is used for acquiring a video file and a data file and sending the video file and the data file to the face recognition module;
the face recognition module is used for performing image framing processing on the video file to obtain an image file, the face recognition module is also used for performing feature extraction on the image file to obtain face feature information, the face recognition module is also used for obtaining expression recognition information according to the face feature information, and the face recognition module is also used for sending the face feature information and the expression recognition information to the model analysis module;
the model analysis module is used for obtaining a student basic information table according to the data file, the student basic information table comprises student information, student classroom exercise achievement information and student face feature information corresponding to each piece of student information, the model analysis module is used for obtaining the student information corresponding to the face feature information according to the face feature information and the student basic information table, the model analysis module is further used for obtaining the head raising rate of each student according to the face feature information and the student information, the model analysis module is further used for obtaining the learning state of each student according to the face feature information and the expression identification information, the model analysis module is used for creating a regression model and a time sequence model, and the model analysis module is further used for establishing the student information, the student information and the face feature information of each student of the students, The information of the head raising rate, the learning state and the student classroom exercise achievement is used as preprocessing information, and a relational table of expression time and achievement is obtained by combining the regression model and the time series model;
and the output module is used for acquiring and outputting the expression time and score relation table.
The invention has the beneficial effects that: the method mainly monitors the classroom quality through OpenCV + Dlib face feature technology, and specifically comprises the steps of analyzing the head-up rate of students, obtaining the influence of the head-up rate on the score to draw a conclusion, realizing the automatic detection of the classroom quality, specifically guiding each student in a targeted manner according to the head-up rate of the students in the later period, and also obtaining the classroom teaching quality of teachers.
On the basis of the technical scheme, the invention can be further improved as follows.
Further, the image framing processing comprises OpenCV + Dlib automatic framing, Gaussian image filter noise reduction and image gray scale processing.
The method has the advantages that compared with the image file obtained by only carrying out OpenCV + Dlib automatic framing, the image file has higher accuracy in feature extraction.
Further, the face recognition module is used for classifying the expressions according to the face feature information and the calculation parameter indexes to obtain expression recognition information.
The beneficial effect of adopting the further scheme is that the deficiency of OpenCV + Dlib can be optimized and the accuracy is increased by increasing the calculation parameter index.
Further, the expression recognition information includes 4 expressions of happiness, curiosity, confusion and nature.
The method has the advantages that after the method is set, the head-up times of students and the expression dimensions of the students appearing each time the heads up are comprehensively considered, and the characteristics of data are reflected from different aspects in each dimension, so that the classroom quality can be detected from multiple aspects; meanwhile, the method can better locate the specific level, thereby finding the problem and solving the problem.
Further, video files containing teaching contents with different difficulty gradients are obtained, and a relation table of expression time and achievement corresponding to the different difficulty gradients is obtained through the face recognition module and the model analysis module.
The beneficial effect of adopting the further scheme is that after the setting, the contingency of a single classroom can be eliminated, the relation between the difficulty degree of the classroom and the expression and the test result of the student is explored, and the difficulty degree of the classroom can be the difficulty degree of the test question in the classroom.
Advantages of additional aspects of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
Drawings
Fig. 1 is a schematic diagram of a system structure of an embodiment of a class quality monitoring system based on human face features;
fig. 2 is a system flow diagram of an embodiment of a class quality monitoring system based on human face features according to the present invention;
fig. 3 is a facial feature point diagram of another embodiment of the class quality monitoring system based on facial features according to the present invention.
Detailed Description
The principles and features of this invention are described below in conjunction with the following drawings, which are set forth by way of illustration only and are not intended to limit the scope of the invention.
The embodiment is basically as shown in the attached figure 1:
classroom quality monitoring system based on human face characteristics in this embodiment includes: the system comprises a data acquisition module 1, a face recognition module 2, a model analysis module 3 and an output module 4;
the data acquisition module 1 is used for acquiring video files and data files and sending the video files and the data files to the face recognition module 2, and the data acquisition module 1 in the embodiment can acquire and process data based on a learning leveling platform, and specifically comprises a multimedia classroom, teaching images and classroom exercise data in a future intelligent classroom, and a large data center for children;
the face recognition module 2 is used for performing image framing processing on the video file to obtain an image file, the face recognition module 2 is also used for performing feature extraction on the image file to obtain face feature information, the face recognition module 2 is also used for obtaining expression recognition information according to the face feature information, and the face recognition module 2 is also used for sending the face feature information and the expression recognition information to the model analysis module 3;
the model analysis module 3 is used for obtaining a student basic information table according to the data file, wherein the student basic information table comprises student information, student classroom exercise score information and student face feature information corresponding to each piece of student information, the student information in the embodiment comprises the school number, name and gender of a student, and the privacy of the student is prevented from being revealed due to excessive information; the model analysis module 3 is used for obtaining student information corresponding to the face feature information according to the face feature information and the student basic information table, the model analysis module 3 is further used for obtaining the head raising rate of each student according to the face feature information and the student information, the model analysis module 3 is further used for obtaining the learning state of each student according to the face feature information and the expression recognition information, the model analysis module 3 is used for creating a regression model and a time series model, the model analysis module 3 is further used for taking the student information, the head raising rate, the learning state and the student classroom exercise achievement information of each student as preprocessing information, and a relation table of expression time and achievement is obtained by combining the regression model and the time series model;
the output module 4 is used for acquiring and outputting the expression time and achievement relation table, and the user checks the expression time and achievement relation table through the output module 4.
The unitary regression model in this example is:
let y ═ f (x) + ε as a univariate regression model, where ε -N (0, σ)2). The equation of the unary regression is
Figure BDA0002247111700000051
If it is
Figure BDA0002247111700000052
The smaller the sum of squared residuals, the better the fitting of the univariate regression model.
When the correlative analysis of the classroom head raising times and the classroom test results is researched, the unary regression model is considered to be adopted to carry out regression analysis on the classroom head raising times and the classroom test results, and therefore the corresponding relation between the two variables is obtained. The established model is that the head-up times are taken as the only independent variable x1Corresponding to the class test result as a dependent variable y, and the error term is epsilon, epsilon-N (0, sigma)2) And solving a unitary regression equation with minimum residual sum of squares according to data analysis to obtain two equationsAnd (4) correlation.
The multiple regression model is:
in the case of multiple regression analysis, the linear relationship between a plurality of independent variables is analyzed first, and independent variables having strong collinearity are excluded. The degree of linearity can be characterized by a correlation coefficient, and the formula is as follows:
Figure BDA0002247111700000053
let y be g (x) as the multiple regression model1,x2...xn) + ε wherein ε -N (0, σ)2)。
The solved multiple regression equation is expressed as
Figure BDA0002247111700000054
The smaller the sum of squared residuals, the better the fitting of the univariate regression model.
When studying the change of the classroom emotion of the student and the correlation analysis of the online test result of the corresponding student, the multivariate regression model is considered to be adopted for carrying out regression analysis on the change, and four kinds of facial emotion changes are set in the multivariate regression model and are happy, curious, puzzling and natural respectively. The number of times of counting the four variables of each student in a certain time is set as x2,x3,x4,x5The personal online test result is used as a dependent variable y, and the error term is epsilon, epsilon-N (0, sigma)2). And analyzing according to actual data to obtain a multiple regression equation with minimum residual square sum.
The invention has the beneficial effects that: the method mainly monitors the classroom quality through OpenCV + Dlib face feature technology, and specifically comprises the steps of analyzing the head-up rate of students, obtaining the influence of the head-up rate on the score to draw a conclusion, realizing the automatic detection of the classroom quality, specifically guiding each student in a targeted manner according to the head-up rate of the students in the later period, and also obtaining the classroom teaching quality of teachers.
On the basis of the technical scheme, the invention can be further improved as follows.
Optionally, in some other embodiments, the image framing process includes OpenCV + Dlib automatic framing, gaussian image filter noise reduction, and image grayscale processing.
Compared with an image file obtained by only carrying out OpenCV + Dlib automatic framing, the image file of the scheme has higher accuracy in feature extraction.
Optionally, in some other embodiments, as shown in fig. 3, the face recognition module 2 is configured to classify the expressions according to the facial feature information and the calculation parameter index, so as to obtain expression recognition information.
The calculation parameter index in this embodiment may include the following 7 parameters:
parameter 1 face Range parameter (x)1,y1,x2,y2) Wherein x is1Horizontal coordinate, y, representing the lower left corner of the face region1Ordinate, x, representing the lower left corner of the face region2Abscissa, y, representing the upper right corner of the face region2The ordinate representing the upper right corner of the face region.
Parameter 2 face boundary feature point coordinates are noted as (a)i,x,ai,y) 1, …,17, aspect ratio
Figure BDA0002247111700000061
Lower face length ratio
Figure BDA0002247111700000062
Parameter 3 eyebrow feature point coordinates are noted as (b)i,x,bi,y) 1, …,10, brow expansion degree
Figure BDA0002247111700000063
The critical point is denoted as k1Degree of raising eyebrow
Figure BDA0002247111700000071
The critical point is denoted as k2
Parameter 4 eye feature point coordinates are noted as (c)i,x,ci,y) 1, …,12, degree of interocular distance
Figure BDA0002247111700000072
The critical point is denoted as k3Degree of opening of eyes
Figure BDA0002247111700000073
The critical point is denoted as k4
Parameter 5 nose feature point coordinates are noted as (d)i,x,di,y) 1, …,9, nose width Δ dx=(d9,x-d5,x)/(x2-x1) The critical point is denoted as k5Nasal height Δ dy=(d7,y-d1,y)/(y2-y1) Critical point is marked as k6
Parameter 6 mouth feature point coordinates are noted as (e)i,x,ei,y) i 1, …,20, degree of mouth cracking Δ ex=(e20,x-e1,x)/(x2-x1) The critical point is denoted as k7Degree of mouth opening Δ ey=(e10,y-e11,y)/(y2-y1) The critical point is denoted as k8
Parameter 7 expression parameter threshold values (k1, k2, k3, k4, k5, k6), the relationships between the eyebrow, the eye, the nose and the mouth of four expressions and the threshold values are expressed as follows:
happy: (Δ b)x>k1,Δby>k2,Δcx>k3,Δcy<k4,Δdx>k5,Δdy<k6,Δex>k7,Δey>k8)
Curiosity: (Δ b)x<k1,Δby>k2,Δcx<k3,Δcy>k4,Δdx<k5,Δdy>k6,Δex<k7,Δey>k8)
For doubt: (Δ b)x<k1,Δby<k2,Δcx<k3,Δcy<k4,Δdx>k5,Δdy<k6,Δex<k7,Δey<k8)
Naturally: (Δ b)x>k1,Δby<k2,Δcx>k3,Δcy>k4,Δdx<k5,Δdy>k6,Δex>k7,Δey<k8)。
By adding the calculation parameter index, the deficiency of OpenCV + Dlib can be optimized, and the accuracy is improved.
Alternatively, in some other embodiments, the expression identification information includes 4 expressions of curiosity, nature, confusion, and happiness.
After the setting, the two dimensions of the head-up times of the students and the expressions appearing each time the students head-up are comprehensively considered, and each dimension embodies the characteristics of data from different aspects, so that the classroom quality can be detected from multiple aspects; meanwhile, the method can better locate the specific level, thereby finding the problem and solving the problem.
As shown in fig. 2, in this embodiment, a camera included in the data acquisition module 1 is opened to obtain a video file at the beginning, then the face recognition module 2 performs image framing on the video file to obtain an image file, extracts face feature information in the image file, outputs a non face without detecting a face, positions the face feature to obtain expression identification information under the condition of extracting the face feature information, and then calculates parameter indexes to obtain a classification of the expression, so as to obtain multiple states of a student listening to a lesson, where the multiple states may include the non face, curiosity, nature, confusion, and happiness.
Optionally, in some other embodiments, video files containing teaching contents with different difficulty gradients are obtained, and a relationship table of expression time and achievement corresponding to different difficulty gradients is obtained through the face recognition module 2 and the model analysis module 3.
After the arrangement, the contingency of a single classroom can be eliminated, the relation between the difficulty level of the classroom and the expression and test result of students can be explored, and the difficulty level of the classroom can be the difficulty level of the test questions in the classroom.
It should be noted that the above embodiments are product embodiments corresponding to the above method embodiments, and for the description of each structural device and the optional implementation in this embodiment, reference may be made to the corresponding description in the above method embodiments, and details are not repeated herein.
The reader should understand that in the description of this specification, reference to the description of the terms "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of various embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
While the invention has been described with reference to specific embodiments, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (5)

1. The utility model provides a classroom quality monitoring system based on facial features which characterized in that includes: the system comprises a data acquisition module, a face recognition module, a model analysis module and an output module;
the data acquisition module is used for acquiring a video file and a data file and sending the video file and the data file to the face recognition module;
the face recognition module is used for performing image framing processing on the video file to obtain an image file, the face recognition module is also used for performing feature extraction on the image file to obtain face feature information, the face recognition module is also used for obtaining expression recognition information according to the face feature information, and the face recognition module is also used for sending the face feature information and the expression recognition information to the model analysis module;
the model analysis module is used for obtaining a student basic information table according to the data file, the student basic information table comprises student information, student classroom exercise achievement information and student face feature information corresponding to each piece of student information, the model analysis module is used for obtaining the student information corresponding to the face feature information according to the face feature information and the student basic information table, the model analysis module is further used for obtaining the head raising rate of each student according to the face feature information and the student information, the model analysis module is further used for obtaining the learning state of each student according to the face feature information and the expression identification information, the model analysis module is used for creating a regression model and a time sequence model, and the model analysis module is further used for establishing the student information, the head raising rate, the learning state of each student, and the like of each student, The learning state and the student classroom exercise result information are used as preprocessing information, and a relation table of expression time and results is obtained by combining the regression model and the time sequence model;
and the output module is used for acquiring and outputting the expression time and score relation table.
2. The class quality monitoring system based on the human face features as claimed in claim 1, wherein: the image framing processing comprises automatic framing of OpenCV + Dlib, Gaussian image filter noise reduction and image gray level processing.
3. The class quality monitoring system based on the human face features as claimed in claim 1, wherein: the face recognition module is used for classifying the expressions according to the face feature information and the calculation parameter indexes to obtain expression recognition information.
4. The class quality monitoring system based on the human face features as claimed in claim 1, wherein: the expression identification information includes 4 expressions of happiness, curiosity, confusion and nature.
5. The class quality monitoring system based on the human face features as claimed in claim 1, wherein: and acquiring a video file containing teaching contents with different difficulty gradients, and acquiring a relation table of expression time and achievement corresponding to the different difficulty gradients through the face recognition module and the model analysis module.
CN201911020705.9A 2019-10-25 2019-10-25 Classroom quality monitoring system based on human face characteristics Pending CN110766580A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911020705.9A CN110766580A (en) 2019-10-25 2019-10-25 Classroom quality monitoring system based on human face characteristics

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911020705.9A CN110766580A (en) 2019-10-25 2019-10-25 Classroom quality monitoring system based on human face characteristics

Publications (1)

Publication Number Publication Date
CN110766580A true CN110766580A (en) 2020-02-07

Family

ID=69333510

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911020705.9A Pending CN110766580A (en) 2019-10-25 2019-10-25 Classroom quality monitoring system based on human face characteristics

Country Status (1)

Country Link
CN (1) CN110766580A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111444389A (en) * 2020-03-27 2020-07-24 焦点科技股份有限公司 Conference video analysis method and system based on target detection
CN112464896A (en) * 2020-12-14 2021-03-09 北京易华录信息技术股份有限公司 Physical and mental state analysis system based on student behaviors
CN112598550A (en) * 2020-12-24 2021-04-02 苏州大学 Student activity multidimensional management system and management method based on behavior analysis
CN112732770A (en) * 2021-02-05 2021-04-30 嘉兴南洋职业技术学院 Educational administration management system and method based on artificial intelligence

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108596057A (en) * 2018-04-11 2018-09-28 重庆第二师范学院 A kind of Information Security Management System based on recognition of face
CN109034014A (en) * 2018-07-10 2018-12-18 天津瑟威兰斯科技有限公司 Biopsy method based on the micro- Expression Recognition of face
US20190034779A1 (en) * 2017-07-28 2019-01-31 Cannex Technology Inc. Digital Learning Smart Identification Card Structure
CN109815795A (en) * 2018-12-14 2019-05-28 深圳壹账通智能科技有限公司 Classroom student's state analysis method and device based on face monitoring
CN110232343A (en) * 2019-06-04 2019-09-13 重庆第二师范学院 Children personalized behavioral statistics analysis system and method based on latent variable model

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190034779A1 (en) * 2017-07-28 2019-01-31 Cannex Technology Inc. Digital Learning Smart Identification Card Structure
CN108596057A (en) * 2018-04-11 2018-09-28 重庆第二师范学院 A kind of Information Security Management System based on recognition of face
CN109034014A (en) * 2018-07-10 2018-12-18 天津瑟威兰斯科技有限公司 Biopsy method based on the micro- Expression Recognition of face
CN109815795A (en) * 2018-12-14 2019-05-28 深圳壹账通智能科技有限公司 Classroom student's state analysis method and device based on face monitoring
CN110232343A (en) * 2019-06-04 2019-09-13 重庆第二师范学院 Children personalized behavioral statistics analysis system and method based on latent variable model

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111444389A (en) * 2020-03-27 2020-07-24 焦点科技股份有限公司 Conference video analysis method and system based on target detection
CN112464896A (en) * 2020-12-14 2021-03-09 北京易华录信息技术股份有限公司 Physical and mental state analysis system based on student behaviors
CN112598550A (en) * 2020-12-24 2021-04-02 苏州大学 Student activity multidimensional management system and management method based on behavior analysis
CN112598550B (en) * 2020-12-24 2024-03-26 苏州大学 Student activity multidimensional management system and method based on behavior analysis
CN112732770A (en) * 2021-02-05 2021-04-30 嘉兴南洋职业技术学院 Educational administration management system and method based on artificial intelligence

Similar Documents

Publication Publication Date Title
CN110766580A (en) Classroom quality monitoring system based on human face characteristics
CN108399376B (en) Intelligent analysis method and system for classroom learning interest of students
CN106878677B (en) Student classroom mastery degree evaluation system and method based on multiple sensors
CN105516280B (en) A kind of Multimodal Learning process state information packed record method
WO2021077382A1 (en) Method and apparatus for determining learning state, and intelligent robot
CN111242049A (en) Student online class learning state evaluation method and system based on facial recognition
CN112183238B (en) Remote education attention detection method and system
CN111523444B (en) Classroom behavior detection method based on improved Openpost model and facial micro-expression
CN113657168B (en) Student learning emotion recognition method based on convolutional neural network
CN108876195A (en) A kind of intelligentized teachers ' teaching quality evaluating system
CN112883867A (en) Student online learning evaluation method and system based on image emotion analysis
CN113920534A (en) Method, system and storage medium for extracting video highlight
CN111523445A (en) Examination behavior detection method based on improved Openpos model and facial micro-expression
Villegas-Ch et al. Identification of emotions from facial gestures in a teaching environment with the use of machine learning techniques
CN113989608A (en) Student experiment classroom behavior identification method based on top vision
CN113609963A (en) Real-time multi-human-body-angle smoking behavior detection method
Shah et al. Assessment of student attentiveness to e-learning by monitoring behavioural elements
CN115937928A (en) Learning state monitoring method and system based on multi-vision feature fusion
CN115689340A (en) Classroom quality monitoring system based on colorful dynamic human face features
Jiang Analysis of Students’ Role Perceptions and their Tendencies in Classroom Education Based on Visual Inspection
CN109447863A (en) A kind of 4MAT real-time analysis method and system
EP4187504A8 (en) Method for training text classification model, apparatus, storage medium and computer program product
CN115050075A (en) Cross-granularity interactive learning micro-expression image labeling method and device
CN114783242A (en) Drawing teaching method and device for online education
CN114550918A (en) Mental disorder evaluation method and system based on drawing characteristic data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200207