CN115829234A - Automatic supervision system based on classroom detection and working method thereof - Google Patents

Automatic supervision system based on classroom detection and working method thereof Download PDF

Info

Publication number
CN115829234A
CN115829234A CN202211408418.7A CN202211408418A CN115829234A CN 115829234 A CN115829234 A CN 115829234A CN 202211408418 A CN202211408418 A CN 202211408418A CN 115829234 A CN115829234 A CN 115829234A
Authority
CN
China
Prior art keywords
classroom
teaching
head
class
students
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211408418.7A
Other languages
Chinese (zh)
Inventor
肖宇
许炜
刘馨
刘侃
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Tiantian Interactive Technology Co ltd
Original Assignee
Wuhan Tiantian Interactive Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Tiantian Interactive Technology Co ltd filed Critical Wuhan Tiantian Interactive Technology Co ltd
Priority to CN202211408418.7A priority Critical patent/CN115829234A/en
Publication of CN115829234A publication Critical patent/CN115829234A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Electrically Operated Instructional Devices (AREA)

Abstract

The invention belongs to the field of intelligent application of educational big data, and provides an automatic supervising system based on classroom detection and a working method thereof. Automatic supervision system patrols and examines the module including classroom information acquisition module, video, and the automatic statistics module of going out to work, attention detection module, teaching liveness calculation module, the visual analysis module of teaching and supervise the evaluation module. The working method of the automatic supervision system comprises the following steps: (1) classroom information acquisition; (2) video inspection; (3) automatic attendance statistics; (4) attention detection; (5) calculating teaching liveness; (6) teaching visual analysis; and (7) supervising and evaluating. The invention provides a new approach and an intelligent method for classroom teaching supervision.

Description

Automatic supervision system based on classroom detection and working method thereof
Technical Field
The invention belongs to the field of intelligent application of educational big data, and particularly relates to an automatic supervising system based on classroom detection and a working method thereof.
Background
The rapid development of new technologies such as big data, artificial intelligence, learning analysis and the like promotes the further improvement of the requirements of teaching resource application, teacher teaching evaluation and teaching behavior analysis. Numerous intelligent classrooms built in schools at all levels are provided with recording and broadcasting equipment capable of supporting AI analysis classroom teaching, curriculum videos can be recorded, functions such as automatic uploading and classroom live broadcasting are achieved, and visual teaching supervision becomes possible through network class patrol. However, in the current teaching supervision work of many schools, experts are still adopted to evaluate courses or review videos on site, check, evaluate and feed back guidance to correct the deviation of the teaching process of teachers, the supervision mode is time-consuming and labor-consuming, massive course video data are collected, and systematic overall evidence is not formed. The computer vision technology is introduced into classroom behavior analysis and classroom evaluation, numerous research achievements are obtained, and the relevant technology of classroom attention measurement is combined, so that intelligent and accurate supervision of teaching can be realized, the attention of students and the behavior state of teachers in the teaching process are depicted, the application development levels of intelligent perception, data modeling, accurate assistance and the like of classroom teaching are further improved, the improvement and the upgrade of teaching space, teaching mode and teaching flow are further promoted, and important support is provided for education digital transformation.
The current video supervision system based on classroom teaching still has a great deal of problem: (1) Data acquisition is not intelligent enough, a patrol supervising system based on network video is in a situation that only watching can be carried out but not checking, a supervising person can remotely call live video of course teaching or review recorded course video frame by frame, the system cannot provide analysis and supervision evaluation reports for automatically generating classroom teaching, and the time for the supervising person to watch the video is difficult to reduce to a great extent; (2) The auxiliary supervision is not accurate enough, although the existing system can visually display information such as classroom operation, equipment operation, teaching environment, operation and maintenance guarantee, time prompt and the like by using a line graph, a bar graph, a pie graph and a table, a supervision evaluation model is not established in the system, and if classroom teaching quality assessment cannot be carried out based on teacher teaching, student attention and interactive behaviors, a supervision person cannot be helped to make correct decision feedback, and the teacher teaching process can be accurately intervened.
Disclosure of Invention
Aiming at the defects or the improvement requirements of the prior art, the invention provides an automatic supervising system based on classroom detection and a working method thereof, and provides a new approach and an intelligent method for classroom teaching supervision.
The object of the invention is achieved by the following technical measures.
The invention provides an automatic supervision system based on classroom detection, which comprises a classroom information acquisition module, a video inspection module, an automatic attendance statistic module, a attention detection module, a teaching liveness calculation module, a teaching visual analysis module and a supervision evaluation module.
The classroom information acquisition module is used for recording and uploading high-definition videos of teaching contents of student learning, teacher teaching and classroom computers.
The video inspection module is used for generating an inspection curriculum schedule, so that an administrator can watch live videos or review videos conveniently and view associated videos, voices and relevant statistical information.
The attendance automatic counting module is used for counting attendance rates of inspection classes, on-time class opening and class early-quit conditions and sending out early warning when abnormal events are detected.
The attention detection module is used for detecting and acquiring the head lifting rate and the head nodding frequency of each class student.
The teaching liveness calculation module is used for calculating the activity index of a classroom, constructing a classroom attention model and calculating the classroom attention value of students.
The teaching visualization analysis module is used for displaying the analysis results of classroom interaction events and student attention by applying the thumbnails, the scatter diagram, the bar chart and the line chart.
And the supervision and evaluation module is used for evaluating the comprehensive results of the learning effect of students, the teaching behavior of teachers and the teaching effect of a certain class.
The invention also provides a working method of the virtual synchronous classroom teaching system under the 5G network environment, which comprises the following steps:
(1) Classroom information acquisition, namely acquiring high-definition pictures for teaching of teachers and learning of students by using cameras at the back end and the front end of a classroom respectively; the method comprises the following steps of intercepting teaching contents and teacher blackboard writing comments on a classroom computer by using a plug-in, and uploading to a school server for storage in an MP4 format;
(2) Video inspection, wherein course arrangement information is derived from the educational administration management system, an inspection course table is generated, and the states of courses are distinguished by using different colors; the governor selects the course to watch live broadcast or review the course video; aiming at the review course, the supervisor can look up the attendance and activity indexes of the course and call teaching contents and voice of teaching at any time;
(3) Automatic attendance statistics is carried out, course and teacher-student information is collected according to a patrol course table, and an image characteristic data set of attendance of a classroom at each time period is constructed; calculating attendance rate, on-time class opening and class early quit conditions by using a video people counting algorithm; if a relevant abnormal event is detected, an abnormal condition early warning is sent out;
(4) Attention detection, namely acquiring head regions and postures of students in a video sequence by using a target detection algorithm, and classifying the postures by adopting classifiers with different scales; counting the student head-up time of each class, and calculating the head-up rate; according to the head posture detection result, recognizing the head nodding action of the students, and calculating the average head nodding frequency of the students in each class;
(5) Calculating teaching liveness, counting the number of times of student head changes by using a head posture detection algorithm, calculating an activity index, and dividing classroom atmosphere according to a set threshold; using a multi-scale detection scheme and an RFB model to obtain the head raising rate, the nodding frequency and the activity index of the student; calculating the classroom attention value of the student by combining and using the head raising rate and the activity index;
(6) Performing visual analysis on the teaching, presenting classroom interaction events by using thumbnails, overlapping audio data, and marking start and end moments on a time axis; taking the head raising rate and the head nodding frequency as a horizontal axis and a vertical axis, taking an internal value per minute as a data point, and drawing an attention scatter diagram; using a bar chart and a line chart to draw the variation trend of the point head frequency and the head-up rate, and marking the maximum value and the minimum value;
(7) Supervising and evaluating, namely, subdividing the attention scatter of the students according to the head raising rate and the head nodding frequency threshold value, and evaluating the learning effect according to the focusing degree of the data points; judging the teaching style and the ability level of the teacher by using the voice recognition and visual detection results; and assisting a supervisor to make comprehensive teaching effect evaluation according to the prerequisites, the learning effect, the teaching behaviors and the classroom atmosphere indexes.
The beneficial results of the invention are as follows:
the method comprises the steps of collecting high-definition pictures of learning of students and teaching of teachers by using front and rear cameras of a classroom respectively, intercepting teaching contents on a computer screen of the classroom by using plug-ins, and uploading the teaching contents to a server in the classroom for storage. The generation patrols and examines the course table, uses the state that different colours distinguished the course, and the governor can choose to watch live broadcast or watch the course video again, consults the attendance of course and active index, the teaching content of giving lessons and pronunciation. And calculating the attendance rate, the on-time class opening and the class early-quit condition by using a video people counting algorithm, and sending out an abnormal condition early warning if a related abnormal event is detected. And acquiring the head-up rate and the average head nodding frequency of the students in the video sequence by using a target detection algorithm. And calculating the classroom activity index, dividing classroom atmosphere, and calculating the classroom attention value of the student by combining the head raising rate and the activity index. And displaying analysis results of classroom interaction events and student attention by using the thumbnails, the scatter diagrams and the bar charts and the line graphs. Evaluating the learning effect according to the head-up rate and the focusing degree of head nodding frequency, and judging the teaching style and the ability level of a teacher by using voice recognition and visual detection results; and assisting a supervisor to make comprehensive teaching effect evaluation according to the prerequisites, the learning effect, the teaching behaviors and the classroom atmosphere indexes. With the rapid application and fusion of artificial intelligence, big data and data analysis technologies in the education field, the interference-free identification and analysis of classroom detection has wide application prospects in teaching supervision.
Drawings
Fig. 1 is a diagram of an architecture of an automated supervision system based on classroom teaching detection in an embodiment of the present invention.
Fig. 2 is a schematic diagram of a teacher teaching high-definition picture collected in the embodiment of the present invention.
Fig. 3 is a schematic diagram of a student learning high-definition picture collected in the embodiment of the invention.
Fig. 4 is a flow chart of the video inspection work in the embodiment of the present invention.
FIG. 5 is a diagram illustrating inspection course presentation according to an embodiment of the present invention.
Fig. 6 is a schematic diagram of detecting the head posture of a student in the embodiment of the invention.
Fig. 7 is a schematic diagram of classroom head-up rate and nodding frequency statistics in an embodiment of the invention.
Fig. 8 is a schematic view of evaluation of learning effect in the embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. In addition, the technical features involved in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
As shown in fig. 1, this embodiment provides an automatic supervision system based on classroom detection, including classroom information acquisition module, video patrol module, the automatic statistics module of attendance, attention detection module, teaching liveness calculation module, the visual analysis module of teaching and supervision evaluation module.
The working method of the automatic supervision system based on classroom detection in the embodiment is further described below, and the method includes the following steps:
(1) And (5) collecting classroom information. The method comprises the steps that high-definition pictures for teaching of teachers and learning of students are collected by using cameras at the rear end and the front end of a classroom respectively; and intercepting the teaching contents and the teacher writing comments on the classroom computer by using the plug-in, and uploading the contents and the teacher writing comments to the school server in an MP4 format for storage.
And (1-1) acquiring a teacher teaching picture. The method comprises the steps of using a camera installed in the middle of the rear end of a classroom, collecting video pictures for classroom teaching from back to front according to the high-definition video collection requirement, recording teaching behaviors of a teacher in a high-definition mode, writing calculation processes on a whiteboard, demonstrating teaching videos, PPT pictures and interactions between the teacher and students.
And (1-2) collecting the learning behaviors of students. The method comprises the steps of using a camera above an electronic whiteboard in the middle of the front end of a classroom, acquiring video pictures for classroom teaching from front to back in a high-definition video mode, recording learning behaviors of students in real time as shown in fig. 3, raising heads to look at the whiteboard and a teacher, lowering heads to look over books or mobile phones, and exchanging and discussing pictures among the students.
(1-3) classroom computer screen collection. The method comprises the steps that a screen content acquisition plug-in is installed on a classroom host computer, when a teacher starts to attend a class by using PPT, video and webpage teaching resources for preparing the class, the acquisition plug-in is started, the teaching content on a classroom computer and teacher blackboard writing and comments are intercepted at the rate of one frame per second, the result is compressed into MP4 video, and the MP4 video is uploaded to a server in the classroom for storage.
(2) Video inspection, wherein course arrangement information is derived from a educational administration management system, an inspection course table is generated, and the states of courses are distinguished by using light yellow, green and white; as shown in fig. 4, the governor may select a green or light yellow entry course to watch live or review the course video; aiming at the review course, the supervisor can look up the attendance and activity indexes of the course and call teaching contents and voice at any time.
And (2-1) generating a routing inspection class schedule. The information of the class scheduling classroom and the teacher-student school timetable in the current school period are derived from the school educational administration management system, the patrol curriculum schedule is generated according to the levels of teaching buildings, floors and classrooms, as shown in fig. 5, the information of the class giving teacher, class giving class and class corresponding to each class in each classroom is displayed, and the states of having finished, having been on and having not been on in the class schedule are displayed in a light yellow color, a green color and a white color respectively.
And (2-2) course video review. The supervising and supervising person logs in the polling curriculum schedule page, clicks green items on the curriculum schedule, can check the live videos of the front camera and the rear camera of a classroom, and can watch the processes and states of teaching of a teacher and listening of students in real time; or selecting the light yellow item, and calling and reviewing the finished course video content from the school server.
And (2-3) viewing the associated information. Watching the video in-process, the governor person can look over this lesson student's attendance and teacher's absence warning, the in-process student of giving lessons lifts first rate, the line graph and the histogram of the frequency of counting one's head, and the interactive teaching incident in classroom at arbitrary moment, including mr's endorsement, blackboard-writing process and teacher-student are interactive to and the pronunciation and the video of classroom teaching.
(3) And (6) automatic attendance statistics. Collecting course and teacher and student information according to a patrol course table, and constructing an image characteristic data set of attendants in a classroom at each period; calculating attendance rate, on-time class opening and class early retreat conditions by using a video people counting algorithm; and if the relevant abnormal event is detected, sending out an abnormal condition early warning.
And (3-1) collecting course and teacher and student information. According to the inspection course table, classroom places, lesson taking time, job numbers of teachers and students and image features of pictures registered by teachers and students corresponding to all courses are called from the educational administration information system, and image feature data sets of faces of the teachers and students of all lessons in each classroom are constructed.
And (3-2) acquiring the attendance rate. Starting a video people counting algorithm to detect the number of arrived people in the classroom 1 minute after each course begins, and calculating the ratio of the number of arrived people to obtain the attendance rate of the class; and recognizing student information in the video by using a face detection algorithm, and automatically uploading the unmatched and absent students to a educational administration management system after recognition and comparison. The video people counting algorithm comprises the following steps:
i: calling a front-end camera video of a classroom, and extracting one frame in a video sequence every second;
II: using an image filter algorithm, converting the extracted image frames into a grayscale map using equation 1:
gray = Red 0.299+ Green 0.587+ blue 0.114 (equation 1)
Wherein Red, green and Blue are the Red, green and Blue values of the pixels in the image frame respectively;
III: and (3) extracting the face characteristics of the students in the gray image by using a Haar-like operator, wherein a formula 2 is a calculation process of the Haar-like operator:
value=p white -αp black (formula 2)
Wherein p is white Indicating a white area in a gray image, p black Denotes a black region in a gray image, and α denotes a ratio of areas of a white region and a black region;
IV: the method of the integral graph is used for accelerating the operation speed of the extraction of the Haar-like operator characteristics, and a formula 3 is a specific calculation formula of the integral graph:
I(x,y)=∑ x′≤x,y′≤y img (x ', y') (equation 3)
Wherein I (x, y) represents a pixel value at a point (x, y) on the integral map, img (x ', y') represents a pixel value at (x ', y') in the gray face image;
v: using Haar-like extraction features as weak classifiers, increasing the weight of the wrongly classified samples, reducing the weight of correctly classified samples, obtaining new sample distribution, and training the next weak classifier by using the new sample distribution;
VI: and obtaining T weak classifiers through T cycles, wherein a formula 4 is that the T weak classifiers form a final classifier model in a cascading structure:
Figure BDA0003937523020000091
wherein w i For the classification result of the ith sample, if the sample is misclassified, w i =1, otherwise w i =0,h t Weak classifiers obtained for the t-th cycle.
VII: and detecting the faces of the students in each frame of image picture by using the final classifier model, and acquiring the number of the real students in the classroom.
And (3-3) the teacher gives an absence warning. After the class begins, calling a rear camera video of the classroom, detecting the face of a teacher in front of the classroom, if the face is successfully extracted, comparing the face with the image characteristics of any teacher, and sending out a teacher transaction prompt to a educational administration department; if the extraction is not successful, giving out a teacher delay warning to the educational administration department, and continuously comparing every 1 minute until the identification is successful.
(4) And (5) detecting attention. Acquiring head areas and postures of students in a video sequence by using a target detection algorithm, and classifying the postures by adopting classifiers with different scales; counting the head raising time of students in each class, and calculating the head raising rate; according to the head posture detection result, the head nodding actions of the students are identified, and the average head nodding frequency of the students in each class is calculated.
And (4-1) detecting the head postures of the students. The method based on target detection is used for acquiring the head regions of students in a video sequence of a front-end camera of a classroom as shown in fig. 6, dividing the small-scale head regions and the normal-scale head regions according to the sizes of the head regions, and classifying the postures of the front-row students and the rear-row students according to the head postures by using classifiers with corresponding scales respectively. Student posture detection algorithm:
i: calling a high-definition video sequence collected by a front-end camera of a classroom;
II: constructing a head detection network by using a convolutional neural network layer, an RFB module and a down-sampling layer, wherein the input of the head detection network is each frame of a video sequence, and the output of the head detection network is a student head area which succeeds in advance;
III: the total loss value of the head detection network comprises two parts of a classification branch loss value and a regression branch loss value, and a cross entropy loss function is adopted for the loss function of the classification branch by formula 5:
Figure BDA0003937523020000101
wherein y is i Is the true category of the ith sample,
Figure BDA0003937523020000102
predicting the class of the ith sample for the model, iBelongs to pos and i belongs to neg respectively represent the classification correctness and classification error of the ith sample point, N and M respectively represent the number of the sample points with the classification correctness and the classification error,
the loss function of the regression branch adopts a BalancedL1 loss function, as shown in formula 6:
Figure BDA0003937523020000103
wherein α =0.5, γ =1.5,
Figure BDA0003937523020000104
c is an arbitrary constant.
IV: constructing an ECA-ResNet module by using a global convolution, a one-dimensional convolution with the size of an adaptive convolution kernel, a sigmoid activation function and a residual network, wherein the global convolution can down-sample the size of an input H x W x C feature graph to (1 x C), and calculating the size of the convolution kernel by adopting a formula 7:
Figure BDA0003937523020000105
wherein K is the size of the convolution kernel, C is the number of channels of the feature map, γ =2, b =1, and if K is an even number, the size of the convolution kernel is K +1;
v: the convolutional layers, the batch normalization layer, the correction linear unit, the maximum pooling layer, the four ECA-ResNet modules, the average pooling layer, the full-connection layer and the normalization index function layer are sequentially stacked to construct a multi-scale head posture classification network, the input is a student head area, and the output is two head postures of head raising and head lowering.
And (4-2) carrying out head-up rate statistics. The method comprises the steps of calculating the instantaneous head raising number of all students in each second along the time axis of a front-end camera video sequence of a classroom, dividing the instantaneous head raising number by the number of the students in the classroom to the total number to obtain the instantaneous head raising rate, and setting the average value of all the head raising rates of the class as the total head raising rate of the class according to the information of a routing inspection course schedule.
And (4-3) calculating the nodding frequency. According to the detection result of the head postures of all students, the pitching, yawing and rolling postures of the heads of the students are evaluated, the identification of nodding, shaking and turning motions is completed, the whole class is divided according to each minute, the nodding motion times of all students in the class are counted, and the average nodding frequency of all students in the class is calculated.
(5) And calculating teaching liveness. Counting the number of changes of the student head by using a head posture detection algorithm, calculating an activity index, and dividing classroom atmosphere according to a set threshold; using a multi-scale detection scheme and an RFB model to obtain the head raising rate, the nodding frequency and the activity index of the student; and calculating the classroom attention value of the student by combining the head raising rate and the activity index.
(5-1) Activity index calculation. Counting the number of times of head changes of students per minute by using a head posture detection algorithm, acquiring the frequency of head changes of the students, summing up all the students, then calculating the average number of times of changes, setting a threshold value, and dividing the classroom atmosphere into extreme activity, neutrality, inactivity and extreme inactivity.
Taking the time interval of T minutes as an observation time window of the class activity, counting the frequency of the posture change of the head of each student in the T minutes, calculating an average value according to the number of current students, and calculating a class activity index in the minute by using a formula 8:
Figure BDA0003937523020000121
wherein N is T A mode representing the number of students detected during the T period,
Figure BDA0003937523020000123
the head-up state of the student at the time t is represented, and the value is 1 if the head-up state is detected, and the value is 0 if the head-up state is not detected.
And (5-2) constructing a classroom attention model. Aiming at the characteristic that the head of a student sitting on a seat is uniform overall during classroom teaching, a two-stage multi-scale detection scheme of detection and classification is adopted, an RFB model is used for detecting the head area and the head posture of the student, and the head raising rate, the head nodding frequency and the activity index of the student are calculated.
And (5-3) calculating the attention value of the student. The integral situation of the attention of the students in a certain period is judged by combining the head raising rate and the activity index, the head raising rate and the activity index in the whole class are calculated according to the interval of every minute, the sum of the head raising rate and the activity index in the class is respectively calculated in an accumulated mode, the sum is divided by the class time length, and the obtained results are multiplied to obtain the class attention value of the students in the class. Equation 9 represents the student attention value calculation equation:
Figure BDA0003937523020000122
wherein T represents the duration of the whole class, AR t Indicating the classroom activity index, UR, for the minute t The classroom head-up value at the t-th minute is shown.
(6) And (5) teaching visual analysis. Displaying classroom interaction events by using thumbnails, superposing audio data, and marking start and end moments on a time axis; taking the head raising rate and the head nodding frequency as a horizontal axis and a vertical axis, taking an internal value per minute as a data point, and drawing an attention scatter diagram; and (5) drawing the variation trends of the dot head frequency and the head-up rate in the form of a bar chart and a line chart, and labeling the maximum value and the minimum value.
And (6-1) recording classroom interaction events. And (3) displaying the captured teacher computer pictures (1-3) by using thumbnails along the time axis of the course, superposing the writing process on the electronic whiteboard, the student answering records and the teacher annotations according to the time line, superposing the classroom audio data recorded by the classroom microphone, and marking the time when the teacher and the student start and end the interaction by using red marking lines.
(6-2) student attention analysis. And calculating the head raising rate and the head nodding frequency of the students in each minute by using the head raising rate and the head nodding frequency as a horizontal axis and a vertical axis of a Cartesian coordinate system. As shown in fig. 7, the data point of the whole class is positioned in the coordinate system, the data point is plotted in the coordinate system in the form of a blue solid circle, and the student attention scatter diagram of the class is presented.
And (6-3) counting the classroom raising rate and nodding frequency. As shown in fig. 8, according to each classroom interaction event, the time reference of each event is marked along the time axis of the lesson, the nodding frequency and the nodding rate are respectively drawn in the forms of a bar graph and a line graph, the respective maximum and minimum values are marked, and the governor is supported to look up the numerical value change of each time on the time axis.
(7) And (6) supervising and evaluating. Re-dividing the student attention scatter points according to the head raising rate and the head nodding frequency threshold, and evaluating the learning effect according to the focusing degree of the data points; judging the teaching style and the ability level of the teacher by using the voice recognition and visual detection results; and assisting a supervisor to make comprehensive teaching effect evaluation according to the prerequisites, the learning effect, the teaching behaviors and the classroom atmosphere indexes.
(7-1) evaluation of learning effect. Setting thresholds of the student head-up rate and the head-pointing frequency of each class, taking the point as a new coordinate origin, drawing a new coordinate system parallel to the original XY axis in a student attention scatter diagram, respectively defining the learning effects of 1-4 quadrants as superior, good, poor and good, and determining the quadrant with the most gathered data points as the evaluation result of the learning effect of the class. The learning effect evaluation method comprises the following specific steps:
i: extracting the average values of the student raising rate and the nodding frequency accumulated in the system to be 3% and 50%;
II: according to the value ranges (0, 100%) and (0, 5) of the horizontal and vertical axes in the student attention scatter diagram, taking (3, 50%) as the origin of coordinates, drawing a new coordinate system;
III: the new coordinate system divides the student attention scatter diagram into four quadrants, the head raising rate and the head nodding frequency of the first quadrant exceed the average value and are defined as excellent, only one index of the head raising rate and the head nodding frequency of the second quadrant and the fourth quadrant exceeds the average value and are defined as good, and the head raising rate and the head nodding frequency of the third quadrant do not exceed the average value and are defined as poor;
IV: and counting the number of data points in the new quadrant, and taking the attention state of the student in the classroom presented at most of time as the basis of the final study effect evaluation.
And (7-2) teaching behavior evaluation. The voice recognition technology and the visual detection technology are comprehensively applied, the classroom teacher behaviors are divided into teaching, inspection, discussion and silence behaviors, the duration and the connection relation of each behavior are counted according to time slices, the teaching strategy, the teaching path and the input degree of any teacher are generated, and the teaching style and the ability level of the teacher are judged.
And (7-3) comprehensive evaluation of teaching effect. And the supervisor gives out a prerequisite judgment according to whether the class teacher has the situations of late arrival, absence and class replacement, respectively gives out corresponding weights according to the learning effect of the students and the teaching behavior of the teacher in the teaching process and in combination with the activity level of the classroom atmosphere, and performs comprehensive evaluation on the teaching effect after multiplication and accumulation.
Details not described in the present specification belong to the prior art known to those skilled in the art.
It will be understood by those skilled in the art that the foregoing is merely a preferred embodiment of the present invention, and is not intended to limit the invention, such that any modification, equivalent replacement or improvement made within the spirit and principle of the present invention shall be included within the scope of the present invention.

Claims (9)

1. Automatic supervision system based on classroom detects, its characterized in that: the system comprises a classroom information acquisition module, a video inspection module, an attendance automatic statistics module, a attention detection module, a teaching liveness calculation module, a teaching visual analysis module and a supervision evaluation module;
the classroom information acquisition module is used for recording and uploading high-definition videos of student learning, teacher teaching and teaching contents on classroom computers;
the video inspection module is used for generating an inspection curriculum schedule, so that an administrator can conveniently watch live broadcast or review and view associated videos, voices and relevant statistical information;
the attendance automatic counting module is used for counting the attendance rate of the inspection class, the on-time class opening and the class early-quit conditions, and sending out early warning when detecting abnormal events;
the attention detection module is used for detecting and acquiring the head raising rate and head nodding frequency of each class student; the teaching liveness calculation module is used for calculating the activity index of a classroom, constructing a classroom attention model and calculating the classroom attention value of a student;
the teaching visualization analysis module is used for displaying the analysis results of classroom interaction events and student attention;
and the supervision and evaluation module is used for evaluating the comprehensive results of the learning effect of students, the teaching behavior of teachers and the teaching effect of a certain class.
2. The working method of the automatic supervision system based on classroom detection is characterized by comprising the following steps:
(1) Classroom information acquisition, namely acquiring high-definition pictures for teaching of teachers and learning of students by using cameras at the back end and the front end of a classroom respectively; intercepting teaching contents and teacher writing comments on a classroom computer by using a plug-in, and uploading the teaching contents and the teacher writing comments to a school server in an MP4 format for storage;
(2) Video inspection, wherein course arrangement information is derived from a educational administration management system to generate an inspection course table, and states of courses are distinguished by using different colors; the governor selects the course to watch live broadcast or review the course video; aiming at the review course, the supervisor can look up the attendance and activity indexes of the course and call teaching contents and voice of teaching at any time;
(3) Automatic attendance statistics is carried out, course and teacher-student information is collected according to a patrol course table, and an image characteristic data set of attendance of a classroom at each time period is constructed; calculating attendance rate, on-time class opening and class early-quit conditions by using a video people counting algorithm; if a relevant abnormal event is detected, an abnormal condition early warning is sent out;
(4) Attention detection, namely acquiring head regions and postures of students in a video sequence by using a target detection algorithm, and classifying the postures by adopting classifiers with different scales; counting the head raising time of students in each class, and calculating the head raising rate; according to the head posture detection result, recognizing the head nodding action of the students, and calculating the average head nodding frequency of the students in each class;
(5) Calculating teaching liveness, counting the number of times of student head changes by using a head posture detection algorithm, calculating an activity index, and dividing classroom atmosphere according to a set threshold; using a multi-scale detection scheme and an RFB model to obtain the head raising rate, the nodding frequency and the activity index of the student; calculating the classroom attention value of the student by combining and using the head raising rate and the activity index;
(6) Performing visual analysis on the teaching, presenting classroom interaction events by using thumbnails, overlapping audio data, and marking start and end moments on a time axis; taking the head raising rate and the head nodding frequency as a horizontal axis and a vertical axis, taking an internal value per minute as a data point, and drawing an attention scatter diagram; using a bar chart and a line chart to draw the variation trend of the point head frequency and the head-up rate, and marking the maximum value and the minimum value;
(7) Supervising and evaluating, namely, subdividing the attention scatter of the students according to the head raising rate and the head nodding frequency threshold value, and evaluating the learning effect according to the focusing degree of the data points; judging the teaching style and the ability level of the teacher by using the voice recognition and visual detection results; and assisting a supervisor to make comprehensive teaching effect evaluation according to the prerequisites, the learning effect, the teaching behaviors and the classroom atmosphere indexes.
3. The working method of the automatic supervision system based on classroom detection as claimed in claim 2, wherein the classroom information collection in step (1) specifically comprises:
(1-1) acquiring a teaching picture of a teacher, acquiring a video picture of classroom teaching from back to front by using a camera arranged in the middle of the rear end of a classroom according to a high-definition video acquisition requirement, recording the teaching behavior of the teacher in a high-definition mode, writing a calculation process on a whiteboard, demonstrating the teaching video and PPT picture, and interacting the teacher and students;
(1-2) acquiring learning behaviors of students, acquiring video pictures of classroom teaching from front to back by using a camera above an electronic white board in the middle of the front end of a classroom according to high-definition video acquisition requirements, recording the learning behaviors of the students in real time, raising heads to the white board and the teacher in front, lowering heads to see books or mobile phones, and exchanging and discussing the pictures among the students;
(1-3) collecting the classroom computer screen, installing a screen content collecting plug-in on a classroom host computer, starting the collecting plug-in when a teacher starts to attend a class by using PPT, video and webpage teaching resources for preparing the class, intercepting the teaching content and teacher writing and comments on the classroom computer at the rate of one frame per second, compressing the result into MP4 video, and uploading the video to a server in the classroom for storage.
4. The working method of the automatic supervision system based on classroom detection as claimed in claim 2, wherein the video inspection in step (2) specifically comprises:
(2-1) generating an inspection class table, wherein the information of a class arrangement classroom and a teacher-student class table in the current school period are derived from a school educational administration system, the inspection class table is generated according to the levels of a teaching building, a floor and a classroom, the inspection class table displays teaching teachers, classes and class information corresponding to each class in each classroom, and the current state, the current state and the current state of no class are displayed in the class table in a light yellow, green and white mode respectively;
(2-2) viewing the course video, logging in a polling course table page by a supervisor, clicking green entries on the course table, viewing live videos of front and rear cameras in a classroom, and viewing the course and state of a teacher giving lessons and students listening in real time; or selecting a light yellow item, and calling and reviewing the finished course video content from the school server;
(2-3) in the process of viewing the video, the supervisor views the attendance rate of students in the class and the absence warning of teachers, the head lifting rate and head counting frequency of students in the teaching process, a line graph and a column graph, interactive teaching events in classes at any time, including teacher annotations, blackboard writing process and teacher-student interaction, and the voice and video of class teaching.
5. The working method of the automatic supervision system based on classroom detection as claimed in claim 2, wherein the automatic attendance statistics of step (3) specifically comprises:
(3-1) collecting course and teacher and student information, calling classroom places, lesson-taking time, job numbers of teachers and students in courses and image characteristics of pictures registered by teachers and students from a teaching affair information system according to a patrol course table, and constructing image characteristic data sets of faces of teachers and students in each class in each classroom;
(3-2) obtaining attendance rate, starting a video people counting algorithm to detect the actual number of people in the classroom 1 minute after each course begins, and calculating the ratio of the actual number of people to obtain the attendance rate of the classroom; recognizing student information in the video by using a face detection algorithm, and automatically uploading unmatched students and absent students to a educational administration management system after recognition and comparison;
(3-3) giving an alarm when the teacher is absent, calling a camera video at the back end of the classroom after the class begins, detecting the face of the teacher in front of the classroom, comparing the face of the teacher with the image characteristics of the teacher in any class if the face of the teacher is extracted successfully, and sending out a prompt of the abnormal movement of the teacher to a teaching department; if the extraction is not successful, giving out a teacher delay warning to the educational administration department, and continuously comparing every 1 minute until the identification is successful.
6. The working method of the automatic supervision system based on classroom detection as claimed in claim 2, wherein the attention detection in step (4) specifically comprises:
(4-1) detecting the head postures of students, namely acquiring the head regions of the students in a video sequence of a front-end camera of a classroom by using a method based on target detection, dividing small-scale head regions and normal-scale head regions according to the sizes of the head regions, and classifying the postures of the front-row students and the rear-row students according to the head postures by using classifiers with corresponding scales respectively;
(4-2) counting the head-up rate, calculating the instantaneous head-up number of all students in each second along the time axis of a front-end camera video sequence of a classroom, dividing the instantaneous head-up number by the number of the students in the classroom to obtain the instantaneous head-up rate, and setting the average value of all the head-up rates of a class as the total head-up rate of the classroom according to a preset class schedule event;
(4-3) calculating the nodding frequency, evaluating the pitching, yawing and rolling postures of the heads of the students according to the detection result of the head postures of the students, finishing the identification of nodding, shaking and turning motions, dividing the whole class according to each minute, counting the nodding motion frequency of each student in the class, and calculating the average nodding frequency of all the students in the class.
7. The working method of the classroom detection-based automatic supervision system according to claim 2, wherein the teaching activity calculation of step (5) specifically comprises:
(5-1) calculating an activity index, counting the number of times of head change of students per minute by using a head posture detection algorithm, acquiring the frequency of head change of the students, summing up all the students, calculating the average number of times of change, setting a threshold value, and dividing classroom atmosphere into extreme activity, neutrality, inactivity and extreme inactivity;
(5-2) constructing a classroom attention model, adopting a two-stage multi-scale detection scheme of detection and classification aiming at the characteristic that the head of a student sitting on a seat is overall uniform during classroom teaching, detecting the head area and the posture of the student by using an RFB model, and calculating the head raising rate, the head nodding frequency and the activity index of the student;
(5-3) calculating the attention value of the student, judging the whole situation of the attention of the student in a certain period by combining the head raising rate and the activity index, calculating the head raising rate and the activity index in the whole class at intervals of every minute, respectively calculating the sum of the two in the class in an accumulating way, respectively dividing the sum by the class time length, and multiplying the obtained results to obtain the class attention value of the student in the class.
8. The working method of the automatic supervision system based on classroom detection as claimed in claim 2, wherein the teaching visual analysis in step (6) specifically comprises:
(6-1) recording classroom interaction events, displaying the captured teacher computer pictures by using thumbnails along a curriculum time axis, superposing a writing process on an electronic whiteboard, student answering records and teacher annotations according to a time line, superposing classroom audio data recorded by a classroom microphone, and marking the start and end moments of teacher-student interaction by using red marking lines;
(6-2) student attention analysis, namely calculating the head raising rate and the head nodding frequency of a student in each minute by using the head raising rate and the head nodding frequency as a horizontal axis and a vertical axis of a Cartesian coordinate system, positioning a data point of the whole lesson in the coordinate system, drawing the data point in the coordinate system in a blue solid circle mode, and presenting a student attention scatter diagram of the lesson;
(6-3) counting the classroom heading rate and the nod frequency, marking the time reference of each event along the course time axis according to each classroom interaction event, drawing the nod frequency and the heading rate in the form of a bar graph and a line graph respectively, marking the maximum value and the minimum value of each event, and supporting a supervisor to look up the numerical value change of each moment on the time axis.
9. The working method of the automatic supervision system based on classroom detection as claimed in claim 2, wherein the supervision and evaluation in step (7) specifically comprises:
(7-1) evaluating the learning effect, namely setting thresholds of the raising rate and the nodding frequency of students in each class section, drawing a new coordinate system parallel to an original XY axis in a student attention scatter diagram by taking the point as a new coordinate origin, respectively defining the learning effects of 1-4 quadrants as excellent, good, poor and good, and obtaining the quadrant with the most gathered data points as the evaluation result of the learning effect of the class section;
(7-2) evaluating teaching behaviors, namely comprehensively utilizing a voice recognition technology and a visual detection technology, dividing classroom teacher behaviors into teaching, inspection, discussion and silence behaviors, counting the duration and the connection relation of each behavior according to time segments, generating a teaching strategy, a teaching path and an input degree of any teacher, and judging the teaching style and the ability level of the teacher;
(7-3) comprehensive evaluation of teaching effect, wherein a supervisor gives out a prerequisite judgment according to whether the class teacher has the situations of late arrival, absence and class substitution, gives out corresponding weights respectively according to the learning effect of students and the teaching behavior of the teacher in the teaching process and in combination with the activity level of the class atmosphere, and makes the comprehensive evaluation of the teaching effect after multiplication and accumulation.
CN202211408418.7A 2022-11-10 2022-11-10 Automatic supervision system based on classroom detection and working method thereof Pending CN115829234A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211408418.7A CN115829234A (en) 2022-11-10 2022-11-10 Automatic supervision system based on classroom detection and working method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211408418.7A CN115829234A (en) 2022-11-10 2022-11-10 Automatic supervision system based on classroom detection and working method thereof

Publications (1)

Publication Number Publication Date
CN115829234A true CN115829234A (en) 2023-03-21

Family

ID=85527627

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211408418.7A Pending CN115829234A (en) 2022-11-10 2022-11-10 Automatic supervision system based on classroom detection and working method thereof

Country Status (1)

Country Link
CN (1) CN115829234A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117574008A (en) * 2024-01-16 2024-02-20 成都泰盟软件有限公司 Course data arrangement processing method and device, server and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019095446A1 (en) * 2017-11-17 2019-05-23 深圳市鹰硕音频科技有限公司 Following teaching system having speech evaluation function
CN110334610A (en) * 2019-06-14 2019-10-15 华中师范大学 A kind of various dimensions classroom based on computer vision quantization system and method
CN111275345A (en) * 2020-01-22 2020-06-12 重庆大学 Classroom informatization evaluation and management system and method based on deep learning
CN113255572A (en) * 2021-06-17 2021-08-13 华中科技大学 Classroom attention assessment method and system
CN115169970A (en) * 2022-07-29 2022-10-11 四川果仁飞翔科技有限公司 Intelligent classroom teaching supervision system and method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019095446A1 (en) * 2017-11-17 2019-05-23 深圳市鹰硕音频科技有限公司 Following teaching system having speech evaluation function
CN110334610A (en) * 2019-06-14 2019-10-15 华中师范大学 A kind of various dimensions classroom based on computer vision quantization system and method
CN111275345A (en) * 2020-01-22 2020-06-12 重庆大学 Classroom informatization evaluation and management system and method based on deep learning
CN113255572A (en) * 2021-06-17 2021-08-13 华中科技大学 Classroom attention assessment method and system
CN115169970A (en) * 2022-07-29 2022-10-11 四川果仁飞翔科技有限公司 Intelligent classroom teaching supervision system and method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117574008A (en) * 2024-01-16 2024-02-20 成都泰盟软件有限公司 Course data arrangement processing method and device, server and storage medium
CN117574008B (en) * 2024-01-16 2024-04-02 成都泰盟软件有限公司 Course data arrangement processing method and device, server and storage medium

Similar Documents

Publication Publication Date Title
CN110334610B (en) Multi-dimensional classroom quantification system and method based on computer vision
CN108399376B (en) Intelligent analysis method and system for classroom learning interest of students
JP6892558B2 (en) Theological assistance method and the theological assistance system that adopts the method
CN110991381B (en) Real-time classroom student status analysis and indication reminding system and method based on behavior and voice intelligent recognition
CN109034036B (en) Video analysis method, teaching quality assessment method and system and computer-readable storage medium
CN111915148B (en) Classroom teaching evaluation method and system based on information technology
CN111275345B (en) Classroom informatization evaluation and management system and method based on deep learning
CN110009210B (en) Comprehensive assessment method for student class listening level based on attention degree and concentration degree
CN112183238B (en) Remote education attention detection method and system
CN111242049A (en) Student online class learning state evaluation method and system based on facial recognition
CN110399810B (en) Auxiliary roll-call method and device
CN111666829A (en) Multi-scene multi-subject identity behavior emotion recognition analysis method and intelligent supervision system
CN111563449A (en) Real-time classroom attention detection method and system
CN111523444A (en) Classroom behavior detection method based on improved Openpos model and facial micro-expressions
CN115810163B (en) Teaching evaluation method and system based on AI classroom behavior recognition
CN116343311A (en) Virtual person teaching auxiliary system based on facial expression recognition
CN111444389A (en) Conference video analysis method and system based on target detection
CN115829234A (en) Automatic supervision system based on classroom detection and working method thereof
CN110378261B (en) Student identification method and device
CN112861809B (en) Classroom head-up detection system based on multi-target video analysis and working method thereof
CN111178263A (en) Real-time expression analysis method and device
CN113989608A (en) Student experiment classroom behavior identification method based on top vision
CN112818740A (en) Psychological quality dimension evaluation method and device for intelligent interview
CN111047731A (en) AR technology-based telecommunication room inspection method and system
CN113569761A (en) Student viewpoint estimation method based on deep learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20230321