CN106228293A - teaching evaluation method and system - Google Patents

teaching evaluation method and system Download PDF

Info

Publication number
CN106228293A
CN106228293A CN201610564845.2A CN201610564845A CN106228293A CN 106228293 A CN106228293 A CN 106228293A CN 201610564845 A CN201610564845 A CN 201610564845A CN 106228293 A CN106228293 A CN 106228293A
Authority
CN
China
Prior art keywords
user
face
key point
eyes
yawning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610564845.2A
Other languages
Chinese (zh)
Inventor
周曦
陈杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Zhongke Yuncong Technology Co Ltd
Original Assignee
Chongqing Zhongke Yuncong Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Zhongke Yuncong Technology Co Ltd filed Critical Chongqing Zhongke Yuncong Technology Co Ltd
Priority to CN201610564845.2A priority Critical patent/CN106228293A/en
Publication of CN106228293A publication Critical patent/CN106228293A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06393Score-carding, benchmarking or key performance indicator [KPI] analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • Physics & Mathematics (AREA)
  • Strategic Management (AREA)
  • Theoretical Computer Science (AREA)
  • Economics (AREA)
  • General Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Tourism & Hospitality (AREA)
  • Marketing (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Development Economics (AREA)
  • General Business, Economics & Management (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Educational Technology (AREA)
  • Primary Health Care (AREA)
  • Game Theory and Decision Science (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The present invention provides a kind of teaching evaluation method and system, and the method includes: obtain the video recording user's face;Determine the face location of user in described video, extract the face key point position of described face location;Variable condition according to described face key point position judges whether user closes one's eyes, yawns and nod, and calculates the number of times that user closes one's eyes, yawns and nod respectively;Quality of instruction according to the number of times assessment course that user closes one's eyes, yawns and nods.The facial image of user's human face expression is comprised by video acquisition, analyze and whether human face expression occurs close one's eyes, nod and yawning action, statistics is closed one's eyes, is nodded and the yawning action frequency of occurrences, thus the assessment teaching efficiency of the state objective and fair according to participant;Not only facilitate teacher to improve according to Evaluated effect to prepare lessons efficiency, meanwhile, no matter impart knowledge to students on line or under line and can accurately be understood the state of participant by this assessment system teacher, make corresponding adjustment in time, more favorably promote teaching efficiency.

Description

Teaching evaluation method and system
Technical Field
The invention relates to the technical field of evaluation, in particular to a teaching evaluation method and system based on face recognition facial actions.
Background
With the development of internet technology, distance education (online education) gradually enters the schedule life, and becomes a means for people to learn knowledge. Distance education breaks through regional limitation, so that the course capacity is large, a teacher often corresponds to a plurality of students, and therefore how to evaluate the teaching quality is a great problem.
Traditionally, teaching assessment is usually considered in terms of atmosphere, teaching contents and the like of a classroom, and then the quality of teaching is judged by scoring teacher courses by students. However, many times, students hide their real ideas and do not cooperate completely, so that the evaluation work becomes tedious and the evaluation effect is not good, and the teaching level of teachers and the actual receiving effect of students cannot be reflected quickly and accurately.
Disclosure of Invention
In view of the above-mentioned shortcomings of the prior art, an object of the present invention is to provide a teaching evaluation method and system for solving the problems of slow evaluation speed and inaccurate evaluation effect in teaching evaluation in the prior art.
To achieve the above and other related objects, the present invention provides a teaching assessment method, comprising:
acquiring a video for recording the face of a user;
determining the face position of a user in the video, and extracting the face key point position of the face position;
judging whether the user closes the eyes, yawns and nods according to the change state of the positions of the key points of the face, and respectively calculating the times of closing the eyes, yawns and nods of the user;
and evaluating the teaching quality of the course according to the times of closing eyes, yawning and nodding of the user.
Another object of the present invention is to provide a teaching evaluation system, comprising:
the acquisition module is used for acquiring and recording a video of the face of the user;
the key point position extraction module is used for determining the face position of the user in the video and extracting the face key point position of the face position;
the detection module is used for judging whether the user closes the eyes, yawns and nods according to the change state of the positions of the key points of the face, and respectively calculating the times of closing the eyes, yawns and nods of the user;
and the evaluation module is used for evaluating the teaching quality of the course according to the times of closing eyes, yawning and nodding of the user.
As described above, the teaching evaluation method and system of the present invention have the following advantages:
in view of the fact that the traditional method is large in labor input and poor in evaluation effect, the method does not need scoring of students, obtains the facial expression state of a student by collecting videos of the user (student), analyzes whether actions of eye closing, head nodding and yawning appear in the facial expression, and counts the frequency of the actions of eye closing, head noding and yawning, so that the teaching effect is evaluated objectively and justly according to the state of the student; the system is convenient for teachers to improve the efficiency of lessons preparation according to the assessment effect, and meanwhile, teachers can accurately know the states of lessees through the assessment system no matter on-line or off-line teaching, corresponding adjustment is made in time, and the teaching effect is favorably improved.
Drawings
FIG. 1 is a flow chart of a teaching assessment method according to the present invention;
FIG. 2 shows a detailed flowchart of step S2 provided for by the present invention;
FIG. 3 is a diagram showing a distribution of feature locations of key points of a human face according to the present invention;
FIG. 4 shows a detailed flowchart of step S3 provided for by the present invention;
FIG. 5 is a block diagram of a teaching assessment system according to the present invention;
FIG. 6 is a block diagram of a key point location extraction module in the teaching assessment system according to the present invention;
FIG. 7 is a block diagram of a detection module in the teaching evaluation system according to the present invention;
fig. 8 is a block diagram showing the overall structure of the teaching evaluation system provided by the present invention.
Element number description:
1 acquisition Module
2 Key point position extraction module
3 detection module
4 evaluation module
11 acquisition unit
21 calibration unit
22 first extraction unit
23 second extraction unit
24 acquisition unit
31 closed eye detection unit
32 Harvest detection unit
33 nodding detection unit
34 statistical unit
41 evaluation unit
S1-S4 Steps 1-4
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention. It is to be noted that the features in the following embodiments and examples may be combined with each other without conflict.
It should be noted that the drawings provided in the following embodiments are only for illustrating the basic idea of the present invention, and the components related to the present invention are only shown in the drawings rather than drawn according to the number, shape and size of the components in actual implementation, and the type, quantity and proportion of the components in actual implementation may be changed freely, and the layout of the components may be more complicated.
EXAMPLE 1
Referring to fig. 1, a flowchart of a teaching evaluation method provided by the present invention includes:
step S1, acquiring a video for recording the face of a user;
wherein, a camera or a video recorder is adopted to record the video containing the face of the user,
step S2, determining the face position of the user in the video, and extracting the face key point position of the face position;
the method comprises the steps of obtaining a face image of a user through a video;
step S3, judging whether the user closes the eyes, yawns and nods according to the change state of the positions of the key points of the human face, and respectively calculating the times of closing the eyes, yawns and nods of the user;
in the application, the change state of the positions of the key points of the face, namely the change conditions of the local key points such as the vertical rotation angles of the eyes, the mouth and the head of a user, is not a normal blink but a dozing or sleepiness presenting eye closing state, the nodding action comprises a head-down action and a head-up action, is also a dozing or sleepiness presenting action, and is a normal instinctive response of yawning and sleepiness or sleepiness presenting action.
And step S4, evaluating the teaching quality of the course according to the times of closing eyes, yawning and nodding of the user.
Whether the times of eye closing, yawning and nodding of a user exceed a preset threshold range within preset time is detected, and the teaching quality is evaluated according to the interval of the times of eye closing, yawning and nodding within the preset threshold range, namely the times of eye closing, yawning and nodding and the occurrence frequency within the preset time are counted, so that the learning state of a lessee is determined, the teaching quality of a teacher is reflected, and the purpose of teaching evaluation is achieved.
In this embodiment, need not artifical assistance, can in time make the aassessment to the teaching quality of online education or classroom education, the objective fair teaching quality who reflects the teacher, the quality of being convenient for improve the teaching.
Example 2
As shown in fig. 2, a detailed flowchart of step S2 provided by the present invention is as follows:
step S201, extracting a user face image in a video by adopting a face detector, and calibrating the initial position of each key point in the face image;
the face detector can adopt a detector based on HAAR characteristics, and can also adopt a model based on deep learning to train and obtain a face image of the user.
Step S202, extracting SURF characteristics of initial positions of all key points, and splicing the corresponding SURF characteristics into a global characteristic;
the method has the advantages that the extracted SURF features have good adaptability to the scale and rotation characteristics of the face image, the preprocessing is facilitated, and meanwhile, the method has the advantages of simplicity in operation, high efficiency and the like; and splicing and fusing the local feature descriptors of the SURF features into a global feature.
Step S203, based on the global characteristics, obtaining the translation amount of each key point by adopting a random forest algorithm;
corresponding global features are randomly acquired by adopting a random forest algorithm, and the same classifier is used for classification, so that the phenomenon of overfitting is not required to be considered when the translation amount of each key point is obtained, and the calculation efficiency is improved.
And step S204, iteratively calculating the translation amount of each key point to obtain the positions of the key points of the face image.
The human face key point position distribution diagram shown in fig. 3 is obtained through steps S201 to S204, and 68 key points are used for positioning in the diagram, so that the human face key point positions can be completely displayed.
In the embodiment, the face of the user is identified by adopting the algorithm, the position of the key point of the face is rapidly and accurately acquired, the position of the lip, eyelid and head features of the face is conveniently and accurately positioned, and the positioning efficiency of the key point of the face is improved.
Example 3
As shown in fig. 4, a detailed flowchart of step S3 provided by the present invention includes:
step S301, calculating a key point distance between an upper eyelid and a lower eyelid of each eye in the key point positions of the human face, and when the key point distance between the upper eyelid and the lower eyelid is lower than a first threshold and the duration time of the key point distance is higher than a second threshold, determining that the user is in an eye closing state;
specifically, the key points between the upper eyelid and the lower eyelid of the eye can be clearly located through the positions of the key points on the face, and when the user closes the eye, the distances of the key points between the upper eyelid and the lower eyelid are approximately coincident, and the coordinates of the key points between the upper eyelid and the lower eyelid are assumed to be (x) respectively1,y1) And (x)2,y2) The corresponding key point distance is as follows:
d = ( x 1 - x 2 ) 2 + ( y 1 - y 2 ) 2 - - - ( 1 )
in the formula (1), d is the distance between two key points in the face image.
As shown in fig. 3, the key point distance between the left eye and the right eye corresponding to their respective upper and lower eyelids is:
d l e f t e y e = d 38 , 42 + d 39 , 41 2 - - - ( 2 ) d r i g h t t e y e = d 44 , 48 + d 45 , 47 2 - - - ( 3 )
d in formula (2)lefteyeIs the key point distance between the upper and lower eyelids of the left eye, d38,42Is the distance, d, between the face keypoints 38 and 4239,41The distances between the face key points 39 and 41, respectively; d in formula (3)rightteyeIs the key point distance between the upper and lower eyelids of the left eye, d44,48Is the distance, d, between face keypoints 44 and 4845,47The distance between the face key points 45 and 47, respectively.
When the distances of the key points corresponding to the left eye and the right eye are both lower than a first threshold and the duration time of the key points is higher than a second threshold, the user is judged to be in the eye closing state, wherein the first threshold can be set to be close to zero, and the time corresponding to the second threshold is at least 3 seconds, so that the user can be accurately determined to be tired or inattentive.
If the distance of the key points between the upper eyelid and the lower eyelid of the two eyes is just lower than a first threshold value but the duration time of the key points is lower than a second threshold value, the user is determined to be in the blinking motion rather than the eye closing state.
Step S302, calculating the distance of key points between the upper lip and the lower lip at the key point position of the face, and when the distance of the key points between the upper lip and the lower lip is higher than a third threshold value and the duration time of the distance is higher than a fourth threshold value, judging that the user is in a yawning state;
specifically, as shown in equation (1), the key point distance between the upper lip and the lower lip of the face key point is calculated, as shown in fig. 3,
d l i p = d 51 , 59 + d 52 , 58 + d 53 , 57 3 - - - ( 4 )
d in formula (4)lipIs the distance of the key point between the upper and lower lips, d51,59Is the distance, d, between the face keypoints 51 and 5952,58Is the distance, d, between the face keypoints 52 and 5853,57The distance between the face key points 53 and 57 is the third threshold, wherein the third threshold is at least 10 pixels, the time corresponding to the fourth threshold is at least 3 seconds, and the third threshold or the fourth threshold can be reset according to the habit of the user.
And only when the distance of the key points between the upper lip and the lower lip in the positions of the key points of the face is calculated to be higher than 10 pixels and the duration time of the key points is calculated to be higher than 3 seconds, the user is judged to be in a yawning state.
Step S303, calculating the head rotation angle of the user according to the rotation angle of the standard 3D face and the corresponding mapping matrix, and judging that the user is in a nodding state when the change value of the head rotation angle reaches a threshold angle within a preset time;
specifically, the face shape is positioned according to the positions of the face key points to obtain a rotation angle a of the current face based on a standard 3d face, and the face shape of the current user is positioned according to the positions of the face key points to obtain a mapping matrix r which changes from the standard face to the current face, wherein a formula corresponding to the mapping matrix r is as follows:
r * = R 11 R 12 R 13 R 21 R 22 R 23 R 31 R 32 R 33 - - - ( 5 )
wherein,each row in the R matrix represents the coordinate of the unit vector in each coordinate axis direction in the camera coordinate system in the world coordinate system, and each column in the corresponding matrix represents the coordinate of the unit vector in each coordinate axis direction in the world coordinate system in the camera coordinate system; the rotation angle of the head part is expressed by the corresponding formula theta of the parametersz=a tan(R21,R11) The angle of the head in the application is obtained, particularly the angle in the up-down direction corresponding to the face, and if the angle corresponds to the real coordinate axis of X/Y/Z, particularly the angle in the Z-axis direction; in this application, the head angle of the user is updated in real time, and the nodding state includes raising and lowering the head, wherein the specific determination mode is: when detecting whether the change value of the head angle reaches the threshold angle within the preset time, when detecting that the reduction value of the head angle reaches the threshold angle, the head is in a low head state, and when detecting that the increase value of the head angle reaches the threshold angle, the head is in a head-up state, and the head-up and the head-down are both signs of inattention or drowsiness.
The threshold angle is preferably 15 degrees, or 15 degrees or more; the preset time may be a time of 1, 2, or 3 seconds, wherein the threshold angle corresponding to each of the change values (increase or decrease values) of the head angle may be set to a value of different magnitude.
And step S304, counting the times of closing eyes, yawning and nodding of the user.
In this embodiment, steps S301, S302, and S303 are executed in no particular order, and the corresponding times are counted according to the actions of closing the eyes, yawning, and nodding of the user.
In this embodiment, it is detected whether the times of eye closing, yawning and nodding of the user exceed a preset threshold range within a preset time by counting the times of eye closing, yawning and nodding of the user, and the teaching quality is evaluated according to the interval of the times of eye closing, yawning and nodding within the preset threshold range, where the preset time may be a lesson or lessons, and the preset threshold range may be automatically set according to the requirements, for example: the attention can be classified according to three grades of degrees of high, medium and low, if the eye closing action does not appear, the attention concentration ratio is high; if the eye closing action occurs for 1-3 times, the concentration ratio of attention is middle, and if the eye closing action is more than three times, the concentration ratio of attention is middle; if the yawning action occurs 0-2 times, the concentration degree of attention is high; if the yawning action occurs 3-5 times, the concentration degree of attention is low; if the yawning action is more than five times, the concentration degree of attention is low; if the head-lowering action occurs for 0-2 times, the concentration of attention is high; if the head-lowering action occurs 3-5 times, the concentration degree of attention is low; if more than five head lowering actions occur, the concentration is low.
If the frequency of the user in the eyes closing, yawning and nodding actions is different and respectively corresponds to the attention conditions with different levels, the lowest level of attention is taken, the teaching quality of the teacher is evaluated according to the attention condition of the user (the student) and the attention condition of the user according to the state attention condition of the user who is listening at the moment, and therefore the purpose of teaching evaluation is achieved.
Example 4
Referring to fig. 5, a teaching evaluation system according to the present invention includes:
the acquisition module 1 is used for acquiring and recording a video of the face of a user;
a key point position extraction module 2, configured to determine a face position of a user in the video, and extract a face key point position of the face position;
the detection module 3 is used for judging whether the user closes the eyes, yawns and nods according to the change state of the positions of the key points of the face, and respectively calculating the times of closing the eyes, yawns and nods of the user;
and the evaluation module 4 is used for evaluating the teaching quality of the courses according to the times of eye closing, yawning and nodding of the user.
The system is mostly used for online education of networks, and each user needs to be provided with a single corresponding system when the system is used in a classroom.
Example 5
Referring to fig. 6, a block diagram of a key point location extraction module in the teaching evaluation system provided by the present invention specifically includes:
a calibration unit 21, configured to extract a user face image in the video by using a face detector, and calibrate initial positions of key points in the face image;
the first extraction unit 22 is configured to extract SURF features at initial positions of the key points, and splice corresponding SURF features into a global feature;
the second extraction unit 23 is configured to obtain the translation amount of each key point by using a random forest algorithm based on the global features;
and the calculating unit 24 is used for iteratively calculating the translation amount of each key point to obtain the position of the key point of the face image.
Example 6
Referring to fig. 7, a block diagram of a detection module in a teaching evaluation system according to the present invention includes:
the eye closing detection unit 31 is configured to calculate a key point distance between an upper eyelid and a lower eyelid of each eye in key point positions of the human face, and determine that the user is in an eye closing state when the key point distance between the upper eyelid and the lower eyelid is lower than a first threshold and the duration of the key point distance is higher than a second threshold;
the yawning detection unit 32 is configured to calculate a key point distance between upper and lower lips in a key point position of a face, and determine that a user is in a yawning state when the key point distance between the upper and lower lips is higher than a third threshold and a duration of the key point distance is higher than a fourth threshold;
the nodding detection unit 33 is configured to calculate a head rotation angle of the user according to a rotation angle of the standard 3D face and the corresponding mapping matrix, and determine that the user is in a nodding state when a change value of the head rotation angle reaches a threshold angle within a preset time;
and the counting unit 34 is used for counting the times of eye closing, yawning and nodding of the user.
Referring to fig. 8, a block diagram of a complete structure of the teaching evaluation system provided by the present invention is shown, specifically as follows:
the acquisition module 1 comprises an acquisition unit 11 for recording a video comprising a face of a user using a camera or a video recorder.
The evaluation module 4 includes an evaluation unit 41, configured to detect whether the number of times of eye closure, yawning, and nodding of the user exceeds a preset threshold range within a preset time, and evaluate the teaching quality according to an interval in which the number of times of eye closure, yawning, and nodding is within the preset threshold range.
The teaching evaluation system and the teaching evaluation method have the same corresponding principle and flow, and are not repeated herein.
In summary, in view of the fact that the traditional method is large in labor input and poor in evaluation effect, the method does not need scoring of students, obtains the facial expression state of a user (lessee) by collecting videos of the user, analyzes whether the facial expression has eye closing, head nodding and yawning actions, and counts the frequency of the eye closing, head noding and yawning actions, so that the teaching effect is evaluated objectively and fairly according to the state of the lessee; the system is convenient for teachers to improve the efficiency of lessons preparation according to the assessment effect, and meanwhile, teachers can accurately know the states of lessees through the assessment system no matter on-line or off-line teaching, corresponding adjustment is made in time, and the teaching effect is favorably improved. Therefore, the invention effectively overcomes various defects in the prior art and has high industrial utilization value.
The foregoing embodiments are merely illustrative of the principles and utilities of the present invention and are not intended to limit the invention. Any person skilled in the art can modify or change the above-mentioned embodiments without departing from the spirit and scope of the present invention. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention be covered by the claims of the present invention.

Claims (10)

1. A teaching assessment method, comprising:
acquiring a video for recording the face of a user;
determining the face position of a user in the video, and extracting the face key point position of the face position;
judging whether the user closes the eyes, yawns and nods according to the change state of the positions of the key points of the face, and respectively calculating the times of closing the eyes, yawns and nods of the user;
and evaluating the teaching quality of the course according to the times of closing eyes, yawning and nodding of the user.
2. The instructional evaluation method of claim 1, wherein the step of capturing a video recording the facial expression of the user comprises: and recording the video containing the face of the user by adopting a camera or a video recorder.
3. The teaching assessment method of claim 1, wherein said step of determining the face position of the user in said video and extracting the face key point position of said face position comprises:
extracting a user face image in a video by adopting a face detector, and calibrating the initial position of each key point in the face image;
extracting SURF characteristics of the initial positions of the key points, and splicing the corresponding SURF characteristics into a global characteristic;
based on the global features, obtaining the translation amount of each key point by adopting a random forest algorithm;
and iteratively calculating the translation amount of each key point to obtain the positions of the key points of the face image.
4. The teaching assessment method according to claim 1, wherein said step of determining whether the user closes his eyes, yawns and nods according to the variation state of the positions of the key points on the face, and calculating the times of closing his eyes, yawns and nods respectively comprises:
calculating the distance of key points between the upper eyelid and the lower eyelid of each eye in the key point positions of the face, and when the distance of the key points between the upper eyelid and the lower eyelid is lower than a first threshold value and the duration time of the distance is higher than a second threshold value, judging that the user is in an eye closing state;
calculating the distance of key points between the upper lip and the lower lip at the key point positions of the face, and judging that the user is in a yawning state when the distance of the key points between the upper lip and the lower lip is higher than a third threshold value and the duration time of the distance of the key points is higher than a fourth threshold value;
calculating the head rotation angle of the user according to the rotation angle of the standard 3D face and the corresponding mapping matrix, and judging that the user is in a nodding state when the change value of the head rotation angle reaches a threshold angle within preset time;
and counting the times of closing eyes, yawning and nodding of the user.
5. The teaching assessment method according to claim 1, wherein said step of assessing the teaching quality of the lesson according to the number of times of eye closing, yawning and nodding of the user comprises:
detecting whether the times of eye closing, yawning and nodding of a user exceed a preset threshold range within preset time, and evaluating the teaching quality according to the interval of the times of eye closing, yawning and nodding within the preset threshold range.
6. A teaching assessment system, comprising:
the acquisition module is used for acquiring and recording a video of the face of the user;
the key point position extraction module is used for determining the face position of the user in the video and extracting the face key point position of the face position;
the detection module is used for judging whether the user closes the eyes, yawns and nods according to the change state of the positions of the key points of the face, and respectively calculating the times of closing the eyes, yawns and nods of the user;
and the evaluation module is used for evaluating the teaching quality of the course according to the times of closing eyes, yawning and nodding of the user.
7. The instructional evaluation system of claim 6 wherein the acquisition module comprises an acquisition unit for recording a video containing a face of the user with a camera or video recorder.
8. The instructional evaluation system of claim 6, wherein the keypoint location extraction module comprises:
the calibration unit is used for extracting a user face image in the video by adopting a face detector and calibrating the initial position of each key point in the face image;
the first extraction unit is used for extracting SURF characteristics of the initial positions of the key points and splicing the corresponding SURF characteristics into a global characteristic;
the second extraction unit is used for obtaining the translation amount of each key point by adopting a random forest algorithm on the basis of the global characteristics;
and the calculating unit is used for iteratively calculating the translation amount of each key point to obtain the position of the key point of the face image.
9. The instructional evaluation system of claim 6, wherein the detection module comprises:
the eye closing detection unit is used for calculating a key point distance between an upper eyelid and a lower eyelid of each eye in key point positions of the human face, and when the key point distance between the upper eyelid and the lower eyelid is lower than a first threshold value and the duration time of the key point distance is higher than a second threshold value, the user is judged to be in an eye closing state;
the yawning detection unit is used for calculating the distance between key points of the upper lip and the lower lip at the key point position of the face, and when the distance between the key points of the upper lip and the lower lip is higher than a third threshold value and the duration time of the distance is higher than a fourth threshold value, the user is judged to be in a yawning state;
the nodding detection unit is used for calculating the head rotation angle of the user according to the rotation angle of the standard 3D face and the corresponding mapping matrix, and judging that the user is in a nodding state when the change value of the head rotation angle reaches a threshold angle within preset time;
and the counting unit is used for counting the times of closing eyes, yawning and nodding of the user.
10. The teaching evaluation system of claim 6, wherein the evaluation module comprises an evaluation unit for detecting whether the times of eye closure, yawning and nodding of the user exceed a preset threshold range within a preset time, and evaluating the teaching quality according to the interval of the times of eye closure, yawning and nodding within the preset threshold range.
CN201610564845.2A 2016-07-18 2016-07-18 teaching evaluation method and system Pending CN106228293A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610564845.2A CN106228293A (en) 2016-07-18 2016-07-18 teaching evaluation method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610564845.2A CN106228293A (en) 2016-07-18 2016-07-18 teaching evaluation method and system

Publications (1)

Publication Number Publication Date
CN106228293A true CN106228293A (en) 2016-12-14

Family

ID=57519356

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610564845.2A Pending CN106228293A (en) 2016-07-18 2016-07-18 teaching evaluation method and system

Country Status (1)

Country Link
CN (1) CN106228293A (en)

Cited By (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106778658A (en) * 2016-12-28 2017-05-31 辽宁师范大学 Method based on classroom scene and learner's Retina transplantation learner's notice
CN107122789A (en) * 2017-03-14 2017-09-01 华南理工大学 The study focus analysis method of multimodal information fusion based on depth camera
CN107133892A (en) * 2017-03-29 2017-09-05 华东交通大学 The real-time estimating method and system of a kind of network Piano lesson
CN107145863A (en) * 2017-05-08 2017-09-08 湖南科乐坊教育科技股份有限公司 A kind of Face detection method and system
CN107145864A (en) * 2017-05-08 2017-09-08 湖南科乐坊教育科技股份有限公司 A kind of concentration appraisal procedure and system
CN107527159A (en) * 2017-09-20 2017-12-29 江苏经贸职业技术学院 One kind teaching quantitative estimation method
CN108009754A (en) * 2017-12-26 2018-05-08 重庆大争科技有限公司 Method of Teaching Quality Evaluation
CN108154304A (en) * 2017-12-26 2018-06-12 重庆大争科技有限公司 There is the server of Teaching Quality Assessment
CN108257056A (en) * 2018-01-23 2018-07-06 余绍志 A kind of classroom assisted teaching system for the big data for being applied to teaching industry
CN108256490A (en) * 2018-01-25 2018-07-06 上海交通大学 Method for detecting human face and listen to the teacher rate appraisal procedure, system based on Face datection
CN108491781A (en) * 2018-03-16 2018-09-04 福州外语外贸学院 A kind of classroom focus appraisal procedure and terminal
CN108596395A (en) * 2018-04-28 2018-09-28 北京比特智学科技有限公司 The distribution method and device of course
CN108764860A (en) * 2018-08-21 2018-11-06 合肥创旗信息科技有限公司 A kind of business Education Administration Information System based on big data
CN108776794A (en) * 2018-06-20 2018-11-09 华南师范大学 Based on the teaching efficiency of big data and artificial intelligence portrait method and robot system
CN108805770A (en) * 2018-06-20 2018-11-13 华南师范大学 Content of courses portrait method based on big data and artificial intelligence and robot system
CN108875785A (en) * 2018-05-17 2018-11-23 深圳市鹰硕技术有限公司 The attention rate detection method and device of Behavior-based control Characteristic Contrast
CN108961115A (en) * 2018-07-02 2018-12-07 百度在线网络技术(北京)有限公司 Method, apparatus, equipment and the computer readable storage medium of teaching data analysis
WO2018229592A1 (en) * 2017-06-15 2018-12-20 Lam Yuen Lee Viola Method and system for evaluting and monitoring compliance using emotion detection
CN109165552A (en) * 2018-07-14 2019-01-08 深圳神目信息技术有限公司 A kind of gesture recognition method based on human body key point, system and memory
CN109165578A (en) * 2018-08-08 2019-01-08 盎锐(上海)信息科技有限公司 Expression detection device and data processing method based on filming apparatus
CN109214673A (en) * 2018-08-27 2019-01-15 南昌理工学院 Method of Teaching Quality Evaluation and system
CN109243228A (en) * 2018-11-12 2019-01-18 重庆靶向科技发展有限公司 A kind of intelligence teaching platform system
CN109272780A (en) * 2018-11-12 2019-01-25 重庆靶向科技发展有限公司 A kind of intelligent teaching method based on PPT
CN109344682A (en) * 2018-08-02 2019-02-15 平安科技(深圳)有限公司 Classroom monitoring method, device, computer equipment and storage medium
CN109345156A (en) * 2018-12-12 2019-02-15 范例 A kind of Classroom Teaching system based on machine vision
CN109446880A (en) * 2018-09-05 2019-03-08 广州维纳斯家居股份有限公司 Intelligent subscriber participation evaluation method, device, intelligent elevated table and storage medium
CN109614934A (en) * 2018-12-12 2019-04-12 易视腾科技股份有限公司 Online teaching quality assessment parameter generation method and device
CN109784313A (en) * 2019-02-18 2019-05-21 上海骏聿数码科技有限公司 A kind of blink detection method and device
CN109858809A (en) * 2019-01-31 2019-06-07 浙江传媒学院 Learning quality appraisal procedure and system based on the analysis of classroom students ' behavior
CN110197169A (en) * 2019-06-05 2019-09-03 南京邮电大学 A kind of contactless learning state monitoring system and learning state detection method
CN110276246A (en) * 2019-05-09 2019-09-24 威比网络科技(上海)有限公司 Course index detects alarm method, device, electronic equipment, storage medium
CN110348328A (en) * 2019-06-24 2019-10-18 北京大米科技有限公司 Appraisal procedure, device, storage medium and the electronic equipment of quality of instruction
WO2019205633A1 (en) * 2018-04-27 2019-10-31 京东方科技集团股份有限公司 Eye state detection method and detection apparatus, electronic device, and computer readable storage medium
CN110765814A (en) * 2018-07-26 2020-02-07 杭州海康威视数字技术股份有限公司 Blackboard writing behavior recognition method and device and camera
CN110796005A (en) * 2019-09-27 2020-02-14 北京大米科技有限公司 Method, device, electronic equipment and medium for online teaching monitoring
CN111402096A (en) * 2020-04-03 2020-07-10 广州云从鼎望科技有限公司 Online teaching quality management method, system, equipment and medium
CN111507555A (en) * 2019-11-05 2020-08-07 浙江大华技术股份有限公司 Human body state detection method, classroom teaching quality evaluation method and related device
CN112380261A (en) * 2020-10-10 2021-02-19 杭州翔毅科技有限公司 Remote tutoring method, device and system based on 5G technology and storage medium
CN112883867A (en) * 2021-02-09 2021-06-01 广州汇才创智科技有限公司 Student online learning evaluation method and system based on image emotion analysis
CN113052064A (en) * 2021-03-23 2021-06-29 北京思图场景数据科技服务有限公司 Attention detection method based on face orientation, facial expression and pupil tracking
CN113112187A (en) * 2021-05-13 2021-07-13 北京一起教育科技有限责任公司 Student attention assessment method and device and electronic equipment
CN113456058A (en) * 2020-03-30 2021-10-01 Oppo广东移动通信有限公司 Head posture detection method and device, electronic equipment and readable storage medium
CN113967014A (en) * 2021-12-22 2022-01-25 成都航空职业技术学院 Student behavior analysis device, system and method based on big data
US11475788B2 (en) 2017-06-15 2022-10-18 Yuen Lee Viola Lam Method and system for evaluating and monitoring compliance using emotion detection
CN115937961A (en) * 2023-03-02 2023-04-07 济南丽阳神州智能科技有限公司 Online learning identification method and equipment
CN116018789A (en) * 2020-09-14 2023-04-25 华为技术有限公司 Method, system and medium for context-based assessment of student attention in online learning

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101032405A (en) * 2007-03-21 2007-09-12 汤一平 Safe driving auxiliary device based on omnidirectional computer vision
CN102436715A (en) * 2011-11-25 2012-05-02 大连海创高科信息技术有限公司 Detection method for fatigue driving
CN103927848A (en) * 2014-04-18 2014-07-16 南京通用电器有限公司 Safe driving assisting system based on biological recognition technology
CN104382607A (en) * 2014-11-26 2015-03-04 重庆科技学院 Fatigue detecting method based on driver video images in vehicle working condition
CN105512627A (en) * 2015-12-03 2016-04-20 腾讯科技(深圳)有限公司 Key point positioning method and terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101032405A (en) * 2007-03-21 2007-09-12 汤一平 Safe driving auxiliary device based on omnidirectional computer vision
CN102436715A (en) * 2011-11-25 2012-05-02 大连海创高科信息技术有限公司 Detection method for fatigue driving
CN103927848A (en) * 2014-04-18 2014-07-16 南京通用电器有限公司 Safe driving assisting system based on biological recognition technology
CN104382607A (en) * 2014-11-26 2015-03-04 重庆科技学院 Fatigue detecting method based on driver video images in vehicle working condition
CN105512627A (en) * 2015-12-03 2016-04-20 腾讯科技(深圳)有限公司 Key point positioning method and terminal

Cited By (57)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106778658A (en) * 2016-12-28 2017-05-31 辽宁师范大学 Method based on classroom scene and learner's Retina transplantation learner's notice
CN107122789A (en) * 2017-03-14 2017-09-01 华南理工大学 The study focus analysis method of multimodal information fusion based on depth camera
CN107122789B (en) * 2017-03-14 2021-10-26 华南理工大学 Learning concentration degree analysis method based on multi-mode information fusion of depth camera
CN107133892A (en) * 2017-03-29 2017-09-05 华东交通大学 The real-time estimating method and system of a kind of network Piano lesson
CN107145863A (en) * 2017-05-08 2017-09-08 湖南科乐坊教育科技股份有限公司 A kind of Face detection method and system
CN107145864A (en) * 2017-05-08 2017-09-08 湖南科乐坊教育科技股份有限公司 A kind of concentration appraisal procedure and system
WO2018229592A1 (en) * 2017-06-15 2018-12-20 Lam Yuen Lee Viola Method and system for evaluting and monitoring compliance using emotion detection
US11475788B2 (en) 2017-06-15 2022-10-18 Yuen Lee Viola Lam Method and system for evaluating and monitoring compliance using emotion detection
CN107527159A (en) * 2017-09-20 2017-12-29 江苏经贸职业技术学院 One kind teaching quantitative estimation method
CN108009754A (en) * 2017-12-26 2018-05-08 重庆大争科技有限公司 Method of Teaching Quality Evaluation
CN108154304A (en) * 2017-12-26 2018-06-12 重庆大争科技有限公司 There is the server of Teaching Quality Assessment
CN108257056A (en) * 2018-01-23 2018-07-06 余绍志 A kind of classroom assisted teaching system for the big data for being applied to teaching industry
CN108256490A (en) * 2018-01-25 2018-07-06 上海交通大学 Method for detecting human face and listen to the teacher rate appraisal procedure, system based on Face datection
CN108491781A (en) * 2018-03-16 2018-09-04 福州外语外贸学院 A kind of classroom focus appraisal procedure and terminal
CN108491781B (en) * 2018-03-16 2020-10-23 福州外语外贸学院 Classroom concentration degree evaluation method and terminal
WO2019205633A1 (en) * 2018-04-27 2019-10-31 京东方科技集团股份有限公司 Eye state detection method and detection apparatus, electronic device, and computer readable storage medium
US11386710B2 (en) 2018-04-27 2022-07-12 Boe Technology Group Co., Ltd. Eye state detection method, electronic device, detecting apparatus and computer readable storage medium
CN108596395A (en) * 2018-04-28 2018-09-28 北京比特智学科技有限公司 The distribution method and device of course
CN108875785B (en) * 2018-05-17 2021-04-06 深圳市鹰硕技术有限公司 Attention degree detection method and device based on behavior feature comparison
CN108875785A (en) * 2018-05-17 2018-11-23 深圳市鹰硕技术有限公司 The attention rate detection method and device of Behavior-based control Characteristic Contrast
CN108776794A (en) * 2018-06-20 2018-11-09 华南师范大学 Based on the teaching efficiency of big data and artificial intelligence portrait method and robot system
CN108776794B (en) * 2018-06-20 2023-03-28 华南师范大学 Teaching effect image drawing method based on big data and artificial intelligence and robot system
CN108805770A (en) * 2018-06-20 2018-11-13 华南师范大学 Content of courses portrait method based on big data and artificial intelligence and robot system
CN108961115A (en) * 2018-07-02 2018-12-07 百度在线网络技术(北京)有限公司 Method, apparatus, equipment and the computer readable storage medium of teaching data analysis
CN109165552B (en) * 2018-07-14 2021-02-26 深圳神目信息技术有限公司 Gesture recognition method and system based on human body key points and memory
CN109165552A (en) * 2018-07-14 2019-01-08 深圳神目信息技术有限公司 A kind of gesture recognition method based on human body key point, system and memory
CN110765814A (en) * 2018-07-26 2020-02-07 杭州海康威视数字技术股份有限公司 Blackboard writing behavior recognition method and device and camera
CN109344682A (en) * 2018-08-02 2019-02-15 平安科技(深圳)有限公司 Classroom monitoring method, device, computer equipment and storage medium
CN109165578A (en) * 2018-08-08 2019-01-08 盎锐(上海)信息科技有限公司 Expression detection device and data processing method based on filming apparatus
CN108764860A (en) * 2018-08-21 2018-11-06 合肥创旗信息科技有限公司 A kind of business Education Administration Information System based on big data
CN109214673A (en) * 2018-08-27 2019-01-15 南昌理工学院 Method of Teaching Quality Evaluation and system
CN109446880A (en) * 2018-09-05 2019-03-08 广州维纳斯家居股份有限公司 Intelligent subscriber participation evaluation method, device, intelligent elevated table and storage medium
CN109272780A (en) * 2018-11-12 2019-01-25 重庆靶向科技发展有限公司 A kind of intelligent teaching method based on PPT
CN109243228A (en) * 2018-11-12 2019-01-18 重庆靶向科技发展有限公司 A kind of intelligence teaching platform system
CN109243228B (en) * 2018-11-12 2021-03-23 重庆靶向科技发展有限公司 Intelligent teaching platform system
CN109614934B (en) * 2018-12-12 2023-06-06 易视腾科技股份有限公司 Online teaching quality assessment parameter generation method and device
CN109614934A (en) * 2018-12-12 2019-04-12 易视腾科技股份有限公司 Online teaching quality assessment parameter generation method and device
CN109345156A (en) * 2018-12-12 2019-02-15 范例 A kind of Classroom Teaching system based on machine vision
CN109858809A (en) * 2019-01-31 2019-06-07 浙江传媒学院 Learning quality appraisal procedure and system based on the analysis of classroom students ' behavior
CN109784313A (en) * 2019-02-18 2019-05-21 上海骏聿数码科技有限公司 A kind of blink detection method and device
CN110276246A (en) * 2019-05-09 2019-09-24 威比网络科技(上海)有限公司 Course index detects alarm method, device, electronic equipment, storage medium
CN110197169A (en) * 2019-06-05 2019-09-03 南京邮电大学 A kind of contactless learning state monitoring system and learning state detection method
CN110197169B (en) * 2019-06-05 2022-08-26 南京邮电大学 Non-contact learning state monitoring system and learning state detection method
CN110348328A (en) * 2019-06-24 2019-10-18 北京大米科技有限公司 Appraisal procedure, device, storage medium and the electronic equipment of quality of instruction
CN110796005A (en) * 2019-09-27 2020-02-14 北京大米科技有限公司 Method, device, electronic equipment and medium for online teaching monitoring
CN111507555A (en) * 2019-11-05 2020-08-07 浙江大华技术股份有限公司 Human body state detection method, classroom teaching quality evaluation method and related device
CN111507555B (en) * 2019-11-05 2023-11-14 浙江大华技术股份有限公司 Human body state detection method, classroom teaching quality evaluation method and related device
CN113456058A (en) * 2020-03-30 2021-10-01 Oppo广东移动通信有限公司 Head posture detection method and device, electronic equipment and readable storage medium
CN111402096A (en) * 2020-04-03 2020-07-10 广州云从鼎望科技有限公司 Online teaching quality management method, system, equipment and medium
CN116018789A (en) * 2020-09-14 2023-04-25 华为技术有限公司 Method, system and medium for context-based assessment of student attention in online learning
CN112380261A (en) * 2020-10-10 2021-02-19 杭州翔毅科技有限公司 Remote tutoring method, device and system based on 5G technology and storage medium
CN112883867A (en) * 2021-02-09 2021-06-01 广州汇才创智科技有限公司 Student online learning evaluation method and system based on image emotion analysis
CN113052064B (en) * 2021-03-23 2024-04-02 北京思图场景数据科技服务有限公司 Attention detection method based on face orientation, facial expression and pupil tracking
CN113052064A (en) * 2021-03-23 2021-06-29 北京思图场景数据科技服务有限公司 Attention detection method based on face orientation, facial expression and pupil tracking
CN113112187A (en) * 2021-05-13 2021-07-13 北京一起教育科技有限责任公司 Student attention assessment method and device and electronic equipment
CN113967014A (en) * 2021-12-22 2022-01-25 成都航空职业技术学院 Student behavior analysis device, system and method based on big data
CN115937961A (en) * 2023-03-02 2023-04-07 济南丽阳神州智能科技有限公司 Online learning identification method and equipment

Similar Documents

Publication Publication Date Title
CN106228293A (en) teaching evaluation method and system
WO2021077382A1 (en) Method and apparatus for determining learning state, and intelligent robot
Whitehill et al. The faces of engagement: Automatic recognition of student engagementfrom facial expressions
CN108399376A (en) Student classroom learning interest intelligent analysis method and system
CN111291613B (en) Classroom performance evaluation method and system
CN105224285A (en) Eyes open and-shut mode pick-up unit and method
Indi et al. Detection of malpractice in e-exams by head pose and gaze estimation
Zaletelj Estimation of students' attention in the classroom from kinect features
Abdulkader et al. Optimizing student engagement in edge-based online learning with advanced analytics
Dubbaka et al. Detecting learner engagement in MOOCs using automatic facial expression recognition
CN114708658A (en) Online learning concentration degree identification method
CN115937928A (en) Learning state monitoring method and system based on multi-vision feature fusion
Khan et al. Human distraction detection from video stream using artificial emotional intelligence
CN113239794B (en) Online learning-oriented learning state automatic identification method
JP7099377B2 (en) Information processing equipment and information processing method
Meriem et al. Determine the level of concentration of students in real time from their facial expressions
KR102245319B1 (en) System for analysis a concentration of learner
CN116994465A (en) Intelligent teaching method, system and storage medium based on Internet
KR20200000680U (en) The Device for improving the study concentration
CN116341983A (en) Concentration evaluation and early warning method, system, electronic equipment and medium
Gogia et al. Multi-modal affect detection for learning applications
Hwang et al. Attentiveness assessment in learning based on fuzzy logic analysis
CN113688739A (en) Classroom learning efficiency prediction method and system based on emotion recognition and visual analysis
CN111507555B (en) Human body state detection method, classroom teaching quality evaluation method and related device
Boels et al. Automated gaze-based identification of students’ strategies in histogram tasks through an interpretable mathematical model and a machine learning algorithm

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20161214