CN111967350A - Remote classroom concentration analysis method and device, computer equipment and storage medium - Google Patents

Remote classroom concentration analysis method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN111967350A
CN111967350A CN202010748481.XA CN202010748481A CN111967350A CN 111967350 A CN111967350 A CN 111967350A CN 202010748481 A CN202010748481 A CN 202010748481A CN 111967350 A CN111967350 A CN 111967350A
Authority
CN
China
Prior art keywords
student
data
students
face
generating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010748481.XA
Other languages
Chinese (zh)
Inventor
白永睿
徐宋传
陈晓宇
罗帅
刘娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Everbright Education Software Technology Co ltd
Original Assignee
Guangzhou Everbright Education Software Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Everbright Education Software Technology Co ltd filed Critical Guangzhou Everbright Education Software Technology Co ltd
Priority to CN202010748481.XA priority Critical patent/CN111967350A/en
Publication of CN111967350A publication Critical patent/CN111967350A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • G06Q50/205Education administration or guidance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction

Abstract

The invention relates to the technical field of remote classes, in particular to a remote classroom concentration analysis method, a device, computer equipment and a storage medium, wherein the remote classroom concentration analysis method comprises the following steps: the method comprises the steps of acquiring student face data of each student end in real time, and acquiring corresponding student face features from each student face data; generating a face rectangular frame corresponding to each student face data through the student face features, and acquiring the rectangular frame size data of each face rectangular frame; calculating the student watching distance data according to the size data of the rectangular frame, and generating the student watching state condition according to the student watching distance data; and acquiring the eye characteristics of the students from the face characteristics of the students, generating student sight line data according to the eye characteristics of the students, and generating the information of the fixation points of the students according to the student sight line data. The invention has the effect of being beneficial to teachers to obtain the performance of students participating in remote classes.

Description

Remote classroom concentration analysis method and device, computer equipment and storage medium
Technical Field
The invention relates to the technical field of remote classes, in particular to a remote class concentration degree analysis method, a remote class concentration degree analysis device, computer equipment and a storage medium.
Background
Currently, the remote video technology is an audio-visual conversational conferencing service that provides two-way real-time transmission of voice and moving color pictures between multiple users located at two or more sites. This technology has also been widely used in various fields to facilitate the communication of voice and video between users across regions.
The remote video technology is also used in the field of teaching at present, teachers and students can utilize the remote video technology to realize the teaching in remote classes, the regional limitation of the teaching is broken, and the convenience of the teaching is improved.
In view of the above-mentioned related technologies, the inventor thinks that there is a defect that it is difficult to obtain the performance of the student in the class in the remote class.
Disclosure of Invention
The invention aims to provide a remote classroom concentration analysis method, a remote classroom concentration analysis device, computer equipment and a storage medium, which are beneficial for teachers to obtain the performance of students participating in remote classrooms.
The above object of the present invention is achieved by the following technical solutions:
a remote classroom concentration analysis method, comprising the steps of:
the method comprises the steps of acquiring student face data of each student end in real time, and acquiring corresponding student face features from each student face data;
generating a face rectangular frame corresponding to each student face data through the student face features, and acquiring the rectangular frame size data of each face rectangular frame;
calculating the student watching distance data according to the size data of the rectangular frame, and generating the student watching state condition according to the student watching distance data;
acquiring student eye characteristics from the student face characteristics, generating student sight line data according to the student eye characteristics, and generating student fixation point information through the student sight line data;
and sending the watching state condition of the students and the information of the fixation points of the students to a teacher end as the information of the class attention of the students.
By adopting the technical scheme, the student face data of each student end is acquired in real time, and the corresponding student face characteristics are acquired from the student face data, so that the lecture listening condition of students can be conveniently identified; by acquiring the viewing distance data, whether the student is in front of the student end or not and the posture state of the student can be judged, and the timely intervention of a teacher is facilitated; by generating the sight line data of the students, the information of the fixation points of the students can be obtained, and the information of the fixation points of the students can be obtained, so that whether the attention of the students is concentrated in the course of class can be detected, and the teacher can change teaching strategies in time to activate the classroom atmosphere.
The present invention in a preferred example may be further configured to: the student watching state information comprises student absence state information and student watching passing information, the student watching distance data is calculated according to the size data of the rectangular frame, and the student watching state condition is generated according to the student watching distance data, and the method specifically comprises the following steps:
acquiring a preset standard range of the viewing distance, and comparing the data of the viewing distance of the students with the standard range of the viewing distance in real time to obtain a corresponding comparison result;
if the comparison result is that the student viewing distance data exceeds the viewing distance standard range, generating out-of-state information of the student;
and if the comparison result shows that the student watching distance data is smaller than the standard watching distance range, generating the student watching distance information.
By adopting the technical scheme, the state of the student in front of the screen of the student end can be obtained by setting the standard range of the viewing distance and comparing the data of the viewing distance of the student with the standard range of the viewing distance in real time, and when the data of the viewing distance of the student exceeds the standard range of the viewing distance or exceeds the standard range of the viewing distance for a long time, the information that the student is not in the state is generated, so that a teacher can intervene the student in time and adjust the class state of the student in time; if the student watching distance data is smaller than the watching distance standard range, the student watching distance data is generated to show that the student is too close to the screen of the student end, the vision is damaged, and the teacher can intervene the student in time to adjust the sitting posture of the student in time.
The present invention in a preferred example may be further configured to: after the student watching distance data is calculated according to the size data of the rectangular frame and the student watching state condition is generated according to the student watching distance data, the remote classroom concentration analysis method further comprises the following steps:
acquiring a plurality of frames of student face data from each student terminal;
and inputting the face data of the students to a preset emotion detection model to obtain emotion data of the students and emotion change data of the students, and sending the emotion data of the students and the emotion change data of the students to the teacher end.
By adopting the technical scheme, the change of the facial expressions of the students in the course of class is observed through the facial expression analysis of the students, because the facial expressions reflect the fondness of the students to a classroom or a teacher and the emotional states of the students, if the students find that the class emotion is not good, such as frequent fear, difficulty and the like, the students reflect positive or negative reactions to the classroom and the teachers, the students give feedback to the teachers and parents in time, the emotional change of the students is concerned, and the mental health development of the students is helped.
The present invention in a preferred example may be further configured to: the method comprises the following steps of obtaining the eye characteristics of a student from the face characteristics of the student, generating student sight line data according to the eye characteristics of the student, and generating student fixation point information according to the student sight line data, and specifically comprises the following steps:
acquiring the position information of a student camera;
and generating the student fixation point information according to the attention point position data corresponding to each student sight line data of the camera position information.
Through adopting above-mentioned technical scheme, concentration degree analysis mainly can detect whether the student is concentrated in the attention of in-process of giving lessons, if discover that student's attention is not concentrated, in time give the teacher early warning.
The present invention in a preferred example may be further configured to: generating the student fixation point information according to the attention point position data corresponding to each student sight line data of the camera position information and the attention point position data, and specifically comprising the following steps:
acquiring the mean value and the standard deviation of all the point of interest position data;
and taking the mean value as a central point, taking N times of standard deviation as a radius to generate an attention point detection circle, acquiring attention point proportion data in the attention point detection circle, and taking the attention point proportion data as the student fixation point information, wherein N is greater than 0.
By adopting the technical scheme, the focus detection circle is generated by the mean value and the standard deviation of all focus position data, so that whether the focus of the student is concentrated in one region or not can be better detected, and the concentration degree of the student in class can be conveniently judged.
The second aim of the invention is realized by the following technical scheme:
a remote classroom concentration analysis apparatus, the remote classroom concentration analysis apparatus comprising:
the characteristic extraction module is used for acquiring the face data of students at each student end in real time and acquiring the corresponding face characteristics of the students from the face data of the students;
the face framing module is used for generating a face rectangular frame corresponding to each student face data through the student face features and acquiring the size data of the rectangular frame of each face rectangular frame;
the distance calculation module is used for calculating the student watching distance data according to the size data of the rectangular frame and generating the student watching state condition according to the student watching distance data;
the attention degree calculation module is used for acquiring the eye characteristics of the students from the face characteristics of the students, generating student sight line data according to the eye characteristics of the students and generating student fixation point information according to the student sight line data;
and the sending module is used for sending the watching state condition of the students and the information of the fixation points of the students to a teacher end as the information of the class attention of the students.
By adopting the technical scheme, the student face data of each student end is acquired in real time, and the corresponding student face characteristics are acquired from the student face data, so that the lecture listening condition of students can be conveniently identified; by acquiring the viewing distance data, whether the student is in front of the student end or not and the posture state of the student can be judged, and the timely intervention of a teacher is facilitated; by generating the sight line data of the students, the information of the fixation points of the students can be obtained, and the information of the fixation points of the students can be obtained, so that whether the attention of the students is concentrated in the course of class can be detected, and the teacher can change teaching strategies in time to activate the classroom atmosphere.
The third object of the invention is realized by the following technical scheme:
a computer device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the remote classroom concentration analysis method when executing the computer program.
The fourth object of the invention is realized by the following technical scheme:
a computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the above-mentioned remote classroom concentration analysis method.
In summary, the invention includes at least one of the following beneficial technical effects:
1. the student face data of each student end is obtained in real time, and the corresponding student face characteristics are obtained from the student face data, so that the student listening situations can be conveniently identified, and meanwhile, the positions and sizes of facial organs are not greatly different although different student face sizes are different, so that a face rectangular frame is generated according to the student face characteristics, and the interference of the face sizes of different students on identification can be reduced; by acquiring the viewing distance data, whether the student is in front of the student end or not and the posture state of the student can be judged, and the timely intervention of a teacher is facilitated;
2. the sight line data of the students are generated, so that the information of the fixation points of the students is obtained, and the information of the fixation points of the students can be obtained, so that whether the attention of the students is concentrated or not in the course of class can be detected, and the teacher can change teaching strategies in time to activate classroom atmosphere;
3. the facial expressions of students are analyzed, the changes of the facial expressions of the students in the course of class are observed, the facial expressions reflect the preferences of the students to a classroom or a teacher and the emotional states of the students, if the students find that the class emotion is not good, such as fear, difficulty and the like frequently appear, the positive or negative reactions of the students to the classroom and the teacher are reflected, the students give feedback to the teacher and parents in time, the emotion changes of the students are concerned, and the mental health development of the students is helped;
4. concentration degree analysis mainly can detect whether the attention of the student is concentrated in the course of class, and if the attention of the student is not concentrated, early warning is given to a teacher in time.
Drawings
FIG. 1 is a diagram illustrating an application environment of a graph structure-based recommendation method according to an embodiment of the present invention;
FIG. 2 is a flow chart of a remote classroom interaction method in accordance with an embodiment of the present invention;
fig. 3 is a flowchart of the implementation of step S30 in the remote classroom interaction method according to an embodiment of the present invention;
FIG. 4 is a flow chart of another implementation of the interaction method for remote classes according to an embodiment of the present invention;
FIG. 5 is a flowchart illustrating step S40 of the remote classroom interaction method according to an embodiment of the present invention;
FIG. 6 is a flowchart illustrating step S42 of the remote classroom interaction method according to an embodiment of the present invention;
FIG. 7 is a schematic block diagram of an interaction apparatus for remote classroom according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of a computer device according to an embodiment of the invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings.
The remote classroom concentration analysis method provided by the application can be applied to the application environment shown in fig. 1, wherein a teacher end and a student end communicate with a server end through a network. The camera of student end acquires student's face data in real time to with student's face data transmission to server, the server obtains the information of the degree of concentration that the student went on class after carrying out the analysis to this student's face data, and with this information transmission to teacher's end of degree of concentration. The student side and the teacher side (computer devices) can be but are not limited to various personal computers, notebook computers, smart phones, tablet computers and portable wearable devices. The server side can be implemented by an independent service or a server cluster consisting of a plurality of servers.
In an embodiment, as shown in fig. 2, the invention discloses a remote classroom concentration analysis method, which specifically includes the following steps:
s10: the method comprises the steps of acquiring student face data of each student end in real time, and acquiring corresponding student face features from the student face data.
In this embodiment, the student side corresponds to the teacher side, and refers to a terminal used by the student in a scene of online lessons. The student face data refers to data of faces of students who are attending class in front of the students. The student face features refer to data of features of key points in student face data.
Specifically, when a student remotely goes to school in the student end, the student face data of the student in front of the student end is acquired in real time by acquiring an Application Programming Interface (API) of a camera of the student end.
Further, by setting key points, such as facial organs like eyebrows, eyes, nose, and mouth, the feature of the key point is obtained from each student face data, i.e., the key point is identified from the student face data as the student face feature.
S20: and generating a face rectangular frame corresponding to the face data of each student through the face characteristics of the students, and acquiring the size data of the rectangular frame of each face rectangular frame.
In this embodiment, the face rectangular frame is a rectangular frame used to indicate the size of the face captured by the student-side camera. The rectangular frame size data refers to data of a specific size of each face rectangular frame.
Specifically, after the face features of the students of each piece of the face data are recognized, the face features of the students are framed by using a minimum rectangle capable of framing all the face features of the students to obtain a face rectangle frame, and the area of the face rectangle frame is calculated to obtain the size data of the rectangle frame.
S30: and calculating the student watching distance data according to the size data of the rectangular frame, and generating the student watching state condition according to the student watching distance data.
In the present embodiment, the student viewing distance data refers to data of a distance at which the student views the content in the student side screen. The student watching state situation refers to a situation of a learning state of a student while participating in a remote classroom.
Specifically, the student viewing distance data is calculated by setting an optimum value or an optimum range of the rectangular frame size data and a ratio of the rectangular frame size data to the optimum value or the optimum range.
Further, if it is determined from the student viewing distance data that the student is too far from the screen at the student end, it indicates that the student may not be seriously participating in the remote classroom; if it is determined from the student viewing distance data that the student is too close to the screen on the student side, there is a possibility that the sitting posture of the student while participating in the remote classroom is incorrect, and there is an influence on the vision and physical health. The corresponding student status condition is generated by the student viewing distance data.
S40: the method comprises the steps of obtaining student eye characteristics from the face characteristics of students, generating student sight line data according to the student eye characteristics, and generating student fixation point information according to the student sight line data.
In the present embodiment, the student eye characteristics refer to characteristics of eyes of a student when the student attends a remote classroom. The student sight line data refers to a straight line between eyes and a target when a student participates in a remote classroom. The student fixation point information refers to information of a target which is fixed by eyes of a student when the student attends a network class.
Specifically, student eye characteristics from the characteristics of the faces of students are input into a preset model for identifying the eye sight, and student sight data are obtained.
Further, according to the student sight line data, a target watched by the student when the student participates in a remote classroom is obtained, and student gaze point information is obtained.
S50: and sending the watching state condition of the students and the information of the fixation points of the students to a teacher end as the information of the class attention of the students.
In this embodiment, the teacher end refers to a terminal used by the teacher in a scene of online lessons.
Specifically, after the student watching condition and the student fixation point information of each student are obtained, the student watching condition and the student fixation point information are sent to a teacher end, so that the teacher can intervene the students in time and adjust the teaching mode day by day.
In the embodiment, the student face data of each student end is obtained in real time, and the corresponding student face characteristics are obtained from the student face data, so that the student listening condition can be conveniently identified, and meanwhile, the positions and sizes of facial organs are not greatly different although different student face sizes are different, so that a face rectangular frame is generated according to the student face characteristics, and the interference of the face sizes of different students on identification can be reduced; by acquiring the viewing distance data, whether the student is in front of the student end or not and the posture state of the student can be judged, and the timely intervention of a teacher is facilitated; by generating the sight line data of the students, the information of the fixation points of the students can be obtained, and the information of the fixation points of the students can be obtained, so that whether the attention of the students is concentrated in the course of class can be detected, and the teacher can change teaching strategies in time to activate the classroom atmosphere.
In one embodiment, the student viewing status information includes student absence status information and student viewing passing information, as shown in fig. 3, in step S30, the method calculates student viewing distance data according to the size data of the rectangular frame, and generates a student viewing status according to the student viewing distance data, and specifically includes the following steps:
s31: and acquiring a preset standard range of the viewing distance, and comparing the data of the viewing distance of the students with the standard range of the viewing distance in real time to obtain a corresponding comparison result.
In this embodiment, the standard range of the viewing distance refers to an ideal distance from the student to the screen of the student terminal when participating in the remote classroom.
Specifically, the standard range of the viewing distance is set by means of experimental tests or big data analysis or the like. And comparing the student watching distance data calculated in real time with the standard range of the watching distance to obtain a comparison result.
S32: and if the comparison result shows that the student watching distance data exceeds the standard range of the watching distance, generating the out-of-state information of the student.
In this embodiment, the out-of-state information of the student refers to information that is in a bad state when the student attends a remote classroom.
Specifically, if the comparison result shows that the student viewing distance data of the student exceeds the viewing distance standard range and the time exceeding the viewing distance standard range is longer than a certain time, for example, 5 minutes, it is determined that the state of the student participating in the remote classroom is not good, and the student non-state information is triggered. If the face data of the student is not detected for a certain time, for example, for more than 5 minutes, the student is proved to be absent, and the absent information can be marked in the state information of the student.
S33: and if the comparison result shows that the student watching distance data is smaller than the standard range of the watching distance, generating the too-close watching information of the students.
In this embodiment, the too-close information viewed by the student is information that the student is too close to the screen when viewing the remote classroom.
Specifically, if the comparison result shows that the student viewing distance data of the student is smaller than the viewing distance standard range, it indicates that the posture of the student from the screen of the student end is incorrect, and the student is triggered to view the too-close information.
In one embodiment, as shown in fig. 4, after step S30, the remote classroom concentration analysis method further includes the following steps:
s301: and acquiring a plurality of frames of student face data from each student terminal.
Specifically, in the student face data acquired in real time, the student face data of 5 frames of pictures is selected for every 10 continuous frames of pictures.
S302: the face data of the students are input into a preset emotion detection model to obtain emotion data of the students and emotion change data of the students, and the emotion data of the students and the emotion change data of the students are sent to a teacher end.
In the present embodiment, the emotion detection model refers to a model for detecting facial expressions of students while participating in a remote classroom. The student emotion data refers to data of facial emotions of students when participating in a remote classroom. The student emotion change data refers to the change situation of facial emotion of the student when the student attends a remote classroom.
Specifically, by acquiring videos of several respective expression types as training sample videos including dissuade, happy, surprised, angry, disgust, fear, neutrality, and the like, an emotion detection model capable of detecting facial expressions is trained.
Furthermore, the face data of the students are input into the emotion detection model, and then real-time emotion data of the students are obtained. And if the facial expression of the student changes, recording the changed condition as the emotion change data of the student.
Further, the student emotion data and the student emotion change data are sent to the teacher side. The teacher end can judge the state of the student listening to the class according to the emotion and the emotion change degree of the student listening to the class, and effective reference is provided for judging whether the student understands the content of the lecture.
In one embodiment, as shown in fig. 5, in step S40, that is, acquiring the eye characteristics of the student from the face characteristics of the student, generating student sight line data according to the eye characteristics of the student, and generating student gaze point information according to the student sight line data, the method specifically includes the following steps:
s41: and acquiring the position information of the student end camera.
In this embodiment, the student side camera position information refers to the camera of the device where the student side is located, and is information of the relative position of the device.
Specifically, the data of the model and the size of the device where the student end is located is obtained through the student end. And calculating the position information of the student end camera according to the data.
S42: and generating student fixation point information according to the attention point position data corresponding to the student sight line data of the camera position information and the attention point position data.
In the present embodiment, the focus point position information refers to information of the position of an object focused on by the eyes of the student.
Specifically, a part of frames are extracted from student face data of a video stream through a deep learning model and input into the model for analysis, and the model outputs a fixation position relative to x and y coordinates (in cm) of camera position information as the focus point position data.
Furthermore, counting the position data of all the attention points to obtain the attention point information of the student.
In an embodiment, as shown in fig. 6, in step S42, that is, generating the student gaze point information according to the point of interest location data, the method specifically includes the following steps:
s421: and acquiring the mean value and the standard deviation of all the point of interest position data.
Specifically, since each concentration point position data is marked in the manner of a coordinate point, the corresponding x-coordinate mean and standard deviation and the mean and standard deviation of the y-coordinate are calculated using the coordinate points of all the concentration point position data.
S422: and taking the mean value as a central point, taking N times of standard deviation as a radius to generate an attention point detection circle, acquiring attention point proportion data in the attention point detection circle, and taking the attention point proportion data as student fixation point information, wherein N is larger than 0.
In the present embodiment, the attention band detection circle refers to a detection range for counting attention points of students.
Specifically, the circle is drawn by taking the mean of the x coordinate and the y coordinate as the center of the circle and taking the value of the standard deviation of N times as the radius, so as to obtain the circle for detecting the attention point, where N is 1, 1.5, or 2 in this embodiment.
Further, a first number of all the attention point position data and a second number falling in the attention point detection circle are acquired, and a ratio between the first number and the second number, that is, attention point proportion data is obtained as the student attention point information.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
In an embodiment, a remote classroom concentration analysis device is provided, and the remote classroom concentration analysis device corresponds to the remote classroom concentration analysis method in the above embodiments one to one. As shown in fig. 7, the remote classroom concentration analysis apparatus includes a feature extraction module 10, a face-framing module 20, a distance calculation module 30, a concentration calculation module 40, and a transmission module 50. The functional modules are explained in detail as follows:
the feature extraction module 10 is configured to obtain face data of each student in real time, and obtain corresponding face features of each student from the face data of each student;
the face framing module 20 is used for generating a face rectangular frame corresponding to the face data of each student according to the face characteristics of the students and acquiring the size data of the rectangular frame of each face rectangular frame;
the distance calculation module 30 is used for calculating the student watching distance data according to the size data of the rectangular frame and generating the student watching state condition according to the student watching distance data;
the attention degree calculation module 40 is used for acquiring the eye characteristics of the students from the face characteristics of the students, generating student sight line data according to the eye characteristics of the students and generating student fixation point information according to the student sight line data;
and the sending module 50 is used for sending the watching state condition of the students and the information of the fixation points of the students to the teacher end as the information of the class attention of the students.
Preferably, the distance calculation module 30 includes:
the comparison submodule 31 is configured to obtain a preset standard range of the viewing distance, and compare the viewing distance data of the student with the standard range of the viewing distance in real time to obtain a corresponding comparison result;
the first result generation submodule 32 is configured to generate out-of-state information of the student if the comparison result indicates that the student viewing distance data exceeds the viewing distance standard range;
and the second result generating submodule 33 is configured to generate too close student viewing information if the comparison result indicates that the student viewing distance data is smaller than the viewing distance standard range.
Preferably, the remote classroom concentration degree analysis device further includes:
the image extraction module 301 is used for acquiring a plurality of frames of student face data from each student end;
and the emotion detection module 302 is used for inputting the face data of the students into a preset emotion detection model to obtain the emotion data of the students and the emotion change data of the students, and sending the emotion data of the students and the emotion change data of the students to the teacher end.
Preferably, the attention calculation module 40 includes:
the position obtaining submodule 41 is used for obtaining the position information of the student end camera;
and the attention point calculation submodule 42 is used for generating student fixation point information according to the attention point position data corresponding to each student sight line data according to the camera position information and the attention point position data.
Preferably, the point of interest calculation submodule 42 comprises:
the first calculation submodule 421 is configured to obtain a mean value and a standard deviation of all the point of interest position data;
the second calculating submodule 422 is configured to generate an attention point detection circle by using the mean value as a central point and using N times of the standard deviation as a radius, acquire attention point proportion data in the attention point detection circle, and use the attention point proportion data as student fixation point information, where N is greater than 0.
For specific limitations of the remote classroom concentration analysis apparatus, reference may be made to the above limitations of the remote classroom concentration analysis method, which are not described in detail herein. The modules in the remote classroom concentration analysis device can be wholly or partially implemented by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a server, and its internal structure diagram may be as shown in fig. 8. The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The database of the computer device is for storing an emotion detection model and a model for detecting point of interest location data. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a remote classroom concentration analysis method.
In one embodiment, a computer device is provided, comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, the processor implementing the following steps when executing the computer program:
the method comprises the steps of acquiring student face data of each student end in real time, and acquiring corresponding student face features from each student face data;
generating a face rectangular frame corresponding to each student face data through the student face characteristics, and acquiring the rectangular frame size data of each face rectangular frame;
calculating the student watching distance data according to the size data of the rectangular frame, and generating the student watching state condition according to the student watching distance data;
acquiring student eye characteristics from the face characteristics of students, generating student sight line data according to the student eye characteristics, and generating student fixation point information through the student sight line data;
and sending the watching state condition of the students and the information of the fixation points of the students to a teacher end as the information of the class attention of the students.
In one embodiment, a computer-readable storage medium is provided, having a computer program stored thereon, which when executed by a processor, performs the steps of:
the method comprises the steps of acquiring student face data of each student end in real time, and acquiring corresponding student face features from each student face data;
generating a face rectangular frame corresponding to each student face data through the student face characteristics, and acquiring the rectangular frame size data of each face rectangular frame;
calculating the student watching distance data according to the size data of the rectangular frame, and generating the student watching state condition according to the student watching distance data;
acquiring student eye characteristics from the face characteristics of students, generating student sight line data according to the student eye characteristics, and generating student fixation point information through the student sight line data;
and sending the watching state condition of the students and the information of the fixation points of the students to a teacher end as the information of the class attention of the students.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (10)

1. A remote classroom concentration analysis method is characterized by comprising the following steps:
the method comprises the steps of acquiring student face data of each student end in real time, and acquiring corresponding student face features from each student face data;
generating a face rectangular frame corresponding to each student face data through the student face features, and acquiring the rectangular frame size data of each face rectangular frame;
calculating the student watching distance data according to the size data of the rectangular frame, and generating the student watching state condition according to the student watching distance data;
acquiring student eye characteristics from the student face characteristics, generating student sight line data according to the student eye characteristics, and generating student fixation point information through the student sight line data;
and sending the watching state condition of the students and the information of the fixation points of the students to a teacher end as the information of the class attention of the students.
2. The remote classroom concentration analysis method as claimed in claim 1, wherein the student viewing status information includes student absence status information and student viewing passing information, the method further comprises the steps of calculating student viewing distance data according to the size data of the rectangular frame, and generating student viewing status according to the student viewing distance data, the method comprising:
acquiring a preset standard range of the viewing distance, and comparing the data of the viewing distance of the students with the standard range of the viewing distance in real time to obtain a corresponding comparison result;
if the comparison result is that the student viewing distance data exceeds the viewing distance standard range, generating out-of-state information of the student;
and if the comparison result shows that the student watching distance data is smaller than the standard watching distance range, generating the student watching distance information.
3. The remote classroom concentration analysis method as set forth in claim 1, wherein after the calculating of the student viewing distance data based on the rectangular box size data and the generating of the student viewing status from the student viewing distance data, the remote classroom concentration analysis method further comprises the steps of:
acquiring a plurality of frames of student face data from each student terminal;
and inputting the face data of the students to a preset emotion detection model to obtain emotion data of the students and emotion change data of the students, and sending the emotion data of the students and the emotion change data of the students to the teacher end.
4. The remote classroom concentration analysis method as set forth in claim 1, wherein the step of obtaining student eye characteristics from the student face characteristics, generating student gaze data according to the student eye characteristics, and generating student gaze point information from the student gaze data comprises the steps of:
acquiring the position information of a student camera;
and generating the student fixation point information according to the attention point position data corresponding to each student sight line data of the camera position information.
5. The remote classroom concentration analysis method according to claim 4, wherein the generating of the student point of regard information according to the point of interest position data corresponding to each of the student sight data of the camera position information and the point of interest position data specifically comprises the steps of:
acquiring the mean value and the standard deviation of all the point of interest position data;
and taking the mean value as a central point, taking N times of standard deviation as a radius to generate an attention point detection circle, acquiring attention point proportion data in the attention point detection circle, and taking the attention point proportion data as the student fixation point information, wherein N is greater than 0.
6. A remote classroom concentration degree analysis apparatus, comprising:
the characteristic extraction module is used for acquiring the face data of students at each student end in real time and acquiring the corresponding face characteristics of the students from the face data of the students;
the face framing module is used for generating a face rectangular frame corresponding to each student face data through the student face features and acquiring the size data of the rectangular frame of each face rectangular frame;
the distance calculation module is used for calculating the student watching distance data according to the size data of the rectangular frame and generating the student watching state condition according to the student watching distance data;
the attention degree calculation module is used for acquiring the eye characteristics of the students from the face characteristics of the students, generating student sight line data according to the eye characteristics of the students and generating student fixation point information according to the student sight line data;
and the sending module is used for sending the watching state condition of the students and the information of the fixation points of the students to a teacher end as the information of the class attention of the students.
7. The remote classroom concentration analysis apparatus of claim 6, wherein the student viewing status information includes student absence status information and student viewing proximity information, the distance calculation module comprising:
the comparison submodule is used for acquiring a preset standard range of the viewing distance, and comparing the data of the viewing distance of the students with the standard range of the viewing distance in real time to obtain a corresponding comparison result;
the first result generation submodule is used for generating out-of-state information of the student if the comparison result indicates that the student watching distance data exceeds the watching distance standard range;
and the second result generation submodule is used for generating the too-close information watched by the student if the comparison result shows that the student watching distance data is smaller than the standard range of the watching distance.
8. The remote classroom concentration analysis apparatus as defined in claim 6, wherein the remote classroom concentration analysis apparatus further comprises:
the image extraction module is used for acquiring a plurality of frames of student face data from each student end;
and the emotion detection module is used for inputting the student face data into a preset emotion detection model to obtain student emotion data and student emotion change data and sending the student emotion data and the student emotion change data to the teacher end.
9. A computer device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor, when executing the computer program, carries out the steps of the remote classroom concentration analysis method as claimed in any one of claims 1 to 5.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the remote classroom concentration analysis method as defined in any one of claims 1 to 5.
CN202010748481.XA 2020-07-30 2020-07-30 Remote classroom concentration analysis method and device, computer equipment and storage medium Pending CN111967350A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010748481.XA CN111967350A (en) 2020-07-30 2020-07-30 Remote classroom concentration analysis method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010748481.XA CN111967350A (en) 2020-07-30 2020-07-30 Remote classroom concentration analysis method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN111967350A true CN111967350A (en) 2020-11-20

Family

ID=73363576

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010748481.XA Pending CN111967350A (en) 2020-07-30 2020-07-30 Remote classroom concentration analysis method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111967350A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112861650A (en) * 2021-01-19 2021-05-28 北京百家科技集团有限公司 Behavior evaluation method, device and system
CN113239841A (en) * 2021-05-24 2021-08-10 桂林理工大学博文管理学院 Classroom concentration state detection method based on face recognition and related instrument
CN113592237A (en) * 2021-07-01 2021-11-02 中国联合网络通信集团有限公司 Teaching quality assessment method and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106599881A (en) * 2016-12-30 2017-04-26 首都师范大学 Student state determination method, device and system
CN108830164A (en) * 2018-05-22 2018-11-16 北京小鱼在家科技有限公司 Reminding method, device, computer equipment and the storage medium of screen viewed status
CN109815795A (en) * 2018-12-14 2019-05-28 深圳壹账通智能科技有限公司 Classroom student's state analysis method and device based on face monitoring
CN110393539A (en) * 2019-06-21 2019-11-01 合肥工业大学 Psychological abnormality detection method, device, storage medium and electronic equipment
US20200175264A1 (en) * 2017-08-07 2020-06-04 Shenzhen Institutes Of Advanced Technology Chinese Academy Of Sciences Teaching assistance method and teaching assistance system using said method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106599881A (en) * 2016-12-30 2017-04-26 首都师范大学 Student state determination method, device and system
US20200175264A1 (en) * 2017-08-07 2020-06-04 Shenzhen Institutes Of Advanced Technology Chinese Academy Of Sciences Teaching assistance method and teaching assistance system using said method
CN108830164A (en) * 2018-05-22 2018-11-16 北京小鱼在家科技有限公司 Reminding method, device, computer equipment and the storage medium of screen viewed status
CN109815795A (en) * 2018-12-14 2019-05-28 深圳壹账通智能科技有限公司 Classroom student's state analysis method and device based on face monitoring
CN110393539A (en) * 2019-06-21 2019-11-01 合肥工业大学 Psychological abnormality detection method, device, storage medium and electronic equipment

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112861650A (en) * 2021-01-19 2021-05-28 北京百家科技集团有限公司 Behavior evaluation method, device and system
CN113239841A (en) * 2021-05-24 2021-08-10 桂林理工大学博文管理学院 Classroom concentration state detection method based on face recognition and related instrument
CN113592237A (en) * 2021-07-01 2021-11-02 中国联合网络通信集团有限公司 Teaching quality assessment method and electronic equipment
CN113592237B (en) * 2021-07-01 2023-06-09 中国联合网络通信集团有限公司 Teaching quality assessment method and electronic equipment

Similar Documents

Publication Publication Date Title
CN107292271B (en) Learning monitoring method and device and electronic equipment
CN111967350A (en) Remote classroom concentration analysis method and device, computer equipment and storage medium
Morelli et al. A call for conceptual models of technology in IO psychology: An example from technology-based talent assessment
CN110674664A (en) Visual attention recognition method and system, storage medium and processor
TWI684159B (en) Instant monitoring method for interactive online teaching
CN111008542A (en) Object concentration analysis method and device, electronic terminal and storage medium
Abdulkader et al. Optimizing student engagement in edge-based online learning with advanced analytics
US20230222932A1 (en) Methods, systems, and media for context-aware estimation of student attention in online learning
CN111353363A (en) Teaching effect detection method and device and electronic equipment
WO2021169616A1 (en) Method and apparatus for detecting face of non-living body, and computer device and storage medium
Rahman et al. E mo A ssist: emotion enabled assistive tool to enhance dyadic conversation for the blind
US20200169693A1 (en) Eye gaze angle feedback in a remote meeting
Hung et al. Augmenting teacher-student interaction in digital learning through affective computing
CN111932967A (en) Interaction method and device for remote classroom, computer equipment and storage medium
CN113570916B (en) Multimedia remote teaching auxiliary method, equipment and system
CN109934150B (en) Conference participation degree identification method, device, server and storage medium
US11127401B2 (en) Attention shifting of a robot in a group conversation using audio-visual perception based speaker localization
CN116994465A (en) Intelligent teaching method, system and storage medium based on Internet
CN113139491A (en) Video conference control method, system, mobile terminal and storage medium
Yang et al. A face and eye detection based feedback system for smart classroom
Horvat et al. Emerging opportunities for education in the time of covid-19-adaptive e-learning intelligent agent based on assessment of emotion and attention
KR102515987B1 (en) Apparatus and method for detecting learners' participation in an untact online class
KR20220057892A (en) Method for educating contents gaze-based and computing device for executing the method
CN112817550A (en) Data processing method and device
Madake et al. Vision-based Monitoring of Student Attentiveness in an E-Learning Environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 510032 Part 402, No. 7, caipin Road, Science City, Luogang District, Guangzhou City, Guangdong Province

Applicant after: Guangdong Everbright Information Technology Co.,Ltd.

Address before: 510032 Part 402, No. 7, caipin Road, Science City, Luogang District, Guangzhou City, Guangdong Province

Applicant before: GUANGZHOU EVERBRIGHT EDUCATION SOFTWARE TECHNOLOGY CO.,LTD.

CB02 Change of applicant information