CN114332719A - Student classroom learning motivation analysis method, device, equipment and storage medium - Google Patents

Student classroom learning motivation analysis method, device, equipment and storage medium Download PDF

Info

Publication number
CN114332719A
CN114332719A CN202111667138.3A CN202111667138A CN114332719A CN 114332719 A CN114332719 A CN 114332719A CN 202111667138 A CN202111667138 A CN 202111667138A CN 114332719 A CN114332719 A CN 114332719A
Authority
CN
China
Prior art keywords
teaching
student
video
students
feedback information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111667138.3A
Other languages
Chinese (zh)
Inventor
孟繁华
方海光
蔡春
朱晓宏
刘文龙
孔新梅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Capital Normal University
Original Assignee
Capital Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Capital Normal University filed Critical Capital Normal University
Priority to CN202111667138.3A priority Critical patent/CN114332719A/en
Publication of CN114332719A publication Critical patent/CN114332719A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Electrically Operated Instructional Devices (AREA)

Abstract

The invention provides a student classroom learning motivation analysis method, a device, equipment and a storage medium, which are applied to a teaching scene provided with audio acquisition equipment and video acquisition equipment, wherein the teaching video for teaching activities in the teaching scene is acquired through the audio acquisition equipment and the video acquisition equipment, the facial expressions of students are obtained by analyzing the teaching video, the facial expressions of the students are evaluated and analyzed to obtain emotion feedback information expressed by the students in the teaching activities, the learning motivation analysis result of the students on the teaching activities can be effectively judged and obtained based on the emotion feedback information, furthermore, the listening and speaking states and the understanding degree of the students in the teaching activities can be analyzed and obtained, the teaching thinking and improvement of teachers are facilitated based on the learning motivations of the students, and the requirements for facial expression recognition in complex teaching activities are met, the recognition accuracy is high, the recognition speed is high, and the accuracy of student learning motivation analysis is improved.

Description

Student classroom learning motivation analysis method, device, equipment and storage medium
Technical Field
The disclosure relates to the technical field of internet, in particular to a student classroom learning motivation analysis method, device, equipment and storage medium.
Background
With the development and progress of the times, comprehensive quality education and ideological and moral education of teenagers are strengthened, so that the problems become more important for people, and the teaching mode of an intelligent classroom is brought forward. The intelligent classroom refers to a classroom learning environment formed by teaching media related to information technology application, information technology and terminal equipment are introduced in the whole teaching process, teaching driven by modern technology is realized, and the classroom becomes more intelligent and efficient.
In the existing teaching mode, the analysis of the classroom learning motivation of the students is mainly a manual analysis mode, the analysis result is subjective and can be obtained only after long-time analysis and arrangement, the fast and real-time analysis result is difficult to obtain aiming at the reaction of the students in the teaching activities, the reading accuracy of the learning emotion and attention of the students in the teaching activities is low, and the classroom learning motivation of the students is difficult to be comprehensively obtained and recorded.
Disclosure of Invention
The embodiment of the disclosure at least provides a student classroom learning motivation analysis method, a student classroom learning motivation analysis device, student classroom learning motivation analysis equipment and a storage medium.
The embodiment of the disclosure provides a student classroom learning motivation analysis method, which is applied to a teaching scene provided with an audio acquisition device and a video acquisition device, and the method comprises the following steps:
acquiring a teaching video which is acquired by the audio acquisition equipment and the video acquisition equipment and used for teaching activities in the teaching scene;
analyzing the teaching video to obtain the facial expression of the student;
according to the facial expressions of the students, evaluating emotional feedback information expressed by the students in teaching activities;
and determining the result of analyzing the learning motivation of the student on the teaching activity based on the emotional feedback information.
In an optional embodiment, the acquiring a teaching video acquired by the audio acquisition device and the video acquisition device and used for teaching activities in the teaching scene includes:
determining the audio acquisition equipment and the video acquisition equipment which are arranged in the teaching scene;
acquiring audio data acquired by the video acquisition equipment and image data acquired by the video acquisition equipment;
screening and denoising collected audio data to obtain processed audio data;
carrying out region division processing on the acquired image data, and extracting to obtain processed image data, wherein the processed image data is student image data for displaying student pictures;
and integrating the processed student image data and the processed audio data to obtain the teaching video.
In an optional implementation manner, the parsing the teaching video to obtain the facial expression of the student includes:
aiming at the teaching video, extracting a student video image file from the teaching video;
based on a preset time interval, slicing the student video image file according to the preset time interval to obtain a student image slicing file;
and analyzing the student image fragment file to obtain the facial expression of the student.
In an optional implementation manner, the parsing the student image fragment file to obtain the facial expression of the student includes:
extracting at least one face image of a student from the student image fragment file based on the student image fragment file;
and determining the facial expression corresponding to each facial image aiming at least one extracted facial image.
In an optional embodiment, the determining, for the extracted at least one facial image, a facial expression corresponding to each facial image includes:
judging whether each facial image can be matched with one sample expression or not based on a plurality of sample expressions stored in a preset expression database;
if the facial image can be matched with one sample expression, determining the corresponding facial expression of the facial image based on the matched sample expression, wherein the facial expression comprises one or more of happiness, surprise, neutrality, disgust, anger and fear.
In an optional embodiment, the evaluating emotional feedback information, which is expressed by the student in the teaching activities, according to the facial expression of the student comprises:
aiming at one or more facial expressions of happiness, surprise and neutrality, the emotional feedback information expressed by the students in the teaching activities is evaluated as positive information;
and evaluating the emotional feedback information presented by the students in the teaching activities into negative information aiming at one or more facial expressions of disgust, anger and fear.
In an optional embodiment, the determining, based on the emotional feedback information, a result of analyzing a learning motivation of the student for the teaching activity includes:
confirming at least one piece of emotional feedback information of at least one student in the teaching video based on the teaching video;
judging the actual ratio of positive information to negative information in the at least one piece of emotion feedback information based on the at least one piece of emotion feedback information;
based on a predetermined preset ratio, comparing the actual ratio with the preset ratio to obtain a comparison result;
and determining the result of the learning motivation analysis of the teaching activities by the students based on the comparison result.
In an optional embodiment, the determining, based on the emotional feedback information, a result of analyzing a learning motivation of the student for the teaching activity includes:
processing the teaching video to obtain at least one teacher action and at least one student action, wherein the teacher action and the student action are in corresponding relation;
integrating to obtain teacher-student interaction behaviors based on the teacher actions and the student actions;
and determining the result of analyzing the learning motivation of the teaching activities by students based on the emotional feedback information and the teacher-student interaction behaviors.
In an optional embodiment, after determining the result of analyzing the learning motivation of the student for the teaching activity based on the emotional feedback information, the method comprises:
determining a teaching quality evaluation result for a teacher in the teaching activity based on the learning motivation analysis result of the student on the teaching activity;
and generating prompt information based on the teaching quality evaluation result, and feeding the prompt information back to the teacher.
The disclosed embodiment also provides a student classroom learning motivation analysis device, the device includes:
the acquisition module is used for acquiring a teaching video which is acquired by the audio acquisition equipment and the video acquisition equipment and used for teaching activities in the teaching scene;
the analysis module is used for analyzing the teaching video to obtain the facial expression of the student;
the evaluation module is used for evaluating emotion feedback information expressed by the students in the teaching activities according to the facial expressions of the students;
and the judging module is used for determining the result of analyzing the learning motivation of the student on the teaching activity based on the emotion feedback information.
In an optional implementation manner, the acquisition module is specifically configured to:
determining the audio acquisition equipment and the video acquisition equipment which are arranged in the teaching scene;
acquiring audio data acquired by the video acquisition equipment and image data acquired by the video acquisition equipment;
screening and denoising collected audio data to obtain processed audio data;
carrying out region division processing on the acquired image data, and extracting to obtain processed image data, wherein the processed image data is student image data for displaying student pictures;
and integrating the processed student image data and the processed audio data to obtain the teaching video.
In an optional implementation manner, the analysis module is specifically configured to:
aiming at the teaching video, extracting a student video image file from the teaching video;
based on a preset time interval, slicing the student video image file according to the preset time interval to obtain a student image slicing file;
and analyzing the student image fragment file to obtain the facial expression of the student.
In an optional implementation manner, when the analysis module is configured to perform analysis processing on the student image fragment file to obtain a facial expression of a student, the analysis module is specifically configured to:
extracting at least one face image of a student from the student image fragment file based on the student image fragment file;
and determining the facial expression corresponding to each facial image aiming at least one extracted facial image.
In an optional embodiment, when the analysis module is configured to determine, for the extracted at least one facial image, a facial expression corresponding to each facial image, the analysis module is specifically configured to:
judging whether each facial image can be matched with one sample expression or not based on a plurality of sample expressions stored in a preset expression database;
if the facial image can be matched with one sample expression, determining the corresponding facial expression of the facial image based on the matched sample expression, wherein the facial expression comprises one or more of happiness, surprise, neutrality, disgust, anger and fear.
In an optional embodiment, the evaluation module is specifically configured to:
aiming at one or more facial expressions of happiness, surprise and neutrality, the emotional feedback information expressed by the students in the teaching activities is evaluated as positive information;
and evaluating the emotional feedback information presented by the students in the teaching activities into negative information aiming at one or more facial expressions of disgust, anger and fear.
In an optional implementation manner, the determining module is specifically configured to:
confirming at least one piece of emotional feedback information of at least one student in the teaching video based on the teaching video;
judging the actual ratio of positive information to negative information in the at least one piece of emotion feedback information based on the at least one piece of emotion feedback information;
based on a predetermined preset ratio, comparing the actual ratio with the preset ratio to obtain a comparison result;
and determining the result of the learning motivation analysis of the teaching activities by the students based on the comparison result.
In an optional implementation manner, the determining module is specifically configured to:
processing the teaching video to obtain at least one teacher action and at least one student action, wherein the teacher action and the student action are in corresponding relation;
integrating to obtain teacher-student interaction behaviors based on the teacher actions and the student actions;
and determining the result of analyzing the learning motivation of the teaching activities by students based on the emotional feedback information and the teacher-student interaction behaviors.
In an optional embodiment, the apparatus further comprises a feedback module configured to:
determining a teaching quality evaluation result for a teacher in the teaching activity based on the learning motivation analysis result of the student on the teaching activity;
and generating prompt information based on the teaching quality evaluation result, and feeding the prompt information back to the teacher.
An embodiment of the present disclosure further provides an electronic device, including: the student classroom learning motivation analysis system comprises a processor, a memory and a bus, wherein the memory stores machine-readable instructions executable by the processor, the processor and the memory are communicated through the bus when an electronic device runs, and the machine-readable instructions are executed by the processor to execute the steps of the student classroom learning motivation analysis method.
The disclosed embodiments also provide a computer-readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, performs the steps of the student classroom learning motivation analysis method.
The student classroom learning motivation analysis method, device, equipment and storage medium provided by the embodiment of the disclosure are applied to a teaching scene provided with audio acquisition equipment and video acquisition equipment, and can acquire a teaching video which is acquired by the audio acquisition equipment and the video acquisition equipment and performs teaching activities in the teaching scene; analyzing the teaching video to obtain the facial expression of the student; according to the facial expressions of the students, evaluating emotional feedback information expressed by the students in teaching activities; and determining the result of analyzing the learning motivation of the student on the teaching activity based on the emotional feedback information.
Therefore, the teaching video for teaching activities in a teaching scene is acquired through the audio acquisition equipment and the video acquisition equipment, the facial expressions of students are obtained through analysis in the teaching video, the facial expressions of the students are evaluated and analyzed, emotion feedback information expressed by the students in the teaching activities is obtained, the learning motivation analysis result of the teaching activities can be effectively judged and obtained through analysis based on the emotion feedback information, furthermore, the listening and speaking states and the understanding degree of the students in the teaching activities can be obtained through analysis, the teaching motivation is based on the students, teaching thinking and improvement of teachers are facilitated, the requirement for recognizing the facial expressions in complex teaching activities is met, the recognition accuracy is high, the recognition speed is high, and the accuracy of analysis of the learning motivations of the students is improved.
In order to make the aforementioned objects, features and advantages of the present disclosure more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required for use in the embodiments will be briefly described below, and the drawings herein incorporated in and forming a part of the specification illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the technical solutions of the present disclosure. It is appreciated that the following drawings depict only certain embodiments of the disclosure and are therefore not to be considered limiting of its scope, for those skilled in the art will be able to derive additional related drawings therefrom without the benefit of the inventive faculty.
Fig. 1 is a schematic diagram illustrating an application scenario provided by an embodiment of the present disclosure;
fig. 2 shows a flowchart of a student classroom learning motivation analysis method provided by an embodiment of the present disclosure;
FIG. 3 is a flow chart illustrating another student classroom learning motivation analysis method provided by an embodiment of the present disclosure;
fig. 4 is a schematic diagram of a student classroom learning motivation analysis apparatus provided by an embodiment of the present disclosure;
fig. 5 is a second schematic diagram of an apparatus for analyzing student classroom learning motivation according to an embodiment of the present disclosure;
fig. 6 shows a schematic diagram of an electronic device provided by an embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present disclosure more clear, the technical solutions of the embodiments of the present disclosure will be described clearly and completely with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, not all of the embodiments. The components of the embodiments of the present disclosure, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure, presented in the figures, is not intended to limit the scope of the claimed disclosure, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the disclosure without making creative efforts, shall fall within the protection scope of the disclosure.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
The term "and/or" herein merely describes an associative relationship, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C.
Research shows that in the existing teaching mode, the analysis of the classroom learning motivations of students is mainly performed in a manual analysis mode, for example, a teacher gives lessons to manually observe and summarize in teaching activities, or a teaching case is manually analyzed, the energy of the teacher gives lessons is dispersed, and the results can be obtained only after long-time analysis and sorting.
Based on the research, the present disclosure provides a student classroom learning motivation analysis method, which is applied to a teaching scene provided with an audio acquisition device and a video acquisition device, and can acquire a teaching video acquired by the audio acquisition device and the video acquisition device and used for teaching activities in the teaching scene; analyzing the teaching video to obtain the facial expression of the student; according to the facial expressions of the students, evaluating emotional feedback information expressed by the students in teaching activities; and determining the result of analyzing the learning motivation of the student on the teaching activity based on the emotional feedback information.
Therefore, the teaching video for teaching activities in a teaching scene is acquired through the audio acquisition equipment and the video acquisition equipment, the facial expressions of students are obtained through analysis in the teaching video, the facial expressions of the students are evaluated and analyzed, emotion feedback information expressed by the students in the teaching activities is obtained, the learning motivation analysis result of the teaching activities can be effectively judged and obtained through analysis based on the emotion feedback information, furthermore, the listening and speaking states and the understanding degree of the students in the teaching activities can be obtained through analysis, the teaching motivation is based on the students, teaching thinking and improvement of teachers are facilitated, the requirement for recognizing the facial expressions in complex teaching activities is met, the recognition accuracy is high, the recognition speed is high, and the accuracy of analysis of the learning motivations of the students is improved.
To facilitate understanding of the present embodiment, first, a student class learning motivation analysis method disclosed in the embodiments of the present disclosure is described in detail, where an execution subject of the student class learning motivation analysis method provided in the embodiments of the present disclosure is generally an electronic device with certain computing power, and the electronic device includes, for example: terminal equipment or servers or other processing devices. In some possible implementations, the student classroom learning motivation analysis method may be implemented by a processor calling computer-readable instructions stored in a memory.
Referring to fig. 1, fig. 1 is a schematic view of an application scenario provided in the present disclosure. As shown in fig. 1, in a teaching activity, the main body of interaction is a student, with which there are a teacher, teaching contents, a learning environment, and interaction between the student and the student, wherein the teaching contents mainly include teaching materials and resources, the teaching materials can include paper teaching materials and electronic teaching materials, the resources can include characters, pictures, audio, video, and the like, in addition, the learning environment mainly includes information technology and terminal equipment, the information technology mainly relates to the internet, big data, the internet of things, artificial intelligence, and the like, and the terminal equipment can include a mobile phone, a tablet computer, an electronic whiteboard, an educational robot, and the like.
In the teaching activity, can feed back between teacher and the student, ask questions and respond, can cooperate and discuss between the student, the teacher can carry out the teaching design to the teaching content, the teaching content can supply the student to learn by oneself and watch, the teacher can carry out the situation to the learning environment and establish, the learning environment can supply the student to operate and experimental, in addition, the teaching content needs the learning environment as the carrier that knowledge and resource presented, the learning environment also can provide corresponding function to support developing of teaching simultaneously.
Referring to fig. 2, fig. 2 is a flowchart of a student classroom learning motivation analysis method according to an embodiment of the present disclosure. As shown in fig. 2, a student classroom learning motivation analysis method provided by the embodiment of the present disclosure includes:
s201: and acquiring a teaching video which is acquired by the audio acquisition equipment and the video acquisition equipment and used for teaching activities in the teaching scene.
In this step, when the student classroom learning motivation analysis is needed for the teaching activities performed in the teaching scene to determine the student classroom learning motivation in the teaching activities, the teaching videos acquired by the audio acquisition device and the video acquisition device and performing the teaching activities in the teaching scene can be acquired by the audio acquisition device and the video acquisition device arranged in the teaching scene, so that the teaching videos can be subsequently analyzed to obtain the student classroom learning motivation.
It can be understood that the teaching scene is provided with the audio acquisition device and the video acquisition device, so that the teaching video can be obtained by means of the audio acquisition device and the video acquisition device.
Optionally, the audio capture device and the video capture device may be pre-set in the instructional scene. For example, when a student classroom learning motivation analysis is needed for a teaching activity in a teaching scene to determine the student classroom learning motivation in the teaching activity, the teaching scene may be a multimedia classroom in which the audio acquisition device and the video acquisition device are pre-installed, so that the acquisition of the teaching video may be implemented by the audio acquisition device and the video acquisition device pre-installed in the multimedia classroom.
Optionally, when the teaching activities performed in the teaching scene need to be recorded, the audio acquisition device and the video acquisition device are set before recording. Illustratively, when student classroom learning motivation analysis is needed for teaching activities in a teaching scene to determine student classroom learning motivation in the teaching activities, the teaching scene may be a common classroom in which the audio acquisition device and the video acquisition device are not installed in advance, and at this time, the acquisition of the teaching video may be realized by temporarily setting the audio acquisition device and the video acquisition device.
The teacher and the students can wear the audio acquisition equipment on the body, at the moment, the audio acquisition equipment can be a microphone or an earphone, in addition, the audio acquisition equipment can be fixed on a platform, a desk or a wall, at the moment, the audio acquisition equipment can also be a suspender microphone or a lifting microphone, and the audio acquisition equipment is not limited at all.
The video acquisition equipment can be fixed cameras arranged in different areas in a teaching scene, and can also be cameras arranged in the fixed areas in the teaching scene and can swing according to preset angles, and no limitation is made here.
S202: and analyzing the teaching video to obtain the facial expression of the student.
In this step, under the condition that the teaching video is collected, the facial expressions of the students can be obtained from the teaching video in the modes of analysis processing and the like.
Here, the student's facial expressions, which may include one or more of happiness, surprise, neutrality, disgust, hurry, anger, and fear, may be automatically recognized using artificial intelligence techniques or convolutional neural networks.
Alternatively, key parts of the face of the student, such as the positions of the eyebrows, the eyes, the nose, the mouth, the chin and the like of the face, can be firstly located in the teaching video, and then a face image including the key parts of the face can be determined, so that the facial expression of the student can be identified.
Further, if the picture of the teaching video is too bright or too dark, the picture brightness of the teaching video can be adjusted first, and then the facial expression is identified.
S203: and evaluating emotional feedback information expressed by the students in the teaching activities according to the facial expressions of the students.
In this step, under the condition that the facial expression of the student is determined, feature extraction may be performed on the facial expression according to the facial expression, so as to evaluate emotional feedback information expressed by the student in the teaching activities.
Specifically, the emotional feedback information expressed by the students in the teaching activities is evaluated as positive information aiming at one or more facial expressions of happiness, surprise and neutrality; and evaluating the emotional feedback information presented by the students in the teaching activities into negative information aiming at one or more facial expressions of disgust, anger and fear.
Therefore, according to the facial expressions of the students, the emotion feedback information expressed by the students in the teaching activities can be evaluated and obtained, so that the learning motivation analysis results of the students on the teaching activities can be obtained through subsequent processing, in addition, the continuous learning states of the students in the teaching activities can be recorded based on the emotion feedback information, and further, the learning motivation analysis results of the students on the teaching activities can be better summarized.
S204: and determining the result of analyzing the learning motivation of the student on the teaching activity based on the emotional feedback information.
In the step, under the condition of obtaining the emotion information, the ratio of positive information to negative information in the teaching activities can be obtained through calculation of a ratio, so that the result of analyzing the learning motivation of the students on the teaching activities is determined.
In order to obtain a learning motivation analysis result, in some possible embodiments, the determining a learning motivation analysis result of the student on the teaching activity based on the emotional feedback information includes:
confirming at least one piece of emotional feedback information of at least one student in the teaching video based on the teaching video;
judging the actual ratio of positive information to negative information in the at least one piece of emotion feedback information based on the at least one piece of emotion feedback information;
based on a predetermined preset ratio, comparing the actual ratio with the preset ratio to obtain a comparison result;
and determining the result of the learning motivation analysis of the teaching activities by the students based on the comparison result.
In this step, it can be understood that, for the teaching video, a picture of the teaching video may include at least one student, so that at least one piece of emotion feedback information of the at least one student in the teaching video may be obtained, then an actual ratio of positive information and negative information in the at least one piece of emotion feedback information may be determined, based on a predetermined preset ratio, the actual ratio and the preset ratio may be compared, and a comparison result may be obtained, based on the comparison result, a result of analyzing a learning motivation of the student for the teaching activity may be determined.
Optionally, the comparison result can be obtained based on the weight by calculating the weight of each of the positive information and the negative information in the at least one piece of emotional feedback information, so that the result of the learning motivation analysis of the teaching activity by the student can be determined.
The student classroom learning motivation analysis method provided by the embodiment of the disclosure is applied to a teaching scene provided with an audio acquisition device and a video acquisition device, and can acquire a teaching video which is acquired by the audio acquisition device and the video acquisition device and is used for teaching activities in the teaching scene; analyzing the teaching video to obtain the facial expression of the student; according to the facial expressions of the students, evaluating emotional feedback information expressed by the students in teaching activities; and determining the result of analyzing the learning motivation of the student on the teaching activity based on the emotional feedback information.
Therefore, the teaching video for teaching activities in a teaching scene is acquired through the audio acquisition equipment and the video acquisition equipment, the facial expressions of students are obtained through analysis in the teaching video, the facial expressions of the students are evaluated and analyzed, emotion feedback information expressed by the students in the teaching activities is obtained, the learning motivation analysis result of the teaching activities can be effectively judged and obtained through analysis based on the emotion feedback information, furthermore, the listening and speaking states and the understanding degree of the students in the teaching activities can be obtained through analysis, the teaching motivation is based on the students, teaching thinking and improvement of teachers are facilitated, the requirement for recognizing the facial expressions in complex teaching activities is met, the recognition accuracy is high, the recognition speed is high, and the accuracy of analysis of the learning motivations of the students is improved.
Referring to fig. 3, fig. 3 is a flowchart of another student classroom learning motivation analysis method according to an embodiment of the present disclosure. As shown in fig. 3, a student classroom learning motivation analysis method provided by the embodiment of the present disclosure includes:
s301: and acquiring a teaching video which is acquired by the audio acquisition equipment and the video acquisition equipment and used for teaching activities in the teaching scene.
S302: and analyzing the teaching video to obtain the facial expression of the student.
S303: and evaluating emotional feedback information expressed by the students in the teaching activities according to the facial expressions of the students.
S304: and determining the result of analyzing the learning motivation of the student on the teaching activity based on the emotional feedback information.
The descriptions of step S301 to step S304 may refer to the descriptions of step S201 to step S204, and the same technical effect and the same technical problem can be achieved, which are not described herein again.
S305: and determining a teaching quality evaluation result for a teacher in the teaching activity based on the learning motivation analysis result of the student on the teaching activity.
In this step, in order to facilitate a teacher to intuitively understand the learning motivation of the student, the teaching quality evaluation result for the teacher may be obtained by sorting the results obtained by analyzing the learning motivation of the student for the teaching activity. Based on the teaching quality result, teachers can conveniently supervise teaching processes and teaching efficiency can be improved.
S306: and generating prompt information based on the teaching quality evaluation result, and feeding the prompt information back to the teacher.
In this step, in order to facilitate the teacher to check the teaching condition, prompt information may be generated for the teaching quality evaluation result, and the prompt information is fed back to the teacher, so that the teacher can adaptively adjust the teaching method and the teaching progress for the teaching quality evaluation result.
Further, for each teaching activity, a teaching quality evaluation result for each teaching activity may be generated, the teaching quality evaluation result may perform data visualization conversion and presentation on the learning motivation analysis result of the student, the classroom report may include a classroom timeline, a classroom migration matrix, and a classroom behavior proportion, the classroom timeline may be used to describe changes in facial expressions of the student in the teaching activity, the classroom migration matrix may be used to describe a teaching structure in the teaching activity, and the classroom behavior analysis proportion may be used to describe an overall structure of the teaching activity.
Furthermore, for classroom reports of the same class, the same subject, the same grade and the like, for example, the class report, the subject report, the grade report and the like can be integrated, so that teachers can intuitively know teaching conditions from different angles.
The above-described aspects will now be described with reference to specific embodiments.
In order to obtain a teaching video, in some possible embodiments, the obtaining of the teaching video of teaching activities in the teaching scene, which is acquired by the audio acquisition device and the video acquisition device, includes:
determining the audio acquisition equipment and the video acquisition equipment which are arranged in the teaching scene;
acquiring audio data acquired by the video acquisition equipment and image data acquired by the video acquisition equipment;
screening and denoising collected audio data to obtain processed audio data;
carrying out region division processing on the acquired image data, and extracting to obtain processed image data, wherein the processed image data is student image data for displaying student pictures;
and integrating the processed student image data and the processed audio data to obtain the teaching video.
In this step, in order to obtain a teaching video, the audio acquisition device and the video acquisition device set in the teaching scene may be determined first, and it is understood that the audio acquisition device may acquire audio data of a teaching activity performed in the teaching scene, and the video acquisition device may acquire image data of a teaching activity performed in the teaching scene, so that the audio data acquired by the audio acquisition device and the image data acquired by the video acquisition device may be acquired, where, in order to obtain the audio data and the image data with higher quality and convenient for subsequent identification processing, for the audio data, invalid audio may be screened and removed, and noise reduction processing may be performed on the screened audio data, and for the image data, an image in the image data may be subjected to area division, and obtaining student image data for displaying student pictures, and further integrating the processed image data and the processed audio data to obtain the teaching video.
Optionally, the student image data may include a panoramic image for the student and may also include a close-up image for the student, which is not limited herein.
Furthermore, in response to the operation of the administrator, the operations such as modification or deletion of the teaching video can be performed, and the administrator can also view the processing state and details of the teaching video.
In some possible embodiments, the analyzing the teaching video to obtain the facial expression of the student includes:
aiming at the teaching video, extracting a student video image file from the teaching video;
based on a preset time interval, slicing the student video image file according to the preset time interval to obtain a student image slicing file;
and analyzing the student image fragment file to obtain the facial expression of the student.
In this step, under the condition that the teaching video is determined, it can be understood that the teaching video is a video to be analyzed, in order to analyze the teaching video, an image of the teaching video may be extracted first to obtain a student video image file extracted from the teaching video, and then, for the student video image file, the student video image file may be sliced according to a preset time interval to obtain a student image slicing file after slicing, so that the student image slicing file may be identified to obtain a facial expression of a student.
It should be noted that the specific duration of the preset time interval may be set according to actual conditions, and the student image fragment file after the slicing processing may be accurately identified and analyzed, which is not limited herein. Illustratively, in the case that the preset time interval is three seconds, image slicing is performed on the student video image file every three seconds, so as to obtain the student image slicing file.
In some possible embodiments, the analyzing the student image fragment file to obtain the facial expression of the student includes:
extracting at least one face image of a student from the student image fragment file based on the student image fragment file;
and determining the facial expression corresponding to each facial image aiming at least one extracted facial image.
In this step, in the case of determining the student image fragment file, the global image included in the student image fragment file may be determined first, and then the facial image for each student is determined from the global image, respectively, so as to determine the facial expression of the student from each of the facial images.
Further, if the facial image is blurred in the student image fragment file, the image addition and other adjustments may be performed on the close-up image of the student, and then the facial expression may be recognized.
Further, if the facial image is blocked in the student image fragment file, the close-up image of the student may be adjusted by local cropping or local treatment of blocking, and then the facial expression may be identified.
Optionally, the facial images of the students can be copied in multiple copies, each copy is adjusted by different methods, and then the facial expressions are recognized, so that the accuracy of the facial expression recognition is further ensured.
In some possible embodiments, the determining, for the extracted at least one facial image, a facial expression corresponding to each facial image includes:
judging whether each facial image can be matched with one sample expression or not based on a plurality of sample expressions stored in a preset expression database;
if the facial image can be matched with one sample expression, determining the corresponding facial expression of the facial image based on the matched sample expression, wherein the facial expression comprises one or more of happiness, surprise, neutrality, disgust, anger and fear.
In this step, for at least one extracted facial image, a preset expression database may be used, where a plurality of sample expressions are stored in the expression database, and at least one facial image is matched with the obtained plurality of sample expressions to obtain the matched sample expressions, so as to determine a facial expression corresponding to the facial image, where the facial expression includes one or more of happiness, surprise, neutrality, aversion, difficulty, anger, and fear.
In order to improve the accuracy of student classroom learning motivation analysis, can also combine teacher's interactive behavior in the teaching activity carries out student classroom learning motivation analysis jointly, in some possible implementation manners, based on emotion feedback information, it is right to confirm the student the learning motivation analysis result of teaching activity includes:
processing the teaching video to obtain at least one teacher action and at least one student action, wherein the teacher action and the student action are in corresponding relation;
integrating to obtain teacher-student interaction behaviors based on the teacher actions and the student actions;
and determining the result of analyzing the learning motivation of the teaching activities by students based on the emotional feedback information and the teacher-student interaction behaviors.
In this step, aim at the teaching video can be through modes such as analytic processing, follows obtain at least one teacher's action and at least one student's action in the teaching video, be definite the teacher's action with under the condition of student's action, can be with there being the relation the teacher's action with the student's action is integrated and is handled, obtains teacher-student's interactive behavior, thereby based on emotion feedback information with teacher-student's interactive behavior, it is right that the joint evaluation obtains the student the motivation analysis result of teaching activity.
It is to be understood that the teacher's motion may be a motion recognized by a teacher as a subject of motion, the student's motion may be a motion recognized by a student as a subject of motion, and the teacher's motion and the student's motion may be part of a set of consecutive motions, and thus, the teacher's motion and the student's motion are in a corresponding relationship with each other.
Illustratively, for a teacher's action of asking questions by the teacher and a student's action of holding hands by the student, the two actions may be linked into a set of continuous actions, and the two actions are in corresponding relationship with each other.
Therefore, the learning motivation analysis result of the student is determined together according to the emotion feedback information and the teacher-student interaction behaviors, and the accuracy and the integrity of teaching evaluation are further guaranteed.
The student classroom learning motivation analysis method provided by the embodiment of the disclosure is applied to a teaching scene provided with an audio acquisition device and a video acquisition device, and can acquire a teaching video which is acquired by the audio acquisition device and the video acquisition device and is used for teaching activities in the teaching scene; analyzing the teaching video to obtain the facial expression of the student; according to the facial expressions of the students, evaluating emotional feedback information expressed by the students in teaching activities; and determining the result of analyzing the learning motivation of the student on the teaching activity based on the emotional feedback information.
Therefore, the teaching video for teaching activities in a teaching scene is acquired through the audio acquisition equipment and the video acquisition equipment, the facial expressions of students are obtained through analysis in the teaching video, the facial expressions of the students are evaluated and analyzed, emotion feedback information expressed by the students in the teaching activities is obtained, the learning motivation analysis result of the teaching activities can be effectively judged and obtained through analysis based on the emotion feedback information, furthermore, the listening and speaking states and the understanding degree of the students in the teaching activities can be obtained through analysis, the teaching motivation is based on the students, teaching thinking and improvement of teachers are facilitated, the requirement for recognizing the facial expressions in complex teaching activities is met, the recognition accuracy is high, the recognition speed is high, and the accuracy of analysis of the learning motivations of the students is improved.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
Based on the same inventive concept, the embodiment of the present disclosure further provides a student classroom learning motivation analysis device corresponding to the student classroom learning motivation analysis method, and as the principle of solving the problem of the device in the embodiment of the present disclosure is similar to the student classroom learning motivation analysis method in the embodiment of the present disclosure, the implementation of the device can refer to the implementation of the method, and repeated details are not repeated.
Please refer to fig. 4 and 5, fig. 4 is a schematic diagram of a student classroom learning motivation analysis apparatus according to an embodiment of the present disclosure, and fig. 5 is a schematic diagram of a student classroom learning motivation analysis apparatus according to an embodiment of the present disclosure. As shown in fig. 4, a student classroom learning motivation analysis apparatus 400 provided by an embodiment of the present disclosure includes:
the acquisition module 410 is used for acquiring a teaching video which is acquired by the audio acquisition equipment and the video acquisition equipment and is used for teaching activities in the teaching scene;
the analysis module 420 is configured to perform analysis processing on the teaching video to obtain facial expressions of the students;
the evaluation module 430 is used for evaluating emotional feedback information expressed by the students in the teaching activities according to the facial expressions of the students;
and the judging module 440 is configured to determine a result of analyzing the learning motivation of the student for the teaching activity based on the emotional feedback information.
In an optional implementation manner, the acquisition module 410 is specifically configured to:
determining the audio acquisition equipment and the video acquisition equipment which are arranged in the teaching scene;
acquiring audio data acquired by the video acquisition equipment and image data acquired by the video acquisition equipment;
screening and denoising collected audio data to obtain processed audio data;
carrying out region division processing on the acquired image data, and extracting to obtain processed image data, wherein the processed image data is student image data for displaying student pictures;
and integrating the processed student image data and the processed audio data to obtain the teaching video.
In an optional implementation manner, the analysis module 420 is specifically configured to:
aiming at the teaching video, extracting a student video image file from the teaching video;
based on a preset time interval, slicing the student video image file according to the preset time interval to obtain a student image slicing file;
and analyzing the student image fragment file to obtain the facial expression of the student.
In an optional implementation manner, when the analysis module 420 is configured to perform parsing processing on the student image fragment file to obtain a facial expression of a student, specifically:
extracting at least one face image of a student from the student image fragment file based on the student image fragment file;
and determining the facial expression corresponding to each facial image aiming at least one extracted facial image.
In an optional embodiment, when the analyzing module 420 is configured to determine, for the extracted at least one facial image, a facial expression corresponding to each facial image, specifically:
judging whether each facial image can be matched with one sample expression or not based on a plurality of sample expressions stored in a preset expression database;
if the facial image can be matched with one sample expression, determining the corresponding facial expression of the facial image based on the matched sample expression, wherein the facial expression comprises one or more of happiness, surprise, neutrality, disgust, anger and fear.
In an alternative embodiment, the evaluation module 430 is specifically configured to:
aiming at one or more facial expressions of happiness, surprise and neutrality, the emotional feedback information expressed by the students in the teaching activities is evaluated as positive information;
and evaluating the emotional feedback information presented by the students in the teaching activities into negative information aiming at one or more facial expressions of disgust, anger and fear.
In an optional implementation manner, the determining module 440 is specifically configured to:
confirming at least one piece of emotional feedback information of at least one student in the teaching video based on the teaching video;
judging the actual ratio of positive information to negative information in the at least one piece of emotion feedback information based on the at least one piece of emotion feedback information;
based on a predetermined preset ratio, comparing the actual ratio with the preset ratio to obtain a comparison result;
and determining the result of the learning motivation analysis of the teaching activities by the students based on the comparison result.
In an optional implementation manner, the determining module 440 is specifically configured to:
processing the teaching video to obtain at least one teacher action and at least one student action, wherein the teacher action and the student action are in corresponding relation;
integrating to obtain teacher-student interaction behaviors based on the teacher actions and the student actions;
and determining the result of analyzing the learning motivation of the teaching activities by students based on the emotional feedback information and the teacher-student interaction behaviors.
In an alternative embodiment, as shown in fig. 5, the apparatus further comprises a feedback module 450, wherein the feedback module 450 is configured to:
determining a teaching quality evaluation result for a teacher in the teaching activity based on the learning motivation analysis result of the student on the teaching activity;
and generating prompt information based on the teaching quality evaluation result, and feeding the prompt information back to the teacher.
The description of the processing flow of each module in the device and the interaction flow between the modules may refer to the related description in the above method embodiments, and will not be described in detail here.
The student classroom learning motivation analysis device provided by the embodiment of the disclosure is applied to a teaching scene provided with an audio acquisition device and a video acquisition device, and can acquire a teaching video which is acquired by the audio acquisition device and the video acquisition device and is used for teaching activities in the teaching scene; analyzing the teaching video to obtain the facial expression of the student; according to the facial expressions of the students, evaluating emotional feedback information expressed by the students in teaching activities; and determining the result of analyzing the learning motivation of the student on the teaching activity based on the emotional feedback information.
Therefore, the teaching video for teaching activities in a teaching scene is acquired through the audio acquisition equipment and the video acquisition equipment, the facial expressions of students are obtained through analysis in the teaching video, the facial expressions of the students are evaluated and analyzed, emotion feedback information expressed by the students in the teaching activities is obtained, the learning motivation analysis result of the teaching activities can be effectively judged and obtained through analysis based on the emotion feedback information, furthermore, the listening and speaking states and the understanding degree of the students in the teaching activities can be obtained through analysis, the teaching motivation is based on the students, teaching thinking and improvement of teachers are facilitated, the requirement for recognizing the facial expressions in complex teaching activities is met, the recognition accuracy is high, the recognition speed is high, and the accuracy of analysis of the learning motivations of the students is improved.
Corresponding to the student classroom learning motivation analysis method in fig. 2 and fig. 3, an embodiment of the present disclosure further provides an electronic device 600, as shown in fig. 6, a schematic structural diagram of the electronic device 600 provided for an embodiment of the present disclosure includes:
a processor 610, a memory 620, and a bus 630; the storage 620 is used for storing execution instructions and includes a memory 621 and an external storage 622; the memory 621 is also referred to as an internal memory, and is used for temporarily storing the operation data in the processor 610 and the data exchanged with the external memory 622 such as a hard disk, the processor 610 exchanges data with the external memory 622 through the memory 621, and when the electronic device 600 operates, the processor 610 communicates with the memory 620 through the bus 630, so that the processor 610 can execute the steps of the student classroom learning motivation analysis method.
It is to be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation to the electronic device 600. In other embodiments of the present application, the electronic device 600 may include more or fewer components than illustrated, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The disclosed embodiments also provide a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, performs the steps of the student classroom learning motivation analysis method described in the above method embodiments. The storage medium may be a volatile or non-volatile computer-readable storage medium.
The embodiments of the present disclosure further provide a computer program product, where the computer program product includes computer instructions, and when the computer instructions are executed by a processor, the steps of the student classroom learning motivation analysis method in the above method embodiments may be executed.
The computer program product may be implemented by hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied in a computer storage medium, and in another alternative embodiment, the computer program product is embodied in a Software product, such as a Software Development Kit (SDK), or the like.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working process of the apparatus described above may refer to the corresponding process in the foregoing method embodiment, and is not described herein again. In the several embodiments provided in the present disclosure, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing an electronic device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present disclosure. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Finally, it should be noted that: the above-mentioned embodiments are merely specific embodiments of the present disclosure, which are used for illustrating the technical solutions of the present disclosure and not for limiting the same, and the scope of the present disclosure is not limited thereto, and although the present disclosure is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive of the technical solutions described in the foregoing embodiments or equivalent technical features thereof within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present disclosure, and should be construed as being included therein. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (10)

1. A student classroom learning motivation analysis method is applied to a teaching scene provided with an audio acquisition device and a video acquisition device, and comprises the following steps:
acquiring a teaching video which is acquired by the audio acquisition equipment and the video acquisition equipment and used for teaching activities in the teaching scene;
analyzing the teaching video to obtain the facial expression of the student;
according to the facial expressions of the students, evaluating emotional feedback information expressed by the students in teaching activities;
and determining the result of analyzing the learning motivation of the student on the teaching activity based on the emotional feedback information.
2. The method of claim 1, wherein the parsing the teaching video to obtain the facial expression of the student comprises:
aiming at the teaching video, extracting a student video image file from the teaching video;
based on a preset time interval, slicing the student video image file according to the preset time interval to obtain a student image slicing file;
and analyzing the student image fragment file to obtain the facial expression of the student.
3. The method according to claim 2, wherein the parsing the student image fragment file to obtain the facial expression of the student comprises:
extracting at least one face image of a student from the student image fragment file based on the student image fragment file;
and determining the facial expression corresponding to each facial image aiming at least one extracted facial image.
4. The method of claim 3, wherein the determining, for the extracted at least one facial image, a facial expression corresponding to each facial image comprises:
judging whether each facial image can be matched with one sample expression or not based on a plurality of sample expressions stored in a preset expression database;
if the facial image can be matched with one sample expression, determining the corresponding facial expression of the facial image based on the matched sample expression, wherein the facial expression comprises one or more of happiness, surprise, neutrality, disgust, anger and fear.
5. The method of claim 4, wherein the step of evaluating emotional feedback information presented by the student in the teaching activity according to the facial expression of the student comprises:
aiming at one or more facial expressions of happiness, surprise and neutrality, the emotional feedback information expressed by the students in the teaching activities is evaluated as positive information;
and evaluating the emotional feedback information presented by the students in the teaching activities into negative information aiming at one or more facial expressions of disgust, anger and fear.
6. The method of claim 5, wherein determining a result of student analysis of learning motivation for the teaching activity based on the emotional feedback information comprises:
confirming at least one piece of emotional feedback information of at least one student in the teaching video based on the teaching video;
judging the actual ratio of positive information to negative information in the at least one piece of emotion feedback information based on the at least one piece of emotion feedback information;
based on a predetermined preset ratio, comparing the actual ratio with the preset ratio to obtain a comparison result;
and determining the result of the learning motivation analysis of the teaching activities by the students based on the comparison result.
7. The method of claim 6, wherein determining a result of student analysis of learning motivation for the teaching activity based on the emotional feedback information comprises:
processing the teaching video to obtain at least one teacher action and at least one student action, wherein the teacher action and the student action are in corresponding relation;
integrating to obtain teacher-student interaction behaviors based on the teacher actions and the student actions;
and determining the result of analyzing the learning motivation of the teaching activities by students based on the emotional feedback information and the teacher-student interaction behaviors.
8. The utility model provides a student classroom learning motivation analytical equipment which characterized in that, is applied to in the teaching scene that is provided with audio acquisition equipment and video acquisition equipment, the device includes:
the acquisition module is used for acquiring a teaching video which is acquired by the audio acquisition equipment and the video acquisition equipment and used for teaching activities in the teaching scene;
the analysis module is used for analyzing the teaching video to obtain the facial expression of the student;
the evaluation module is used for evaluating emotion feedback information expressed by the students in the teaching activities according to the facial expressions of the students;
and the judging module is used for determining the result of analyzing the learning motivation of the student on the teaching activity based on the emotion feedback information.
9. An electronic device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating over the bus when the electronic device is running, the machine-readable instructions when executed by the processor performing the steps of the student classroom learning motivation analysis method of any of claims 1-7.
10. A computer-readable storage medium, having stored thereon a computer program which, when executed by a processor, performs the steps of the student classroom learning motivation analysis method as recited in any one of claims 1-7.
CN202111667138.3A 2021-12-31 2021-12-31 Student classroom learning motivation analysis method, device, equipment and storage medium Pending CN114332719A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111667138.3A CN114332719A (en) 2021-12-31 2021-12-31 Student classroom learning motivation analysis method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111667138.3A CN114332719A (en) 2021-12-31 2021-12-31 Student classroom learning motivation analysis method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114332719A true CN114332719A (en) 2022-04-12

Family

ID=81021413

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111667138.3A Pending CN114332719A (en) 2021-12-31 2021-12-31 Student classroom learning motivation analysis method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114332719A (en)

Similar Documents

Publication Publication Date Title
CN108648757B (en) Analysis method based on multi-dimensional classroom information
CN109522815B (en) Concentration degree evaluation method and device and electronic equipment
KR20210144658A (en) Video processing method and apparatus, electronic device and storage medium
CN109800663A (en) Teachers ' teaching appraisal procedure and equipment based on voice and video feature
CN111046819A (en) Behavior recognition processing method and device
CN114299617A (en) Teaching interaction condition identification method, device, equipment and storage medium
Barmaki et al. Embodiment analytics of practicing teachers in a virtual immersive environment
CN111027486A (en) Auxiliary analysis and evaluation system and method for big data of teaching effect of primary and secondary school classroom
CN111666820B (en) Speech state recognition method and device, storage medium and terminal
WO2020214316A1 (en) Artificial intelligence-based generation of event evaluation report
US20230222932A1 (en) Methods, systems, and media for context-aware estimation of student attention in online learning
CN114926889B (en) Job submission method and device, electronic equipment and storage medium
CN112949461A (en) Learning state analysis method and device and electronic equipment
e Silva et al. Applications of convolutional neural networks in education: A systematic literature review
CN113257060A (en) Question answering solving method, device, equipment and storage medium
CN114332719A (en) Student classroom learning motivation analysis method, device, equipment and storage medium
CN111199378A (en) Student management method, student management device, electronic equipment and storage medium
WO2023079370A1 (en) System and method for enhancing quality of a teaching-learning experience
CN111327943B (en) Information management method, device, system, computer equipment and storage medium
Liu et al. Design and Experimentation of Face Recognition Technology Applied to Online Live Class.
CN111369400A (en) Middle school student learning process supervision method based on image data processing
CN114786027B (en) Online live broadcast teaching prompting method and device, electronic equipment and storage medium
WO2024108512A1 (en) Class note generation method and apparatus, device, and storage medium
CN115052194B (en) Learning report generation method, device, electronic equipment and storage medium
CN117492871B (en) Teaching activity construction method based on low codes and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination