CN117095464A - Student classroom learning habit analysis method and system based on image recognition - Google Patents

Student classroom learning habit analysis method and system based on image recognition Download PDF

Info

Publication number
CN117095464A
CN117095464A CN202311171959.7A CN202311171959A CN117095464A CN 117095464 A CN117095464 A CN 117095464A CN 202311171959 A CN202311171959 A CN 202311171959A CN 117095464 A CN117095464 A CN 117095464A
Authority
CN
China
Prior art keywords
image
classroom
images
student
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311171959.7A
Other languages
Chinese (zh)
Inventor
廖劲光
唐武雷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Logansoft Technology Co ltd
Original Assignee
Guangzhou Logansoft Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Logansoft Technology Co ltd filed Critical Guangzhou Logansoft Technology Co ltd
Priority to CN202311171959.7A priority Critical patent/CN117095464A/en
Publication of CN117095464A publication Critical patent/CN117095464A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06398Performance of employee with respect to a job function
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • G06Q50/205Education administration or guidance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/62Extraction of image or video features relating to a temporal dimension, e.g. time-based feature extraction; Pattern tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/757Matching configurations of points or features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Resources & Organizations (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Strategic Management (AREA)
  • Multimedia (AREA)
  • Educational Administration (AREA)
  • Tourism & Hospitality (AREA)
  • Health & Medical Sciences (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Development Economics (AREA)
  • Educational Technology (AREA)
  • General Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Evolutionary Computation (AREA)
  • Operations Research (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • Game Theory and Decision Science (AREA)
  • Human Computer Interaction (AREA)
  • Databases & Information Systems (AREA)
  • Psychiatry (AREA)
  • Quality & Reliability (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Social Psychology (AREA)
  • Primary Health Care (AREA)
  • Electrically Operated Instructional Devices (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The application relates to a student classroom learning habit analysis method and system based on image recognition, wherein the method comprises the steps of receiving classroom images from a shooting terminal at intervals; transmitting the received classroom images to an image comparison model in real time, wherein the image comparison model splits the classroom images to obtain a plurality of characteristic images representing the classroom states of each student based on a preset area division rule of the classroom images; feature comparison is carried out on the feature images and a target feature image set prestored in an image comparison model, current classroom scene information is judged and output, and the target feature image set contains a plurality of target feature images with different behavior types; judging the behavior type of the feature image; and identifying pre-bound student information based on the characteristic image, binding and archiving the behavior type of the characteristic image and the student information. The application has the effect of conveniently supervising the behavior of students in class, thereby conveniently analyzing and correcting the learning habit of the students.

Description

Student classroom learning habit analysis method and system based on image recognition
Technical Field
The application relates to the technical field of intelligent analysis, in particular to a student classroom learning habit analysis method and system based on image recognition.
Background
The classroom behavior of the students can reflect the classroom learning habit of the students to a great extent and also influence the school performance of the students. At present, supervision of the behavior of students in a class is generally performed by manually supervising the students in the class by teachers, but the teachers have limited energy and are difficult to supervise the behavior of all the students in the class during teaching, so that the learning and knowing of the learning habit of the students in the class are difficult, the learning habit of each student is difficult to analyze and correct, and a space for improvement exists.
Disclosure of Invention
In order to facilitate supervision of behaviors of students in a classroom, analysis and righting of learning habits of the students are facilitated; the application provides a student classroom learning habit analysis method and system based on image recognition.
The first object of the present application is achieved by the following technical solutions:
a student classroom learning habit analysis method based on image recognition comprises the following steps:
receiving classroom images from a shooting terminal at intervals, wherein the shooting terminal is fixed in a classroom to shoot the classroom images of students in class;
Transmitting the received classroom images to an image comparison model in real time, wherein the image comparison model splits the classroom images to obtain a plurality of characteristic images representing the classroom states of each student based on a preset area division rule of the classroom images;
feature comparison is carried out on the feature images and a target feature image set prestored in an image comparison model, current classroom scene information is judged and output, and the target feature image set contains a plurality of target feature images with different behavior types;
judging the behavior type of the feature image based on the current classroom scene information and the comparison result of the feature image and the target feature image;
and identifying pre-bound student information based on the characteristic image, binding and archiving the behavior type of the characteristic image and the student information.
Through adopting above-mentioned technical scheme, shooting terminal can set up many according to the teacher size, because shooting terminal is fixed in the teaching room, consequently the classroom image angle of shooing at every turn is certain, consequently can go on regional division to the classroom image, the characteristic image on the exclusive position of every student on the split classroom, further carry out image feature contrast to a plurality of characteristic images, through the characteristic image that the interval was constantly received, can judge whether characteristic image belongs to preset target characteristic image, target characteristic image is the image that the student's behavior was held in class, for example, take notes, play a pen and change, hold the answer of hand, low head and hold the head, do not see blackboard and hold, low head sleep etc. behavior is held, if the characteristic image that the behavior was held is recognized, the student appears above-mentioned behavior on the classroom.
Before the judgment of behavior taking, the current class scene needs to be firstly judged through the class images, for example, the current scene of lecture of a teacher in a lecture desk or the scene of low-head writing class practice together with a whole class is judged, so that the correct recognition of the behavior type is ensured, further, the characteristic comparison recognition is carried out on each characteristic image and a target characteristic image set respectively, the behavior taking of each student in the whole class is obtained and is bound with the student information of the student, for example, the behavior taking of which student takes place in the class without serious listening class such as black board slow-boiling, pen playing, low-head listening and the like is recognized when the teacher takes the lecture, or the low-head pen taking is adopted as the problem when the teacher takes the class, and the intelligent supervision of the behavior taking of the student in the class can be realized, so that the aim of evaluating the learning quality of the student is achieved, and the behavior taking of the student in the class is criticism and encouragement is carried out.
The present application is in a preferred example: the target feature image set comprises a scene feature image set bound with classroom scene information, and the step of comparing the feature image with the target feature image set prestored in the image comparison model, judging and outputting the current classroom scene information comprises the following steps:
Comparing the characteristics of the characteristic images with the characteristics of the target characteristic images in the target characteristic image set;
based on the feature comparison result, if the feature images exceeding the preset quantity proportion are all judged to be the same scene feature image set, the current class scene information is judged to be the class scene information which is bound in advance with the scene feature image set.
By adopting the technical scheme, the number of the feature images acquired by each class at one time is fixed, if the feature images exceeding the preset number proportion are judged to be the same feature image set, namely, more than 45 of the 50 feature images are judged to be the behavior of students looking at a blackboard after feature comparison, namely, the current class scene is judged to be the class scene listening to a teacher lecture, and if 45 of the 50 feature images are judged to be the class practice scene belonging to a low-head lecture, the current class scene is judged to be the current class scene, and the recognition judgment of the current class scene information is realized.
The present application is in a preferred example: the step of judging the behavior type of the feature image based on the current classroom scene information and the comparison result of the feature image and the target feature image comprises the following steps:
Feature comparison is carried out on the feature images and target feature images in the target feature image set, and a plurality of target feature images similar to the feature images are obtained;
screening one target feature image conforming to the classroom scene information from a plurality of target feature images based on the classroom scene information, and identifying the behavior type corresponding to the screened target feature image;
judging whether the behavior types of the obtained target feature images are the same after feature comparison of the feature images of the same splitting area in a plurality of received classroom images in a preset time period;
if the behavior types are the same, taking the behavior type corresponding to the screened target feature image as the behavior type of the feature image.
By adopting the technical scheme, the judgment of the behavior of the student in the characteristic image is carried out by combining the classroom scene, if the classroom scene information is used for carrying out classroom exercise, the student cannot judge the behavior which is not carefully heard but is judged to be the behavior of the low head when the student looks at the problem, so the judgment is carried out by combining the classroom scene information, further, the behavior of the student in the classroom cannot be judged by only one classroom image, for example, a shooting terminal acquires one classroom image every 50 milliseconds, and if the characteristic images in all the classroom images in the continuous 1-2 seconds are used for continuous comparison, namely, if the classroom images exist in a preset time period, the judgment of whether the student has the same behavior as the target characteristic image can be carried out, so that the behavior type judgment of the characteristic image is more accurate.
The present application is in a preferred example: the step of identifying pre-bound student information based on the characteristic image, binding and archiving the behavior type of the characteristic image and the student information comprises the following steps:
identifying the identification information bound after the feature images are split based on the region division rule, wherein the identification information of each feature image in the class images is different;
identifying student information pre-bound with the identification information, binding and archiving the behavior types of the feature image with the student information, and binding a plurality of behavior types of each student information according to time sequence.
By adopting the technical scheme, as the angle of the classroom image is fixed, the classroom image can be divided into areas, each area is an area where a student seat is located, and identification information is bound to each area, namely, name information of the student in each area can be known, binding of the behavior type of the student in the classroom and identity information of the student is realized, and each student can bind a plurality of behavior types, namely, all behaviors of the student in each lesson, each week, each month or each school can be known.
The present application is in a preferred example: the characteristic image also comprises a student desktop image, the classroom scene information comprises a classroom practice scene and a teacher lecture scene, the characteristic image is based on the characteristic image to identify the pre-bound student information, and the following steps are executed after the step of binding and archiving the behavior type of the characteristic image and the student information:
When the classroom scene information is a classroom practice scene, extracting a student desktop image from the characteristic image;
transmitting the desktop images of the students to a question analysis model for monitoring the question making progress of the students;
the method comprises the steps that a question analysis model identifies and extracts exercise text data from a student desktop image, judges the answer progress of a current student based on the exercise text data, and records a time node of the current answer progress;
based on the exercise question text data and the time nodes extracted from the front and the back, outputting question duration data representing the time spent by students for doing each exercise question.
Through adopting above-mentioned technical scheme, including the picture on student and its desk in the characteristic image, promptly when the student carries out the classroom exercise, can supervise the exercise habit and the progress condition of student classroom exercise, through the exercise text data that the interval was drawed, whether jump the problem was answered when the student was done the problem can be known, whether the student accomplished all exercise problems, further, through the calculation of time node, can calculate the time that the student stayed in each exercise problem of doing the problem, and then can analyze the student to the mastery degree of knowledge in the different exercise problems only, and can analyze the student that exercise problem efficiency is high and the student that exercise efficiency is low, the tutorial is pointed for the pertinence.
The present application is in a preferred example: after the step of identifying the pre-bound student information based on the characteristic image, binding and archiving the behavior type of the characteristic image and the student information, the following steps are further executed:
acquiring all behavior types bound by each student information in a preset period; counting the occurrence times of different behavior types;
acquiring class discipline information corresponding to the moment based on the time point of the behavior type record; and counting class discipline information with the largest occurrence times of different behavior types.
By adopting the technical scheme, the preset period comprises every lesson, every week, every month and every learning period, the behavior type of each student information is classified and counted, the learning habit of each student can be known, part of characteristic images for normal lesson listening and exercise can be filtered according to different behavior types, and only the characteristic images which can represent the students to actively answer questions and learn inappropriateness are left to relieve the memory pressure; further, through identifying the class subject information corresponding to the behavior types, whether students in classes of different subjects listen and talk carefully or not, or the subjects with the most behaviors of not listening carefully and having classes can be obtained, and the behavior types of the students with good results and the behavior types of the students with poor results can be analyzed by combining the staged examination results of the students.
The second object of the present application is achieved by the following technical solutions:
a system for analyzing learning habits of students in a class based on image recognition, comprising:
the classroom image acquisition module is used for receiving classroom images from the shooting terminal at intervals, and the shooting terminal is fixed in the classroom to shoot the classroom images of students when they take class;
the characteristic image splitting module is used for sending the received classroom images to the image comparison model in real time, and the image comparison model splits the classroom images to obtain a plurality of characteristic images representing the classroom states of each student based on the preset area division rules of the classroom images;
the scene recognition module is used for comparing the characteristic images with a target characteristic image set prestored in the image comparison model, judging and outputting current classroom scene information, wherein the target characteristic image set comprises a plurality of target characteristic images with different behavior types;
the behavior judging module is used for judging the behavior type of the characteristic image based on the current classroom scene information and the comparison result of the characteristic image and the target characteristic image;
and the behavior binding module is used for identifying pre-bound student information based on the characteristic image, binding and archiving the behavior type of the characteristic image and the student information.
Through adopting above-mentioned technical scheme, shooting terminal can set up many according to the teacher size, because shooting terminal is fixed in the teaching room, consequently the classroom image angle of shooing at every turn is certain, consequently can go on regional division to the classroom image, the characteristic image on the exclusive position of every student on the split classroom, further carry out image feature contrast to a plurality of characteristic images, through the characteristic image that the interval was constantly received, can judge whether characteristic image belongs to preset target characteristic image, target characteristic image is the image that the student's behavior was held in class, for example, take notes, play a pen and change, hold the answer of hand, low head and hold the head, do not see blackboard and hold, low head sleep etc. behavior is held, if the characteristic image that the behavior was held is recognized, the student appears above-mentioned behavior on the classroom.
Before the judgment of behavior taking, the current class scene needs to be firstly judged through the class images, for example, the current scene of lecture of a teacher in a lecture desk or the scene of low-head writing class practice together with a whole class is judged, so that the correct recognition of the behavior type is ensured, further, the characteristic comparison recognition is carried out on each characteristic image and a target characteristic image set respectively, the behavior taking of each student in the whole class is obtained and is bound with the student information of the student, for example, the behavior taking of which student takes place in the class without serious listening class such as black board slow-boiling, pen playing, low-head listening and the like is recognized when the teacher takes the lecture, or the low-head pen taking is adopted as the problem when the teacher takes the class, and the intelligent supervision of the behavior taking of the student in the class can be realized, so that the aim of evaluating the learning quality of the student is achieved, and the behavior taking of the student in the class is criticism and encouragement is carried out.
Optionally, the target feature image set includes a scene feature image set bound with classroom scene information, and the scene recognition module includes:
the image and set comparison sub-module is used for comparing the characteristics of the plurality of characteristic images with the characteristics of the target characteristic images in the target characteristic image set;
and the scene judging sub-module is used for judging that the current class scene information is class scene information which is bound in advance by the scene feature image set if the feature images exceeding the preset quantity proportion are all judged to be the same scene feature image set based on the feature comparison result.
The third object of the present application is achieved by the following technical solutions:
a computer device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of a student classroom learning habit analysis system based on image recognition as described above when the computer program is executed by the processor.
The fourth object of the present application is achieved by the following technical solutions:
a computer readable storage medium storing a computer program which, when executed by a processor, implements the steps of a student classroom learning habit analysis system based on image recognition described above.
In summary, the present application includes at least one of the following beneficial technical effects:
1. through the comparison of the characteristic images, the students who do not watch the blackboard for slow-keeping, play pens, low-head behaviors and the like which do not take the lessons carefully can be identified, or the questions which take the pens carefully can be identified as questions when the teacher takes the lessons, and the students can take the questions intelligently when the lessons are practiced, so that the aim of evaluating the learning habits of the students is fulfilled, and criticizing and encouraging the students with the behaviors which do not take the lessons carefully and the students with the behaviors taking the carefully can be fulfilled;
2. for example, more than 45 of the 50 characteristic images are judged to be held by students looking at a blackboard after characteristic comparison, namely, the current class scene is judged to be the class scene for listening to the lectures of teachers, and if the 45 characteristic images belong to the low-head holding behavior in the 50 characteristic images, the current class scene is judged to be the scene for doing class exercises, so that the identification judgment of the current class scene information is realized;
3. the shooting terminal acquires a class image every 50 milliseconds, and the characteristic images in all class images in the continuous 1-2 seconds are required to be continuously compared, namely if the class images exist in a preset time period, whether the students have the same behavior as the target characteristic image or not can be judged, so that the behavior type of the characteristic images is judged more accurately;
4. Through the calculation of the time nodes, the stay time of each exercise problem of the students can be calculated, the mastering degree of the students on only knowledge in different exercise problems can be analyzed, and the students with high exercise problem making efficiency and the students with low exercise problem making efficiency can be analyzed to conduct coaching and correct in a targeted mode.
Drawings
FIG. 1 is a flow chart of an embodiment of a method for analyzing learning habits of students in a classroom based on image recognition according to the present application;
FIG. 2 is a flowchart showing an implementation of step S40 in a method for analyzing learning habits of students in class based on image recognition according to the present application;
FIG. 3 is a flowchart showing an implementation of the method for analyzing learning habits of students in class based on image recognition according to the present application after step S50;
fig. 4 is a schematic block diagram of a computer device of the present application.
Detailed Description
The application is described in further detail below with reference to fig. 1-4.
In an embodiment, as shown in fig. 1, the application discloses a method for analyzing learning habits of students in a classroom based on image recognition, which specifically comprises the following steps:
s10: receiving classroom images from a shooting terminal at intervals, wherein the shooting terminal is fixed in a classroom to shoot the classroom images of students in class;
In this embodiment, the shooting terminal is a suspended ceiling shooting device fixed on a classroom ceiling, and the suspended ceiling shooting device is used for shooting images of classroom students and desktops of the students. The suspended ceiling shooting device has a communication function and can receive the instruction and send the stored image. The number of the photographing terminals is selected and installed according to the size of a classroom.
The classroom image includes an image picture of each student in the shooting range, the student seat, and the desk top.
The specific time interval of the interval receiving is 50 milliseconds, namely the shooting terminal collects a class image every 50 milliseconds.
Specifically, the classroom image within the shooting range acquired by the shooting terminal is received every 50 milliseconds.
S20: transmitting the received classroom images to an image comparison model in real time, wherein the image comparison model splits the classroom images to obtain a plurality of characteristic images representing the classroom states of each student based on a preset area division rule of the classroom images;
in this embodiment, the image comparison model is a trained model for performing image feature comparison on student class behavior and outputting a student behavior type.
The area division rule is set according to angles and pictures of classroom images shot by the shooting terminal, and aims to divide the classroom images so that each student is located in a single image area; the characteristic image comprises an area picture of the whole position of the student, the student desktop and the student. Each characteristic image contains a student and a corresponding desktop.
Specifically, the classroom images received every 50 milliseconds are sent to an image comparison model in real time, after the image comparison model receives the classroom images, the classroom images are divided based on the areas corresponding to the shooting terminals for shooting the classroom images, and feature images of students on the whole class in the classroom are obtained, wherein each feature image comprises an area picture of one student and the corresponding desktop and the integral position of the student.
S30: feature comparison is carried out on the feature images and a target feature image set prestored in an image comparison model, current classroom scene information is judged and output, and the target feature image set contains a plurality of target feature images with different behavior types;
in this embodiment, feature contrast, that is, the action features of the student in the image are compared with the action features of the person in the target feature image.
The target feature images in the target feature image set include, but are not limited to, action features or images such as taking notes, playing pen and rotating pen, answering questions by lifting hands, lowering the head, not looking at the direction of a podium blackboard, lowering the head, lying on a table, stacking hands on the table for carefully listening and speaking, engaging in a joint, doing exercises with the head, and the like, and the images of each action are more than one, for example, include target feature images of a plurality of pen and rotating pen and playing pen, or include action features of a plurality of pen and rotating pen and playing pen.
The classroom scene information includes, but is not limited to, a scene of teacher lecture, a scene of classroom exercises where class students collectively do classroom exercises, and a scene of discussion of student groups.
Specifically, after the model comparison model receives the feature images, a pre-stored target feature image set is called, the received feature images are compared with target feature images in the target feature image set, and a class scene to which the current class image belongs is judged according to comparison results of all the feature images.
S40: judging the behavior type of the feature image based on the current classroom scene information and the comparison result of the feature image and the target feature image;
in this embodiment, the behavior types are preset student class behaviors, such as taking notes, playing pens and rotating pens, answering questions by hand, lowering the head, not looking at the direction of a podium blackboard, lowering the head, lying on a desk, stacking two hands on the desk to carefully listen and talk, engaging in the ears, lowering the head to do exercises, and the like.
Specifically, based on the current class scene and the image feature comparison result of the feature image and the target feature image, the preset behavior of the student in the feature image is determined.
S50: and identifying pre-bound student information based on the characteristic image, binding and archiving the behavior type of the characteristic image and the student information.
In this embodiment, the student information includes information of a class, a name, and the like of the student. And deleting the characteristic image which is stored first or cannot show the active performance and the careless performance of students in the class when the memory is insufficient because the archived edge node is a 2T memory.
Specifically, based on the position of the characteristic image in the area of the class image, identifying student information pre-bound by the characteristic image, and binding and archiving the behavior type of the characteristic image and the student information.
In one embodiment, the target feature image set includes a scene feature image set bound with classroom scene information, and step S30 includes the steps of:
s31: comparing the characteristics of the characteristic images with the characteristics of the target characteristic images in the target characteristic image set;
s32: based on the feature comparison result, if the feature images exceeding the preset quantity proportion are all judged to be the same scene feature image set, the current class scene information is judged to be the class scene information which is bound in advance with the scene feature image set.
In this embodiment, the scene feature image set refers to a target feature image bound with classroom scene information in the target feature image set, for example, a target feature image bound with a classroom scene of a teacher lecture and a target feature image bound with a classroom practice scene; the preset number ratio may be set manually, in this embodiment 90%.
Specifically, feature comparison is performed on a plurality of feature images and target feature images in a target feature image set, and if feature images exceeding a preset quantity proportion belong to the same scene feature image set in a plurality of feature images obtained by splitting the classroom images, the current classroom scene information is judged to be the classroom scene bound by the scene feature image set.
In one embodiment, referring to fig. 2, step S40 includes:
s41: feature comparison is carried out on the feature images and target feature images in the target feature image set, and a plurality of target feature images similar to the feature images are obtained;
s42: screening one target feature image conforming to the classroom scene information from a plurality of target feature images based on the classroom scene information, and identifying the behavior type corresponding to the screened target feature image;
S43: judging whether the behavior types of the obtained target feature images are the same after feature comparison of the feature images of the same splitting area in a plurality of received classroom images in a preset time period;
s44: if the behavior types are the same, taking the behavior type corresponding to the screened target feature image as the behavior type of the feature image.
In this embodiment, the plurality of approximate target feature images are not determined as to the class scene information, for example, the student has a behavior of low head, but the class scene information about the behavior may be a class practice scene or a teacher lecture scene, and the behavior of the same class scene information may cause misjudgment on learning habits of the student, so that further confirmation of the class scene information is required.
Since the time interval for capturing classroom images at intervals of the photographing terminal is 50 ms, the preset time period is generally set to 1-2 seconds. If the behavior types of the target feature images corresponding to the feature images are the same behavior type within a preset time period, namely, the students are in the action of playing the pencil by turning the pencil within 1-2 seconds, the students are proved to have the behavior of playing the pencil by turning the pencil and having no serious class listening.
Further, if the characteristic images of the same splitting area are subjected to characteristic comparison within a preset time period, and the behavior types of the obtained forehead target characteristic images within 1 second are different, for example, only one image appears in a student in preset 1-2 seconds, but the student actually acts in a normal way of lifting the pen, so that the student is proved to appear in a behavior similar to a careless class of the revolving pen only at the moment of being photographed and is not recorded as the behavior of careless class.
Specifically, feature images are subjected to feature comparison with target feature images in a target feature image set to obtain a plurality of target feature images similar to the feature images, one target feature image conforming to the classroom scene information is further screened out from the plurality of target feature images based on the classroom scene information, and whether the behavior types of the obtained target feature images are the same after feature comparison in the received feature images of the same splitting area in a plurality of preset classroom images is judged;
if the behavior types are the same, taking the behavior type corresponding to the screened target feature image as the behavior type of the feature image;
If the behavior types are different, the behavior types of the inappropriately listening class in the preset time period are not recorded.
In one embodiment, step S50 includes the steps of:
identifying the identification information bound after the feature images are split based on the region division rule, wherein the identification information of each feature image in the class images is different;
identifying student information pre-bound with the identification information, binding and archiving the behavior types of the feature image with the student information, and binding a plurality of behavior types of each student information according to time sequence.
In this embodiment, the identification information is a number. Each student information binds a plurality of behavior types according to time sequence, the edge of the buffer memory is a 2T memory, and when the memory is full, the earliest preservation time can be preferably deleted, or the behavior types that the students are active in class and do not listen and talk seriously are not reflected. The active behavior of lesson is embodied in answering questions by hand.
Specifically, the classroom image is split based on the region division rule to obtain a plurality of feature images, each feature image is bound with identification information, and the identification information bound by different feature images in the same classroom image is different. And further identifying student information pre-bound with the identification information to bind and archive the behavior types of the feature images with the student information, and binding a plurality of behavior types by each student information according to the time sequence along with the continuous judgment and output of the behavior types of the feature images to form periodic behavior types of the students.
In an embodiment, the feature image further includes a student desktop image, and the classroom scene information includes a classroom practice scene and a teacher lecture scene, referring to fig. 3, the following steps are further executed after step S50:
s51: when the classroom scene information is a classroom practice scene, extracting a student desktop image from the characteristic image;
s52: transmitting the desktop images of the students to a question analysis model for monitoring the question making progress of the students;
s53: the method comprises the steps that a question analysis model identifies and extracts exercise text data from a student desktop image, judges the answer progress of a current student based on the exercise text data, and records a time node of the current answer progress;
s54: based on the exercise question text data and the time nodes extracted from the front and the back, outputting question duration data representing the time spent by students for doing each exercise question.
In this embodiment, the desktop screen of the student in the student desktop image is an angle screen facing the ceiling.
When a teacher gives a lecture on a lecture table, students are normally in the head direction to listen to the lectures of the teacher; when practicing scenes in a classroom, students normally belong to the scene that the students look at the desktop to see the practice problems.
The problem making analysis model can find the outline of the practice problem book from the desktop of the student and extract the characters in the practice problem book, and by identifying whether handwriting exists at the position of the problem answer, the problem making analysis model can realize the identification of whether the student finishes the answer of the problem or not, and further realize the identification and judgment of the progress of the answer. Such as selection questions and gap filler questions.
The time spent by students for every time to make an integral can be obtained through marking the time point of the answering progress, so that the answering efficiency of the students is obtained.
Specifically, when the current classroom scene information is judged to be a classroom practice scene, a student desktop image in each characteristic image is obtained, the student desktop image is sent to a question making analysis model for monitoring the question making progress of the student, the question making analysis model extracts text information on a practice problem book in the student desktop image, whether the student finishes the answering of the practice problem or not is judged by identifying whether answering positions in the text information are provided with answering scripts, the answering progress of the student is known by combining the labels of the practice problem, the time point of the answering progress is further recorded, and therefore the time consumed by each answer of the student is calculated.
In one embodiment, the following steps are further performed after step S50:
S51A: acquiring all behavior types bound by each student information in a preset period; counting the occurrence times of different behavior types;
S52A: acquiring class discipline information corresponding to the moment based on the time point of the behavior type record; and counting class discipline information with the largest occurrence times of different behavior types.
In this embodiment, the preset period includes node periods such as each lesson, each day, each week, each month, each learning period, etc., and by counting the number of occurrences of different behavioral types and combining with the periodic examination results of the students, the association analysis of the student results and learning habits is realized. And the behavior types of students with good results and the behavior types of students with poor results can be counted.
The matching of the class subject information and the time points generated by the behavior types can acquire class conditions of each subject, and the class which most students are actively involved in and the class which most students are not carefully listened to and talked to can be obtained through analysis.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic, and should not limit the implementation process of the embodiment of the present application.
In an embodiment, a system for analyzing learning habits of students in class based on image recognition is provided, and the system for analyzing learning habits of students in class based on image recognition corresponds to the method for analyzing learning habits of students in class based on image recognition in the above embodiment. The method for analyzing the learning habit of the students in the class based on the image recognition comprises the following steps:
the classroom image acquisition module is used for receiving classroom images from the shooting terminal at intervals, and the shooting terminal is fixed in the classroom to shoot the classroom images of students when they take class;
the characteristic image splitting module is used for sending the received classroom images to the image comparison model in real time, and the image comparison model splits the classroom images to obtain a plurality of characteristic images representing the classroom states of each student based on the preset area division rules of the classroom images;
the scene recognition module is used for comparing the characteristic images with a target characteristic image set prestored in the image comparison model, judging and outputting current classroom scene information, wherein the target characteristic image set comprises a plurality of target characteristic images with different behavior types;
the behavior judging module is used for judging the behavior type of the characteristic image based on the current classroom scene information and the comparison result of the characteristic image and the target characteristic image;
And the behavior binding module is used for identifying pre-bound student information based on the characteristic image, binding and archiving the behavior type of the characteristic image and the student information.
Optionally, the target feature image set includes a scene feature image set bound with classroom scene information, and the scene recognition module includes:
the image and set comparison sub-module is used for comparing the characteristics of the plurality of characteristic images with the characteristics of the target characteristic images in the target characteristic image set;
and the scene judging sub-module is used for judging that the current class scene information is class scene information which is bound in advance by the scene feature image set if the feature images exceeding the preset quantity proportion are all judged to be the same scene feature image set based on the feature comparison result.
Optionally, the behavior determination module includes:
the approximate image acquisition sub-module is used for comparing the characteristic image with the target characteristic image in the target characteristic image set to acquire a plurality of target characteristic images approximate to the characteristic image;
the target feature image determining sub-module is used for screening one target feature image which accords with the classroom scene information from a plurality of target feature images based on the classroom scene information, and identifying the behavior type corresponding to the screened target feature image;
The behavior type consistency judging sub-module is used for judging whether the behavior types of the obtained target feature images are the same after feature comparison of the feature images of the same splitting area in a plurality of received classroom images in a preset time period;
and the behavior type same submodule is used for taking the behavior type corresponding to the screened target characteristic image as the behavior type of the characteristic image if the behavior types are the same.
Optionally, the behavior binding module includes:
the identification sub-module is used for identifying the identification information bound after the characteristic images are split based on the region division rule, and the identification information of each characteristic image in the classroom images is different from each other;
and the binding sub-module is used for identifying student information pre-bound with the identification information, binding and archiving the behavior types of the characteristic image with the student information, and binding a plurality of behavior types of each student information according to time sequence.
Optionally, the method further comprises:
the desktop image acquisition module is used for extracting desktop images of students from the characteristic images when the classroom scene information is a classroom practice scene;
the question analysis module is used for sending the desktop image of the student to a question analysis model for monitoring the question making progress of the student;
The progress judging module is used for identifying and extracting exercise problem text data from the desktop images of the students by the exercise problem analysis model, judging the answer progress of the current students based on the exercise problem text data, and recording the time node of the current answer progress;
the time length calculation module is used for outputting the exercise time length data representing the time spent by students for doing exercise each time based on the exercise text data extracted from front and back and the time nodes.
Optionally, the method further comprises:
the behavior statistics module is used for acquiring all behavior types bound by each student information in a preset period; counting the occurrence times of different behavior types;
the class discipline statistics module is used for acquiring class discipline information corresponding to the moment based on the time point of the behavior type record; and counting class discipline information with the largest occurrence times of different behavior types.
Specific limitation regarding a system for analyzing learning habits of students based on image recognition may be referred to above as limitation regarding a method for analyzing learning habits of students based on image recognition, and will not be described herein. The modules in the student classroom learning habit analysis system based on image recognition can be all or partially realized by software, hardware and combinations thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a computer device is provided, which may be a server, the internal structure of which may be as shown in fig. 4. The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, computer programs, and a database. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The database of the computer device is used for storing image class images, feature images, image comparison models, behavior types, target feature image sets, class scene information and the like. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program, when executed by the processor, implements a method for analyzing learning habits of students in a class based on image recognition.
In one embodiment, a computer device is provided, including a memory, a processor, and a computer program stored on the memory and executable on the processor, the processor implementing a method for analyzing learning habits of students based on image recognition when executing the computer program;
In one embodiment, a computer readable storage medium having a computer program stored thereon is provided, which when executed by a processor implements a method for analyzing learning habits of students in a class based on image recognition.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in embodiments provided herein may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), memory bus direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), among others.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (10)

1. A student classroom learning habit analysis method based on image recognition is characterized in that: the method comprises the following steps:
receiving classroom images from a shooting terminal at intervals, wherein the shooting terminal is fixed in a classroom to shoot the classroom images of students in class;
Transmitting the received classroom images to an image comparison model in real time, wherein the image comparison model splits the classroom images to obtain a plurality of characteristic images representing the classroom states of each student based on a preset area division rule of the classroom images;
feature comparison is carried out on the feature images and a target feature image set prestored in an image comparison model, current classroom scene information is judged and output, and the target feature image set contains a plurality of target feature images with different behavior types;
judging the behavior type of the feature image based on the current classroom scene information and the comparison result of the feature image and the target feature image;
and identifying pre-bound student information based on the characteristic image, binding and archiving the behavior type of the characteristic image and the student information.
2. The method for analyzing the learning habit of the student based on the image recognition according to claim 1, wherein the method comprises the following steps: the target feature image set comprises a scene feature image set bound with classroom scene information, and the step of comparing the feature image with the target feature image set prestored in the image comparison model, judging and outputting the current classroom scene information comprises the following steps:
Comparing the characteristics of the characteristic images with the characteristics of the target characteristic images in the target characteristic image set;
based on the feature comparison result, if the feature images exceeding the preset quantity proportion are all judged to be the same scene feature image set, the current class scene information is judged to be the class scene information which is bound in advance with the scene feature image set.
3. The method for analyzing the learning habit of the student based on the image recognition according to claim 1, wherein the method comprises the following steps: the step of judging the behavior type of the feature image based on the current classroom scene information and the comparison result of the feature image and the target feature image comprises the following steps:
feature comparison is carried out on the feature images and target feature images in the target feature image set, and a plurality of target feature images similar to the feature images are obtained;
screening one target feature image conforming to the classroom scene information from a plurality of target feature images based on the classroom scene information, and identifying the behavior type corresponding to the screened target feature image;
judging whether the behavior types of the obtained target feature images are the same after feature comparison of the feature images of the same splitting area in a plurality of received classroom images in a preset time period;
If the behavior types are the same, taking the behavior type corresponding to the screened target feature image as the behavior type of the feature image.
4. The method for analyzing the learning habit of the student based on the image recognition according to claim 1, wherein the method comprises the following steps: the step of identifying pre-bound student information based on the characteristic image, binding and archiving the behavior type of the characteristic image and the student information comprises the following steps:
identifying the identification information bound after the feature images are split based on the region division rule, wherein the identification information of each feature image in the class images is different;
identifying student information pre-bound with the identification information, binding and archiving the behavior types of the feature image with the student information, and binding a plurality of behavior types of each student information according to time sequence.
5. The method for analyzing the learning habit of the student based on the image recognition according to claim 1, wherein the method comprises the following steps: the characteristic image also comprises a student desktop image, the classroom scene information comprises a classroom practice scene and a teacher lecture scene, the characteristic image is based on the characteristic image to identify the pre-bound student information, and the following steps are executed after the step of binding and archiving the behavior type of the characteristic image and the student information:
When the classroom scene information is a classroom practice scene, extracting a student desktop image from the characteristic image;
transmitting the desktop images of the students to a question analysis model for monitoring the question making progress of the students;
the method comprises the steps that a question analysis model identifies and extracts exercise text data from a student desktop image, judges the answer progress of a current student based on the exercise text data, and records a time node of the current answer progress;
based on the exercise question text data and the time nodes extracted from the front and the back, outputting question duration data representing the time spent by students for doing each exercise question.
6. The method for analyzing the learning habit of the student based on the image recognition according to claim 1, wherein the method comprises the following steps: after the step of identifying the pre-bound student information based on the characteristic image, binding and archiving the behavior type of the characteristic image and the student information, the following steps are further executed:
acquiring all behavior types bound by each student information in a preset period; counting the occurrence times of different behavior types;
acquiring class discipline information corresponding to the moment based on the time point of the behavior type record; and counting class discipline information with the largest occurrence times of different behavior types.
7. The student classroom learning habit analysis system based on image recognition is characterized by comprising:
the classroom image acquisition module is used for receiving classroom images from the shooting terminal at intervals, and the shooting terminal is fixed in the classroom to shoot the classroom images of students when they take class;
the characteristic image splitting module is used for sending the received classroom images to the image comparison model in real time, and the image comparison model splits the classroom images to obtain a plurality of characteristic images representing the classroom states of each student based on the preset area division rules of the classroom images;
the scene recognition module is used for comparing the characteristic images with a target characteristic image set prestored in the image comparison model, judging and outputting current classroom scene information, wherein the target characteristic image set comprises a plurality of target characteristic images with different behavior types;
the behavior judging module is used for judging the behavior type of the characteristic image based on the current classroom scene information and the comparison result of the characteristic image and the target characteristic image;
and the behavior binding module is used for identifying pre-bound student information based on the characteristic image, binding and archiving the behavior type of the characteristic image and the student information.
8. The system for analyzing learning habits of students in a class based on image recognition according to claim 7, wherein the target feature image set includes a scene feature image set bound with class scene information, and the scene recognition module includes:
the image and set comparison sub-module is used for comparing the characteristics of the plurality of characteristic images with the characteristics of the target characteristic images in the target characteristic image set;
and the scene judging sub-module is used for judging that the current class scene information is class scene information which is bound in advance by the scene feature image set if the feature images exceeding the preset quantity proportion are all judged to be the same scene feature image set based on the feature comparison result.
9. A computer device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of an image recognition based student classroom learning habit analysis system as defined in any one of claims 1 to 6 when the computer program is executed by the processor.
10. A computer-readable storage medium storing a computer program, wherein the computer program when executed by a processor implements the steps of an image recognition-based student classroom learning habit analysis system of any one of claims 1 to 6.
CN202311171959.7A 2023-09-12 2023-09-12 Student classroom learning habit analysis method and system based on image recognition Pending CN117095464A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311171959.7A CN117095464A (en) 2023-09-12 2023-09-12 Student classroom learning habit analysis method and system based on image recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311171959.7A CN117095464A (en) 2023-09-12 2023-09-12 Student classroom learning habit analysis method and system based on image recognition

Publications (1)

Publication Number Publication Date
CN117095464A true CN117095464A (en) 2023-11-21

Family

ID=88769839

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311171959.7A Pending CN117095464A (en) 2023-09-12 2023-09-12 Student classroom learning habit analysis method and system based on image recognition

Country Status (1)

Country Link
CN (1) CN117095464A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108875606A (en) * 2018-06-01 2018-11-23 重庆大学 A kind of classroom teaching appraisal method and system based on Expression Recognition
CN109461104A (en) * 2018-10-22 2019-03-12 杭州闪宝科技有限公司 Classroom monitoring method, device and electronic equipment
CN109815795A (en) * 2018-12-14 2019-05-28 深圳壹账通智能科技有限公司 Classroom student's state analysis method and device based on face monitoring
CN111914801A (en) * 2020-08-17 2020-11-10 四川创客知佳科技有限公司 Classroom analysis method for intelligent education
CN115223179A (en) * 2021-04-19 2022-10-21 腾讯科技(深圳)有限公司 Classroom teaching data processing method and system based on answer codes
CN115907507A (en) * 2022-10-13 2023-04-04 华中科技大学 Classroom behavior detection and learning situation analysis method for students in combined classroom scene

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108875606A (en) * 2018-06-01 2018-11-23 重庆大学 A kind of classroom teaching appraisal method and system based on Expression Recognition
CN109461104A (en) * 2018-10-22 2019-03-12 杭州闪宝科技有限公司 Classroom monitoring method, device and electronic equipment
CN109815795A (en) * 2018-12-14 2019-05-28 深圳壹账通智能科技有限公司 Classroom student's state analysis method and device based on face monitoring
CN111914801A (en) * 2020-08-17 2020-11-10 四川创客知佳科技有限公司 Classroom analysis method for intelligent education
CN115223179A (en) * 2021-04-19 2022-10-21 腾讯科技(深圳)有限公司 Classroom teaching data processing method and system based on answer codes
CN115907507A (en) * 2022-10-13 2023-04-04 华中科技大学 Classroom behavior detection and learning situation analysis method for students in combined classroom scene

Similar Documents

Publication Publication Date Title
CN108648757B (en) Analysis method based on multi-dimensional classroom information
CN111079113B (en) Teaching system with artificial intelligent control and use method thereof
US11151892B2 (en) Internet teaching platform-based following teaching system
CN108491781B (en) Classroom concentration degree evaluation method and terminal
CN103761894B (en) Interaction classroom implementation method and interaction platform
Baker et al. Human classification of low-fidelity replays of student actions
CN109147444B (en) Learning condition feedback method and intelligent desk lamp
CN110364043A (en) A kind of paper cloud intellectual education system and application method
CN109359613A (en) A kind of teaching process analysis method based on artificial intelligence
CN110827595A (en) Interaction method and device in virtual teaching and computer storage medium
CN110930781A (en) Recording and broadcasting system
CN112861591A (en) Interactive identification method, interactive identification system, computer equipment and storage medium
CN109949189A (en) A kind of online teaching interaction effect evaluation method and device
CN112164259A (en) Classroom teacher-student interactive teaching system and method
CN117095464A (en) Student classroom learning habit analysis method and system based on image recognition
CN110826796A (en) Score prediction method
CN110991943A (en) Teaching quality evaluation system based on cloud computing
US11436934B2 (en) Systems and methods for providing a dialog assessment platform
CN116110262A (en) Speech answer evaluation method, device, equipment and storage medium
CN114580882A (en) Teaching effect evaluation system and method for hybrid teaching method
CN111967255A (en) Internet-based automatic language test paper evaluation method and storage medium
CN111311998A (en) Modern cloud classroom online education platform
Shurtina et al. LEARNERS’INTERACTIONS IN MASSIVE OPEN ONLINE COURSES: ANALYSIS AND INTERPRETATION
CN111445360A (en) Invigilating examination system based on computer-aided teaching
CN110933510A (en) Information interaction method in control system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination