CN112597813A - Teaching evaluation method and device and computer readable storage medium - Google Patents

Teaching evaluation method and device and computer readable storage medium Download PDF

Info

Publication number
CN112597813A
CN112597813A CN202011410428.5A CN202011410428A CN112597813A CN 112597813 A CN112597813 A CN 112597813A CN 202011410428 A CN202011410428 A CN 202011410428A CN 112597813 A CN112597813 A CN 112597813A
Authority
CN
China
Prior art keywords
information
behavior
attribute library
real
analysis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011410428.5A
Other languages
Chinese (zh)
Inventor
蔡丹丰
金佳
卢尧
张武科
徐瑶洁
吴亦心
李萍
许添杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
College of Science and Technology of Ningbo University
Original Assignee
College of Science and Technology of Ningbo University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by College of Science and Technology of Ningbo University filed Critical College of Science and Technology of Ningbo University
Priority to CN202011410428.5A priority Critical patent/CN112597813A/en
Publication of CN112597813A publication Critical patent/CN112597813A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06398Performance of employee with respect to a job function
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • G06Q50/205Education administration or guidance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction

Abstract

The embodiment of the invention discloses a teaching evaluation method, a device and a computer readable storage medium, which are applied to a control center, wherein the control center is in communication connection with an information acquisition device, and the method comprises the following steps: acquiring real-time information of the designated object through the information acquisition equipment to obtain real-time acquisition information, wherein the real-time acquisition information comprises action characteristic information and biological characteristic information; performing behavior analysis on the action characteristic information according to a behavior characteristic attribute library to obtain behavior information; performing learning state analysis on the biological characteristic information according to a biological characteristic attribute library to obtain learning state information; and performing integrated analysis on the learning state information and the behavior information to obtain teaching evaluation information. The embodiment of the invention provides a teaching evaluation method, a teaching evaluation device and a computer-readable storage medium, which have the characteristic of accurately obtaining teaching evaluation.

Description

Teaching evaluation method and device and computer readable storage medium
Technical Field
The invention relates to the technical field of teaching evaluation, in particular to a teaching evaluation method, teaching evaluation equipment and a computer-readable storage medium.
Background
In order to continuously improve the teaching quality, the teaching level of a teacher and the exquisite degree of a classroom need to be evaluated. At present, the existing teaching evaluation method is that students attend classes on site, the students score the classrooms and classes on paper during or after classes, collect scoring tables, and then the specially-assigned persons count the scoring conditions, but the students are influenced by factors such as physiology, psychology, semanteme and environment during the scoring process, and the obtained scoring can not accurately obtain the teaching evaluation including the teaching level of the classrooms and the exquisite degree of the classes.
Disclosure of Invention
The embodiment of the invention provides a teaching evaluation method, a teaching evaluation device and a computer-readable storage medium, which have the characteristic of accurately obtaining teaching evaluation.
The embodiment of the invention provides a teaching evaluation method which is applied to a control center, wherein the control center is in communication connection with information acquisition equipment, and the method comprises the following steps: acquiring real-time information of the designated object through the information acquisition equipment to obtain real-time acquisition information, wherein the real-time acquisition information comprises action characteristic information and biological characteristic information; performing behavior analysis on the action characteristic information according to a behavior characteristic attribute library to obtain behavior information; performing learning state analysis on the biological characteristic information according to a biological characteristic attribute library to obtain learning state information; and performing integrated analysis on the learning state information and the behavior information to obtain teaching evaluation information.
In one embodiment, the information acquisition device comprises an image acquisition device and a biological characteristic acquisition device, and the specified object comprises a first specified object and a second specified object; correspondingly, the information acquisition equipment acquires real-time information of the designated object to obtain real-time acquisition information, wherein the real-time acquisition information comprises action characteristic information and biological characteristic information, and the method comprises the following steps: acquiring images of the first specified object and the second specified object through the image acquisition equipment to obtain action characteristic information; acquiring real-time biological characteristics of the first specified object and/or the second specified object through the biological characteristic acquisition equipment to obtain biological characteristic information; and integrating the action characteristic information and the biological characteristic information to determine the real-time acquisition information.
In one embodiment, the behavior feature attribute library comprises a first behavior feature attribute library, a second behavior feature attribute library and a third behavior feature attribute library; correspondingly, performing behavior analysis on the action characteristic information according to a behavior characteristic attribute library to obtain behavior information, wherein the behavior analysis comprises the following steps: performing image analysis on the action characteristic information according to the first action characteristic attribute library to obtain first action information, wherein the first action characteristic attribute library is corresponding to the action characteristic attribute library of the first designated object; performing image analysis on the action characteristic information according to the second behavior characteristic attribute library to obtain second behavior information, wherein the second behavior characteristic attribute library is corresponding to the second specified object; performing image analysis on the action characteristic information according to the third behavior characteristic attribute library to obtain third behavior information, wherein the third behavior characteristic attribute library is a behavior characteristic attribute library corresponding to the first specified object and the second specified object; and performing integration analysis on the first behavior information, the second behavior information and the third behavior information to determine the behavior information.
In one embodiment, the biometric attribute library comprises a brain wave attribute library, an eye movement attribute library and a facial attribute library; correspondingly, the learning state analysis is carried out on the biological characteristic information according to the biological characteristic attribute library to obtain the learning state information, and the method comprises the following steps: performing feature analysis on the biological feature information to obtain real-time brain wave information, real-time eye movement information and real-time facial information; performing cranial nerve activity degree analysis on the real-time brain wave information according to the brain wave attribute library to obtain cranial nerve activity degree information; performing eye movement analysis on the real-time eye movement information according to the eye movement attribute library to obtain eye movement analysis information; performing facial emotion analysis on the real-time facial information according to the facial attribute library to obtain facial emotion information; and performing integrated analysis on the cranial nerve activity information, the eye movement analysis information and the facial emotion information to determine the learning state information.
In one embodiment, performing a brain neural activity analysis on the real-time brain wave information according to the brain wave attribute library to obtain brain neural activity information includes: generating fatigue information under the condition that the real-time brain wave information meets a first preset threshold value, wherein the first preset threshold value is used for representing a threshold value corresponding to the brain nerve activity degree of the specified object under the fatigue condition; generating attention information under the condition that the real-time brain wave information meets a second preset threshold value, wherein the second preset threshold value is used for representing a threshold value corresponding to the brain nerve activity degree of the specified object under the attention condition; when the real-time brain wave information meets a third preset threshold value, generating active information, wherein the third preset threshold value is used for representing a threshold value corresponding to the brain nerve activity degree of the specified object under the active condition; and performing integrated analysis on the fatigue information, the attention information and the activity information to determine the cranial nerve activity information.
In an embodiment, the method further comprises: generating a teaching evaluation report according to the teaching evaluation information; and generating a first display instruction and sending the first display instruction to display equipment so as to instruct the display equipment to display the teaching evaluation report.
Another aspect of the embodiments of the present invention provides a teaching evaluation device, which is applied to a control center, where the control center is in communication connection with an information acquisition device, and the device includes: the first acquisition module is used for acquiring real-time information of the specified object through the information acquisition equipment to acquire real-time acquisition information, wherein the real-time acquisition information comprises action characteristic information and biological characteristic information; the second obtaining module is used for performing behavior analysis on the action characteristic information according to the behavior characteristic attribute library to obtain behavior information; the third obtaining module is used for carrying out learning state analysis on the biological characteristic information according to the biological characteristic attribute library to obtain learning state information; and the fourth obtaining module is used for performing integrated analysis on the learning state information and the behavior information to obtain teaching evaluation information.
In one embodiment, the information acquisition device comprises an image acquisition device and a biological characteristic acquisition device, and the specified object comprises a first specified object and a second specified object; accordingly, the first obtaining module comprises: the first obtaining submodule is used for carrying out image acquisition on the first specified object and the second specified object through the image acquisition equipment to obtain action characteristic information; the second obtaining submodule is used for carrying out real-time biological characteristic collection on the first specified object and/or the second specified object through the biological characteristic collecting equipment to obtain biological characteristic information; and the first determining submodule is used for integrating the action characteristic information and the biological characteristic information to determine the real-time acquisition information.
In one embodiment, the behavior feature attribute library comprises a first behavior feature attribute library, a second behavior feature attribute library and a third behavior feature attribute library; accordingly, the second obtaining module comprises: a third obtaining sub-module, configured to perform image analysis on the action feature information according to the first behavior feature attribute library, so as to obtain first behavior information, where the first behavior feature attribute library is a behavior feature attribute library corresponding to the first specified object; a fourth obtaining submodule, configured to perform image analysis on the action feature information according to the second behavior feature attribute library, so as to obtain second behavior information, where the second behavior feature attribute library is a behavior feature attribute library corresponding to the second specified object; a fifth obtaining sub-module, configured to perform image analysis on the action feature information according to the third behavior feature attribute library, so as to obtain third behavior information, where the third behavior feature attribute library corresponds to the behavior feature attribute library corresponding to the first specified object and the second specified object; and the second determining submodule is used for performing integration analysis on the first behavior information, the second behavior information and the third behavior information to determine the behavior information.
In one embodiment, the biometric attribute library comprises a brain wave attribute library, an eye movement attribute library and a facial attribute library; accordingly, the third obtaining module comprises: a sixth obtaining submodule, configured to perform feature analysis on the biological feature information to obtain real-time brain wave information, real-time eye movement information, and real-time facial information; the seventh obtaining submodule is used for carrying out cranial nerve activity degree analysis on the real-time brain wave information according to the brain wave attribute library to obtain cranial nerve activity degree information; the eighth obtaining submodule is used for carrying out eye movement analysis on the real-time eye movement information according to the eye movement attribute library to obtain eye movement analysis information; a ninth obtaining sub-module, configured to perform facial emotion analysis on the real-time facial information according to the facial attribute library, so as to obtain facial emotion information; and the third determining submodule is used for performing integrated analysis on the cranial nerve activity information, the eye movement analysis information and the facial emotion information to determine the learning state information.
In an embodiment, the seventh obtaining sub-module includes: the first generation unit is used for generating fatigue information under the condition that the real-time brain wave information meets a first preset threshold value; the first preset threshold is used for representing a threshold corresponding to the brain nerve activity degree under the fatigue condition of the specified object; the second generation unit is used for generating attention information under the condition that the real-time brain wave information meets a second preset threshold value; the second preset threshold is used for representing a threshold corresponding to the brain nerve activity degree under the attention condition of the specified object; the third generation unit is used for generating active information under the condition that the real-time brain wave information meets a third preset threshold value; the third preset threshold is used for representing a threshold corresponding to the brain nerve activity degree under the active condition of the specified object; the first determining unit is used for performing integration analysis on the fatigue information, the attention information and the activity information to determine the cranial nerve activity information.
In an embodiment, the apparatus further comprises: the first generation module is used for generating a teaching evaluation report according to the teaching evaluation information; and the second generation module is used for generating a first display instruction and sending the first display instruction to the display equipment so as to indicate the display equipment to display the teaching evaluation report.
Embodiments of the present invention also provide a computer-readable storage medium, where the storage medium includes a set of computer-executable instructions, and when the instructions are executed, the storage medium is configured to perform any one of the above teaching evaluation methods.
According to the teaching evaluation method, the teaching evaluation equipment and the computer readable storage medium, the control center is in communication connection with the information acquisition equipment, receives real-time acquisition information from the information acquisition equipment, the real-time acquisition information comprises action characteristic information and biological characteristic information, the control center performs behavior analysis on the action characteristic information according to the behavior characteristic attribute library to obtain the behavior information, the control center performs learning state analysis on the biological characteristic information according to the biological characteristic attribute library to obtain the learning state information, and the control center performs integrated analysis on the learning state information and the behavior information to obtain teaching evaluation information, so that the condition that a student is not accurately scored due to the fact that the student is subjected to factors such as physiology, psychology, semanteme and environment in the scoring process is avoided, and therefore teaching evaluation is accurately obtained.
Drawings
The above and other objects, features and advantages of exemplary embodiments of the present invention will become readily apparent from the following detailed description read in conjunction with the accompanying drawings. Several embodiments of the invention are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which:
in the drawings, the same or corresponding reference numerals indicate the same or corresponding parts.
FIG. 1 is a schematic diagram of an implementation flow of a teaching evaluation method according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of a teaching evaluation method for obtaining real-time collected information according to an embodiment of the present invention;
FIG. 3 is a schematic flow chart of behavior information obtained by a teaching evaluation method according to an embodiment of the present invention;
fig. 4 is a schematic flow chart of learning state information obtained by a teaching evaluation method according to an embodiment of the present invention;
FIG. 5 is a schematic flow chart of a teaching evaluation method for obtaining information on cranial nerve activity according to an embodiment of the present invention;
FIG. 6 is a flowchart illustrating a teaching evaluation report displayed by a teaching evaluation method according to an embodiment of the present invention;
fig. 7 is a schematic block diagram of a teaching evaluation device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, features and advantages of the present invention more obvious and understandable, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1 is a schematic flow chart illustrating an implementation of a teaching evaluation method according to an embodiment of the present invention.
Referring to fig. 1, in one aspect, an embodiment of the present invention provides a teaching evaluation method applied to a control center, where the control center is in communication connection with an information acquisition device, and the method includes: step 101, acquiring real-time information of a specified object through information acquisition equipment to obtain real-time acquisition information, wherein the real-time acquisition information comprises action characteristic information and biological characteristic information; 102, performing behavior analysis on the action characteristic information according to a behavior characteristic attribute library to obtain behavior information; 103, performing learning state analysis on the biological characteristic information according to the biological characteristic attribute library to obtain learning state information; and step 104, integrating and analyzing the learning state information and the behavior information to obtain teaching evaluation information.
The teaching evaluation method provided by the embodiment of the invention is applied to classroom teaching evaluation. In the method, a control center is in communication connection with information acquisition equipment, receives real-time acquisition information from the information acquisition equipment, wherein the real-time acquisition information comprises action characteristic information and biological characteristic information, the control center performs action analysis on the action characteristic information according to an action characteristic attribute library to obtain the action information, the control center performs learning state analysis on the biological characteristic information according to the biological characteristic attribute library to obtain learning state information, and the control center performs integrated analysis on the learning state information and the action information to obtain teaching evaluation information, so that the problem that a student is subjected to inaccurate scoring due to factors such as physiology, psychology, semanteme, environment and the like in the scoring process is avoided, and the teaching evaluation is accurately obtained. The method can solve the problem of inaccurate teaching evaluation caused by subjective grading of teaching by a lessee.
In the embodiment of the invention, the control center can comprise information analysis equipment and a management platform, wherein the information analysis equipment is used for analyzing the real-time collected information to obtain the behavior information and the learning state information, and the management platform is used for integrating and analyzing the behavior information and the learning state information to obtain the teaching evaluation information.
In step 101, the information acquisition device monitors the designated object in real time to acquire real-time acquisition data, and the information acquisition device transmits the real-time acquisition data to the control center so that the control center acquires real-time acquisition information, wherein the real-time acquisition information includes action characteristic information and biological characteristic information. The real-time collected data may include electroencephalogram data, eye movement data, face data, motion data, and the like, the electroencephalogram data may be collected in real time by electroencephalogram collection equipment to obtain biological characteristic information including electroencephalogram information, for example, each specified object in a classroom is provided with one electroencephalogram collection equipment, each electroencephalogram collection equipment performs real-time electroencephalogram collection on the specified object corresponding to each brain wave collection equipment to obtain electroencephalogram data, the electroencephalogram data is sent to information analysis equipment in a control center, and the information analysis equipment generates biological characteristic information including electroencephalogram information; the eye movement data and the facial data can be acquired in real time by the image acquisition device to obtain the biological characteristic information containing the eye movement information and the facial information, for example, the image acquisition device can be a camera, a plurality of cameras can be installed in a classroom to acquire images of each designated object in the classroom in real time and acquire the images to obtain the image data, and the information analysis device generates the biological characteristic information containing the eye movement information and the facial information after receiving the image data; the motion data may be collected in real time by an image collecting device to obtain motion characteristic information, for example, the image collecting device may be a camera, a plurality of cameras may be installed in a classroom to collect real-time images of the whole classroom and collect the images to obtain image data, and the information analyzing device generates the motion characteristic information after receiving the image data.
In step 102, the behavior feature attribute library is used to provide an attribute library corresponding to the action feature information, and an operator may adjust the behavior feature attribute library in the control center according to the actual situation. The control center performs behavior analysis on the action characteristic information according to the behavior characteristic attribute library to obtain behavior information, wherein the behavior information can comprise interaction behavior information of a designated object such as hand raising and communication, behavior information of a designated object for clapping hands, behavior information of a designated object for sleeping and the like. In an implementation case, the information analysis device in the control center may perform motion analysis on the motion feature information according to a behavior feature attribute library, for example, the behavior feature attribute library may be an image attribute library, the motion feature information may be represented by an image, the information analysis device compares and identifies the image attribute library with the image, and selects an image including an interaction motion such as a designated object hand-raising and communication, a designated object hand-clapping motion, a designated object sleeping motion, and the like as the motion information.
In step 103, the biometric attribute library is used to provide an attribute library for matching with the biometric information, and the operator can adjust the biometric attribute library in the control center according to the actual situation. The control center analyzes the learning state of the biological characteristic information according to the biological characteristic attribute library to obtain learning state information, wherein the learning state information comprises information that the specified object is in a fatigue state in the learning process, information that the specified object is in a concerned state in the learning process, information that the specified object is in an excited state in the learning process and the like. In one implementable case, the information analysis device in the control center may perform learning state analysis on the biological feature information according to a biological feature attribute library, for example, the biological feature information includes real-time brain wave information, real-time eye movement information, real-time facial information, and the like, and the information analysis device may perform analysis on the real-time brain wave information according to the biological feature attribute library on the cranial nerve state, obtaining cranial nerve states such as cranial nerve fatigue, cranial nerve attention, and cranial nerve activity; the information analysis device can analyze the real-time eye movement information according to the biological characteristic attribute library to obtain the eye movement states such as the eyes are open and the eyes are closed; the information analysis device can analyze the real-time facial information according to the biological characteristic attribute library to obtain facial states such as fatigue, excitement and joy of the face; the information analysis device judges the cranial nerve state, the eye movement state and the face state belonging to the same time to obtain the learning state corresponding to the time, for example, on the basis of the same time, the cranial nerve state for representing cranial nerve fatigue at the time, the eye movement state for representing eyes in a closed state at the time and the face state for representing a face in fatigue at the time are obtained, and the obtained cranial nerve state, the eye movement state and the face state are judged to obtain the learning state information for representing the fatigue of a specified object at the time.
In step 104, the operator sets evaluation criteria in advance, and the control center analyzes and evaluates learning state information including learning states such as learning fatigue, learning attention, and learning excitement and behavior information including holding hands, clapping hands, sleeping and the like according to the evaluation criteria to obtain a plurality of evaluation results and integrate the evaluation results to determine teaching evaluation information. Teaching commentThe price information can be represented by scores, in an implementable case, after the course is finished, scoring can be carried out according to preset scoring standards and scoring items corresponding to each item of information in proportion, for example, the basic score is 0, the learning state information in one course is scored, when the learning state information for representing the learning excitement accounts for 80% of all the learning state information and meets the scoring standard of 80%, 20 points can be added, and when the learning state information for representing the learning fatigue accounts for 70% of all the learning state information and meets the scoring standard of 70%, 20 points can be reduced; at the same time, scoring is carried out on the learning state information at the time, when the learning state information for representing the learning excitement at the time accounts for 60 percent of all the learning state information at the time and the scoring standard is met to be 60 percent,can be added by 30 minutesWhen the proportion of the learning state information for representing the learning attention at the time to all the learning state information at the time is 65% and the scoring standard is met to be 70%, adding 20 points; scoring the behavior information in one course, adding 70 points when the behavior information for representing the raising hands accounts for 80% of all the behavior information and meets the scoring standard of 80%, and subtracting 10 points when the behavior information for representing the sleeping accounts for 10% of all the behavior information and meets the scoring standard of 10%; at the same time, scoring is carried out on the behavior information at the time, when the proportion of the behavior information for representing the hand raising at the time to all the behavior information at the time is 90%, and the scoring standard is 90%, 80 points can be added, when the proportion of the behavior information for representing the sleep at the time to all the behavior information at the time is 20%, and the scoring standard is 20%, 20 points can be reduced, and 190 points are counted, namely the teaching evaluation information can be represented by 190 points.
In the method, the information acquisition equipment can further comprise a learning platform, in the learning platform, a designated object can be pre-studied in the class, answers in the class and completes the work after the class, the learning platform sends learning platform data such as the pre-study, the in-class answers and the post-class completion work to the control center, the control center can take the learning platform data as one of the evaluation items, for example, when the teaching evaluation information is expressed by scores, the learning platform data can be used as an additional evaluation item, the scoring standard is set, when the pre-study rate in the learning platform data reaches 90% and meets the scoring standard of 90%, 30 points can be added, and when the in-class answer accuracy rate in the learning platform data is 0% and meets the scoring standard of 0%, 0 point can be added.
In the method, the information acquisition equipment can further comprise an attendance device, the attendance device sends acquired attendance data to the control center, the control center can take the attendance data as one of the evaluation items, for example, when the teaching evaluation information is represented by scores, the attendance data can be taken as an additional evaluation item, a scoring standard is set, and when the attendance rate reaches 100% on time and the scoring standard is met, the scoring standard can be added by 20 points.
In an implementable scenario, the control center may include information analysis equipment and a management platform, an operator adjusts the behavior characteristic attribute library and the biological characteristic attribute library in advance, and information acquisition equipment including brain wave acquisition equipment, image acquisition equipment, a learning platform, an attendance device and the like is arranged in a classroom.
The method comprises the following steps of firstly obtaining real-time acquisition information including action characteristic information, biological characteristic information and the like, and specifically comprising the following steps: each appointed object can be provided with a corresponding brain wave acquisition device, real-time brain wave acquisition is carried out on the appointed object through the brain wave acquisition devices, real-time brain wave data are obtained, the real-time brain wave data are sent to the information analysis device, and biological characteristic information containing brain waves is generated; acquiring real-time images of the designated object through image acquisition equipment to obtain image data, sending the image data to information analysis equipment, and generating biological characteristic information containing eye movement and face and action characteristic information containing action; the appointed object can carry out pre-class pre-study, in-class answering and after-class completion work and the like in the learning platform, and learning platform data related to the pre-class pre-study, in-class answering and after-class completion work and the like can be obtained through the learning platform; attendance data including the on-time attendance rate, the late arrival rate, the early exit rate and the like in the class are obtained through the attendance device.
Then, performing behavior analysis on the action characteristic information according to the behavior characteristic attribute library to obtain behavior information, specifically, the behavior characteristic attribute library can be an image attribute library, the action characteristic information can be represented by an image, and the information analysis equipment compares and identifies the image attribute library and the image, and selects an image containing interaction actions such as designated object hand-raising and communication, designated object hand-clapping actions, designated object sleeping actions and the like as the behavior information;
then, performing learning state analysis on the biological characteristic information according to the biological characteristic attribute library to obtain learning state information, wherein the biological characteristic information specifically comprises real-time brain wave information, real-time eye movement information, real-time facial information and the like, and the information analysis equipment can perform analysis on the real-time brain wave information about the cranial nerve state according to the biological characteristic attribute library to obtain the cranial nerve state; the information analysis equipment can analyze the real-time eye movement information about the eye movement state according to the biological characteristic attribute library to obtain the eye movement state; the information analysis device can analyze the real-time eye movement information about the face state according to the biological characteristic attribute library to obtain the face state; the information analysis equipment judges the cranial nerve state, the eye movement state and the face state which belong to the same time, and obtains a learning state corresponding to the time;
and finally, integrating and analyzing the learning state information and the behavior information to obtain teaching evaluation information, which specifically comprises the following steps: the teaching evaluation information can be represented by scores, scoring can be carried out according to preset scoring standards and scoring items, the basis is 0 score, when the proportion of the learning state information for representing the learning fatigue in a class accounts for 50% of all the learning state information in the class and the scoring standard is 50%, 20 scores are reduced, when the proportion of the learning state information for representing the learning excitement in the class accounts for 30% of all the learning state information in the class and the scoring standard is 30%, 30 scores are added, when the proportion of the learning state information for representing the learning attention in the class accounts for 20% of all the learning state information in the class and the scoring standard is 20%, 15 scores are added, when the proportion of the behavior information for representing the sleeping in the class accounts for 20% of all the behavior information in the class and the scoring standard is 20%, 20 scores are reduced, when the behavior information for representing hands raising in one class accounts for 70% of all behavior information in the class and meets the scoring standard of 70%, adding 70 points, when the class-ahead pre-learning rate in the learning platform data is 94%, the scoring standard is 90% and adding 50 points, when the on-time attendance rate in the attendance data is 100%, the scoring standard is 100%, adding 60 points, when the early-receding rate in the attendance data is 100%, the scoring standard is 100%, subtracting 60 points, and finally obtaining the teaching evaluation score of 125 points.
The teaching evaluation information is obtained by the method, so that the condition of inaccurate grading caused by factors such as physiology, psychology, semanteme, environment and the like of the lessee in the grading process is avoided, and the aim of accurately obtaining the teaching evaluation is fulfilled.
Fig. 2 is a schematic flow chart of a teaching evaluation method for obtaining real-time collected information according to an embodiment of the present invention.
Referring to fig. 2, in an embodiment of the present invention, an information acquisition apparatus includes an image acquisition apparatus and a biometric acquisition apparatus, and a specified object includes a first specified object and a second specified object; correspondingly, carry out real-time information acquisition to appointed object through information acquisition equipment, obtain real-time information of gathering, real-time information of gathering contains action characteristic information and biological characteristic information, includes: step 201, acquiring images of a first specified object and a second specified object by image acquisition equipment to obtain action characteristic information; step 202, performing real-time biological characteristic acquisition on the first specified object and/or the second specified object through biological characteristic acquisition equipment to obtain biological characteristic information; and step 203, integrating the action characteristic information and the biological characteristic information to determine real-time acquisition information.
In the embodiment of the present invention, the first designated object may be a student and the second designated object may be a lecturer, for example, when a student is in a math class in a school, the first designated object is the student, and the second designated object is a teacher for teaching math.
In step 201, the image acquisition device acquires images of the whole classroom to obtain image acquisition data including a lecturer and a lecturer, the image acquisition device transmits the image acquisition data to an information analysis device in the control center, the information analysis device selects images including actions such as raising hands and sleeping, and the selected images serve as action characteristic information.
In step 202, the biometric acquisition device may include a brain wave acquisition device, an image acquisition device for acquiring data including eye movement, face, and the like. The image capturing device may be provided in plurality to ensure real-time image capturing of faces and eye movements for each of the designated subjects in the classroom, and the image capturing device performs real-time image capturing of the lecturer and lecturer to obtain biometric data including the eye movements and faces. The brain wave acquisition equipment is used for acquiring the biological characteristic acquisition data containing the brain waves in real time for the corresponding lecturer. The biometric acquisition device transmits the acquired biometric acquisition data to a control center, which generates biometric information, wherein the biometric information may include information on brain waves, information on eye movements, and information on a face.
In step 203, the control center integrates the motion characteristic information and the biological characteristic information to determine real-time acquisition information.
Fig. 3 is a schematic flow chart of behavior information obtained by a teaching evaluation method according to an embodiment of the present invention.
Referring to fig. 3, in the embodiment of the present invention, the behavior feature attribute library includes a first behavior feature attribute library, a second behavior feature attribute library, and a third behavior feature attribute library; correspondingly, performing behavior analysis on the action characteristic information according to the behavior characteristic attribute library to obtain behavior information, wherein the behavior analysis comprises the following steps: step 301, performing image analysis on the action characteristic information according to a first action characteristic attribute library to obtain first action information, wherein the first action characteristic attribute library is corresponding to a first designated object; step 302, performing image analysis on the action characteristic information according to a second behavior characteristic attribute library to obtain second behavior information, wherein the second behavior characteristic attribute library is a behavior characteristic attribute library corresponding to a second specified object; step 303, performing image analysis on the action characteristic information according to a third behavior characteristic attribute library to obtain third behavior information, wherein the third behavior characteristic attribute library corresponds to the first specified object and the second specified object; and step 304, performing integrated analysis on the first behavior information, the second behavior information and the third behavior information to determine behavior information.
In step 301, based on the first specified object, the control center performs image analysis on the action feature information according to a first action feature attribute library to obtain first action information, where the first action feature attribute library can be adjusted by an operator according to actual conditions, and the first action information only includes action information about the first specified object, for example, the first action feature attribute library may include a sleep image attribute library, and when a lessee sleeps, the control center performs image recognition and analysis on an image including the action of sleeping the lessee according to the sleep image attribute library to obtain first action information including the sleep action of the lessee.
In step 302, based on the second specified object, the control center performs image analysis on the action feature information according to a second action feature attribute library to obtain second action information, where the second action feature attribute library can be adjusted by an operator according to actual conditions, and the second action information only includes action information about the second specified object, for example, the second action feature attribute library includes an image attribute library for writing on a blackboard, and when a lecturer writes on the blackboard, the control center performs image recognition and analysis on an image including the action of writing on the blackboard by the lecturer according to the image attribute library for writing on the blackboard, and obtains the second action information including the action of writing on the blackboard by the lecturer.
In step 303, based on the first designated object and the second designation, the control center performs image analysis on the action feature information according to a third behavior feature attribute library to obtain third behavior information, where the third behavior feature library can be adjusted by an operator according to actual conditions, the third behavior information includes behavior information about both the first designated object and the second designated object, for example, the third behavior feature library may include an image attribute library of an interactive action, and when an interactive action occurs between a lecturer and a lecturer, the control center performs image recognition and analysis on an image including the interactive action according to the image attribute library including the interactive action to obtain the third behavior information including the interactive action.
In step 304, the control center performs an integrated analysis on the first behavior information, the second behavior information, and the third behavior information to determine behavior information.
Fig. 4 is a schematic flow chart illustrating a learning state information obtaining method according to a teaching evaluation method in an embodiment of the present invention.
Referring to fig. 4, in an embodiment of the present invention, the biometric attribute library includes a brain wave attribute library, an eye movement attribute library, and a face attribute library; correspondingly, the learning state analysis is carried out on the biological characteristic information according to the biological characteristic attribute library to obtain the learning state information, and the method comprises the following steps: step 401, performing feature analysis on biological feature information to obtain real-time brain wave information, real-time eye movement information and real-time facial information; step 402, performing cranial nerve activity analysis on the real-time brain wave information according to a brain wave attribute library to obtain cranial nerve activity information; step 403, performing eye movement analysis on the real-time eye movement information according to the eye movement attribute library to obtain eye movement analysis information; step 404, performing facial emotion analysis on the real-time facial information according to the facial attribute library to obtain facial emotion information; and 405, performing integrated analysis on the cranial nerve activity information, the eye movement analysis information and the facial emotion information to determine learning state information.
In step 401, the control center may include a feature analysis device, where the feature analysis device performs feature analysis on the biological feature information according to the brain wave comparison library, the eye movement comparison library, and the face comparison library, determines information corresponding to the brain wave comparison library in the biological feature information as real-time brain wave information, determines information corresponding to the eye movement comparison library in the biological feature information as real-time eye movement information, and determines information corresponding to the face comparison library in the biological feature information as real-time face information.
In step 402, the control center performs a brain neural activity analysis on the real-time brain wave information according to a brain wave attribute library, so as to obtain brain neural activity information, wherein the brain wave attribute library can be adjusted by an operator according to actual conditions. In one implementable case, when the cranial nerve of the subject is specified, the control center may analyze information of the cranial nerve fatigue in the real-time brain wave information according to the brain wave attribute library, thereby obtaining information of the cranial nerve activity degree on the cranial nerve fatigue.
In step 403, the control center performs eye movement analysis on the real-time eye movement information according to the eye movement attribute library to obtain eye movement analysis information, wherein the eye movement attribute library can be adjusted by an operator according to actual conditions. In one implementation, when the specified object blinks, the control center may analyze information of the blinks in the real-time eye movement information according to the eye movement attribute library to obtain eye movement analysis information about the blinks.
In step 404, the control center performs facial emotion analysis on the real-time facial information according to a facial attribute library to obtain facial emotion information, wherein the facial attribute library can be adjusted by an operator according to actual conditions. In one implementable case, when the specified subject smiles, the control center may analyze information of the smile in the real-time face information according to the face attribute library, thereby obtaining face emotion information about the smile.
In step 405, the control center determines the cranial nerve activity information, the eye movement analysis information and the facial emotion information belonging to the same time, determines the learning state information for representing that the student is in the learning fatigue state at the time, the student is in the learning attention state at the time and the student is in the learning excitation state at the time, for example, the control center determines the cranial nerve activity information, the eye movement analysis information and the facial emotion information for representing that the student is in the learning fatigue state at the same time, and determines the learning state information for representing that the student is in the learning fatigue state at the time.
Fig. 5 is a schematic flow chart of a teaching evaluation method for obtaining information on the activity of cranial nerves according to an embodiment of the present invention.
Referring to fig. 5, in the embodiment of the present invention, performing brain neural activity analysis on real-time brain wave information according to a brain wave attribute library to obtain brain neural activity information, including: step 501, when the real-time brain wave information meets a first preset threshold value, generating fatigue information, wherein the first preset threshold value is used for representing a threshold value corresponding to the brain nerve activity degree of a specified object under the fatigue condition; step 502, when the real-time brain wave information meets a second preset threshold value, attention information is generated, and the second preset threshold value is used for representing a threshold value corresponding to the brain nerve activity degree of a specified object under the attention condition; step 503, when the real-time brain wave information meets a third preset threshold, generating active information, wherein the third preset threshold is used for representing a threshold corresponding to the brain nerve activity degree of the specified object under the active condition; and step 504, performing integrated analysis on the fatigue information, the attention information and the activity information to determine the information of the brain nerve activity.
In step 501, the real-time brain wave information may be represented by a numerical value, and the fatigue information is generated when the control center determines that the real-time brain wave information satisfies a first preset threshold, where the first preset threshold is used to represent a threshold corresponding to a brain nerve activity level in a fatigue condition of the specified object.
In step 502, the real-time brain wave information may be represented by a numerical value, and the attention information is generated when the control center determines that the real-time brain wave information satisfies a second preset threshold, where the second preset threshold is used to represent a threshold corresponding to a brain nerve activity level in a state of attention of the specified object.
In step 503, the real-time brain wave information may be represented by a numerical value, and the activity information is generated when the control center determines that the real-time brain wave information satisfies a third preset threshold, where the third preset threshold is used to represent a threshold corresponding to the brain nerve activity level in the active condition of the specified object.
In step 504, the control center performs an integration analysis on the fatigue information, the attention information and the activity information to determine the information of the brain nerve activity.
In an implementation case, the first preset threshold is set to be 1-50, the second preset threshold is set to be 50-80, the third preset threshold is set to be 80-100, when the control center judges that the numerical value represented by the real-time brain wave information is between 1-50, fatigue information is generated, when the control center judges that the numerical value represented by the real-time brain wave information is between 50-80, attention information is generated, when the control center judges that the numerical value represented by the real-time brain wave information is between 80-100, active information is generated, and the control center performs integration analysis on the fatigue information, the attention information and the active information to determine the information of the brain nerve activity.
Fig. 6 is a flowchart illustrating a teaching evaluation method displaying a teaching evaluation report according to an embodiment of the present invention.
Referring to fig. 6, in an embodiment of the present invention, the method further includes: 601, generating a teaching evaluation report according to the teaching evaluation information; step 602, generating a first display instruction and sending the first display instruction to a display device to instruct the display device to display the teaching evaluation report.
In the embodiment of the invention, the control center generates the teaching evaluation report according to the teaching evaluation information, under an implementable condition, the control center can comprise a display picture, and after the control center generates the teaching evaluation report, the teaching evaluation report can be displayed in the display picture. Under an implementable condition, the control center is in communication connection with the display device, and the control center generates a first display instruction and sends the first display instruction to the display device so as to instruct the display device to display the teaching evaluation report.
To facilitate a further understanding of the above-described implementation, a specific implementation scenario is provided below in which, the first designated object is a student, the second designated object is a teacher, the operator sets a behavior characteristic attribute library and a biological characteristic attribute library in advance, wherein, the behavior feature attribute library can comprise a first behavior feature attribute library, a second behavior feature attribute library, a third behavior feature attribute library and the like, the biological feature attribute library can comprise a brain wave attribute library, an eye movement attribute library, a face attribute library and the like, an information acquisition device is arranged in the classroom and is in communication connection with the control center, the control center comprises feature analysis equipment, the information acquisition equipment comprises image acquisition equipment and biological feature acquisition equipment, the biological feature acquisition equipment comprises a plurality of brain wave acquisition equipment, and each student is provided with one brain wave acquisition equipment.
Firstly, acquiring real-time acquisition information including action characteristic information, biological characteristic information and the like, for example, each brain wave acquisition device performs real-time acquisition on brain waves of a corresponding student, and sends acquired brain wave data to biological nerve data analysis equipment to generate real-time brain wave information; the image acquisition equipment for acquiring the eye movement and the face acquires images containing the eye movement and the face of the student, and sends the acquired image data containing the eye movement and the face to the eye movement and face data analysis equipment to generate real-time eye movement information and face information; the image acquisition equipment for acquiring the actions acquires images containing the actions of the students and teachers, and sends the acquired image data containing the actions to the action characteristic analysis equipment to generate action characteristic information; learning platform data related to pre-class pre-study, in-class answering, after-class completion work and the like can be obtained through the learning platform; attendance data including on-time attendance rate, late arrival rate, early exit rate and the like in the class are obtained through an attendance device; and integrating the action characteristic information, the biological characteristic information, the learning platform data and the attendance data to determine real-time acquisition information.
And then, the control center performs behavior analysis on the action characteristic information according to a behavior characteristic attribute library to obtain behavior information. Specifically, based on the student, the action characteristic analysis device performs image analysis on the action characteristic information according to the first action characteristic attribute library to obtain first action information, wherein the first action information may include action information such as student holding and student sleeping, for example, the action characteristic analysis device performs image recognition and analysis on the image information including the action of student holding to obtain first action information including the action of student holding. Based on the teacher, the motion characteristic analysis device performs image analysis on the motion characteristic information according to the second behavior characteristic attribute library to obtain second behavior information, where the second behavior information may include behavior information such as writing by the teacher and clapping by the teacher, and for example, the motion characteristic analysis device performs image recognition and analysis on an image of the motion of writing by the teacher on a blackboard to obtain the second behavior information including the writing motion of the teacher on the blackboard. Based on the student and the teacher, the action characteristic analysis device performs image analysis on the action characteristic information according to the third action characteristic attribute library to obtain third action information, wherein the third action information comprises action information of interaction between the student and the teacher, for example, the action characteristic analysis device performs image recognition and analysis on an image of the action of interaction between the teacher and the student to obtain third action information comprising the interaction between the teacher and the student. And the action characteristic analysis equipment performs integrated analysis on the first behavior information, the second behavior information and the third behavior information to determine behavior information.
Then, the information analysis device performs learning state analysis on the biological characteristic information according to the biological characteristic attribute library to obtain learning state information including learning states such as learning fatigue and learning excitement. Specifically, the biological characteristic attribute library comprises a brain wave attribute library, an eye movement attribute library and a face attribute library, the biological nerve data analysis equipment performs characteristic analysis on the biological characteristic information to obtain real-time brain wave information, real-time eye movement information and real-time face information, and the biological nerve data analysis equipment performs brain nerve activity analysis on the real-time brain wave information according to the brain wave attribute library to obtain brain nerve activity information. And the biological nerve data analysis equipment performs eye movement analysis on the real-time eye movement information according to the eye movement attribute library to obtain eye movement analysis information. And the biological nerve data analysis equipment performs facial emotion analysis on the real-time facial information according to the facial attribute library to obtain facial emotion information. The biological nerve data analysis device performs integrated analysis of the cranial nerve activity information, the eye movement analysis information and the facial emotion information, determines learning state information containing learning states such as learning fatigue and learning excitement, for example, integrates all the results when the biological nerve data analysis device outputs the cranial nerve activity information containing the cranial nerve fatigue, the eye movement analysis information containing the eye fatigue and the facial emotion information containing the facial fatigue based on the biological feature information at the same time, and determines the learning state of the learning fatigue.
And finally, the management platform integrates and analyzes the learning state information and the behavior information to obtain teaching evaluation information. The teaching evaluation information can be represented by scores, the management platform divides the learning state information including learning excitement, divides the learning state information including learning fatigue, divides the behavior information including hand raising action, integrates all scores to obtain teaching scores, for example, when the learning state information for representing learning fatigue in a class accounts for 50% of all the learning state information in a class and meets the scoring criterion of 50%, divides 20, when the learning state information for representing learning excitement in a class accounts for 30% of all the learning state information in a class and meets the scoring criterion of 30%, divides 30, and when the learning state information for representing learning attention in a class accounts for 20% of all the learning state information in a class and meets the scoring criterion of 50%, divides 15, when the behavior information for representing sleep in one class accounts for 20% of all behavior information in one class and meets the scoring standard of 20%, 20 points are subtracted, when the behavior information for representing hand raising in one class accounts for 70% of all behavior information in one class and meets the scoring standard of 70%, 70 points are added, the calculated teaching evaluation score is 75 points, when the post-session completion rate in the learning platform data is 94%, the scoring standard of 90% is met, 30 points can be added, the teaching evaluation score is 105, when the early-retreat rate in the attendance data is 100%, the scoring standard of 100% is met, 60 points are subtracted, and the total teaching evaluation score is 45 points. The control center can also generate a teaching evaluation report form containing the evaluation results of the lecturers, the grading results of the competitive curriculum and the grading results of the students receiving the success degrees according to the teaching evaluation information.
Through the method, data such as brain waves, eye movements and facial expressions of students can be monitored in real time, and excitement, attention and brain fatigue states of the students to new knowledge points are analyzed to judge teaching quality, so that comprehensive teaching scoring is performed, the condition that scoring is inaccurate due to factors such as physiology, psychology, semanteme and environment of teachers in the scoring process is avoided, and the purpose of accurately obtaining teaching evaluation is achieved.
Fig. 7 is a schematic block diagram of a teaching evaluation device according to an embodiment of the present invention.
Referring to fig. 7, another aspect of the embodiment of the present invention provides a teaching evaluation device, which is applied to a control center, where the control center is in communication connection with an information acquisition device, and the device includes: a first obtaining module 701, configured to perform real-time information collection on a specified object through an information collection device, so as to obtain real-time collected information, where the real-time collected information includes action characteristic information and biological characteristic information; a second obtaining module 702, configured to perform behavior analysis on the action feature information according to the behavior feature attribute library, so as to obtain behavior information; a third obtaining module 703, configured to perform learning state analysis on the biological feature information according to the biological feature attribute library, so as to obtain learning state information; a fourth obtaining module 704, configured to perform integrated analysis on the learning state information and the behavior information to obtain teaching evaluation information.
In an embodiment of the present invention, an information acquisition apparatus includes an image acquisition apparatus and a biometric acquisition apparatus, and a specified object includes a first specified object and a second specified object; accordingly, the first obtaining module 701 includes: a first obtaining sub-module 7011, configured to perform image acquisition on the first specified object and the second specified object through an image acquisition device to obtain action feature information; the second obtaining sub-module 7012 is configured to perform real-time biometric acquisition on the first specified object and/or the second specified object by using a biometric acquisition device to obtain biometric information; the first determining sub-module 7013 is configured to integrate the motion characteristic information and the biometric characteristic information to determine real-time collected information.
In the embodiment of the invention, the behavior feature attribute library comprises a first behavior feature attribute library, a second behavior feature attribute library and a third behavior feature attribute library; accordingly, the second obtaining module 702 includes: a third obtaining sub-module 7021, configured to perform image analysis on the action feature information according to the first behavior feature attribute library to obtain first behavior information, where the first behavior feature attribute library is a behavior feature attribute library corresponding to the first specified object; a fourth obtaining sub-module 7022, configured to perform image analysis on the action feature information according to a second behavior feature attribute library to obtain second behavior information, where the second behavior feature attribute library is a behavior feature attribute library corresponding to a second specified object; a fifth obtaining sub-module 7023, configured to perform image analysis on the action feature information according to a third behavior feature attribute library to obtain third behavior information, where the third behavior feature library is a behavior feature attribute library corresponding to the first specified object and the second specified object; the second determining sub-module 7024 is configured to perform integrated analysis on the first behavior information, the second behavior information, and the third behavior information to determine behavior information.
In the embodiment of the invention, the biological characteristic attribute library comprises a brain wave attribute library, an eye movement attribute library and a face attribute library; accordingly, the third obtaining module 703 includes: a sixth obtaining sub-module 7031, configured to perform feature analysis on the biological feature information to obtain real-time brain wave information, real-time eye movement information, and real-time face information; a seventh obtaining sub-module 7032, configured to perform, according to the brain wave attribute library, brain nerve activity analysis on the real-time brain wave information to obtain brain nerve activity information; an eighth obtaining sub-module 7033, configured to perform eye movement analysis on the real-time eye movement information according to the eye movement attribute library, to obtain eye movement analysis information; a ninth obtaining sub-module 7034, configured to perform facial emotion analysis on the real-time facial information according to the facial attribute library, to obtain facial emotion information; and a third determining sub-module 7035, configured to perform integrated analysis on the cranial nerve activity information, the eye movement analysis information, and the facial emotion information, and determine learning state information.
In this embodiment of the present invention, a seventh obtaining sub-module 7032 includes: a first generating unit 70321, configured to generate fatigue information when the real-time brain wave information satisfies a first preset threshold; the first preset threshold is used for representing a threshold corresponding to the brain nerve activity degree under the fatigue condition of the specified object; a second generating unit 70322 for generating attention information in a case where the real-time brain wave information satisfies a second preset threshold; the second preset threshold is used for representing a threshold corresponding to the brain nerve activity degree under the attention condition of the specified object; a third generating unit 70323, configured to generate active information when the real-time brain wave information satisfies a third preset threshold; the third preset threshold value is used for representing a threshold value corresponding to the brain nerve activity degree under the specified object activity condition; the first determining unit 70324 is configured to perform an integrated analysis on the fatigue information, the attention information, and the activity information to determine the cranial nerve activity information.
In an embodiment of the present invention, the apparatus further includes: the first generating module 705 is used for generating a teaching evaluation report according to the teaching evaluation information; the second generating module 706 is configured to generate a first display instruction and send the first display instruction to the display device, so as to instruct the display device to display the teaching evaluation report.
Embodiments of the present invention also provide a computer-readable storage medium, which includes a set of computer-executable instructions, and when executed, is configured to perform any one of the above teaching evaluation methods.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1. A teaching evaluation method is applied to a control center which is in communication connection with information acquisition equipment, and comprises the following steps:
acquiring real-time information of the designated object through the information acquisition equipment to obtain real-time acquisition information, wherein the real-time acquisition information comprises action characteristic information and biological characteristic information;
performing behavior analysis on the action characteristic information according to a behavior characteristic attribute library to obtain behavior information;
performing learning state analysis on the biological characteristic information according to a biological characteristic attribute library to obtain learning state information;
and performing integrated analysis on the learning state information and the behavior information to obtain teaching evaluation information.
2. The method according to claim 1, wherein the information acquisition device includes an image acquisition device and a biometric acquisition device, and the specified object includes a first specified object and a second specified object;
correspondingly, the information acquisition equipment acquires real-time information of the designated object to obtain real-time acquisition information, wherein the real-time acquisition information comprises action characteristic information and biological characteristic information, and the method comprises the following steps:
acquiring images of the first specified object and the second specified object through the image acquisition equipment to obtain action characteristic information;
acquiring real-time biological characteristics of the first specified object and/or the second specified object through the biological characteristic acquisition equipment to obtain biological characteristic information;
and integrating the action characteristic information and the biological characteristic information to determine the real-time acquisition information.
3. The method of claim 2, wherein the behavior feature attribute library comprises a first behavior feature attribute library, a second behavior feature attribute library, and a third behavior feature attribute library;
correspondingly, performing behavior analysis on the action characteristic information according to a behavior characteristic attribute library to obtain behavior information, wherein the behavior analysis comprises the following steps:
performing image analysis on the action characteristic information according to the first action characteristic attribute library to obtain first action information, wherein the first action characteristic attribute library is corresponding to the action characteristic attribute library of the first designated object;
performing image analysis on the action characteristic information according to the second behavior characteristic attribute library to obtain second behavior information, wherein the second behavior characteristic attribute library is corresponding to the second specified object;
performing image analysis on the action characteristic information according to the third behavior characteristic attribute library to obtain third behavior information, wherein the third behavior characteristic attribute library is a behavior characteristic attribute library corresponding to the first specified object and the second specified object;
and performing integration analysis on the first behavior information, the second behavior information and the third behavior information to determine the behavior information.
4. The method of claim 1, wherein the biometric attribute library comprises a brain wave attribute library, an eye movement attribute library, and a facial attribute library;
correspondingly, the learning state analysis is carried out on the biological characteristic information according to the biological characteristic attribute library to obtain the learning state information, and the method comprises the following steps:
performing feature analysis on the biological feature information to obtain real-time brain wave information, real-time eye movement information and real-time facial information;
performing cranial nerve activity degree analysis on the real-time brain wave information according to the brain wave attribute library to obtain cranial nerve activity degree information;
performing eye movement analysis on the real-time eye movement information according to the eye movement attribute library to obtain eye movement analysis information;
performing facial emotion analysis on the real-time facial information according to the facial attribute library to obtain facial emotion information;
and performing integrated analysis on the cranial nerve activity information, the eye movement analysis information and the facial emotion information to determine the learning state information.
5. The method according to claim 4, wherein performing brain neural activity analysis on the real-time brain wave information according to the brain wave attribute library to obtain brain neural activity information comprises:
generating fatigue information under the condition that the real-time brain wave information meets a first preset threshold value, wherein the first preset threshold value is used for representing a threshold value corresponding to the brain nerve activity degree of the specified object under the fatigue condition;
generating attention information under the condition that the real-time brain wave information meets a second preset threshold value, wherein the second preset threshold value is used for representing a threshold value corresponding to the brain nerve activity degree of the specified object under the attention condition;
when the real-time brain wave information meets a third preset threshold value, generating active information, wherein the third preset threshold value is used for representing a threshold value corresponding to the brain nerve activity degree of the specified object under the active condition;
and performing integrated analysis on the fatigue information, the attention information and the activity information to determine the cranial nerve activity information.
6. The method of claim 1, further comprising:
generating a teaching evaluation report according to the teaching evaluation information;
and generating a first display instruction and sending the first display instruction to display equipment so as to instruct the display equipment to display the teaching evaluation report.
7. The teaching evaluation equipment is applied to a control center which is in communication connection with information acquisition equipment, and comprises:
the first acquisition module is used for acquiring real-time information of the specified object through the information acquisition equipment to acquire real-time acquisition information, wherein the real-time acquisition information comprises action characteristic information and biological characteristic information;
the second obtaining module is used for performing behavior analysis on the action characteristic information according to the behavior characteristic attribute library to obtain behavior information;
the third obtaining module is used for carrying out learning state analysis on the biological characteristic information according to the biological characteristic attribute library to obtain learning state information;
and the fourth obtaining module is used for performing integrated analysis on the learning state information and the behavior information to obtain teaching evaluation information.
8. The apparatus according to claim 7, wherein the information acquisition apparatus includes an image acquisition apparatus and a biometric acquisition apparatus, and the specified object includes a first specified object and a second specified object;
accordingly, the number of the first and second electrodes,
accordingly, the first obtaining module comprises:
the first obtaining submodule is used for carrying out image acquisition on the first specified object and the second specified object through the image acquisition equipment to obtain action characteristic information;
the second obtaining submodule is used for carrying out real-time biological characteristic collection on the first specified object and/or the second specified object through the biological characteristic collecting equipment to obtain biological characteristic information;
and the first determining submodule is used for integrating the action characteristic information and the biological characteristic information to determine the real-time acquisition information.
9. The apparatus of claim 8, wherein the behavior feature attribute library comprises a first behavior feature attribute library, a second behavior feature attribute library, and a third behavior feature attribute library;
accordingly, the second obtaining module comprises:
a third obtaining sub-module, configured to perform image analysis on the action feature information according to the first behavior feature attribute library, so as to obtain first behavior information, where the first behavior feature attribute library is a behavior feature attribute library corresponding to the first specified object;
a fourth obtaining submodule, configured to perform image analysis on the action feature information according to the second behavior feature attribute library, so as to obtain second behavior information, where the second behavior feature attribute library is a behavior feature attribute library corresponding to the second specified object;
a fifth obtaining sub-module, configured to perform image analysis on the action feature information according to the third behavior feature attribute library, so as to obtain third behavior information, where the third behavior feature attribute library corresponds to the behavior feature attribute library corresponding to the first specified object and the second specified object;
and the second determining submodule is used for performing integration analysis on the first behavior information, the second behavior information and the third behavior information to determine the behavior information.
10. A computer-readable storage medium comprising a set of computer-executable instructions that, when executed, perform the instructional evaluation method of any of claims 1-6.
CN202011410428.5A 2020-12-03 2020-12-03 Teaching evaluation method and device and computer readable storage medium Pending CN112597813A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011410428.5A CN112597813A (en) 2020-12-03 2020-12-03 Teaching evaluation method and device and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011410428.5A CN112597813A (en) 2020-12-03 2020-12-03 Teaching evaluation method and device and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN112597813A true CN112597813A (en) 2021-04-02

Family

ID=75188562

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011410428.5A Pending CN112597813A (en) 2020-12-03 2020-12-03 Teaching evaluation method and device and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN112597813A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113223356A (en) * 2021-05-13 2021-08-06 深圳市技成科技有限公司 Skill training and checking system for PLC control technology
CN113509189A (en) * 2021-07-07 2021-10-19 科大讯飞股份有限公司 Learning state monitoring method and related equipment thereof

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150002272A1 (en) * 2013-06-06 2015-01-01 Zih Corp. Method and apparatus for associating radio frequency identification tags with participants
CN108805009A (en) * 2018-04-20 2018-11-13 华中师范大学 Classroom learning state monitoring method based on multimodal information fusion and system
CN109461104A (en) * 2018-10-22 2019-03-12 杭州闪宝科技有限公司 Classroom monitoring method, device and electronic equipment
CN110991381A (en) * 2019-12-12 2020-04-10 山东大学 Real-time classroom student state analysis and indication reminding system and method based on behavior and voice intelligent recognition
CN111402096A (en) * 2020-04-03 2020-07-10 广州云从鼎望科技有限公司 Online teaching quality management method, system, equipment and medium
CN111861146A (en) * 2020-06-29 2020-10-30 武汉科技大学 Teaching evaluation and real-time feedback system based on micro-expression recognition

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150002272A1 (en) * 2013-06-06 2015-01-01 Zih Corp. Method and apparatus for associating radio frequency identification tags with participants
CN108805009A (en) * 2018-04-20 2018-11-13 华中师范大学 Classroom learning state monitoring method based on multimodal information fusion and system
CN109461104A (en) * 2018-10-22 2019-03-12 杭州闪宝科技有限公司 Classroom monitoring method, device and electronic equipment
CN110991381A (en) * 2019-12-12 2020-04-10 山东大学 Real-time classroom student state analysis and indication reminding system and method based on behavior and voice intelligent recognition
CN111402096A (en) * 2020-04-03 2020-07-10 广州云从鼎望科技有限公司 Online teaching quality management method, system, equipment and medium
CN111861146A (en) * 2020-06-29 2020-10-30 武汉科技大学 Teaching evaluation and real-time feedback system based on micro-expression recognition

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
蔡丹丰 等: "移动电商领域的社会拥挤研究述评及展望", 《电子商务》, pages 34 - 35 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113223356A (en) * 2021-05-13 2021-08-06 深圳市技成科技有限公司 Skill training and checking system for PLC control technology
CN113223356B (en) * 2021-05-13 2022-12-13 深圳市技成科技有限公司 Skill training and checking system for PLC control technology
CN113509189A (en) * 2021-07-07 2021-10-19 科大讯飞股份有限公司 Learning state monitoring method and related equipment thereof

Similar Documents

Publication Publication Date Title
CN110334610B (en) Multi-dimensional classroom quantification system and method based on computer vision
Ysseldyke et al. Evaluating students' instructional environments
CN110678935A (en) Interactive adaptive learning and neurocognitive disorder diagnosis system applying face tracking and emotion detection and related methods thereof
JP4631014B2 (en) Electronic teaching material learning support device, electronic teaching material learning support system, electronic teaching material learning support method, and electronic learning support program
CN112597813A (en) Teaching evaluation method and device and computer readable storage medium
CN111814587A (en) Human behavior detection method, teacher behavior detection method, and related system and device
WO2019180652A1 (en) Interactive, adaptive, and motivational learning systems using face tracking and emotion detection with associated methods
CN115177253A (en) Student psychological crisis early warning system based on multi-mode data
Clarke et al. Classification accuracy of easyCBM first-grade mathematics measures: Findings and implications for the field
Pise et al. Estimation of learning affects experienced by learners: an approach using relational reasoning and adaptive mapping
Brooker et al. Improving the assessment of practice teaching: a criteria and standards framework
Gilliam et al. Effects of question difficulty and post-question wait-time on cognitive engagement: A psychophysiological analysis
Irajzad et al. Student socioeconomic status and teacher stroke: A case of female students in Iran
Byrne et al. Leveraging eye tracking in digital classrooms: A step towards multimodal model for learning assistance
CN111611896B (en) Anti-cheating management system for examination
Adams et al. Business students' ranking of reasons for assessment: Gender differences
CN112651602A (en) Classroom mode evaluation method and device
Kuklick et al. Developing physical educators’ knowledge of opaque and transparent technologies and its implications for student learning
Veerabhadrappa et al. Identification and evaluation of effective strategies in a dynamic visual task using eye gaze dynamics
Medina et al. Sensing behaviors of students in online vs. face-to-face lecturing contexts
Ramos et al. Aggregating attention, emotion, and cognition load as basis for developing a rule-based pedagogy for online learner
CN117496787B (en) Six-ability assessment and training system for children
TWI769580B (en) System for judging cognitive dimensions based on brainwaves to arrange classes and method thereof
Swathi et al. Student Engagement Prediction in an E-Learning Platform
Amarasinghe et al. Remotify: The Emergency Remote Learning Solution using Learning Analytics

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination