WO2022160842A1 - 一种基于脑电数据的学生协作状态评估方法及系统 - Google Patents

一种基于脑电数据的学生协作状态评估方法及系统 Download PDF

Info

Publication number
WO2022160842A1
WO2022160842A1 PCT/CN2021/128838 CN2021128838W WO2022160842A1 WO 2022160842 A1 WO2022160842 A1 WO 2022160842A1 CN 2021128838 W CN2021128838 W CN 2021128838W WO 2022160842 A1 WO2022160842 A1 WO 2022160842A1
Authority
WO
WIPO (PCT)
Prior art keywords
learning
classifier
eeg data
data
behavior
Prior art date
Application number
PCT/CN2021/128838
Other languages
English (en)
French (fr)
Inventor
杨宗凯
张立山
陈宇飞
戴志诚
Original Assignee
华中师范大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202110105420.6A external-priority patent/CN112932507A/zh
Priority claimed from CN202110989581.6A external-priority patent/CN113742396B/zh
Application filed by 华中师范大学 filed Critical 华中师范大学
Publication of WO2022160842A1 publication Critical patent/WO2022160842A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/375Electroencephalography [EEG] using biofeedback
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/06Electrically-operated educational appliances with both visual and audible presentation of the material to be studied

Definitions

  • the invention belongs to the field of student state monitoring, and more particularly, relates to a method and system for evaluating student collaboration state based on EEG data.
  • Patent CN201810429318.X discloses a learning state monitoring system based on brain wave analysis and a method of using the same.
  • the brain wave collector uses the brain wave collector, the collected brain wave information is preprocessed, and the test information collected by the question bank APP on the tablet computer is sent to the signal classification and feature extraction module; the background diagnosis server is used for the extracted EEG signals.
  • the students' learning status can be diagnosed.
  • the invention realizes the acquisition of students' EEG signals so as to monitor the students' status, in the teaching process, teachers can only know the students' learning status according to the final diagnosis report, and cannot monitor the students' learning process in real time. Students' learning status cannot be predicted based on students' EEG data.
  • this patent is only applicable to the diagnosis of the student's state in a common learning scenario, and is not applicable to the diagnosis of the student's state in a collaborative learning scenario.
  • Patent CN201510629507.8 relates to a classroom teaching monitoring method and system based on brain wave acquisition, which mainly monitors the fluctuation of brain waves of students in classroom teaching.
  • the acquisition terminal of the system obtains the brain wave data of the collected students through the brain wave sensor, and after processing and framing, it is sent to the server from the wireless local area network through the WIFI module.
  • the monitoring terminal of the system obtains and displays the brainwave data of each acquisition terminal from the database of the server through the local area network, and displays the individual fluctuation information and the overall fluctuation information of the brainwaves of the collected students in real time.
  • this patent realizes a monitoring of the student's learning status, it can only observe the real-time concentration and relaxation value of the student, which is only suitable for monitoring the student's status in ordinary learning scenarios, not for collaborative learning scenarios.
  • the purpose of the present invention is to provide a method and system for evaluating students' collaborative state based on EEG data, which aims to solve the problem that the existing EEG analysis learning state monitoring system is not suitable for the situation of students in collaborative learning scenarios. Diagnosed problems.
  • the present invention provides a method for evaluating student collaboration status based on EEG data, comprising the following steps:
  • the EEG data is collected by using an EEG sensor
  • the states of collaborative learning include: interactive learning, constructive learning, active learning and passive learning;
  • the classifier is constructed based on the long short-term memory network LSTM, and is used to classify the cooperative learning state of the students according to the EEG data during the cooperative learning process of the students.
  • the method for obtaining the vectorized behavior sequence includes the following steps:
  • the behavior in the behavior sequence database is vectorized by learning behavior embedding, and the vectorized behavior sequence is obtained.
  • the trained classifier is obtained through the following steps:
  • the training data includes: pre-collected EEG data, vectorized behavior sequences and corresponding collaborative learning states during the collaborative learning process of students;
  • the training data is divided into a training set and a test set, and the training set is used to train the classifier, and the test set is used to adjust the parameters of the trained classifier to ensure the classification accuracy of the classifier;
  • the trained classifier is obtained through the following steps:
  • the training data includes: pre-collected EEG data during the collaborative learning process of students and the corresponding collaborative learning state;
  • the EEG data during the learning collaborative learning process is used as the input of the classifier, and the corresponding collaborative learning state is used as the output of the classifier;
  • the training data is divided into a training set and a test set, and the training set is used to train the classifier, and then the parameters of the trained classifier are adjusted by using the test set to ensure the classification accuracy of the classifier.
  • the classifier first uses an LSTM layer as the input layer for connecting the input EEG data, vectorized behavior sequence and the hidden layer or connecting the input EEG data and the hidden layer, and then uses an LSTM layer and a Flatten layer layer is used as the hidden layer, and finally a Dense layer is used as the output layer to connect the hidden layer and the output data.
  • an LSTM layer as the input layer for connecting the input EEG data, vectorized behavior sequence and the hidden layer or connecting the input EEG data and the hidden layer
  • LSTM layer and a Flatten layer layer is used as the hidden layer
  • a Dense layer is used as the output layer to connect the hidden layer and the output data.
  • the EEG data includes: lowAlpha wave, highAlpha wave, lowBeta wave, highBeta wave, lowGamma wave, highGamma wave, delta wave and theta wave.
  • the collaborative learning state in the training data is determined by manual annotation according to the performance of each student in the video of the collaborative learning.
  • the present invention provides a system for evaluating student collaboration status based on EEG data, comprising: an EEG sensor and a classifier or including: an EEG sensor, a builder of vectorized behavior sequences, and a classifier;
  • the EEG sensor is used to obtain EEG data during the collaborative learning process of the students; the EEG data is collected by using the EEG sensor;
  • the builder of the vectorized behavior sequence is used to arrange the behavior data according to the time sequence of the actions performed by the students, construct the behavior sequence corresponding to each student, and form a behavior sequence database; use the learning behavior embedding to vectorize the behavior in the behavior sequence database, Get the vectorized behavior sequence;
  • the classifier is constructed based on the long short-term memory network LSTM, and is used for classifying the collaborative learning state of the students according to the EEG data or EEG data and the vectorized behavior sequence in the collaborative learning process of the students;
  • the classifier is trained, and the trained classifier receives the EEG data or EEG data and the vectorized behavior sequence, and based on the EEG data or EEG data and the vectorized behavior sequence, carries out the state of student collaborative learning.
  • Classification; the states of collaborative learning include: interactive learning, constructive learning, active learning and passive learning.
  • the classifier is trained based on pre-collected training data;
  • the training data includes: pre-collecting EEG data, vectorized behavior sequences and corresponding cooperative learning states during the collaborative learning process of students, or pre-collecting student collaboration EEG data in the learning process and the corresponding collaborative learning state;
  • the EEG data and vectorized behavior sequences in the learning collaborative learning process are used as the input of the classifier or the EEG data in the learning collaborative learning process is used as the input of the classifier , the corresponding collaborative learning state is used as the output of the classifier;
  • the classifier divides the training data into a training set and a test set, and uses the training set for training, and then uses the test set to adjust the parameters of the trained classifier to ensure that the classifier classification accuracy.
  • the classifier first uses an LSTM layer as the input layer for connecting the input EEG data, the vectorized behavior sequence and the hidden layer or connecting the input EEG data and the hidden layer, and then uses an LSTM layer and a A Flatten layer is used as the hidden layer, and finally a Dense layer is used as the output layer to connect the hidden layer and the output data.
  • the EEG data includes: lowAlpha wave, highAlpha wave, lowBeta wave, highBeta wave, lowGamma wave, highGamma wave, delta wave and theta wave.
  • the collaborative learning state in the training data is determined by manual annotation according to the performance of each student in the video of the collaborative learning.
  • the present invention provides a method and system for evaluating students' collaborative state based on EEG data, which adopts brain-computer interface and machine learning technology to automatically monitor students' collaborative learning state, which is more efficient than the traditional way of judging students' states through teacher's observation. Scientific and efficient.
  • the present invention collects the EEG data of the students during collaborative learning by wearing a portable EEG device for each student, extracts the log information of each student operating the online learning platform, collects the behavior data of the students during collaborative learning, and converts the EEG data into the Vectorized behavioral sequences or EEG data as raw data for automatic collaborative state classifiers.
  • the invention provides a learning label for an automatic cooperative classifier by compiling a cooperative classification coding table and using manual coding.
  • the present invention uses a deep learning algorithm (taking LSTM as an example), combines the synchronized EEG data and behavior data of multiple learners, and manually marks the collaborative state classification label according to the performance of each student in the video of the collaborative learning. , to construct an automatic classifier of student collaborative learning states.
  • LSTM deep learning algorithm
  • FIG. 1 is a flowchart of a method for evaluating student collaboration status provided by an embodiment of the present invention
  • FIG. 2 is an architectural diagram of a student collaboration state assessment system provided by an embodiment of the present invention
  • FIG. 3 is a structural diagram of an EEG sensor provided by an embodiment of the present invention.
  • FIG. 5 is a schematic diagram of a state classification effect provided by an embodiment of the present invention.
  • the invention relates to a student collaboration state evaluation technology based on EEG data, which is mainly used for classifying the collaboration state of students during collaborative learning in the classroom, so as to timely intervene when the students are in inefficient or ineffective collaboration.
  • the present invention mainly includes three parts: the collection of EEG data, behavior data and the classification of cooperation state.
  • the EEG data of each student is collected through a portable single-channel EEG sensor.
  • the original EEG data is processed by the sensor processing module and then sent to the collection end through the Bluetooth module. After processing, it is sent to the server.
  • the EEG data of multiple students are aggregated on the server and stored in the database. Finally, these data are displayed on the monitoring side.
  • each student's learning behavior data is collected by extracting the log information of the online learning platform. Arrange the students' learning behaviors in the order in which they perform actions, construct the behavior sequences corresponding to each student, and input the object behavior sequence database into the word2vec model.
  • the word2vec model interprets each behavior sequence as a sentence, and interprets each behavior as A word, through the learning behavior embedding process, converts each action into an action vector.
  • Teachers can grasp the collaborative state of each student during collaborative learning according to the classification results, and give appropriate interventions to students with inefficient and ineffective collaboration. to improve students' learning efficiency.
  • the invention can help teachers to grasp the participation of each student in collaborative learning, and also provide data support for the improvement and evaluation of the classroom teaching process.
  • FIG. 1 is a flowchart of a method for evaluating student collaboration status provided by an embodiment of the present invention; as shown in FIG. 1 , it includes the following steps:
  • S102 Input the EEG data and the vectorized behavior sequence or EEG data into the trained classifier to classify the state of students' collaborative learning;
  • the state of collaborative learning includes: interactive learning, constructive learning, and active learning and passive learning;
  • the classifier is constructed based on the long short-term memory network LSTM, and is used to classify the collaborative learning state of the students according to the EEG data and the vectorized behavior sequences or EEG data during the collaborative learning process of the students.
  • FIG. 2 is an architectural diagram of a student collaboration state evaluation system provided by an embodiment of the present invention; as shown in FIG. 2 , each student wears a portable EEG device, collects EEG data during students’ collaborative learning, and adopts a brain-computer interface and machine learning. Technology that automatically monitors students' collaborative learning status.
  • a portable single-channel EEG sensor is used to collect the EEG data of students during collaborative learning, which includes a reference electrode, an active electrode and a bracket.
  • Figure 3 shows the EEG sensor used.
  • the reference electrode is located at the earlobe, which is used to set the potential there to 0.
  • Active electrodes are located on the forehead and are used to record potentials there.
  • the EEG signal recorded is the absolute value of the potential changes of the reference electrode located at the earlobe and the active electrode located at the forehead.
  • the students participating in the test were divided into several groups, and the students in each group wore the EEG sensors and carried out collaborative learning according to the learning tasks arranged by the teacher. And before the official start, the students participating in the test also need to sit still for a few minutes to eliminate the interference of the behavior before the test on the data.
  • the collected raw EEG data is first processed by the chip in the sensor, and then lowAlpha wave, highAlpha wave, lowBeta wave, highBeta wave, lowGamma wave, highGamma wave, delta wave and theta wave can be distinguished. and relaxation meditation values.
  • the range of the attention value is 0-100. The larger the value, the more concentrated the attention.
  • the value of relaxation meditation ranges from 0 to 100. The larger the value, the more relaxed the brain is.
  • delta represents the delta rhythm wave in the EEG signal, which is related to fatigue and lethargy, and its frequency ranges from 0.5HZ to 2.75HZ.
  • theta represents theta rhythm waves in EEG signals, which are associated with relaxation, sleepiness and sleep. Its frequency range is 3.5HZ ⁇ 6.75HZ.
  • lowAlpha represents the low-frequency alpha rhythm wave in the EEG signal, which is related to the relaxation state, and its frequency ranges from 7.5HZ to 9.25HZ.
  • highAlpha represents the high-frequency alpha rhythm wave in the EEG signal, which is related to relaxation, calm and other states, and its frequency range is 10HZ ⁇ 11.75HZ.
  • lowBeta represents the low-frequency alpha rhythm wave in the EEG signal, which is related to states such as thinking and concentration, and its frequency range is 13HZ ⁇ 16.75HZ.
  • highBeta represents the high-frequency beta rhythm wave in the EEG signal, which is related to anxiety, tension, fear and other states, and its frequency ranges from 18HZ to 29.75HZ.
  • lowGamma represents the low-frequency gamma rhythm wave in the EEG signal, which is related to the meditation state, and its frequency ranges from 31HZ to 39.75HZ.
  • highGamma represents the high-frequency gamma rhythm wave in the EEG signal, which is related to behaviors such as listening, speaking and reading, and its frequency ranges from 41HZ to 49.75HZ.
  • the frequency of the data received by the acquisition terminal is 1 per second, and each data contains attention, meditation, lowAlpha, highAlpha , lowBeta, highBeta, lowGamma, highGamma, delta, and theta data values.
  • timestamps and the ID of the device used are added to each piece of data to distinguish the EEG data of each student.
  • the data is aggregated to the server.
  • the server stores these data into the database, and Table 1 shows the collected EEG data.
  • each student's learning behavior data is collected by extracting the log information of the online learning platform, including the student's id, time, and behavior.
  • Table 2 shows the extracted log data.
  • the word2vec model interprets each behavior sequence as a sentence, and interprets each behavior as For a word, through the learning behavior embedding process, each action is converted into a behavior vector. Assuming that each action is converted into a 10-dimensional behavior vector, the behavior vector converted by each action is shown in Table 3.
  • id behavior vector 10001 [0.04835496, -0.13685343, 0.15464433, -0.00095755, -0.02065742, 0.0218445, -0.05245922, 0.07527179, 0.12930311, -0.11722008] 10002 [-0.01208691, -0.09034875, -0.0200465, -0.09869331, -0.05632833, -0.10120608, -0.08197978, 0.03664138, -0.05778247, 0.02686951] 10003 [-0.1529376, -0.02754693, 0.04391915, -0.11126086, 0.14165899, -0.11321357, 0.10645257, -0.02427305, -0.14912353, -0.06264896] 10004 [-0.02985449, -0.01967788, -0.0035726, -0.01048924, 0.15300834, 0.02249564, 0.11786396
  • Long Short-Term Memory Network (LongShort-TermMemory, LSTM) is a special kind of recurrent neural network (RNN) that can learn long-term dependencies in sequences. Because the collected EEG data and vectorized behavior sequences belong to time series, LSTM can be used to classify EEG data and vectorized behavior sequences over a period of time, or LSTM can be used to classify EEG data over a period of time independently .
  • the collaborative state of students in collaborative learning can be divided into four types: interactive learning (Interactive), constructive learning (Constructive), active learning (Active), and passive learning (Passive).
  • Interactive learning is a type of learning activity with dialogue as an interactive prototype, emphasizing that two or more students learn around the same theme or concept, and "collaborative innovation" is its distinctive feature.
  • Constructive learning requires students to generate new ideas and perspectives through learning that go beyond the textbook or learning materials provided.
  • Active learning refers to an activity in which students actively participate in teaching in the process of learning and manipulate learning materials through actual learning behaviors.
  • Passive learning means that students have no other explicit activities in the process of learning except to concentrate on listening to others' explanations.
  • EEG data and the vectorized behavior sequence at the same time are combined into one piece of data or the EEG data is used as data. Since the input frequency of EEG data is one per second, the output frequency of the vectorized behavior sequence is not fixed. Therefore, the combined data is based on the output frequency of EEG data. If no behavior occurs at the current moment, the behavior that occurred last time is regarded as the behavior that occurred at the current time, and then the behavior of this period of time is vectorized and converted into vectorization. sequence of actions.
  • the students' collaborative state is classified.
  • EEG data and vectorized behavior sequences are used as the input of the classifier, and the main process in the classification of student collaboration states is introduced as follows:
  • the data stored in the database includes time, attention, meditation, lowAlpha, highAlpha, lowBeta, highBeta, lowGamma, highGamma, delta, theater.
  • Collect student behavior data through log records of the online learning platform, and extract behavior data including student ID, time, and actions.
  • the behavior data is divided into student units, the actions in each student's behavior data are arranged in sequence according to the time when the student performs the action, and then the student's ID and time are deleted, and a session corresponding to each student is constructed.
  • the session corresponding to each student is regarded as the behavior sequence corresponding to the student, and then the database of student learning behavior sequence is constructed.
  • the learning behavior embedding is used to vectorize the behavior sequence, and the NLP (Natural Language Processing) method is used to learn the behavior embedding.
  • NLP Natural Language Processing
  • a behavior sequence database is formed for the behavior sequences corresponding to all students, wherein each behavior sequence includes a variety of behaviors.
  • the standard word2vec model is directly applied to implement learning behavior embedding, and the word2vec model inputs the behavior sequence, and each behavior is The sequence is interpreted as a sentence, each behavior is interpreted as a word, the behavior vector corresponding to each action is output, and useful behavior embeddings are generated; the behavior vector can capture students' learning or browsing patterns and reveal that students are learning in similar ways similarity.
  • the specific process of using the word2vec model to implement learning behavior embedding is as follows: input the behavior sequence corresponding to the student into the word2vec model, the word2vec model interprets each behavior sequence as a sentence, and interprets each behavior as a word. After the learning behavior embedding process , transforming each action into a behavior vector.
  • the EEG data of each second and the vectorized behavior sequence are combined into one piece of data. Since the input frequency of EEG data is one per second, while the output frequency of the vectorized behavior sequence is not fixed, the combined data is based on EEG data. The output frequency of the data shall prevail. If no behavior occurs at the current moment, the behavior that occurred last time is regarded as the behavior that occurred at the current time, and then the behavior is vectorized and transformed into a vectorized behavior sequence;
  • the current EEG data is one per second, with 8 columns of data, lowAlpha, highAlpha, lowBeta, highBeta, lowGamma, highGamma, delta, thetheat. Since the data frequency of the vectorized behavior sequence is not fixed, if no vectorized behavior sequence is generated at the current moment, the last vectorized behavior sequence is used as the current vectorized behavior sequence, and the dimension of each vectorized behavior sequence can be defined by itself.
  • the EEG data and the vectorized behavior sequence are combined into one piece of data. Assuming that the dimension of the vectorized behavior sequence is 10, there are 18 columns for each data, namely lowAlpha, highAlpha, lowBeta, highBeta, lowGamma, highGamma, delta , theater, behavior. As shown in Table 4.
  • every 20 seconds is set as every 20 pieces of data as a group of data. Then combined with the manually annotated collaborative classification coding table, the 20 seconds of data are labeled.
  • the LSTM model requires the input data to be in the two-dimensional format of (D, N), where D represents the number of pieces of data input each time, here is set to 20, which means that 20 seconds of data is input at a time, and N represents the characteristic variable of each piece of data
  • D represents the number of pieces of data input each time
  • N represents the characteristic variable of each piece of data
  • the number, set to 18 here means that there are 8 values of lowAlpha, highAlpha, lowBeta, highBeta, lowGamma, highGamma, delta, theater in each data and the vectorized behavior sequence behavior corresponding to the behavior data of each 20-second time period. These 10 variable value.
  • This classifier uses an LSTM layer as the input layer to connect the input information and the hidden state, then uses an LSTM layer and a Flatten layer as the hidden layer, and finally uses a Dense layer as the output layer to connect the hidden state and the output state.
  • the original EEG data and vectorized behavior sequences are divided into training set and test set, the classifier is trained on the training set, and the parameters are adjusted according to the training results.
  • the test set is used as the input of the classification function, and the classification result is output, and the accuracy of the output is checked with the original data. And adjust the parameters according to the results, so that the accuracy of the classifier can reach a higher level.
  • the trained classifier is used to classify the students' collaboration status.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Psychiatry (AREA)
  • Psychology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Surgery (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Theoretical Computer Science (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Electrically Operated Instructional Devices (AREA)

Abstract

本发明提供一种基于脑电数据的学生协作状态评估方法及系统,包括:确定学生进行协作学习过程中的脑电数据和向量化行为序列或脑电数据;所述脑电数据通过使用脑电传感器采集;将所述脑电数据和向量化行为序列或脑电数据输入到训练好的分类器,对学生协作学习的状态进行分类;所述协作学习的状态包括:交互学习、建构学习、主动学习以及被动学习;所述分类器基于长短期记忆网络LSTM构建,用于根据学生协作学习过程中的脑电数据和向量化行为序列对学生的协作学习状态进行分类。本发明采用脑机接口和机器学习技术,自动监测学生的协作学习状态,相比传统的通过教师的观察来判断学生的状态要更加科学和高效。

Description

一种基于脑电数据的学生协作状态评估方法及系统 【技术领域】
本发明属于学生状态监测领域,更具体地,涉及一种基于脑电数据的学生协作状态评估方法及系统。
【背景技术】
专利CN201810429318.X公开了一种基于脑电波分析的学习状态监测系统及其使用方法。利用脑电波采集器将采集到的脑电波信息通过预处理后、平板电脑上的题库APP收集的做题信息均发送至信号分类及特征提取模块;后台诊断服务器用于对所提取的脑电信号通过PIK核心算法进行处理及相关性分析,从而诊断出学生的学习状态。该发明虽然实现了对学生脑电信号的采集从而达到对学生状态的监测,但是在教学过程中教师只能根据最后的诊断报告了解学生的学习状况,不能在学生的学习过程中实时监测,也不能根据学生的脑电数据预测学生的学习状态。另外,该专利只适用于普通的学习场景学生状态的诊断,不适用于协作学习的场景中学生状态的诊断。
专利CN201510629507.8涉及一种基于脑电波采集的课堂教学监测方法及系统,主要监测课堂教学中学生脑电波的波动情况。系统的采集终端通过脑电波感应器获取被采集学生的脑电波数据,经过处理组帧后,经WIFI模块从无线局域网送至服务器。系统的监测终端经局域网从服务器的数据库中获取并显示各采集终端的脑波数据,实时显示被采集学生的脑电波的个体波动信息和整体波动信息。该专利虽然实现了对学生学习状态的一个监测,但是只能观察到学生实时的专注度和放松度的值,只适用于普通的学习场景中学生状态的监测,不适用于协作学习的场景。
【发明内容】
针对现有技术的缺陷,本发明的目的在于提供一种基于脑电数据的学生 协作状态评估方法及系统,旨在解决现有脑电波分析的学习状态监测系统不适用于协作学习的场景中学生状态诊断的问题。
为实现上述目的,第一方面,本发明提供了一种基于脑电数据的学生协作状态评估方法,包括如下步骤:
确定学生进行协作学习过程中的脑电数据;所述脑电数据通过使用脑电传感器采集;
将所述脑电数据或者脑电数据和向量化行为序列输入到训练好的分类器,对学生协作学习的状态进行分类;
所述协作学习的状态包括:交互学习、建构学习、主动学习以及被动学习;
所述分类器基于长短期记忆网络LSTM构建,用于根据学生协作学习过程中的脑电数据对学生的协作学习状态进行分类。
优选地,所述向量化行为序列的获取方法,包括以下步骤:
对行为数据按照学生执行动作的时间先后顺序排列,构建各个学生对应的行为序列,组建行为序列数据库;
采用学习行为embedding对行为序列数据库中的行为进行向量化处理,获取向量化行为序列。
优选地,当分类器的输入为脑电数据和向量化行为序列时,所述训练好的分类器通过如下步骤得到:
基于预先收集到的训练数据对所述分类器进行训练;
所述训练数据包括:预先采集的学生协作学习过程中的脑电数据、向量化行为序列和对应的协作学习状态;
将学习协作学习过程中的脑电数据和向量化行为序列作为分类器的输入,对应的协作学习状态作为分类器的输出;
将所述训练数据划分成训练集和测试集,并利用训练集训练分类器,再 利用测试集调整训练好的分类器的参数,以保证分类器的分类精度;
当分类器的输入为脑电数据时,所述训练好的分类器通过如下步骤得到:
基于预先收集到的训练数据对所述分类器进行训练;
所述训练数据包括:预先采集的学生协作学习过程中的脑电数据和对应的协作学习状态;
将学习协作学习过程中的脑电数据作为分类器的输入,对应的协作学习状态作为分类器的输出;
将所述训练数据划分成训练集和测试集,并利用训练集训练分类器,再利用测试集调整训练好的分类器的参数,以保证分类器的分类精度。
优选地,分类器先用一个LSTM层作为输入层用于将输入的脑电数据、向量化行为序列和隐藏层连接或将输入的脑电数据和隐藏层连接,再用一个LSTM层和一个Flatten层作为隐藏层,最后再用一个Dense层作为输出层来连接隐藏层和输出数据。
优选地,所述脑电数据包括:lowAlpha波、highAlpha波、lowBeta波、highBeta波、lowGamma波、highGamma波、delta波和theta波。
优选地,训练数据中的协作学习状态是根据学生在进行协作学习的视频中每位学生的表现通过人工标注的方式确定。
第二方面,本发明提供了一种基于脑电数据的学生协作状态评估系统,包括:脑电传感器和分类器或包括:脑电传感器、向量化行为序列的构建器和分类器;
所述脑电传感器用于获取学生进行协作学习过程中的脑电数据;所述脑电数据通过使用脑电传感器采集;
向量化行为序列的构建器用于对行为数据按照学生执行动作的时间先后顺序排列,构建各个学生对应的行为序列,组建行为序列数据库;采用学 习行为embedding对行为序列数据库中的行为进行向量化处理,获取向量化行为序列;
所述分类器基于长短期记忆网络LSTM构建,用于根据学生协作学习过程中的脑电数据或脑电数据和向量化行为序列对学生的协作学习状态进行分类;
所述分类器经过训练,训练好的分类器接收所述脑电数据或脑电数据和向量化行为序列,基于所述脑电数据或脑电数据和向量化行为序列对学生协作学习的状态进行分类;所述协作学习的状态包括:交互学习、建构学习、主动学习以及被动学习。
优选地,所述分类器基于预先收集到的训练数据进行训练;所述训练数据包括:预先采集学生协作学习过程中的脑电数据、向量化行为序列和对应的协作学习状态或预先采集学生协作学习过程中的脑电数据和对应的协作学习状态;将学习协作学习过程中的脑电数据和向量化行为序列作为分类器的输入或将学习协作学习过程中的脑电数据作为分类器的输入,对应的协作学习状态作为分类器的输出;所述分类器将训练数据划分成训练集和测试集,并利用训练集训练,再利用测试集调整训练好的分类器的参数,以保证分类器的分类精度。
优选地,所述分类器先用一个LSTM层作为输入层用于将输入的脑电数据、向量化行为序列和隐藏层连接或将输入的脑电数据和隐藏层连接,再用一个LSTM层和一个Flatten层作为隐藏层,最后再用一个Dense层作为输出层来连接隐藏层和输出数据。
优选地,所述脑电数据包括:lowAlpha波、highAlpha波、lowBeta波、highBeta波、lowGamma波、highGamma波、delta波和theta波。
在一个可选的实施例中,所述训练数据中的协作学习状态是根据学生在进行协作学习的视频中每位学生的表现通过人工标注的方式确定。
总体而言,通过本发明所构思的以上技术方案与现有技术相比,具有以下有益效果:
本发明提供一种基于脑电数据的学生协作状态评估方法及系统,采用脑机接口和机器学习技术,自动监测学生的协作学习状态,相比传统的通过教师的观察来判断学生的状态要更加科学和高效。本发明通过为每名学生佩戴便携式脑电设备,采集学生协作学习时的脑电数据,通过提取每名学生操作在线学习平台的日志信息,采集学生协作学习时的行为数据,将脑电数据和向量化行为序列或脑电数据作为自动协作状态分类器的原始数据。本发明通过编写协作分类编码表,利用人工编码,为自动协作分类器提供学习标签。本发明利用深度学习算法(以LSTM为例),结合多名学习者的同步脑电数据和行为数据,以及根据学生在进行协作学习的视频中每位学生的表现通过人工标记的协作状态分类标签,构造学生协作学习状态自动分类器。
【附图说明】
图1为本发明实施例提供的学生协作状态评估方法流程图;
图2为本发明实施例提供的学生协作状态评估系统架构图;
图3为本发明实施例提供的脑电传感器结构图;
图4为本发明实施例提供的状态分类部分的流程图;
图5为本发明实施例提供的状态分类效果示意图。
【具体实施方式】
为了使本发明的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本发明进行进一步详细说明。应当理解,此处所描述的具体实施例仅用以解释本发明,并不用于限定本发明。
本发明涉及一种基于脑电数据的学生协作状态评估技术,主要用于对学生在课堂上进行协作学习时的协作状态做出分类,以便在学生处于低效协同和无效协同的时候及时进行干预。本发明主要包括脑电数据、行为数据的采 集和协作状态的分类三部分。在脑电数据的采集部分,通过便携式的单通道脑电传感器采集每个学生的脑电数据,原始的脑电数据经过传感器的处理模块处理后再经过蓝牙模块发送至采集端,采集端经过简要的处理后再发往服务端,多名学生的脑电数据在服务端进行汇聚并存储在数据库中,最后再将这些数据在监测端展示出来。
在行为数据的采集部分,通过提取在线学习平台的日志信息采集每个学生的学习行为数据。将学生的学习行为按照执行动作的先后顺序排列,构建各个学生对应的行为序列,将对象行为序列数据库输入至word2vec模型,word2vec模型将每个行为序列解释为一个句子,并将每个行为解释为一个单词,经学习行为embedding过程,将每个动作转化为行为向量。
在协作状态分类部分,利用深度学习算法并结合多名学生的同步脑电数据和向量化行为序列或脑电数据,以及根据学生在进行协作学习的视频中每位学生的表现通过人工标记的协作状态分类标签,构造学生协作学习状态自动分类器。
这样就可以实现在学生进行协作学习时对学生的协作状态进行分类,教师可以根据分类的结果掌握每个学生在协作学习时的协作状态,对低效协同和无效协同的学生给予适当的干预,以提高学生的学习效率。该发明可以帮助教师掌握每个学生在协作学习时的参与情况,同时也为课堂教学过程的改进和评价提供了数据支持。
图1为本发明实施例提供的学生协作状态评估方法流程图;如图1所示,包括如下步骤:
S101,确定学生进行协作学习过程中的脑电数据和向量化行为序列或脑电数据;
S102,将所述脑电数据和向量化行为序列或脑电数据输入到训练好的分类器,对学生协作学习的状态进行分类;所述协作学习的状态包括:交互学 习、建构学习、主动学习以及被动学习;所述分类器基于长短期记忆网络LSTM构建,用于根据学生协作学习过程中的脑电数据和向量化行为序列或脑电数据对学生的协作学习状态进行分类。
本发明涉及一种基于脑电数据的学生协作状态评估方法及系统,其主要包括脑电数据的采集、向量化行为序列的采集和协作状态的分类三部分。图2为本发明实施例提供的学生协作状态评估系统架构图;如图2所示,为每名学生佩戴便携式脑电设备,采集学生协作学习时的脑电数据,采用脑机接口和机器学习技术,自动监测学生的协作学习状态。
在脑电数据的采集部分是利用一个便携式的单通道脑电传感器采集学生在协作学习时的脑电数据,其包括一个参考电极、一个活动电极和一个托架。图3为所用脑电传感器。其中,参考电极位于耳垂,用于将该处的电位设置为0。活动电极位于前额,用于记录此处的电位。脑电信号记录的就是位于耳垂处的参考电极和位于前额处活动电极电位变化的绝对值。
在进行数据采集前,先将参与测试的学生分为几个小组,各小组学生配戴好脑电传感器并根据教师安排的学习任务进行协作学习。且在正式开始之前,参与测试的学生还需先静坐几分钟,以排除测试前的行为对数据产生的干扰。
采集到的原始脑电数据先经过传感器中的芯片处理后可以区分出lowAlpha波、highAlpha波、lowBeta波、highBeta波、lowGamma波、highGamma波、delta波和theta波,并根据相关算法算出专注度attention和放松度meditation的值。
专注度attention值的范围为0-100,数值越大,代表注意力越集中。放松度meditation的值的范围为0-100,数值越大,代表大脑越放松。
delta表示脑电信号中δ节律波,与疲劳和昏睡状态相关,其频率范围为0.5HZ~2.75HZ。
theta表示脑电信号中θ节律波,与放松、困倦和睡眠有关。其频率范围为3.5HZ~6.75HZ。
lowAlpha表示脑电信号中低频的α节律波,与放松状态有关,其频率范围为7.5HZ~9.25HZ。
highAlpha表示脑电信号中高频的α节律波,与放松、平静等状态有关,其频率范围为10HZ~11.75HZ。
lowBeta表示脑电信号中低频的α节律波,与思考、专心等状态有关,其频率范围为13HZ~16.75HZ。
highBeta表示脑电信号中高频的β节律波,与焦虑、紧张、害怕等状态有关,其频率范围为18HZ~29.75HZ。
lowGamma表示脑电信号中低频的γ节律波,与冥想状态有关,其频率范围为31HZ~39.75HZ。
highGamma表示脑电信号中高频的γ节律波,与听说读等行为有关,其频率范围为41HZ~49.75HZ。
然后再将这些数据经过芯片处理并打包后通过蓝牙模块将这些数据发送到采集端,采集端接收到的数据的频率为1秒钟1条,每条数据中都包含attention、meditation、lowAlpha、highAlpha、lowBeta、highBeta、lowGamma、highGamma、delta、和theta这几个数据的值。在采集端将每条数据加上时间戳和所用设备的ID用于区分每个学生的脑电数据。经过这些简单的处理后再将数据汇聚到服务端。服务端再将这些数据存入数据库中,表1为采集的脑电数据。
表1
time attention meditation lowAlpha highAlpha lowBeta highBeta lowGamma highGamma delta theta
2021-11-02 18:45:00 69 43 11112 3332 2717 1942 641 790 38561 12442
2021-11-02 18:45:01 60 63 21618 4063 4830 2269 4715 2248 160965 17156
2021-11-02 18:45:02 77 64 11284 24833 5109 23873 3778 5468 201152 26482
2021-11-02 18:45:03 77 87 1745 13417 8182 5069 4374 2230 81039 27731
2021-11-02 18:45:04 74 91 54431 16710 18071 5003 5179 4033 301337 33552
在行为数据的采集部分,通过提取在线学习平台的日志信息采集每个学生的学习行为数据,包括学生的id、时间、行为。表2为提取的日志数据。将学生的学习行为按照执行动作的先后顺序排列,构建各个学生对应的行为序列,将对象行为序列数据库输入至word2vec模型,word2vec模型将每个行为序列解释为一个句子,并将每个行为解释为一个单词,经学习行为embedding过程,将每个动作转化为行为向量,假设将每个动作转化为一个10维的行为向量,则每个动作转化的行为向量如表3所示。
表2
id time behavior
10001 2021-11-02 18:45:00 registered
10002 2021-11-02 18:45:01 opened
10001 2021-11-02 18:45:02 added
10003 2021-11-02 18:45:03 recover
10002 2021-11-02 18:45:04 set
10004 2021-11-02 18:45:05 logged-out
表3
id behavior vector
10001 [0.04835496,-0.13685343,0.15464433,-0.00095755,-0.02065742,0.0218445,-0.05245922,0.07527179,0.12930311,-0.11722008]
10002 [-0.01208691,-0.09034875,-0.0200465,-0.09869331,-0.05632833,-0.10120608,-0.08197978,0.03664138,-0.05778247,0.02686951]
10003 [-0.1529376,-0.02754693,0.04391915,-0.11126086,0.14165899,-0.11321357,0.10645257,-0.02427305,-0.14912353,-0.06264896]
10004 [-0.02985449,-0.01967788,-0.0035726,-0.01048924,0.15300834,0.02249564,0.11786396,0.18152717,0.00466656,-0.04310976]
10005 [-0.06719482,-0.1107032,0.0341095,0.05950354,-0.09434665,0.0660033,-0.04190211,0.19142264,0.10968415,0.18288177]
在协作状态分类部分,利用深度学习算法并结合多名学习者的同步脑电数据、向量化行为序列以及人工标记的协作状态分类标签,或利用深度学习 算法并结合多名学习者的同步脑电数据以及人工标记的协作状态分类标签,构造学生协作学习状态自动分类器,从而实现对学习者在协作学习时的协作状态进行分类。教师根据学生协作状态的分类结果给学习者适当的干预,使学生更好的参与协作学习从而提高学习效率。图4为学生协作状态评估分类的流程图。
长短期记忆网络(LongShort-TermMemory,LSTM)是一种特殊的循环神经网络(RNN),它能够学习序列中的长期依赖关系。因为采集的脑电数据和向量化行为序列属于时间序列,所以可以利用LSTM对一段时间内的脑电数据和向量化行为序列进行分类,也可以利用LSTM对一段时间内的脑电数据单独进行分类。
首先用视频记录下学生在协作学习时的学习过程。然后在课程结束后通过查看视频观察每个学生上课时的表现并对每个学生每分钟的协作状态进行人工标注。通过编写协作分类编码表,利用人工编码为自动协作分类器提供学习标签。
根据ICAP学习框架,可以将学生在协作学习时的协作状态分为交互学习(Interactive)、建构学习(Constructive)、主动学习(Active)、被动学习(Passive)四种。
交互学习是一种以对话为交互原型的学习活动类型,强调两个及以上学生围绕同一主题或概念展开学习,“协同创新”为其显著特征。建构学习要求学生通过学习生成超出教材或所提供学习材料之外的新思想和新观点。主动学习是指学生在学习的过程中积极参与教学,通过实际学习行为操控学习材料的一种活动。被动学习是指学生在学习的过程中除了集中注意听取他人的讲解外没有其他外显的活动表现。
再通过对已经收集的脑电数据和向量化行为序列或脑电数据进行预处理,构造出LSTM模型所规定的标准数据格式。并将这些数据划分训练集和 测试集,然后定义和建立LSTM模型,构造出学生协作学习状态自动分类器,最后利用分类器并结合学习者的脑电数据和向量化行为序列或脑电数据对学习者的协作状态进行分类。图5为小组协作状态分类的效果图。
所述方法的实施步骤如下:
(1)获取脑电数据的历史数据,这些数据包括time、attention、meditation、lowAlpha、highAlpha、lowBeta、highBeta、lowGamma、highGamma、delta、theat,只需提取其中的lowAlpha、highAlpha、lowBeta、highBeta、lowGamma、highGamma、delta、theat这8段的数据。并将数据的格式转化成LSTM模型所需的数据格式。
(2)获取学习行为数据,这些数据包括logged-out join、created、applauded、closed、exported、imported、started、registered、opened、answered、submitted、sign-in、downloaded、logged-in、copied、pasted、added、recover、set、deleted、edited、arranged、drag-droped、folded、unfolded、zoom-in、zoom-out、undo、redo。
先将学习行为按照学生执行动作的先后顺序排列,构建各个学生对应的行为序列,组建学生学习行为序列数据库;采用学习行为embedding对行为序列数据库中的行为进行向量化处理;
(3)将同一时刻的脑电数据和向量化行为序列组成一条数据或将脑电数据作为数据,由于脑电的数据输入频率是一秒一条,而向量化行为序列的输出频率却不固定,所以组合的数据以脑电数据的输出频率为准,如果当前时刻没有行为发生,则以上一次发生的行为作为当前时间发生的行为,然后再将这一段时间的行为进行向量化处理转化为向量化行为序列。
(4)结合学生协作学习时的视频,编写协作分类编码表,利用人工编码为自动协作分类器提供学习标签。
(5)构造出学生协作状态自动分类器,然后在脑电数据和向量化行为 序列或脑电数据的训练集上训练分类器,并在测试集上测试分类器的精度,根据测试的结果调整分类器的参数。使分类器的精度达到较高水平。
(6)利用训练好的学生协作学习状态自动分类器,并结合学生在协作学习时的脑电数据和向量化行为序列或脑电数据,对学生的协作状态进行分类。
实施例
本实施例采用脑电数据和向量化行为序列作为分类器的输入,介绍在学生协作状态分类中的主要过程具体如下:
1.数据的预处理
通过脑电传感器采集的原始脑电数据在经过处理后,存入数据库中的数据包括time、attention、meditation、lowAlpha、highAlpha、lowBeta、highBeta、lowGamma、highGamma、delta、theat。要建立学生协作分类模型只需提取其中的8段数据lowAlpha、highAlpha、lowBeta、highBeta、lowGamma、highGamma、delta、theat。
通过在线学习平台的日志记录收集学生的行为数据,提取包括学生的ID、时间和动作的行为数据。根据学生的ID将行为数据以学生为单元划分,将每个学生行为数据中的动作按照学生执行动作的时间先后排列,然后删除学生的ID和时间,构建各个学生对应的会话,其中,会话中只包含学生的动作数据。其中每个学生对应的会话作为学生对应的行为序列,然后构建学生学习行为序列数据库。采用学习行为embedding对行为序列进行向量化处理,使用NLP(自然语言处理)方法学习行为embedding。具体如下:对于所有学生对应的行为序列构成行为序列数据库,其中每个行为序列包括多种行为,本实施例中直接应用标准word2vec模型以实现学习行为embedding,word2vec模型输入行为序列,将每个行为序列解释为一个句子,将每个行为解释为一个单词,输出每个动作对应的行为向量,生成有用的行为嵌入;通 过行为向量可以捕捉学生的学习或浏览模式,并揭示学生以类似方式进行学习的相似性。具体采用word2vec模型实现学习行为embedding的过程如下:将学生对应的行为序列输入至word2vec模型,word2vec模型将每个行为序列解释为一个句子,并将每个行为解释为一个单词,经学习行为embedding过程,将每个动作转化为行为向量。
然后将每一秒的脑电数据和向量化行为序列组合成一条数据,由于脑电的数据输入频率是一秒一条,而向量化行为序列的输出频率却不固定,所以组合的数据以脑电的数据输出频率为准,如果当前时刻没有行为发生,则以上一次发生的行为作为当前时间发生的行为,然后再将这个行为进行向量化处理转化为向量化行为序列;
2.编写协作分类编码表
用视频记录下学生在协作学习时的学习过程。在课程结束后通过查看视频观察每个学生上课时的表现并对每个学生每20秒的协作状态进行人工标注。通过编写协作分类编码表,利用人工编码为自动协作分类器提供学习标签。
3.构造标准的数据格式
在定义LSTM模型之前,需要按照模型需要的格式构造输入数据的格式。当前脑电数据是一秒钟一条,一条有8列数据,lowAlpha、highAlpha、lowBeta、highBeta、lowGamma、highGamma、delta、theat。由于向量化行为序列的数据频率不是固定的,所以如果当前时刻没有产生向量化行为序列,那么将以上一次向量化行为序列作为当前向量化行为序列,每条向量化行为序列的维度可以自己定义。
将脑电数据和向量化行为序列组合拼接为一条数据,假设向量化行为序列的维数为10,则每条数据一共有18列,分别为lowAlpha、highAlpha、lowBeta、highBeta、lowGamma、highGamma、delta、theat、behavior。如表 4所示。
表4
Figure PCTCN2021128838-appb-000001
这里设置每20秒也就是每20条数据作为一组数据。然后结合人工标注的协作分类编码表,给这20秒的数据打上标签。
LSTM模型要求输入的数据按照(D,N)的二维格式,其中D代表每次输入的数据的条数,这里设置为20,表示一次输入20秒的数据,N代表每条数据的特征变量数,这里设置为18,表示每条数据中有lowAlpha、highAlpha、lowBeta、highBeta、lowGamma、highGamma、delta、theat这8个值和每20秒时间段的行为数据对应的向量化行为序列behavior这10个变量值。
4.构造出学生协作状态自动分类器
利用深度学习算法构建分类器用于学生协作状态的分类。此分类器先用一个LSTM层作为输入层用于将输入信息和隐藏状态连接,再用一个LSTM层和一个Flatten层作为隐藏层,最后再用一个Dense层作为输出层来连接隐藏状态和输出状态。并将原始的脑电数据和向量化行为序列分成训练集和测试集,在训练集上训练分类器,并根据训练结果调整参数。
5.利用学生协作状态自动分类器对学生的协作状态进行分类和精度检验
分类器训练完毕后,将测试集作为分类函数的输入,输出分类结果,将输出与原数据进行精度检验。并根据结果调整参数,使分类器的精度达到较高水平。
6.将学生协作状态自动分类器用于学生协作状态的分类
对新获取的学生的脑电数据和向量化行为序列,使用训练好的分类器对学生的协作状态进行分类。
本领域的技术人员容易理解,以上所述仅为本发明的较佳实施例而已,并不用以限制本发明,凡在本发明的精神和原则之内所作的任何修改、等同替换和改进等,均应包含在本发明的保护范围之内。

Claims (10)

  1. 一种基于脑电数据的学生协作状态评估方法,其特征在于,包括如下步骤:
    确定学生进行协作学习过程中的脑电数据;所述脑电数据通过使用脑电传感器采集;
    将所述脑电数据或者脑电数据和向量化行为序列输入到训练好的分类器,对学生协作学习的状态进行分类;
    所述协作学习的状态包括:交互学习、建构学习、主动学习以及被动学习;
    所述分类器基于长短期记忆网络LSTM构建,用于根据学生协作学习过程中的脑电数据对学生的协作学习状态进行分类。
  2. 根据权利要求1所述的学生协作状态评估方法,其特征在于,所述向量化行为序列的获取方法,包括以下步骤:
    对行为数据按照学生执行动作的时间先后顺序排列,构建各个学生对应的行为序列,组建行为序列数据库;
    采用学习行为embedding对行为序列数据库中的行为进行向量化处理,获取向量化行为序列。
  3. 根据权利要求1所述的基于脑电数据的学生协作状态评估方法,其特征在于,当分类器的输入为脑电数据和向量化行为序列时,所述训练好的分类器通过如下步骤得到:
    基于预先收集到的训练数据对所述分类器进行训练;
    所述训练数据包括:预先采集的学生协作学习过程中的脑电数据、向量化行为序列和对应的协作学习状态;将学习协作学习过程中的脑电数据和向量化行为序列作为分类器的输入,对应的协作学习状态作为分类器的输出;
    将所述训练数据划分成训练集和测试集,并利用训练集训练分类器,再 利用测试集调整训练好的分类器的参数,以保证分类器的分类精度;
    当分类器的输入为脑电数据时,所述训练好的分类器通过如下步骤得到:
    基于预先收集到的训练数据对所述分类器进行训练;
    所述训练数据包括:预先采集的学生协作学习过程中的脑电数据和对应的协作学习状态;
    将学习协作学习过程中的脑电数据作为分类器的输入,对应的协作学习状态作为分类器的输出;
    将所述训练数据划分成训练集和测试集,并利用训练集训练分类器,再利用测试集调整训练好的分类器的参数,以保证分类器的分类精度。
  4. 根据权利要求1或2所述的基于脑电数据的学生协作状态评估方法,其特征在于,所述分类器先用一个LSTM层作为输入层用于将输入的脑电数据、向量化行为序列和隐藏层连接或将输入的脑电数据和隐藏层连接,再用一个LSTM层和一个Flatten层作为隐藏层,最后再用一个Dense层作为输出层来连接隐藏层和输出数据。
  5. 根据权利要求2所述的基于脑电数据的学生协作状态评估方法,其特征在于,所述脑电数据包括:lowAlpha波、highAlpha波、lowBeta波、highBeta波、lowGamma波、highGamma波、delta波和theta波。
  6. 根据权利要求3所述的基于脑电数据的学生协作状态评估方法,其特征在于,所述训练数据中的协作学习状态是根据学生在进行协作学习的视频中每位学生的表现通过人工标注的方式确定。
  7. 一种基于脑电数据的学生协作状态评估系统,其特征在于,包括:脑电传感器和分类器或包括:脑电传感器、向量化行为序列的构建器和分类器;
    所述脑电传感器用于获取学生进行协作学习过程中的脑电数据;所述脑电数据通过使用脑电传感器采集;
    所述向量化行为序列的构建器用于对行为数据按照学生执行动作的时间先后顺序排列,构建各个学生对应的行为序列,组建行为序列数据库;
    采用学习行为embedding对行为序列数据库中的行为进行向量化处理,获取向量化行为序列;
    所述分类器基于长短期记忆网络LSTM构建,用于根据学生协作学习过程中的脑电数据或脑电数据和向量化行为序列对学生的协作学习状态进行分类;
    所述分类器经过训练,训练好的分类器接收所述脑电数据或脑电数据和向量化行为序列,基于所述脑电数据或脑电数据和向量化行为序列对学生协作学习的状态进行分类;
    所述协作学习的状态包括:交互学习、建构学习、主动学习以及被动学习。
  8. 根据权利要求7所述的基于脑电数据的学生协作状态评估系统,其特征在于,所述分类器基于预先收集到的训练数据进行训练;
    所述训练数据包括:预先采集的学生协作学习过程中的脑电数据、向量化行为序列和对应的协作学习状态或预先采集学生协作学习过程中的脑电数据和对应的协作学习状态;
    将学习协作学习过程中的脑电数据和向量化行为序列作为分类器的输入或将学习协作学习过程中的脑电数据作为分类器的输入,对应的协作学习状态作为分类器的输出;
    所述分类器将训练数据划分成训练集和测试集,并利用训练集训练,再利用测试集调整训练好的分类器的参数,以保证分类器的分类精度。
  9. 根据权利要求8所述的基于脑电数据的学生协作状态评估系统,其特征在于,所述分类器先用一个LSTM层作为输入层用于将输入的脑电数据、向量化行为序列和隐藏层连接或将输入的脑电数据和隐藏层连接,再用一个 LSTM层和一个Flatten层作为隐藏层,最后再用一个Dense层作为输出层来连接隐藏层和输出数据。
  10. 根据权利要求7或8所述的基于脑电数据的学生协作状态评估系统,其特征在于,所述脑电数据包括:lowAlpha波、highAlpha波、lowBeta波、highBeta波、lowGamma波、highGamma波、delta波和theta波。
PCT/CN2021/128838 2021-01-26 2021-11-05 一种基于脑电数据的学生协作状态评估方法及系统 WO2022160842A1 (zh)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN202110105420.6 2021-01-26
CN202110105420.6A CN112932507A (zh) 2021-01-26 2021-01-26 一种基于脑机接口的学生协作状态评估方法及系统
CN202110989581.6A CN113742396B (zh) 2021-08-26 2021-08-26 一种对象学习行为模式的挖掘方法及装置
CN202110989581.6 2021-08-26

Publications (1)

Publication Number Publication Date
WO2022160842A1 true WO2022160842A1 (zh) 2022-08-04

Family

ID=82652958

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/128838 WO2022160842A1 (zh) 2021-01-26 2021-11-05 一种基于脑电数据的学生协作状态评估方法及系统

Country Status (1)

Country Link
WO (1) WO2022160842A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116304358A (zh) * 2023-05-17 2023-06-23 济南安迅科技有限公司 一种用户数据采集方法

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140117826A (ko) * 2013-03-27 2014-10-08 인텔렉추얼디스커버리 주식회사 뇌파 측정을 이용한 사용자 상태 분류 방법 및 장치
CN107343786A (zh) * 2016-05-05 2017-11-14 胡渐佳 基于精神状态脑波数据分组和统计方法以及记录显示装置
CN108670276A (zh) * 2018-05-29 2018-10-19 南京邮电大学 基于脑电信号的学习注意力评价系统
CN110148318A (zh) * 2019-03-07 2019-08-20 上海晨鸟信息科技有限公司 一种数字助教系统、信息交互方法和信息处理方法
AU2019101151A4 (en) * 2019-09-30 2020-01-23 Chen, Ke MISS Classify Mental States from EEG Signal Using Xgboost Algorithm
CN111213197A (zh) * 2017-06-13 2020-05-29 Fuvi认知网络公司 基于洞察力的认知辅助装置、方法及系统
CN111631736A (zh) * 2020-06-03 2020-09-08 安徽建筑大学 建筑室内学习效率检测方法与系统
WO2020226257A1 (ko) * 2019-05-09 2020-11-12 엘지전자 주식회사 인공 지능형 뇌파 측정 밴드와 이를 이용한 학습 시스템 및 방법
KR20210006486A (ko) * 2018-08-13 2021-01-18 한국과학기술원 강화학습을 이용한 적응형 뇌파 분석 방법 및 장치
CN112932507A (zh) * 2021-01-26 2021-06-11 华中师范大学 一种基于脑机接口的学生协作状态评估方法及系统
CN113742396A (zh) * 2021-08-26 2021-12-03 华中师范大学 一种对象学习行为模式的挖掘方法及装置

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140117826A (ko) * 2013-03-27 2014-10-08 인텔렉추얼디스커버리 주식회사 뇌파 측정을 이용한 사용자 상태 분류 방법 및 장치
CN107343786A (zh) * 2016-05-05 2017-11-14 胡渐佳 基于精神状态脑波数据分组和统计方法以及记录显示装置
CN111213197A (zh) * 2017-06-13 2020-05-29 Fuvi认知网络公司 基于洞察力的认知辅助装置、方法及系统
CN108670276A (zh) * 2018-05-29 2018-10-19 南京邮电大学 基于脑电信号的学习注意力评价系统
KR20210006486A (ko) * 2018-08-13 2021-01-18 한국과학기술원 강화학습을 이용한 적응형 뇌파 분석 방법 및 장치
CN110148318A (zh) * 2019-03-07 2019-08-20 上海晨鸟信息科技有限公司 一种数字助教系统、信息交互方法和信息处理方法
WO2020226257A1 (ko) * 2019-05-09 2020-11-12 엘지전자 주식회사 인공 지능형 뇌파 측정 밴드와 이를 이용한 학습 시스템 및 방법
AU2019101151A4 (en) * 2019-09-30 2020-01-23 Chen, Ke MISS Classify Mental States from EEG Signal Using Xgboost Algorithm
CN111631736A (zh) * 2020-06-03 2020-09-08 安徽建筑大学 建筑室内学习效率检测方法与系统
CN112932507A (zh) * 2021-01-26 2021-06-11 华中师范大学 一种基于脑机接口的学生协作状态评估方法及系统
CN113742396A (zh) * 2021-08-26 2021-12-03 华中师范大学 一种对象学习行为模式的挖掘方法及装置

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
CAO XIAOMING;ZHANG YONGHE;PAN MENG;ZHU SHAN;YAN HAILIANG: "Research on Student Engagement Recognition Method from the Perspective of Artificial Intelligence: Analysis of Deep Learning Experiment based on a Multimodal Data Fusion", JOURNAL OF DISTANCE EDUCATION, 20 January 2019 (2019-01-20), pages 32 - 44, XP055955630, ISSN: 1672-0008, DOI: 10.15881/j.cnki.cn33-1304/g4.2019.01.003 *
CHENG PENGXIANG: "Learner Personalized Modeling and User Portrait System for Online Education Platform", CHINESE MASTER'S THESES FULL-TEXT DATABASE, 1 June 2019 (2019-06-01), pages 1 - 67, XP055955633, DOI: 10.27061/d.cnki.ghgdu.2019.002729 *
ZHU JIAQI: "Study on Distributed EEG Signal Acquisition System", INSTRUMENTATION USERS, vol. 2, no. 1, 6 January 2020 (2020-01-06), pages 6 - 9, XP055955624, ISSN: 1671-1041, DOI: 10.3969/j.issn.1671-1041.2020.01.002 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116304358A (zh) * 2023-05-17 2023-06-23 济南安迅科技有限公司 一种用户数据采集方法
CN116304358B (zh) * 2023-05-17 2023-08-08 济南安迅科技有限公司 一种用户数据采集方法

Similar Documents

Publication Publication Date Title
Xu et al. Review on portable EEG technology in educational research
Chen et al. Assessing the attention levels of students by using a novel attention aware system based on brainwave signals
Hussain et al. Affect detection from multichannel physiology during learning sessions with AutoTutor
CN107080546A (zh) 基于脑电图的青少年环境心理的情绪感知系统及方法、刺激样本选择方法
CN109637222B (zh) 脑科学智慧教室
CN108478224A (zh) 基于虚拟现实与脑电的紧张情绪检测系统及检测方法
CN111695442A (zh) 一种基于多模态融合的在线学习智能辅助系统
WO2022160842A1 (zh) 一种基于脑电数据的学生协作状态评估方法及系统
Kawamura et al. Detecting drowsy learners at the wheel of e-learning platforms with multimodal learning analytics
Zhao et al. Research and development of autism diagnosis information system based on deep convolution neural network and facial expression data
Gupta et al. Attention recognition system in online learning platform using eeg signals
Liu Optimization of college students’ mental health education based on improved intelligent recognition model
Al-Azani et al. A comprehensive literature review on children’s databases for machine learning applications
KR101118276B1 (ko) 생체 감성 지표 및 상황 정보로부터 학습 집중도에 관련된 학습 감성 지표를 생성하기 위한 유비쿼터스-러닝용 미들웨어 장치
CN110569968B (zh) 基于电生理信号的创业失败恢复力评估方法及评估系统
He et al. EEG-based confusion recognition using different machine learning methods
KR20120097098A (ko) 생체 감성 지표 및 상황 정보로부터 생성된 학습 감성 지표에 기반하여 사용자의 학습 효과를 향상시키기 위한 유비쿼터스-러닝용 학습 효과 향상 장치
CN112932507A (zh) 一种基于脑机接口的学生协作状态评估方法及系统
García-Monge et al. Potentialities and Limitations of the Use of EEG Devices in Educational Contexts.
Agarwal et al. Semi-Supervised Learning to Perceive Children's Affective States in a Tablet Tutor
Kwan et al. Designing VR training systems for children with attention deficit hyperactivity disorder (ADHD)
Cowen et al. Facial movements have over twenty dimensions of perceived meaning that are only partially captured with traditional methods
Zhai et al. HREmo: A measurement system of students’ studying state in online group class based on rPPG technology
Dong Analysis of Emotional Stress of Teachers in Japanese Teaching Process Based on EEG Signal Analysis
CN117547271B (zh) 一种心理素质智能评估分析仪

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21922432

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21922432

Country of ref document: EP

Kind code of ref document: A1