CN111738177A - Student classroom behavior identification method based on attitude information extraction - Google Patents

Student classroom behavior identification method based on attitude information extraction Download PDF

Info

Publication number
CN111738177A
CN111738177A CN202010595034.5A CN202010595034A CN111738177A CN 111738177 A CN111738177 A CN 111738177A CN 202010595034 A CN202010595034 A CN 202010595034A CN 111738177 A CN111738177 A CN 111738177A
Authority
CN
China
Prior art keywords
image
recognition
behavior
classroom behavior
neural network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010595034.5A
Other languages
Chinese (zh)
Other versions
CN111738177B (en
Inventor
高绍兵
蒋沁沂
谭敏洁
彭舰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan University
Original Assignee
Sichuan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan University filed Critical Sichuan University
Priority to CN202010595034.5A priority Critical patent/CN111738177B/en
Publication of CN111738177A publication Critical patent/CN111738177A/en
Application granted granted Critical
Publication of CN111738177B publication Critical patent/CN111738177B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

The invention discloses a student classroom behavior identification method based on posture information extraction, which comprises the following steps: s1, model training, namely establishing a local feature recognition convolutional neural network by utilizing known student classroom behavior images and class labels corresponding to the classroom behavior images; and S2, identifying unknown class behavior images of the students by using the local feature recognition convolutional neural network established in the step S1. The invention utilizes the posture information and the local image information to identify the classroom behavior image of the student, effectively reduces the interference of information irrelevant to the behavior in the image, such as clothing color, body form size and the like to the identification, and the behavior which can be identified by the method comprises the steps of carefully listening to the class, looking in east and west, sleeping, playing mobile phones, making notes and reading books. Compared with the traditional image identification method based on the convolutional neural network, the generalization accuracy of the student classroom behavior identification can be effectively improved.

Description

Student classroom behavior identification method based on attitude information extraction
Technical Field
The invention belongs to the technical field of computer vision and behavior recognition, and particularly relates to a student classroom behavior recognition method based on posture information extraction.
Background
In a traditional teaching classroom, a teacher often takes on the responsibility of maintaining classroom order in addition to teaching knowledge, and if classroom order is disordered, students often do not receive knowledge to a satisfactory extent. With the continuous deepening of the informationization and intellectualization process of teaching, in order to enable teachers to be more concentrated on teaching professional knowledge to students, people hope to construct an automatic classroom teaching management system, and how to accurately identify the behaviors of the students in a classroom becomes a challenging subject.
Common student classroom behavior recognition methods include a traditional machine learning-based method and a deep learning method which is started in recent years. The traditional machine learning-based method needs to manually extract proper features and train a classifier for recognition, and the method is high in difficulty in designing the features and low in recognition accuracy. The advanced learning method developed in the present year can carry out end-to-end training in the environment of a large amount of data, does not need manual design of characteristics, and is convenient in the training process, for example, the student class behavior recognition method based on advanced learning (reference documents: Wei Yan Tao, Qin Daoying, Hujiamin, Yaohu Huang, Shiyao Fei, student class behavior recognition based on advanced learning, modern education technology, 2019,29(07):87-91.) which is proposed by Wei Yan Tao and the like reduces the training difficulty and improves the recognition accuracy compared with the traditional machine learning method, but when the trained network is used for recognizing people who do not appear in the data set, the recognition accuracy is greatly reduced.
Disclosure of Invention
The invention aims to overcome the defects of the prior art, and provides a student classroom behavior recognition method based on attitude information extraction, which utilizes attitude information and local image information to recognize student classroom behavior images, effectively reduces information irrelevant to behaviors in the images, and can effectively improve generalization accuracy of student classroom behavior recognition
The purpose of the invention is realized by the following technical scheme: a student classroom behavior recognition method based on posture information extraction comprises the following steps:
s1, model training, namely establishing a local feature recognition convolutional neural network by utilizing known student classroom behavior images and class labels corresponding to the classroom behavior images;
s2, identifying unknown class behavior images of the students by using the local feature recognition convolutional neural network established in the step S1, wherein the method comprises the following substeps:
s21, inputting a student classroom behavior image to be recognized;
s22, inputting the image into an Openpos gesture recognition network for gesture information extraction;
s23, performing head posture recognition on the extracted posture information by using a head posture recognition neural network, executing a step S25 if a serious class listening, a east-Zhang hope or a sleeping behavior is recognized, or executing a step S24;
s24, extracting local image information, and further identifying behaviors by using a local feature identification convolutional neural network;
and S25, outputting the final recognition result.
Further, the step S1 includes the following sub-steps:
s11, making a student classroom behavior identification data set: inputting a class behavior image of a student and a data set of class labels corresponding to the class behavior image:
Figure BDA0002557159300000021
wherein x is a student classroom behavior image, y is a class label corresponding to the image, and R3Representing a three-dimensional Euclidean space, and N representing a natural number set;
s12, extracting the posture information of the images in the classroom behavior recognition data set by utilizing an Openpos posture recognition network, and integrating the extraction results into a student classroom behavior recognition posture information data set:
Sg={(p1,p2,...,pn,y)|pi∈R2(i=1,2,...,n),y∈N}
wherein p isiDenotes the position of the ith body part, i 1, 2., n, n denotes the total number of body parts; r2Representing a two-dimensional Euclidean space;
s13, constructing a head posture recognition neural network, and training the neural network by using the head posture information in the student classroom behavior recognition posture information data set;
s14, extracting local images with discrimination in the student classroom behavior recognition data set by using the hand posture information in the student classroom behavior recognition posture information data set, and setting (wl)x,wly)、(wrx,wry) The coordinates of the left wrist and the coordinates of the right wrist in the image M are respectively obtained by the following extraction method:
wxmax=max{wlx,wrx}
wxmin=min{wlx,wrx}
wymax=max{wly,wry}
wymin=min{wly,wry}
Figure BDA0002557159300000022
M′128×128=Resize(M′)
wherein M' is the extracted local image, and constants a, b, c and d are image range correction parameters;
Figure BDA0002557159300000023
means that image M is cropped, and x coordinate is taken out at wxmin-a:wxmaxIn the range of + b, the y coordinate is wymin-c:wymax+ d pixel points; resize (M ') denotes image reshaping of M'; the remolded image was denoted as M'128×128indicating an image resolution size of 128 × 128;
these images are then used to produce a labeled local image information dataset:
Figure BDA0002557159300000031
x 'is a local image, and y' is a category label corresponding to the local image;
and S15, building a local feature recognition convolutional neural network.
Further, the input layer of the head posture recognition neural network in the step S13 has five neuron groups, each neuron group contains two neurons of X and Y; the hidden layer contains five neurons, and receives the output of the five neuron groups from the input layer; the output layer contains a neuron, receives the outputs of five neurons from the hidden layer, and outputs the recognition result.
Further, the partial image extraction method in step S14 is an image cropping method, and the cropping coordinates take the coordinates of the left and right wrists as a reference and contain four parameters, a, b, c, and d.
Further, the number of output classes of the local feature recognition convolutional neural network of the step S15 is 3.
The invention has the beneficial effects that: the method of the invention utilizes the posture information and the local image information to identify the classroom behavior image of the student, effectively reduces the interference of information irrelevant to the behavior in the image, such as clothing color, body state size and the like, on the identification, and the behavior which can be identified by the method comprises the steps of carefully listening to the class, looking in east and west, sleeping, playing mobile phones, making notes and reading books. Compared with the traditional image recognition method based on the convolutional neural network, the method can effectively improve the generalization accuracy rate of the student classroom behavior recognition, and can still obtain higher generalization accuracy rate of the student classroom behavior recognition under the condition of less training samples.
Drawings
FIG. 1 is a flow chart of a student classroom behavior identification method proposed by the present invention;
FIG. 2 is a diagram illustrating an example of model training performed in the present embodiment;
FIG. 3 is an example graph of the visualization image of the pose information of the portion of the graph of FIG. 2;
FIG. 4 is a block diagram of the HPRNN;
FIG. 5 is a diagram of a ResNet18 network trimming version;
FIG. 6 is an example of a partial image used to train LFRCNN;
FIG. 7 is an image of student classroom behavior to be identified;
FIG. 8 is student classroom behavior image pose information to be identified;
fig. 9 is a partial image of a student classroom behavior image to be recognized.
Detailed Description
The technical scheme of the invention is further explained by combining the attached drawings.
The deep learning method based on the convolutional neural network needs a large amount of images for training in the process of identifying the classroom behaviors of students, and due to the limitation of practical factors, images of the classroom behaviors of the students with sufficiently rich samples cannot be collected, so that the identification accuracy rate is reduced when character behaviors which do not appear in a sample set are identified. The Openpos gesture recognition network can effectively recognize character gesture information in the image through training of other large data sets, and can further recognize classroom behaviors by combining hand information beneficial to recognition in the image, so that the situation that the recognition accuracy rate is reduced due to different wearing and body states of characters is effectively relieved. Based on this, the invention provides a student classroom behavior recognition method based on posture information extraction, as shown in fig. 1, which includes the following steps:
s1, model training, namely establishing a local feature recognition convolutional neural network by utilizing known student classroom behavior images and class labels corresponding to the classroom behavior images;
in the embodiment, 8201 images are collected in the experimental process and used for making a classroom behavior recognition data set, and an example of the partial images is shown in fig. 2.
The method comprises the following substeps:
s11, making a student classroom behavior identification data set: inputting a class behavior image of a student and a data set of class labels corresponding to the class behavior image:
Figure BDA0002557159300000041
wherein x is a student classroom behavior image, y is a class label corresponding to the image, and R3Representing a three-dimensional Euclidean space, and N representing a natural number set;
s12, utilizing an Openpos gesture recognition network (reference: Cao, Zhe, et al, "real multi-person 2d position estimation using part of gesture information fields." Proceedings of the IEEEConference on Computer Vision and Pattern recognition.2017.) to extract gesture information of images in the classroom behavior recognition dataset, and integrating the extraction results into a student classroom behavior recognition gesture information dataset:
Sg={(p1,p2,...,pn,y)|pi∈R2(i=1,2,...,n),y∈N}
wherein p isiDenotes the position of the ith body part, i 1, 2., n, n denotes the total number of body parts; r2Representing a two-dimensional Euclidean space;
specifically, only the posture information of the upper half body is used in the experiment, n is 18, and the included body parts include the nose, the neck, the right shoulder, the right elbow, the right wrist, the left shoulder, the left elbow, the left wrist, the right eye, the left eye, the right ear and the left ear. The partial pose information is visualized as shown in fig. 3.
S13, constructing a Head Position Recognition Neural Network (HPRNN), and training a specific structure of the HPRNN by using the Head position information in the student classroom behavior recognition position information data set as shown in fig. 4; the input layer of the head posture recognition neural network is provided with five neuron groups, and each neuron group comprises two neurons of X and Y; the hidden layer contains five neurons, and receives the output of the five neuron groups from the input layer; the output layer contains a neuron, receives the outputs of five neurons from the hidden layer, and outputs the recognition result. The 5 neurons in the input layer of the present embodiment receive the input of the coordinate difference (u) between the right ear and the right eye1,v1) Coordinate difference of right eye and nose (u)2,v2) Neck and nose coordinate difference(u3,v3) Coordinate difference of left ear and left eye (u)4,v4) Left eye and nose coordinate difference (u)5,v5) The number of output classes of the output layer is 4.
S14, extracting local images with discrimination in the student classroom behavior recognition data set by using the hand posture information in the student classroom behavior recognition posture information data set, and setting (wl)x,wly)、(wrx,wry) The coordinates of the left wrist and the coordinates of the right wrist in the image M are respectively obtained by the following extraction method:
wxmax=max{wlx,wrx}
wxmin=min{wlx,wrx}
wymax=max{wly,wry}
wymin=min{wly,wry}
Figure BDA0002557159300000051
M′128×128=Resize(M′)
wherein M' is the extracted local image, and constants a, b, c and d are image range correction parameters;
Figure BDA0002557159300000052
means that image M is cropped, and x coordinate is taken out at wxmin-a:wxmaxIn the range of + b, the y coordinate is wymin-c:wymax+ d pixel points; resize (M ') denotes image reshaping of M'; the remolded image was denoted as M'128×128indicating an image resolution size of 128 × 128;
these images are then used to produce a labeled local image information dataset:
Figure BDA0002557159300000053
x 'is a local image, and y' is a category label corresponding to the local image;
specifically, in the experiment, the image range correction parameter takes a value of a ═ 0, b ═ 0, c ═ 10, and d ═ 10. When the coordinate data of the left wrist is lacking, the image cutting formula is adjusted as follows:
Figure BDA0002557159300000054
wherein, a is 0, b is 60, c is 10, and d is 10;
when the coordinate data of the right-hand wrist is lacking, the image cutting formula is adjusted as follows:
Figure BDA0002557159300000061
wherein, a is 60, b is 0, c is 10, and d is 10;
when coordinate data of both hands are lacking, the image cutting formula is adjusted as follows:
M′=M(a:b)×(c:d)
wherein, a is 1, b is 128, c is 91, and d is 128.
S15, building a Local Feature Recognition Convolutional Neural Network (LFRCNN).
The local feature recognition convolutional neural network is a fine-tuned version of ResNet18, that is, the number of neurons in the output layer of the ResNet18 network is adjusted to 3, and a structure diagram of a fine-tuned version of the ResNet18 network is shown in fig. 5, where in represents the number of channels of the input feature map, out represents the number of channels of the output feature map, k represents the size of a convolution kernel, s represents a moving step size, and p represents a filling size, and specifically, a partial image example for training LFRCNN is shown in fig. 6.
S2, identifying unknown class behavior images of the students by using the local feature recognition convolutional neural network established in the step S1, wherein the method comprises the following substeps:
s21, inputting the student classroom behavior image M to be recognizedm×nStudent classroom performance images with image size 128 by 128, as shown in fig. 7;
s22, inputting the image into an Openpos gesture recognition network for gesture information extraction, and obtaining the image shown in the figure 8;
s23, performing head posture recognition on the extracted posture information by using a head posture recognition neural network, executing a step S25 if a serious class listening, a east-Zhang hope or a sleeping behavior is recognized, or executing a step S24; in this embodiment, the gesture information recognition result is a head-down, so step S24 is executed;
s24, extracting the local image information using the method described in S14, resulting in the local image information as shown in fig. 9; the behavior is further identified by using a local feature identification convolutional neural network;
and S25, outputting the final recognition result, wherein the final recognition result of the obtained image is 'playing the mobile phone'.
Further, the partial image extraction method in step S14 is an image cropping method, and the cropping coordinates take the coordinates of the left and right wrists as a reference and contain four parameters, a, b, c, and d. The number of output categories of the local feature recognition convolutional neural network of the step S15 is 3.
It will be appreciated by those of ordinary skill in the art that the embodiments described herein are intended to assist the reader in understanding the principles of the invention and are to be construed as being without limitation to such specifically recited embodiments and examples. Those skilled in the art can make various other specific changes and combinations based on the teachings of the present invention without departing from the spirit of the invention, and these changes and combinations are within the scope of the invention.

Claims (5)

1. A student classroom behavior recognition method based on posture information extraction is characterized by comprising the following steps:
s1, model training, namely establishing a local feature recognition convolutional neural network by utilizing known student classroom behavior images and class labels corresponding to the classroom behavior images;
s2, identifying unknown class behavior images of the students by using the local feature recognition convolutional neural network established in the step S1, wherein the method comprises the following substeps:
s21, inputting a student classroom behavior image to be recognized;
s22, inputting the image into an Openpos gesture recognition network for gesture information extraction;
s23, performing head posture recognition on the extracted posture information by using a head posture recognition neural network, executing a step S25 if a serious class listening, a east-Zhang hope or a sleeping behavior is recognized, or executing a step S24;
s24, extracting local image information, and further identifying behaviors by using a local feature identification convolutional neural network;
and S25, outputting the final recognition result.
2. The student classroom behavior recognition method based on posture information extraction as recited in claim 1, wherein the step S1 includes the following sub-steps:
s11, making a student classroom behavior identification data set: inputting a class behavior image of a student and a data set of class labels corresponding to the class behavior image:
Figure FDA0002557159290000011
wherein x is a student classroom behavior image, y is a class label corresponding to the image, and R3Representing a three-dimensional Euclidean space, and N representing a natural number set;
s12, extracting the posture information of the images in the classroom behavior recognition data set by utilizing an Openpos posture recognition network, and integrating the extraction results into a student classroom behavior recognition posture information data set:
Sg={(p1,p2,...,pn,y)|pi∈R2(i=1,2,...,n),y∈N}
wherein p isiDenotes the position of the ith body part, i 1, 2., n, n denotes the total number of body parts; r2Representing a two-dimensional Euclidean space;
s13, constructing a head posture recognition neural network, and training the neural network by using the head posture information in the student classroom behavior recognition posture information data set;
s14, extracting local images with discrimination in the student classroom behavior recognition data set by using the hand posture information in the student classroom behavior recognition posture information data set, and setting (wl)x,wly)、(wrx,wry) The coordinates of the left wrist and the coordinates of the right wrist in the image M are respectively obtained by the following extraction method:
wxmax=max{wlx,wrx}
wxmin=min{wlx,wrx}
wymax=max{wly,wry}
wymin=min{wly,wry}
Figure FDA0002557159290000021
M128×128=Resize(M′)
wherein M' is the extracted local image, and constants a, b, c and d are image range correction parameters;
Figure FDA0002557159290000022
means that image M is cropped, and x coordinate is taken out at wxmin-a:wxmaxIn the range of + b, the y coordinate is wymin-c:wymax+ d pixel points; resize (M ') denotes image reshaping of M'; the reshaped image is denoted M128×128indicating an image resolution size of 128 × 128;
these images are then used to produce a labeled local image information dataset:
Figure FDA0002557159290000023
x 'is a local image, and y' is a category label corresponding to the local image;
and S15, building a local feature recognition convolutional neural network.
3. The student classroom behavior recognition method based on posture information extraction as described in claim 2, wherein the input layer of the head posture recognition neural network in step S13 has five neuron groups, each neuron group having two neurons of X and Y; the hidden layer contains five neurons, and receives the output of the five neuron groups from the input layer; the output layer contains a neuron, receives the outputs of five neurons from the hidden layer, and outputs the recognition result.
4. The student classroom behavior recognition method based on posture information extraction as recited in claim 2, wherein the local image extraction method of step S14 is an image cropping method, wherein the cropping coordinates are referenced by left and right wrist coordinates and include four parameters a, b, c, and d.
5. The student classroom behavior recognition method based on posture information extraction as recited in claim 2, wherein the number of output classes of the local feature recognition convolutional neural network of step S15 is 3.
CN202010595034.5A 2020-06-28 2020-06-28 Student classroom behavior identification method based on attitude information extraction Active CN111738177B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010595034.5A CN111738177B (en) 2020-06-28 2020-06-28 Student classroom behavior identification method based on attitude information extraction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010595034.5A CN111738177B (en) 2020-06-28 2020-06-28 Student classroom behavior identification method based on attitude information extraction

Publications (2)

Publication Number Publication Date
CN111738177A true CN111738177A (en) 2020-10-02
CN111738177B CN111738177B (en) 2022-08-02

Family

ID=72651268

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010595034.5A Active CN111738177B (en) 2020-06-28 2020-06-28 Student classroom behavior identification method based on attitude information extraction

Country Status (1)

Country Link
CN (1) CN111738177B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112329634A (en) * 2020-11-05 2021-02-05 华中师范大学 Classroom behavior recognition method and device, electronic equipment and storage medium
CN113139530A (en) * 2021-06-21 2021-07-20 城云科技(中国)有限公司 Method and device for detecting sleep post behavior and electronic equipment thereof

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160104385A1 (en) * 2014-10-08 2016-04-14 Maqsood Alam Behavior recognition and analysis device and methods employed thereof
WO2018218286A1 (en) * 2017-05-29 2018-12-06 Saltor Pty Ltd Method and system for abnormality detection
CN109241917A (en) * 2018-09-12 2019-01-18 南京交通职业技术学院 A kind of classroom behavior detection system based on computer vision
CN109344682A (en) * 2018-08-02 2019-02-15 平安科技(深圳)有限公司 Classroom monitoring method, device, computer equipment and storage medium
CN109635725A (en) * 2018-12-11 2019-04-16 深圳先进技术研究院 Detect method, computer storage medium and the computer equipment of student's focus
CN109740446A (en) * 2018-12-14 2019-05-10 深圳壹账通智能科技有限公司 Classroom students ' behavior analysis method and device
CN110287792A (en) * 2019-05-23 2019-09-27 华中师范大学 A kind of classroom Middle school students ' learning state real-time analysis method in nature teaching environment
CN110443226A (en) * 2019-08-16 2019-11-12 重庆大学 A kind of student's method for evaluating state and system based on gesture recognition
CN111291840A (en) * 2020-05-12 2020-06-16 成都派沃智通科技有限公司 Student classroom behavior recognition system, method, medium and terminal device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160104385A1 (en) * 2014-10-08 2016-04-14 Maqsood Alam Behavior recognition and analysis device and methods employed thereof
WO2018218286A1 (en) * 2017-05-29 2018-12-06 Saltor Pty Ltd Method and system for abnormality detection
CN109344682A (en) * 2018-08-02 2019-02-15 平安科技(深圳)有限公司 Classroom monitoring method, device, computer equipment and storage medium
CN109241917A (en) * 2018-09-12 2019-01-18 南京交通职业技术学院 A kind of classroom behavior detection system based on computer vision
CN109635725A (en) * 2018-12-11 2019-04-16 深圳先进技术研究院 Detect method, computer storage medium and the computer equipment of student's focus
CN109740446A (en) * 2018-12-14 2019-05-10 深圳壹账通智能科技有限公司 Classroom students ' behavior analysis method and device
CN110287792A (en) * 2019-05-23 2019-09-27 华中师范大学 A kind of classroom Middle school students ' learning state real-time analysis method in nature teaching environment
CN110443226A (en) * 2019-08-16 2019-11-12 重庆大学 A kind of student's method for evaluating state and system based on gesture recognition
CN111291840A (en) * 2020-05-12 2020-06-16 成都派沃智通科技有限公司 Student classroom behavior recognition system, method, medium and terminal device

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
SHENG LI ET.AL: "A Neural Network-Based Teaching Style Analysis Model", 《2019 11TH INTERNATIONAL CONFERENCE ON INTELLIGENT HUMAN-MACHINE SYSTEMS AND CYBERNETICS》, 25 August 2019 (2019-08-25) *
徐家臻等: "基于人体骨架信息提取的学生课堂行为自动识别", 《现代教育技术》, no. 05, 15 May 2020 (2020-05-15) *
蒋沁沂等: "基于残差网络的学生课堂行为识别", 《现代计算机》, 15 July 2019 (2019-07-15) *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112329634A (en) * 2020-11-05 2021-02-05 华中师范大学 Classroom behavior recognition method and device, electronic equipment and storage medium
CN112329634B (en) * 2020-11-05 2024-04-02 华中师范大学 Classroom behavior identification method and device, electronic equipment and storage medium
CN113139530A (en) * 2021-06-21 2021-07-20 城云科技(中国)有限公司 Method and device for detecting sleep post behavior and electronic equipment thereof
CN113139530B (en) * 2021-06-21 2021-09-03 城云科技(中国)有限公司 Method and device for detecting sleep post behavior and electronic equipment thereof

Also Published As

Publication number Publication date
CN111738177B (en) 2022-08-02

Similar Documents

Publication Publication Date Title
CN104199834B (en) The method and system for obtaining remote resource from information carrier surface interactive mode and exporting
CN109034099B (en) Expression recognition method and device
CN111563452B (en) Multi-human-body gesture detection and state discrimination method based on instance segmentation
Bin Ahmed et al. UCOM offline dataset-an Urdu handwritten dataset generation
CN106778506A (en) A kind of expression recognition method for merging depth image and multi-channel feature
CN110085068A (en) A kind of study coach method and device based on image recognition
US11783615B2 (en) Systems and methods for language driven gesture understanding
CN111626297A (en) Character writing quality evaluation method and device, electronic equipment and recording medium
CN110009027A (en) Comparison method, device, storage medium and the electronic device of image
CN111738177B (en) Student classroom behavior identification method based on attitude information extraction
CN110837947B (en) Assessment method for teacher teaching concentration degree based on audio and video analysis technology
CN107292289A (en) Facial expression recognizing method based on video time sequence
CN101615244A (en) Handwritten plate blank numbers automatic identifying method and recognition device
CN110414563A (en) Total marks of the examination statistical method, system and computer readable storage medium
CN110796131A (en) Chinese character writing evaluation system
CN111950486A (en) Teaching video processing method based on cloud computing
CN112364883A (en) American license plate recognition method based on single-stage target detection and deptext recognition network
CN113657168A (en) Convolutional neural network-based student learning emotion recognition method
CN111985184A (en) Auxiliary writing font copying method, system and device based on AI vision
CN110096987B (en) Dual-path 3DCNN model-based mute action recognition method
CN108921006A (en) The handwritten signature image true and false identifies method for establishing model and distinguishing method between true and false
CN116403218B (en) Online and offline hybrid teaching management system based on remote audio/video interaction
CN117095414A (en) Handwriting recognition system and recognition method based on dot matrix paper pen
CN109165551B (en) Expression recognition method for adaptively weighting and fusing significance structure tensor and LBP characteristics
CN113076916B (en) Dynamic facial expression recognition method and system based on geometric feature weighted fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant