CN110009539A - A kind of student is in school learning state smart profile system and application method - Google Patents
A kind of student is in school learning state smart profile system and application method Download PDFInfo
- Publication number
- CN110009539A CN110009539A CN201910293925.2A CN201910293925A CN110009539A CN 110009539 A CN110009539 A CN 110009539A CN 201910293925 A CN201910293925 A CN 201910293925A CN 110009539 A CN110009539 A CN 110009539A
- Authority
- CN
- China
- Prior art keywords
- module
- student
- behavior
- learning state
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 52
- 230000006399 behavior Effects 0.000 claims abstract description 136
- 238000004458 analytical method Methods 0.000 claims abstract description 101
- 230000003542 behavioural effect Effects 0.000 claims abstract description 68
- 238000012545 processing Methods 0.000 claims abstract description 32
- 230000002776 aggregation Effects 0.000 claims abstract description 11
- 238000004220 aggregation Methods 0.000 claims abstract description 11
- 238000013527 convolutional neural network Methods 0.000 claims description 43
- 238000013480 data collection Methods 0.000 claims description 26
- 230000002159 abnormal effect Effects 0.000 claims description 17
- 238000000354 decomposition reaction Methods 0.000 claims description 16
- 230000000694 effects Effects 0.000 claims description 13
- 238000003708 edge detection Methods 0.000 claims description 10
- 238000012360 testing method Methods 0.000 claims description 8
- 230000001815 facial effect Effects 0.000 claims description 7
- 238000012549 training Methods 0.000 claims description 6
- 238000013528 artificial neural network Methods 0.000 claims description 5
- 238000012216 screening Methods 0.000 claims description 5
- 238000011156 evaluation Methods 0.000 claims description 4
- 238000001514 detection method Methods 0.000 claims description 3
- 230000003993 interaction Effects 0.000 claims description 3
- 238000012163 sequencing technique Methods 0.000 claims description 3
- 230000000007 visual effect Effects 0.000 claims description 3
- 230000019771 cognition Effects 0.000 claims description 2
- 235000013399 edible fruits Nutrition 0.000 claims 1
- 230000008569 process Effects 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 5
- 238000010223 real-time analysis Methods 0.000 description 4
- 230000002547 anomalous effect Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000011160 research Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000009499 grossing Methods 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000009897 systematic effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
- G06Q50/20—Education
- G06Q50/205—Education administration or guidance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Strategic Management (AREA)
- Tourism & Hospitality (AREA)
- Educational Administration (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Educational Technology (AREA)
- Marketing (AREA)
- Economics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Human Resources & Organizations (AREA)
- General Business, Economics & Management (AREA)
- Image Analysis (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Primary Health Care (AREA)
Abstract
The present invention relates to a kind of students in school learning state smart profile system and application method, what is solved is the high technical problem of computation complexity of behavioural analysis, by using include image data acquiring module M1, recognition of face locating module M2, using the behavioural analysis module M3 of extensive control methods, learning state analysis module M6, filing processing module M7;The student further includes scene behavior library module M10, archive of student module M8 in school learning state smart profile system, and the archive of student module M8, which is also connected with, is provided with data aggregation module M9;The filing processing module M7 pushes to archive of student module M8 after handling the information package of learning state analysis module M6;Technical solution, the problem is preferably resolved, in wisdom education informations.
Description
Technical field
The present invention relates to wisdom education informations and image intelligent process field, and in particular to a kind of student is in school learning state
Smart profile system and application method.
Background technique
In wisdom teaching field, student is one of the key element paid close attention in school learning state.The prior art
In, limited several action recognitions can be realized by image recognition technology in the research of school learning state to student, such as: it sees
Book writes, looks at the blackboard.Support is provided for the research that automatically analyzes of learning state to a certain extent, but there is a variety of
Nonrecognition is acted, and lacks the systematic Study to student's learning state general character and individual character, analytical error is excessive, causes existing
Technological applicability is not high.
Existing student school learning state smart profile system and application method cannot take into account school general character and
Otherness, analysis efficiency is low, and identification error is big.The present invention provides a kind of student in school learning state smart profile system and use
Method solves above-mentioned technical problem.
Summary of the invention
The technical problem to be solved by the present invention is to the high skills of the computation complexity of behavioural analysis existing in the prior art
Art problem.The new student of one kind is provided in school learning state smart profile system, the student is in school learning state smart profile system
System has the characteristics that the computation complexity of reduction behavioural analysis, suitable for the real-time analysis to student group learning state.
In order to solve the above technical problems, the technical solution adopted is as follows:
In school learning state smart profile system, the student includes in school learning state smart profile system by a kind of student
Image data acquiring module M1, recognition of face locating module M2, the behavioural analysis module M3 using extensive control methods, study shape
State analysis module M6, filing processing module M7;
The student further includes scene behavior library module M10, archive of student module in school learning state smart profile system
M8, the archive of student module M8, which is also connected with, is provided with data aggregation module M9;
The filing processing module M7 pushes to archive of student after handling the information package of learning state analysis module M6
Module M8;
The data aggregation module M9 is docked with teaching education administration system, realizes the convergence and application of data;
The scene behavior library module M10 includes Activity recognition basic data collection.
The working principle of the invention: the general character and otherness of present invention school in order to balance improves analysis efficiency, drop
Low identification error, the present invention are also added into extensive control methods, greatly reduce the computation complexity of behavioural analysis, especially suitable for
Real-time analysis to student group learning state realizes and student intelligently files in school learning state, and individual students peel off
The intelligence discovery of habit state and reason, and it is used for scientific guidance teaching, it improves the quality of teaching.On the one hand row is greatly reduced
Outlier Analysis for again the computation complexity of analysis, the behavior that on the other hand can attend class for student provides direct foundation.
In above scheme, for optimization, further, the behavioural analysis module M3 is connected with difficult sample using module
M4, difficult sample are connected with abnormal scene processing module M5 using module;The scene behavior library module M10 further includes anomalous field
Scape processing module M5 incoming new scene behavioral data collection.
The present invention also provides a kind of students in school learning state smart profile application method, and the student is in school learning state
Smart profile application method is based on aforementioned student in school learning state smart profile system, and the application method includes:
Step 1, image data acquiring module M1 carry out image data acquiring, the face figure including acquiring student and teacher
As information, as the crucial recognition property of archives, and the real-time image information of acquisition student and teacher, as state analysis
Input information;
Step 2, recognition of face locating module M2 receive the reality of image data acquiring module M1 incoming student and teacher
When image information, the face information in image is identified and is positioned;
Step 3, behavioural analysis module M3 are received from the recognition of face locating module M2 real-time image information transmitted and people
After member's information, according to scene behavior library module M10, behavioural analysis is done using extensive control methods;
Step 4, peel off behavior screening, the behavior type that peels off evaluation, teacher of learning state analysis module M6 attend class row
The team learning state at certain moment is generated for behavioral reasons analysis of analyzing, peel off;
The information of learning state analysis module M6 is packaged and is handled by step 5, and is pushed to archive of student module
M8, archive of student module M8 are stored with personal information, total marks of the examination, history learning status information;
Step 6, the external interaction data of data aggregation module M9.
Further, step 2 includes:
S201 carries out recognition of face and obtains the feature of facial image, know to the facial image register information of student and teacher
Another characteristic result is denoted as Rc, and it is stored in archive of student module M8, it is registered for personnel's archives;
S202 carries out recognition of face and positioning obtains image view to the real-time image information of collected student and teacher
All people's face information is identified and is positioned in open country, is identified and is denoted as R with the characteristic results of positioninglcAfter obtaining characteristic results, people
Face identifies that locating module M2 calls the face register information in archive of student module M8, and comparison obtains all people person in the visual field and believes
Breath;
S203, recognition of face locating module M2 are defeated by the real-time image information and corresponding personal information of student and teacher
Behavioural analysis module M3 is arrived out.
Further, step 3 includes:
Step S301, member's fast grouping, behavioural analysis module M3 read realtime graphic from archive of student module M8 first
The student's school grade recognized in picture, if there is academic record, by marks sequencing in archive of student module M8
Gradient is divided into n group;If there is no academic record in archive of student module M8, by member definitions all in video image
For a group;
Step 3012, image is decomposed by member, as unit of the group result in step 301, using recognition of face and is determined
Position method identifies the member of each group from image, by rapid edge-detection method, image is pressed member's Region Decomposition, is generated
Decomposition image for behavioural analysis;
Step S302 constructs depth convolutional neural networks behavior analyzer, carries out typical behaviour analysis to s class.
Further, include: by rapid edge-detection method
Step 30121: the member region navigated to using recognition of face carries out rapid edge-detection as target area;
Step 30122: circulation carries out above-mentioned detection, and defining cycle-index is the number of members in image, and decomposition is independent
Decompose image, the input for behavior classifier;
Step 3013: the decomposition image in step 3012 being input to convolutional neural networks behavior classifier and is carried out by behavior
Classification, is divided into s class.
Further, the convolutional neural networks behavior classifier building the following steps are included:
Step 30131: the data set of building convolutional neural networks behavior classifier is made with all big Activity recognition data sets
For basic data set, the user's scene behavioral data collection generated by abnormal scene processing module M5 is as growth data collection, base
Plinth data set and growth data collection collectively constitute the data set of convolutional neural networks behavior classifier, the convolutional neural networks row
It is divided into training set, test set and verifying collection for the data set of classifier;
Step 30132: building is suitable for the convolutional neural networks of behavior classification, and defining the number of members in one group is m, member
Behavior be divided into s class, s≤m selects a behavior as typical at random from s class, does behavioural analysis.
Further, include: in step S302
Step 3021, the sample data set of depth convolutional neural networks behavior analyzer is established, with big Activity recognition data
Collection is as basic data set, and the user's scene behavioral data collection generated by abnormal scene processing module M5 is as growth data
Collection, basic data collection and growth data collection collectively constitute the sample data set of depth convolutional neural networks behavior analyzer, described
Sample data set is divided into training set, test set and verifying collection;Depth convolutional neural networks are trained and are tested, until
To the depth convolutional neural networks behavior analyzer for meeting pre-defined threshold requirement;
Step 3022: the image after decomposition being input in depth convolutional neural networks behavior analyzer, behavior point is obtained
Analysis is as a result, behavioural analysis result includes that goal behavior particularly belongs to which kind of behavior and goal behavior can not parse.
Step 3023: for behavioural analysis the result is that the goal behavior that can not be parsed, by corresponding goal behavior original graph
Piece is input to difficult specimen sample module M4.
Further, the goal behavior that the difficulty specimen sample module M4 can not parse behavioural analysis module M3,
In its time domain, preceding n1 seconds to latter n1 seconds of successive frame picture is intercepted, successive frame picture is then pushed to abnormal scene process mould
Block M5;
Successive frame image information is pushed to user using event triggering method by the exception scene processing module M5, by
User carries out artificial cognition, after typing differentiates result, by successive frame picture and differentiates that result is deposited into scene behavior library module M10
In, convolutional neural networks behavior classifier and depth to spread foundation Activity recognition database, in behavioural analysis module M3
The update iteration of convolutional neural networks behavior analyzer network model.
Beneficial effects of the present invention: the general character and otherness of present invention school in order to balance improves analysis efficiency, drop
Low identification error, the present invention are also added into extensive control methods, greatly reduce the computation complexity of behavioural analysis, especially suitable for
Real-time analysis to student group learning state realizes and student intelligently files in school learning state, and individual students peel off
The intelligence discovery of habit state and reason, and it is used for scientific guidance teaching, it improves the quality of teaching.It joined abnormal scene process
Module, solve the problems, such as prior art model can not in application scenarios the movement of automatic Iterative strange scene identification.
Detailed description of the invention
Present invention will be further explained below with reference to the attached drawings and examples.
Fig. 1, student is in school learning state smart profile system and method block diagram.
Fig. 2, group behavior analysis flow chart diagram.
Fig. 3, by member's exploded view as flow chart.
Fig. 4, learning state analysis flow chart diagram.
Specific embodiment
In order to make the objectives, technical solutions, and advantages of the present invention clearer, with reference to embodiments, to the present invention
It is further elaborated.It should be appreciated that described herein, specific examples are only used to explain the present invention, is not used to limit
The fixed present invention.
Embodiment 1
The present embodiment provides a kind of students in school learning state smart profile system, and such as Fig. 1, the student is in school study shape
State smart profile system includes image data acquiring module M1, recognition of face locating module M2, using the row of extensive control methods
For analysis module M3, learning state analysis module M6, filing processing module M7;
The student further includes scene behavior library module M10, archive of student module in school learning state smart profile system
M8, the archive of student module M8, which is also connected with, is provided with data aggregation module M9;
The filing processing module M7 pushes to archive of student after handling the information package of learning state analysis module M6
Module M8;
The data aggregation module M9 is docked with teaching education administration system, realizes the convergence and application of data;
The scene behavior library module M10 includes Activity recognition basic data collection.
The general character and otherness of the present embodiment school in order to balance improves analysis efficiency, reduces identification error, this hair
It is bright to be also added into extensive control methods, greatly reduce the computation complexity of behavioural analysis, especially suitable for learning to student group
The real-time analysis of state realizes and student intelligently files in school learning state, and individual students peel off learning state and reason
Intelligence discovery, and it is used for scientific guidance teaching, it improves the quality of teaching.On the one hand the calculating for greatly reducing behavioural analysis is multiple
The Outlier Analysis of miscellaneous degree, the behavior that on the other hand can attend class again for student provides direct foundation.
Specifically, the behavioural analysis module M3 is connected with difficult sample using module M4, and difficult sample is connected using module
It is connected to abnormal scene processing module M5;The scene behavior library module M10 further includes incoming new of abnormal scene processing module M5
Scene behavioral data collection.It joined abnormal scene processing module, solving prior art model can not be automatic in application scenarios
The identification problem of the strange scene movement of iteration.
Wherein, the human face image information of student and teacher are acquired, mode can be to be acquired by camera, is also possible to
Pass through management software uploading pictures.
The present embodiment also provides a kind of student in school learning state smart profile application method, and the application method includes:
Step 1, image data acquiring module M1 carry out image data acquiring, the face figure including acquiring student and teacher
As information, as the crucial recognition property of archives, and the real-time image information of acquisition student and teacher, as state analysis
Input information;
Step 2, recognition of face locating module M2 receive the reality of image data acquiring module M1 incoming student and teacher
When image information, the face information in image is identified and is positioned;
Step 3, behavioural analysis module M3 are received from the recognition of face locating module M2 real-time image information transmitted and people
After member's information, according to scene behavior library module M10, behavioural analysis is done using extensive control methods;
Step 4, peel off behavior screening, the behavior type that peels off evaluation, teacher of learning state analysis module M6 attend class row
The team learning state at certain moment is generated for behavioral reasons analysis of analyzing, peel off;
The information of learning state analysis module M6 is packaged and is handled by step 5, and is pushed to archive of student module
M8, archive of student module M8 are stored with personal information, total marks of the examination, history learning status information;
Step 6, the external interaction data of data aggregation module M9.
The real-time image information of the collected human face image information of image data acquiring module M1 and student and teacher will
It is input to recognition of face locating module M2.
Recognition of face locating module M2, the module receive the reality of student and teacher that image data acquiring module M1 is transmitted
When image information, the face information in image is identified and is positioned first, for the specific of recognition of face herein and positioning
Step, the present embodiment, which is not done, specifically to be repeated, and those skilled in the art can refer to the prior art.
Specifically, the treatment process that the face information in image is identified and positioned includes:
S201. it to the facial image register information of student and teacher, carries out recognition of face and obtains the feature of facial image, know
Another characteristic result is denoted as Rc, which is stored in archive of student module M8, for the registration of personnel's archives;
S202. it to the real-time image information of collected student and teacher, carries out recognition of face and positioning obtains image view
All people's face information is identified and is positioned in open country, is identified and is denoted as R with the characteristic results of positioninglcAfter obtaining characteristic results, people
Face identification locating module M2 can also call the face register information in archive of student module M8, and comparison obtains all people in the visual field
Member's information;
S203, after S202 step, recognition of face locating module M2 is by the real-time image information of student and teacher and right
The personal information answered is output to behavioural analysis module M3;
Behavioural analysis module M3 is received from the recognition of face locating module M2 real-time image information transmitted and personal information
Afterwards, behavioural analysis is done.
In order to solve the problems, such as the recognition efficiency and ratio of defects of crowd behaviour analysis, analysis rate is improved, identification is reduced and misses
Difference, the present embodiment use extensive control methods.
As shown in Fig. 2, as follows to the processing step of crowd behaviour analysis:
S3011, member's fast grouping:
Step 3011A: behavioural analysis module M3 has identified from archive of student module M8 reading realtime graphic picture first
The student's school grade arrived is divided into n group by achievement gradient, is denoted as group respectively if having academic record in system
1 ... group n, in the present embodiment, group technology are group 1 by before marks sequencing 5%, and remaining preceding 10% is group 2, remaining
For group 3.
Step 3011B: if there is no academic record in system, members all in video image are considered as one big
Group.
Step 3012: referring to Fig. 3, image is as follows by the process that member decomposes:
As unit of grouping in step 301, identified from image first with existing recognition of face and location technology
The target rapid edge-detection that is identified as of the member of each group out, each member provide foundation, will by rapid edge-detection method
Image presses member's Region Decomposition, generates the decomposition image for being used for behavioural analysis, steps are as follows:
Step 30121: the member region navigated to using recognition of face carries out rapid edge-detection as target area.
Canny edge detection method is used in the present embodiment, is handled as follows:
(1) along the row and column of image coordinate, smoothing denoising filter is constructed respectively.
Based on common 2-d gaussian filters device, function representation are as follows:
The gradient of above-mentioned function is2-d gaussian filters device is divided into the row along image coordinate
With two smoothing denoising filters of column.
(2) digital picture calculates direction and the amplitude of image gradient using the differential representation of adjacent domains single order local derviation.
For image M (x, y), (x, y) indicates image point pixel value, it is asked to obtain the local derviation of x and y respectively:
Therefore the amplitude of gradient are as follows:The direction of gradient are as follows:
(3) anti-local optimum processing is carried out to image, to obtain more accurate fringe region.
Two film size values in the amplitude and gradient direction are made ratio by the amplitude for successively calculating every bit pixel in two dimensional image
Compared with, using the point of both greater than two film size values as marginal point, reject only one point for being greater than two film size values, in this way can anti-part most
Advantage interferes edge, finally obtains more accurate fringe region.
Step 30122: circulation carries out above-mentioned detection, and cycle-index is the number of members in image.It can obtain all members
Fringe region, decomposition be independent exploded view picture, with the input for behavior classifier.
Step 3013: the decomposition image in step 3012 being input to convolutional neural networks behavior classifier and is carried out by behavior
Classification.
Wherein, convolutional neural networks behavior classifier building the following steps are included:
Step 30131: the data set of building convolutional neural networks behavior classifier, first with major Activity recognition data set
As basic data set including HMDB51, UCF101 etc., and during the routine use of system, pass through abnormal scene process mould
For user's scene behavioral data collection that block M5 is generated as growth data collection, the two collectively constitutes convolutional neural networks behavior classifier
Data set, the data set be divided into training set, test set and verifying collection;
Step 30132: building is suitable for the convolutional neural networks of behavior classification.
In the present embodiment, the building of convolutional neural networks is not specifically limited, those skilled in the art can refer to existing
Technology is trained and tests, directly using suitable convolutional neural networks, such as ResNet32 network, LeNet-5 network etc.
To the convolutional neural networks behavior classifier for obtaining meeting pre-defined threshold requirement.
Assuming that the number of members in group 1 is m, the behavior of member is divided into s class, and s≤m selects a behavior at random from s class and makees
Behavioural analysis is done for typical case, the behavioural analysis comprising m group member is simplified to analyze its s class behavior.On the one hand significantly
Reduce the computation complexity of behavioural analysis, the Outlier Analysis for the behavior that on the other hand can attend class again for student provide directly according to
According to.
S302. typical behaviour is analyzed: for the s class behavior in step 30132, carrying out typical behaviour analysis, processing step
It is rapid as follows:
Step 3021: building depth convolutional neural networks behavior analyzer, the present embodiment is not to depth spacing neural network
Building be specifically limited, those skilled in the art can refer to the prior art, such as ResNet32 network;
Step 3022: the sample data set of depth convolutional neural networks behavior analyzer is established, first with major behavior knowledge
Other data set includes HMDB51, UCF101 etc. as basic data set, and during the routine use of system, passes through anomalous field
For user's scene behavioral data collection that scape processing module M5 is generated as growth data collection, the two collectively constitutes depth convolutional Neural net
The sample data set of network behavior analyzer, the data set are divided into training set, test set and verifying collection;To depth convolutional Neural
Network is trained and tests, until the depth convolutional neural networks behavior analyzer for obtaining meeting pre-defined threshold requirement;
Step 3023: referring to described in step 3012, the image after decomposition being input to depth convolutional neural networks behavior point
In parser, behavioural analysis result is obtained.It as a result include two kinds, as a result 1: which kind of behavior goal behavior particularly belongs to;As a result 2: mesh
Mark behavior can not parse;
Step 3024: for the goal behavior that can not be parsed, behavior original image being input to difficult specimen sample mould
In block M4.
Difficult specimen sample module M4, for the goal behavior that behavioural analysis module M3 can not be parsed, system is in its time domain
It is interior, it intercepts first n1 seconds and arrives rear n1 seconds of dependent picture, the present embodiment uses first 3 seconds to latter 3 seconds successive frame pictures, then
Continuous pictures are pushed to abnormal scene processing module M5.
Abnormal scene processing module M5, abnormal scene processing module M5 after receiving the successive frame picture that can not be parsed,
Using event triggering method, image information is pushed to user interface, by manually to the successive frame picture that can not be parsed into
Row differentiates, and after input result, difficult scenic picture and result are deposited into scene behavior library module M10 by system, to expand
Basic Activity recognition database is opened up, to realize the convolutional neural networks behavior classifier and depth convolution in behavioural analysis module M3
The update iteration of neural network behavior analyzer network model.
Learning state analysis module M6, the processing step of such as Fig. 4, learning state analysis module include the following:
S601 peel off behavior screening: through behavioural analysis module M3 S302 step typical behaviour analysis after, group can be obtained
Behavioural analysis result.The analysis result is presented for data, in the present embodiment, is stated with scatter plot.
Peel off the screening technique of behavior are as follows:
It (1), will when behavior analysis module M3 classifies to group behavior using convolutional neural networks behavior classifier
Extract the two class behaviors individual of minimum number.
(2) after depth convolutional neural networks behavior analyzer does behavioural analysis to all kinds of typical behaviours, then according to result
Obtain peeling off the concrete behavior of classification.
S602 peels off behavior type evaluation: the behavior of attending class is divided into three classes: it is front, neutral, negative, and to scene behavior library mould
Behavior in block M10 is identified, to be used to evaluate the behavior of attending class of student.Such as: pay attention to the class-front, talk-neutrality, sleep-
Negatively.
It is evaluated using the behavior type that peels off in the scene behavior library module, the classification concrete behavior that peels off to analyzing
Carry out Types Assessment.
S603, teacher attend class behavioural analysis: will teacher real time video image sample after be input to behavioural analysis module M3,
Teacher is obtained to attend class behavioural analysis result.The behavior of teacher includes: writing on the blackboard, teaching, reads, says multimedia etc., is learnt in student
In the research of state analysis, the behavior of attending class of teacher is also important references.
In the present embodiment, teacher only is being recognized when attend class behavior, the sleep behavior of student is just be evaluated as
It negatively, is otherwise neutrality.
The behavioral reasons that peel off analysis: S604 attends class behavior and the behavior the selection result that peels off with reference to teacher, can determine student
Peel off behavior the reason of.Such as: teacher says class hour, certain student sleep.
S605, generates the team learning state at certain moment: student's status information include: along time locus behavior record,
Peel off behavior record etc..
The information of learning state analysis module M6 is packaged and is handled by filing processing module M7, and is pushed to student
Profile module M8.
The personal information of student, total marks of the examination, history learning status information etc. are stored in student by archive of student module M8
In profile module M8.
Data aggregation module M9 is docked with teaching education administration system by modes such as data flows, is realized the convergences of data and is answered
With.
Scene behavior library module M10, each Activity recognition basic data collection is stored in M10 module, in conjunction with anomalous field
The new scene behavioral data collection that scape processing module M5 is transmitted has collectively constituted the scene behavior library suitable for this scene, to be used for
Depth needed for convolutional neural networks behavior classifier needed for trained and iteration behavior Fast Classification and real-time behavioural analysis is rolled up
The network model of product neural network behavior analyzer, so that the model can adapt to application scenarios, and it is continuous with using
Improve adaptability and precision.
The present embodiment completes student of the present invention and learns filling smart profile system and method in school, which can
To realize the intellectual monitoring of student's learning state, manpower teaching management cost is reduced, and find peel off row of the student in study
For and reason, convenient for finding to and guide teaching in time, for promoted quality of instruction have significant effect and meaning.
Although the illustrative specific embodiment of the present invention is described above, in order to the technology of the art
Personnel are it will be appreciated that the present invention, but the present invention is not limited only to the range of specific embodiment, to the common skill of the art
For art personnel, as long as long as various change the attached claims limit and determine spirit and scope of the invention in, one
The innovation and creation using present inventive concept are cut in the column of protection.
Claims (9)
1. a kind of student is in school learning state smart profile system, it is characterised in that: the student is in school learning state intelligence shelves
Case system includes image data acquiring module M1, recognition of face locating module M2, the behavioural analysis mould using extensive control methods
Block M3, learning state analysis module M6, filing processing module M7;
The student further includes scene behavior library module M10, archive of student module M8, institute in school learning state smart profile system
It states archive of student module M8 and is also connected with and be provided with data aggregation module M9;
The filing processing module M7 pushes to archive of student module after handling the information package of learning state analysis module M6
M8;
The data aggregation module M9 is docked with teaching education administration system, realizes the convergence and application of data;
The scene behavior library module M10 includes Activity recognition basic data collection.
2. student according to claim 1 is in school learning state smart profile system, it is characterised in that: the behavioural analysis
Module M3 is connected with difficult sample using module M4, and difficult sample is connected with abnormal scene processing module M5 using module;It is described
Scene behavior library module M10 further includes the incoming new scene behavioral data collection of abnormal scene processing module M5.
3. a kind of student is in school learning state smart profile application method, it is characterised in that: the student is in school learning state intelligence
Energy archives application method is based on student as claimed in claim 1 or 2 in school learning state smart profile system, the application method packet
It includes:
Step 1, image data acquiring module M1 carry out image data acquiring, the facial image letter including acquisition student and teacher
Breath, as the crucial recognition property of archives, and the real-time image information of acquisition student and teacher, the input as state analysis
Information;
Step 2, recognition of face locating module M2 receive the real-time figure of image data acquiring module M1 incoming student and teacher
As information, the face information in image is identified and positioned;
Step 3, behavioural analysis module M3 is received to be believed from the recognition of face locating module M2 real-time image information transmitted and personnel
After breath, according to scene behavior library module M10, behavioural analysis is done using extensive control methods;
Step 4, peel off behavior screening, the behavior type that peels off evaluation, teacher of learning state analysis module M6 attend class behavior point
It analyses, the team learning state at the behavioral reasons that peel off analysis certain moment of generation;
The information of learning state analysis module M6 is packaged and is handled by step 5, and is pushed to archive of student module M8,
Archive of student module M8 is stored with personal information, total marks of the examination, history learning status information;
Step 6, the external interaction data of data aggregation module M9.
4. student according to claim 3 is in school learning state smart profile application method, it is characterised in that: step 2 packet
It includes:
S201 carries out recognition of face and obtains the feature of facial image to the facial image register information of student and teacher, identification
Characteristic results are denoted as Rc, and it is stored in archive of student module M8, it is registered for personnel's archives;
S202 carries out recognition of face and positioning obtains in field of view to the real-time image information of collected student and teacher
All people's face information is identified and is positioned, and is identified and is denoted as R with the characteristic results of positioninglcAfter obtaining characteristic results, face is known
Other locating module M2 calls the face register information in archive of student module M8, and comparison obtains all people person's information in the visual field;
The real-time image information and corresponding personal information of student and teacher are output to by S203, recognition of face locating module M2
Behavioural analysis module M3.
5. student according to claim 3 is in school learning state smart profile application method, it is characterised in that: step three guarantees
It includes:
Step S301, member's fast grouping, behavioural analysis module M3 read realtime graphic picture from archive of student module M8 first
In student's school grade for having recognized, if there is academic record, by marks sequencing gradient in archive of student module M8
It is divided into n group;It is one by member definitions all in video image if there is no academic record in archive of student module M8
A group;
Step 3012, image is decomposed by member, as unit of the group result in step 301, utilizes recognition of face and positioning side
Method identifies the member of each group from image, by rapid edge-detection method, image is pressed member's Region Decomposition, generation is used for
The decomposition image of behavioural analysis;
Step S302 constructs depth convolutional neural networks behavior analyzer, carries out typical behaviour analysis to s class.
6. student according to claim 5 is in school learning state smart profile application method, it is characterised in that: by quick
Edge detection method includes:
Step 30121: the member region navigated to using recognition of face carries out rapid edge-detection as target area;
Step 30122: circulation carries out above-mentioned detection, and defining cycle-index is the number of members in image, and decomposition is independent decomposition
Image, the input for behavior classifier;
Step 3013: the decomposition image in step 3012 being input to convolutional neural networks behavior classifier and is divided by behavior
Class is divided into s class.
7. student according to claim 6 is in school learning state smart profile application method, it is characterised in that: the convolution
The building of neural network behavior classifier the following steps are included:
Step 30131: the data set of building convolutional neural networks behavior classifier, using all big Activity recognition data sets as base
Plinth data set, the user's scene behavioral data collection generated by abnormal scene processing module M5 is as growth data collection, basic number
The data set of convolutional neural networks behavior classifier, the convolutional neural networks behavior point are collectively constituted according to collection and growth data collection
The data set of class device is divided into training set, test set and verifying collection;
Step 30132: building is suitable for the convolutional neural networks of behavior classification, and defining the number of members in one group is m, the row of member
To be divided into s class, s≤m selects a behavior as typical case at random from s class, does behavioural analysis.
8. student according to claim 7 is in school learning state smart profile application method, it is characterised in that: step S302
In include:
Step 3021, the sample data set of depth convolutional neural networks behavior analyzer is established, with big Activity recognition data set work
For basic data set, the user's scene behavioral data collection generated by abnormal scene processing module M5 is as growth data collection, base
Plinth data set and growth data collection collectively constitute the sample data set of depth convolutional neural networks behavior analyzer, the sample number
Training set, test set and verifying collection are divided into according to collection;Depth convolutional neural networks are trained and are tested, until being met
The depth convolutional neural networks behavior analyzer of pre-defined threshold requirement;
Step 3022: the image after decomposition being input in depth convolutional neural networks behavior analyzer, behavioural analysis knot is obtained
Fruit, behavioural analysis result include that goal behavior particularly belongs to which kind of behavior and goal behavior can not parse;
Step 3023: for behavioural analysis the result is that the goal behavior that can not be parsed, defeated by corresponding goal behavior original image
Enter to difficult specimen sample module M4.
9. student according to claim 8 is in school learning state smart profile application method, it is characterised in that: the difficulty
The goal behavior that specimen sample module M4 can not parse behavioural analysis module M3 intercepts first n1 seconds and arrives rear n1 in its time domain
The successive frame picture of second, is then pushed to abnormal scene processing module M5 for successive frame picture;
Successive frame image information is pushed to user, by user using event triggering method by the exception scene processing module M5
Artificial cognition is carried out, after typing differentiates result, by successive frame picture and differentiates that result is deposited into scene behavior library module M10,
Convolutional neural networks behavior classifier and depth convolution to spread foundation Activity recognition database, in behavioural analysis module M3
The update iteration of neural network behavior analyzer network model.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910293925.2A CN110009539A (en) | 2019-04-12 | 2019-04-12 | A kind of student is in school learning state smart profile system and application method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910293925.2A CN110009539A (en) | 2019-04-12 | 2019-04-12 | A kind of student is in school learning state smart profile system and application method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110009539A true CN110009539A (en) | 2019-07-12 |
Family
ID=67171519
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910293925.2A Pending CN110009539A (en) | 2019-04-12 | 2019-04-12 | A kind of student is in school learning state smart profile system and application method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110009539A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110716920A (en) * | 2019-09-27 | 2020-01-21 | 成都驰通数码系统有限公司 | Student behavior automatic analysis method and system based on face recognition |
CN111091484A (en) * | 2020-03-19 | 2020-05-01 | 浙江正元智慧科技股份有限公司 | Student learning behavior analysis system based on big data |
CN112732770A (en) * | 2021-02-05 | 2021-04-30 | 嘉兴南洋职业技术学院 | Educational administration management system and method based on artificial intelligence |
CN113256453A (en) * | 2020-02-07 | 2021-08-13 | 顾得科技教育股份有限公司 | Learning state improvement management system |
CN115909152A (en) * | 2022-11-16 | 2023-04-04 | 北京师范大学 | Teaching scene intelligent analysis system and method based on group behaviors |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105678284A (en) * | 2016-02-18 | 2016-06-15 | 浙江博天科技有限公司 | Fixed-position human behavior analysis method |
CN108073888A (en) * | 2017-08-07 | 2018-05-25 | 中国科学院深圳先进技术研究院 | A kind of teaching auxiliary and the teaching auxiliary system using this method |
CN207424928U (en) * | 2017-09-29 | 2018-05-29 | 烟台工程职业技术学院 | A kind of intelligence compares financial audit system |
US20180211117A1 (en) * | 2016-12-20 | 2018-07-26 | Jayant Ratti | On-demand artificial intelligence and roadway stewardship system |
CN109034020A (en) * | 2018-07-12 | 2018-12-18 | 重庆邮电大学 | A kind of community's Risk Monitoring and prevention method based on Internet of Things and deep learning |
CN109241946A (en) * | 2018-10-11 | 2019-01-18 | 平安科技(深圳)有限公司 | Abnormal behaviour monitoring method, device, computer equipment and storage medium |
CN109284737A (en) * | 2018-10-22 | 2019-01-29 | 广东精标科技股份有限公司 | A kind of students ' behavior analysis and identifying system for wisdom classroom |
CN109522793A (en) * | 2018-10-10 | 2019-03-26 | 华南理工大学 | More people's unusual checkings and recognition methods based on machine vision |
CN109543627A (en) * | 2018-11-27 | 2019-03-29 | 西安电子科技大学 | A kind of method, apparatus and computer equipment judging driving behavior classification |
-
2019
- 2019-04-12 CN CN201910293925.2A patent/CN110009539A/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105678284A (en) * | 2016-02-18 | 2016-06-15 | 浙江博天科技有限公司 | Fixed-position human behavior analysis method |
US20180211117A1 (en) * | 2016-12-20 | 2018-07-26 | Jayant Ratti | On-demand artificial intelligence and roadway stewardship system |
CN108073888A (en) * | 2017-08-07 | 2018-05-25 | 中国科学院深圳先进技术研究院 | A kind of teaching auxiliary and the teaching auxiliary system using this method |
CN207424928U (en) * | 2017-09-29 | 2018-05-29 | 烟台工程职业技术学院 | A kind of intelligence compares financial audit system |
CN109034020A (en) * | 2018-07-12 | 2018-12-18 | 重庆邮电大学 | A kind of community's Risk Monitoring and prevention method based on Internet of Things and deep learning |
CN109522793A (en) * | 2018-10-10 | 2019-03-26 | 华南理工大学 | More people's unusual checkings and recognition methods based on machine vision |
CN109241946A (en) * | 2018-10-11 | 2019-01-18 | 平安科技(深圳)有限公司 | Abnormal behaviour monitoring method, device, computer equipment and storage medium |
CN109284737A (en) * | 2018-10-22 | 2019-01-29 | 广东精标科技股份有限公司 | A kind of students ' behavior analysis and identifying system for wisdom classroom |
CN109543627A (en) * | 2018-11-27 | 2019-03-29 | 西安电子科技大学 | A kind of method, apparatus and computer equipment judging driving behavior classification |
Non-Patent Citations (1)
Title |
---|
栾悉道等: "《多媒体情报处理技术》", 国防工业出版社, pages: 68 - 69 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110716920A (en) * | 2019-09-27 | 2020-01-21 | 成都驰通数码系统有限公司 | Student behavior automatic analysis method and system based on face recognition |
CN113256453A (en) * | 2020-02-07 | 2021-08-13 | 顾得科技教育股份有限公司 | Learning state improvement management system |
CN111091484A (en) * | 2020-03-19 | 2020-05-01 | 浙江正元智慧科技股份有限公司 | Student learning behavior analysis system based on big data |
CN112732770A (en) * | 2021-02-05 | 2021-04-30 | 嘉兴南洋职业技术学院 | Educational administration management system and method based on artificial intelligence |
CN115909152A (en) * | 2022-11-16 | 2023-04-04 | 北京师范大学 | Teaching scene intelligent analysis system and method based on group behaviors |
CN115909152B (en) * | 2022-11-16 | 2023-08-29 | 北京师范大学 | Intelligent teaching scene analysis system based on group behaviors |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110009539A (en) | A kind of student is in school learning state smart profile system and application method | |
CN110991381B (en) | Real-time classroom student status analysis and indication reminding system and method based on behavior and voice intelligent recognition | |
WO2019028592A1 (en) | Teaching assistance method and teaching assistance system using said method | |
Collins et al. | Introduction: Ten points about mixed methods research to be considered by the novice researcher | |
CN109165552A (en) | A kind of gesture recognition method based on human body key point, system and memory | |
CN108073888A (en) | A kind of teaching auxiliary and the teaching auxiliary system using this method | |
CN111027865B (en) | Teaching analysis and quality assessment system and method based on behavior and expression recognition | |
CN109344682A (en) | Classroom monitoring method, device, computer equipment and storage medium | |
CN108596046A (en) | A kind of cell detection method of counting and system based on deep learning | |
CN109034036A (en) | A kind of video analysis method, Method of Teaching Quality Evaluation and system, computer readable storage medium | |
CN108154304A (en) | There is the server of Teaching Quality Assessment | |
CN111046819A (en) | Behavior recognition processing method and device | |
CN109431523A (en) | Autism primary screening apparatus based on asocial's sonic stimulation behavior normal form | |
CN108182649A (en) | For the intelligent robot of Teaching Quality Assessment | |
CN110097283A (en) | Education Administration Information System and method based on recognition of face | |
CN109472464A (en) | A kind of appraisal procedure of the online course quality based on eye movement tracking | |
CN111178263B (en) | Real-time expression analysis method and device | |
Kumar et al. | Automated Attendance System Based on Face Recognition Using Opencv | |
Munshi et al. | Modeling the relationships between basic and achievement emotions in computer-based learning environments | |
CN113255572B (en) | Classroom attention assessment method and system | |
CN113434229A (en) | Education and teaching cloud desktop intelligent analysis management method and system and computer storage medium | |
CN111199378B (en) | Student management method, device, electronic equipment and storage medium | |
CN111444877B (en) | Classroom people number identification method based on video photos | |
CN115829234A (en) | Automatic supervision system based on classroom detection and working method thereof | |
WO2023284067A1 (en) | Facial nerve function evaluation method and apparatus, and computer device and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190712 |
|
RJ01 | Rejection of invention patent application after publication |