CN114612977A - Big data based acquisition and analysis method - Google Patents

Big data based acquisition and analysis method Download PDF

Info

Publication number
CN114612977A
CN114612977A CN202210228837.6A CN202210228837A CN114612977A CN 114612977 A CN114612977 A CN 114612977A CN 202210228837 A CN202210228837 A CN 202210228837A CN 114612977 A CN114612977 A CN 114612977A
Authority
CN
China
Prior art keywords
module
student
class
computer
standard
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210228837.6A
Other languages
Chinese (zh)
Inventor
吴嘉明
谢中淮
郭卫星
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Weike Suyuan New Energy Technology Co ltd
Original Assignee
Suzhou Weike Suyuan New Energy Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Weike Suyuan New Energy Technology Co ltd filed Critical Suzhou Weike Suyuan New Energy Technology Co ltd
Priority to CN202210228837.6A priority Critical patent/CN114612977A/en
Publication of CN114612977A publication Critical patent/CN114612977A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • G06Q50/205Education administration or guidance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume

Abstract

The invention discloses a big data-based acquisition and analysis method, which comprises a big data-based acquisition and analysis system, the system comprises a data acquisition module, a computer-aided analysis module and a result display module, the computer-aided analysis module is electrically connected with the result display module, the data acquisition module is used for acquiring the class state information of students in front of a computer screen in class, the computer-aided analysis module is used for analyzing the expressions and actions of the students according to the computer and the acquired data, the result display module is used for displaying the analysis results, the data acquisition module comprises a thermal imaging module, an image acquisition module and a positioning module, the thermal imaging module is used for thermally imaging the students in class, the image acquisition module is used for acquiring the image information of the students, and the positioning module is used for positioning eyebrow regions on the acquired images, the method has the characteristics of real-time feedback and timely reminding.

Description

Big data based acquisition and analysis method
Technical Field
The invention relates to the technical field of big data, in particular to a big data-based acquisition and analysis method.
Background
In order to ensure the health and safety of teachers and students, online teaching gradually becomes a more popular teaching mode.
The implementation mode of online teaching does, and the teacher publishes the teaching content through the computer to online live broadcast class in appointed time, and the student listens to the course through mobile devices such as computer or cell-phone on line, compares with traditional teaching mode, and online teaching does not receive the restriction of geography, space, and has reduced the contact between the classmate, has effectively prevented the propagation of virus.
However, online teaching also has some disadvantages, for example, for teachers, the online teaching is limited by computer screens and display areas, so that the students often cannot observe expressions and actions in class in time, the teaching quality cannot be fed back in time, and for students with poor learning habits, the teachers cannot supervise themselves in the same place to cause lackluster, the phenomenon that the students sleep in class and are not reminded by people is easy to occur, and even absenteeism is caused, so that the teaching quality is reduced, and therefore, it is necessary to design a large data-based acquisition and analysis method for real-time feedback and prompt reminding.
Disclosure of Invention
The invention aims to provide a big data-based acquisition and analysis method to solve the problems in the background technology.
In order to solve the technical problems, the invention provides the following technical scheme: the system comprises a data acquisition module, a computer-aided analysis module and a result display module, wherein the data acquisition module is electrically connected with the computer-aided analysis module, the computer-aided analysis module is electrically connected with the result display module, the data acquisition module is used for acquiring the class-giving state information of students in front of a computer screen in class, the computer-aided analysis module is used for analyzing the expressions and actions of the students according to the computer and the acquired data, and the result display module is used for displaying the analysis results.
According to the technical scheme, the data acquisition module comprises a thermal imaging module, an image acquisition module and a positioning module, the thermal imaging module is electrically connected with the image acquisition module, the image acquisition module is electrically connected with the positioning module, the thermal imaging module is used for thermally imaging students in class, the image acquisition module is used for acquiring image information of the students, and the positioning module is used for positioning an eyebrow area on an acquired image.
According to the technical scheme, the computer-aided analysis module comprises an expression analysis module and an action analysis module, the expression analysis module is electrically connected with the action analysis module, the expression analysis module is used for analyzing expressions of students according to positions of eyebrows, and the action analysis module is used for analyzing real-time actions of the students.
According to the technical scheme, the result display module comprises a state classification module, a state judgment module and a feedback module, the state classification module is electrically connected with the state judgment module, the state judgment module is electrically connected with the feedback module, the state classification module is used for determining state types according to different characteristics of students, the state judgment module is used for judging the real-time state of the students, and the feedback module is used for feeding the state of the students back to a teacher.
According to the technical scheme, the big data-based acquisition and analysis system comprises the following steps:
step S1: the student sits in front of the computer, and starts working by the data acquisition module positioned at the computer camera after the student starts to go to school;
step S2: the data acquisition module starts to acquire the image information of students who are in class before the computer;
step S3: the computer-aided analysis module analyzes the class-taking state of the student who is in class and analyzes the class-taking expression of the current student in real time;
step S4: the result display module judges the class state of the student according to the computer-aided analysis result and feeds back the information to the teachers in the lessons through the feedback module.
According to the above technical solution, the step S2 further includes the following steps:
step S21: the thermal imaging module arranged at the computer camera detects infrared energy and converts the infrared energy into an electric signal for thermal imaging after a student starts to class, and a thermal imaging picture of the student in class is obtained;
step S22: meanwhile, the image acquisition module acquires an image of the front position of the computer and sends the acquired picture information electric signal to the positioning module;
step S23: the positioning module positions an eyebrow area in the picture according to the collected picture information.
According to the above technical solution, the step S3 further includes the following steps:
step S31: the action analysis module establishes a standard lecture listening posture area according to the sitting position of a student in front of a computer, determines the optimal distance between the body of the student and a screen, and gives a prompt when the distance is too close or too far;
step S32: fitting a real-time class-attending posture contour region of a student who is in class through a thermal imaging picture, and comparing the real-time class-attending posture contour region with an established standard class-attending posture region;
step S33: judging the standard degree of sitting posture of the current student in class through the coincidence degree of the real-time class-attending posture contour region and the standard class-attending posture region;
step S34: when the expression analysis module begins to attend a class, the positioning module positions eyebrow picture information of the student in the picture acquired by the image acquisition module, establishes a plane rectangular coordinate system by taking the nose tip at the moment as a coordinate origin and the horizontal direction as an x axis, and records multipoint coordinates of the eyebrow at the moment as standard positions of the eyebrow;
step S35: in the course of the course later, the expression analysis module updates the coordinates of the current position of the eyebrows every few seconds;
step S36: judging the expression of the student according to the distance change of the bilateral symmetry points of the coordinate axis, when the distance of the bilateral symmetry points of the eyebrows of the student is smaller than that of the standard position, expressing the expression of the frown of the student, and calculating the doubtful value of the expression.
According to the above technical solution, in the step S33, the calculation formula of the standard degree of sitting posture is:
Figure BDA0003539802010000031
wherein S is the area value of the real-time student class-attending posture area in the standard class-attending posture area, S is the area value of the established standard class-attending posture area, lambda is the standard degree conversion coefficient, Q is the sitting posture standard value, and the area value of the real-time student class-attending posture area in the formula is in direct proportion to the area value of the standard class-attending posture area, when the area value of the real-time student class-attending posture area is larger, the sitting posture is more correct, and the spirit of the student on class attending lectures is more concentrated; when the area value of the posture area of the student who is in class in real time is smaller, the change of the sitting posture is shown, and the possibility of dozing in class or even absenteezing in class exists.
In step S36, the formula for calculating the confusion value of the student in class is:
Figure BDA0003539802010000041
wherein (x)i,yi) (x) is the real-time coordinate of a point on the eyebrow of the student0,y0) In the formula, when the distance between a real-time point on the eyebrow of the student and the point in the standard state of the eyebrow of the student is larger, the doubtful degree of the student is deeper, and when the Y value is 0, the expression of the student is not changed, and the doubtful probability is lower.
According to the above technical solution, in step S4, the operating method of state classification includes:
when the action analysis module detects that the standard degree Q of the student is more than or equal to 75% in real time, the action analysis module indicates that the sitting posture of the student is more standard, and the probability of dozing or leaving class is lower; when the action analysis module measures that the standard degree Q of the student is less than or equal to 75% in real time, the probability that the student is askew in sitting posture, possibly sleepy or even absenteeism is shown; in the case of the standard sitting posture of the student, when the doubtful value of the student is measured to be larger, the current class progress point and the name of the student are recorded.
According to the above technical solution, in step S4, the feedback module operates in the following manner:
when the sitting posture of the student is more standard, feedback to any teacher is not needed; when the sitting posture of the student is relatively askew, the name of the student and the sitting posture standard degree information are sent to the teacher end of the lecture, and the teacher end of the lecture opens a camera of the corresponding student to perform secondary confirmation after receiving the information; and (3) detecting the suspicious values of the students in the whole process, and generating two-dimensional images of time points and the suspicious values of the students after class administration for teachers in any class to summarize the teaching conditions.
Compared with the prior art, the invention has the following beneficial effects: in the invention, the raw materials are mixed,
(1) the thermal imaging module is arranged to fit the body outline image of the student before the computer, so that the posture of the student can be judged conveniently by the system;
(2) the positioning module is arranged, so that the positions of eyebrows are positioned in the collected images of the students, and the expressions of the students can be conveniently detected;
(3) the computer aided analysis module is arranged, so that the expressions and postures of students can be accurately judged according to strong computing power of a computer, the students accessing the lessons can be restrained conveniently, and good learning habits can be developed;
(4) the result display module is arranged to display the student state identified by the system, so that the system is convenient to remind;
(5) the feedback module is arranged to feed the lesson-taking state information of the students back to the lessee teachers, so that the lessee teachers can manage the lesson-taking discipline conveniently and restrict the behaviors of the students;
(6) through being provided with feedback module, the teacher can snatch the log of having lessons after class, screens out the place that the student is suspicious frequently great in class, and the teacher of giving lessons of wantonly of being convenient for summarizes the condition of having lessons and prepares for lessons.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings:
FIG. 1 is a schematic diagram of the system module composition of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, the present invention provides a technical solution: a big data-based collection and analysis method comprises a big data-based collection and analysis system, a computer-aided analysis module and a result display module, wherein the system comprises a data collection module, a computer-aided analysis module and a result display module, the data collection module is electrically connected with the computer-aided analysis module, the computer-aided analysis module is electrically connected with the result display module, the data collection module is used for collecting the class state information of students in front of a computer screen in class, the computer-aided analysis module is used for analyzing the expressions and actions of the students according to the computer and the collected data, and the result display module is used for displaying the analysis results.
The data acquisition module comprises a thermal imaging module, an image acquisition module and a positioning module, the thermal imaging module is electrically connected with the image acquisition module, the image acquisition module is electrically connected with the positioning module, the thermal imaging module is used for thermally imaging students in class, the image acquisition module is used for acquiring image information of the students, and the positioning module is used for positioning an eyebrow area on an acquired image.
The computer-aided analysis module comprises an expression analysis module and an action analysis module, the expression analysis module is electrically connected with the action analysis module, the expression analysis module is used for analyzing the expressions of students according to the positions of eyebrows, and the action analysis module is used for analyzing the real-time actions of the students.
The result display module comprises a state classification module, a state judgment module and a feedback module, wherein the state classification module is electrically connected with the state judgment module, the state judgment module is electrically connected with the feedback module, the state classification module is used for determining the state types according to different characteristics of students, the state judgment module is used for judging the real-time states of the students, and the feedback module is used for feeding the states of the students back to teachers.
A big data-based acquisition and analysis system comprises the following steps:
step S1: the student sits in front of the computer, and starts working by the data acquisition module positioned at the computer camera after the student starts to go to school;
step S2: the data acquisition module starts to acquire image information of students who are in class before the computer;
step S3: the computer-aided analysis module analyzes the class state of the student who is in class and analyzes the class expression of the current student in real time;
step S4: the result display module judges the class state of the student according to the computer-aided analysis result and feeds back the information to the teachers in the lessons through the feedback module.
Step S2 further includes the steps of:
step S21: the thermal imaging module arranged at the computer camera detects infrared energy and converts the infrared energy into an electric signal for thermal imaging after a student starts to class, and a thermal imaging picture of the student in class is obtained;
step S22: meanwhile, the image acquisition module acquires an image of the front position of the computer and sends the acquired picture information electric signal to the positioning module;
step S23: the positioning module positions an eyebrow area in the picture according to the collected picture information.
Step S3 further includes the steps of:
step S31: the action analysis module establishes a standard lecture listening posture area according to the sitting position of a student in front of a computer, determines the optimal distance between the body of the student and a screen, and gives a prompt when the distance is too close or too far;
step S32: fitting a real-time class-attending posture contour region of a student who is in class through a thermal imaging picture, and comparing the real-time class-attending posture contour region with an established standard class-attending posture region;
step S33: judging the standard degree of sitting posture of the current student in class through the coincidence degree of the real-time class-attending posture contour region and the standard class-attending posture region;
step S34: when the expression analysis module begins to attend a class, the positioning module positions eyebrow picture information of the student in the picture acquired by the image acquisition module, establishes a plane rectangular coordinate system by taking the nose tip at the moment as a coordinate origin and the horizontal direction as an x axis, and records multipoint coordinates of the eyebrow at the moment as standard positions of the eyebrow;
step S35: in the course of the course later, the expression analysis module updates the coordinates of the current position of the eyebrows every few seconds;
step S36: judging the expression of the student according to the distance change of the bilateral symmetry points of the coordinate axis, when the distance of the bilateral symmetry points of the eyebrows of the student is smaller than that of the standard position, expressing the expression of the frown of the student, and calculating the doubtful value of the expression.
In step S33, the calculation formula of the sitting posture standard degree is:
Figure BDA0003539802010000081
wherein S is the area value of the real-time student class-attending posture area in the standard class-attending posture area, S is the area value of the established standard class-attending posture area, lambda is the standard degree conversion coefficient, Q is the sitting posture standard value, and the area value of the real-time student class-attending posture area in the formula is in direct proportion to the area value of the standard class-attending posture area, when the area value of the real-time student class-attending posture area is larger, the sitting posture is more correct, and the spirit of the student on class attending lectures is more concentrated; when the area value of the posture area of the student who is in class in real time is smaller, the change of the sitting posture is shown, and the possibility of dozing in class or even absenteezing in class exists.
In step S36, the formula for calculating the confusion value of the student in class is:
Figure BDA0003539802010000082
wherein (x)i,yi) (x) is the real-time coordinate of a point on the eyebrow of the student0,y0) And in the formula, when the distance between a real-time point on the eyebrow of the student and the point in the standard state when the student just enters the class is larger, the higher the confusion degree of the student is, and when the Y value is 0, the expression of the student is not changed, so that the probability of confusion is lower.
In step S4, the operating method of state classification is:
when the action analysis module detects that the standard degree Q of the student is more than or equal to 75% in real time, the action analysis module indicates that the sitting posture of the student is more standard, and the probability of dozing or leaving class is lower; when the action analysis module measures that the standard degree Q of the student is less than or equal to 75% in real time, the probability that the student is askew in sitting posture, possibly sleepy or even absenteeism is shown; in the case of the standard sitting posture of the student, when the doubtful value of the student is measured to be larger, the current class progress point and the name of the student are recorded.
In step S4, the feedback module operates as follows:
when the sitting posture of the student is more standard, feedback to any teacher is not needed; when the sitting posture of the student is relatively askew, the name of the student and the sitting posture standard degree information are sent to the teacher end of the lecture, and the teacher end of the lecture opens a camera of the corresponding student to perform secondary confirmation after receiving the information; and (3) detecting the suspicious values of the students in the whole process, and generating two-dimensional images of time points and the suspicious values of the students after class administration for teachers in any class to summarize the teaching conditions.
The first embodiment is as follows: when student A goes on class, the area value S of standard class-attending posture area is 2m2The real-time lecture-listening posture area s is 1.8m2λ 1 according to the formula
Figure BDA0003539802010000091
It can be found that Q is 90%, and the real-time coordinate of a point on the eyebrow of the student is xi=-2,yi1.5, just after openingThe coordinate of the eyebrow pointing at the standard state of the student is x at the beginning of the class0=-1,y00.5, μ 1, according to the formula
Figure BDA0003539802010000092
Figure BDA0003539802010000093
When Y is 2.6, the information is fed back to the teacher.
Example two: when student B goes to class, the area value S of the standard class-attending posture area is 2.5m2The real-time lecture-listening posture area value s is 2m2λ 1 according to the formula
Figure BDA0003539802010000094
It can be found that Q is 80%, and the real-time coordinate of a point on the eyebrow of the student is xi=-2,yiThe coordinate of the eyebrow point under the standard state of the student at the beginning of class is x0=-1,y00.5, μ 1, according to the formula
Figure BDA0003539802010000095
Figure BDA0003539802010000096
When Y is 3.6, the information is fed back to the teacher.
Example three: when student C goes to class, the area value S of the standard class-attending posture area is 2.8m2The real-time lecture-listening posture area value s is 1.4m2λ 1 according to the formula
Figure BDA0003539802010000097
It can be found that Q is 50%, and the real-time coordinate of a point on the eyebrow of the student is xi=0,yi1.5, the coordinate of the eyebrow point under the standard state of the student at the beginning of class is x01.2, y0 is 0.5, μ is 1, according to the formula
Figure BDA0003539802010000098
Figure BDA0003539802010000099
When Y is 2.3, the information is fed back to the teacher.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
Finally, it should be noted that: although the present invention has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that changes may be made in the embodiments and/or equivalents thereof without departing from the spirit and scope of the invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A big data-based acquisition and analysis method comprises a big data-based acquisition and analysis system, and is characterized in that: the computer-aided analysis system comprises a data acquisition module, a computer-aided analysis module and a result display module, wherein the data acquisition module is electrically connected with the computer-aided analysis module, the computer-aided analysis module is electrically connected with the result display module, the data acquisition module is used for acquiring the class-giving state information of students in front of a computer screen in class, the computer-aided analysis module is used for analyzing the expressions and actions of the students according to the computer and the acquired data, and the result display module is used for displaying the analysis results.
2. The big data-based collection and analysis method according to claim 1, wherein: the data acquisition module includes thermal imaging module, image acquisition module, orientation module, thermal imaging module is connected with the image acquisition module electricity, the image acquisition module is connected with the orientation module electricity, thermal imaging module is used for carrying out the thermal imaging to the student of giving lessons, the image acquisition module is used for gathering student's image information, orientation module is used for fixing a position the eyebrow region on the image that obtains.
3. The big data-based collection and analysis method according to claim 2, wherein: the computer-aided analysis module comprises an expression analysis module and an action analysis module, the expression analysis module is electrically connected with the action analysis module, the expression analysis module is used for analyzing expressions of students according to the positions of eyebrows, and the action analysis module is used for analyzing real-time actions of the students.
4. The big data-based collection and analysis method according to claim 3, wherein: the result display module comprises a state classification module, a state judgment module and a feedback module, wherein the state classification module is electrically connected with the state judgment module, the state judgment module is electrically connected with the feedback module, the state classification module is used for determining the state types according to different characteristics of students, the state judgment module is used for judging the real-time state of the students, and the feedback module is used for feeding the state of the students back to a teacher.
5. The big data-based collection and analysis method according to claim 4, wherein: the big data-based acquisition and analysis system comprises the following steps:
step S1: the student sits in front of the computer, and starts working by the data acquisition module positioned at the computer camera after the student starts to go to school;
step S2: the data acquisition module starts to acquire image information of students who are in class before the computer;
step S3: the computer-aided analysis module analyzes the class state of the student who is in class and analyzes the class expression of the current student in real time;
step S4: the result display module judges the class state of the student according to the computer-aided analysis result and feeds back the information to the teachers in the lessons through the feedback module.
6. The big data-based collection and analysis method according to claim 5, wherein: the step S2 further includes the steps of:
step S21: the thermal imaging module arranged at the computer camera detects infrared energy and converts the infrared energy into an electric signal for thermal imaging after a student starts to class, and a thermal imaging picture of the student in class is obtained;
step S22: meanwhile, the image acquisition module acquires an image of the front position of the computer and sends the acquired picture information electric signal to the positioning module;
step S23: the positioning module positions an eyebrow area in the picture according to the collected picture information.
7. The big data-based collection and analysis method according to claim 6, wherein: the step S3 further includes the steps of:
step S31: the action analysis module establishes a standard lecture listening posture area according to the sitting position of a student in front of a computer, determines the optimal distance between the body of the student and a screen, and gives a prompt when the distance is too close or too far;
step S32: fitting a real-time class-attending posture contour region of a student who is in class through a thermal imaging picture, and comparing the real-time class-attending posture contour region with an established standard class-attending posture region;
step S33: judging the standard degree of sitting posture of the current student in class through the coincidence degree of the real-time class-attending posture contour region and the standard class-attending posture region;
step S34: when the expression analysis module begins to attend a class, the positioning module positions eyebrow picture information of the student in the picture acquired by the image acquisition module, establishes a plane rectangular coordinate system by taking the nose tip at the moment as a coordinate origin and the horizontal direction as an x axis, and records multipoint coordinates of the eyebrow at the moment as standard positions of the eyebrow;
step S35: in the course of the course later, the expression analysis module updates the coordinates of the current position of the eyebrows every few seconds;
step S36: judging the expression of the student according to the distance change of the bilateral symmetry points of the coordinate axis, when the distance of the bilateral symmetry points of the eyebrows of the student is smaller than that of the standard position, expressing the expression of the frown of the student, and calculating the doubtful value of the expression.
8. The big data-based collection and analysis method according to claim 7, wherein: in step S33, the calculation formula of the sitting posture standard degree is:
Figure FDA0003539799000000031
wherein S is the area value of the real-time student class-listening posture area in the standard class-listening posture area, S is the area value of the established standard class-listening posture area, λ is the standard degree conversion coefficient, Q is the sitting posture standard value, and the area value of the real-time student class-listening posture area in the formula is in direct proportion to the area value of the standard class-listening posture area, when the area value of the real-time student class-listening posture area is larger, the sitting posture is more correct, the spirit of the student in class is more concentrated, and when the area value of the real-time student class-listening posture area is smaller, the sitting posture is changed, so that the possibility of dozing in class or even absenteezing class exists;
in step S36, the formula for calculating the confusion value of the student in class is:
Figure FDA0003539799000000032
wherein (x)i,yi) (x) is the real-time coordinate of a point on the eyebrow of the student0,y0) In the formula, when the distance between a real-time point on the eyebrow of the student and the point in the standard state of the eyebrow of the student is larger, the doubtful degree of the student is deeper, and when the Y value is 0, the expression of the student is not changed, and the doubtful probability is lower.
9. The big data-based collection and analysis method according to claim 8, wherein: in step S4, the operating method of state classification includes:
when the action analysis module detects that the standard degree Q of the student is more than or equal to 75% in real time, the action analysis module indicates that the sitting posture of the student is more standard, and the probability of dozing or leaving class is lower; when the action analysis module measures that the standard degree Q of the student is less than or equal to 75% in real time, the probability that the student is askew in sitting posture, possibly sleepy or even absenteeism is shown; in the case of the standard sitting posture of the student, when the doubtful value of the student is measured to be larger, the current class progress point and the name of the student are recorded.
10. The big data-based collection and analysis method according to claim 9, wherein: in step S4, the feedback module operates as follows:
when the sitting posture of the student is more standard, feedback to any teacher is not needed; when the sitting posture of the student is relatively askew, the name of the student and the sitting posture standard degree information are sent to the teacher end of the lecture, and the teacher end of the lecture opens a camera of the corresponding student to perform secondary confirmation after receiving the information; and (3) detecting the suspicious values of the students in the whole process, and generating two-dimensional images of time points and the suspicious values of the students after class administration for teachers in any class to summarize the teaching conditions.
CN202210228837.6A 2022-03-10 2022-03-10 Big data based acquisition and analysis method Pending CN114612977A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210228837.6A CN114612977A (en) 2022-03-10 2022-03-10 Big data based acquisition and analysis method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210228837.6A CN114612977A (en) 2022-03-10 2022-03-10 Big data based acquisition and analysis method

Publications (1)

Publication Number Publication Date
CN114612977A true CN114612977A (en) 2022-06-10

Family

ID=81861435

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210228837.6A Pending CN114612977A (en) 2022-03-10 2022-03-10 Big data based acquisition and analysis method

Country Status (1)

Country Link
CN (1) CN114612977A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017156835A1 (en) * 2016-03-18 2017-09-21 深圳大学 Smart method and system for body building posture identification, assessment, warning and intensity estimation
CN109165633A (en) * 2018-09-21 2019-01-08 上海健坤教育科技有限公司 A kind of intelligent interactive learning system based on camera perception
WO2020024400A1 (en) * 2018-08-02 2020-02-06 平安科技(深圳)有限公司 Class monitoring method and apparatus, computer device, and storage medium
CN111178263A (en) * 2019-12-30 2020-05-19 湖北美和易思教育科技有限公司 Real-time expression analysis method and device
CN112132095A (en) * 2020-09-30 2020-12-25 Oppo广东移动通信有限公司 Dangerous state identification method and device, electronic equipment and storage medium
CN112487928A (en) * 2020-11-26 2021-03-12 重庆邮电大学 Classroom learning condition real-time monitoring method and system based on feature model
CN113703335A (en) * 2021-10-27 2021-11-26 江苏博子岛智能产业技术研究院有限公司 Intelligent home brain control system based on internet of things and provided with brain-computer interface
CN113947797A (en) * 2021-10-28 2022-01-18 深圳市中悦科技有限公司 State expression analysis method, device and equipment for students in class
CN114098284A (en) * 2021-11-23 2022-03-01 苏州爱果乐智能家居有限公司 Height adjusting method for infrared induction height and learning table

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017156835A1 (en) * 2016-03-18 2017-09-21 深圳大学 Smart method and system for body building posture identification, assessment, warning and intensity estimation
WO2020024400A1 (en) * 2018-08-02 2020-02-06 平安科技(深圳)有限公司 Class monitoring method and apparatus, computer device, and storage medium
CN109165633A (en) * 2018-09-21 2019-01-08 上海健坤教育科技有限公司 A kind of intelligent interactive learning system based on camera perception
CN111178263A (en) * 2019-12-30 2020-05-19 湖北美和易思教育科技有限公司 Real-time expression analysis method and device
CN112132095A (en) * 2020-09-30 2020-12-25 Oppo广东移动通信有限公司 Dangerous state identification method and device, electronic equipment and storage medium
CN112487928A (en) * 2020-11-26 2021-03-12 重庆邮电大学 Classroom learning condition real-time monitoring method and system based on feature model
CN113703335A (en) * 2021-10-27 2021-11-26 江苏博子岛智能产业技术研究院有限公司 Intelligent home brain control system based on internet of things and provided with brain-computer interface
CN113947797A (en) * 2021-10-28 2022-01-18 深圳市中悦科技有限公司 State expression analysis method, device and equipment for students in class
CN114098284A (en) * 2021-11-23 2022-03-01 苏州爱果乐智能家居有限公司 Height adjusting method for infrared induction height and learning table

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
郑莉莉等: "翻转课堂模式中教学行为分析模型构建研究", 《中国教育技术装备》 *

Similar Documents

Publication Publication Date Title
CN109284737A (en) A kind of students ' behavior analysis and identifying system for wisdom classroom
CN110197169B (en) Non-contact learning state monitoring system and learning state detection method
CN111652189A (en) Student management system for intelligent teaching
CN109359521A (en) The two-way assessment system of Classroom instruction quality based on deep learning
CN109727167B (en) Teaching auxiliary system
CN112183238B (en) Remote education attention detection method and system
CN106373444B (en) A kind of Multifunctional English classroom with English teaching aid
CN109345156A (en) A kind of Classroom Teaching system based on machine vision
CN111796752A (en) Interactive teaching system based on PC
CN111507592B (en) Evaluation method for active modification behaviors of prisoners
CN112613440A (en) Attitude detection method and apparatus, electronic device and storage medium
CN111626628A (en) Network teaching system for extraclass tutoring
CN206557851U (en) A kind of situation harvester of listening to the teacher of imparting knowledge to students
TW202008293A (en) System and method for monitoring qualities of teaching and learning
WO2023041940A1 (en) Gaze-based behavioural monitoring system
CN111402096A (en) Online teaching quality management method, system, equipment and medium
CN111931608A (en) Operation management method and system based on student posture and student face recognition
CN113034322B (en) Internet-based online education supervision system and method
CN114187640A (en) Learning situation observation method, system, equipment and medium based on online classroom
CN113095259A (en) Remote online course teaching management method
CN114612977A (en) Big data based acquisition and analysis method
CN107958500A (en) A kind of monitoring system for real border real time information sampling of imparting knowledge to students
CN115690867A (en) Classroom concentration detection method, device, equipment and storage medium
JP6905775B1 (en) Programs, information processing equipment and methods
Wang et al. Research and design of an attention monitoring system based on head posture estimation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20220610

RJ01 Rejection of invention patent application after publication