CN112132087A - Online learning quality evaluation method and system - Google Patents
Online learning quality evaluation method and system Download PDFInfo
- Publication number
- CN112132087A CN112132087A CN202011054084.9A CN202011054084A CN112132087A CN 112132087 A CN112132087 A CN 112132087A CN 202011054084 A CN202011054084 A CN 202011054084A CN 112132087 A CN112132087 A CN 112132087A
- Authority
- CN
- China
- Prior art keywords
- eye
- student
- face image
- determining
- online learning
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 50
- 238000013441 quality evaluation Methods 0.000 title claims abstract description 32
- 210000001508 eye Anatomy 0.000 claims description 307
- 210000005252 bulbus oculi Anatomy 0.000 claims description 56
- 210000000744 eyelid Anatomy 0.000 claims description 28
- 238000011156 evaluation Methods 0.000 claims description 22
- 230000000007 visual effect Effects 0.000 claims description 10
- 238000001914 filtration Methods 0.000 claims description 8
- 238000007781 pre-processing Methods 0.000 claims description 6
- 239000000126 substance Substances 0.000 claims description 3
- 230000009286 beneficial effect Effects 0.000 abstract description 14
- 230000004438 eyesight Effects 0.000 abstract description 6
- 238000011158 quantitative evaluation Methods 0.000 abstract description 6
- 210000005069 ears Anatomy 0.000 description 4
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 230000007547 defect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
- G06V40/171—Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0639—Performance analysis of employees; Performance analysis of enterprise or organisation operations
- G06Q10/06395—Quality analysis or management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
- G06Q50/20—Education
- G06Q50/205—Education administration or guidance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Human Resources & Organizations (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Strategic Management (AREA)
- Educational Administration (AREA)
- Multimedia (AREA)
- Economics (AREA)
- Tourism & Hospitality (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Marketing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Entrepreneurship & Innovation (AREA)
- Educational Technology (AREA)
- Human Computer Interaction (AREA)
- Development Economics (AREA)
- General Business, Economics & Management (AREA)
- Ophthalmology & Optometry (AREA)
- Quality & Reliability (AREA)
- Operations Research (AREA)
- Game Theory and Decision Science (AREA)
- Primary Health Care (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
The invention provides an online learning quality evaluation method and system, which can obtain a corresponding face image by shooting the face of a student, determine the eye opening and closing state and the eye sight direction state of the student according to the face image so as to determine the concentration state of the student in the online student process, and can directly obtain a quantitative evaluation result about the learning concentration state of the student from the shot face image through a series of image processing calculation, thereby reliably and quantitatively judging the online learning quality state of the student and being beneficial to improving the efficiency and quality of online learning.
Description
Technical Field
The invention relates to the technical field of intelligent education, in particular to an online learning quality evaluation method and system.
Background
At present, online learning is realized by means of watching corresponding courses on a screen of a device terminal by students and interacting with teachers, and the online learning can be convenient for the students to learn the courses on different occasions at any time and any place, so that convenience and flexibility of course learning of the students are improved. However, since the whole process of online learning is realized by facing the screen of the student, the student cannot constantly ensure the state of concentration and is prone to distraction in the whole process, which affects the efficiency and quality of online learning of the student, and the prior art cannot effectively and accurately judge what state of the student belongs to the state of distraction or distraction, which cannot reliably and quantitatively judge the quality state of online learning of the student, and thus is not beneficial to improving the efficiency and quality of online learning.
Disclosure of Invention
Aiming at the defects in the prior art, the invention provides an online learning quality evaluation method and system, which are used for obtaining a face image of a student by shooting the student, determining eye coordinate information of the student according to the face image, determining eye opening and closing state information and eye watching screen sight direction information of the student according to the eye coordinate information, and determining the concentration state of the student in the current online learning process according to the eye opening and closing state information and the eye watching screen sight direction information; therefore, the online learning quality evaluation method and the online learning quality evaluation system can obtain the corresponding face image by shooting the face of the student, determine the eye opening and closing state and the eye sight direction state of the student according to the face image, determine the concentration state of the student in the online student process, and directly obtain the quantitative evaluation result about the student learning concentration state from the shot face image through a series of image processing calculation, thereby reliably and quantitatively judging the quality state of the student online learning and being beneficial to improving the efficiency and the quality of the online learning.
The invention provides an online learning quality evaluation method, which is characterized by comprising the following steps:
step S1, shooting a student to obtain a face image of the student, and determining eye coordinate information of the student according to the face image;
step S2, determining the eye opening and closing state information and the eye watching screen sight direction information of the student according to the eye coordinate information;
step S3, determining the concentration state of the student in the current online learning process according to the eye opening and closing state information and the eye watching screen sight direction information;
further, in step S1, the photographing a student to obtain a face image of the student, and determining the eye coordinate information of the student according to the face image specifically includes:
step S101, adjusting a shooting angle of view and/or a shooting focal length for shooting the student to enable the face area of the student to be imaged in a central area of a current shooting view picture, so as to obtain a face image of the student;
step S102, sequentially carrying out background noise reduction filtering processing, pixel sharpening processing and pixel binarization processing on the face image so as to obtain a preprocessed face image;
step S103, acquiring the preprocessed face image, and determining eye coordinate information of two eyes of the student;
further, in step S2, the determining, according to the eye coordinate information, the eye opening and closing state information and the eye viewing screen sight line direction information of the student specifically include:
step S201, determining an average value ER of the eye opening and closing amplitude of the student according to the eye coordinate information and the following formula (1):
in the above-mentioned formula (1),coordinate values representing an eye angle on an ear side of an ith eye of the student in the face image,a coordinate value representing a point where a tangent to an eyeball contour portion of the ith eye of the student, which is located at a shortest distance from the ear, intersects the upper eyelid in the eyeball contour of the ith eye of the student in the face image,a coordinate value representing a point where a tangent to an eyeball contour portion of an ith eye of the student on the eyeball contour of the ith eye of the student in the face image, the distance between the tangent and the nose bridge is shortest, and an upper eyelid,coordinate values representing an eye corner on a side of an ith eye of the student close to a nose bridge in the face image,a coordinate value indicating a point where a tangent to an eyeball contour portion of the ith eye of the student on the eyeball contour of the ith eye of the student in the face image, the eyeball contour portion having the shortest distance to the ear, intersects the lower eyelid,coordinate values representing intersection points of a tangent line of an eyeball contour part with the shortest distance to the nose bridge on the eyeball contour of the ith eye of the student in the face image and the lower eyelid, wherein when i is 1, the 1 st eye represents the left eye of the student, and when i is 2, the 2 nd eye represents the right eye of the student;
step S202, determining an included angle alpha between the sight line of the screen watched by the eyes of the student and a straight line perpendicular to the surface where the screen is located according to the eye coordinate information and the following formula (2):
in the above-mentioned formula (2),coordinate values representing an eyeball center point in an ith eye of the student in the face image, (X, Y) represents preset origin coordinate values, L represents a shooting depth of field corresponding to the shooting, f represents a shooting focal length corresponding to the shooting, when i is 1, the 1 st eye represents a left eye of the student, and when i is 2, the 2 nd eye represents a right eye of the student;
further, in step S3, determining, according to the eye opening and closing state information and the eye viewing screen sight line direction information, a concentration state of the student in the current online learning process specifically includes:
step S301, determining the current online learning concentration degree evaluation value F of the student according to the eye opening and closing state information, the eye watching screen sight line direction information and the following formula (3):
in the formula (3), ER represents an average value of eye opening and closing amplitudes of the student in the shooting process, α represents an included angle between a sight line of the student watching the screen and a straight line perpendicular to the surface where the screen is located, and ER representsmaxRepresenting the maximum value of the eye opening and closing amplitude of the student;
step S302, comparing the online learning concentration evaluation value F with a preset concentration threshold, if the online learning concentration evaluation value F is greater than or equal to the preset concentration threshold, determining that the student is currently in a learning concentration state, otherwise, determining that the student is not currently in the learning concentration state.
The invention also provides an online learning quality evaluation system which is characterized by comprising an image shooting and processing module, an eye coordinate information determining module, an eye state determining module and a learning concentration state determining module; wherein the content of the first and second substances,
the image shooting and processing module is used for shooting students so as to obtain face images of the students and preprocessing the face images;
the eye coordinate information determining module is used for determining eye coordinate information of the student according to the preprocessed face image;
the eye state determining module is used for determining the eye opening and closing state information of the student and the sight line direction information of the student when the student watches the screen according to the eye coordinate information;
the learning concentration state determination module is used for determining the concentration state of the student in the current online learning process according to the eye opening and closing state information and the eye watching screen sight direction information;
further, the image shooting and processing module shoots students to obtain face images of the students, and the preprocessing of the face images specifically comprises:
adjusting a shooting visual angle and/or a shooting focal length for shooting the student so as to enable the face area of the student to be imaged in the central area of the current shooting visual field picture, and thus obtaining the face image of the student;
then, sequentially carrying out background noise reduction filtering processing, pixel sharpening processing and pixel binarization processing on the face image so as to obtain a preprocessed face image;
and the number of the first and second groups,
the eye coordinate information determining module determines the eye coordinate information of the student according to the preprocessed face image, and specifically includes:
determining eye coordinate information of two eyes of the student according to the preprocessed face image;
further, the determining, by the eye state determining module, the eye opening and closing state information of the student and the information of the sight line direction of the screen viewed by the eyes according to the eye coordinate information specifically includes:
determining an eye opening and closing amplitude average value ER of the student according to the eye coordinate information and the following formula (1):
in the above-mentioned formula (1),coordinate values representing an eye angle on an ear side of an ith eye of the student in the face image,the distance between the eyeball contour of the ith eye of the student in the face image and the earThe coordinate value of the intersection point of the tangent line of the shortest eyeball contour part and the upper eyelid,a coordinate value representing a point where a tangent to an eyeball contour portion of an ith eye of the student on the eyeball contour of the ith eye of the student in the face image, the distance between the tangent and the nose bridge is shortest, and an upper eyelid,coordinate values representing an eye corner on a side of an ith eye of the student close to a nose bridge in the face image,a coordinate value indicating a point where a tangent to an eyeball contour portion of the ith eye of the student on the eyeball contour of the ith eye of the student in the face image, the eyeball contour portion having the shortest distance to the ear, intersects the lower eyelid,coordinate values representing intersection points of a tangent line of an eyeball contour part with the shortest distance to the nose bridge on the eyeball contour of the ith eye of the student in the face image and the lower eyelid, wherein when i is 1, the 1 st eye represents the left eye of the student, and when i is 2, the 2 nd eye represents the right eye of the student;
and the number of the first and second groups,
according to the eye coordinate information and the following formula (2), determining an included angle alpha between the sight line of the screen watched by the eyes of the student and a straight line perpendicular to the surface where the screen is located:
in the above-mentioned formula (2),coordinate values representing the center point of an eyeball in the ith eye of the student in the face image, (X, Y) represents a preset primitivePoint coordinate values, L representing a photographing depth of field corresponding to the photographing, f representing a photographing focal length corresponding to the photographing, the 1 st eye representing a left eye of the student when i is 1, and the 2 nd eye representing a right eye of the student when i is 2;
further, the determination module for the learning concentration state determines the concentration state of the student in the current online learning process according to the eye opening and closing state information and the information of the eye watching screen sight direction, and specifically includes:
determining the current online learning concentration degree evaluation value F of the student according to the eye opening and closing state information, the eye watching screen sight direction information and the following formula (3):
in the formula (3), ER represents an average value of eye opening and closing amplitudes of the student in the shooting process, α represents an included angle between a sight line of the student watching the screen and a straight line perpendicular to the surface where the screen is located, and ER representsmaxRepresenting the maximum value of the eye opening and closing amplitude of the student;
and comparing the online learning concentration degree evaluation value F with a preset concentration degree threshold, if the online learning concentration degree evaluation value F is larger than or equal to the preset concentration degree threshold, determining that the student is currently in a learning concentration state, otherwise, determining that the student is not currently in the learning concentration state.
Compared with the prior art, the online learning quality evaluation method and the online learning quality evaluation system have the advantages that the students are shot to obtain face images of the students, eye coordinate information of the students is determined according to the face images, eye opening and closing state information and eye watching screen sight direction information of the students are determined according to the eye coordinate information, and concentration states of the students in the current online learning process are determined according to the eye opening and closing state information and the eye watching screen sight direction information; therefore, the online learning quality evaluation method and the online learning quality evaluation system can obtain the corresponding face image by shooting the face of the student, determine the eye opening and closing state and the eye sight direction state of the student according to the face image, determine the concentration state of the student in the online student process, and directly obtain the quantitative evaluation result about the student learning concentration state from the shot face image through a series of image processing calculation, thereby reliably and quantitatively judging the quality state of the student online learning and being beneficial to improving the efficiency and the quality of the online learning.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
The technical solution of the present invention is further described in detail by the accompanying drawings and embodiments.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic flow chart of the online learning quality evaluation method provided by the present invention.
Fig. 2 is a schematic structural diagram of the online learning quality evaluation system provided by the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1 is a schematic flow chart of an online learning quality evaluation method according to an embodiment of the present invention. The online learning quality evaluation method comprises the following steps:
step S1, shooting a student to obtain a face image of the student, and determining the eye coordinate information of the student according to the face image;
step S2, determining the eye opening and closing state information and the eye watching screen sight direction information of the student according to the eye coordinate information;
and step S3, determining the concentration state of the student in the current online learning process according to the eye opening and closing state information and the eye watching screen sight direction information.
The beneficial effects of the above technical scheme are: the online learning quality evaluation method comprises the steps of shooting the face of a student to obtain a corresponding face image, determining the eye opening and closing state and the eye sight direction state of the student according to the face image, determining the concentration state of the student in the online student process, and directly obtaining a quantitative evaluation result about the student learning concentration state from the shot face image through a series of image processing calculations, so that the online learning quality state of the student is reliably and quantitatively judged, and the online learning efficiency and quality are improved.
Preferably, in step S1, the photographing a student to obtain a face image of the student, and the determining the eye coordinate information of the student based on the face image specifically includes:
step S101, adjusting a shooting angle of view and/or a shooting focal length for shooting the student to enable the face area of the student to be imaged in the central area of the current shooting view picture, so as to obtain the face image of the student;
step S102, sequentially carrying out background noise reduction filtering processing, pixel sharpening processing and pixel binarization processing on the face image so as to obtain a preprocessed face image;
step S103, acquiring the preprocessed face image, and determining eye coordinate information of two eyes of the student.
The beneficial effects of the above technical scheme are: by adjusting the shooting visual angle and/or the shooting focal length, the face area of the student can be completely imaged in the corresponding shooting visual field, so that the situations of incomplete shooting and distorted shooting are avoided; in addition, the background noise reduction filtering processing, the pixel sharpening processing and the pixel binarization processing are sequentially carried out on the face image, so that the noise component in the image can be effectively reduced, the resolution of the image pixel can be improved, and the calculation accuracy and reliability of the eye coordinate information can be ensured.
Preferably, in step S2, the determining the eye opening and closing state information and the eye viewing screen sight line direction information of the student according to the eye coordinate information specifically includes:
step S201, determining an average value ER of the eye opening and closing amplitude of the student according to the eye coordinate information and the following formula (1):
in the above-mentioned formula (1),coordinate values indicating the canthus of the ith eye of the student on the side close to the ear in the face image,coordinate values of the intersection points of the tangent line of the eyeball contour part with the shortest distance to the ears and the upper eyelid on the eyeball contour of the ith eye of the student in the face image,a coordinate value representing a point where a tangent to an eyeball contour portion of the ith eye of the student, which is located at a position where the distance from the eyeball contour portion to the nose bridge is shortest, intersects the upper eyelid in the face image,coordinate values indicating the canthus on the side of the i-th eye of the student near the nose bridge in the face image,coordinate values of the intersection points of the tangent line of the eyeball contour part with the shortest distance to the ears and the lower eyelid on the eyeball contour of the ith eye of the student in the face image,coordinate values of intersection points of a tangent line of an eyeball contour part with the shortest distance to the nose bridge and a lower eyelid on the eyeball contour of the ith eye of the student in the face image, wherein when i is 1, the 1 st eye represents the left eye of the student, and when i is 2, the 2 nd eye represents the right eye of the student;
step S202, determining an included angle alpha between the sight line of the screen watched by the eyes of the student and a straight line perpendicular to the surface where the screen is located according to the eye coordinate information and the following formula (2):
in the above-mentioned formula (2),coordinate values indicating the center point of the eyeball in the ith eye of the student in the face image, (X, Y) indicates preset origin coordinate values, L indicates the shooting depth of field corresponding to the shooting, f indicates the shooting focal length corresponding to the shooting, when i is 1, the 1 st eye indicates the left eye of the student, and when i is 2, the 2 nd eye indicates the right eye of the student.
The beneficial effects of the above technical scheme are: the average value of the opening and closing amplitude of the eyes of the student and the included angle between the sight line of the eyes of the student watching the screen and the straight line vertical to the surface where the screen is positioned are respectively calculated by the formulas (1) and (2), thus, the real-time state of the eyes watching of the students in the on-line learning process can be reflected to the maximum extent, thereby providing accurate and reliable data support for subsequently determining the student's learning concentration status, wherein the average value of the eye opening and closing amplitude can be obtained by randomly extracting a plurality of eye part images, and calculating the area size of the region surrounded by the upper eyelid and the lower eyelid in the eye part image for each eye part image, the area is the eye opening and closing amplitude value corresponding to the eye part image, and the average value of the eye opening and closing amplitude values corresponding to the plurality of eye part images obtained by random extraction is calculated to obtain the corresponding average value of the eye opening and closing amplitude.
Preferably, in step S3, the determining, according to the eye opening and closing state information and the eye viewing screen sight line direction information, a concentration state of the student currently performing online learning specifically includes:
step S301, determining the current online learning concentration evaluation value F of the student according to the eye opening and closing state information, the eye viewing screen sight line direction information, and the following formula (3):
in the formula (3), ER represents an average value of eye opening and closing amplitudes of the student in the shooting process, α represents an included angle between a sight line of the student watching the screen and a straight line perpendicular to the surface where the screen is located, and ER representsmaxRepresenting the maximum value of the eye opening and closing amplitude of the student;
step S302, comparing the online learning concentration evaluation value F with a preset concentration threshold, if the online learning concentration evaluation value F is greater than or equal to the preset concentration threshold, determining that the student is currently in the learning concentration state, otherwise, determining that the student is not currently in the learning concentration state.
The beneficial effects of the above technical scheme are: the current online learning concentration evaluation value of the student is obtained through the formula (3), and the eye watching real-time state of the student in the online learning process can be objectively and comprehensively converted into the online learning concentration evaluation value in a quantitative form, so that the quality state of the online learning of the student can be reliably and quantitatively judged, and the efficiency and the quality of the online learning can be improved.
Fig. 2 is a schematic structural diagram of an online learning quality evaluation system according to an embodiment of the present invention. The online learning quality evaluation system comprises an image shooting and processing module, an eye coordinate information determining module, an eye state determining module and a learning concentration state determining module; wherein the content of the first and second substances,
the image shooting and processing module is used for shooting students so as to obtain face images of the students and preprocessing the face images;
the eye coordinate information determining module is used for determining eye coordinate information of the student according to the preprocessed face image;
the eye state determining module is used for determining the eye opening and closing state information and the sight line direction information of the screen watched by the eyes of the student according to the eye coordinate information;
the learning concentration state determination module is used for determining the concentration state of the student in the current online learning process according to the eye opening and closing state information and the eye watching screen sight direction information.
The beneficial effects of the above technical scheme are: the online learning quality evaluation system obtains a corresponding face image by shooting the face of a student, determines the eye opening and closing state and the eye sight direction state of the student according to the face image so as to determine the concentration state of the student in the online student process, and can directly obtain a quantitative evaluation result about the student learning concentration state from the shot face image through a series of image processing calculation, thereby reliably and quantitatively judging the online learning quality state of the student and being beneficial to improving the online learning efficiency and quality.
Preferably, the image capturing and processing module captures a student to obtain a face image of the student, and the preprocessing the face image specifically includes:
adjusting a shooting visual angle and/or a shooting focal length for shooting the student so as to enable the face area of the student to be imaged in the central area of the current shooting visual field picture, and thus obtaining the face image of the student;
then, sequentially carrying out background noise reduction filtering processing, pixel sharpening processing and pixel binarization processing on the face image so as to obtain a preprocessed face image;
and the number of the first and second groups,
the eye coordinate information determining module determines the eye coordinate information of the student according to the preprocessed face image, and specifically comprises the following steps:
and determining eye coordinate information of two eyes of the student according to the preprocessed face image.
The beneficial effects of the above technical scheme are: by adjusting the shooting visual angle and/or the shooting focal length, the face area of the student can be completely imaged in the corresponding shooting visual field, so that the situations of incomplete shooting and distorted shooting are avoided; in addition, the background noise reduction filtering processing, the pixel sharpening processing and the pixel binarization processing are sequentially carried out on the face image, so that the noise component in the image can be effectively reduced, the resolution of the image pixel can be improved, and the calculation accuracy and reliability of the eye coordinate information can be ensured.
Preferably, the determining the eye opening and closing state information of the student and the information of the sight line direction of the screen watched by the eyes by the eye state determining module according to the eye coordinate information specifically includes:
determining the average value ER of the eye opening and closing amplitude of the student according to the eye coordinate information and the following formula (1):
in the above-mentioned formula (1),coordinate values indicating the canthus of the ith eye of the student on the side close to the ear in the face image,coordinate values of the intersection points of the tangent line of the eyeball contour part with the shortest distance to the ears and the upper eyelid on the eyeball contour of the ith eye of the student in the face image,a coordinate value representing a point where a tangent to an eyeball contour portion of the ith eye of the student, which is located at a position where the distance from the eyeball contour portion to the nose bridge is shortest, intersects the upper eyelid in the face image,coordinate values indicating the canthus on the side of the i-th eye of the student near the nose bridge in the face image,coordinate values of the intersection points of the tangent line of the eyeball contour part with the shortest distance to the ears and the lower eyelid on the eyeball contour of the ith eye of the student in the face image,coordinate values of intersection points of a tangent line of an eyeball contour part with the shortest distance to the nose bridge and a lower eyelid on the eyeball contour of the ith eye of the student in the face image, wherein when i is 1, the 1 st eye represents the left eye of the student, and when i is 2, the 2 nd eye represents the right eye of the student;
and the number of the first and second groups,
according to the eye coordinate information and the following formula (2), determining an included angle alpha between the sight line of the student watching the screen and a straight line perpendicular to the surface where the screen is located:
in the above-mentioned formula (2),coordinate values indicating the center point of the eyeball in the ith eye of the student in the face image, (X, Y) indicates preset origin coordinate values, L indicates the shooting depth of field corresponding to the shooting, f indicates the shooting focal length corresponding to the shooting, when i is 1, the 1 st eye indicates the left eye of the student, and when i is 2, the 2 nd eye indicates the right eye of the student.
The beneficial effects of the above technical scheme are: the average value of the opening and closing amplitude of the eyes of the student and the included angle between the sight line of the eyes of the student watching the screen and the straight line vertical to the surface where the screen is positioned are respectively calculated by the formulas (1) and (2), thus, the real-time state of the eyes watching of the students in the on-line learning process can be reflected to the maximum extent, thereby providing accurate and reliable data support for subsequently determining the student's learning concentration status, wherein the average value of the eye opening and closing amplitude can be obtained by randomly extracting a plurality of eye part images, and calculating the area size of the region surrounded by the upper eyelid and the lower eyelid in the eye part image for each eye part image, the area is the eye opening and closing amplitude value corresponding to the eye part image, and the average value of the eye opening and closing amplitude values corresponding to the plurality of eye part images obtained by random extraction is calculated to obtain the corresponding average value of the eye opening and closing amplitude.
Preferably, the determining module for the learning concentration state determines the concentration state of the student currently performing online learning according to the eye opening and closing state information and the information of the eye viewing screen sight line direction, and specifically includes:
determining the current online learning concentration evaluation value F of the student according to the eye opening and closing state information, the eye watching screen sight direction information and the following formula (3):
in the above formula (3), ER represents an average value of the eye opening and closing amplitudes of the student during the current shooting process, and α represents an eye watching screen of the studentAngle between line of sight and line perpendicular to surface on which the screen is located, ERmaxRepresenting the maximum value of the eye opening and closing amplitude of the student;
and comparing the online learning concentration evaluation value F with a preset concentration threshold, if the online learning concentration evaluation value F is greater than or equal to the preset concentration threshold, determining that the student is currently in the learning concentration state, otherwise, determining that the student is not currently in the learning concentration state.
The beneficial effects of the above technical scheme are: the current online learning concentration evaluation value of the student is obtained through the formula (3), and the eye watching real-time state of the student in the online learning process can be objectively and comprehensively converted into the online learning concentration evaluation value in a quantitative form, so that the quality state of the online learning of the student can be reliably and quantitatively judged, and the efficiency and the quality of the online learning can be improved. As can be seen from the content of the above embodiment, the online learning quality evaluation method and system obtain the face image of the student by photographing the student, determine the eye coordinate information of the student according to the face image, determine the eye opening and closing state information and the eye watching screen sight direction information of the student according to the eye coordinate information, and determine the concentration state of the student in the online learning process according to the eye opening and closing state information and the eye watching screen sight direction information; therefore, the online learning quality evaluation method and the online learning quality evaluation system can obtain the corresponding face image by shooting the face of the student, determine the eye opening and closing state and the eye sight direction state of the student according to the face image, determine the concentration state of the student in the online student process, and directly obtain the quantitative evaluation result about the student learning concentration state from the shot face image through a series of image processing calculation, thereby reliably and quantitatively judging the quality state of the student online learning and being beneficial to improving the efficiency and the quality of the online learning.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.
Claims (8)
1. The online learning quality evaluation method is characterized by comprising the following steps of:
step S1, shooting a student to obtain a face image of the student, and determining eye coordinate information of the student according to the face image;
step S2, determining the eye opening and closing state information and the eye watching screen sight direction information of the student according to the eye coordinate information;
and step S3, determining the concentration state of the student in the current online learning process according to the eye opening and closing state information and the eye watching screen sight direction information.
2. The online learning quality evaluation method according to claim 1, wherein:
in step S1, the step of capturing a picture of a student to obtain a face image of the student, and determining eye coordinate information of the student based on the face image specifically includes:
step S101, adjusting a shooting angle of view and/or a shooting focal length for shooting the student to enable the face area of the student to be imaged in a central area of a current shooting view picture, so as to obtain a face image of the student;
step S102, sequentially carrying out background noise reduction filtering processing, pixel sharpening processing and pixel binarization processing on the face image so as to obtain a preprocessed face image;
step S103, acquiring the preprocessed face image, and determining eye coordinate information of two eyes of the student.
3. The online learning quality evaluation method according to claim 2, wherein:
in step S2, the determining, according to the eye coordinate information, the eye opening and closing state information and the eye viewing screen sight line direction information of the student specifically include:
step S201, determining an average value ER of the eye opening and closing amplitude of the student according to the eye coordinate information and the following formula (1):
in the above-mentioned formula (1),coordinate values representing an eye angle on an ear side of an ith eye of the student in the face image,a coordinate value representing a point where a tangent to an eyeball contour portion of the ith eye of the student, which is located at a shortest distance from the ear, intersects the upper eyelid in the eyeball contour of the ith eye of the student in the face image,a coordinate value representing a point where a tangent to an eyeball contour portion of an ith eye of the student on the eyeball contour of the ith eye of the student in the face image, the distance between the tangent and the nose bridge is shortest, and an upper eyelid,coordinate values representing an eye corner on a side of an ith eye of the student close to a nose bridge in the face image,a coordinate value indicating a point where a tangent to an eyeball contour portion of the ith eye of the student on the eyeball contour of the ith eye of the student in the face image, the eyeball contour portion having the shortest distance to the ear, intersects the lower eyelid,coordinate values representing intersection points of a tangent line of an eyeball contour part with the shortest distance to the nose bridge on the eyeball contour of the ith eye of the student in the face image and the lower eyelid, wherein when i is 1, the 1 st eye represents the left eye of the student, and when i is 2, the 2 nd eye represents the right eye of the student;
step S202, determining an included angle alpha between the sight line of the screen watched by the eyes of the student and a straight line perpendicular to the surface where the screen is located according to the eye coordinate information and the following formula (2):
in the above-mentioned formula (2),coordinate values indicating an eyeball center point in an ith eye of the student in the face image, (X, Y) indicates preset origin coordinate values, L indicates a photographing depth of field corresponding to the photographing, f indicates a photographing focal length corresponding to the photographing, the 1 st eye indicates a left eye of the student when i is 1, and the 2 nd eye indicates a right eye of the student when i is 2.
4. The online learning quality evaluation method according to claim 3, wherein:
in step S3, determining, according to the eye opening and closing state information and the eye viewing screen sight line direction information, a concentration state of the student in the current online learning process specifically includes:
step S301, determining the current online learning concentration degree evaluation value F of the student according to the eye opening and closing state information, the eye watching screen sight line direction information and the following formula (3):
in the formula (3), ER represents an average value of eye opening and closing amplitudes of the student in the shooting process, α represents an included angle between a sight line of the student watching the screen and a straight line perpendicular to the surface where the screen is located, and ER representsmaxRepresenting the maximum value of the eye opening and closing amplitude of the student;
step S302, comparing the online learning concentration evaluation value F with a preset concentration threshold, if the online learning concentration evaluation value F is greater than or equal to the preset concentration threshold, determining that the student is currently in a learning concentration state, otherwise, determining that the student is not currently in the learning concentration state.
5. The online learning quality evaluation system is characterized by comprising an image shooting and processing module, an eye coordinate information determining module, an eye state determining module and a learning concentration state determining module; wherein the content of the first and second substances,
the image shooting and processing module is used for shooting students so as to obtain face images of the students and preprocessing the face images;
the eye coordinate information determining module is used for determining eye coordinate information of the student according to the preprocessed face image;
the eye state determining module is used for determining the eye opening and closing state information of the student and the sight line direction information of the student when the student watches the screen according to the eye coordinate information;
the learning concentration state determination module is used for determining the concentration state of the student in the current online learning process according to the eye opening and closing state information and the eye watching screen sight direction information.
6. The online learning quality evaluation system according to claim 5, wherein:
the image shooting and processing module shoots students so as to obtain face images of the students, and the preprocessing of the face images specifically comprises the following steps:
adjusting a shooting visual angle and/or a shooting focal length for shooting the student so as to enable the face area of the student to be imaged in the central area of the current shooting visual field picture, and thus obtaining the face image of the student;
then, sequentially carrying out background noise reduction filtering processing, pixel sharpening processing and pixel binarization processing on the face image so as to obtain a preprocessed face image;
and the number of the first and second groups,
the eye coordinate information determining module determines the eye coordinate information of the student according to the preprocessed face image, and specifically includes:
and determining eye coordinate information of two eyes of the student according to the preprocessed face image.
7. The online learning quality evaluation system according to claim 6, wherein:
the eye state determining module determines the eye opening and closing state information of the student and the eye watching screen sight direction information according to the eye coordinate information, and specifically comprises the following steps:
determining an eye opening and closing amplitude average value ER of the student according to the eye coordinate information and the following formula (1):
in the above-mentioned formula (1),coordinate values representing an eye angle on an ear side of an ith eye of the student in the face image,a tangent line representing an eyeball contour part with the shortest distance between the eyeball contour part and the ear on the eyeball contour of the ith eye of the student in the face imageThe coordinate value of the point of intersection with the upper eyelid,a coordinate value representing a point where a tangent to an eyeball contour portion of an ith eye of the student on the eyeball contour of the ith eye of the student in the face image, the distance between the tangent and the nose bridge is shortest, and an upper eyelid,coordinate values representing an eye corner on a side of an ith eye of the student close to a nose bridge in the face image,a coordinate value indicating a point where a tangent to an eyeball contour portion of the ith eye of the student on the eyeball contour of the ith eye of the student in the face image, the eyeball contour portion having the shortest distance to the ear, intersects the lower eyelid,coordinate values representing intersection points of a tangent line of an eyeball contour part with the shortest distance to the nose bridge on the eyeball contour of the ith eye of the student in the face image and the lower eyelid, wherein when i is 1, the 1 st eye represents the left eye of the student, and when i is 2, the 2 nd eye represents the right eye of the student;
and the number of the first and second groups,
according to the eye coordinate information and the following formula (2), determining an included angle alpha between the sight line of the screen watched by the eyes of the student and a straight line perpendicular to the surface where the screen is located:
in the above-mentioned formula (2),representing an eyeball center point in an ith eye of the student in the face imageThe coordinate values (X, Y) indicate preset origin coordinate values, L indicates a photographing depth of field corresponding to the photographing, f indicates a photographing focal length corresponding to the photographing, and the 1 st eye indicates the left eye of the student when i is 1, and the 2 nd eye indicates the right eye of the student when i is 2.
8. The online learning quality evaluation system according to claim 7, wherein:
the learning concentration state determination module determines the concentration state of the student in the current online learning process according to the eye opening and closing state information and the eye watching screen sight direction information, and specifically comprises:
determining the current online learning concentration degree evaluation value F of the student according to the eye opening and closing state information, the eye watching screen sight direction information and the following formula (3):
in the formula (3), ER represents an average value of eye opening and closing amplitudes of the student in the shooting process, α represents an included angle between a sight line of the student watching the screen and a straight line perpendicular to the surface where the screen is located, and ER representsmaxRepresenting the maximum value of the eye opening and closing amplitude of the student; and comparing the online learning concentration degree evaluation value F with a preset concentration degree threshold, if the online learning concentration degree evaluation value F is larger than or equal to the preset concentration degree threshold, determining that the student is currently in a learning concentration state, otherwise, determining that the student is not currently in the learning concentration state.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011054084.9A CN112132087A (en) | 2020-09-29 | 2020-09-29 | Online learning quality evaluation method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011054084.9A CN112132087A (en) | 2020-09-29 | 2020-09-29 | Online learning quality evaluation method and system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112132087A true CN112132087A (en) | 2020-12-25 |
Family
ID=73844828
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011054084.9A Pending CN112132087A (en) | 2020-09-29 | 2020-09-29 | Online learning quality evaluation method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112132087A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113792577A (en) * | 2021-07-29 | 2021-12-14 | 何泽仪 | Method and system for detecting attention state of students in online class and storage medium |
CN113869241A (en) * | 2021-09-30 | 2021-12-31 | 西安理工大学 | Online learning state analysis and alarm method integrating human face multiple attributes |
-
2020
- 2020-09-29 CN CN202011054084.9A patent/CN112132087A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113792577A (en) * | 2021-07-29 | 2021-12-14 | 何泽仪 | Method and system for detecting attention state of students in online class and storage medium |
CN113869241A (en) * | 2021-09-30 | 2021-12-31 | 西安理工大学 | Online learning state analysis and alarm method integrating human face multiple attributes |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020151489A1 (en) | Living body detection method based on facial recognition, and electronic device and storage medium | |
US5953440A (en) | Method of measuring the focus of close-up images of eyes | |
CN112132087A (en) | Online learning quality evaluation method and system | |
CN108257137A (en) | A kind of angle measurement method and system of the automatic interpretation of view-based access control model hot spot | |
CN105825189A (en) | Device for automatically analyzing attendance rate and class concentration degree of college students | |
CN105791709A (en) | Automatic exposure processing method and apparatus with back-light compensation | |
CN105718863A (en) | Living-person face detection method, device and system | |
CN111126366B (en) | Method, device, equipment and storage medium for distinguishing living human face | |
CN105224285A (en) | Eyes open and-shut mode pick-up unit and method | |
CN108960235B (en) | Method for identifying filling and coating block of answer sheet | |
US11188771B2 (en) | Living-body detection method and apparatus for face, and computer readable medium | |
CN111126330A (en) | Pupil membrane center positioning method and student class attendance fatigue degree detection method | |
CN109165630A (en) | A kind of fatigue monitoring method based on two-dimentional eye recognition | |
CN107480678A (en) | A kind of chessboard recognition methods and identifying system | |
EP4131061A1 (en) | Vehicle loss assessment method, vehicle loss assessment apparatus, and electronic device using same | |
CN110226913A (en) | A kind of self-service examination machine eyesight detection intelligent processing method and device | |
CN109800654A (en) | Vehicle-mounted camera detection processing method, apparatus and vehicle | |
CN110658918B (en) | Positioning method, device and medium for eyeball tracking camera of video glasses | |
CN109194952B (en) | Head-mounted eye movement tracking device and eye movement tracking method thereof | |
CN1889093A (en) | Recognition method for human eyes positioning and human eyes opening and closing | |
CN109376595B (en) | Monocular RGB camera living body detection method and system based on human eye attention | |
CN111738241B (en) | Pupil detection method and device based on double cameras | |
CN112749735B (en) | Converter tapping steel flow identification method, system, medium and terminal based on deep learning | |
CN108921143A (en) | A kind of appraisal procedure and device of face bearing | |
CN112257507A (en) | Method and device for judging distance and human face validity based on human face interpupillary distance |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |