CN116453027A - AI identification management method for educational robot - Google Patents

AI identification management method for educational robot Download PDF

Info

Publication number
CN116453027A
CN116453027A CN202310689417.2A CN202310689417A CN116453027A CN 116453027 A CN116453027 A CN 116453027A CN 202310689417 A CN202310689417 A CN 202310689417A CN 116453027 A CN116453027 A CN 116453027A
Authority
CN
China
Prior art keywords
period
student
current monitoring
emotion
students
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310689417.2A
Other languages
Chinese (zh)
Other versions
CN116453027B (en
Inventor
潘鑫
黄勇
朱松
武庆三
潘若木
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Wantong Technology Co ltd
Original Assignee
Shenzhen Wantong Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Wantong Technology Co ltd filed Critical Shenzhen Wantong Technology Co ltd
Priority to CN202310689417.2A priority Critical patent/CN116453027B/en
Publication of CN116453027A publication Critical patent/CN116453027A/en
Application granted granted Critical
Publication of CN116453027B publication Critical patent/CN116453027B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/168Evaluating attention deficit, hyperactivity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/72Data preparation, e.g. statistical preprocessing of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • G06V40/176Dynamic expression
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Psychiatry (AREA)
  • Developmental Disabilities (AREA)
  • Human Computer Interaction (AREA)
  • Psychology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Child & Adolescent Psychology (AREA)
  • Molecular Biology (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Social Psychology (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Computing Systems (AREA)
  • Ophthalmology & Optometry (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention relates to the technical field of education robot management, in particular to an AI identification management method for an education robot. The eye concentration and the body concentration of the students corresponding to each sub-period in the current monitoring period are analyzed, so that on one hand, the multidimensional analysis of the eye concentration of the students is realized, the comprehensiveness and the accuracy of the analysis result of the eye concentration of the students by the education robot are improved to a great extent, and the teaching content of the students can be correspondingly adjusted for the education robot; on the other hand, the body movement change of the student corresponding to the current monitoring period can be intuitively judged, the concentration degree of the student corresponding to the current monitoring period can be accurately analyzed, and reliable data support is provided for the analysis of the follow-up education robot corresponding to the next-stage adjustment mode.

Description

AI identification management method for educational robot
Technical Field
The invention relates to the technical field of education robot management, in particular to an AI identification management method for an education robot.
Background
With the continuous development of artificial intelligence technology, the application range of the educational robot is wider and wider, and the educational robot is an intelligent robot capable of simulating a human teaching process, can interact with students and can provide personalized teaching services. The AI identification management method is one of important technologies for realizing personalized teaching of the educational robot.
Because of the particularity of educational robots, the educational robots cannot be widely applied to various fields at present, and although the educational robots save human resources to a certain extent, the teaching mode of the educational robots has certain defects, and the teaching mode is specifically as follows:
the teaching process of educational robot is fixed usually, can't carry out individualized teaching to the student according to the concentration degree and the emotion state of student's study to influence student's enthusiasm to a certain extent, make student's efficiency of study not high, can't obtain apparent study result, not only caused the waste of teaching resource, still can't be pointed at the same time for student's formulation corresponding study content.
The educational robot can not analyze the concentration degree of students and the emotion states of the students, and can not judge the concentration degree of students to learn, so that the educational robot lacks intelligence and flexibility in the teaching process, and is unfavorable for guaranteeing the learning efficiency of the students.
Disclosure of Invention
The invention aims to provide an AI recognition management method for an educational robot.
The aim of the invention can be achieved by the following technical scheme: an AI identification management method for an educational robot, comprising the steps of:
f1, concentration monitoring and identification: the learning video of the student corresponding to the current monitoring period is monitored and analyzed to obtain a first-level parameter set of the student corresponding to the current monitoring period, and the concentration evaluation coefficient of the student corresponding to the current monitoring period is analyzed to obtain the concentration evaluation coefficient of the student corresponding to the current monitoring period.
As a further improvement of the invention, the learning video of the student corresponding to the current monitoring period is monitored and analyzed, and the specific monitoring and analyzing steps are as follows:
f101: and acquiring the learning video of the student corresponding to the current monitoring period through the high-definition camera to obtain the learning video of the student corresponding to the current monitoring period.
f102: and focusing and amplifying eyes of the student corresponding to the current monitoring period in the learning video of the student corresponding to the current monitoring period to obtain an eye amplifying video of the student corresponding to the current monitoring period.
f103: and carrying out picture decomposition on the learning video of the student corresponding to the current monitoring time period according to the monitoring time points to obtain learning pictures of the student corresponding to each monitoring time point in the current monitoring time period, and extracting body contour images of the student corresponding to each monitoring time point in the current monitoring time period from the learning pictures.
f104: the eye enlarged video of the student corresponding to the current monitoring period and the body contour image of each monitoring time point form a primary parameter set of the student corresponding to the current monitoring period.
As a further improvement of the invention, the concentration evaluation coefficient of the student corresponding to the current monitoring period is analyzed, and the specific analysis steps are as follows:
f201: the eye amplified video of the student corresponding to the current monitoring period is uniformly divided into sub-periods according to the set video duration, the eye amplified video of the student corresponding to each sub-period is obtained, and the blink times of the student corresponding to each sub-period are counted from the eye amplified video of the student corresponding to each sub-periodAnd eyeball rotation times->I is denoted as the number of each sub-period, i=1, 2, …, n.
f202: the interval duration of each blink and the next blink in each sub-period corresponding to the student is obtained based on the eye enlarged video of each sub-period corresponding to the student, and is recorded as the interval duration of the marking period, the interval duration of each marking period in each sub-period corresponding to the student is obtained, and is compared with the set reference interval duration, and if the interval duration of a certain marking period is longer than the set reference interval duration, the marking period is recorded as an abnormal period.
f203: acquiring eyeball rotation times of students corresponding to abnormal time periods in each sub-time period based on eye magnification videos of the students corresponding to the sub-time periods, obtaining eyeball rotation times of the students corresponding to the abnormal time periods in each sub-time period, comparing the eyeball rotation times with set reference eyeball rotation times, judging the abnormal time period as a foolproof time period if the eyeball rotation times of certain abnormal time period are smaller than the reference eyeball rotation times, thus obtaining each foolproof time period of the students corresponding to each sub-time period, and counting the foolproof times of the students corresponding to each sub-time periodSimultaneously, the method counts the fool-proofing time length of students corresponding to each fool-proofing time period in each sub-period>J is denoted as the number of each of the periods of the transmission, j=1, 2, …, m.
f204: comprehensively analyzing the blink times and eyeball rotation times of the students corresponding to the sub-periods and the foolproof times and the foolproof time of the students corresponding to the sub-periods to obtain the eye concentration of the students corresponding to the sub-periods
f205: will learn toThe body contour image corresponding to each monitoring time point in the current monitoring period is subjected to coincidence comparison with the body contour image of the next monitoring time point to obtain the body coincidence area of each monitoring time point in the current monitoring period corresponding to the student and the next monitoring time point, the period between each monitoring time point and the next monitoring time point is recorded as a monitoring segment, and the body coincidence area of each monitoring segment in the current monitoring period corresponding to the student is obtainedG is denoted as the number of each monitoring fragment, g=1, 2, …, p.
f206: analyzing the body concentration of the student corresponding to the current monitoring period based on the body overlapping area of each monitoring segment in the current monitoring period, so as to obtain the body concentration of the student corresponding to the current monitoring period
f207: analyzing the concentration evaluation coefficient of the student corresponding to the current monitoring period based on the eye concentration of the student corresponding to each sub-period and the body concentration of the student corresponding to the current monitoring period to obtain the concentration evaluation coefficient of the student corresponding to the current monitoring period
F2, emotion monitoring and recognition: and monitoring the facial image of the student corresponding to the current monitoring period to obtain a secondary parameter set of the student corresponding to the current monitoring period, and analyzing the emotion state of the student corresponding to the current monitoring period to obtain a learning emotion coincidence coefficient of the student corresponding to the current monitoring period.
As a further improvement of the invention, the face image of the student corresponding to the current monitoring period is monitored, and the specific monitoring mode is as follows:
and acquiring facial images of the students corresponding to all monitoring time points in the current monitoring period through the intelligent cameras to obtain facial images of the students corresponding to all monitoring time points in the current monitoring period.
And extracting the lip contour and the eyebrow contour of each monitoring time point corresponding to the student from the facial image of each monitoring time point in the current monitoring time period corresponding to the student, and obtaining the lip contour and the eyebrow contour of each monitoring time point corresponding to the student.
And forming a secondary parameter set of the student corresponding to the current monitoring period by the lip outline and the eyebrow outline of the student corresponding to each monitoring time point.
As a further improvement of the invention, the student is analyzed for the emotional state corresponding to the current monitoring period, and the specific analysis process is as follows:
f301: matching the lip outline of each monitoring time point corresponding to the student with the lip outline set of each lip emotion state corresponding to the student stored in the database to obtain the lip emotion state corresponding to each monitoring time point, wherein each lip emotion state comprises: happy, tense, dysphoria, qi-generating, difficult and excessive.
f302: matching the eyebrow outline of each monitoring time point corresponding to the student with the eyebrow outline set of each eyebrow emotion state corresponding to the student stored in the database to obtain the eyebrow emotion state corresponding to each monitoring time point, wherein each eyebrow emotion state comprises: happy, tense, dysphoria, qi-generating, difficult and excessive.
f303: matching the lip emotion states and the eyebrow emotion states of the students corresponding to the monitoring time points with the set parameter sets corresponding to the face emotion states to obtain the face emotion states of the students corresponding to the monitoring time points, wherein the parameter sets corresponding to the face emotion states are as follows: the lips emotional state and the eyebrows emotional state corresponding to each facial tendency state are specifically a waving state and a stable state.
f304: if the facial emotion state of the student corresponding to a certain monitoring time point is a stable state, the monitoring time point is marked as a stable time point, and if the facial emotion state of the student corresponding to a certain monitoring time point is a fluctuation state, the monitoring time point is marked as a fluctuation time point, so that each stable time point and each fluctuation time point in the current monitoring period of the student are counted.
f305: will be adjacent toIntegrating all the stable time points of the current monitoring time period, recording as emotion stability time periods, obtaining all the emotion stability time periods of the students corresponding to the current monitoring time period, counting the duration of the students corresponding to all the emotion stability time periods in the current monitoring time period, further summing the duration to obtain the total duration of the emotion stability of the students corresponding to the current monitoring time period, recording as follows
f306: integrating adjacent fluctuation time points, recording as emotion fluctuation time periods, obtaining each emotion fluctuation time period of the student corresponding to the current monitoring time period, counting the duration of each emotion fluctuation time period of the student corresponding to the current monitoring time period, further summing the duration to obtain the total duration of emotion fluctuation of the student corresponding to the current monitoring time period, and recording as follows
f307: comprehensively analyzing the total duration of emotion stability and the total duration of emotion fluctuation of the student in the current monitoring period to obtain an emotion state evaluation index of the student in the current monitoring period
f308: selecting the time length of the biggest emotion stabilization period from the time lengths of the emotion stabilization periods of the students in the current monitoring periodAnd duration of minimum mood stabilization period +.>And screening the time length of the maximum emotion fluctuation period from the time length of each emotion fluctuation period in the current monitoring period corresponding to the student +.>And duration of the minimum mood swings period +.>Simultaneously counting the number of emotion stabilizing periods of students corresponding to the current monitoring period +.>And the number of mood swings periods->
f309: comprehensively analyzing the time length of the student corresponding to the maximum emotion stabilization period, the time length of the minimum emotion stabilization period, the time length of the maximum emotion fluctuation period, the time length of the minimum emotion fluctuation period, the number of emotion stabilization periods and the number of emotion fluctuation periods in the student corresponding to the current monitoring period, and obtaining the emotion change frequency of the student corresponding to the current monitoring period
f310: comprehensively analyzing the emotion state evaluation index and emotion change frequency of the student corresponding to the current monitoring period to obtain a learning emotion coincidence coefficient of the student corresponding to the current monitoring period
F3, robot management analysis: the learning phase of the student corresponding to the current monitoring period is identified and analyzed to obtain the learning phase of the student corresponding to the current monitoring period, and the adjustment mode of the education robot corresponding to the next phase is analyzed to obtain the adjustment mode of the education robot corresponding to the next phase.
As a further improvement of the invention, the student is identified and analyzed in the learning stage corresponding to the current monitoring period, and the specific analysis process is as follows:
f401: comprehensively analyzing the concentration evaluation coefficient and the learning emotion coincidence coefficient of the student corresponding to the current monitoring period to obtain the learning state evaluation coefficient of the student corresponding to the current monitoring period
f402: and matching the learning state evaluation coefficient of the student corresponding to the current monitoring period with the set learning state evaluation coefficient threshold corresponding to each learning stage to obtain the learning stage of the student corresponding to the current monitoring period.
As a further improvement of the invention, the adjustment mode of the education robot corresponding to the next stage is analyzed, and the specific analysis mode is as follows:
and matching the learning phase of the student corresponding to the current monitoring period with the adjustment mode of each learning phase corresponding to the next phase to obtain the adjustment mode of the student corresponding to the next phase of the current monitoring period, and taking the adjustment mode as the adjustment mode of the education robot corresponding to the next phase.
And F4, robot adjustment execution: and correspondingly adjusting the learning content of the student corresponding to the next stage based on the adjustment mode of the education robot corresponding to the next stage.
As a further improvement of the invention, the student can correspondingly adjust the learning content corresponding to the next stage, and the specific adjustment mode is as follows:
and matching the adjustment mode of the education robot corresponding to the next stage with the adjustment learning content corresponding to each set adjustment mode to obtain the adjustment learning content of the education robot corresponding to the next stage, and taking the adjustment learning content as the learning content of the student corresponding to the next stage.
The invention has the beneficial effects that:
according to the invention, eye-concentration parameters such as the blink times, the eyeball rotation times and the like of the students corresponding to each sub-period in the current monitoring period are obtained by monitoring and analyzing the learning video of the students corresponding to the current monitoring period, and therefore, the eye concentration of the students corresponding to each sub-period in the current monitoring period is analyzed, so that the multidimensional analysis of the eye concentration of the students is realized, the comprehensiveness and the accuracy of the analysis result of the eye concentration of the students by the education robot are improved to a great extent, and the teaching content of the students can be adjusted correspondingly for the education robot.
According to the invention, the body concentration degree of the student corresponding to the current monitoring period is obtained by analyzing the body change degree of the student corresponding to the current monitoring period, so that the body movement change of the student corresponding to the current monitoring period can be intuitively judged, meanwhile, the concentration degree of the student corresponding to the current monitoring period can be accurately analyzed, and reliable data support is provided for the analysis of the follow-up education robot corresponding to the next-stage adjustment mode.
According to the invention, the emotion stabilization period and the emotion fluctuation period of the student corresponding to the current monitoring period are analyzed, and the emotion state evaluation index and the emotion change frequency of the student corresponding to the current monitoring period are analyzed, so that the learning emotion coincidence coefficient of the student corresponding to the current monitoring period is obtained through comprehensive analysis, the defect of emotion analysis of the student in the current technology is overcome, and the emotion state of the student can be more intuitively focused.
According to the invention, the learning content of the student corresponding to the next stage is analyzed based on the learning stage of the student corresponding to the current monitoring period, and the education robot is enabled to execute corresponding adjustment, so that not only is the concentration of the student considered, but also the emotion state of the student is concerned, the teaching effect of the education robot is improved to a great extent, the learning efficiency of the student is further ensured, and personalized teaching is realized.
Drawings
The invention is further described below with reference to the accompanying drawings.
Fig. 1 is a flow chart of the method steps of the present invention.
Description of the embodiments
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1, the present invention is an AI identification management method for an educational robot, comprising the steps of:
f1, concentration monitoring and identification: the learning video of the student corresponding to the current monitoring period is monitored and analyzed, and the specific analysis and sending process is as follows:
f101: and acquiring the learning video of the student corresponding to the current monitoring period through the high-definition camera to obtain the learning video of the student corresponding to the current monitoring period.
f102: and focusing and amplifying eyes of the student corresponding to the current monitoring period in the learning video of the student corresponding to the current monitoring period to obtain an eye amplifying video of the student corresponding to the current monitoring period.
f103: and carrying out picture decomposition on the learning video of the student corresponding to the current monitoring time period according to the monitoring time points to obtain learning pictures of the student corresponding to each monitoring time point in the current monitoring time period, and extracting body contour images of the student corresponding to each monitoring time point in the current monitoring time period from the learning pictures.
f104: the eye enlarged video of the student corresponding to the current monitoring period and the body contour image of each monitoring time point form a primary parameter set of the student corresponding to the current monitoring period.
The concentration evaluation coefficient of the student corresponding to the current monitoring period is analyzed, and the specific analysis steps are as follows:
f201: the eye amplified video of the student corresponding to the current monitoring period is uniformly divided into sub-periods according to the set video duration, the eye amplified video of the student corresponding to each sub-period is obtained, and the blink times of the student corresponding to each sub-period are counted from the eye amplified video of the student corresponding to each sub-periodAnd eyeball rotation times->I is denoted as the number of each sub-period, i=1, 2, …, n.
f202: the interval duration of each blink and the next blink in each sub-period corresponding to the student is obtained based on the eye enlarged video of each sub-period corresponding to the student, and is recorded as the interval duration of the marking period, the interval duration of each marking period in each sub-period corresponding to the student is obtained, and is compared with the set reference interval duration, and if the interval duration of a certain marking period is longer than the set reference interval duration, the marking period is recorded as an abnormal period.
f203: acquiring eyeball rotation times of students corresponding to abnormal time periods in each sub-time period based on eye magnification videos of the students corresponding to the sub-time periods, obtaining eyeball rotation times of the students corresponding to the abnormal time periods in each sub-time period, comparing the eyeball rotation times with set reference eyeball rotation times, judging the abnormal time period as a foolproof time period if the eyeball rotation times of certain abnormal time period are smaller than the reference eyeball rotation times, thus obtaining each foolproof time period of the students corresponding to each sub-time period, and counting the foolproof times of the students corresponding to each sub-time periodSimultaneously, the method counts the fool-proofing time length of students corresponding to each fool-proofing time period in each sub-period>J is denoted as the number of each of the periods of the transmission, j=1, 2, …, m.
f204: according to the formulaCalculating eye concentration of student corresponding to each sub-period>E is expressed as a natural constant, a1, a2, a3, a4 are respectively expressed as set weight factors, +.>Respectively expressed as a set reference blink number, a reference eyeball rotation number, a reference fool number,/->Respectively expressed as the set allowable blink frequency difference, allowable eyeball rotation frequency difference,/and the like>Expressed as a set reference duration, < >>Indicated as the set total allowed duration of the hair.
f205: overlapping and comparing the body contour image of each monitoring time point in the current monitoring period corresponding to the student with the body contour image of the next monitoring time point to obtain the body overlapping area of each monitoring time point in the current monitoring period corresponding to the student and the next monitoring time point, and recording the period between each monitoring time point and the next monitoring time point as a monitoring segment to obtain the body overlapping area of each monitoring segment in the current monitoring period corresponding to the studentG is denoted as the number of each monitoring fragment, g=1, 2, …, p.
f206: according to the formulaCalculating the concentration of the student corresponding to the current monitoring period>A5 and a6 respectively represent the set weight factors,>expressed as the set reference body overlap area, +.>Expressed as the body overlapping area of the student corresponding to the g-1 th monitoring segment in the current monitoring period,/day>Indicated as the set allowable body overlap area difference.
f207: according to the formulaCalculating concentration evaluation coefficient of student corresponding to current monitoring period>Tables a7 and a8, respectivelyShown as set weighting factors.
F2, emotion monitoring and recognition: the face images of students corresponding to the current monitoring period are monitored, and the specific monitoring mode is as follows:
and acquiring facial images of the students corresponding to all monitoring time points in the current monitoring period through the intelligent cameras to obtain facial images of the students corresponding to all monitoring time points in the current monitoring period.
And extracting the lip contour and the eyebrow contour of each monitoring time point corresponding to the student from the facial image of each monitoring time point in the current monitoring time period corresponding to the student, and obtaining the lip contour and the eyebrow contour of each monitoring time point corresponding to the student.
And forming a secondary parameter set of the student corresponding to the current monitoring period by the lip outline and the eyebrow outline of the student corresponding to each monitoring time point.
The emotion state of the student corresponding to the current monitoring period is analyzed, and the specific analysis process is as follows:
f301: matching the lip outline of each monitoring time point corresponding to the student with the lip outline set of each lip emotion state corresponding to the student stored in the database to obtain the lip emotion state corresponding to each monitoring time point, wherein each lip emotion state comprises: happy, tense, dysphoria, qi-generating, difficult and excessive.
The collection mode of the lip outline set corresponding to each lip emotion state of the student stored in the database is as follows: and collecting lip outlines of the students corresponding to the relaxed states, collecting lip outline sets of the students corresponding to the emotional states of the lips, forming the lip outline sets of the students corresponding to the emotional states of the lips, and storing the lip outline sets in a database.
f302: matching the eyebrow outline of each monitoring time point corresponding to the student with the eyebrow outline set of each eyebrow emotion state corresponding to the student stored in the database to obtain the eyebrow emotion state corresponding to each monitoring time point, wherein each eyebrow emotion state comprises: happy, tense, dysphoria, qi-generating, difficult and excessive.
The collecting mode of the eyebrow outline set of the student corresponding to each eyebrow emotion state stored in the database is as follows: the method comprises the steps of collecting eyebrow outlines of students corresponding to relaxed states, collecting eyebrow outline sets of students corresponding to emotional states of all eyebrows, forming eyebrow outline sets of students corresponding to the emotional states of all eyebrows, and storing the eyebrow outline sets in a database.
f303: matching the lip emotion states and the eyebrow emotion states of the students corresponding to the monitoring time points with the set parameter sets corresponding to the face emotion states to obtain the face emotion states of the students corresponding to the monitoring time points, wherein the parameter sets corresponding to the face emotion states are as follows: lip emotional states and eyebrow emotional states corresponding to the facial tendency states. Each facial emotional state is specifically a fluctuating state and a steady state, wherein the fluctuating state includes: dysphoria, qi, vexation, tension, etc. The steady state includes: happy, sinkage, etc.
f304: if the facial emotion state of the student corresponding to a certain monitoring time point is a stable state, the monitoring time point is marked as a stable time point, and if the facial emotion state of the student corresponding to a certain monitoring time point is a fluctuation state, the monitoring time point is marked as a fluctuation time point, so that each stable time point and each fluctuation time point in the current monitoring period of the student are counted.
f305: integrating adjacent stabilization time points, recording as emotion stabilization time periods, obtaining each emotion stabilization time period of the student corresponding to the current monitoring time period, counting the duration of each emotion stabilization time period of the student corresponding to the current monitoring time period, further summing the duration to obtain the total duration of emotion stabilization of the student corresponding to the current monitoring time period, and recording as follows
f306: integrating adjacent fluctuation time points, recording as emotion fluctuation time periods, obtaining each emotion fluctuation time period of the student corresponding to the current monitoring time period, counting the duration of each emotion fluctuation time period of the student corresponding to the current monitoring time period, further summing the duration to obtain the total duration of emotion fluctuation of the student corresponding to the current monitoring time period, and recording as follows
f307: according to the formulaCalculating emotion state evaluation index +.>B1, b2 are denoted as set scale factors, respectively->Represented as a duration corresponding to the current monitoring period stored in the database.
f308: selecting the time length of the biggest emotion stabilization period from the time lengths of the emotion stabilization periods of the students in the current monitoring periodAnd duration of minimum mood stabilization period +.>And screening the time length of the maximum emotion fluctuation period from the time length of each emotion fluctuation period in the current monitoring period corresponding to the student +.>And duration of the minimum mood swings period +.>Simultaneously counting the number of emotion stabilizing periods of students corresponding to the current monitoring period +.>And the number of mood swings periods->
f309: the emotion change frequency of the student corresponding to the current monitoring period is calculated according to the following specific formula:expressed as the emotional change frequency of the student corresponding to the current monitoring period, b3, b4, b5, b6, b7, b8 are respectively expressed as set scale factors.
f310: according to the formulaCalculating learning emotion coincidence coefficient of student corresponding to current monitoring period>C1 and c2 are respectively indicated as set coefficient factors.
F3, robot management analysis: the student is identified and analyzed in the learning stage corresponding to the current monitoring period, and the specific analysis steps are as follows:
f401: calculation according to the formulaLearning state evaluation coefficient +.>C3 and c4 are respectively indicated as set coefficient factors.
f402: and matching the learning state evaluation coefficient of the student corresponding to the current monitoring period with the set learning state evaluation coefficient threshold corresponding to each learning stage to obtain the learning stage of the student corresponding to the current monitoring period.
The education robot is analyzed corresponding to the adjustment mode of the next stage, and the specific analysis mode is as follows:
and matching the learning phase of the student corresponding to the current monitoring period with the adjustment mode of each learning phase corresponding to the next phase to obtain the adjustment mode of the student corresponding to the next phase of the current monitoring period, and taking the adjustment mode as the adjustment mode of the education robot corresponding to the next phase.
And F4, robot adjustment execution: based on the adjustment mode of the education robot corresponding to the next stage, the student corresponding to the learning content of the next stage is correspondingly adjusted, and the specific adjustment mode is as follows:
and matching the adjustment mode of the education robot corresponding to the next stage with the adjustment learning content corresponding to each set adjustment mode to obtain the adjustment learning content of the education robot corresponding to the next stage, and taking the adjustment learning content as the learning content of the student corresponding to the next stage.
It should be noted that, according to relevant laws and regulations, in order to protect personal privacy of students, the applicant may need to obtain personal information of students and process the personal information of students in the present patent application. Here, the applicant is a major promise, will strictly adhere to relevant laws and regulations, be responsible for keeping student personal information secret, and will not use the student personal information for other purposes. Meanwhile, the applicant will also take necessary technical and organizational measures to ensure the safety and confidentiality of personal information of students. The student, while agreeing to provide personal information, also understands and agrees to the above-described authorizations and commitments.
The foregoing is merely illustrative of the structures of this invention and various modifications, additions and substitutions for those skilled in the art can be made to the described embodiments without departing from the scope of the invention or from the scope of the invention as defined in the accompanying claims.

Claims (8)

1. An AI-recognition management method for an educational robot, characterized by comprising the steps of:
f1, concentration monitoring and identification: monitoring and analyzing learning videos of students corresponding to a current monitoring period to obtain a first-level parameter set of the students corresponding to the current monitoring period, analyzing eye concentration of the students corresponding to each sub-period based on blink times, eyeball rotation times, foolproof times and foolproof time of each foolproof period, obtaining body concentration of the students corresponding to the current monitoring period based on body overlapping area analysis of each monitoring segment in the students corresponding to the current monitoring period, and analyzing concentration evaluation coefficients of the students corresponding to the current monitoring period based on the eye concentration of the students corresponding to each sub-period and the body concentration of the students corresponding to the current monitoring period to obtain concentration evaluation coefficients of the students corresponding to the current monitoring period;
f2, emotion monitoring and recognition: monitoring face images of students corresponding to a current monitoring period to obtain a secondary parameter set of the students corresponding to the current monitoring period, identifying and analyzing each emotion fluctuation period of each emotion stabilization period of the students corresponding to the current monitoring period, analyzing the total emotion stabilization duration and the total emotion fluctuation duration in the current monitoring period to obtain an emotion state evaluation index of the students corresponding to the current monitoring period, analyzing emotion change parameters in the current monitoring period to obtain emotion change frequency of the students corresponding to the current monitoring period, and analyzing emotion states of the students corresponding to the current monitoring period based on the emotion state evaluation index and the emotion change frequency of the students corresponding to the current monitoring period to obtain a learning emotion coincidence coefficient of the students corresponding to the current monitoring period;
f3, robot management analysis: identifying and analyzing the learning phase of the student corresponding to the current monitoring period to obtain the learning phase of the student corresponding to the current monitoring period, and analyzing the adjustment mode of the education robot corresponding to the next phase to obtain the adjustment mode of the education robot corresponding to the next phase;
and F4, robot adjustment execution: and correspondingly adjusting the learning content of the student corresponding to the next stage based on the adjustment mode of the education robot corresponding to the next stage.
2. The AI-recognition management method for an educational robot of claim 1, wherein the monitoring and analysis of the learning video of the student corresponding to the current monitoring period comprises the following steps:
f101: acquiring a learning video of the student corresponding to the current monitoring period through a high-definition camera to obtain the learning video of the student corresponding to the current monitoring period;
f102: focusing and amplifying eyes of the student corresponding to the current monitoring period in the learning video of the student corresponding to the current monitoring period to obtain an eye amplifying video of the student corresponding to the current monitoring period;
f103: picture decomposition is carried out on the learning video of the student corresponding to the current monitoring time period according to the monitoring time points, learning pictures of the student corresponding to each monitoring time point in the current monitoring time period are obtained, and body contour images of the student corresponding to each monitoring time point in the current monitoring time period are extracted from the learning pictures;
f104: the eye enlarged video of the student corresponding to the current monitoring period and the body contour image of each monitoring time point form a primary parameter set of the student corresponding to the current monitoring period.
3. The AI-recognition management method for an educational robot according to claim 1, wherein the analyzing of the concentration evaluation coefficient of the student corresponding to the current monitoring period comprises the following steps:
f201: dividing the eye amplified video of the student corresponding to the current monitoring period into sub-periods according to a set dividing mode to obtain eye amplified video of the student corresponding to each sub-period, and counting the blink times and eyeball rotation times of the student corresponding to each sub-period from the eye amplified video of the student corresponding to each sub-period;
f202: acquiring interval duration of each blink and the next blink in each sub-period corresponding to the student based on the eye enlarged video of each sub-period corresponding to the student, marking the interval duration as interval duration of a marking period, obtaining interval duration of each marking period in each sub-period corresponding to the student, comparing the interval duration with a set reference interval duration, and marking the marking period as an abnormal period if the interval duration of a marking period is longer than the set reference interval duration;
f203: acquiring eyeball rotation times of students corresponding to abnormal time periods in each sub-time period based on eye magnification videos of the students corresponding to the sub-time periods, obtaining eyeball rotation times of the students corresponding to the abnormal time periods in each sub-time period, comparing the eyeball rotation times with set reference eyeball rotation times, judging the abnormal time period as a foolproof time period if the eyeball rotation times of certain abnormal time period are smaller than the reference eyeball rotation times, thus obtaining each foolproof time period of the students corresponding to each sub-time period, counting the foolproof times of the students corresponding to each sub-time period, and simultaneously counting foolproof time lengths of the students corresponding to each foolproof time period in each sub-time period;
f204: comprehensively analyzing the blink times and eyeball rotation times of the students corresponding to the sub-periods and the foolproof times and the foolproof duration of the students corresponding to the sub-periods to obtain the eye concentration of the students corresponding to the sub-periods;
f205: overlapping and comparing the body contour image of each monitoring time point in the current monitoring time period corresponding to the student with the body contour image of the next monitoring time point to obtain the body overlapping area of each monitoring time point in the current monitoring time period corresponding to the student and the next monitoring time point, and recording the time period between each monitoring time point and the next monitoring time point as a monitoring segment to obtain the body overlapping area of each monitoring segment in the current monitoring time period corresponding to the student;
f206: analyzing the body concentration of the student corresponding to the current monitoring period based on the body overlapping area of each monitoring segment in the current monitoring period, so as to obtain the body concentration of the student corresponding to the current monitoring period;
f207: and analyzing the concentration evaluation coefficient of the student corresponding to the current monitoring period based on the eye concentration of the student corresponding to each sub period and the body concentration of the student corresponding to the current monitoring period to obtain the concentration evaluation coefficient of the student corresponding to the current monitoring period.
4. The AI-recognition management method for an educational robot according to claim 1, wherein the monitoring of the face image of the student corresponding to the current monitoring period is performed in the following manner:
acquiring face images of the students corresponding to all monitoring time points in the current monitoring period through the intelligent cameras to obtain face images of the students corresponding to all monitoring time points in the current monitoring period;
extracting the lip contour and the eyebrow contour of each monitoring time point corresponding to the student from the facial image of each monitoring time point in the current monitoring time period corresponding to the student to obtain the lip contour and the eyebrow contour of each monitoring time point corresponding to the student;
and forming a secondary parameter set of the student corresponding to the current monitoring period by the lip outline and the eyebrow outline of the student corresponding to each monitoring time point.
5. The AI-recognition management method for educational robots according to claim 1, wherein the analysis of the emotional state of the student corresponding to the current monitoring period is performed as follows:
f301: matching the lip outline of each monitoring time point corresponding to the student with the lip outline set of each lip emotion state corresponding to the student stored in the database to obtain the lip emotion state corresponding to each monitoring time point;
f302: matching the eyebrow outline of each monitoring time point corresponding to the student with the eyebrow outline set of each eyebrow emotion state corresponding to the student stored in the database to obtain the eyebrow emotion state corresponding to each monitoring time point;
f303: matching the lip emotion states and the eyebrow emotion states of the students corresponding to the monitoring time points with the set parameter sets corresponding to the face emotion states to obtain the face emotion states of the students corresponding to the monitoring time points, wherein the parameter sets corresponding to the face emotion states are as follows: lip emotional states and eyebrow emotional states corresponding to the facial tendency states, wherein the facial emotional states are specifically a fluctuation state and a stable state;
f304: if the facial emotion state of the student corresponding to a certain monitoring time point is a stable state, the monitoring time point is marked as a stable time point, and if the facial emotion state of the student corresponding to a certain monitoring time point is a fluctuation state, the monitoring time point is marked as a fluctuation time point, so that each stable time point and each fluctuation time point in the current monitoring period of the student are counted;
f305: integrating all adjacent stabilization time points, recording the integrated stabilization time points as emotion stabilization time periods, obtaining all emotion stabilization time periods of the students corresponding to the current monitoring time period, counting the duration of the students corresponding to all emotion stabilization time periods in the current monitoring time period, and further summing the duration to obtain the total duration of emotion stabilization of the students corresponding to the current monitoring time period;
f306: integrating each adjacent fluctuation time point, recording as emotion fluctuation time periods, obtaining each emotion fluctuation time period of the student corresponding to the current monitoring time period, counting the duration of each emotion fluctuation time period of the student corresponding to the current monitoring time period, and further summing the duration to obtain the total duration of emotion fluctuation of the student corresponding to the current monitoring time period;
f307: comprehensively analyzing the total duration of emotion stabilization and the total duration of emotion fluctuation of the student in the current monitoring period to obtain an emotion state evaluation index of the student in the current monitoring period;
f308: the method comprises the steps of screening out the duration of a maximum emotion stabilization period and the duration of a minimum emotion stabilization period from the durations of the emotion stabilization periods of the students in the current monitoring period, screening out the duration of the maximum emotion fluctuation period and the duration of the minimum emotion fluctuation period from the durations of the emotion fluctuation periods of the students in the current monitoring period, and simultaneously counting the number of emotion stabilization periods and the number of emotion fluctuation periods of the students in the current monitoring period;
f309: comprehensively analyzing the duration of the student corresponding to the maximum emotion stabilization period, the duration of the minimum emotion stabilization period, the duration of the maximum emotion fluctuation period, the duration of the minimum emotion fluctuation period, the number of emotion stabilization periods and the number of emotion fluctuation periods in the student corresponding to the current monitoring period, and obtaining the emotion change frequency of the student corresponding to the current monitoring period;
f310: and comprehensively analyzing the emotion state evaluation index and emotion change frequency of the student corresponding to the current monitoring period to obtain a learning emotion coincidence coefficient of the student corresponding to the current monitoring period.
6. The AI-recognition management method for educational robots according to claim 1, wherein the learning phase of the student corresponding to the current monitoring period is recognized and analyzed, and the specific analysis process is as follows:
f401: comprehensively analyzing the concentration evaluation coefficient and the learning emotion coincidence coefficient of the student corresponding to the current monitoring period to obtain a learning state evaluation coefficient of the student corresponding to the current monitoring period;
f402: and matching the learning state evaluation coefficient of the student corresponding to the current monitoring period with the set learning state evaluation coefficient threshold corresponding to each learning stage to obtain the learning stage of the student corresponding to the current monitoring period.
7. The AI-recognition management method for an educational robot of claim 1, wherein the analyzing the adjustment pattern of the educational robot corresponding to the next stage is performed by:
and matching the learning phase of the student corresponding to the current monitoring period with the adjustment mode of each learning phase corresponding to the next phase to obtain the adjustment mode of the student corresponding to the next phase of the current monitoring period, and taking the adjustment mode as the adjustment mode of the education robot corresponding to the next phase.
8. The AI-recognition management method for an educational robot of claim 1, wherein the corresponding adjustment of the learning content of the student corresponding to the next stage is performed by:
and matching the adjustment mode of the education robot corresponding to the next stage with the adjustment learning content corresponding to each set adjustment mode to obtain the adjustment learning content of the education robot corresponding to the next stage, and taking the adjustment learning content as the learning content of the student corresponding to the next stage.
CN202310689417.2A 2023-06-12 2023-06-12 AI identification management method for educational robot Active CN116453027B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310689417.2A CN116453027B (en) 2023-06-12 2023-06-12 AI identification management method for educational robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310689417.2A CN116453027B (en) 2023-06-12 2023-06-12 AI identification management method for educational robot

Publications (2)

Publication Number Publication Date
CN116453027A true CN116453027A (en) 2023-07-18
CN116453027B CN116453027B (en) 2023-08-22

Family

ID=87132328

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310689417.2A Active CN116453027B (en) 2023-06-12 2023-06-12 AI identification management method for educational robot

Country Status (1)

Country Link
CN (1) CN116453027B (en)

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106599881A (en) * 2016-12-30 2017-04-26 首都师范大学 Student state determination method, device and system
US20170270411A1 (en) * 2016-03-18 2017-09-21 Robert Kocher Brain Matching
US20190206218A1 (en) * 2017-12-28 2019-07-04 Cerner Innovation, Inc. Utilizing artificial intelligence to detect objects or patient safety events in a patient room
CN110298569A (en) * 2019-06-19 2019-10-01 重庆工商职业学院 Learning evaluation method and device based on eye movement identification
US20190304157A1 (en) * 2018-04-03 2019-10-03 Sri International Artificial intelligence in interactive storytelling
US20200075007A1 (en) * 2018-08-31 2020-03-05 Kyoto University Voice interaction system, voice interaction method, program, learning model generation apparatus, and learning model generation method
CN111796752A (en) * 2020-05-15 2020-10-20 四川科华天府科技有限公司 Interactive teaching system based on PC
CN112101074A (en) * 2019-06-18 2020-12-18 深圳市优乐学科技有限公司 Online education auxiliary scoring method and system
CN112189192A (en) * 2018-06-02 2021-01-05 北京嘀嘀无限科技发展有限公司 System and method for training and using chat robots
CN112990677A (en) * 2021-03-04 2021-06-18 青岛海科创新科技有限公司 Teaching system, computer equipment and storage medium based on artificial intelligence
US20220022790A1 (en) * 2020-07-22 2022-01-27 Actibrain Bio, Inc. Ai (artificial intelligence) based device for providing brain information
CN114495217A (en) * 2022-01-14 2022-05-13 建信金融科技有限责任公司 Scene analysis method, device and system based on natural language and expression analysis
US20220319181A1 (en) * 2021-04-06 2022-10-06 AspectO Technologies Pvt Ltd Artificial intelligence (ai)-based system and method for managing education of students in real-time
CN115240127A (en) * 2022-06-01 2022-10-25 东莞理工学院 Smart television-oriented child monitoring method
CN115731596A (en) * 2022-11-21 2023-03-03 广西师范大学 Spontaneous expression recognition method based on progressive label distribution and depth network
CN115937961A (en) * 2023-03-02 2023-04-07 济南丽阳神州智能科技有限公司 Online learning identification method and equipment
CN116055806A (en) * 2022-12-30 2023-05-02 深圳创维-Rgb电子有限公司 Mode switching processing method and device of intelligent terminal, terminal and storage medium
CN116109455A (en) * 2023-03-09 2023-05-12 电子科技大学成都学院 Language teaching auxiliary system based on artificial intelligence

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170270411A1 (en) * 2016-03-18 2017-09-21 Robert Kocher Brain Matching
CN106599881A (en) * 2016-12-30 2017-04-26 首都师范大学 Student state determination method, device and system
US20190206218A1 (en) * 2017-12-28 2019-07-04 Cerner Innovation, Inc. Utilizing artificial intelligence to detect objects or patient safety events in a patient room
US20190304157A1 (en) * 2018-04-03 2019-10-03 Sri International Artificial intelligence in interactive storytelling
CN112189192A (en) * 2018-06-02 2021-01-05 北京嘀嘀无限科技发展有限公司 System and method for training and using chat robots
US20200075007A1 (en) * 2018-08-31 2020-03-05 Kyoto University Voice interaction system, voice interaction method, program, learning model generation apparatus, and learning model generation method
CN112101074A (en) * 2019-06-18 2020-12-18 深圳市优乐学科技有限公司 Online education auxiliary scoring method and system
CN110298569A (en) * 2019-06-19 2019-10-01 重庆工商职业学院 Learning evaluation method and device based on eye movement identification
CN111796752A (en) * 2020-05-15 2020-10-20 四川科华天府科技有限公司 Interactive teaching system based on PC
US20220022790A1 (en) * 2020-07-22 2022-01-27 Actibrain Bio, Inc. Ai (artificial intelligence) based device for providing brain information
CN112990677A (en) * 2021-03-04 2021-06-18 青岛海科创新科技有限公司 Teaching system, computer equipment and storage medium based on artificial intelligence
US20220319181A1 (en) * 2021-04-06 2022-10-06 AspectO Technologies Pvt Ltd Artificial intelligence (ai)-based system and method for managing education of students in real-time
CN114495217A (en) * 2022-01-14 2022-05-13 建信金融科技有限责任公司 Scene analysis method, device and system based on natural language and expression analysis
CN115240127A (en) * 2022-06-01 2022-10-25 东莞理工学院 Smart television-oriented child monitoring method
CN115731596A (en) * 2022-11-21 2023-03-03 广西师范大学 Spontaneous expression recognition method based on progressive label distribution and depth network
CN116055806A (en) * 2022-12-30 2023-05-02 深圳创维-Rgb电子有限公司 Mode switching processing method and device of intelligent terminal, terminal and storage medium
CN115937961A (en) * 2023-03-02 2023-04-07 济南丽阳神州智能科技有限公司 Online learning identification method and equipment
CN116109455A (en) * 2023-03-09 2023-05-12 电子科技大学成都学院 Language teaching auxiliary system based on artificial intelligence

Also Published As

Publication number Publication date
CN116453027B (en) 2023-08-22

Similar Documents

Publication Publication Date Title
Elias et al. Face recognition attendance system using Local Binary Pattern (LBP)
CN109783642A (en) Structured content processing method, device, equipment and the medium of multi-person conference scene
US11436714B2 (en) Method and apparatus for estimating emotional quality using machine learning
US20160026872A1 (en) Identifying presentation styles of educational videos
US20200193661A1 (en) Signal change apparatus, method, and program
CN110807585A (en) Student classroom learning state online evaluation method and system
US10592733B1 (en) Computer-implemented systems and methods for evaluating speech dialog system engagement via video
Sidorov Changing the image memorability: From basic photo editing to gans
CN113920534A (en) Method, system and storage medium for extracting video highlight
CN116453027B (en) AI identification management method for educational robot
Serwadda et al. Scan-based evaluation of continuous keystroke authentication systems
Muhammad et al. A video summarization framework based on activity attention modeling using deep features for smart campus surveillance system
Pavlopoulou et al. Indoor-outdoor classification with human accuracies: Image or edge gist?
CN113723093B (en) Personnel management policy recommendation method and device, computer equipment and storage medium
Ratsamee et al. Keyframe selection framework based on visual and excitement features for lifelog image sequences
Kasahara et al. Weakly supervised acoustic defect detection in concrete structures using clustering-based augmentation
Gadaley et al. Classroom Engagement Evaluation Using Multi-sensor Fusion with a 180∘ Camera View
Hubens et al. Fake-buster: A lightweight solution for deepfake detection
Mudgal et al. Using saliency and cropping to improve video memorability
CN115689864A (en) Method for training smile image generation model and method for generating smile image
Sun et al. Visual realism assessment for face-swap videos
Lee et al. Combining voice and image recognition for smart home security system
Mai et al. Investigation of art abstraction Using AI and psychological experiment
Vatamaniuk et al. Methods and Algorithms of Audio-Video Signal Processing for Analysis of Indoor Human Activity
Boumiza et al. Development of model for automatic tutor in e-learning environment based on student reactions extraction using facial recognition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant