CN114792553A - Method and system for screening psychological health group of students - Google Patents
Method and system for screening psychological health group of students Download PDFInfo
- Publication number
- CN114792553A CN114792553A CN202111623754.9A CN202111623754A CN114792553A CN 114792553 A CN114792553 A CN 114792553A CN 202111623754 A CN202111623754 A CN 202111623754A CN 114792553 A CN114792553 A CN 114792553A
- Authority
- CN
- China
- Prior art keywords
- students
- student
- screening
- frequency
- vibration
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H10/00—ICT specially adapted for the handling or processing of patient-related medical or healthcare data
- G16H10/20—ICT specially adapted for the handling or processing of patient-related medical or healthcare data for electronic clinical trials or questionnaires
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/70—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2218/00—Aspects of pattern recognition specially adapted for signal processing
- G06F2218/12—Classification; Matching
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Epidemiology (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Primary Health Care (AREA)
- Public Health (AREA)
- Child & Adolescent Psychology (AREA)
- Developmental Disabilities (AREA)
- Hospice & Palliative Care (AREA)
- Psychiatry (AREA)
- Psychology (AREA)
- Social Psychology (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
The invention discloses a screening method for a student mental health group, which relates to the technical field of mental health prevention and treatment and comprises the following steps: the front-end high-frequency camera collects the facial and neck muscle group videos of the students; acquiring micro vibration frequency and amplitude parameters generated by micro movement of head muscles from a video; acquiring mapping parameters and brain wave information through a psychophysiological parameter extraction algorithm; comparing the emotional state information with the emotional algorithm model, and rapidly counting the emotional state of the student; evaluating the emotional state of the student and creating a file; screening out high-risk suspicious students with abnormal emotion, and automatically giving out early warning; and (4) recording, tracking and detecting abnormal students, and consulting and communicating. The invention also provides a system for screening the mental health group of the student. The method can realize non-contact AI emotion recognition, rapid detection and large-scale group screening, can grasp group cognition and psychological change in time, improves treatment capacity and level, and effectively maintains the campus group psychological stability situation.
Description
Technical Field
The invention belongs to the technical field of online mental cloud platforms, and particularly relates to a student mental health group screening method and system.
Background
At present, the psychological scales and brain wave detectors commonly used for screening the psychological health of college students have the following defects: psychological scales, 1, most of the currently used scales (SCL-90, UPI, EPQ, 16PF and the like) are foreign translation scales, the characteristics of college students in China cannot be completely reflected, the establishment of foreign imported questionnaire screening standards lacks basis and inspection, part of scales are always too old, and in practical application, the test result comes in and out greatly, and the test is carried out once in three months; 2. tools for screening the psychological problems of college students in all colleges are not uniform, so that the current situation that relevant department institutions such as education department know and master the psychological health of the college students in China is not facilitated; 3. the group detection is time-consuming and labor-consuming and requires the guidance of a specially-assigned person. Although the brain wave detector can perform small-scale detection and has accurate detection results, the brain wave detector cannot perform large-scale detection, and therefore the invention provides a student mental health group screening system according to the requirements.
Disclosure of Invention
The invention aims to solve the defects in the prior art and provides a method and a system for screening a psychological health group of students. The method can realize non-contact AI emotion recognition, 30-60 s rapid detection and large-scale population screening.
In order to achieve the purpose, the invention adopts the following technical scheme:
a screening method for a psychological health group of students is designed, and specifically comprises the following steps:
step 1: collecting 30-60 seconds of facial and neck muscle group videos of students by a front-end high-frequency camera;
and 2, step: acquiring micro vibration frequency and amplitude parameters generated by the micro movement of the muscles of the head controlled by the vestibular organ;
and 3, step 3: according to the mapping relation between brain waves and psychophysiological parameters, performing conversion between videos and electroencephalograms and between video images and HRVs on the vibration frequency and amplitude parameters obtained in the step (2) through a psychophysiological parameter extraction algorithm to obtain mapping parameters and brain wave information;
and 4, step 4: comparing the mapping parameters and the brain wave information obtained in the step 3 with an emotion algorithm model in a basic feature database, and rapidly counting the emotional state of the student;
and 5: evaluating the emotional state of the student counted in the step 4, and creating a file;
and 6: screening high-risk suspicious students with abnormal emotions, detecting the screened students once a week, marking students with abnormal emotions by the system and automatically giving out early warning, wherein the emotions of the students for three times are abnormal continuously;
and 7: and (5) recording, tracking and detecting abnormal students in the step 5, and performing online and/or offline surface-to-surface communication until the emotional stability stops tracking and detecting.
Further, in step 1, before video acquisition, the camera needs to set a video acquisition filter and a video capture format to debug the image.
Further, in the step 2, the micro-vibration frequency and the amplitude parameter are motion parameters of the head point, and the difference exists between the optical contrast of the head point and the optical contrast of the head point, so that a vibration image is formed at each point.
Further, the optical contrast difference includes the following:
the amplitude of the signal varies spatially per point;
wherein: u shape x,y,i Semaphore at (x, y) point in ith frame, U x,y,(i+1) The signal amount at (x, y) point in the (i + 1) th frame, N, accumulates the number of frames of the vibration image amplitude component;
the variation of the frequency of the signal per point in space;
wherein: delta i Is the inter-frame difference (difference is 1, no difference is 0) of the image at the ith point, and N is the frame number of the amplitude component of the accumulated vibration image;
deriving an emotional algorithm model from equations (1) and (2):
wherein, E i In order to be in a certain emotional state,the change (frequency) of doing work over time in the form of micro-vibration for one's head,spatial distribution (displacement) of work done in the form of micro-vibrations for one's head; f. of i Are in a corresponding relationship.
Further, in the step 4, the emotion model algorithm includes an attack model algorithm, a stress model algorithm, a suspicious model algorithm, a balance model algorithm, a charm model algorithm, a vitality model algorithm, a self-adjusting model algorithm, a suppression model algorithm, a neuron model algorithm, a depression model algorithm, a happiness model algorithm, an extroversion model algorithm, and a stability model algorithm.
The model algorithm equation of the aggressivity value is as follows:
wherein m is the maximum frequency in the frequency distribution density histogram; f i Is the ith frequency in the frequency distribution density histogram; f in Processing the vibration frequency for input; n is the count of the difference between frames in N frames greater than the threshold;
the model algorithm equation of the pressure value is as follows:
wherein the content of the first and second substances,the total amplitude of the vibration frequency component of the ith line on the left side of the object is obtained;the total amplitude of the vibration frequency component of the ith line on the right side of the object is obtained;is composed ofA maximum value of;the maximum frequency of the ith row vibration frequency component on the left side of the object;the maximum frequency of the ith row vibration frequency component on the right side of the objectIs composed of A maximum value of; n is the number of lines occupied by the image;
the model algorithm equation for the anxiety value is:
wherein, P i (f) Spectral power distributed for vibration frequency; f. of max Is the maximum frequency in the vibration frequency distribution spectrum;
the model algorithm equation for the suspect value is:
wherein E1 is an offensiveness rating; e2 is pressure rating; e3 is anxiety grade;
the equation for the equilibrium value is:
e5 ═ Bl ═ (100-2 Va)%, (equation 8)
Wherein Va is the variability calculation sum of the emotion parameters;
the charm value model algorithm equation is:
wherein, W li -W ri A difference of left and right amplitude averages of the vibration image amplitude component of each line; c li -C ri A difference between maximum frequency values on the left and right sides of the vibration image amplitude component for each line; n is the frame number of the processing process;
the model algorithm equation of the activity value is as follows:
wherein M is the maximum value of the count on the frequency histogram; σ is the standard deviation of the vibration frequency byCalculating a frequency histogram; f ps The maximum value of the input frequency of the vibration image;
the model algorithm equation for the self-regulation value is:
wherein E5 is the average value of the balance parameter during the measurement process; dE5 is the variation range of the balance parameter; e6 is the average value of charm value parameter in the measuring process; dE6 is the variation range of charm parameter;
the model algorithm equation of the inhibition value is as follows:
wherein, F 1 Changing the frequency for the vibration; t is a unit of m Is the average period of the vibration frequency variation; t: measuring the vibration period;
the model algorithm equation of the nerve quality value is as follows:
e10 ═ Nr ═ 10 σ (E9), (formula 13)
Wherein σ (E9) is the standard deviation of the E9 (inhibition) value;
the model algorithm equation for depression values is:
wherein σ is a standard deviation of the vibration frequency in the frequency histogram; m is the average value of the vibration frequency in the frequency histogram;
the happiness value model algorithm equation is:
wherein, I is the information effectiveness of the psychophysiological state; e is the energy reduction characteristic of the psychophysiological state; dI is the change of effectiveness of the psychophysiological state information; dE is the change in the energy reducing characteristic of the psychophysiological state;
the model algorithm equation of the extroversion value is as follows:
wherein R is IE Is the Pearson correlation coefficient between information efficiency (I) and energy consumption (E).
The model algorithm equation for the stability value is:
wherein K is a normalization coefficient of the frequency histogram, and y' is a normal distribution density;
furthermore, the emotional state of the student is a parameter calculated by the calculated value, the standard deviation and the variation value.
The invention also provides a system for screening the mental health groups of the students, which comprises the following steps:
the high-frequency camera is used for acquiring videos of face and neck muscle groups of front-end students;
the AI emotion recognition module is used for rapidly analyzing and processing the group data and the picture signals to obtain the motion parameters of the student head vibration points;
the psychophysiological parameter extraction module is used for acquiring mapping parameters and brain wave information according to the mapping relation between brain waves and psychophysiological parameters and the mapping relation between brain wave detection and video detection;
the vibration imaging module is used for optical comparison, the change of the amplitude of the signal at each point in space and the change of the frequency of the signal at each point in space are obtained, and each point is obtained to form vibration imaging;
the algorithm model comparison module is used for comparing the emotional algorithm model with the emotional algorithm model in the basic feature database and rapidly counting the emotional state of the student;
the evaluation module is used for comprehensively evaluating the mental health state of the students and quantifying whether each index is normal or not by using data;
the screening module is used for screening high-risk suspicious students with abnormal emotions;
the early warning module is used for automatically early warning students who continuously detect abnormal emotion for three times;
the file creating module is used for performing file creating management on students taking part in detection and monitoring screened key students in a key way;
the tracking service module is used for carrying out on-line and/or off-line face-to-face tracking on the screened key students and carrying out psychological communication consultation;
and the recording module is used for making a historical record for each detection of the screened key student and finding the emotion change trend in time.
Further, the assessment module assesses the composite data including the student's emotional state attacks, stress, suspicion, balance, charm, vitality, self-regulation, suppression, nervousness, depression, happiness, extroversion, and stability, as well as the proportion of negative emotions, positive emotions, and physiological emotions in the composite data in the mental state;
the assessment also included activity, concentration, and fatigue.
Further, the file creation module includes a detection time unit, a name unit, a number unit, an age unit, a unit, a class unit, a comprehensive evaluation unit, and an operation unit.
Further, the recording module includes a brain fatigue history and a concentration history.
Compared with the prior art, the method and the system for screening the psychological health group of the student, which are provided by the invention, have the beneficial effects that: by means of artificial intelligence and AI emotion recognition technologies, group data and picture signals collected from the sensors are rapidly analyzed and processed, and are accurately perceived in a non-contact manner, group cognition and psychological changes are timely mastered, so that prediction, early warning and active decision making reactions are realized, the governing capacity and level are improved, and the campus group psychological stability situation is effectively maintained. Specifically, the method comprises the following steps:
(1) the method is simple and easy to use, only 1 minute is needed for detection, the psychological state can be visualized, and inquiry and form filling are not needed in the whole process.
(2) And the non-contact type is that the user only sits in front of the camera for measurement or flows under a plurality of cameras for snapshot without contact, and the body has no other burden.
(3) Since the psychological state is unconsciously detected, the psychological state can be grasped more objectively, and the examinee cannot control the psychological state by his/her own consciousness as compared with the questionnaire.
(4) And through regular tests, psychological changes can be found in advance so as to take corresponding measures to intervene as early as possible.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings:
FIG. 1 is a block diagram of the present invention relating to the screening of mental health groups of students.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention; it is understood that the described embodiments are only some embodiments of the invention, rather than all embodiments, and that all other embodiments that can be derived by a person of ordinary skill in the art based on the embodiments of the invention without inventive faculty are intended to be within the scope of the invention.
The structural features of the present invention will now be described in detail with reference to the accompanying drawings.
Referring to fig. 1, a screening method for mental health groups of students specifically includes the following steps:
1) and video acquisition: and acquiring 30-60 seconds of facial and neck muscle group video of the student by a front-end high-frequency camera. In order to ensure the accuracy of system detection, the camera must be set before video acquisition, a video acquisition filter and a video screenshot format of the camera are set, and image debugging is performed. If the illumination is constant, a manual configuration is used. If the light is not constant (e.g., using natural light illumination), automatic configuration is used. Because electronic zooming can reduce the frame rate of the camera, manual settings are used for the "zoom" and "focus" parameters, and adjusting these parameters requires more than 25 frames/second, with care being taken to observe the input frame rate. The frame rate of the video screenshot format is set to be 30, and the frame size is set to be 640 x 480, so that the quality of the screenshot can be ensured.
2) And obtaining the head micro-vibration frequency and amplitude parameters: micro vibration frequency and amplitude parameters generated by controlling the fine movement of head muscles by vestibular organs are obtained from videos of the face and neck muscles of students. The micro vibration frequency and the amplitude parameter are motion parameters of head points, the difference exists between the points in optical contrast, and vibration imaging is formed at each point.
The optical contrast difference includes the following:
the amplitude of the signal varies spatially per point;
wherein: u shape x,y,i Semaphore at (x, y) point in ith frame, U x,y,(i+1) The signal amount at the (x, y) point in the (i + 1) th frame, N, the number of frames in which the vibration image amplitude component is accumulated;
the variation of the frequency of the signal per point in space;
wherein: delta i Is the difference between frames of the image at the ith point (difference is 1, no difference is 0), and N is the number of frames of the amplitude component of the accumulated vibration image;
the emotion model algorithm is derived from formula (1) and formula (2):
wherein E is i In order to be in a certain emotional state,is the change in time (frequency) of work done by a person's head in the form of micro-vibrations,a spatial distribution (displacement) of work done in the form of micro-vibrations for one's head; f. of i : and (4) corresponding relation.
The emotion model algorithms include an attack model algorithm, a stress model algorithm, a suspicious model algorithm, a balance model algorithm, an attractive model algorithm, a vitality model algorithm, a self-adjusting model algorithm, a suppression model algorithm, a neural model algorithm, a depression model algorithm, a happiness model algorithm, an extroversion model algorithm, and a stability model algorithm.
The model algorithm equation of the aggressivity value is as follows:
wherein m is the maximum frequency in the frequency distribution density histogram; f i Is the ith frequency in the frequency distribution density histogram; f in Processing the vibration frequency for input; n is the count of the difference between frames in N frames greater than the threshold; aggressiveness is determined by vibration frequency histogram reflecting maximum value of frequency distribution density and vibration of personStandard deviation.
The model algorithm equation of the pressure value is as follows:
wherein the content of the first and second substances,the total amplitude of the vibration frequency component of the ith line on the left side of the object is calculated;the total amplitude of the vibration frequency component of the ith line on the right side of the object is calculated;is composed ofThe maximum value in between;the maximum frequency of the ith row vibration frequency component on the left side of the object;the maximum frequency of the ith row vibration frequency component on the right side of the objectIs composed of The maximum value in between; n is the number of lines occupied by the image; the degree of asymmetry of the external frequency vibrations reflects the level of pressure.
The model algorithm equation for the anxiety value is:
wherein, P i (f) Spectral power distributed for vibration frequency; f. of max Is the maximum frequency in the vibration frequency distribution spectrum; high density of high frequency vibrations reflects a high level of anxiety.
The model algorithm equation for the suspect value is:
wherein E1 is an offensiveness rating; e2 is pressure rating; e3 is anxiety grade; aggressiveness, stress, anxiety reflect suspicious levels.
The equation for the equilibrium value is:
E5=Bl=(100-2Va)%,
wherein Va is the variability calculation sum of the emotion parameters; the ratio of standard deviation and mathematical expectation reflects the mood parameter variability.
The charm value model algorithm equation is as follows:
wherein, W li -W ri A difference of left and right amplitude averages of the vibration image amplitude component of each line; c li -C ri A difference between maximum frequency values on the left and right sides of the vibration image amplitude component for each line; n is the frame number of the processing process; the symmetry of the head micro-movements reflects the attractiveness value.
The model algorithm equation of the activity value is as follows:
wherein M is the maximum value of the count on the frequency histogram; sigma is the standard deviation of vibration frequency, passing through the frequency histogramCalculating a graph; f ps Inputting the maximum value of the frequency for the vibration image; the difference between the maximum density and the standard deviation of the vibration frequency.
The model algorithm equation for the self-regulation value is:
wherein E5 is the average value of the balance parameter in the measuring process; dE5 is the variation range of the balance parameter; e6 is the average value of the charm parameter during the measurement; dE6 is the variation range of charm parameter; the self-adjusting rating reflects the stability (vitality and charm) of the positive mood.
The model algorithm equation for the inhibition value is:
wherein, F 1 The frequency is changed for vibration; t is m Is the average period of the vibration frequency variation; t: measuring the vibration period; the level of inhibition is derived from the response time of the human to the stimulus.
The model algorithm equation of the nerve quality value is as follows:
E10=Nr=10σ(E9),
wherein σ (E9) is the standard deviation of the E9 (inhibition) value; the standard deviation of the inhibition parameter reflects the grade of the nerve.
The model algorithm equation for depression values is:
wherein σ is a standard deviation of the vibration frequency in the frequency histogram; m is the average value of the vibration frequency in the frequency histogram; the standard deviation and mean of the vibration frequencies of the frequency histogram reflect the depression level.
The happiness value model algorithm equation is as follows:
wherein, I is the information effectiveness of the psychophysiological state; e is the energy reduction characteristic of the psychophysiological state; dI is the change of the effectiveness of the psychophysiological state information; dE is the change in the energy reduction characteristic of the psychophysiological state; information efficiency and energy consumption reflect the happiness level.
The model algorithm equation of the extroversion value is as follows:
wherein R is IE Is the pearson correlation coefficient between information efficiency (I) and energy consumption (E); the correlation of information efficiency and energy consumption reflects the extroversion level.
The model algorithm equation for the stability value is:
wherein K is a normalization coefficient of the frequency histogram, and y' is a normal distribution density;
the degree of similarity of the frequency histogram and the normal distribution rule reflects the level of stability.
3) Acquiring mapping parameters and brain wave information: and according to the mapping relation between the brain waves and the psychophysiological parameters, performing conversion between the video and electroencephalogram and between the video image and the HRV on the obtained vibration frequency and amplitude parameters through a psychophysiological parameter extraction algorithm to obtain mapping parameters and brain wave information. Under the state of the generated vibration image, a real-time image with a halo can be generated and displayed, the length of a halo line of the vibration image depends on the central line of the vibration amplitude, and the color of the halo line depends on the maximum vibration frequency in the range of the vibration image.
4) And comparing models: and comparing the mapping parameters and the brain wave information obtained in the above steps with an emotion algorithm model in a basic characteristic database, and rapidly counting the emotional state of the student. Emotional states of the student include aggression, stress/anxiety, suspicious, balance, charm, vitality, self-regulation, suppression, nervousness, depression, happiness model, extroversion, and stability. The emotional state of the student is calculated by setting the average value in the detection time, the root mean square error of the parameter and the parameter variability to obtain the parameter. After the data acquisition is completed, a corresponding table can be opened on a main interface of the system, wherein the table comprises a vibration frequency analysis histogram and a psychophysiological state analysis chart. After a certain testing time, the psychological energy state of a person can be visually displayed through the curve graph.
5) And (3) emotional state evaluation: and evaluating the emotional states of the students counted in the step (A) and creating a file. And (5) performing filing management on students taking part in detection, and monitoring important personnel.
6) Screening and early warning: screening out high-risk suspicious students with abnormal emotions, detecting the screened students once a week, marking the students with abnormal emotions by the system and automatically giving out early warning, wherein the emotions of the students for three times continuously are abnormal.
Making historical records of each detection of key personnel and finding out emotion change trend in time
7) And tracking and consulting: and (4) recording, tracking and detecting abnormal students, and performing online and/or offline face-to-face communication until the emotional stability stops tracking and detecting. And making a historical record of each detection of key personnel, and finding out the emotion change trend in time.
For further illustration, the invention further provides a system for screening the mental health group of students, comprising:
the high-frequency camera is used for collecting videos of face and neck muscle groups of front-end students, and the high-frequency camera needs to be over against the faces of the students during collection.
And the AI emotion recognition module is used for rapidly analyzing and processing the group data and the picture signals to obtain the motion parameters of the student head vibration points.
And the psychophysiological parameter extraction module is used for acquiring mapping parameters and brain wave information according to the mapping relation between brain waves and psychophysiological parameters and the mapping relation between brain wave detection and video detection.
And the vibration imaging module is used for optical contrast, the change of the amplitude of the signal at each point in space and the change of the frequency of the signal at each point in space are obtained, and each point is obtained to form vibration imaging.
And the algorithm model comparison module is used for comparing the emotion algorithm model with the emotion algorithm model in the basic characteristic database and rapidly counting the emotional state of the student. Student emotional states include aggression, stress, suspicion, balance, charm, vitality, self-regulation, inhibition, nervousness, depression, happiness, extroversion and stability profile.
The evaluation module is used for comprehensively evaluating the mental health state of the students and quantifying whether each index is normal or not by using data; the assessment module assesses aggregate data including aggression, stress, suspicion, balance, charm, vitality, self-regulation, suppression, nervousness, depression, happiness, extroversion, and stability of the student's emotional state, and the proportion of negative emotions, positive emotions, and physiological emotions in the aggregate data in the mental state.
And the screening module is used for screening high-risk suspicious students with abnormal emotion.
And the early warning module is used for automatically early warning students who detect abnormal emotion three times continuously.
The file creating module is used for performing file creating management on students taking part in detection and monitoring screened key students in a key way; the file creating module comprises a detection time unit, a name unit, a number unit, an age unit, a unit, a class unit, a comprehensive evaluation unit and an operation unit.
And the tracking service module is used for tracking the screened key students from the online and/or the offline and opposite surfaces and carrying out psychological communication consultation.
And the recording module is used for making a historical record for each detection of the screened key student and finding the emotion change trend in time. The recording module comprises a brain fatigue history record and a concentration history record.
Although the present invention has been described in detail with reference to the foregoing embodiments, those skilled in the art will recognize that changes may be made in the form and details of the embodiments described herein without departing from the spirit and scope of the invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (10)
1. A screening method for a psychological health group of students is characterized by comprising the following steps:
s1: acquiring 30-60 seconds of facial and neck muscle group videos of students by a front-end high-frequency camera;
s2: acquiring micro vibration frequency and amplitude parameters generated by the micro movement of the muscles of the head controlled by the vestibular organ;
s3: according to the mapping relation between brain waves and psychophysiological parameters, performing conversion between a video and an electroencephalogram and between a video image and an HRV (high resolution video) on the vibration frequency and amplitude parameters obtained in the step S2 through a psychophysiological parameter extraction algorithm to obtain mapping parameters and brain wave information;
s4: comparing the mapping parameters and the brain wave information obtained in the step S3 with the emotion algorithm model in the basic feature database, and rapidly counting the emotional state of the student;
s5: evaluating the emotional states of the students counted in the S4, and creating a file;
s6: screening out high-risk suspicious students with abnormal emotions, detecting the screened students once a week, marking students with abnormal emotions by a system and automatically giving out early warning, wherein the emotions of the screened students for three times are abnormal continuously;
s7: and (4) performing record tracking detection on abnormal students in the S6, and performing online and/or offline face-to-face communication until the emotional stability stops tracking detection.
2. The method for screening psychological health groups of students according to claim 1, wherein in S1, the camera requires the setting of video capture filter and video capture format for image adjustment before video capture.
3. The method for screening student mental health group as claimed in claim 1, wherein in the step S2, the minute vibration frequency and amplitude parameters are motion parameters of head points, there is a difference in optical contrast between the points, and a vibration image is formed at each point.
4. The method for screening a mental health population of a student according to claim 3, wherein the optical contrast difference comprises the following:
the amplitude of the signal varies spatially per point;
wherein: u shape x,y,i Semaphore at (x, y) point in ith frame, U x,y,(i+1) The signal amount at (x, y) point in the (i + 1) th frame, N, accumulates the number of frames of the vibration image amplitude component;
the variation of the frequency of the signal per point in space;
wherein: delta i Is the inter-frame difference (difference is 1, no difference is 0) of the image at the ith point, and N is the frame number of the amplitude component of the accumulated vibration image;
deriving an emotion model algorithm from formula (1) and formula (2):
5. The student mental health population screening method of claim 4, wherein the mood model algorithms include attack model algorithms, stress model algorithms, suspicious model algorithms, balance model algorithms, charm model algorithms, vitality model algorithms, self-regulation model algorithms, suppression model algorithms, neuron model algorithms, depression model algorithms, happiness model algorithms, extroversion model algorithms, and stability model algorithms.
6. The student mental health group screening method of claim 5, wherein the emotional state of the student is a parameter calculated from a calculated value, a standard deviation and a variance value.
7. A system for screening a mental health group of students, comprising:
the high-frequency camera is used for acquiring the videos of the face and neck muscle groups of the front-end students;
the AI emotion recognition module is used for rapidly analyzing and processing the group data and the picture signals to obtain the motion parameters of the student head vibration points;
the psychophysiological parameter extraction module is used for acquiring mapping parameters and brain wave information according to the mapping relation between brain waves and psychophysiological parameters and the mapping relation between brain wave detection and video detection;
the vibration imaging module is used for optical contrast, the change of the amplitude of the signal at each point in space and the change of the frequency of the signal at each point in space acquire each point to form vibration imaging;
the algorithm model comparison module is used for comparing the emotional algorithm model with the emotional algorithm model in the basic feature database and rapidly counting the emotional state of the student;
the evaluation module is used for comprehensively evaluating the mental health state of the students and quantifying whether each index is normal or not by using data;
the screening module is used for screening high-risk suspicious students with abnormal emotions;
the early warning module is used for automatically early warning students who continuously detect abnormal emotion for three times;
the file creating module is used for carrying out file creating management on students taking part in detection and monitoring key students screened out;
the tracking service module is used for tracking the screened key students from the online surface and/or the offline surface to the opposite surface and carrying out psychological communication consultation;
and the recording module is used for making a historical record for each detection of the screened key student and finding the emotion change trend in time.
8. The student mental health population screening system of claim 7, wherein the assessment module assesses the aggregate data including the attack, stress, suspicion, balance, charm, vitality, self-regulation, suppression, nervousness, depression, happiness, extroversion, and stability of the student's emotional state, and the proportion of negative, positive, and physiological emotions in the mental state to the aggregate data;
the assessments also included activity, concentration, and fatigue.
9. The student mental health group screening system of claim 7, wherein the profile creation module includes a detection time unit, a name unit, a number unit, an age unit, a unit, a class unit, a comprehensive evaluation unit, and an operation unit.
10. The student mental health population screening system of claim 7, wherein the recording module includes a brain fatigue history and a concentration history.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111623754.9A CN114792553A (en) | 2021-12-28 | 2021-12-28 | Method and system for screening psychological health group of students |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111623754.9A CN114792553A (en) | 2021-12-28 | 2021-12-28 | Method and system for screening psychological health group of students |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114792553A true CN114792553A (en) | 2022-07-26 |
Family
ID=82459428
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111623754.9A Pending CN114792553A (en) | 2021-12-28 | 2021-12-28 | Method and system for screening psychological health group of students |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114792553A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116311510A (en) * | 2023-03-08 | 2023-06-23 | 广东兆邦智能科技股份有限公司 | Emotion detection method and system based on image acquisition |
CN116741344A (en) * | 2023-06-05 | 2023-09-12 | 厦门纳智壳生物科技有限公司 | Student physiological and psychological health screening system based on computer vision and algorithm technology |
CN117414135A (en) * | 2023-10-20 | 2024-01-19 | 郑州师范学院 | Behavioral and psychological abnormality detection method, system and storage medium |
CN116311510B (en) * | 2023-03-08 | 2024-05-31 | 广东兆邦智能科技股份有限公司 | Emotion detection method and system based on image acquisition |
-
2021
- 2021-12-28 CN CN202111623754.9A patent/CN114792553A/en active Pending
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116311510A (en) * | 2023-03-08 | 2023-06-23 | 广东兆邦智能科技股份有限公司 | Emotion detection method and system based on image acquisition |
CN116311510B (en) * | 2023-03-08 | 2024-05-31 | 广东兆邦智能科技股份有限公司 | Emotion detection method and system based on image acquisition |
CN116741344A (en) * | 2023-06-05 | 2023-09-12 | 厦门纳智壳生物科技有限公司 | Student physiological and psychological health screening system based on computer vision and algorithm technology |
CN117414135A (en) * | 2023-10-20 | 2024-01-19 | 郑州师范学院 | Behavioral and psychological abnormality detection method, system and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107809951B (en) | Psychophysiological detection (lie detection) method and apparatus for distortion using video-based detection of physiological signals | |
US7344251B2 (en) | Mental alertness level determination | |
RU2292839C2 (en) | Method and device for analyzing human behavior | |
EP2800507B1 (en) | Apparatus for psychiatric evaluation | |
CN110507335A (en) | Inmate's psychological health states appraisal procedure and system based on multi-modal information | |
CN114792553A (en) | Method and system for screening psychological health group of students | |
EP3030151A1 (en) | System and method for detecting invisible human emotion | |
CN112957042B (en) | Non-contact target emotion recognition method and system | |
KR101689021B1 (en) | System for determining psychological state using sensing device and method thereof | |
JP6863563B2 (en) | Stress evaluation system | |
US20150305662A1 (en) | Remote assessment of emotional status | |
WO2014150684A1 (en) | Artifact as a feature in neuro diagnostics | |
KR20140041382A (en) | Method for obtaining information about the psychophysiological state of a living being | |
CN115517681A (en) | Method and system for monitoring mood fluctuation and evaluating emotional disorder state of MD (MD) patient | |
WO2023012818A1 (en) | A non-invasive multimodal screening and assessment system for human health monitoring and a method thereof | |
CN115299947A (en) | Psychological scale confidence evaluation method and system based on multi-modal physiological data | |
Sabry et al. | Automatic estimation of laryngeal vestibule closure duration using high-resolution cervical auscultation signals | |
US11670423B2 (en) | Method and system for early detection of neurodegeneration using progressive tracking of eye-markers | |
CN113143274A (en) | Emotion early warning method based on camera | |
EP3529764A1 (en) | Device for determining features of a person | |
CN111613338A (en) | Method and system for constructing spike-slow complex wave detection model | |
CN111723869A (en) | Special personnel-oriented intelligent behavior risk early warning method and system | |
Mantri et al. | Real time multimodal depression analysis | |
KR101940673B1 (en) | Evaluation Method of Empathy based on micro-movement and system adopting the method | |
Pecundo et al. | Amyotrophic lateral sclerosis and post-stroke orofacial impairment video-based multi-class classification |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |