CN117438048A - Method and system for assessing psychological disorder of psychiatric patient - Google Patents

Method and system for assessing psychological disorder of psychiatric patient Download PDF

Info

Publication number
CN117438048A
CN117438048A CN202311759887.8A CN202311759887A CN117438048A CN 117438048 A CN117438048 A CN 117438048A CN 202311759887 A CN202311759887 A CN 202311759887A CN 117438048 A CN117438048 A CN 117438048A
Authority
CN
China
Prior art keywords
coefficient
patient
psychological
jcx
interaction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311759887.8A
Other languages
Chinese (zh)
Other versions
CN117438048B (en
Inventor
张超胜
李军涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Longgang Third People's Hospital
Original Assignee
Shenzhen Longgang Third People's Hospital
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Longgang Third People's Hospital filed Critical Shenzhen Longgang Third People's Hospital
Priority to CN202311759887.8A priority Critical patent/CN117438048B/en
Publication of CN117438048A publication Critical patent/CN117438048A/en
Application granted granted Critical
Publication of CN117438048B publication Critical patent/CN117438048B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/70ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/021Measuring pressure in heart or blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/10ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to drugs or medications, e.g. for ensuring correct administration to patients
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment

Abstract

The invention discloses a psychological disorder assessment method and system for psychiatric patients, and relates to the technical field of psychological disorder assessment, wherein a psychological disorder detection model is established through S1-S4, physiological data, whole-course video and virtual reality technology are combined, basic scoring coefficients Jcx1, abnormal physiological coefficients Jcx2, interaction positive coefficients Jcx3 and interaction scenario coefficients Jcx4 are obtained through calculation according to multiple acquisition data, and particularly, the method can capture changes of patients under different situations by means of real-time data acquisition and creation of virtual environments. By introducing the virtual reality technology, the social scene, the public place and the home environment are simulated, so that the behavior and the emotional response of the patient in daily life are more truly displayed in the evaluation, and the real situation reduction degree of the evaluation is improved. By analyzing the comprehensive evaluation coefficient Zh and combining different symptom thresholds, a personalized grooming scheme is generated, customized treatment suggestions are provided for different patients, and the individuation level of treatment is improved.

Description

Method and system for assessing psychological disorder of psychiatric patient
Technical Field
The invention relates to the technical field of psychological disorder assessment, in particular to a psychological disorder assessment method and system for psychiatric patients.
Background
Traditional methods for assessing mental disorders in psychiatric patients rely mainly on professional observations, patient speech expressions and standardized psychological assessment tools, which have limitations. Patients may not truly exhibit their behavioral and emotional responses in daily life in a clinical setting, resulting in an assessment that may not be comprehensive and objective. Furthermore, traditional psychological assessment methods are often intermittent, and it is difficult to capture changes in patients under different circumstances.
Traditional methods rely primarily on patient self-reporting and expert observations, lacking comprehensive multidimensional information. This makes the evaluation result possibly limited to one-sided observation and subjective judgment, and the psychological state of the patient cannot be comprehensively known.
The traditional method is difficult to restore the real situation of the patient in daily life in the evaluation process, and the behavioral response of the patient in different scenes cannot be comprehensively observed.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides a psychology disturbance evaluation method and system for a psychology patient, which are used for solving the problems mentioned in the background art.
In order to achieve the above purpose, the invention is realized by the following technical scheme: a method for assessing psychological disorders of a psychiatric patient comprises the following steps:
S1, establishing a psychological disorder detection model, wherein the psychological disorder detection model is generated by training a plurality of psychological disorder characteristic samples acquired in advance through a convolutional neural network;
s2, diagnosing a plurality of target patients by using a professional psychological disorder diagnosis table through a psychiatric specialist, obtaining diagnosis result data and establishing a first quantitative data set; configuring the intelligent bracelet and the target patient, collecting physiological data of the target patient, and establishing a second target database; in the interactive diagnosis process of the cardiologist, recording the whole-course video of the target patient by adopting camera equipment, and establishing a third target database; creating various virtual environments, including social scenes, public places, and home environments, using virtual reality technology; after the target patient wears the VR head display, simulating the interaction scene with the virtual character in the environment, collecting virtual scene behavior data, and establishing a fourth target database;
s3, analyzing the first quantized data set, the second target database, the third target database and the fourth target database, extracting psychological barrier characteristics, inputting the psychological barrier characteristics into a psychological barrier detection model, and obtaining a basic scoring coefficient Jcx, an abnormal physiological coefficient Jcx2, an interaction positive coefficient Jcx and an interaction scenario coefficient Jcx;
S4, establishing an evaluation model, correlating a basic grading coefficient Jcx1, an abnormal physiological coefficient Jcx, an interaction positive coefficient Jcx and an interaction scenario coefficient Jcx4 to obtain a comprehensive evaluation coefficient Zh, inputting the comprehensive evaluation coefficient Zh into the evaluation model, and comparing the comprehensive evaluation coefficient Zh with a typical symptom threshold value ZZ to obtain a symptom evaluation result; and generating a corresponding dredging scheme according to the symptom evaluation result.
Preferably, the S1 includes: s11, collecting sample characteristics related to psychological disorders, including data from psychological assessment, physiological detection, speech analysis and behavioral analysis; processing the acquired sample data to obtain sample characteristic data after processing missing values, abnormal values and standardization;
s12, extracting, training and verifying sample feature data by using a convolutional neural network CNN to obtain a psychological barrier detection model, deriving the trained psychological barrier detection model, and deploying the psychological barrier detection model into an actual application scene.
Preferably, the first quantitative data set is acquired by a psychiatric specialist using a validated and standardized mental disorder diagnostic tool, including a DSM-5 standard tool and a mental disorder diagnostic table; the psychological disorder diagnosis form comprises a symptom self-rating form, a depression symptom questionnaire and an anxiety symptom questionnaire;
Selecting an intelligent bracelet capable of collecting blood pressure, heart rate, sleep information and galvanic skin response; configuring the intelligent bracelet and a target patient, and collecting physiological data in real time; the physiological data includes blood pressure, heart rate, sleep information, and galvanic skin response information;
deploying an image pickup device to record a high-definition video, capturing a facial expression, a posture and a language of a target patient; and recording videos in the whole process of interaction between the patient and the psychological doctor, storing the videos in a time axis mode, and establishing a third target database.
Preferably, video clips under different conditions are recorded, including supermarket shopping, social activities and home environments; medical staff play roles in different scenes, interact with a patient in a virtual environment through a virtual reality VR technology, record and collect virtual scene behavior data, and establish a fourth target database.
Preferably, the first quantized data set is analyzed by a mental barrier detection model to obtain a basic scoring coefficient Jcx1; analyzing the second target database to obtain an abnormal physiological coefficient Jcx2; processing, analyzing and quantitatively calculating the third target database to obtain an interaction positive coefficient Jcx3; and processing, analyzing and quantitatively calculating the fourth target database to obtain the interaction scenario coefficient Jcx.
Preferably, a plurality of psychological barrier features are extracted from the first quantized data set and expressed in X 1 、X 2 、X 3 、...、X n Marking; and the basic scoring coefficient Jcx1 is obtained by calculation by the following formula:
wherein X is 1 X 2 />X 3 />.../>X n Are indicative of psychological barrier characteristics from the first quantized data set,/->Representing an analysis function trained based on the mental disorder detection model;
extracting blood pressure abnormality information number Y from second target database 1 Including the number of times of hypertension and hypotension; number of heart rate abnormality Y 2 Abnormal times of daily falling asleep time Y 3 Sweat gland increasing activity times Y reacting with skin electricity 4 After dimensionless treatment, the abnormal physiological coefficient Jcx is calculated by the following formula:
in the blood pressure abnormality information number Y 1 For the number of times the blood pressure in the record is above and below a preset blood pressure threshold range; number of heart rate abnormality Y 2 Recording the number of times that the heart rate is higher or lower than a preset heart rate threshold range; abnormal times of daily falling asleep time Y 3 Recording the times when the falling time is lower than the preset sleeping time range; sweat gland increasing activity times Y of skin electric reaction 4 Increasing the number of activities for reflecting sweat glands of skin by the statistics of skin spot projection; wherein,,/>,/>and->,/>And->Is weight(s)>The coefficient is modified for the first constant.
Preferably, the whole-course video of the target patient in the third target database is divided into a plurality of fragments according to a time axis, and each fragment is dynamically segmented; extracting speech, actions and facial expression changes in each segment video to trigger segmentation, and using facial expression analysis technology, including an expression recognition model based on deep learning, extracting facial emotion characteristics, and counting the number of times of the facial emotion characteristics, including:
anger b1: facial muscles contract, eyebrows are gathered together, and corners of the mouth are downward;
aversion b2: nose wrinkling with mouth corners facing downwards;
fear b3: the eyebrows are raised, and the eyes are opened greatly;
happy b4: smiling the mouth and squinting eyes into a line;
sadness b5: eyebrow sagging, mouth corner sagging;
surprise b6: the eyebrows are raised, and the eyes are opened greatly;
light stripe b7: one side mouth corner is raised;
tired b8: the eye bags are obvious, eyes are not god, and the condition of beating and cutting is accompanied;
focusing on b9: the eyebrows are slightly gathered together, and the mouth is tightly closed;
the emotion product extreme JJD is calculated by quantifying the number of times of anger b1, aversion b2, fear b3, happiness b4, sadness b5, surprise b6, light stripe b7, tiredness b8 and concentration b9, and the emotion product extreme JJD is calculated by the following formula:
a. c, d, e, f, g, h, i and j represent anger b1, aversion b2, fear b3, happiness b4, sadness b5, surprise b6, slight b7, tirednessb8 and b9, and a, c, d, e, f, g, h, i and j are both greater than 0, and a+c+d+e+f+g+h+i+j=1.0,correcting the coefficient for a second constant;
extracting action features in each segment video, analyzing the technology by using a gesture estimation algorithm, including a gesture recognition model based on deep learning, extracting gesture features, and counting the number of times of the gesture features, wherein the method comprises the following steps:
standing posture P1: standing straight, parallel feet and relaxed shoulders;
sitting position P2: upright sitting posture, straight back, crossed legs or flat;
gesture P3: including hands in pockets, crossing in the chest or hands naturally riding on a table;
arm crossover P4: the arms cross in front of the chest, accompanied by smiling expressions;
focus on gesture P5: the body is slightly tilted forward, the eyes are concentrated, and the hands are placed on the table;
calculating the self-confidence ZXD according to the gesture feature times, wherein the self-confidence ZXD is generated by the following formula:
k. m, n, o and q represent scaling coefficients of the standing position P1, sitting position P2, gesture P3, arm intersection P4 and the number of times of feature of the concentration position P5, and k, m, n, o and q are each greater than 0, and k+m+n+o+q=1.0, Correcting the coefficient for a third constant;
extracting voice dialogue characteristics in each segment video, using a language processing technology, including a voice recognition model based on deep learning, extracting voice speed, intonation and interaction frequency characteristics, and calculating a language expression capacity coefficient YYX, wherein the language expression capacity coefficient YYX is obtained through calculation according to the following formula;
wherein, HDPL is expressed as the frequency of interaction in videos of a target patient and a psychological doctor, BZ is expressed as a standard interaction frequency, YDPF is expressed as a score value of intonation, and YSPF is expressed as an average speech rate value;
and generating a third interaction positive coefficient Jcx3 by fitting the emotion product of the target patient extremely JJD, the confidence level ZXD and the speech expression capacity coefficient YYX through curve fitting by a method comprising an exponential, logarithmic and power function.
Preferably, in the virtual environment of the virtual reality VR technology, the actual communication frequency JLcs of the target patient and the virtual character is collected, and the interaction scenario coefficient Jcx is generated according to the following formula:
wherein BZVR represents the preset communication standard times of virtual environment of virtual reality VR technology; the meaning of the formula is that the performance of the target patient when interacting with the virtual task is evaluated; if the interaction scenario factor Jcx approaches 100%, it indicates that the actual number of exchanges of the target patient matches the preset criteria.
Preferably, the comprehensive evaluation coefficient Zh is generated by the following formula:
wherein w1, w2, w3 and w4 are the proportional coefficients of the basic scoring coefficient Jcx, the abnormal physiological coefficient Jcx, the interaction positive coefficient Jcx and the interaction scenario coefficient Jcx4, respectively, and w1 is more than or equal to 0.25 and less than or equal to 0.65,0.15 and less than or equal to 0.55, w3 is more than or equal to 0.25 and less than or equal to 0.55,0.15 and less than or equal to 0.66,
setting a typical symptom threshold value ZZ in the assessment model; comparing the comprehensive evaluation coefficient Zh of the target patient with a typical symptom threshold value ZZ, and acquiring a symptom evaluation result when the comprehensive evaluation coefficient Zh is greater than or equal to the typical symptom threshold value, wherein the evaluation result is a suspected abnormal result;
when the comprehensive evaluation coefficient Zh is smaller than the threshold value of the typical symptom, the method indicates normal;
the comprehensive evaluation coefficient Zh in the symptom evaluation result is respectively compared with the melancholia threshold, the attention deficit threshold and the anxiety threshold;
if the comprehensive assessment coefficient Zh is within the depression threshold, generating a first grooming regimen comprising: carrying out melancholia drug treatment and psychological synchronization treatment, and establishing a social support system matched with a target patient, wherein the social support system comprises family members, friends, classmates or community support groups;
If the comprehensive evaluation coefficient Zh is within the attention deficit range, generating a second dispersion scheme, including: providing cognitive behavioral therapy to help the target patient alter the negative mental patterns; scheduling and rewarding are planned through a concentration behavior intervention method, and a target is set for further dredging;
if the composite evaluation coefficient Zh is within the anxiety threshold, a third grooming regimen is generated comprising: teaching the patient to use deep breathing and progressive muscle relaxation techniques; matching with drug treatment and behavioural exposure and dredging schemes.
A psychological disorder assessment system for psychiatric patients comprises a psychological disorder detection model building module, a multichannel acquisition module, a psychological disorder feature extraction module, an assessment model building module and a dispersion scheme generation module;
the module for establishing the psychological barrier detection model is used for generating the psychological barrier detection model by using a psychological barrier characteristic sample acquired in advance through convolutional neural network CNN training;
the multichannel acquisition module is used for establishing a first quantitative data set comprising psychological disorder diagnosis results through diagnosis of psychiatric specialists, acquiring physiological data of a target patient by using an intelligent bracelet and establishing a second target database; recording the whole-course video of the target patient through the camera equipment, and establishing a third target database; creating a virtual environment by using a virtual reality technology, interacting with the virtual character, collecting virtual scene behavior data, and establishing a fourth target database;
The psychological barrier feature extraction module is used for extracting psychological barrier features from the first quantized data set, the second target database, the third target database and the fourth target database; analyzing the characteristics by using a physiological disorder detection model to obtain a basic scoring coefficient Jcx1, an abnormal physiological coefficient Jcx, an interaction positive coefficient Jcx and an interaction scenario coefficient Jcx;
the evaluation model building module is used for correlating the basic scoring coefficient Jcx1, the basic physiological coefficient Jcx, the interaction positive coefficient Jcx and the interaction scenario coefficient Jcx4 to obtain a comprehensive evaluation coefficient Zh; comparing the comprehensive evaluation coefficient Zh with a typical symptom threshold value ZZ to obtain a symptom evaluation result;
the dispersion scheme generating module is used for generating corresponding dispersion schemes according to the symptom evaluation result, wherein the corresponding dispersion schemes comprise different dispersion schemes aiming at melancholia, attention deficit and anxiety disorder.
The invention provides a method and a system for assessing psychological disorders of a psychiatric patient. The beneficial effects are as follows:
(1) According to the psychological disorder assessment method for the psychiatric patients, the psychological disorder detection model is established in S1-S4, and the psychological states of the patients can be comprehensively and objectively assessed by combining physiological data, whole-course video and virtual reality technology, so that the defects of the traditional method on information comprehensiveness and objectivity are overcome. By utilizing real-time data acquisition and creation of virtual environment, the method can capture the change of the patient under different situations and realize continuous monitoring of the psychological state of the patient, thereby improving the timeliness and the continuity of evaluation. By introducing the virtual reality technology, the social scene, the public place and the home environment are simulated, so that the behavior and the emotional response of the patient in daily life are more truly displayed in the evaluation, and the real situation reduction degree of the evaluation is improved. By analyzing the comprehensive evaluation coefficient Zh and combining different symptom thresholds, a personalized grooming scheme is generated, customized treatment suggestions are provided for different patients, and the individuation level of treatment is improved.
(2) The method and the system for assessing the psychological disorder of the psychiatric patient are generally used for regular and discrete assessment by the traditional method, and the instant state and change of the patient in daily life are difficult to capture. The new method makes the evaluation more time-efficient by real-time recording and creation of virtual environments.
(3) The method and the system for assessing the psychological disorder of the psychiatric patient lack sufficient consideration of individual differences of different patients in the traditional method, and generally adopt relatively uniform assessment standards. The new method realizes personalized evaluation by quantitatively analyzing various data and provides corresponding personalized dispersion schemes for different patients.
(4) The system comprises a multi-channel acquisition module, a first quantitative data set established by diagnosis of a psychiatric specialist, a second target database formed by physiological data acquired by an intelligent bracelet, a third target database formed by whole-course video recorded by camera equipment, and a fourth target database formed by virtual scene behavior data acquired by interaction of virtual reality technology and virtual characters in a virtual environment. This multi-channel data acquisition ensures that the system is able to fully understand the patient's behavioral and physiological response in different situations. The extract mental disorder feature module effectively extracts key information from a plurality of data sources, and obtains basic scoring coefficients Jcx1, abnormal physiological coefficients Jcx, interaction positive coefficients Jcx and interaction scenario coefficients Jcx through analysis of a mental disorder detection model. This integrated feature extraction and analysis process provides a solid basis for subsequent evaluations. And establishing an evaluation model module to obtain a comprehensive evaluation coefficient Zh through the associated different coefficients. The comprehensive evaluation coefficient Zh is compared with a typical symptom threshold value ZZ to form a symptom evaluation result, so that medical staff can quickly know the psychological state of a patient, and scientific preliminary judgment is provided. By introducing modern technical means such as virtual reality technology, physiological data acquisition, deep learning and the like, the defects of the traditional method are overcome, and the comprehensiveness, objectivity, timeliness and individuation level of the evaluation are improved.
Drawings
FIG. 1 is a schematic diagram showing the steps of a method for assessing a mental disorder in a mental patient according to the present invention;
fig. 2 is a block diagram and schematic diagram of a system for assessing mental disorder for mental patients.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Traditional methods for assessing mental disorders in psychiatric patients rely mainly on professional observations, patient speech expressions and standardized psychological assessment tools, which have limitations. Patients may not truly exhibit their behavioral and emotional responses in daily life in a clinical setting, resulting in an assessment that may not be comprehensive and objective. Furthermore, traditional psychological assessment methods are often intermittent, and it is difficult to capture changes in patients under different circumstances.
Traditional methods rely primarily on patient self-reporting and expert observations, lacking comprehensive multidimensional information. This makes the evaluation result possibly limited to one-sided observation and subjective judgment, and the psychological state of the patient cannot be comprehensively known.
The traditional method is difficult to restore the real situation of the patient in daily life in the evaluation process, and the behavioral response of the patient in different scenes cannot be comprehensively observed.
Embodiment 1 the present invention provides a method for assessing a mental disorder for a mental patient, please refer to fig. 1, comprising the following steps:
s1, establishing a psychological disorder detection model, wherein the psychological disorder detection model is generated by training a plurality of psychological disorder characteristic samples acquired in advance through a convolutional neural network;
s2, diagnosing a plurality of target patients by using a professional psychological disorder diagnosis table through a psychiatric specialist, obtaining diagnosis result data and establishing a first quantitative data set; configuring the intelligent bracelet and the target patient, collecting physiological data of the target patient, and establishing a second target database; in the interactive diagnosis process of the cardiologist, recording the whole-course video of the target patient by adopting camera equipment, and establishing a third target database; creating various virtual environments, including social scenes, public places, and home environments, using virtual reality technology; after the target patient wears the VR head display, simulating the interaction scene with the virtual character in the environment, collecting virtual scene behavior data, and establishing a fourth target database;
S3, analyzing the first quantized data set, the second target database, the third target database and the fourth target database, extracting psychological barrier characteristics, inputting the psychological barrier characteristics into a psychological barrier detection model, and obtaining a basic scoring coefficient Jcx, an abnormal physiological coefficient Jcx2, an interaction positive coefficient Jcx and an interaction scenario coefficient Jcx;
s4, establishing an evaluation model, correlating a basic grading coefficient Jcx1, an abnormal physiological coefficient Jcx, an interaction positive coefficient Jcx and an interaction scenario coefficient Jcx4 to obtain a comprehensive evaluation coefficient Zh, inputting the comprehensive evaluation coefficient Zh into the evaluation model, and comparing the comprehensive evaluation coefficient Zh with a typical symptom threshold value ZZ to obtain a symptom evaluation result; and generating a corresponding dredging scheme according to the symptom evaluation result.
In the embodiment, through establishing a psychological disorder detection model in S1-S4 and combining physiological data, whole-course video and virtual reality technology, the method can comprehensively and objectively evaluate the psychological state of the patient, and the defect of the traditional method on information comprehensiveness and objectivity is overcome. By utilizing real-time data acquisition and creation of virtual environment, the method can capture the change of the patient under different situations and realize continuous monitoring of the psychological state of the patient, thereby improving the timeliness and the continuity of evaluation. By introducing the virtual reality technology, the social scene, the public place and the home environment are simulated, so that the behavior and the emotional response of the patient in daily life are more truly displayed in the evaluation, and the real situation reduction degree of the evaluation is improved. By analyzing the comprehensive evaluation coefficient Zh and combining different symptom thresholds, a personalized grooming scheme is generated, customized treatment suggestions are provided for different patients, and the individuation level of treatment is improved.
Embodiment 2, this embodiment is an explanation performed in embodiment 1, specifically, the S1 includes: s11, collecting sample characteristics related to psychological disorders, including data from psychological assessment, physiological detection, speech analysis and behavioral analysis; processing the acquired sample data to obtain sample characteristic data after processing missing values, abnormal values and standardization;
s12, extracting, training and verifying sample feature data by using a convolutional neural network CNN to obtain a psychological barrier detection model, deriving the trained psychological barrier detection model, and deploying the psychological barrier detection model into an actual application scene.
In this embodiment, the trained psychological disorder detection model is deployed to the actual application scene, so that the model can be applied in the actual clinical environment, and a powerful auxiliary tool is provided for doctors.
Example 3, which is an illustration of example 1, specifically, a mental disorder diagnostic tool, which is verified and standardized by a psychiatrist, including a DSM-5 standard tool and a mental disorder diagnostic table, is selected for collection to obtain a first quantitative data set; the psychological disorder diagnosis form comprises a symptom self-rating form, a depression symptom questionnaire and an anxiety symptom questionnaire;
Selecting an intelligent bracelet capable of collecting blood pressure, heart rate, sleep information and galvanic skin response; configuring the intelligent bracelet and a target patient, and collecting physiological data in real time; the physiological data includes blood pressure, heart rate, sleep information, and galvanic skin response information;
deploying an image pickup device to record a high-definition video, capturing a facial expression, a posture and a language of a target patient; and recording videos in the whole process of interaction between the patient and the psychological doctor, storing the videos in a time axis mode, and establishing a third target database.
In this example, validated and standardized psychological disorder diagnostic tools, including DSM-5 standard tools and psychological disorder diagnostic tables, are used to help improve the objectivity and credibility of the diagnosis. Such tool selection makes the acquired first quantitative data set more reliable, providing a high quality basis for subsequent evaluation. The intelligent bracelet is selected as a physiological data acquisition tool, so that multiple physiological information including blood pressure, heart rate, sleep information and galvanic skin response can be acquired simultaneously. This helps to fully understand the patient's physiological condition, providing more comprehensive data support. The physiological data is acquired in real time through the intelligent bracelet, and the interaction process of the patient and the psychological doctor is recorded through the whole process of the camera equipment, so that more real and real-time information can be captured, and the physiological and psychological states of the patient in the interaction process can be reflected more accurately. High-definition camera equipment is deployed, so that high-quality video data can be recorded, facial expressions, postures and languages of patients can be observed and analyzed more clearly, and more detailed emotion and behavior information can be provided. And storing the collected video in a time axis mode, and constructing a third target database, thereby providing convenience for subsequent analysis and mining. The database structure is beneficial to integrating multidimensional information of patients in the interaction process, and improves the comprehensiveness and depth of evaluation.
Embodiment 4, which is an explanation performed in embodiment 1, specifically, implementing recording of video clips under different situations, including supermarket shopping, social activities and home environments; medical staff play roles in different scenes, interact with a patient in a virtual environment through a virtual reality VR technology, record and collect virtual scene behavior data, and establish a fourth target database.
In the embodiment, diversified virtual environments such as supermarket shopping, social activities and home environments are created, so that a patient can show own behavior and emotional response in more real and diverse situations. Such diversity helps to more fully understand the patient's performance in different scenarios. Please medical personnel play roles in different situations, increasing the realism of the environment. Medical staff can simulate the situation in real life and interact with patients naturally and truly, so that collected data is more representative. Through virtual reality VR technique, put the patient in virtual environment, interact with virtual personage. The method not only improves the sense of reality of the situation, but also provides a relatively safe and controllable environment for the patient, and is helpful for the patient to more truly exhibit own behaviors and emotions. The interaction of the patient with the virtual character is recorded in the virtual environment, and the interaction comprises speech, actions, facial expressions and other behavior data. This helps to obtain more detailed and comprehensive information, providing more data support for subsequent extraction of psychological barrier features. And integrating and storing the data acquired in the virtual environment, and establishing a fourth target database. Such a database would contain behavioral and emotional data of the patient in different virtual contexts, facilitating a comprehensive assessment of the target patient.
Embodiment 5, which is an explanation of embodiment 1, specifically, the first quantized data set is analyzed by the mental barrier detection model to obtain a basic scoring coefficient Jcx1; analyzing the second target database to obtain an abnormal physiological coefficient Jcx2; processing, analyzing and quantitatively calculating the third target database to obtain an interaction positive coefficient Jcx3; and processing, analyzing and quantitatively calculating the fourth target database to obtain the interaction scenario coefficient Jcx.
Specifically, a plurality of psychological barrier features are extracted from the first quantized data set and expressed in X 1 、X 2 、X 3 、...、X n Marking; and the basic scoring coefficient Jcx1 is obtained by calculation by the following formula:
wherein X is 1 X 2 />X 3 />.../>X n Each representing data from the first quantizationPsychological disorder characteristics of the album,/>Representing an analysis function trained based on the mental disorder detection model;
extracting blood pressure abnormality information number Y from second target database 1 Including the number of times of hypertension and hypotension; number of heart rate abnormality Y 2 Abnormal times of daily falling asleep time Y 3 Sweat gland increasing activity times Y reacting with skin electricity 4 After dimensionless treatment, the abnormal physiological coefficient Jcx is calculated by the following formula:
in the blood pressure abnormality information number Y 1 For the number of times the blood pressure in the record is above and below a preset blood pressure threshold range; number of heart rate abnormality Y 2 Recording the number of times that the heart rate is higher or lower than a preset heart rate threshold range; abnormal times of daily falling asleep time Y 3 Recording the times when the falling time is lower than the preset sleeping time range; sweat gland increasing activity times Y of skin electric reaction 4 Increasing the number of activities for reflecting sweat glands of skin by the statistics of skin spot projection; wherein,,/>and->,/>And->Is weight(s)>The coefficient is modified for the first constant.
Specifically, the whole-process video of the target patient in the third target database is divided into a plurality of fragments according to a time axis, and each fragment is dynamically segmented; extracting speech, actions and facial expression changes in each segment video to trigger segmentation, and using facial expression analysis technology, including an expression recognition model based on deep learning, extracting facial emotion characteristics, and counting the number of times of the facial emotion characteristics, including:
anger b1: facial muscles contract, eyebrows are gathered together, and corners of the mouth are downward;
aversion b2: nose wrinkling with mouth corners facing downwards;
fear b3: the eyebrows are raised, and the eyes are opened greatly;
happy b4: smiling the mouth and squinting eyes into a line;
Sadness b5: eyebrow sagging, mouth corner sagging;
surprise b6: the eyebrows are raised, and the eyes are opened greatly;
light stripe b7: one side mouth corner is raised;
tired b8: the eye bags are obvious, eyes are not god, and the condition of beating and cutting is accompanied;
focusing on b9: the eyebrows are slightly gathered together, and the mouth is tightly closed;
the emotion product extreme JJD is calculated by quantifying the number of times of anger b1, aversion b2, fear b3, happiness b4, sadness b5, surprise b6, light stripe b7, tiredness b8 and concentration b9, and the emotion product extreme JJD is calculated by the following formula:
c. d, e, f, g, h, i and j represent the proportionality coefficients of the number of expressions of anger b1, aversion b2, fear b3, happiness b4, sadness b5, surprise b6, slight b7, tiredness b8 and concentration b9, and a, c, d, e, f, g, h, i and j are each greater than 0, and a+c+d+e+f+g+h+i+j=1.0,correcting the coefficient for a second constant;
by quantifying the number of different emotion expressions and calculating the emotion product extreme JJD through a formula, the system can comprehensively consider the emotion change of the patient in different time periods. The proportional coefficient and the constant correction coefficient are introduced, so that the influence of different emotions on the whole emotion product can be balanced better, and the emotion evaluation precision is improved. The emotion accumulation extreme JJD can more comprehensively understand the emotion states of the patient under different situations by analyzing a plurality of emotions (anger, aversion, fear, happiness, sadness, surprise, slight, tiredness and concentration). This helps to improve overall cognition of the patient's mental state, providing more information for comprehensive assessment of assessment results.
Extracting action features in each segment video, analyzing the technology by using a gesture estimation algorithm, including a gesture recognition model based on deep learning, extracting gesture features, and counting the number of times of the gesture features, wherein the method comprises the following steps:
standing posture P1: standing straight, parallel feet and relaxed shoulders;
sitting position P2: upright sitting posture, straight back, crossed legs or flat;
gesture P3: including hands in pockets, crossing in the chest or hands naturally riding on a table;
arm crossover P4: the arms cross in front of the chest, accompanied by smiling expressions;
focus on gesture P5: the body is slightly tilted forward, the eyes are concentrated, and the hands are placed on the table;
calculating the self-confidence ZXD according to the gesture feature times, wherein the self-confidence ZXD is generated by the following formula:
k. m, n, o and q represent scaling coefficients of the standing position P1, sitting position P2, gesture P3, arm intersection P4 and the number of times of feature of the concentration position P5, and k, m, n, o and q are each greater than 0, and k+m+n+o+q=1.0,correcting the coefficient for a third constant;
by extracting the motion features in each video clip, the system is able to learn more about the physical actions and gestures of the patient in different contexts. This helps enrich the analysis of patient behavior and improves the grasp of the overall state of the patient. A gesture recognition model based on deep learning is introduced to analyze multiple aspects of standing, sitting, gesture, arm crossing, concentration gesture and the like. Such a pose estimation algorithm may more accurately capture the pose characteristics of the patient, providing reliable data support for subsequent confidence calculations. The number of different gesture features including standing, sitting, gestures, arm crossings, and concentration gestures are counted. By introducing the proportional coefficient and the constant correction coefficient, the influence of different postures on the whole confidence level can be better considered, and the accuracy of the confidence level calculation is improved. The confidence ZXD is generated by calculating the number of gesture features. This confidence reflects the patient's state of confidence exhibited over different time periods. The introduction of the proportional coefficient and the constant correction coefficient is beneficial to balancing the contribution of different posture characteristics to the confidence level, and the reliability and the comprehensiveness of the confidence level are improved.
Extracting voice dialogue characteristics in each segment video, using a language processing technology, including a voice recognition model based on deep learning, extracting voice speed, intonation and interaction frequency characteristics, and calculating a language expression capacity coefficient YYX, wherein the language expression capacity coefficient YYX is obtained through calculation according to the following formula;
wherein, HDPL is expressed as the frequency of interaction in videos of a target patient and a psychological doctor, BZ is expressed as a standard interaction frequency, YDPF is expressed as a score value of intonation, and YSPF is expressed as an average speech rate value;
and generating a third interaction positive coefficient Jcx3 by fitting the emotion product of the target patient extremely JJD, the confidence level ZXD and the speech expression capacity coefficient YYX through curve fitting by a method comprising an exponential, logarithmic and power function.
The language expression ability coefficient YYX is introduced, and the language expression level of the patient is comprehensively reflected by calculating the combination of the interaction frequency, the intonation score value and the average speech speed value. This enables the system to more fully evaluate the patient's performance in voice communication.
Specifically, in the virtual environment of the virtual reality VR technology, the actual communication frequency JLcs of the target patient and the virtual character is collected, and the interaction scenario coefficient Jcx is generated according to the following formula:
Wherein BZVR represents the preset communication standard times of virtual environment of virtual reality VR technology; the meaning of the formula is that the performance of the target patient when interacting with the virtual task is evaluated; if the interaction scenario factor Jcx approaches 100%, it indicates that the actual number of exchanges of the target patient matches the preset criteria.
In the virtual environment, the system can accurately collect the actual communication times JLcs of the target patient and the virtual character. This helps quantify the social interaction performance of the patient in the virtual environment, providing a reliable data basis for subsequent analysis and evaluation. The interaction scenario coefficient Jcx is introduced, and a coefficient reflecting the interaction performance of the target patient in the virtual environment is generated through comparing the actual communication times of the interaction with the virtual task with the preset standard times. This enables the system to more objectively assess the patient's social ability and level of interaction under different circumstances. The closer the value of the interaction scenario coefficient Jcx is to 100%, the more the actual number of exchanges of the target patient matches the preset criteria in the virtual environment. The visual presentation mode enables the evaluation result to be easier to understand, and helps medical staff to quickly know social interaction conditions of patients in the virtual environment.
Example 6, which is an explanation made in example 1, specifically, the comprehensive evaluation coefficient Zh is generated by the following formula:
wherein w1, w2, w3 and w4 are respectively basic scoring coefficients Jcx1, abnormal physiological coefficients Jcx2 and each otherDynamic coefficient Jcx and interaction scene coefficient Jcx, and w1 is more than or equal to 0.25 and less than or equal to 0.65,0.15, w2 is more than or equal to 0.55, w3 is more than or equal to 0.25 and less than or equal to 0.55,0.15, w4 is more than or equal to 0.66, and
setting a typical symptom threshold value ZZ in the assessment model; comparing the comprehensive evaluation coefficient Zh of the target patient with a typical symptom threshold value ZZ, and acquiring a symptom evaluation result when the comprehensive evaluation coefficient Zh is greater than or equal to the typical symptom threshold value, wherein the evaluation result is a suspected abnormal result;
when the comprehensive evaluation coefficient Zh is smaller than the threshold value of the typical symptom, the method indicates normal;
the comprehensive evaluation coefficient Zh in the symptom evaluation result is respectively compared with the melancholia threshold, the attention deficit threshold and the anxiety threshold;
if the comprehensive assessment coefficient Zh is within the depression threshold, generating a first grooming regimen comprising: carrying out melancholia drug treatment and psychological synchronization treatment, and establishing a social support system matched with a target patient, wherein the social support system comprises family members, friends, classmates or community support groups;
If the comprehensive evaluation coefficient Zh is within the attention deficit range, generating a second dispersion scheme, including: providing cognitive behavioral therapy to help the target patient alter the negative mental patterns; scheduling and rewarding are planned through a concentration behavior intervention method, and a target is set for further dredging;
if the composite evaluation coefficient Zh is within the anxiety threshold, a third grooming regimen is generated comprising: teaching the patient to use deep breathing and progressive muscle relaxation techniques; matching with drug treatment and behavioural exposure and dredging schemes.
In this embodiment, the system generates the comprehensive evaluation coefficient Zh by considering different weights of the basic scoring coefficient Jcx, the abnormal physiological coefficient Jcx, the interaction positive coefficient Jcx and the interaction scenario coefficient Jcx, where the coefficient comprehensively considers information of multiple dimensions including psychological characteristics, physiological states, interaction ability and scenario reaction. This helps to comprehensively and objectively evaluate the psychological state of the psychiatric patient. Based on the comparison of the comprehensive evaluation coefficient Zh and the typical symptom threshold value ZZ, the system classifies the evaluation result as normal or suspected abnormal. This classification simplifies the medical staff's preliminary knowledge of the patient's condition and provides a clear judgment standard. Aiming at different suspected abnormal results, the system generates a corresponding grooming scheme. This includes specific therapeutic advice for depression, attention deficit and anxiety disorders, such as medication, mental synchrony, cognitive behavioral therapy, and the like. Generating personalized grooming protocols helps provide more targeted treatments and support.
Referring to fig. 2, the system for assessing psychological disorders for mental patients comprises a module for establishing a psychological disorder detection model, a multi-channel acquisition module, a module for extracting psychological disorder characteristics, a module for establishing an assessment model and a module for generating a dispersion scheme;
the module for establishing the psychological barrier detection model is used for generating the psychological barrier detection model by using a psychological barrier characteristic sample acquired in advance through convolutional neural network CNN training;
the multichannel acquisition module is used for establishing a first quantitative data set comprising psychological disorder diagnosis results through diagnosis of psychiatric specialists, acquiring physiological data of a target patient by using an intelligent bracelet and establishing a second target database; recording the whole-course video of the target patient through the camera equipment, and establishing a third target database; creating a virtual environment by using a virtual reality technology, interacting with the virtual character, collecting virtual scene behavior data, and establishing a fourth target database;
the psychological barrier feature extraction module is used for extracting psychological barrier features from the first quantized data set, the second target database, the third target database and the fourth target database; analyzing the characteristics by using a physiological disorder detection model to obtain a basic scoring coefficient Jcx1, an abnormal physiological coefficient Jcx, an interaction positive coefficient Jcx and an interaction scenario coefficient Jcx;
The evaluation model building module is used for correlating the basic scoring coefficient Jcx1, the basic physiological coefficient Jcx, the interaction positive coefficient Jcx and the interaction scenario coefficient Jcx4 to obtain a comprehensive evaluation coefficient Zh; comparing the comprehensive evaluation coefficient Zh with a typical symptom threshold value ZZ to obtain a symptom evaluation result;
the dispersion scheme generating module is used for generating corresponding dispersion schemes according to the symptom evaluation result, wherein the corresponding dispersion schemes comprise different dispersion schemes aiming at melancholia, attention deficit and anxiety disorder.
The psychology disturbance evaluation system for the psychology patients realizes comprehensive and multi-channel patient evaluation and dredging scheme generation through the synergistic effect of a series of modules. Firstly, the core of the system is to build a psychological barrier detection model module, and train a psychological barrier characteristic sample acquired in advance by utilizing a convolutional neural network CNN, so as to generate an efficient psychological barrier detection model. The model can deeply analyze the psychological state of the patient and improve the accuracy of evaluation.
The multi-channel acquisition module acquires abundant patient information in various modes, wherein the multi-channel acquisition module comprises a first quantitative data set established by diagnosis of a psychiatric specialist, a second target database formed by physiological data acquired by an intelligent bracelet, a third target database formed by whole-course video recorded by camera equipment, and a fourth target database formed by virtual scene behavior data acquired by interaction with a virtual character in a virtual environment through a virtual reality technology. This multi-channel data acquisition ensures that the system is able to fully understand the patient's behavioral and physiological response in different situations.
The extract mental disorder feature module effectively extracts key information from a plurality of data sources, and obtains basic scoring coefficients Jcx1, abnormal physiological coefficients Jcx, interaction positive coefficients Jcx and interaction scenario coefficients Jcx through analysis of a mental disorder detection model. This integrated feature extraction and analysis process provides a solid basis for subsequent evaluations.
And establishing an evaluation model module to obtain a comprehensive evaluation coefficient Zh through the associated different coefficients. The comprehensive evaluation coefficient Zh is compared with a typical symptom threshold value ZZ to form a symptom evaluation result, so that medical staff can quickly know the psychological state of a patient, and scientific preliminary judgment is provided.
Finally, the dispersion scheme generating module generates a personalized dispersion scheme for medical staff according to the symptom evaluation result, and covers treatment suggestions of different physical disorders, including medication, psychological synchronization treatment, cognitive behavioral therapy and the like. This provides specific guidance to the medical team and helps to better support patient recovery.
Although embodiments of the present invention have been shown and described, it will be understood by those skilled in the art that various changes, modifications, substitutions and alterations can be made therein without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (10)

1. A method for assessing psychological disorders of a psychiatric patient is characterized in that: the method comprises the following steps:
s1, establishing a psychological disorder detection model, wherein the psychological disorder detection model is generated by training a plurality of psychological disorder characteristic samples acquired in advance through a convolutional neural network;
s2, diagnosing a plurality of target patients by using a professional psychological disorder diagnosis table through a psychiatric specialist, obtaining diagnosis result data and establishing a first quantitative data set; configuring the intelligent bracelet and the target patient, collecting physiological data of the target patient, and establishing a second target database; in the interactive diagnosis process of the cardiologist, recording the whole-course video of the target patient by adopting camera equipment, and establishing a third target database; creating various virtual environments, including social scenes, public places, and home environments, using virtual reality technology; after the target patient wears the VR head display, simulating the interaction scene with the virtual character in the environment, collecting virtual scene behavior data, and establishing a fourth target database;
s3, analyzing the first quantized data set, the second target database, the third target database and the fourth target database, extracting psychological barrier characteristics, inputting the psychological barrier characteristics into a psychological barrier detection model, and obtaining a basic scoring coefficient Jcx, an abnormal physiological coefficient Jcx2, an interaction positive coefficient Jcx and an interaction scenario coefficient Jcx;
S4, establishing an evaluation model, correlating a basic grading coefficient Jcx1, an abnormal physiological coefficient Jcx, an interaction positive coefficient Jcx and an interaction scenario coefficient Jcx4 to obtain a comprehensive evaluation coefficient Zh, inputting the comprehensive evaluation coefficient Zh into the evaluation model, and comparing the comprehensive evaluation coefficient Zh with a typical symptom threshold value ZZ to obtain a symptom evaluation result; and generating a corresponding dredging scheme according to the symptom evaluation result.
2. The method for assessing a mental disorder of a patient in a psychiatric department according to claim 1, wherein: the S1 comprises the following steps: s11, collecting sample characteristics related to psychological disorders, including data from psychological assessment, physiological detection, speech analysis and behavioral analysis; processing the acquired sample data to obtain sample characteristic data after processing missing values, abnormal values and standardization;
s12, extracting, training and verifying sample feature data by using a convolutional neural network CNN to obtain a psychological barrier detection model, deriving the trained psychological barrier detection model, and deploying the psychological barrier detection model into an actual application scene.
3. The method for assessing a mental disorder of a patient in a psychiatric department according to claim 1, wherein: selecting a verified and standardized mental disorder diagnostic tool by a mental expert, wherein the mental disorder diagnostic tool comprises a DSM-5 standard tool and a mental disorder diagnostic table for acquisition to obtain a first quantitative data set; the psychological disorder diagnosis form comprises a symptom self-rating form, a depression symptom questionnaire and an anxiety symptom questionnaire;
Selecting an intelligent bracelet capable of collecting blood pressure, heart rate, sleep information and galvanic skin response; configuring the intelligent bracelet and a target patient, and collecting physiological data in real time; the physiological data includes blood pressure, heart rate, sleep information, and galvanic skin response information;
deploying an image pickup device to record a high-definition video, capturing a facial expression, a posture and a language of a target patient; and recording videos in the whole process of interaction between the patient and the psychological doctor, storing the videos in a time axis mode, and establishing a third target database.
4. The method for assessing a mental disorder of a patient in a psychiatric department according to claim 1, wherein: video clips under different conditions are recorded, wherein the video clips comprise supermarket shopping, social activities and home environments; medical staff play roles in different scenes, interact with a patient in a virtual environment through a virtual reality VR technology, record and collect virtual scene behavior data, and establish a fourth target database.
5. The method for assessing a mental disorder of a patient in a psychiatric department according to claim 1, wherein: analyzing the first quantized data set through a psychological barrier detection model to obtain a basic scoring coefficient Jcx1; analyzing the second target database to obtain an abnormal physiological coefficient Jcx2; processing, analyzing and quantitatively calculating the third target database to obtain an interaction positive coefficient Jcx3; and processing, analyzing and quantitatively calculating the fourth target database to obtain the interaction scenario coefficient Jcx.
6. The method for assessing the mental disorder of a patient in a psychiatric department as claimed in claim 5, wherein: extracting a plurality of psychological barrier features from the first quantized data set and taking X 1 、X 2 、X 3 、...、X n Marking; and the basic scoring coefficient Jcx1 is obtained by calculation by the following formula:
wherein X is 1 X 2 />X 3 />.../>X n Are indicative of psychological barrier characteristics from the first quantized data set,/->Representing an analysis function trained based on the mental disorder detection model;
extracting blood pressure abnormality information number Y from second target database 1 Including the number of times of hypertension and hypotension; number of heart rate abnormality Y 2 Abnormal times of daily falling asleep time Y 3 Sweat gland increasing activity times Y reacting with skin electricity 4 After dimensionless treatment, the abnormal physiological coefficient Jcx is calculated by the following formula:
in the blood pressure abnormality information number Y 1 For the number of times the blood pressure in the record is above and below a preset blood pressure threshold range; number of heart rate abnormality Y 2 Recording the number of times that the heart rate is higher or lower than a preset heart rate threshold range; abnormal times of daily falling asleep time Y 3 Recording the times when the falling time is lower than the preset sleeping time range; sweat gland increasing activity times Y of skin electric reaction 4 Increasing the number of activities for reflecting sweat glands of skin by the statistics of skin spot projection; wherein, ,/>,/>And is also provided with,/>And->Is weight(s)>The coefficient is modified for the first constant.
7. The method for assessing the mental disorder of a patient in a psychiatric department as claimed in claim 5, wherein: dividing the whole-course video of the target patient in the third target database into a plurality of fragments according to time axes, and dynamically dividing each fragment; extracting speech, actions and facial expression changes in each segment video to trigger segmentation, and using facial expression analysis technology, including an expression recognition model based on deep learning, extracting facial emotion characteristics, and counting the number of times of the facial emotion characteristics, including:
anger b1: facial muscles contract, eyebrows are gathered together, and corners of the mouth are downward;
aversion b2: nose wrinkling with mouth corners facing downwards;
fear b3: the eyebrows are raised, and the eyes are opened greatly;
happy b4: smiling the mouth and squinting eyes into a line;
sadness b5: eyebrow sagging, mouth corner sagging;
surprise b6: the eyebrows are raised, and the eyes are opened greatly;
light stripe b7: one side mouth corner is raised;
tired b8: the eye bags are obvious, eyes are not god, and the condition of beating and cutting is accompanied;
focusing on b9: the eyebrows are slightly gathered together, and the mouth is tightly closed;
the emotion product extreme JJD is calculated by quantifying the number of times of anger b1, aversion b2, fear b3, happiness b4, sadness b5, surprise b6, light stripe b7, tiredness b8 and concentration b9, and the emotion product extreme JJD is calculated by the following formula:
a. c, d, e, f, g, h, i and j represent the proportionality coefficients of the number of expressions of anger b1, aversion b2, fear b3, happiness b4, sadness b5, surprise b6, slight b7, tiredness b8 and concentration b9, and a, c, d, e, f, g, h, i and j are each greater than 0, and a+c+d+e+f+g+h+i+j=1.0,correcting the coefficient for a second constant;
extracting action features in each segment video, analyzing the technology by using a gesture estimation algorithm, including a gesture recognition model based on deep learning, extracting gesture features, and counting the number of times of the gesture features, wherein the method comprises the following steps:
standing posture P1: standing straight, parallel feet and relaxed shoulders;
sitting position P2: upright sitting posture, straight back, crossed legs or flat;
gesture P3: including hands in pockets, crossing in the chest or hands naturally riding on a table;
arm crossover P4: the arms cross in front of the chest, accompanied by smiling expressions;
focus on gesture P5: the body is slightly tilted forward, the eyes are concentrated, and the hands are placed on the table;
calculating the self-confidence ZXD according to the gesture feature times, wherein the self-confidence ZXD is generated by the following formula:
k. m, n, o and q represent scaling coefficients of the standing position P1, sitting position P2, gesture P3, arm intersection P4 and the number of times of feature of the concentration position P5, and k, m, n, o and q are each greater than 0, and k+m+n+o+q=1.0, Correcting the coefficient for a third constant;
extracting voice dialogue characteristics in each segment video, using a language processing technology, including a voice recognition model based on deep learning, extracting voice speed, intonation and interaction frequency characteristics, and calculating a language expression capacity coefficient YYX, wherein the language expression capacity coefficient YYX is obtained through calculation according to the following formula;
wherein, HDPL is expressed as the frequency of interaction in videos of a target patient and a psychological doctor, BZ is expressed as a standard interaction frequency, YDPF is expressed as a score value of intonation, and YSPF is expressed as an average speech rate value;
and generating a third interaction positive coefficient Jcx3 by fitting the emotion product of the target patient extremely JJD, the confidence level ZXD and the speech expression capacity coefficient YYX through curve fitting by a method comprising an exponential, logarithmic and power function.
8. The method for assessing the mental disorder of a patient in a psychiatric department as claimed in claim 5, wherein: the method is characterized in that: in the virtual reality VR technology virtual environment, the actual communication times JLcs of the target patient and the virtual character are collected, and an interaction scenario coefficient Jcx is generated according to the following formula:
wherein BZVR represents the preset communication standard times of virtual environment of virtual reality VR technology; the meaning of the formula is that the performance of the target patient when interacting with the virtual task is evaluated; if the interaction scenario factor Jcx approaches 100%, it indicates that the actual number of exchanges of the target patient matches the preset criteria.
9. The method for assessing the mental disorder of a patient in a psychiatric department as claimed in claim 8, wherein: the comprehensive evaluation coefficient Zh is generated by the following formula:
wherein w1, w2, w3 and w4 are the proportional coefficients of the basic scoring coefficient Jcx, the abnormal physiological coefficient Jcx, the interaction positive coefficient Jcx and the interaction scenario coefficient Jcx4, respectively, and w1 is more than or equal to 0.25 and less than or equal to 0.65,0.15 and less than or equal to 0.55, w3 is more than or equal to 0.25 and less than or equal to 0.55,0.15 and less than or equal to 0.66,
setting a typical symptom threshold value ZZ in the assessment model; comparing the comprehensive evaluation coefficient Zh of the target patient with a typical symptom threshold value ZZ, and acquiring a symptom evaluation result when the comprehensive evaluation coefficient Zh is greater than or equal to the typical symptom threshold value, wherein the evaluation result is a suspected abnormal result;
when the comprehensive evaluation coefficient Zh is smaller than the threshold value of the typical symptom, the method indicates normal;
the comprehensive evaluation coefficient Zh in the symptom evaluation result is respectively compared with the melancholia threshold, the attention deficit threshold and the anxiety threshold;
if the comprehensive assessment coefficient Zh is within the depression threshold, generating a first grooming regimen comprising: carrying out melancholia drug treatment and psychological synchronization treatment, and establishing a social support system matched with a target patient, wherein the social support system comprises family members, friends, classmates or community support groups;
If the comprehensive evaluation coefficient Zh is within the attention deficit range, generating a second dispersion scheme, including: providing cognitive behavioral therapy to help the target patient alter the negative mental patterns; scheduling and rewarding are planned through a concentration behavior intervention method, and a target is set for further dredging;
if the composite evaluation coefficient Zh is within the anxiety threshold, a third grooming regimen is generated comprising: teaching the patient to use deep breathing and progressive muscle relaxation techniques; matching with drug treatment and behavioural exposure and dredging schemes.
10. The system for assessing psychological disorders for psychiatric patients, comprising the method for assessing psychological disorders for psychiatric patients according to any one of claims 1 to 9, characterized in that: the system comprises a psychological barrier detection model building module, a multichannel acquisition module, a psychological barrier feature extracting module, an evaluation model building module and a dredging scheme generating module;
the module for establishing the psychological barrier detection model is used for generating the psychological barrier detection model by using a psychological barrier characteristic sample acquired in advance through convolutional neural network CNN training;
the multichannel acquisition module is used for establishing a first quantitative data set comprising psychological disorder diagnosis results through diagnosis of psychiatric specialists, acquiring physiological data of a target patient by using an intelligent bracelet and establishing a second target database; recording the whole-course video of the target patient through the camera equipment, and establishing a third target database; creating a virtual environment by using a virtual reality technology, interacting with the virtual character, collecting virtual scene behavior data, and establishing a fourth target database;
The psychological barrier feature extraction module is used for extracting psychological barrier features from the first quantized data set, the second target database, the third target database and the fourth target database; analyzing the characteristics by using a physiological disorder detection model to obtain a basic scoring coefficient Jcx1, an abnormal physiological coefficient Jcx, an interaction positive coefficient Jcx and an interaction scenario coefficient Jcx;
the evaluation model building module is used for correlating the basic scoring coefficient Jcx1, the basic physiological coefficient Jcx, the interaction positive coefficient Jcx and the interaction scenario coefficient Jcx4 to obtain a comprehensive evaluation coefficient Zh; comparing the comprehensive evaluation coefficient Zh with a typical symptom threshold value ZZ to obtain a symptom evaluation result;
the dispersion scheme generating module is used for generating corresponding dispersion schemes according to the symptom evaluation result, wherein the corresponding dispersion schemes comprise different dispersion schemes aiming at melancholia, attention deficit and anxiety disorder.
CN202311759887.8A 2023-12-20 2023-12-20 Method and system for assessing psychological disorder of psychiatric patient Active CN117438048B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311759887.8A CN117438048B (en) 2023-12-20 2023-12-20 Method and system for assessing psychological disorder of psychiatric patient

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311759887.8A CN117438048B (en) 2023-12-20 2023-12-20 Method and system for assessing psychological disorder of psychiatric patient

Publications (2)

Publication Number Publication Date
CN117438048A true CN117438048A (en) 2024-01-23
CN117438048B CN117438048B (en) 2024-02-23

Family

ID=89556913

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311759887.8A Active CN117438048B (en) 2023-12-20 2023-12-20 Method and system for assessing psychological disorder of psychiatric patient

Country Status (1)

Country Link
CN (1) CN117438048B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117637118A (en) * 2024-01-27 2024-03-01 南京元域绿洲科技有限公司 Anxiety disorder virtual reality training system

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109298779A (en) * 2018-08-10 2019-02-01 济南奥维信息科技有限公司济宁分公司 Virtual training System and method for based on virtual protocol interaction
US20190216392A1 (en) * 2016-08-26 2019-07-18 Akili Interactive Labs, Inc. Cognitive platform coupled with a physiological component
CN113539430A (en) * 2021-07-02 2021-10-22 广东省人民医院 Immersive VR-based Parkinson's disease depression cognitive behavior treatment system
US20220398411A1 (en) * 2021-06-11 2022-12-15 Hume AI Inc. Empathic artificial intelligence systems
WO2023272935A1 (en) * 2021-07-01 2023-01-05 曾迎春 Virtual reality-based cancer-related cognitive impairment evaluation and rehabilitation training system
WO2023102125A1 (en) * 2021-12-01 2023-06-08 Behavr, Inc. Management of psychiatric or mental conditions using digital or augmented reality with personalized exposure progression
CN116469557A (en) * 2023-03-22 2023-07-21 深圳大学 Intervention effect evaluation method, device, terminal equipment and storage medium
CN116994718A (en) * 2023-09-28 2023-11-03 南京元域绿洲科技有限公司 VR technology-based mental disorder auxiliary treatment method
CN117133409A (en) * 2023-08-11 2023-11-28 贵州省人民医院 Auxiliary child autism spectrum disorder rehabilitation system based on VR interaction technology

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190216392A1 (en) * 2016-08-26 2019-07-18 Akili Interactive Labs, Inc. Cognitive platform coupled with a physiological component
CN109298779A (en) * 2018-08-10 2019-02-01 济南奥维信息科技有限公司济宁分公司 Virtual training System and method for based on virtual protocol interaction
US20220398411A1 (en) * 2021-06-11 2022-12-15 Hume AI Inc. Empathic artificial intelligence systems
WO2023272935A1 (en) * 2021-07-01 2023-01-05 曾迎春 Virtual reality-based cancer-related cognitive impairment evaluation and rehabilitation training system
CN113539430A (en) * 2021-07-02 2021-10-22 广东省人民医院 Immersive VR-based Parkinson's disease depression cognitive behavior treatment system
WO2023102125A1 (en) * 2021-12-01 2023-06-08 Behavr, Inc. Management of psychiatric or mental conditions using digital or augmented reality with personalized exposure progression
CN116469557A (en) * 2023-03-22 2023-07-21 深圳大学 Intervention effect evaluation method, device, terminal equipment and storage medium
CN117133409A (en) * 2023-08-11 2023-11-28 贵州省人民医院 Auxiliary child autism spectrum disorder rehabilitation system based on VR interaction technology
CN116994718A (en) * 2023-09-28 2023-11-03 南京元域绿洲科技有限公司 VR technology-based mental disorder auxiliary treatment method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117637118A (en) * 2024-01-27 2024-03-01 南京元域绿洲科技有限公司 Anxiety disorder virtual reality training system
CN117637118B (en) * 2024-01-27 2024-03-29 南京元域绿洲科技有限公司 Anxiety disorder virtual reality training system

Also Published As

Publication number Publication date
CN117438048B (en) 2024-02-23

Similar Documents

Publication Publication Date Title
Piana et al. Adaptive body gesture representation for automatic emotion recognition
Udovičić et al. Wearable emotion recognition system based on GSR and PPG signals
CN109298779B (en) Virtual training system and method based on virtual agent interaction
Aigrain et al. Multimodal stress detection from multiple assessments
KR102277820B1 (en) The psychological counseling system and the method thereof using the feeling information and response information
CN110349667B (en) Autism assessment system combining questionnaire and multi-modal model behavior data analysis
US20150305662A1 (en) Remote assessment of emotional status
CN117438048B (en) Method and system for assessing psychological disorder of psychiatric patient
Müller et al. Emotion recognition from embedded bodily expressions and speech during dyadic interactions
WO2019141017A1 (en) Human sensory data measurement system and method
CN113837153B (en) Real-time emotion recognition method and system integrating pupil data and facial expressions
CN116807476B (en) Multi-mode psychological health assessment system and method based on interface type emotion interaction
CN110060753A (en) Cognitive disorder patient's intervention Effects Evaluation system and method
Jianwattanapaisarn et al. Emotional characteristic analysis of human gait while real-time movie viewing
Vargas-Quiros et al. Impact of annotation modality on label quality and model performance in the automatic assessment of laughter in-the-wild
Ilyas et al. Deep transfer learning in human–robot interaction for cognitive and physical rehabilitation purposes
Masmoudi et al. Meltdowncrisis: Dataset of autistic children during meltdown crisis
Jáuregui et al. Toward automatic detection of acute stress: Relevant nonverbal behaviors and impact of personality traits
CN115966003A (en) System for evaluating online learning efficiency of learner based on emotion recognition
Buck et al. Measuring the dynamic stream of display: Spontaneous and intentional facial expression and communication.
Yashaswini et al. Stress detection using deep learning and IoT
Anolli et al. A multimodal database as a background for emotional synthesis, recognition and training in e-learning systems
Kasiran et al. Facial expression as an implicit customers' feedback and the challenges
Ray et al. Biophysical signal based emotion detection for technology enabled affective learning
Xu et al. An auxiliary screening system for autism spectrum disorder based on emotion and attention analysis

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant