CN111341444A - Intelligent drawing scoring method and system - Google Patents

Intelligent drawing scoring method and system Download PDF

Info

Publication number
CN111341444A
CN111341444A CN201911406860.4A CN201911406860A CN111341444A CN 111341444 A CN111341444 A CN 111341444A CN 201911406860 A CN201911406860 A CN 201911406860A CN 111341444 A CN111341444 A CN 111341444A
Authority
CN
China
Prior art keywords
user
score
eye movement
client
emotion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911406860.4A
Other languages
Chinese (zh)
Other versions
CN111341444B (en
Inventor
王荃
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dongguan Dongquan Intelligent Technology Co ltd
Original Assignee
Dongguan Dongquan Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dongguan Dongquan Intelligent Technology Co ltd filed Critical Dongguan Dongquan Intelligent Technology Co ltd
Priority to CN201911406860.4A priority Critical patent/CN111341444B/en
Publication of CN111341444A publication Critical patent/CN111341444A/en
Application granted granted Critical
Publication of CN111341444B publication Critical patent/CN111341444B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/70ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • General Health & Medical Sciences (AREA)
  • Epidemiology (AREA)
  • Databases & Information Systems (AREA)
  • Pathology (AREA)
  • Data Mining & Analysis (AREA)
  • Biomedical Technology (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Hospice & Palliative Care (AREA)
  • Psychiatry (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses an intelligent drawing scoring method and system, wherein the method comprises the following steps: the client provides a reference picture for a user to select; receiving and storing images and lines drawn by a user in a drawing area; collecting eye movement information of a user and facial features of the user; sending the images and lines drawn by the user in the drawing area, the eye movement information of the user and the facial features of the user to a processing center; after receiving the score, the processing center analyzes the score to obtain a drawing score, an eye movement tracking score and a mood score, and synthesizes the drawing score, the eye movement tracking score and the mood score to obtain a risk assessment result; the processing center sends the drawing score, the eye movement tracking score, the emotion state and the risk assessment result to the client; and the client receives and displays the data. The invention can avoid the limitation of time and space to the user, so as to obtain the test result of the user in the relaxed state, and provide quantitative reference for various disease-related behaviors which may appear in the drawing test process of the user.

Description

Intelligent drawing scoring method and system
Technical Field
The invention relates to the technical field of computer image processing, in particular to an intelligent drawing scoring method and system.
Background
Three brain science diseases faced in the world at present: autism in children (autism), depression in adults, and alzheimer's disease in the elderly (commonly known as senile dementia). The problem of brain science also reflects the social concern about mental diseases, the development of children and the aging of old people, and the abnormality of human brain is often reflected by the behavior of people.
The drawing score itself is currently used in the medical clinic to assist doctors in judging the development degree of children, or to assist the diagnosis of degenerative diseases and mental diseases of the elderly, such as vision problems, diseases related to muscle control (such as Parkinson, cerebral palsy, etc.) and some psychological states.
Eye tracking technology is used for psychological and behavioral analysis in scientific research and medicine. Eye tracking for assessment of user attention, assessment of user's fitting degree to painting test, may be used for nystagmus-related diseases (e.g. epilepsy, schizophrenia, tremor of eyes, etc.), attention-related diseases (e.g. ADHD (attention deficit hyperactivity disorder) hyperactivity disorder, depression, etc.), related dysplasia with missing social attributes or psychological diseases (e.g. ASD (Autistic spectrum disorder) autism, etc.).
The analysis of the expression is also useful for expression recognition and psychological analysis in psychological clinics, for example, ADHD hyperkinetic syndrome is easy to have moods such as impatience, boredom, and easy anger, and facial paralysis, facial neuroinflammation and the like are easy to cause facial muscle control problems. Also depression, mania, depression, bipolar disorder of depression, etc., may also reflect more negative expressions. Negative emotions may also be related to difficulties in the human being's appearance, affecting the attention of the child.
However, whether it is a drawing score, an expression analysis, or a gaze focus analysis, if it is determined only by one or two clinicians at the same time, it is often very subjective and subject to significant external influences. Meanwhile, the user or the patient is limited by factors such as time and geographical position, and cannot be diagnosed and treated in time.
Currently, in China, developmental assessments and mental disease assessments that can be performed at present are all offline institutions and organizations. Therefore, a user who needs to participate in drawing test scoring needs to draw out corresponding time to a specified place to participate in drawing or other behavioral test activities, which is not very convenient for the user, and changes the familiar environment of the user, which may affect the attention of the user and is not beneficial to the test effect. And offline evaluation requires the presence of professional doctors, and has many limitations and difficulties under the conditions that normal psychologists are in short supply, children doctors are overloaded in work and the medical care requirements of the old exceed standards. And the existing evaluation scheme has no way to provide quantitative reference for various possible disease-related behaviors which may appear in the drawing test process of the user.
Disclosure of Invention
The invention provides an intelligent drawing scoring method and system, which are used for solving the technical problems that a user is not limited by time and space and is in a nervous mental state during psychological evaluation.
In order to solve the technical problems, the technical scheme provided by the invention is as follows:
an intelligent painting scoring method comprises the following steps:
the client provides a reference picture for a user to select;
the client displays the reference image selected by the user, and receives and stores the image and the line drawn by the user in the drawing area; meanwhile, the client collects the eye movement information of the user and the facial features of the user through a camera device; sending the images and lines drawn by the user in the drawing area, the eye movement information of the user and the facial features of the user to a processing center; the processing center is arranged on the server or the client;
the processing center receives images and lines drawn by a user in a drawing area from a client, eye movement information of the user and facial features of the user; analyzing the image and the lines to obtain a drawing score, analyzing the eye movement information of the user to obtain an eye movement tracking score of the user, and obtaining facial feature points and changes of the facial features of the user during drawing of the user, so that the expression of the user is obtained to judge the emotional state of the user, performing emotion scoring according to the emotional state, and integrating the drawing score, the eye movement tracking score and the emotion scoring to obtain a risk assessment result;
the processing center sends the drawing score, the eye movement tracking score, the emotion score and the risk assessment result to the client;
the client receives and displays the drawing score, the eye movement tracking score, the emotion score and the risk assessment result from the processing center.
Preferably, the drawing score comprises: precision, continuity, smoothness, line segment size and proportion;
the accuracy is used for judging whether the user accurately draws the reference graph or not according to the similarity of the comparison between the image and the line of the user and the reference graph;
the consistency degree is used for judging whether the user continuously draws the reference graph or not according to whether the lines drawn by the user are continuous or not and whether the lines drawn by the user are discontinuous or not;
the smoothness is used for judging whether the user draws a reference graph smoothly or not according to whether the line drawn by the user is smooth or not and whether the line drawn by the user has an obvious trembling phenomenon or not;
and the line segment size and proportion are used for judging the accuracy of the hand muscle control of the user according to the thickness degree of the line drawn by the user and the whole drawing size.
The invention also provides an intelligent drawing scoring method, which is applied to the client and comprises the following steps:
the client provides a reference picture for a user to select;
the client displays the reference image selected by the user, and receives and stores the image and the line drawn by the user in the drawing area; meanwhile, the client collects the eye movement information of the user and the facial features of the user through a camera device; sending the images and lines drawn by the user in the drawing area, the eye movement information of the user and the facial features of the user to a server;
the client receives and displays the drawing score, the eye movement tracking score, the emotion score and the risk assessment result from the server.
Preferably, the collecting the eye movement information of the user comprises recording the positions of the screen observed by the eyes of the user during the drawing period to obtain the attention points and the gaze tracks of the eyes; collecting facial features of a user, including facial feature points and changes thereof when the user draws a picture;
the client also receives and displays an eye tracking result from the server, wherein the eye tracking result is an eye tracking heat map.
The invention also provides an intelligent drawing scoring method, which is applied to a server and comprises the following steps:
the server receives images and lines drawn by the user in the drawing area from the client, eye movement information of the user and facial features of the user; analyzing the image and the lines to obtain a drawing score, analyzing the eye movement information of the user to obtain an eye movement tracking score of the user, and obtaining facial feature points and changes of the facial features of the user during drawing of the user, so that the expression of the user is obtained to judge the emotional state of the user, performing emotion scoring according to the emotional state, and integrating the drawing score, the eye movement tracking score and the emotion scoring to obtain a risk assessment result;
and the server sends the drawing score, the eye movement tracking score, the emotion score and the risk assessment result to the client.
Preferably, the drawing score comprises: accuracy, consistency, smoothness, line segment size and scale.
Preferably, the accuracy is used for judging whether the user accurately draws the reference graph or not according to the similarity between the image and the line of the user and the reference graph;
the consistency degree is used for judging whether the user continuously draws the reference graph or not according to whether the lines drawn by the user are continuous or not and whether the lines drawn by the user are discontinuous or not;
the smoothness is used for judging whether the user draws a reference graph smoothly or not according to whether the line drawn by the user is smooth or not and whether the line drawn by the user has an obvious trembling phenomenon or not;
and the line segment size and proportion are used for judging the accuracy of the hand muscle control of the user according to the thickness degree of the line drawn by the user and the whole drawing size.
Preferably, the eye movement information includes a focus and a gaze track of the eyes, and is obtained according to the recorded positions of the screen observed by the eyes of the user during the drawing period;
the server also evaluates and obtains the degree of fit of the user and the eye tracking score according to the attention points and the fixation tracks of the eyes.
The present invention also provides a computer system comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of any of the methods described above when executing the computer program.
The invention has the following beneficial effects:
according to the intelligent drawing scoring method, the limitation of time and space of the user can be avoided through local testing of the client or online testing between the client and the server (the method can be applied to diversified platforms such as a webpage version, a computer version and a mobile version), so that a testing result of the user in a relaxed state can be obtained. The method comprises the steps of collecting the actions of eyes, faces and hands of a user, automatically evaluating the painting and the eye movement and emotion of the user in the period by means of artificial intelligence and video analysis, and detecting and evaluating the behavior characteristics, attention distribution and emotion states of the user in the painting process to judge the psychological state of the user. Provides quantitative reference for the research, evaluation and prevention of children dysplasia, senile degenerative diseases, psychological problems, mental diseases and the like, and can perform online psychological health evaluation at any time and any place.
In addition to the objects, features and advantages described above, other objects, features and advantages of the present invention are also provided. The present invention will be described in further detail below with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the invention and, together with the description, serve to explain the invention and not to limit the invention. In the drawings:
FIG. 1 is a flow chart diagram of an intelligent drawing scoring method of a client according to a preferred embodiment of the invention;
FIG. 2 is a flow chart diagram of an intelligent drawing scoring method of the server in accordance with the preferred embodiment of the present invention;
FIG. 3 is a schematic diagram of a pictorial scoring of a preferred embodiment of the present invention;
FIG. 4 is a schematic diagram of an eye tracking heat map of a preferred embodiment of the present invention;
FIG. 5 is a flow chart of an intelligent drawing scoring method in cooperation with a client and a processing center according to a preferred embodiment of the present invention;
FIG. 6 is a graphical representation of the results of mood scoring in accordance with a preferred embodiment of the present invention;
FIG. 7 is a schematic illustration of neutral emotion recognition in accordance with a preferred embodiment of the present invention;
FIG. 8 is a schematic illustration of positive emotion recognition in a preferred embodiment of the present invention;
FIG. 9 is a schematic illustration of negative emotion recognition in a preferred embodiment of the present invention;
fig. 10 is a graphical representation of composite scores in accordance with a preferred embodiment of the present invention.
Detailed Description
The embodiments of the invention will be described in detail below with reference to the drawings, but the invention can be implemented in many different ways as defined and covered by the claims.
Referring to fig. 1, the intelligent painting scoring method of the present invention is applied to a client, and includes the following steps:
the client provides a reference picture (uploaded by the user or randomly selected from a picture library by the system) for the user to select; the client includes, but is not limited to, a web page, a computer client, a tablet computer client or a tablet computer application, a mobile phone client or a mobile phone application, a wechat applet, and the like.
The client displays the reference image selected by the user, and receives and stores the image and the line drawn by the user in the drawing area; meanwhile, the client collects the eye movement information of the user and the facial features of the user through a camera device; sending the images and lines drawn by the user in the drawing area, the eye movement information of the user and the facial features of the user to a server;
the client receives and displays the drawing score, the eye movement tracking score, the emotion score and the risk assessment result from the server.
Through the online test between the client and the server, the limitation of time and space on the user can be avoided, and the test result of the user in a relaxed state can be obtained. The mental health evaluation can be carried out on line at any time and any place, and references can be provided for the research, evaluation, prevention and evaluation of children dysplasia, senile degenerative diseases and mental diseases.
In practice, collecting the eye movement information of the user includes recording the positions of the screen viewed by both eyes of the user during the drawing period to obtain the attention points and the gaze tracks of the eyes. The eye tracking technology can be designed, for example, a Record rtc (Real-Time Communication) frame and a face-API (JavaScript API for face detection and face recognition in the browser) face recognition frame based on WebRTC (WebReal-Time Communication, web-based instant messaging) technology are used to call a computer camera, a mobile phone front camera or a tablet pc front camera, and Record the position of a screen observed by both eyes of a user during a drawing period. When the result is presented to the user, referring to fig. 4, the client further receives and presents the eye tracking result from the server, and in this embodiment, the eye tracking result is an eye tracking heat map.
In practice, the collecting of the facial features of the user includes facial feature points and changes thereof when the user draws. The facial feature recognition technology can be designed, for example, an open source face-API. And calling a computer camera, a front camera of a mobile phone or a front camera of a tablet computer, and recording the facial feature change of the user during the drawing period. Facial feature recognition is to use technology to recognize facial features of human face, such as eye corners, mouth corners and other parts, and to record and analyze facial expression changes. And also can provide reference for the discovery, evaluation and rehabilitation of muscle control related diseases, nervous system diseases or muscle control related diseases.
Correspondingly, the invention also provides an intelligent drawing scoring method, which is applied to a server and comprises the following steps of: the server receives images and lines drawn by the user in the drawing area from the client, eye movement information of the user and facial features of the user; analyzing the image and the lines to obtain a drawing score, analyzing the eye movement information of the user to obtain an eye movement tracking score of the user, and obtaining facial feature points and changes of the facial features of the user during drawing of the user, so as to obtain the expression of the user to judge the emotional state of the user, performing emotion scoring according to the emotional state and the emotional state, and synthesizing the drawing score, the eye movement tracking score and the emotion score to obtain a risk assessment result; and the server sends the drawing score, the eye movement tracking score, the emotion score and the risk assessment result to the client.
The invention also provides an intelligent drawing scoring method, which is applied to the client and the processing center and comprises the following steps of: the client provides a reference picture for a user to select; the client displays the reference image selected by the user, and receives and stores the image and the line drawn by the user in the drawing area; meanwhile, the client collects the eye movement information of the user and the facial features of the user through a camera device; sending the images and lines drawn by the user in the drawing area, the eye movement information of the user and the facial features of the user to a processing center; the processing center is arranged on the server or the client; the processing center receives images and lines drawn by a user in a drawing area from a client, eye movement information of the user and facial features of the user; analyzing the image and the lines to obtain a drawing score, analyzing the eye movement information of the user to obtain an eye movement tracking score of the user, and obtaining facial feature points and changes of the facial features of the user during drawing of the user, so that the expression of the user is obtained to judge the emotional state of the user, performing emotion scoring according to the emotional state, and integrating the drawing score, the eye movement tracking score and the emotion scoring to obtain a risk assessment result; the processing center sends the drawing score, the eye movement tracking score, the emotion score and the risk assessment result to the client; the client receives and displays the drawing score, the eye movement tracking score, the emotion score and the risk assessment result from the processing center.
In this embodiment, referring to fig. 3, the drawing score includes: accuracy, consistency, smoothness, line segment size and scale. The accuracy is used for judging whether the user accurately draws the reference graph or not according to the similarity of the comparison between the image and the line of the user and the reference graph, and providing reference for the identification degree of the geometric graph in the user. And the consistency degree is used for judging whether the user continuously draws the reference graph according to whether the line drawn by the user is continuous or not and whether the line drawn by the user is discontinuous or not, so that reference is provided for discovery, evaluation and rehabilitation of brain diseases, nervous system diseases or muscle control related diseases. The smoothness is used for judging whether the user draws a reference graph smoothly according to whether the line drawn by the user is smooth or not and whether obvious tremor exists or not, and provides reference for discovery, evaluation and rehabilitation of brain diseases, nervous system diseases (especially Parkinson) or muscle control related diseases. And the line segment size and proportion are used for judging the accuracy of the control of the hand muscles of the user according to the thickness degree of the line drawn by the user and the whole drawing size, and providing quantitative reference for the manual activity dexterity of the user. Also provides quantitative reference for the discovery, evaluation and rehabilitation of brain diseases, nervous system diseases or muscle control related diseases. In practical application, the reference field can be used for a user to write, and the scoring can be carried out by referring to drawing scoring to obtain writing scoring.
In this embodiment, the eye movement information includes a focus and a gaze trajectory of the eyes, and is obtained according to the recorded positions of the screen observed by the two eyes of the user during the drawing period; the server also evaluates and obtains the degree of fit of the user and the eye tracking score according to the attention points and the fixation tracks of the eyes. And performing emotion scoring according to the matching degree of the user and the emotion state of the user, and integrating the drawing scoring, the eye movement tracking scoring and the emotion scoring to obtain a risk assessment result. The emotional states of the user include: positive emotions, negative emotions, neutral emotions, and expression symmetry. By acquiring the positions of the facial feature points and the motions of the connecting lines of the facial feature points, the motions of facial muscles of the user (such as the rising arc of the mouth corner and the eye corner, the frown of the eyebrow and the like) can be known, so that the positive emotion (happy, happy and relaxed) or the negative emotion (too much, tired and irritated) or the neutral emotion (the facial expression has no obvious emotional change) of the user and the bilateral symmetry of the expression of the user are calculated.
The invention can be implemented through multiple platforms such as a web application program, a computer client, a mobile app application, a WeChat applet and the like, and is convenient for a user to select according to self conditions. For example, by using a web application program scheme, a user can use the method and the system only by using a browser, so that the user is prevented from installing additional software or plug-ins, the use difficulty of the user is reduced, and the compatibility and the usability of the platform are improved. The Web side may employ a resolution-responsive design. The method and the system can automatically adapt to display with different resolutions, thus breaking through the limitation of platforms, users can use the product of the invention at a computer end and can also use the method and the system on mobile equipment, such as mobile phones, tablet computers and the like, thus being convenient for users to have various choices and using the method and the system of the invention on different platforms. The invention can use the fragment time at any place by using the browser or APP on the equipment under the condition of network, thereby improving the use efficiency of the user. The invention can also be set as a single edition independent of the network, which is convenient for users without network conditions to use.
In addition, in the embodiment, the data of the client and the processing center are stored and transmitted by using the block chain technology. The block chain technology can realize a block chain storage and tracing scheme corresponding to key information system infrastructure, and can store and transmit key data of the key system infrastructure in a data sharing process based on the block chain. The method can discover malicious tampered key data in time, and discover false data and illegal tampered malicious data in the using process and repair the false data and the illegal tampered malicious data in time.
Painting techniques provide a relatively simple way to collect social information from or relating to a child. Painting is a good tool for assessment purposes, as most children like painting without any signs of stress. Although many children do not like to answer questions, they can complete the drawing test quickly, easily, and pleasantly. In addition, the invention can avoid the limitation of time and space of the user through the single machine test of the client or the online test between the client and the server (can be applied to diversified platforms such as a webpage version, a computer version, a mobile version, a single machine version and the like) so as to obtain the test result of the user in the relaxed state.
According to the invention, the imaging device of the computer platform is used for simultaneously collecting and analyzing the fine movement of hands and eyes, the movement track of eyes, the attention distribution and the emotional state, carrying out drawing grading (drawing grading, eye movement tracking grading and emotion grading can be tested for many times, and the results of the tests are synthesized for many times, so that the test accuracy is ensured), carrying out eye movement tracking grading and emotion grading, finally synthesizing the three to obtain a risk evaluation result (comprehensive grading), and carrying out related analysis and auxiliary diagnosis by means of artificial intelligence. In the implementation process, the three can be integrated to obtain the following risk assessment results:
TABLE 1 correspondence table of drawing score, eye movement tracking score, expression score and risk assessment
Figure BDA0002348891580000071
Figure BDA0002348891580000081
Note: 1. the risk assessment results are for reference only and can be determined by further testing. 2. The risk assessments shown in table 1 are exemplary only and do not include all. 3. The high, normal, medium thresholds for the score values may be set and fine-tuned empirically and experimentally.
Fig. 10 is a schematic diagram of the output of a composite score, in this embodiment, the risk assessment result can be obtained from the composite score of fig. 10, see table 2:
table 2 composite score evaluation example
Figure BDA0002348891580000082
The scoring detail employed by the example of FIG. 10 is given below:
(1) drawing and grading:
drawing accuracy (full score 100), comparing the drawing submitted by the user with a drawing reference diagram provided by the system, wherein the higher the shape accuracy is, the higher the score is;
drawing smoothness (full score 100), analyzing the drawing line smoothness of a user, wherein the smaller the line jitter, the higher the score;
drawing continuity (full score 100), analyzing whether a user draws a line continuously or not, and the less the line is broken, the higher the score is;
the drawing line size (full score 100) is used for analyzing the thickness degree of the drawing line of the user, and the thinner the line is, the lower the score is; the lower the overall size of the plot, the lower the score.
(2) Eye tracking score:
eye movement fraction: (full score 100), the higher the attention degree of the eye-tracking result hotspot attention drawing region and the drawing reference drawing region, the higher the score.
(3) And (3) mood scoring:
mood (100 full score) is (neutral mood score + positive mood score + negative mood score); the emotion score represents the proportion of which emotion the user expression is more inclined to. Fig. 6 is an example of an emotion score, where scores for three expressions represent constituent components of the expression exhibited by the user's face, and an expression with a higher score represents a greater probability that the user's face is that expression at that time. See table below:
TABLE 3 Emotion score content
Figure BDA0002348891580000091
The expression is symmetrical (full score is 100), the connecting line of the nose feature points of the user is used as a central axis, the symmetry degree of the left and right face feature points of the user is analyzed, and the higher the symmetry degree is, the higher the score is.
The results of the mood score were: positive emotions (see fig. 8), negative emotions (see fig. 9), neutral emotions (see fig. 7), and expression symmetry. By acquiring the positions of the facial feature points and the motions of the connecting lines of the facial feature points, the motions of facial muscles of the user (such as the rising arc of the mouth corner and the eye corner, the frown of the eyebrow and the like) can be known, so that the positive emotion (happy, happy and relaxed) or the negative emotion (too much, tired and irritated) or the neutral emotion (the facial expression has no obvious emotional change) of the user and the bilateral symmetry of the expression of the user are calculated.
The invention adopts artificial intelligence means to score the drawing of the user, and the computer algorithm automatically gives the score after the drawing is analyzed, thereby saving the cost of artificial scoring and improving the efficiency and the objectivity of the drawing test. The process of drawing simultaneously can be used to measure the fine motor ability of the user and the control ability of the hand muscle group. The method can be comprehensively used for evaluating related diseases (such as epilepsy, spasm and the like) related to abnormal discharge of brain areas, diseases related to muscle control (such as Parkinson, cerebral palsy, eye tremor and the like), diseases related to attention (such as ADHD hyperactivity, depression and the like), development and social related diseases (ASD autism and the like) or other mental diseases (such as schizophrenia and the like), assists in evaluation, provides quantitative reference for early intervention and improves practicability. The invention provides quantitative reference for discovery, evaluation and rehabilitation of muscle control related diseases, attention related diseases, development and social related diseases or other mental diseases.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (9)

1. An intelligent drawing scoring method is characterized by comprising the following steps:
the client provides a reference picture for a user to select;
the client displays the reference image selected by the user, and receives and stores the image and the line drawn by the user in the drawing area; meanwhile, the client collects the eye movement information of the user and the facial features of the user through a camera device; sending the images and lines drawn by the user in the drawing area, the eye movement information of the user and the facial features of the user to a processing center; the processing center is arranged on the server or the client;
the processing center receives images and lines drawn by a user in a drawing area from a client, eye movement information of the user and facial features of the user; analyzing the image and the lines to obtain a drawing score, analyzing the eye movement information of the user to obtain an eye movement tracking score of the user, and obtaining facial feature points and changes of the facial features of the user during drawing of the user, so as to obtain the expression of the user to judge the emotional state of the user, carrying out emotion scoring according to the emotional state, and integrating the drawing score, the eye movement tracking score and the emotion scoring to obtain a risk assessment result;
the processing center sends the drawing score, the eye movement tracking score, the emotion state and the risk assessment result to the client;
the client receives and displays the drawing score, the eye movement tracking score, the emotion score and the risk assessment result from the processing center.
2. The intelligent drawing scoring method of claim 1, wherein the drawing scoring comprises: precision, continuity, smoothness, line segment size and proportion;
the accuracy is used for judging whether the user accurately draws the reference graph or not according to the similarity degree of the image, the line and the reference graph of the user;
the consistency is used for judging whether the user continuously draws the reference graph or not according to whether the lines drawn by the user are continuous or not and whether the lines drawn by the user are discontinuous or not;
the smoothness is used for judging whether the user draws a reference graph smoothly or not according to whether the line drawn by the user is smooth or not and whether the line drawn by the user has an obvious trembling phenomenon or not;
and the line segment size and proportion are used for judging the accuracy of the hand muscle control of the user according to the thickness degree of the line drawn by the user and the whole drawing size.
3. An intelligent drawing scoring method is applied to a client side and is characterized by comprising the following steps:
the client provides a reference picture for a user to select;
the client displays the reference image selected by the user, and receives and stores the image and the line drawn by the user in the drawing area; meanwhile, the client collects the eye movement information of the user and the facial features of the user through a camera device; sending the images and lines drawn by the user in the drawing area, the eye movement information of the user and the facial features of the user to a server;
the client receives and displays the drawing score, the eye movement tracking score, the emotion score and the risk assessment result from the server.
4. The intelligent drawing grading method according to claim 3, wherein the collecting of the eye movement information of the user comprises recording the positions of the screen observed by the eyes of the user during the drawing period to obtain the attention points and the gaze tracks of the eyes; the collected facial features of the user comprise facial feature points and changes thereof when the user draws the picture;
the client also receives and displays the eye tracking result from the server.
5. An intelligent drawing scoring method is applied to a server and is characterized by comprising the following steps:
the server receives images and lines drawn by the user in the drawing area from the client, eye movement information of the user and facial features of the user; analyzing the image and the lines to obtain a drawing score, analyzing the eye movement information of the user to obtain an eye movement tracking score of the user, and obtaining facial feature points and changes of the facial features of the user during drawing of the user, so as to obtain the expression of the user to judge the emotional state of the user, carrying out emotion scoring according to the emotional state, and integrating the drawing score, the eye movement tracking score and the emotion scoring to obtain a risk assessment result;
and the server sends the drawing score, the eye movement tracking score, the emotion score and the risk assessment result to the client.
6. The intelligent drawing grading method according to claim 5, wherein the drawing grading comprises: accuracy, consistency, smoothness, line segment size and scale.
7. The intelligent painting scoring method according to claim 6,
the accuracy is used for judging whether the user accurately draws the reference graph or not according to the similarity degree of the image, the line and the reference graph of the user;
the consistency is used for judging whether the user continuously draws the reference graph or not according to whether the lines drawn by the user are continuous or not and whether the lines drawn by the user are discontinuous or not;
the smoothness is used for judging whether the user draws a reference graph smoothly or not according to whether the line drawn by the user is smooth or not and whether the line drawn by the user has an obvious trembling phenomenon or not;
and the line segment size and proportion are used for judging the accuracy of the hand muscle control of the user according to the thickness degree of the line drawn by the user and the whole drawing size.
8. The intelligent drawing grading method according to claim 5, wherein the eye movement information comprises the attention point and the gaze track of the eyes, and is obtained according to the recorded positions of the screen observed by the eyes of the user during the drawing period;
and the server also evaluates and obtains the degree of fit of the user and the eye movement tracking score according to the attention points and the fixation tracks of the eyes.
9. A computer system comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the steps of the method of any of the preceding claims 1 to 8 are performed when the computer program is executed by the processor.
CN201911406860.4A 2019-12-31 2019-12-31 Intelligent painting scoring method and system Active CN111341444B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911406860.4A CN111341444B (en) 2019-12-31 2019-12-31 Intelligent painting scoring method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911406860.4A CN111341444B (en) 2019-12-31 2019-12-31 Intelligent painting scoring method and system

Publications (2)

Publication Number Publication Date
CN111341444A true CN111341444A (en) 2020-06-26
CN111341444B CN111341444B (en) 2023-05-05

Family

ID=71183520

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911406860.4A Active CN111341444B (en) 2019-12-31 2019-12-31 Intelligent painting scoring method and system

Country Status (1)

Country Link
CN (1) CN111341444B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115394395A (en) * 2022-10-27 2022-11-25 安徽星辰智跃科技有限责任公司 Method, system and device for psychology evaluation and intervention based on painting creation
WO2023219489A1 (en) * 2022-05-13 2023-11-16 Toybox Creations And Technology Sdn Bhd Automated tool to assess child development

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106339460A (en) * 2016-08-26 2017-01-18 厦门优莱柏网络科技有限公司 Online drawing processing system and online drawing processing method
CN107783945A (en) * 2017-11-13 2018-03-09 山东师范大学 A kind of search result web page notice assessment method and device based on the dynamic tracking of eye
CN109448848A (en) * 2018-09-26 2019-03-08 长沙师范学院 A kind of infantile psychology state evaluating method based on fuzzy evaluation

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106339460A (en) * 2016-08-26 2017-01-18 厦门优莱柏网络科技有限公司 Online drawing processing system and online drawing processing method
CN107783945A (en) * 2017-11-13 2018-03-09 山东师范大学 A kind of search result web page notice assessment method and device based on the dynamic tracking of eye
CN109448848A (en) * 2018-09-26 2019-03-08 长沙师范学院 A kind of infantile psychology state evaluating method based on fuzzy evaluation

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023219489A1 (en) * 2022-05-13 2023-11-16 Toybox Creations And Technology Sdn Bhd Automated tool to assess child development
CN115394395A (en) * 2022-10-27 2022-11-25 安徽星辰智跃科技有限责任公司 Method, system and device for psychology evaluation and intervention based on painting creation
CN115394395B (en) * 2022-10-27 2023-02-24 安徽星辰智跃科技有限责任公司 Method, system and device for neuropsychological assessment and intervention based on drawing creation

Also Published As

Publication number Publication date
CN111341444B (en) 2023-05-05

Similar Documents

Publication Publication Date Title
Niehorster et al. The impact of slippage on the data quality of head-worn eye trackers
Aigrain et al. Multimodal stress detection from multiple assessments
Pampouchidou et al. Automatic assessment of depression based on visual cues: A systematic review
Wang et al. Facial expression video analysis for depression detection in Chinese patients
Sowden et al. The role of movement kinematics in facial emotion expression production and recognition.
JP5317415B2 (en) Image output apparatus, image output method, and image output program
JP4869978B2 (en) Image recording apparatus, image recording method, and image recording program
US20090119154A1 (en) Determining a demographic characteristic based on computational user-health testing of a user interaction with advertiser-specified content
US20120164613A1 (en) Determining a demographic characteristic based on computational user-health testing of a user interaction with advertiser-specified content
US20090132275A1 (en) Determining a demographic characteristic of a user based on computational user-health testing
US20090118593A1 (en) Determining a demographic characteristic based on computational user-health testing of a user interaction with advertiser-specified content
US20090112617A1 (en) Computational user-health testing responsive to a user interaction with advertiser-configured content
US20090112621A1 (en) Computational user-health testing responsive to a user interaction with advertiser-configured content
US20150305662A1 (en) Remote assessment of emotional status
JP5044237B2 (en) Image recording apparatus, image recording method, and image recording program
US20090112620A1 (en) Polling for interest in computational user-health test output
JP2007289656A (en) Image recording apparatus, image recording method and image recording program
CN111341444B (en) Intelligent painting scoring method and system
Orlosky et al. Using eye tracked virtual reality to classify understanding of vocabulary in recall tasks
JP2007289657A (en) Image recording apparatus, image recording method, and image recording program
Varela et al. Looking at faces in the wild
Migliorelli et al. A store-and-forward cloud-based telemonitoring system for automatic assessing dysarthria evolution in neurological diseases from video-recording analysis
Dakanalis et al. Artificial intelligence: a game-changer for mental health care
Murali et al. Towards automated pain assessment using embodied conversational agents
Chen Cognitive load measurement from eye activity: acquisition, efficacy, and real-time system design

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant