CN111627526B - Eye movement and attention feature vector determination method for children ADHD screening and evaluation system - Google Patents

Eye movement and attention feature vector determination method for children ADHD screening and evaluation system Download PDF

Info

Publication number
CN111627526B
CN111627526B CN202010402290.8A CN202010402290A CN111627526B CN 111627526 B CN111627526 B CN 111627526B CN 202010402290 A CN202010402290 A CN 202010402290A CN 111627526 B CN111627526 B CN 111627526B
Authority
CN
China
Prior art keywords
eye
vector
screen
sight line
eye movement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010402290.8A
Other languages
Chinese (zh)
Other versions
CN111627526A (en
Inventor
朱强
张雁翼
孔鸣
洪文琛
赵天琦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Zhihai Zhiguang Technology Co ltd
Original Assignee
Huzhou Weizhi Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huzhou Weizhi Information Technology Co ltd filed Critical Huzhou Weizhi Information Technology Co ltd
Priority to CN202010402290.8A priority Critical patent/CN111627526B/en
Publication of CN111627526A publication Critical patent/CN111627526A/en
Application granted granted Critical
Publication of CN111627526B publication Critical patent/CN111627526B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/70ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements

Abstract

Eye movement attention for children ADHD screening and assessment systemFeature vector determining method, input as collected front video V Front And camera calibration matrix C r Output is the generated eye movement attention characteristic vector F g Step 1, pupil position calculation; step 2, calculating the sight direction, wherein the sight direction characteristic is expressed as a two-dimensional vector g containing a yaw angle and a pitch angle; step 3, determining the space position of the plane of the screen in an external calibration mode, and according to the sight direction feature vector g and the eye position e r Calculating the intersection point of the sight line and the screen plane, namely the falling point p of the sight line on the screen s Obtaining an eye attention area r according to the screen structure; step 4 obtains a final eye movement attention feature vector, denoted F g =[e r ,g,r]. The invention has the advantages of convenient operation and good accuracy, and can measure and calculate the eye movement attention on the premise of not limiting and influencing the activities of children with ADHD.

Description

Eye movement and attention feature vector determination method for children ADHD screening and evaluation system
Technical Field
The invention relates to a children ADHD screening and evaluating system, in particular to an eye movement attention characteristic vector determining method.
Background
Attention Deficit Hyperactivity Disorder (ADHD), commonly known as hyperactivity disorder, is the most common neurological disorder in childhood in teenagers, and its clinical manifestations are difficulty in focusing on, overactivity, impulsivity, instability of emotion, difficulty in learning, etc. The existing intelligent recognition technology of ADHD medical images is mainly based on research on pathological aspects such as brain functional nuclear magnetic resonance and electroencephalogram, or on observing the behavior characteristics of a patient solely based on eye movement or facial expression, etc.
At present, two main researches and technical routes of the eye movement characteristics of the children of ADHD exist, one is to track the eye movement condition and the gaze point position of the children through a head-mounted eye movement instrument, the other is to limit the positions of the body and the head of the children, and then the gaze point position of the children is calculated by using a formula with fixed parameters. The former has disadvantages in that the child needs to wear an eye tracker, which is somewhat inconvenient to operate, and the eye tracker is easily removed or changed in position by the child, resulting in inaccurate results. The latter has the disadvantage that the movement of the child needs to be limited, in which case part of the child's behavioral characteristics are difficult to characterize and the calculation of eye movements after the child has moved substantially also generates errors.
Disclosure of Invention
In order to overcome the defects of inconvenient operation and poor accuracy of the existing ADHD child eye movement characteristic determination mode, the invention provides the eye movement attention characteristic vector determination method for the children ADHD screening and evaluating system, which is convenient to operate and has good accuracy, and the eye movement attention is measured and calculated on the premise of not limiting and influencing the activities of the children with ADHD.
The technical scheme adopted for solving the technical problems is as follows:
an eye movement attention feature vector determination method for a children ADHD screening and evaluation system is characterized in that the input is acquired front video V Front And camera calibration matrix C r Output is the generated eye movement attention characteristic vector F g Comprising the following steps:
step 1 pupil position calculation: firstly, confirming the face position by an HOG-based algorithm, and detecting the key points of the face by a continuous conditional neural field model framework to calculate the pupil position e in the plane h The method comprises the steps of carrying out a first treatment on the surface of the Then, the EPnP algorithm is utilized to align the detected face with the average standard 3D face model F, and the rotation matrix R of the head part under the camera coordinate system is calculated r Translation vector t r Output is eye space position e r =t r +e h
Step 2, calculating the sight line direction, wherein the sight line direction characteristic is expressed as a two-dimensional vector g comprising a yaw angle and a pitch angle, the yaw angle is an included angle between the sight line direction and a vertical plane, and the pitch angle is an included angle between the sight line direction and a horizontal plane;
step 3, screen position conversion: determining the spatial position of the plane of the screen by an external calibration mode, and according to the sight direction characteristicsVector g and eye position e r The intersection point of the sight line and the screen plane can be calculated, namely the falling point p of the sight line on the screen s Obtaining an eye attention area r according to the screen structure;
step 4, combining the pupil position, the sight line direction and the screen attention to obtain a final eye movement attention feature vector, which is expressed as F g =[e r ,g,r]。
Preferably, in the step 2, normalization processing is performed on the eye image: multiplying the original image by the inverse of the camera projection matrix
Figure BDA0002489954890000021
Converting the head pose into a three-dimensional space; then the original posture of the head at the moment is multiplied by a transformation matrix M to fix the eye position, and finally the normalized eye posture is multiplied by a standard camera projection matrix C n Obtaining a normalized two-dimensional eye image e;
to calculate the line of sight vector, the rotation matrix R is normalized n =M R r And converting the two-dimensional rotation vector h, inputting the obtained 2D head posture information h and the single-channel gray eye image e into a convolutional neural network model, and outputting a sight direction feature vector which is a two-dimensional vector g containing a yaw angle and a pitch angle in the step.
The beneficial effects of the invention are mainly shown in the following steps: no wearing equipment is needed, so that children can not feel abnormal; the activities of children do not need to be limited, so that the children can be more naturally represented; the measurement accuracy is higher, and the error angle can be controlled within 5 degrees.
Drawings
Fig. 1 is a flow chart of an eye movement attention feature vector determination method for a child ADHD screening assessment system.
Fig. 2 is a schematic diagram of a child ADHD screening assessment system.
Detailed Description
The invention is further described below with reference to the accompanying drawings.
Referring to FIG. 1, an eye movement attention feature for a child ADHD screening assessment systemThe vector determination method is characterized in that the input is the collected front video V Front And camera calibration matrix C r Output is the generated eye movement attention characteristic vector F g Comprising the following steps:
step 1 pupil position calculation: first, confirming face position by HOG-based algorithm, and detecting face key point by Continuous Conditional Neural Field (CCNF) model frame, thereby calculating pupil position e in plane h The method comprises the steps of carrying out a first treatment on the surface of the Then, the EPnP algorithm is utilized to align the detected face with the average standard 3D face model F, and the rotation matrix R of the head part under the camera coordinate system is calculated r Translation vector t r The output of this step is the eye space position e r =t r +e h
Step 2, calculating the sight line direction, wherein the sight line direction characteristic can be expressed as a two-dimensional vector g comprising a yaw angle (yaw) and a pitch angle (pitch), the yaw angle is an included angle between the sight line direction and a vertical plane, and the pitch angle is an included angle between the sight line direction and a horizontal plane;
in order to acquire the sight line vector, normalization processing is performed on the eye image: multiplying the original image by the inverse of the camera projection matrix
Figure BDA0002489954890000041
Converting the head pose into a three-dimensional space; then the original posture of the head at the moment is multiplied by a transformation matrix M to fix the eye position, and finally the normalized eye posture is multiplied by a standard camera projection matrix C n Obtaining a normalized two-dimensional eye image e; to calculate the line of sight vector, the rotation matrix R is normalized n =M R r Converting the two-dimensional rotation vector h, inputting the obtained 2D head gesture information h and the single-channel gray eye image e into a convolutional neural network model, and outputting a sight direction feature vector which is a two-dimensional vector g containing a yaw angle and a pitch angle;
step 3, screen position conversion: by means of external calibration, the spatial position of the plane of the screen can be determined, and the position of eyes and the position of the screen can be determined according to the sight direction characteristic vector gE is arranged r The intersection point of the sight line and the screen plane can be calculated, namely the falling point p of the sight line on the screen s Obtaining an eye attention area r according to the screen structure;
step 4, combining the pupil position, the sight line direction and the screen attention to obtain a final eye movement attention feature vector, which is expressed as F g =[e r ,g,r]。
Fig. 2 illustrates the relationship between a camera and a tested person in the children ADHD screening and evaluating system, and the camera No. 1 behind the computer display screen is used for capturing the front image of the tested person and collecting the eye movement and expression change information of the tested person. The 2 and 3 binocular depth camera modules positioned on the side face of the seat of the tester can be used for shooting the whole body of the tester and collecting three-dimensional body posture information of the tester. In this embodiment, a camera (camera 1) placed in front of the face of the child is used to capture the eye movement of the child, and finally a video V is obtained fr ont。
In this embodiment, the camera matrix C of the camera is calibrated by using the method of orthogonal constraint of the plane mirror-display r
The input of eye movement and attention characteristic vector calculation is the collected front video V front And camera calibration matrix C r Output is the generated eye movement attention characteristic vector F g . The concrete calculation mainly comprises three steps:
pupil position calculation: the face position is first confirmed by an HOG based algorithm. Detection of facial keypoints is achieved through a Continuous Conditional Neural Field (CCNF) model framework, whereby we can calculate pupil position e in the plane h . Then, the EPnP algorithm is utilized to align the detected face with the average standard 3D face model F, and the rotation matrix R of the head part under the camera coordinate system is calculated r Translation vector t r The output of this step is the eye space position e r =t r +e h
In order to acquire the sight line vector, normalization processing is performed on the eye image: multiplying the original image by the inverse of the camera projection matrix
Figure BDA0002489954890000051
Converting the head pose into a three-dimensional space; then the original posture of the head at the moment is multiplied by a transformation matrix M to fix the eye position, and finally the normalized eye posture is multiplied by a standard camera projection matrix C n And obtaining a normalized two-dimensional eye image e. To calculate the line of sight vector, we normalize the rotation matrix R n =M R r Is transformed into a two-dimensional rotation vector h. The 2D head posture information h and the single-channel gray eye image e obtained in the previous step are input into a convolutional neural network model, and a sight direction feature vector is output in the step, wherein the feature vector is a two-dimensional vector g containing a yaw angle and a pitch angle.
Screen position conversion: by means of external calibration, we can determine the spatial position of the plane in which the screen is located according to the line of sight angle g and eye position e r The intersection point of the sight line and the screen plane can be calculated, namely the falling point p of the sight line on the screen s And obtaining an eye attention area r according to the screen structure.
The pupil position, the sight line direction and the screen attention are combined to obtain the final eye movement attention characteristic vector which is expressed as F g =[e r ,g,r]。
The embodiments described in this specification are merely illustrative of the manner in which the inventive concepts may be implemented. The scope of the present invention should not be construed as being limited to the specific forms set forth in the embodiments, but the scope of the present invention and the equivalents thereof as would occur to one skilled in the art based on the inventive concept.

Claims (1)

1. An eye movement attention feature vector determination method for a children ADHD screening and evaluation system is characterized in that the input is acquired front video V Front And camera calibration matrix C r Output is the generated eye movement attention characteristic vector F g Comprising the following steps:
step 1, eye space position calculation: firstly, confirming the face position by an HOG-based algorithm, and aiming at the faceDetection of part key points is achieved by a continuous conditional neural field model framework, from which pupil position e in the plane is calculated h The method comprises the steps of carrying out a first treatment on the surface of the Then, the EPnP algorithm is utilized to align the detected face with the average standard 3D face model F, and the rotation matrix R of the head part under the camera coordinate system is calculated r Translation vector t r Output is eye space position e r =t r +e h
Step 2, calculating the sight line direction, wherein the sight line direction characteristic is expressed as a two-dimensional vector g comprising a yaw angle and a pitch angle, the yaw angle is an included angle between the sight line direction and a vertical plane, and the pitch angle is an included angle between the sight line direction and a horizontal plane;
step 3, screen position conversion: determining the spatial position of the plane of the screen by an external calibration mode, and determining the spatial position e of eyes according to the sight direction characteristic vector g r Calculating the intersection point of the sight line and the screen plane, namely the falling point p of the sight line on the screen s Obtaining an eye attention area r according to the screen structure;
step 4, combining the eye space position, the sight line direction and the screen attention to obtain a final eye movement attention feature vector, which is expressed as f g =[e r ,g,r];
In the step 2, normalization processing is performed on the eye image: multiplying the original image by the inverse of the camera projection matrix
Figure QLYQS_1
Converting the head pose into a three-dimensional space; then the original posture of the head at the moment is multiplied by a transformation matrix M to fix the eye position, and finally the normalized eye posture is multiplied by a standard camera projection matrix C n Obtaining a normalized two-dimensional eye image e;
to calculate the line of sight vector, the rotation matrix R is normalized n =M R r And transforming the eye-gaze direction characteristic vector into 2D head posture information h, inputting the obtained 2D head posture information h and an eye two-dimensional image e into a convolutional neural network model, and outputting a gaze direction characteristic vector which is a two-dimensional vector g containing a yaw angle and a pitch angle.
CN202010402290.8A 2020-05-13 2020-05-13 Eye movement and attention feature vector determination method for children ADHD screening and evaluation system Active CN111627526B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010402290.8A CN111627526B (en) 2020-05-13 2020-05-13 Eye movement and attention feature vector determination method for children ADHD screening and evaluation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010402290.8A CN111627526B (en) 2020-05-13 2020-05-13 Eye movement and attention feature vector determination method for children ADHD screening and evaluation system

Publications (2)

Publication Number Publication Date
CN111627526A CN111627526A (en) 2020-09-04
CN111627526B true CN111627526B (en) 2023-05-23

Family

ID=72271945

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010402290.8A Active CN111627526B (en) 2020-05-13 2020-05-13 Eye movement and attention feature vector determination method for children ADHD screening and evaluation system

Country Status (1)

Country Link
CN (1) CN111627526B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113158879B (en) * 2021-04-19 2022-06-10 天津大学 Three-dimensional fixation point estimation and three-dimensional eye movement model establishment method based on matching characteristics
CN113705349B (en) * 2021-07-26 2023-06-06 电子科技大学 Attention quantitative analysis method and system based on line-of-sight estimation neural network

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10482333B1 (en) * 2017-01-04 2019-11-19 Affectiva, Inc. Mental state analysis using blink rate within vehicles
CN107861625A (en) * 2017-12-04 2018-03-30 北京易真学思教育科技有限公司 Gaze tracking system and method based on 3d space model
CN109902630B (en) * 2019-03-01 2022-12-13 上海像我信息科技有限公司 Attention judging method, device, system, equipment and storage medium
CN110991324B (en) * 2019-11-29 2023-06-02 中通服咨询设计研究院有限公司 Fatigue driving detection method based on various dynamic characteristics and Internet of things technology

Also Published As

Publication number Publication date
CN111627526A (en) 2020-09-04

Similar Documents

Publication Publication Date Title
US8371693B2 (en) Autism diagnosis support apparatus
CN109008944B (en) Sight line measuring device, ROM, and sight line measuring method
JP6014931B2 (en) Gaze measurement method
CN111627526B (en) Eye movement and attention feature vector determination method for children ADHD screening and evaluation system
TWI520576B (en) Method and system for converting 2d images to 3d images and computer-readable medium
Wang et al. Screening early children with autism spectrum disorder via response-to-name protocol
JP5995408B2 (en) Information processing apparatus, photographing system, information processing method, and program for causing computer to execute information processing
CN111528859A (en) Child ADHD screening and evaluating system based on multi-modal deep learning technology
CN111933275A (en) Depression evaluation system based on eye movement and facial expression
JP2013252301A (en) Device and program for estimating eyeball center position
Auvinet et al. Lower limb movement asymmetry measurement with a depth camera
Lu et al. Neural 3D gaze: 3D pupil localization and gaze tracking based on anatomical eye model and neural refraction correction
CN111134693B (en) Virtual reality technology-based autism child auxiliary detection method, system and terminal
CN110881981A (en) Alzheimer's disease auxiliary detection system based on virtual reality technology
CN117392644A (en) Fatigue detection method and system based on machine vision
JP6652263B2 (en) Mouth region detection device and mouth region detection method
KR102565852B1 (en) Autism spectrum disorder evaluation method based on facial expression analysis
CN113197542B (en) Online self-service vision detection system, mobile terminal and storage medium
KR20100104330A (en) A system and method measuring objective 3d display-induced visual fatigue using 3d oddball paradigm
CN115294018A (en) Neck dystonia identification system based on RGB-D image
WO2022024272A1 (en) Information processing system, data accumulation device, data generation device, information processing method, data accumulation method, data generation method, recording medium, and database
Mishima et al. Analysis methods for facial motion
JP7076154B1 (en) Spinal sequence estimation device, method and program
Wang et al. Facial Landmark based BMI Analysis for Pervasive Health Informatics
Panev et al. Improved multi-camera 3D eye tracking for human-computer interface

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20240312

Address after: Room 617, Building 4, No. 188 Zijinghua North Road, Xihu District, Hangzhou City, Zhejiang Province, 310012

Patentee after: Hangzhou Zhihai Zhiguang Technology Co.,Ltd.

Country or region after: China

Address before: No. 1-143, Building 4, No. 10 Keyuan Road, Wuyang Street, Deqing County, Huzhou City, Zhejiang Province, 313200 (Moganshan National High tech Zone)

Patentee before: Huzhou Weizhi Information Technology Co.,Ltd.

Country or region before: China