CA3031251A1 - Platforms to implement signal detection metrics in adaptive response-deadline procedures - Google Patents
Platforms to implement signal detection metrics in adaptive response-deadline procedures Download PDFInfo
- Publication number
- CA3031251A1 CA3031251A1 CA3031251A CA3031251A CA3031251A1 CA 3031251 A1 CA3031251 A1 CA 3031251A1 CA 3031251 A CA3031251 A CA 3031251A CA 3031251 A CA3031251 A CA 3031251A CA 3031251 A1 CA3031251 A1 CA 3031251A1
- Authority
- CA
- Canada
- Prior art keywords
- response
- individual
- task
- interference
- user interface
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 251
- 230000003044 adaptive effect Effects 0.000 title claims abstract description 40
- 238000001514 detection method Methods 0.000 title abstract description 27
- 230000004044 response Effects 0.000 claims abstract description 783
- 230000001149 cognitive effect Effects 0.000 claims abstract description 159
- 230000003993 interaction Effects 0.000 claims abstract description 45
- 238000012545 processing Methods 0.000 claims description 159
- 230000008859 change Effects 0.000 claims description 105
- 239000003814 drug Substances 0.000 claims description 74
- 229940079593 drug Drugs 0.000 claims description 72
- 238000005259 measurement Methods 0.000 claims description 55
- 239000008177 pharmaceutical agent Substances 0.000 claims description 53
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 claims description 44
- 238000012360 testing method Methods 0.000 claims description 44
- 230000015654 memory Effects 0.000 claims description 41
- 230000033001 locomotion Effects 0.000 claims description 37
- 238000004458 analytical method Methods 0.000 claims description 35
- 230000003935 attention Effects 0.000 claims description 34
- 238000009877 rendering Methods 0.000 claims description 31
- 238000012986 modification Methods 0.000 claims description 29
- 230000004048 modification Effects 0.000 claims description 29
- 230000006870 function Effects 0.000 claims description 27
- 208000035475 disorder Diseases 0.000 claims description 25
- 238000009825 accumulation Methods 0.000 claims description 24
- 230000035945 sensitivity Effects 0.000 claims description 24
- 238000004448 titration Methods 0.000 claims description 22
- 230000008921 facial expression Effects 0.000 claims description 21
- 201000010099 disease Diseases 0.000 claims description 19
- 230000002123 temporal effect Effects 0.000 claims description 18
- 238000013542 behavioral therapy Methods 0.000 claims description 17
- 238000009223 counseling Methods 0.000 claims description 17
- 238000009792 diffusion process Methods 0.000 claims description 16
- 238000011269 treatment regimen Methods 0.000 claims description 16
- 230000000694 effects Effects 0.000 claims description 15
- 238000012544 monitoring process Methods 0.000 claims description 15
- 208000010877 cognitive disease Diseases 0.000 claims description 13
- 230000003287 optical effect Effects 0.000 claims description 13
- 238000012549 training Methods 0.000 claims description 13
- 230000002708 enhancing effect Effects 0.000 claims description 12
- 238000006243 chemical reaction Methods 0.000 claims description 11
- 238000005094 computer simulation Methods 0.000 claims description 11
- 230000006735 deficit Effects 0.000 claims description 11
- 230000003936 working memory Effects 0.000 claims description 11
- 208000029560 autism spectrum disease Diseases 0.000 claims description 10
- 208000024714 major depressive disease Diseases 0.000 claims description 8
- 208000027061 mild cognitive impairment Diseases 0.000 claims description 8
- 208000024827 Alzheimer disease Diseases 0.000 claims description 7
- 208000019901 Anxiety disease Diseases 0.000 claims description 7
- 230000036506 anxiety Effects 0.000 claims description 7
- 238000013528 artificial neural network Methods 0.000 claims description 7
- 230000007659 motor function Effects 0.000 claims description 7
- 230000000626 neurodegenerative effect Effects 0.000 claims description 7
- 208000006096 Attention Deficit Disorder with Hyperactivity Diseases 0.000 claims description 6
- 230000007787 long-term memory Effects 0.000 claims description 6
- 230000010332 selective attention Effects 0.000 claims description 6
- 230000006403 short-term memory Effects 0.000 claims description 6
- 230000002459 sustained effect Effects 0.000 claims description 6
- 230000004304 visual acuity Effects 0.000 claims description 6
- 208000005145 Cerebral amyloid angiopathy Diseases 0.000 claims description 5
- 206010012289 Dementia Diseases 0.000 claims description 5
- 208000034846 Familial Amyloid Neuropathies Diseases 0.000 claims description 5
- 208000023105 Huntington disease Diseases 0.000 claims description 5
- 208000018737 Parkinson disease Diseases 0.000 claims description 5
- 238000009227 behaviour therapy Methods 0.000 claims description 5
- 201000000980 schizophrenia Diseases 0.000 claims description 5
- 238000003745 diagnosis Methods 0.000 claims description 4
- 238000012417 linear regression Methods 0.000 claims description 4
- 238000007477 logistic regression Methods 0.000 claims description 4
- 201000006417 multiple sclerosis Diseases 0.000 claims description 4
- 238000000513 principal component analysis Methods 0.000 claims description 4
- 238000007637 random forest analysis Methods 0.000 claims description 4
- 230000031893 sensory processing Effects 0.000 claims description 4
- 238000012706 support-vector machine Methods 0.000 claims description 4
- 230000005764 inhibitory process Effects 0.000 claims description 3
- 230000008133 cognitive development Effects 0.000 claims 2
- 238000011981 development test Methods 0.000 claims 2
- 230000003930 cognitive ability Effects 0.000 abstract description 22
- 230000000670 limiting effect Effects 0.000 description 72
- 230000008569 process Effects 0.000 description 26
- 238000009826 distribution Methods 0.000 description 19
- 238000004891 communication Methods 0.000 description 15
- 230000000704 physical effect Effects 0.000 description 13
- 238000011282 treatment Methods 0.000 description 13
- 230000009471 action Effects 0.000 description 12
- 238000004590 computer program Methods 0.000 description 12
- 208000036864 Attention deficit/hyperactivity disease Diseases 0.000 description 9
- 208000015802 attention deficit-hyperactivity disease Diseases 0.000 description 9
- 230000000007 visual effect Effects 0.000 description 9
- 230000003542 behavioural effect Effects 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 8
- 230000008685 targeting Effects 0.000 description 8
- DUGOZIWVEXMGBE-UHFFFAOYSA-N Methylphenidate Chemical compound C=1C=CC=CC=1C(C(=O)OC)C1CCCCN1 DUGOZIWVEXMGBE-UHFFFAOYSA-N 0.000 description 7
- 230000008901 benefit Effects 0.000 description 7
- 230000001965 increasing effect Effects 0.000 description 7
- 230000008449 language Effects 0.000 description 7
- 229960001344 methylphenidate Drugs 0.000 description 7
- 238000002599 functional magnetic resonance imaging Methods 0.000 description 6
- 238000010801 machine learning Methods 0.000 description 6
- 230000035484 reaction time Effects 0.000 description 6
- 238000013459 approach Methods 0.000 description 5
- 230000019771 cognition Effects 0.000 description 5
- 238000011161 development Methods 0.000 description 5
- 230000018109 developmental process Effects 0.000 description 5
- 230000003557 neuropsychological effect Effects 0.000 description 5
- 206010021567 Impulsive behaviour Diseases 0.000 description 4
- 230000006978 adaptation Effects 0.000 description 4
- 230000002411 adverse Effects 0.000 description 4
- 238000004422 calculation algorithm Methods 0.000 description 4
- VYFYYTLLBUKUHU-UHFFFAOYSA-N dopamine Chemical compound NCCC1=CC=C(O)C(O)=C1 VYFYYTLLBUKUHU-UHFFFAOYSA-N 0.000 description 4
- 230000002452 interceptive effect Effects 0.000 description 4
- 238000000926 separation method Methods 0.000 description 4
- KWTSXDURSIMDCE-QMMMGPOBSA-N (S)-amphetamine Chemical compound C[C@H](N)CC1=CC=CC=C1 KWTSXDURSIMDCE-QMMMGPOBSA-N 0.000 description 3
- 229940025084 amphetamine Drugs 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 3
- 210000004556 brain Anatomy 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000002582 magnetoencephalography Methods 0.000 description 3
- 238000010295 mobile communication Methods 0.000 description 3
- 230000001537 neural effect Effects 0.000 description 3
- 230000008825 perceptual sensitivity Effects 0.000 description 3
- 210000002442 prefrontal cortex Anatomy 0.000 description 3
- 230000000644 propagated effect Effects 0.000 description 3
- 238000010079 rubber tapping Methods 0.000 description 3
- 230000001953 sensory effect Effects 0.000 description 3
- 231100000430 skin reaction Toxicity 0.000 description 3
- 229930000680 A04AD01 - Scopolamine Natural products 0.000 description 2
- STECJAGHUSJQJN-GAUPFVANSA-N Hyoscine Natural products C1([C@H](CO)C(=O)OC2C[C@@H]3N([C@H](C2)[C@@H]2[C@H]3O2)C)=CC=CC=C1 STECJAGHUSJQJN-GAUPFVANSA-N 0.000 description 2
- STECJAGHUSJQJN-UHFFFAOYSA-N N-Methyl-scopolamin Natural products C1C(C2C3O2)N(C)C3CC1OC(=O)C(CO)C1=CC=CC=C1 STECJAGHUSJQJN-UHFFFAOYSA-N 0.000 description 2
- 238000004497 NIR spectroscopy Methods 0.000 description 2
- LDDHMLJTFXJGPI-LFPUEVJFSA-N [(3s,5r)-3,5-dimethyl-1-adamantyl]azanium;chloride Chemical compound Cl.C1C(C2)C[C@@]3(C)C[C@]1(C)CC2(N)C3 LDDHMLJTFXJGPI-LFPUEVJFSA-N 0.000 description 2
- 229950008995 aducanumab Drugs 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 2
- 239000008280 blood Substances 0.000 description 2
- 210000004369 blood Anatomy 0.000 description 2
- 230000036772 blood pressure Effects 0.000 description 2
- 238000007621 cluster analysis Methods 0.000 description 2
- 230000007278 cognition impairment Effects 0.000 description 2
- 230000003920 cognitive function Effects 0.000 description 2
- 230000003931 cognitive performance Effects 0.000 description 2
- 230000001010 compromised effect Effects 0.000 description 2
- 229950001954 crenezumab Drugs 0.000 description 2
- 230000002354 daily effect Effects 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- XWAIAVWHZJNZQQ-UHFFFAOYSA-N donepezil hydrochloride Chemical compound [H+].[Cl-].O=C1C=2C=C(OC)C(OC)=CC=2CC1CC(CC1)CCN1CC1=CC=CC=C1 XWAIAVWHZJNZQQ-UHFFFAOYSA-N 0.000 description 2
- 229960003135 donepezil hydrochloride Drugs 0.000 description 2
- 229960003638 dopamine Drugs 0.000 description 2
- 230000004424 eye movement Effects 0.000 description 2
- 238000009472 formulation Methods 0.000 description 2
- 230000036541 health Effects 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 210000000653 nervous system Anatomy 0.000 description 2
- 238000006213 oxygenation reaction Methods 0.000 description 2
- 230000004962 physiological condition Effects 0.000 description 2
- 230000010344 pupil dilation Effects 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 238000012552 review Methods 0.000 description 2
- 229960004323 rivastigmine tartrate Drugs 0.000 description 2
- STECJAGHUSJQJN-FWXGHANASA-N scopolamine Chemical compound C1([C@@H](CO)C(=O)O[C@H]2C[C@@H]3N([C@H](C2)[C@@H]2[C@H]3O2)C)=CC=CC=C1 STECJAGHUSJQJN-FWXGHANASA-N 0.000 description 2
- 229960002646 scopolamine Drugs 0.000 description 2
- 238000013515 script Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 239000010454 slate Substances 0.000 description 2
- 229950007874 solanezumab Drugs 0.000 description 2
- 208000024891 symptom Diseases 0.000 description 2
- 230000009897 systematic effect Effects 0.000 description 2
- 238000002560 therapeutic procedure Methods 0.000 description 2
- YONPGGFAJWQGJC-UHFFFAOYSA-K titanium(iii) chloride Chemical compound Cl[Ti](Cl)Cl YONPGGFAJWQGJC-UHFFFAOYSA-K 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 238000012800 visualization Methods 0.000 description 2
- SFLSHLFXELFNJZ-QMMMGPOBSA-N (-)-norepinephrine Chemical compound NC[C@H](O)C1=CC=C(O)C(O)=C1 SFLSHLFXELFNJZ-QMMMGPOBSA-N 0.000 description 1
- WQZGKKKJIJFFOK-GASJEMHNSA-N Glucose Natural products OC[C@H]1OC(O)[C@H](O)[C@@H](O)[C@@H]1O WQZGKKKJIJFFOK-GASJEMHNSA-N 0.000 description 1
- 208000009668 Neurobehavioral Manifestations Diseases 0.000 description 1
- 230000001594 aberrant effect Effects 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 230000032683 aging Effects 0.000 description 1
- 230000009118 appropriate response Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000033228 biological regulation Effects 0.000 description 1
- 239000000090 biomarker Substances 0.000 description 1
- 230000017531 blood circulation Effects 0.000 description 1
- 238000009530 blood pressure measurement Methods 0.000 description 1
- 230000036760 body temperature Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000000747 cardiac effect Effects 0.000 description 1
- 238000002512 chemotherapy Methods 0.000 description 1
- 230000006999 cognitive decline Effects 0.000 description 1
- 230000004633 cognitive health Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 239000000356 contaminant Substances 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000000881 depressing effect Effects 0.000 description 1
- 230000003001 depressive effect Effects 0.000 description 1
- 230000001627 detrimental effect Effects 0.000 description 1
- 230000010339 dilation Effects 0.000 description 1
- 230000003292 diminished effect Effects 0.000 description 1
- 238000000537 electroencephalography Methods 0.000 description 1
- 238000002001 electrophysiology Methods 0.000 description 1
- 230000007831 electrophysiology Effects 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 239000008103 glucose Substances 0.000 description 1
- 230000004886 head movement Effects 0.000 description 1
- 230000000004 hemodynamic effect Effects 0.000 description 1
- 210000001320 hippocampus Anatomy 0.000 description 1
- 238000002595 magnetic resonance imaging Methods 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 230000005056 memory consolidation Effects 0.000 description 1
- 230000007595 memory recall Effects 0.000 description 1
- 230000010387 memory retrieval Effects 0.000 description 1
- 230000006996 mental state Effects 0.000 description 1
- 238000012806 monitoring device Methods 0.000 description 1
- 230000004751 neurological system process Effects 0.000 description 1
- 229960002748 norepinephrine Drugs 0.000 description 1
- SFLSHLFXELFNJZ-UHFFFAOYSA-N norepinephrine Natural products NCC(O)C1=CC=C(O)C(O)=C1 SFLSHLFXELFNJZ-UHFFFAOYSA-N 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 230000003285 pharmacodynamic effect Effects 0.000 description 1
- 238000002600 positron emission tomography Methods 0.000 description 1
- 208000028173 post-traumatic stress disease Diseases 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000004321 preservation Methods 0.000 description 1
- 230000001179 pupillary effect Effects 0.000 description 1
- 230000002829 reductive effect Effects 0.000 description 1
- 230000002787 reinforcement Effects 0.000 description 1
- 238000005067 remediation Methods 0.000 description 1
- 230000036387 respiratory rate Effects 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 238000010561 standard procedure Methods 0.000 description 1
- 239000000021 stimulant Substances 0.000 description 1
- 230000035882 stress Effects 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 208000011580 syndromic disease Diseases 0.000 description 1
- 238000012956 testing procedure Methods 0.000 description 1
- 210000000857 visual cortex Anatomy 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
- 230000003442 weekly effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61P—SPECIFIC THERAPEUTIC ACTIVITY OF CHEMICAL COMPOUNDS OR MEDICINAL PREPARATIONS
- A61P25/00—Drugs for disorders of the nervous system
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/369—Electroencephalography [EEG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/389—Electromyography [EMG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/40—Detecting, measuring or recording for evaluating the nervous system
- A61B5/4076—Diagnosing or monitoring particular conditions of the nervous system
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4848—Monitoring or testing the effects of treatment, e.g. of medication
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/70—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/63—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/70—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/02—Details of sensors specially adapted for in-vivo measurements
- A61B2562/0219—Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/0205—Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/021—Measuring pressure in heart or blood vessels
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
- A61B5/02405—Determining heart rate variability
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/05—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves
- A61B5/053—Measuring electrical impedance or conductance of a portion of the body
- A61B5/0531—Measuring skin impedance
- A61B5/0533—Measuring galvanic skin response
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/08—Detecting, measuring or recording devices for evaluating the respiratory organs
- A61B5/0816—Measuring devices for examining respiratory frequency
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1121—Determining geometric values, e.g. centre of rotation or angular range of movement
- A61B5/1122—Determining geometric values, e.g. centre of rotation or angular range of movement of movement trajectories
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1124—Determining motor skills
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/145—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
- A61B5/14532—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue for measuring glucose, e.g. by tissue impedance measurement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/145—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
- A61B5/14542—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue for measuring blood gases
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/168—Evaluating attention deficit, hyperactivity
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/40—Detecting, measuring or recording for evaluating the nervous system
- A61B5/4076—Diagnosing or monitoring particular conditions of the nervous system
- A61B5/4082—Diagnosing or monitoring movement diseases, e.g. Parkinson, Huntington or Tourette
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/40—Detecting, measuring or recording for evaluating the nervous system
- A61B5/4076—Diagnosing or monitoring particular conditions of the nervous system
- A61B5/4088—Diagnosing of monitoring cognitive diseases, e.g. Alzheimer, prion diseases or dementia
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Public Health (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Pathology (AREA)
- Physics & Mathematics (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Physiology (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- Psychiatry (AREA)
- Cardiology (AREA)
- Neurology (AREA)
- Psychology (AREA)
- Data Mining & Analysis (AREA)
- Artificial Intelligence (AREA)
- Child & Adolescent Psychology (AREA)
- Developmental Disabilities (AREA)
- Hospice & Palliative Care (AREA)
- Social Psychology (AREA)
- Databases & Information Systems (AREA)
- Neurosurgery (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Educational Technology (AREA)
- Mathematical Physics (AREA)
- Evolutionary Computation (AREA)
- Signal Processing (AREA)
- Fuzzy Systems (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
Abstract
Example systems, methods, and apparatus, including cognitive platforms, are provided for applying signal detection metrics in computer-implemented adaptive response-deadline procedures to data collected based at least in part on user interaction(s) with computerized tasks and/or interferences. The apparatus can include a response classifier for generating a quantifier of the cognitive abilities of an individual. The apparatus also can be configured to adapt the tasks and/or interferences to enhance the individual's cognitive abilities.
Description
PLATFORMS TO IMPLEMENT SIGNAL DETECTION METRICS IN ADAPTIVE
RESPONSE-DEADLINE PROCEDURES
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims priority benefit of U.S. provisional application no.
62/364,297, entitled "SIGNAL DETECTION METRICS IN ADAPTIVE RESPONSE-DEADLINE PROCEDURES," filed on July 19, 2016, and is a continuation-in-part of U.S.
design application no. 29/579,480 entitled "GRAPHICAL USER INTERFACE FOR A
DISPLAY SCREEN OR PORTION THEREOF," filed on September 30, 2016, each of which is incorporated herein by reference in its entirety, including drawings.
BACKGROUND OF THE DISCLOSURE
RESPONSE-DEADLINE PROCEDURES
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims priority benefit of U.S. provisional application no.
62/364,297, entitled "SIGNAL DETECTION METRICS IN ADAPTIVE RESPONSE-DEADLINE PROCEDURES," filed on July 19, 2016, and is a continuation-in-part of U.S.
design application no. 29/579,480 entitled "GRAPHICAL USER INTERFACE FOR A
DISPLAY SCREEN OR PORTION THEREOF," filed on September 30, 2016, each of which is incorporated herein by reference in its entirety, including drawings.
BACKGROUND OF THE DISCLOSURE
[0002] In the normal course of aging, individuals can experience a certain amount of cognitive decline. This can cause an individual to experience increased difficulty in challenging situations, such as time-limited, attention-demanding conditions.
In both older and younger individuals, certain cognitive conditions, diseases, or executive function disorders can result in compromised performance at tasks that require attention, memory, motor function, reaction, executive function, decision-making skills, problem-solving skills, language processing, or comprehension.
SUMMARY OF THE DISCLOSURE
In both older and younger individuals, certain cognitive conditions, diseases, or executive function disorders can result in compromised performance at tasks that require attention, memory, motor function, reaction, executive function, decision-making skills, problem-solving skills, language processing, or comprehension.
SUMMARY OF THE DISCLOSURE
[0003] In view of the foregoing, apparatus, systems and methods are provided for quantifying aspects of cognition (including cognitive abilities). In certain configurations, the apparatus, systems and methods can be implemented for enhancing certain cognitive abilities.
[0004] Example apparatus, systems and methods are configured for applying signal detection metrics in computer-implemented adaptive response-deadline procedures to data collected based at least in part on user interaction(s) with computerized tasks and/or interferences. For example, the apparatus can include a response classifier for generating a quantifier of the cognitive abilities of an individual. As another example, the apparatus also can be configured to adapt the tasks and/or interferences to enhance the individual's cognitive abilities.
[0005] In a general aspect, an apparatus for generating a quantifier of cognitive skills in an individual using a response classifier is provided. The apparatus includes a user interface; a memory to store processor-executable instructions; and a processing unit communicatively coupled to the user interface and the memory, in which upon execution of the processor-executable instructions by the processing unit, the processing unit is configured to render a task with an interference at the user interface, one or more of the task and the interference being time-varying and having a response deadline, such that the user interface imposes a limited time period for receiving at least one type of response from an individual; and the user interface being configured to measure data indicative of two or more differing types of responses to the task or to the interference. The processing unit is further configured to receive data indicative of a first response of an individual to the task and a second response of the individual to the interference; analyze the data indicative of the first response and the second response to compute at least one response profile representative of a performance of the individual; determine a decision boundary metric from the response profile, the decision boundary metric comprising a quantitative measure of a tendency of the individual to provide at least one type of response of the two or more differing types of responses to the task or the interference; and execute a response classifier based at least in part on the computed values of decision boundary metric, to generate a classifier output indicative of the cognitive response capabilities of the individual.
[0006] In another general aspect, a computer-implemented method for generating a quantifier of cognitive skills in an individual using a response classifier is provided. The method includes rendering a task with an interference at a user interface;
measuring data indicative of two or more differing types of responses to the task or to the interference; receiving data indicative of a first response of an individual to the task and a second response of the individual to the interference. The method includes analyzing the data indicative of the first response and the second response to compute at least
measuring data indicative of two or more differing types of responses to the task or to the interference; receiving data indicative of a first response of an individual to the task and a second response of the individual to the interference. The method includes analyzing the data indicative of the first response and the second response to compute at least
7 PCT/US2017/042938 one response profile representative of the performance of the individual. The method includes determining a decision boundary metric from the response profile, the decision boundary metric comprising a quantitative measure of a tendency of the individual to provide at least one type of response of the two or more differing types of responses to the interference. The method includes executing a response classifier based at least in part on the decision boundary metric, to generate a classifier output indicative of the individual's cognitive response capabilities.
[0007] In another general aspect, an apparatus for enhancing cognitive skills in an individual is provided. The apparatus includes a user interface; a memory to store processor-executable instructions; and a processing unit communicatively coupled to the user interface and the memory, in which upon execution of the processor-executable instructions by the processing unit, the processing unit is configured to render a primary task with an interference at the user interface, one or more of the task and the interference being time-varying and having a response deadline, such that the user interface imposes a limited time period for receiving at least one type of response from an individual; and the user interface being configured to measure data indicative of two or more differing types of responses to the task or to the interference.
The processing unit is configured to receive data indicative of a first response of an individual to the task and a second response of the individual to the interference; and analyze the data indicative of the first response and the second response to compute at least one response profile representative of a performance of the individual.
The processing unit is configured to determine a first decision boundary metric based at least in part on the at least one response profile, the first decision boundary metric comprising a quantitative measure of a tendency of the individual to provide at least one type of response of the two or more differing types of responses to the interference. The processing unit is configured to, based at least in part on the computed first decision boundary metric, adjust the task and/or the interference to derive a modification in the computed at least one decision boundary metric such that a further response to the task and/or a further response to the interference is modified as compared to an earlier response to the task and/or an earlier response to the interference, thereby indicating a modification of the cognitive response capabilities of the individual.
[0007] In another general aspect, an apparatus for enhancing cognitive skills in an individual is provided. The apparatus includes a user interface; a memory to store processor-executable instructions; and a processing unit communicatively coupled to the user interface and the memory, in which upon execution of the processor-executable instructions by the processing unit, the processing unit is configured to render a primary task with an interference at the user interface, one or more of the task and the interference being time-varying and having a response deadline, such that the user interface imposes a limited time period for receiving at least one type of response from an individual; and the user interface being configured to measure data indicative of two or more differing types of responses to the task or to the interference.
The processing unit is configured to receive data indicative of a first response of an individual to the task and a second response of the individual to the interference; and analyze the data indicative of the first response and the second response to compute at least one response profile representative of a performance of the individual.
The processing unit is configured to determine a first decision boundary metric based at least in part on the at least one response profile, the first decision boundary metric comprising a quantitative measure of a tendency of the individual to provide at least one type of response of the two or more differing types of responses to the interference. The processing unit is configured to, based at least in part on the computed first decision boundary metric, adjust the task and/or the interference to derive a modification in the computed at least one decision boundary metric such that a further response to the task and/or a further response to the interference is modified as compared to an earlier response to the task and/or an earlier response to the interference, thereby indicating a modification of the cognitive response capabilities of the individual.
[0008] In another general aspect, a computer-implemented method for enhancing cognitive skills in an individual is provided. The method includes rendering a task with an interference at a user interface; measuring data indicative of two or more differing types of responses to the task or to the interference; receiving data indicative of a first response of an individual to the task and a second response of the individual to the interference; and analyzing the data indicative of the first response and the second response to compute at least one response profile representative of the performance of the individual. The method includes determining a first decision boundary metric based at least in part on the at least one response profile, the first decision boundary metric comprising a quantitative measure of a tendency of the individual to provide at least one type of response of the two or more differing types of responses to the interference. The method includes, based at least in part on the computed first decision boundary metric, adapting the task and/or the interference to derive a modification in the computed first decision boundary metric such that the first response and/or the second response is modified, thereby indicating a modification of the cognitive response capabilities of the individual.
[0009] In another general aspect, an apparatus for enhancing cognitive skills in an individual is provided. The apparatus includes a user interface; a memory to store processor-executable instructions; and a processing unit communicatively coupled to the user interface and the memory, in which upon execution of the processor-executable instructions by the processing unit, the processing unit is configured to receive data indicative of one or more of an amount, concentration, or dose titration of a pharmaceutical agent, drug, or biologic being or to be administered to an individual. The processing unit is configured to render a primary task with an interference at the user interface, one or more of the task and the interference being time-varying and having a response deadline, such that the user interface imposes a limited time period for receiving at least one type of response from an individual; and the user interface being configured to measure data indicative of two or more differing types of responses to the task or to the interference. The processing unit is configured to receive data indicative of a first response of an individual to the task and a second response of the individual to the interference, from a first session; analyze the data indicative of the first response and the second response to compute a first response profile representative of a first performance of the individual; and determine a first decision boundary metric based at least in part on the at least one response profile, the first decision boundary metric comprising a quantitative measure of a tendency of the individual to provide at least one type of response of the two or more differing types of responses to the interference. The processing unit is configured to, based at least in part on the computed first decision boundary metric and the amount or concentration of a pharmaceutical agent, drug, or biologic, adapt the task and/or the interference to generate a second session.
The processing unit is configured to analyze collected data indicative of the first response and the second response from the second session, to compute a second response profile and a second decision boundary metric representative of a second performance of the individual. The processing unit is configured to, based at least in part on the first decision boundary metric and second decision boundary metric, generate an output to the user interface indicative of at least one of: (i) a likelihood of the individual experiencing an adverse event in response to administration of the pharmaceutical agent, drug, or biologic, (ii) a recommended change in one or more of the amount, concentration, or dose titration of the pharmaceutical agent, drug, or biologic, (iii) a change in the individual's cognitive response capabilities, (iv) a recommended treatment regimen, (v) a recommendation of at least one of a behavioral therapy, counseling, or physical exercise, or (vi) a degree of effectiveness of at least one of a behavioral therapy, counseling, or physical exercise.
The processing unit is configured to analyze collected data indicative of the first response and the second response from the second session, to compute a second response profile and a second decision boundary metric representative of a second performance of the individual. The processing unit is configured to, based at least in part on the first decision boundary metric and second decision boundary metric, generate an output to the user interface indicative of at least one of: (i) a likelihood of the individual experiencing an adverse event in response to administration of the pharmaceutical agent, drug, or biologic, (ii) a recommended change in one or more of the amount, concentration, or dose titration of the pharmaceutical agent, drug, or biologic, (iii) a change in the individual's cognitive response capabilities, (iv) a recommended treatment regimen, (v) a recommendation of at least one of a behavioral therapy, counseling, or physical exercise, or (vi) a degree of effectiveness of at least one of a behavioral therapy, counseling, or physical exercise.
[0010] In another general aspect, a computer-implemented method for enhancing cognitive skills in an individual is provided. The method includes receiving data indicative of one or more of an amount, concentration, or dose titration of a pharmaceutical agent, drug, or biologic being or to be administered to an individual. The method includes rendering a task with an interference at a user interface;
measuring data indicative of two or more differing types of responses to the task or to the interference; receiving data indicative of a first response of an individual to the task and a second response of the individual to the interference; and analyzing the data indicative of the first response and the second response to compute a first response profile representative of the performance of the individual. The method includes determining a first decision boundary metric based at least in part on the at least one response profile, the first decision boundary metric comprising a quantitative measure of a tendency of the individual to provide at least one type of response of the two or more differing types of responses to the interference. The method includes, based at least in part on the computed first decision boundary metric and the amount or concentration of a pharmaceutical agent, drug, or biologic, adapting the task and/or the interference such that the at least one response profile is modified. The method includes analyzing the collected data indicative of the first response and the second response to compute a second decision boundary metric representative of a second performance of the individual. The method includes, based at least in part on the first decision boundary metric and second decision boundary metric, generate an output to the user interface indicative of at least one of (i) a change in one or more of the amount, concentration, or dose titration of the pharmaceutical agent, drug, or biologic, (ii) a likelihood of the individual experiencing an adverse event in response to administration of the pharmaceutical agent, drug, or biologic, (iii) a change in the individual's cognitive response capabilities, (iv) a recommended treatment regimen, (v) a recommendation of at least one of a behavioral therapy, counseling, or physical exercise, or (vi) a degree of effectiveness of at least one of a behavioral therapy, counseling, or physical exercise.
measuring data indicative of two or more differing types of responses to the task or to the interference; receiving data indicative of a first response of an individual to the task and a second response of the individual to the interference; and analyzing the data indicative of the first response and the second response to compute a first response profile representative of the performance of the individual. The method includes determining a first decision boundary metric based at least in part on the at least one response profile, the first decision boundary metric comprising a quantitative measure of a tendency of the individual to provide at least one type of response of the two or more differing types of responses to the interference. The method includes, based at least in part on the computed first decision boundary metric and the amount or concentration of a pharmaceutical agent, drug, or biologic, adapting the task and/or the interference such that the at least one response profile is modified. The method includes analyzing the collected data indicative of the first response and the second response to compute a second decision boundary metric representative of a second performance of the individual. The method includes, based at least in part on the first decision boundary metric and second decision boundary metric, generate an output to the user interface indicative of at least one of (i) a change in one or more of the amount, concentration, or dose titration of the pharmaceutical agent, drug, or biologic, (ii) a likelihood of the individual experiencing an adverse event in response to administration of the pharmaceutical agent, drug, or biologic, (iii) a change in the individual's cognitive response capabilities, (iv) a recommended treatment regimen, (v) a recommendation of at least one of a behavioral therapy, counseling, or physical exercise, or (vi) a degree of effectiveness of at least one of a behavioral therapy, counseling, or physical exercise.
[0011] The details of one or more of the above aspects and implementations are set forth in the accompanying drawings and the description below. Other features, aspects, and advantages will become apparent from the description, the drawings, and the claims.
BRIEF DESCRIPTION OF DRAWINGS
BRIEF DESCRIPTION OF DRAWINGS
[0012] The skilled artisan will understand that the figures, described herein, are for illustration purposes only. It is to be understood that in some instances various aspects of the described implementations may be shown exaggerated or enlarged to facilitate an understanding of the described implementations. In the drawings, like reference characters generally refer to like features, functionally similar and/or structurally similar elements throughout the various drawings. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the teachings. The drawings are not intended to limit the scope of the present teachings in any way. The system and method may be better understood from the following illustrative description with reference to the following drawings in which:
[0013] FIG. 1 shows a block diagram of an example system, according to the principles herein.
[0014] FIG. 2 shows a block diagram of an example computing device, according to the principles herein.
[0015] FIG. 3 shows an example plot of the signal and noise distribution curves computed based on an example cognitive test, according to the principles herein.
[0016] FIG. 4A shows an example graphical depiction of a drift-diffusion model for linear belief accumulation, according to the principles herein.
[0017] FIG. 4B shows an example graphical depiction of a drift-diffusion model for non-linear belief accumulation, according to the principles herein.
[0018] FIG. 5 shows an example plot of the signal (right curve) and noise based on an example cognitive platform, according to the principles herein.
[0019] FIG. 6 shows an example plot of the conditional probability of a quantifier of belief given a signal, according to the principles herein.
[0020] FIGs. 7A ¨ 7B show plots of the curves for values of conservative and impulsive measures, according to the principles herein.
[0021] FIGs. 7C ¨ 7D show example plots of the formation of belief for linear belief accumulation and non-linear belief accumulation, respectively, according to the principles herein.
[0022] FIGs. 8A ¨ 8D show example plots of the probability curves for signal distribution and noise distribution at the time points shown in FIGs. 7A ¨ 7D, according to the principles herein.
[0023] FIG. 9 shows an example projected two-dimensional (2D) representation of a three-dimensional (3D) joint distribution, according to the principles herein.
[0024] FIGs. 10A ¨ 10D show example user interfaces with instructions to a user that can be rendered to an example user interface, according to the principles herein.
[0025] FIGs. 11A ¨ 11D show examples of the time-varying features of example objects (targets or non-targets) that can be rendered to an example user interface, according to the principles herein.
[0026] FIGs. 12A¨ 12T show the dynamics of tasks and interferences that can be rendered at user interfaces, according to the principles herein.
[0027] FIGs. 13A ¨ 13D show examples of the dynamics of multi-tasking involving user interaction with an implementation of a navigation task and with an interference rendered to a user interface of an example user interface, according to the principles herein.
[0028] FIGs. 14A ¨ 14D show examples of the dynamics of an instructions panel rendered to a user interface of an example user interface, according to the principles herein.
[0029] FIGs. 15A ¨ 15V show examples of the dynamics of multi-tasking involving user interaction with an implementation of a navigation task and with an interference.
[0030] FIGs. 16A ¨ 16C show flowcharts of example methods, according to the principles herein.
[0031] FIG. 17 shows the architecture of an example computer system, according to the principles herein.
DETAILED DESCRIPTION
DETAILED DESCRIPTION
[0032] It should be appreciated that all combinations of the concepts discussed in greater detail below (provided such concepts are not mutually inconsistent) are contemplated as being part of the inventive subject matter disclosed herein.
It also should be appreciated that terminology explicitly employed herein that also may appear in any disclosure incorporated by reference should be accorded a meaning most consistent with the particular concepts disclosed herein.
It also should be appreciated that terminology explicitly employed herein that also may appear in any disclosure incorporated by reference should be accorded a meaning most consistent with the particular concepts disclosed herein.
[0033] Following below are more detailed descriptions of various concepts related to, and embodiments of, inventive methods, apparatus and systems comprising a cognitive platform configured for applying signal detection metrics in computer-implemented adaptive response-deadline procedures.
[0034] It should be appreciated that various concepts introduced above and discussed in greater detail below may be implemented in any of numerous ways, as the disclosed concepts are not limited to any particular manner of implementation.
Examples of specific implementations and applications are provided primarily for illustrative purposes.
Examples of specific implementations and applications are provided primarily for illustrative purposes.
[0035] As used herein, the term "includes" means includes but is not limited to, the term "including" means including but not limited to. The term "based on" means based at least in part on.
[0036] As used herein, the term "target" refers to a type of stimulus that is specified to an individual (e.g., in instructions) to be the focus for an interaction. A
target differs from a non-target in at least one characteristic or feature. Two targets may differ from each other by at least one characteristic or feature, but overall are still instructed to an individual as a target, in an example where the individual is instructed/required to make a choice (e.g., between two different degrees of a facial expression or other characteristic/feature difference, such as but not limited to between a happy face and a happier face or between an angry face and an angrier face).
target differs from a non-target in at least one characteristic or feature. Two targets may differ from each other by at least one characteristic or feature, but overall are still instructed to an individual as a target, in an example where the individual is instructed/required to make a choice (e.g., between two different degrees of a facial expression or other characteristic/feature difference, such as but not limited to between a happy face and a happier face or between an angry face and an angrier face).
[0037] As used herein, the term "non-target" refers to a type of stimulus that is not to be the focus for an interaction, whether indicated explicitly or implicitly to the individual.
[0038] As used herein, the term "task" refers to a goal and/or objective to be accomplished by an individual. The task may require the individual to provide or withhold a response to a particular stimulus. The "task" can be configured as a baseline cognitive function that is being measured.
[0039] As used herein, the term "interference" refers to a stimulus presented to the individual such that it interferes with the individual's performance of a primary task. In any example herein, an interference is a type of task that is presented/rendered in such a manner that it diverts or interferes with an individual's attention in performing another task. In some examples herein, the interference is configured as a secondary task that is presented simultaneously with a primary task, either over a short, discrete time period or over an extended time period (less than the time frame over which the primary task is presented), or over the entire period of time of the primary task. In any example herein, the interference can be presented/rendered continuously, or continually (i.e., repeated in a certain frequency, irregularly, or somewhat randomly). For example, the interference can be presented at the end of the primary task or at discrete, interim periods during presentation of the primary task. The degree of interference can be modulated based on the type, amount, and/or temporal length of presentation of the interference relative to the primary task.
[0040] As used herein, the term "stimulus," refers to a sensory event configured to evoke a specified functional response from an individual. The degree and type of response can be quantified based on the individual's interactions with a measuring component (including using sensor devices or other measuring components). Non-limiting examples of a stimulus include a navigation path (with an individual being instructed to control an avatar or other processor-rendered guide to navigate the path), or a discrete object, whether a target or a non-target, rendered to a user interface (with an individual being instructed to control a computing component to provide input or other indication relative to the discrete object). In any example herein, the task and/or interference includes a stimulus.
[0041] As used herein, a "trial" includes at least one iteration of rendering of a task and/or interference and at least one receiving of the individual's response(s) to the task and/or interference. As non-limiting examples, a trial can include at least a portion of a single-tasking task and/or at least a portion of a multi-tasking task. For example, a trial can be a period of time during a navigation task (including a visuo-motor navigation task) in which the individual's performance is assessed, such as but not limited to, assessing whether or the degree of success to which an individual's actions in interacting with the platform result in a guide (including a computerized avatar) navigating along at least a portion of a certain path or in an environment for a time interval (such as but not limited to, fractions of a second, a second, several seconds, or more) and/or causes the guide (including computerized avatar) to cross (or avoid crossing) performance milestones along the path or in the environment. In another example, a trial can be a period of time during a targeting task in which the individual's performance is assessed, such as but not limited to, assessing whether or the degree of success to which an individual's actions in interacting with the platform result in identification/selection of a target versus a non-target (e.g., red object versus yellow object), or discriminates between two different types of targets (a happy face versus a happier face). In these examples, the segment of the individual's performance that is designated as a trial for the navigation task does not need to be co-extensive or aligned with the segment of the individual's performance that is designated as a trial for the targeting task.
[0042] As used herein, a "session" refers to at least one trial or can include at least one trial and at least one other type of measurement and/or other user interaction. As a non-limiting example, a session can include at least one trial and one or more of a measurement using a physiological or monitoring component and/or a cognitive testing component. As another non-limiting example, a session can include at least one trial and receipt of data indicative of one or more measures of an individual's condition, including physiological condition and/or cognitive condition.
[0043] In any example herein, an object may be rendered as a depiction of a physical object (including a polygonal or other object), a face (human or non-human), or a caricature, other type of object.
[0044] In any of the examples herein, instructions can be provided to the individual to specify how the individual is expected to perform the task and/or interference in a trial and/or a session. In non-limiting examples, the instructions can inform the individual of the expected performance of a navigation task (e.g., stay on this path, go to these parts of the environment, cross or avoid certain milestone objects in the path or environment), a targeting task (e.g., describe or show the type of object that is the target object versus the non-target object, or describe or show the type of object that is the target object versus the non-target object, or two different types of target object that the individual is expected to choose between (e.g., happy face versus happier face)), and/or describe how the individual's performance is to be scored. In examples, the instructions may be provided visually (e.g., based on a rendered user interface) or via sound. In various examples, the instructions may be provided once prior to the performance two or more trials or sessions, or repeated each time prior to the performance of a trial or a session, or some combination thereof.
[0045] While some example systems, methods, and apparatus described herein are based on an individual being instructed/required to decide/select between a target versus a non-target may, in other example implementations, the example systems, methods, and apparatus can be configured such that the individual is instructed/required to decide/choose between two different types of targets (such as but not limited to between two different degrees of a facial expression or other characteristic/feature difference).
[0046] In addition, while example systems, methods, and apparatus may be described herein relative to an individual, in other example implementations, the example systems, methods, and apparatus can be configured such that two or more individuals, or members of a group (including a clinical population), perform the tasks and/or interferences, either individually or concurrently.
[0047] The instant disclosure is directed to the application of signal detection metrics such as criterion, bias, and sensitivity indices to computer-implemented adaptive time-deadline procedures.
[0048] The example systems, methods, and apparatus according to the principles herein can be implemented, using at least one processing unit of a programmed computing device, to characterize the response profiles of individuals and groups on the spectrum between impulsive (tend to respond with limited information) or conservative (tend to withhold response until maximum information is acquired) in psychophysical computer-implemented adaptive testing procedures.
[0049] As described in greater detail below, the computing device can include an application (an "App") to perform such functionalities as analyzing the data.
For example, the data from the at least one sensor component can be analyzed as described herein by a processor executing the App on an example computing device to provide the computed response profile, decision boundary metric (such as but not limited to response criteria), response classifier, and other metrics and analyses described herein.
For example, the data from the at least one sensor component can be analyzed as described herein by a processor executing the App on an example computing device to provide the computed response profile, decision boundary metric (such as but not limited to response criteria), response classifier, and other metrics and analyses described herein.
[0050] An example system according to the principles herein provides for generating a quantifier of cognitive skills in an individual (using a machine learning response classifier) and/or enhancing cognitive skills in an individual. In an example implementation, the example system employs an App running on a mobile communication device or other hand-held devices. Non-limiting examples of such mobile communication devices or hand-held device include a smartphone, such as but not limited to an iPhone , a BlackBerry , or an Android-based smartphone, a tablet, a slate, an electronic-reader (e-reader), a digital assistant, or other electronic reader or hand-held, portable, or wearable computing device, or any other equivalent device, an Xbox , a Wii , or other computing system that can be used to render game-like elements. In some example implementations, the example system can include a head-mounted device, such as smart eyeglasses with built-in displays, a smart goggle with built-in displays, or a smart helmet with built-in displays, and the user can hold a controller or an input device having one or more sensors in which the controller or the input device communicates wirelessly with the head-mounted device. In some example implementations, the computing system may be stationary, such as a desktop computing system that includes a main computer and a desktop display (or a projector display), in which the user provides inputs to the App using a keyboard, a computer mouse, a joystick, handheld consoles, wristbands, or other wearable devices having sensors that communicate with the main computer using wired or wireless communication. In examples herein, the sensors can be configured to measure movements of the user's hands, feet, and/or any other part of the body. In some example implementations, the example system can be formed as a virtual reality (VR) system (a simulated environment including as an immersive, interactive 3-D
experience for a user), an augmented reality (AR) system (including a live direct or indirect view of a physical, real-world environment whose elements are augmented by computer-generated sensory input such as but not limited to sound, video, graphics and/or GPS
data), or a mixed reality (MR) system (also referred to as a hybrid reality which merges the real and virtual worlds to produce new environments and visualizations where physical and digital objects co-exist and interact substantially in real time).
experience for a user), an augmented reality (AR) system (including a live direct or indirect view of a physical, real-world environment whose elements are augmented by computer-generated sensory input such as but not limited to sound, video, graphics and/or GPS
data), or a mixed reality (MR) system (also referred to as a hybrid reality which merges the real and virtual worlds to produce new environments and visualizations where physical and digital objects co-exist and interact substantially in real time).
[0051] FIG.
1 shows an example apparatus 100 according to the principles herein that can be used to implement the cognitive platform according to the principles herein.
The example apparatus 100 includes at least one memory 102 and at least one processing unit 104. The at least one processing unit 104 is communicatively coupled to the at least one memory 102.
1 shows an example apparatus 100 according to the principles herein that can be used to implement the cognitive platform according to the principles herein.
The example apparatus 100 includes at least one memory 102 and at least one processing unit 104. The at least one processing unit 104 is communicatively coupled to the at least one memory 102.
[0052]
Example memory 102 can include, but is not limited to, hardware memory, non-transitory tangible media, magnetic storage disks, optical disks, flash drives, computational device memory, random access memory, such as but not limited to DRAM, SRAM, EDO RAM, any other type of memory, or combinations thereof.
Example processing unit 104 can include, but is not limited to, a microchip, a processor, a microprocessor, a special purpose processor, an application specific integrated circuit, a microcontroller, a field programmable gate array, any other suitable processor, or combinations thereof.
Example memory 102 can include, but is not limited to, hardware memory, non-transitory tangible media, magnetic storage disks, optical disks, flash drives, computational device memory, random access memory, such as but not limited to DRAM, SRAM, EDO RAM, any other type of memory, or combinations thereof.
Example processing unit 104 can include, but is not limited to, a microchip, a processor, a microprocessor, a special purpose processor, an application specific integrated circuit, a microcontroller, a field programmable gate array, any other suitable processor, or combinations thereof.
[0053] The at least one memory 102 is configured to store processor-executable instructions 106 and a computing component 108. In a non-limiting example, the computing component 108 can be used to compute signal detection metrics in computer-implemented adaptive response-deadline procedures. As shown in FIG.
1, the memory 102 also can be used to store data 110, such as but not limited to measurement data 112. In various examples, the measurement data 112 can include physiological measurement data of an individual received from a physiological component (not shown) and/or data indicative of the response of an individual to a task and/or an interference rendered at a user interface of the apparatus 100 (as described in greater detail below), and/or data indicative of one or more of an amount, concentration, or dose titration, or other treatment regimen of a drug, pharmaceutical agent, biologic, or other medication being or to be administered to an individual.
1, the memory 102 also can be used to store data 110, such as but not limited to measurement data 112. In various examples, the measurement data 112 can include physiological measurement data of an individual received from a physiological component (not shown) and/or data indicative of the response of an individual to a task and/or an interference rendered at a user interface of the apparatus 100 (as described in greater detail below), and/or data indicative of one or more of an amount, concentration, or dose titration, or other treatment regimen of a drug, pharmaceutical agent, biologic, or other medication being or to be administered to an individual.
[0054] In a non-limiting example, the at least one processing unit 104 executes the processor-executable instructions 106 stored in the memory 102 at least to compute signal detection metrics in computer-implemented adaptive response-deadline procedures using the computing component 108. The at least one processing unit also executes processor-executable instructions 106 to control a transmission unit to transmit values indicative of the computed signal detection metrics and/or controls the memory 102 to store values indicative of the signal detection metrics.
[0055] In another non-limiting example, the at least one processing unit executes the processor-executable instructions 106 stored in the memory 102 at least to apply signal detection metrics in computer-implemented adaptive response-deadline procedures.
[0056] In any example herein, the user interface may be a graphical user interface.
[0057] In another non-limiting example, the measurement data 112 can be collected from measurements using one or more physiological or monitoring components and/or cognitive testing components. In any example herein, the one or more physiological components are configured for performing physiological measurements. The physiological measurements provide quantitative measurement data of physiological parameters and/or data that can be used for visualization of physiological structure and/or functions.
[0058] In any example herein, the measurement data 112 can include reaction time, response variance, correct hits, omission errors, number of false alarms (such as but not limited to a response to a non-target), learning rate, spatial deviance, subjective ratings, and/or performance threshold, or data from an analysis, including percent accuracy, hits, and/or misses in the latest completed trial or session. Other non-limiting examples of measurement data 112 include response time, task completion time, number of tasks completed in a set amount of time, preparation time for task, accuracy of responses, accuracy of responses under set conditions (e.g., stimulus difficulty or magnitude level and association of multiple stimuli), number of responses a participant can register in a set time limit number of responses a participant can make with no time limit, number of attempts at a task needed to complete a task, movement stability, accelerometer and gyroscope data, and/or self-rating.
[0059] For a target discrimination task, the cognitive platform may require a temporally-specific and/or a position-specific response from an individual, including to select between a target and a non-target (e.g., in a GO/NO-GO task) or to select between two differing types of targets, e.g., in a two-alternative forced choice (2AFC) task (including choosing between two differing degrees of a facial expression or other characteristic/feature difference). For a navigation task, the cognitive platform may require a position-specific and/or a motion-specific response from the user.
For a facial expression recognition or object recognition task, the cognitive platform may require temporally-specific and/or position-specific responses from the user. In non-limiting examples, the user response to tasks, such as but not limited to targeting and/or navigation and/or facial expression recognition or object recognition task(s), can be recorded using an input device of the cognitive platform. Non-limiting examples of such input devices can include a device for capturing a touch, swipe or other gesture relative to a user interface, an audio capture device (e.g., a microphone input), or an image capture device (such as but not limited to a touch-screen or other pressure-sensitive or touch-sensitive surface, or a camera), including any form of graphical user interface configured for recording a user interaction. In other non-limiting examples, the user response recorded using the cognitive platform for tasks, such as but not limited to targeting and/or navigation and/or facial expression recognition or object recognition task(s), can include user actions that cause changes in a position, orientation, or movement of a computing device including the cognitive platform. Such changes in a position, orientation, or movement of a computing device can be recorded using an input device disposed in or otherwise coupled to the computing device, such as but not limited to a sensor. Non-limiting examples of sensors include a motion sensor, position sensor, and/or an image capture device (such as but not limited to a camera).
For a facial expression recognition or object recognition task, the cognitive platform may require temporally-specific and/or position-specific responses from the user. In non-limiting examples, the user response to tasks, such as but not limited to targeting and/or navigation and/or facial expression recognition or object recognition task(s), can be recorded using an input device of the cognitive platform. Non-limiting examples of such input devices can include a device for capturing a touch, swipe or other gesture relative to a user interface, an audio capture device (e.g., a microphone input), or an image capture device (such as but not limited to a touch-screen or other pressure-sensitive or touch-sensitive surface, or a camera), including any form of graphical user interface configured for recording a user interaction. In other non-limiting examples, the user response recorded using the cognitive platform for tasks, such as but not limited to targeting and/or navigation and/or facial expression recognition or object recognition task(s), can include user actions that cause changes in a position, orientation, or movement of a computing device including the cognitive platform. Such changes in a position, orientation, or movement of a computing device can be recorded using an input device disposed in or otherwise coupled to the computing device, such as but not limited to a sensor. Non-limiting examples of sensors include a motion sensor, position sensor, and/or an image capture device (such as but not limited to a camera).
[0060] In any example herein, the multi-tasking tasks can include any combination of two or more tasks. The multi-task interactive elements of an implementation include interactive mechanics configured to engage the individual in multiple temporally-overlapping tasks, i.e., tasks that may require multiple, substantially simultaneous responses from an individual. In non-limiting examples herein, in an individual's performance of at least a portion of a multi-tasking task, the system, method, and apparatus are configured to measure data indicative of the individual's multiple responses in real-time, and also to measure a first response from the individual to a task (as a primary task) substantially simultaneously with measuring a second response from the individual to an interference (as a secondary task).
[0061] In an example implementation involving multi-tasking tasks, the computer device is configured (such as using at least one specially-programmed processing unit) to cause the cognitive platform to present to a user two or more different types of tasks, such as but not limited to, target discrimination and/or navigation and/or facial expression recognition or object recognition tasks, during a short time frame (including in real-time and/or substantially simultaneously). The computer device is also configured (such as using at least one specially-programmed processing unit) to collect data indicative of the type of user response received for the multi-tasking tasks, within the short time frame (including in real-time and/or substantially simultaneously). In these examples, the two or more different types of tasks can be presented to the individual within the short time frame (including in real-time and/or substantially simultaneously), and the computing device can be configured to receive data indicative of the user response(s) relative to the two or more different types of tasks within the short time frame (including in real-time and/or substantially simultaneously).
[0062] In some examples, the short time frame can be of any time interval at a resolution of up to about 1.0 millisecond or greater. The time intervals can be, but are not limited to, durations of time of any division of a periodicity of about 2.0 milliseconds or greater, up to any reasonable end time. The time intervals can be, but are not limited to, about 3.0 millisecond, about 5.0 millisecond, about 10 milliseconds, about milliseconds, about 40 milliseconds, about 50 milliseconds, about 60 milliseconds, about 70 milliseconds, about 100 milliseconds, or greater. In other examples, the short time frame can be, but is not limited to, fractions of a second, about a second, between about 1.0 and about 2.0 seconds, or up to about 2.0 seconds, or more.
[0063] In any example herein, the cognitive platform can be configured to collect data indicative of a reaction time of a user's response relative to the time of presentation of the tasks (including an interference with a task). For example, the computing device can be configured to cause the platform product or cognitive platform to provide smaller or larger reaction time window for a user to provide a response to the tasks as an example way of adjusting the difficulty level.
[0064] In any example herein, the one or more physiological components can include any means of measuring physical characteristics of the body and nervous system, including electrical activity, heart rate, blood flow, and oxygenation levels, to provide the measurement data 112. This can include camera-based heart rate detection, measurement of galvanic skin response, blood pressure measurement, electroencephalogram, electrocardiogram, magnetic resonance imaging, near-infrared spectroscopy, and/or pupil dilation measures, to provide the measurement data 112.
The one or more physiological components can include one or more sensors for measuring parameter values of the physical characteristics of the body and nervous system, and one or more signal processors for processing signals detected by the one or more sensors.
The one or more physiological components can include one or more sensors for measuring parameter values of the physical characteristics of the body and nervous system, and one or more signal processors for processing signals detected by the one or more sensors.
[0065] Other examples of physiological measurements to provide measurement data 112 include, but are not limited to, the measurement of body temperature, heart or other cardiac-related functioning using an electrocardiograph (ECG), electrical activity using an electroencephalogram (EEG), event-related potentials (ERPs), functional magnetic resonance imaging (fMRI), blood pressure, electrical potential at a portion of the skin, galvanic skin response (GSR), magneto-encephalogram (MEG), eye-tracking device or other optical detection device including processing units programmed to determine degree of pupillary dilation, functional near-infrared spectroscopy (fNIRS), and/or a positron emission tomography (PET) scanner. An EEG-fMRI or MEG-fMRI
measurement allows for simultaneous acquisition of electrophysiology (EEG/MEG) data and hemodynamic (fMRI) data.
measurement allows for simultaneous acquisition of electrophysiology (EEG/MEG) data and hemodynamic (fMRI) data.
[0066] The example apparatus of FIG. 1 can be configured as a computing device for performing any of the example methods described herein. The computing device can include an App for performing some of the functionality of the example methods described herein.
[0067] FIG. 2 shows another example apparatus according to the principles herein, configured as a computing device 200 that can be used to implement the cognitive platform according to the principles herein. The example computing device 200 can include a communication module 210 and an analysis engine 212. The communication module 210 can be implemented to receive data indicative of a response of an individual to the task and/or a response of the individual to the interference. The analysis engine 212 can be implemented to analyze the data to generate a response profile, decision boundary metric (such as but not limited to response criteria), a response classifier, and/or other metrics and analyses described herein. As shown in the example of Figure 2, the computing device 200 can include processor-executable instructions such that a processor unit can execute an application (an App) 214 that a user can implement to initiate the analysis engine 212. In an example, the processor-executable instructions can include software, firmware, or other instructions.
[0068] The example communication module 210 can be configured to implement any wired and/or wireless communication interface by which information may be exchanged between the computing device 200 and another computing device or computing system.
Non-limiting examples of wired communication interfaces include, but are not limited to, USB ports, RS232 connectors, RJ45 connectors, and Ethernet connectors, and any appropriate circuitry associated therewith. Non-limiting examples of wireless communication interfaces may include, but are not limited to, interfaces implementing Bluetooth technology, Wi-Fi, Wi-Max, IEEE 802.11 technology, radio frequency (RF) communications, Infrared Data Association (IrDA) compatible protocols, Local Area Networks (LAN), Wide Area Networks (WAN), and Shared Wireless Access Protocol (SWAP).
Non-limiting examples of wired communication interfaces include, but are not limited to, USB ports, RS232 connectors, RJ45 connectors, and Ethernet connectors, and any appropriate circuitry associated therewith. Non-limiting examples of wireless communication interfaces may include, but are not limited to, interfaces implementing Bluetooth technology, Wi-Fi, Wi-Max, IEEE 802.11 technology, radio frequency (RF) communications, Infrared Data Association (IrDA) compatible protocols, Local Area Networks (LAN), Wide Area Networks (WAN), and Shared Wireless Access Protocol (SWAP).
[0069] In an example implementation, the example computing device 200 includes at least one other component that is configured to transmit a signal from the apparatus to a second computing device. For example, the at least one component can include a transmitter or a transceiver configured to transmit a signal including data indicative of a measurement by at least one sensor component to the second computing device.
[0070] In any example herein, the App 214 on the computing device 200 can include processor-executable instructions such that a processor unit of the computing device implements an analysis engine to analyze data indicative of the individual's response to the rendered tasks and/or interference to provide a response profile, decision boundary metric (such as but not limited to response criteria), a response classifier, and other metrics and analyses described herein. In some example, the App 214 can include processor-executable instructions to provide one or more of: (i) a classifier output indicative of the cognitive response capabilities of the individual, (ii) a likelihood of the individual experiencing an adverse event in response to administration of the pharmaceutical agent, drug, or biologic, (iii) a change in one or more of the amount, concentration, or dose titration of the pharmaceutical agent, drug, or biologic, and (iv) a change in the individual's cognitive response capabilities, a recommended treatment regimen, or recommending or determining a degree of effectiveness of at least one of a behavioral therapy, counseling, or physical exercise.
[0071] In any example herein, the App 214 can be configured to receive measurement data including physiological measurement data of an individual received from a physiological component, and/or data indicative of the response of an individual to a task and/or an interference rendered at a user interface of the apparatus 100 (as described in greater detail below), and/or data indicative of one or more of an amount, concentration, or dose titration, or other treatment regimen of a drug, pharmaceutical agent, biologic, or other medication being or to be administered to an individual.
[0072] Non-limiting examples of the computing device include a smartphone, a tablet, a slate, an e-reader, a digital assistant, or any other equivalent device, including any of the mobile communication devices described hereinabove. As an example, the computing device can include a processor unit that is configured to execute an application that includes an analysis module for analyzing the data indicative of the individual's response to the rendered tasks and/or interference.
[0073] The example systems, methods, and apparatus can be implemented as a component in a product comprising a computing device that uses computer-implemented adaptive psychophysical procedures to assess human performance or delivers psychological/perceptual therapy.
[0074] A non-limiting example characteristic of a type of decision boundary metric that can be computed based on the response profile is the response criterion (a time-point measure), calculated using the standard procedure to calculate response criterion for a signal detection psychophysics assessment. See, e.g., Macmillan and Creelman (2004), "Signal Detection: A Users Guide" 2nd edition, Lawrence Erlbaum USA.
[0075] In other non-limiting examples, the decision boundary metric may be more than a single quantitative measure but rather a curve defined by quantitative parameters based on which decision boundary metrics can be computed, such as but not limited to an area to one side or the other of the response profile curve. Other non-limiting example types of decision boundary metrics that can be computed to characterize the decision boundary curves for evaluating the time-varying characteristics of the decision process include a distance between the initial bias point (the starting point of the belief accumulation trajectory) and the criterion, a distance to the decision boundary, a "waiting cost" (e.g., the distance from the initial decision boundary and the maximum decision boundary, or the total area of the curve to that point), or the area between the decision boundary and the criterion line (including the area normalized to the response deadline to yield a measure of an "average decision boundary" or an "average criterion").
[0076] While examples herein may be described based on computation of a response criterion, other types of decision boundary metrics are applicable.
[0077] FIG. 3 shows an example plot of the signal (right curve 302) and noise (left curve 304) distributions of an individual or group psychophysical data, and the computed response criterion, based on data collected from individuals that performed a Test of Variables of Attention (TOVAC)) test (The TOVA Company, Los Alamitos, CA).
The TO VA test is an example of a computerized test that can be used by a healthcare professional as an aid in an assessment of an individual's attention deficits and impulsivity, including attention-deficit/hyperactivity disorder (ADHD).
The TO VA test is an example of a computerized test that can be used by a healthcare professional as an aid in an assessment of an individual's attention deficits and impulsivity, including attention-deficit/hyperactivity disorder (ADHD).
[0078] In FIG. 3, the vertical line represents the response criterion 300.
The intercept of the criterion line on the X axis (in Z units) can be used to provide an indication of the tendency of an individual to respond 'yes (further right) or `no' (further left) from a point of zero bias (p). As indicated in FIG. 3, p is located on the x-axis where the signal distribution (right curve 302) and the noise distribution (left curve 304) intersect. Response criterion intercepts left of p may indicate an individual's overall tendency to a more impulsive strategy and intercepts right of p may indicate an individual's overall tendency to a more conservative strategy. Response criterion intercepts at p indicate a balanced strategy.
The intercept of the criterion line on the X axis (in Z units) can be used to provide an indication of the tendency of an individual to respond 'yes (further right) or `no' (further left) from a point of zero bias (p). As indicated in FIG. 3, p is located on the x-axis where the signal distribution (right curve 302) and the noise distribution (left curve 304) intersect. Response criterion intercepts left of p may indicate an individual's overall tendency to a more impulsive strategy and intercepts right of p may indicate an individual's overall tendency to a more conservative strategy. Response criterion intercepts at p indicate a balanced strategy.
[0079] The example systems, methods, and apparatus can be configured to implement a further extension of signal detection theory to a time-limited task (as described in greater detail below). The example systems, methods, and apparatus can be configured to extend accumulation of belief information, modeled using a computational model of human decision-making (such as but not limited to a drift-diffusion model (DDM) and/or a Bayesian model), and decision boundaries that reflect different strategies.
[0080] Following is a description of a non-limiting example use of a computational model of human decision-making (based on a drift diffusion model). While the drift diffusion model is used as the example, other types of models apply, including a Bayesian model. The drift-diffusion model (DDM) can be applied for systems with two-choice decision making. See, e.g., Ratcliff, R. (1978), "A theory of memory retrieval."
Psychological Review, 85, 59-108; Ratcliff, R., & Tuerlinckx, F. (2002), "Estimating parameters of the diffusion model: Approaches to dealing with contaminant reaction times and parameter variability," Psychonomic Bulletin & Review, 9, 438-481.
The diffusion model is based on an assumption that binary decision processes are driven by systematic and random influences.
Psychological Review, 85, 59-108; Ratcliff, R., & Tuerlinckx, F. (2002), "Estimating parameters of the diffusion model: Approaches to dealing with contaminant reaction times and parameter variability," Psychonomic Bulletin & Review, 9, 438-481.
The diffusion model is based on an assumption that binary decision processes are driven by systematic and random influences.
[0081] FIG. 4A shows an example plot of the diffusion model with a stimulus that results in a linear drift rate, showing example paths of the accumulation of belief from a stimulus. It shows the distributions of drift rates across trials for targets (signal) and non-targets (noise). The vertical line is the response criterion. The drift rate on each trial is determined by the distance between the drift criterion and a sample from the drift distribution. The process starts at point x, and moves over time until it reaches the lower threshold at "A" or the upper threshold at "B". The DDM assumes that an individual is accumulating evidence for one or other of the alternative thresholds at each time step, and integrating that evidence to develop a belief, until a decision threshold is reached. Depending on which threshold is reached, different responses (i.e., Response A or Response B) are initiated by the individual. In a psychological application, this means that the decision process is finished and the response system is being activated, in which the individual initiates the corresponding response. As described in non-limiting examples below, this can require a physical action of the individual to actuate a component of the system or apparatus to provide the response (such as but not limited to tapping on the user interface in response to a target). The systematic influences are called the drift rate, and they drive the process in a given direction. The random influences add an erratic fluctuation to the constant path. With a given set of parameters, the model predicts distributions of process durations (i.e., response times) for the two possible outcomes of the process.
[0082] FIG. 4A also shows an example drift-diffusion path of the process, illustrating that the path is not straight but rather oscillates between the two boundaries, due to random influences. In a situation in which individuals are required to categorize stimuli, the process describes the ratio of information gathered over time that causes an individual to foster each of the two possible stimulus interpretations. Once belief points with sufficient clarity is reached, the individual initiates a response. In the example of FIG. 4A, processes reaching the upper threshold are indicative of a positive drift rate. In some trials, the random influences can outweigh the drift, and the process terminates at the lower threshold.
[0083] Example parameters of the drift diffusion model include quantifiers of the thresholds ("A" or "B"), the starting point (x), the drift rate, and a response time constant (to). The DDM can provide a measure of conservatism, an indication that the process takes more time to reach one threshold and that it will reach the other threshold (opposite to the drift) less frequently. The starting point (x) provides an indicator of bias (reflecting differences in the amount of information that is required before the alternative responses are initiated). If x is closer to "A", an individual requires a smaller (relative) amount of information to develop a belief to execute Response A, as compared with a larger (relative) amount of information that the individual would need to execute Response B. The smaller the distance between the starting point (x) and a threshold, the shorter the process durations would be for the individual to execute the corresponding response. A positive value of drift rate (v) serves as a measure of the mean rate of approach to the upper threshold ("A"). The drift rate indicates the relative amount of information per time unit that the individual absorbs information on a stimulus to develop a belief in order to initiate and execute a response. In an example, comparison of the drift rates computed from data of one individual to data from another can provide a measure of relative perceptual sensitivity of the individuals.
In another example, comparison of the drift rates can provide a relative measure of task difficulty.
For computation of the response time, the DDM allows for estimating their total duration, and the response time constant (to) indicates the duration of extra-decisional processes.
The DDM has been shown to describe accuracy and reaction times in human data for tasks. In the non-limiting example of FIG. 4A, the total response time is computed as a sum of the magnitude of time for stimulus encoding (ts), the time the individual takes for the decision, and the time for response execution.
In another example, comparison of the drift rates can provide a relative measure of task difficulty.
For computation of the response time, the DDM allows for estimating their total duration, and the response time constant (to) indicates the duration of extra-decisional processes.
The DDM has been shown to describe accuracy and reaction times in human data for tasks. In the non-limiting example of FIG. 4A, the total response time is computed as a sum of the magnitude of time for stimulus encoding (ts), the time the individual takes for the decision, and the time for response execution.
[0084] As compared to the traditional drift diffusion model that is based on stimuli that result in linear drift rates, the example systems, methods, and apparatus according to the principles herein are configured to render stimuli that result in non-linear drift rates, which stimuli are based on tasks and/or interferences that are time-varying and have specified response deadlines. As a result, the example systems, methods, and apparatus according to the principles herein are configured to apply a modified diffusion model (modified DDM) based on these stimuli that result in non-linear drift rates.
[0085] FIG. 4B shows an example plot of a non-linear drift rate in a drift diffusion computation. Example parameters of the modified DDM also include quantifiers of the thresholds ("A" or "B"), the starting point (x), the drift rate, and a response time constant (to). Based on data collected from user interaction with the example systems, methods, and apparatus herein, the systems, methods, and apparatus are configured to apply the modified DDM with the non-linear drift rates to provide a measure of the conservatism or impulsivity of the strategy employed in the user interaction with the example platforms herein. The example systems, methods, and apparatus are configured to compute a measure of the conservatism or impulsivity of the strategy used by an individual based on the modified DDM model, to provide an indication of the time the process takes for a given individual to reach one threshold and as compared to reaching the other threshold (opposite to the drift). The starting point (x) in FIG. 4B also provides an indicator of bias (reflecting differences in the amount of information that is required before the alternative responses are initiated). For computation of the response time, the DDM allows for estimating their total duration, and the response time constant (to) indicates the duration of extra-decisional processes.
[0086] In the example systems, methods, and apparatus according to the principles herein, the non-linear drift rate results from the time-varying nature of the stimuli, including (i) the time-varying feature of portions of the task and/or interference rendered to the user interface for user response (as a result of which the amount of information available for an individual to develop a belief is presented in a temporally non-linear manner), and (ii) the time limit of the response deadlines of the task and/or interference, which can influence an individual's sense of timing to develop a belief in order to initiate a response. In this example as well, a positive value of drift rate (v) serves as a measure of the mean rate of approach to the upper threshold ("A"). The non-linear drift rate indicates the relative amount of information per time unit that the individual absorbs to develop a belief in order to initiate and execute a response. In an example, comparison of the drift rate computed from response data collected from one individual to the drift rate computed from response data collected from another individual can be used to provide a measure of relative perceptual sensitivity of the individuals. In another example, comparison of the drift rate computed from response data collected from a given individual from two or more different interaction sessions can be used to provide a relative measure of task difficulty. For computation of the response time of the individual's responses, the modified DDM also allows for estimating the total duration of the response time, and the response time constant (to) indicates the duration of extra-decisional processes. In the non-limiting example of FIG. 4A, the total response time is computed as a sum of the magnitude of time for stimulus encoding (ts), the time the individual takes for the decision, and the time for response execution.
[0087] For the modified DDM, the distance between the thresholds (i.e., between "A"
and "B") provides a measure of conservatism¨that is, the larger the separation, the more information is collected prior to an individual executing a response. The starting point (x) also provides an estimate of relative conservatism: if the process starts above or below the midpoint between the two thresholds, different amounts of information are required for both responses; that is, a more conservative decision criterion is applied for one response, and a more liberal criterion (i.e., impulsive) for the opposite response.
The drift rate (v) indicates the (relative) amount of information gathered per time, denoting either perceptual sensitivity or task difficulty.
and "B") provides a measure of conservatism¨that is, the larger the separation, the more information is collected prior to an individual executing a response. The starting point (x) also provides an estimate of relative conservatism: if the process starts above or below the midpoint between the two thresholds, different amounts of information are required for both responses; that is, a more conservative decision criterion is applied for one response, and a more liberal criterion (i.e., impulsive) for the opposite response.
The drift rate (v) indicates the (relative) amount of information gathered per time, denoting either perceptual sensitivity or task difficulty.
[0088] FIG. 5 shows an example plot of the signal (right curve 502) and noise (left curve 504) distributions of an individual or group psychophysical data, and the computed response criterion 500, based on data collected from an individual's responses with the tasks and/or interference rendered at a user interface of a computing device according to the principles herein (as described in greater detail hereinbelow). The intercept of the criterion line on the X axis (in Z units) can be used to provide an indication of the tendency of an individual to respond 'yes (further right) or `no' (further left). The response criterion 500 is left of the zero-bias decision point (p) and where the signal and noise distributions intersect. In the non-limiting example of FIG. 5, p is the location of the zero-bias decision on the decision axis in Z-units, and response criterion values to the left of p indicate an impulsive strategy and response criterion values to the right of p indicate a conservative strategy, with intercepts on the zero-bias point indicating a balanced strategy.
[0089] The example systems, methods, and apparatus according to the principles herein can be configured to compute a response criterion based on the detection or classification task(s) described herein that are composed of signal and non-signal response targets (as stimuli), in which a user indicates a response that indicates a feature, or multiple features, are present in a series of sequential presentations of stimuli or simultaneous presentation of stimuli.
[0090] The data indicative of the results of the classification of an individual according to the principles herein (including a classifier output) can be transmitted (with the pertinent consent) as a signal to one or more of a medical device, healthcare computing system, or other device, and/or to a medical practitioner, a health practitioner, a physical therapist, a behavioral therapist, a sports medicine practitioner, a pharmacist, or other practitioner, to allow formulation of a course of treatment for the individual or to modify an existing course of treatment, including to determine a change in one or more of an amount, concentration, or dose titration of a drug, biologic or other pharmaceutical agent being or to be administered to the individual and/or to determine an optimal type or combination of drug, biologic or other pharmaceutical agent to be administered to the individual.
[0091] The example systems, methods, and apparatus herein provide computerized classifiers, treatment tools, and other tools that can be used by a medical, behavioral, healthcare, or other professional as an aid in an assessment and/or enhancement of an individual's attention, working memory, and goal management. In an example implementation, the example systems, methods, and apparatus herein apply the modified DDM to the collected data to provide measures of conservatism or impulsivity.
The example analysis performed using the example systems, methods, and apparatus according to the principles herein can be used to provide measures of attention deficits and impulsivity (including ADHD). The example systems, methods, and apparatus herein provide computerized classifiers, treatment tools, and other tools that can be used as aids in assessment and/or enhancement in other cognitive domains, such as but not limited to attention, memory, motor, reaction, executive function, decision-making, problem-solving, language processing, and comprehension. In some examples, the systems, methods, and apparatus can be used to compute measures for use for cognitive monitoring and/or disease monitoring. In some examples, the systems, methods, and apparatus can be used to compute measures for use for cognitive monitoring and/or disease monitoring during treatment of one or more cognitive conditions and/or diseases and/or executive function disorders.
The example analysis performed using the example systems, methods, and apparatus according to the principles herein can be used to provide measures of attention deficits and impulsivity (including ADHD). The example systems, methods, and apparatus herein provide computerized classifiers, treatment tools, and other tools that can be used as aids in assessment and/or enhancement in other cognitive domains, such as but not limited to attention, memory, motor, reaction, executive function, decision-making, problem-solving, language processing, and comprehension. In some examples, the systems, methods, and apparatus can be used to compute measures for use for cognitive monitoring and/or disease monitoring. In some examples, the systems, methods, and apparatus can be used to compute measures for use for cognitive monitoring and/or disease monitoring during treatment of one or more cognitive conditions and/or diseases and/or executive function disorders.
[0092] FIG. 6 shows an example plot of the conditional probability of a quantifier of belief given a signal (P (Belief I Signal)) along the z-axis, Time as the x-axis, and the quantifier of belief is the y-axis. The curve labeled Valid Target and the curve labeled Invalid Target (each lying in the x-y plane) indicate data values quantifying belief trajectories of accumulated (noisy) information over time for a user to develop a strong belief one way or another as to the appropriate response. The four curves labeled Signal and four curves labeled Noise each has a magnitude in the z-direction and are data values of the "signal" distribution and "noise" distribution at different points in time.
Each signal curve is paired with a noise, and the pair is time-displaced along the x-axis (at times t = to, ti, t2, t3). As shown in FIG. 6, each signal-noise curve pair spreads out (i.e., becomes a wider curve as time increases from to to t3), to represent the probability of a given degree of belief at a given point in time given a type of signal.
In this time-evolving model, the decision is made when the belief trajectory crosses a decision boundary. FIG. 6 also shows example curves that serve as projected decision boundaries for response data values indicative of an impulsive strategy (narrower curve in the x-y plane) and response data values indicative of a conservative strategy (wider curve in the x-y plane). As described herein, the impulsive strategy requires much less extreme belief (i.e., less extreme values of the quantifier of belief) in order to arrive at a decision. As also described herein, the conservative strategy requires much more extreme belief (i.e., more extreme values of the quantifier of belief) in order to arrive at a decision. As the perceived response deadline approaches, these decision boundaries converge on the criterion value described in signal detection theory.
Each signal curve is paired with a noise, and the pair is time-displaced along the x-axis (at times t = to, ti, t2, t3). As shown in FIG. 6, each signal-noise curve pair spreads out (i.e., becomes a wider curve as time increases from to to t3), to represent the probability of a given degree of belief at a given point in time given a type of signal.
In this time-evolving model, the decision is made when the belief trajectory crosses a decision boundary. FIG. 6 also shows example curves that serve as projected decision boundaries for response data values indicative of an impulsive strategy (narrower curve in the x-y plane) and response data values indicative of a conservative strategy (wider curve in the x-y plane). As described herein, the impulsive strategy requires much less extreme belief (i.e., less extreme values of the quantifier of belief) in order to arrive at a decision. As also described herein, the conservative strategy requires much more extreme belief (i.e., more extreme values of the quantifier of belief) in order to arrive at a decision. As the perceived response deadline approaches, these decision boundaries converge on the criterion value described in signal detection theory.
[0093] An example system, method or apparatus according to the principles herein can be applied to data values as indicated in accordance with FIG. 6 to compute a classifier to apply to data indicative of a user's responses to the tasks and/or interference rendered at a user interface to determine a measure of whether an individual is employing a more conservative strategy or a more impulsive strategy.
[0094] Such an example model, e.g., as described in connection with FIG. 6, enables Bayesian inference of the shape of an individual's decision boundary based on the response times and correctness of a sequence of decisions. In a non-limiting example, a metric can be derived characterizing a degree of impulsiveness of the individual's response strategy based on the area of this decision boundary compared with the area of the "ideal" decision boundary (the response deadline times the full width of the belief axis).
[0095] FIGs. 7A ¨ 7B show example plots of the curves for values of conservative and impulsive measures from the trial start (t = 0) to the perceived response deadline (R-Dp). FIG. 7A shows example curves for a two-alternative forced choice (2AFC) task, where an individual is instructed/required to discriminate between two types of stimulus (such as but not limited to targets with differing degrees of a facial expression or other characteristic/feature difference), hence both are ultimately targets as they require an action/response from the individual. FIG. 7B shows example curves for a GO/NO-GO
task, where the individual is instructed/required to decide whether a stimulus is a target requiring response/action (based on the instructions) or a non-target requiring inaction/no response (based on the instructions). In some examples herein, the stimuli are designated as GO/NO-GO task (i.e., with instructions to act/give response for a target or not act/give no response). In FIG. 7A, the plot shows the curves versus development of belief (for two types of target stimuli at various time points (t = 0, a, b, c, d) as well as the decision boundaries relative to the value of the response criterion for the time-varying stimuli described herein. FIG. 7B shows the differing types of values and shapes of conservative and impulsive measures from the trial start to a response deadline for the traditional GO/NO GO task (target vs. non-target), a pass/fail or yes/no type of test that has two boundary conditions or a binary classification. As shown in FIG. 7B, the curves for the values of the conservative and impulsive measures for the GO/NO GO task does not have a right-side decision boundary because waiting to act/response is not a momentary decision that an individual arrive at, rather it is a process that continues until the end of the trial (or at least until the attention of the individual is allocated elsewhere).
task, where the individual is instructed/required to decide whether a stimulus is a target requiring response/action (based on the instructions) or a non-target requiring inaction/no response (based on the instructions). In some examples herein, the stimuli are designated as GO/NO-GO task (i.e., with instructions to act/give response for a target or not act/give no response). In FIG. 7A, the plot shows the curves versus development of belief (for two types of target stimuli at various time points (t = 0, a, b, c, d) as well as the decision boundaries relative to the value of the response criterion for the time-varying stimuli described herein. FIG. 7B shows the differing types of values and shapes of conservative and impulsive measures from the trial start to a response deadline for the traditional GO/NO GO task (target vs. non-target), a pass/fail or yes/no type of test that has two boundary conditions or a binary classification. As shown in FIG. 7B, the curves for the values of the conservative and impulsive measures for the GO/NO GO task does not have a right-side decision boundary because waiting to act/response is not a momentary decision that an individual arrive at, rather it is a process that continues until the end of the trial (or at least until the attention of the individual is allocated elsewhere).
[0096] FIGs. 7C ¨ 7D show example plots of the formation of belief for linear belief accumulation and non-linear belief accumulation, respectively. In a system with linear belief accumulation, FIG. 7C shows the values of the mean belief for targets (MB(targets)) and mean belief for non-targets (MB(non-targets)) versus the development of belief (for target versus non-target) at various time points (t = 0, a, b, c, d) relative to the value of the response criterion. FIG. 7C also shows the target confidence interval and non-target confidence interval for the linear belief accumulation. In a system with non-linear belief accumulation, FIG. 7D shows the values of the mean belief for targets (MB(targets)) and mean belief for non-targets (MB(non-targets)) versus development of belief (for target versus non-target) at various time points (t = 0, a, b, c, d) relative to the value of the response criterion for the nonlinear belief accumulation. FIG. 7C
also shows the target confidence interval and non-target confidence interval. A
traditional GO/NO GO task involves presentation to an individual for a specific period of time of a stimulus without a time-varying aspect, and supports linear accumulation of belief from the information available to the individual for developing belief. By contrast, the example tasks and/or interferences according to the principles herein have at least one time-varying feature (based on their feature dynamics), resulting in nonlinear belief accumulation.
also shows the target confidence interval and non-target confidence interval. A
traditional GO/NO GO task involves presentation to an individual for a specific period of time of a stimulus without a time-varying aspect, and supports linear accumulation of belief from the information available to the individual for developing belief. By contrast, the example tasks and/or interferences according to the principles herein have at least one time-varying feature (based on their feature dynamics), resulting in nonlinear belief accumulation.
[0097] FIGs. 8A ¨ 8D show plots of the probability curves for the "signal"
distribution and the "noise" distribution at the different points in time (t = a, b, c, d) shown in FIGs.
7A ¨ 7D. Each of FIGs. 8A ¨ 8D shows a signal curve and a noise curve at differing time-points displaced along the x-axis (similar to the signal and noise curves shown at time-points t = to, ti, t2, t3 in FIG. 6). As shown in FIGs. 8A ¨ 8D, the signal-noise curve pair spreads out (i.e., becomes a wider curve) as time increases from t = a to t = d, representing the probability of a given degree of belief at a given point in time for a given type of signal. In this time-evolving model, the decision is made when the belief trajectory crosses a decision boundary. FIGs. 8A ¨ 8D also show the values of the mean belief for targets (MB(targets)) and mean belief for non-targets (MB(non-targets)) versus the development of belief. In FIG. 8D, the decision boundaries (conservative and impulsive) are converged at the criterion.
distribution and the "noise" distribution at the different points in time (t = a, b, c, d) shown in FIGs.
7A ¨ 7D. Each of FIGs. 8A ¨ 8D shows a signal curve and a noise curve at differing time-points displaced along the x-axis (similar to the signal and noise curves shown at time-points t = to, ti, t2, t3 in FIG. 6). As shown in FIGs. 8A ¨ 8D, the signal-noise curve pair spreads out (i.e., becomes a wider curve) as time increases from t = a to t = d, representing the probability of a given degree of belief at a given point in time for a given type of signal. In this time-evolving model, the decision is made when the belief trajectory crosses a decision boundary. FIGs. 8A ¨ 8D also show the values of the mean belief for targets (MB(targets)) and mean belief for non-targets (MB(non-targets)) versus the development of belief. In FIG. 8D, the decision boundaries (conservative and impulsive) are converged at the criterion.
[0098] An example system, method, and apparatus according to the principles herein can be configured to execute an example response classifier to generate a quantifier of the cognitive skills in an individual. The example response classifier can be built using a machine learning tool, such as but not limited to linear/logistic regression, principal component analysis, generalized linear mixed models, random decision forests, support vector machines, and/or artificial neural networks. In a non-limiting example, classification techniques that may be used to train a classifier using the performance measures of a labeled population of individuals (e.g., individuals with known cognitive disorders, executive function disorder, disease or other cognitive condition). The trained classifier can be applied to measures of the responses of the individual to the tasks and/or interference to classify the individual as to a population label (e.g., cognitive disorder, executive function disorder, disease or other cognitive condition). In an example, machine learning may be implemented using cluster analysis. Each measurement of the cognitive response capabilities of participating individuals can be used as the parameter that groups the individuals to subsets or clusters. For example, the subset or cluster labels may be a diagnosis of a cognitive disorder, cognitive disorder, executive function disorder, disease or other cognitive condition. Using a cluster analysis, similarity metric of each subset and the separation between different subsets can be computed, and these similarity metrics may be applied to data indicative of an individual's responses to a task and/or interference to classify that individual to a subset. In another example, the classifier may be a supervised machine learning tool based on artificial neural networks. In such a case, the performance measures of individuals with known cognitive abilities may be used to train the neural network algorithm to model the complex relationships among the different performance measures. A trained classifier can be applied to the performance/response measures of a given individual to generate a classifier output indicative of the cognitive response capabilities of the individual. Other applicable techniques for generating a classifier include a regression or Monte Carlo technique for projecting cognitive abilities based on his/her cognitive performance. The classifier may be built using other data, including a physiological measure (e.g., EEG) and demographic measures.
[0099] In an example implementation, a programmed processing unit is configured to execute processor-executable instructions to render a task with an interference at a user interface. As described in greater detail herein, one or more of the task and the interference can be time-varying and have a response deadline, such that the user interface imposes a limited time period for receiving at least one type of response from the individual interacting with the apparatus or system. The processing unit is configured to control the user interface to measure data indicative of two or more differing types of responses to the task or to the interference. The programmed processing unit is further configured to execute processor-executable instructions to cause the example system or apparatus to receive data indicative of a first response of the individual to the task and a second response of the individual to the interference, analyze at least some portion of the data to compute at least one response profile representative of the performance of the individual, and determine a decision boundary metric (such as but not limited to the response criterion) from the response profile. As described herein, including in connection with FIGs. 4A and 4B, the decision boundary metric (such as but not limited to the response criterion) gives a quantitative measure of a tendency of the individual to provide at least one type of response of the two or more differing types of responses (Response A vs. Response B) to the task or the interference. The programmed processing unit is further configured to execute processor-executable instructions to execute a response classifier based on the computed values of the decision boundary metric (such as but not limited to the response criterion), to generate a classifier output indicative of the cognitive response capabilities of the individual.
[0100] In an example, the processing unit further uses the classifier output for one or more of changing one or more of the amount, concentration, or dose titration of the pharmaceutical agent, drug, biologic or other medication, identifying a likelihood of the individual experiencing an adverse event in response to administration of the pharmaceutical agent, drug, biologic or other medication, identifying a change in the individual's cognitive response capabilities, recommending a treatment regimen, or recommending or determining a degree of effectiveness of at least one of a behavioral therapy, counseling, or physical exercise.
[0101] In any example herein, the example response classifier can be used as an intelligent proxy for quantifiable assessments of an individual's cognitive abilities. That is, once a response classifier is trained, the classifier output can be used to provide the indication of the cognitive response capabilities of multiple individuals without use of other cognitive or behavioral assessment tests.
[0102] Monitoring cognitive deficits allows individuals, and/or medical, healthcare, behavioral, or other professional (with consent) to monitor the status or progression of a cognitive condition, a disease, or an executive function disorder. For example, individuals with Alzheimer's disease may shows mild symptoms initially, but others have more debilitating symptoms. If the status or progression of the cognitive symptoms can be regularly or periodically quantified, it can provide an indication of when a form of pharmaceutical agent or other drug may be administered or to indicate when quality of life might be compromised (such as the need for assisted living). Monitoring cognitive deficits also allows individuals, and/or medical, healthcare, behavioral, or other professional (with consent) to monitor the response of the individual to any treatment or intervention, particularly in cases where the intervention is known to be selectively effective for certain individuals. In an example, a cognitive assessment tool based on the classifiers herein can be an individual patient with attention deficit hyperactivity disorder (ADHD). In another example, the classifiers and other tools herein can be used as a monitor of the presence and/or severity of any cognitive side effects from therapies with known cognitive impact, such as but not limited to chemotherapy, or that involve uncharacterized or poorly characterized pharmacodynamics. In any example herein, the cognitive performance measurements and/or classifier analysis of the data may be performed every 30 minutes, each few hours, daily, two or more times per week, weekly, bi-weekly, each month, or once per year.
[0103] In an example, response classifier can be used as an intelligent proxy for quantifiable measures of the degree of conservativeness or impulsivity of the individual.
[0104] In an example, the analysis of the data indicative of the first response and/or the second response generates a first response profile that is an impulsive response profile or a conservative response profile.
[0105] In a non-limiting example, the task and the interference can be rendered at the user interface such that the individual is required to provide the first response and the second response within a limited period of time. In an example, the individual is required to provide the first response and the second response substantially simultaneously.
[0106] In the example herein, "substantially simultaneously" means tasks are rendered, or response measurements are performed, within less than about 5 milliseconds of each other, or within about 10 milliseconds, about 20 milliseconds, about 50 milliseconds, about 75 milliseconds, about 100 milliseconds, or about 150 milliseconds or less, about 200 milliseconds or less, about 250 milliseconds or less, of each other. In any example herein, "substantially simultaneously" is a period of time less than the average human reaction time. In another example, two tasks may be substantially simultaneous if the individual switches between the two tasks within a pre-set amount of time. The set amount of time for switching considered "substantially simultaneously" can be about 1 tenth of a second, 1 second, about 5 seconds, about 10 seconds, about 30 seconds, or greater.
[0107] In a non-limiting example, the classifier output can be indicative of the degree of impulsiveness or conservativeness of the individual's cognitive response capabilities.
[0108] In an example, the processing unit executes further instructions including applying at least one adaptive procedure to modify the task and/or the interference, such that analysis of the data indicative of the first response and/or the second response indicates a modification of the first response profile.
[0109] In an example, the at least one response profile changes from an impulsive response profile to a conservative response profile based on received data collected from measurement of the first response and/or the second response to the modified task and/or the modified.
[0110] In an example, the task or the interference includes a response-deadline procedure having the response-deadline; and wherein the at least one adaptive procedure modifies the response-deadline to modify a performance characteristics of the individual to an impulsive response profile or a conservative response profile.
[0111] In an example, the processing unit controls the user interface to modify a temporal length of the response window associated with the response-deadline procedure.
[0112] In an example, the processing unit controls the user interface to modify a time-varying characteristics of an aspect of the task or the interference rendered to the user interface.
[0113] As described in connection with FIGs. 4A and 4B, the time-varying characteristics of the task and/or interference results in the time-varying availability of information about the target, such that that a linear drift-rate is no longer sufficient to capture development of belief over time (rather, requiring a nonlinear drift rate). A time-varying characteristic can be a feature such as, but not limited to, color, shape, type of creature, facial expression, or other feature that an individual requires in order to discriminate between a target and a non-target, resulting in differing time-characteristics of availability. The trial-by-trial adjustment of the response window length also can be a time-varying characteristic that alters the individual's perception of where the decision criteria needs to be in order to respond successfully to a task and/or an interference.
Another time-varying characteristic that can be modified is the degree that an interference interferes with a parallel task which can introduce interruptions in belief accumulation and/or response selection and execution.
Another time-varying characteristic that can be modified is the degree that an interference interferes with a parallel task which can introduce interruptions in belief accumulation and/or response selection and execution.
[0114] In an example, modifying the time-varying characteristics of an aspect of the task or the interference includes adjusting a temporal length of the rendering of the task or interference at the user interface between two or more sessions of interactions of the individual.
[0115] In an example, the time-varying characteristics is one or more of a speed of an object, a rate of change of a facial expression, a direction of trajectory of an object, a change of orientation of an object, at least one color of an object, a type of an object, or a size of an object.
[0116] In an example, the change in type of object is effected using morphing from a first type of object to a second type of object or rendering a blendshape as a proportionate combination of the first type of object and the second type of object.
[0117] In a non-limiting example, the processing unit can be configured to render a user interface or cause another component to execute least one element for indicating a reward to the individual for a degree of success in interacting with a task and/or interference, or another feature or other element of a system or apparatus. A
reward computer element can be a computer generated feature that is delivered to a user to promote user satisfaction with the example system, method or apparatus, and as a result, increase positive user interaction and hence enjoyment of the experience of the individual.
reward computer element can be a computer generated feature that is delivered to a user to promote user satisfaction with the example system, method or apparatus, and as a result, increase positive user interaction and hence enjoyment of the experience of the individual.
[0118] In an example, the processing unit further computes as the classifier output parameters indicative of one or more of a bias sensitivity derived from the data indicative of the first response and the second response, a non-decision time sensitivity to parallel tasks, a belief accumulation sensitivity to parallel task demands, a reward rate sensitivity, or a response window estimation efficiency. Bias sensitivity can be a measure of how sensitive an individual is to certain of the tasks based on their bias (tendency to one type of response versus another (e.g., Response A vs.
Response B)).
Non-decision time sensitivity to parallel tasks can be a measure of how much the interference interferes with the individual's performance of the primary task.
Belief accumulation sensitivity to parallel task demands can be a measure of the rate of the individual to develop/accumulate belief for responding to the interference during the individual's performance of the primary task. Reward rate sensitivity can be used to measure how an individual's response changes based on the temporal length of the response deadline window. When near the end of a response deadline window (e.g., as individual sees interference about to move off the field of view), the individual realizes that he is running out of time to make a decision. This measures how the individual's responses change accordingly. Response window estimation efficiency is explained as follows. When the individual is making a decision to act/respond or not act/no response, the decision needs to be based on when the individual thinks his time to respond is running out. For a varying window, the individual will not be able to measure that window perfectly, but with enough trials/session, based the response data, it may be possible to infer how good the individual is at making that estimation based on the time-varying aspect (e.g., trajectory) of the objects in the task or interference.
Response B)).
Non-decision time sensitivity to parallel tasks can be a measure of how much the interference interferes with the individual's performance of the primary task.
Belief accumulation sensitivity to parallel task demands can be a measure of the rate of the individual to develop/accumulate belief for responding to the interference during the individual's performance of the primary task. Reward rate sensitivity can be used to measure how an individual's response changes based on the temporal length of the response deadline window. When near the end of a response deadline window (e.g., as individual sees interference about to move off the field of view), the individual realizes that he is running out of time to make a decision. This measures how the individual's responses change accordingly. Response window estimation efficiency is explained as follows. When the individual is making a decision to act/respond or not act/no response, the decision needs to be based on when the individual thinks his time to respond is running out. For a varying window, the individual will not be able to measure that window perfectly, but with enough trials/session, based the response data, it may be possible to infer how good the individual is at making that estimation based on the time-varying aspect (e.g., trajectory) of the objects in the task or interference.
[0119] An example system, method, and apparatus according to the principles herein can be configured to train a classifier model of a measure of the cognitive capabilities of individuals based on feedback data from the output of the computational model of human decision-making for individuals that are previously classified as to the measure of cognitive abilities of interest. For example, the response classifier can be trained using a plurality of training datasets, where each training dataset is associated with a previously classified individual from a group of individuals. Each of the training dataset includes data indicative of the first response of the classified individual to the task and data indicative of the second response of the classified individual to the interference, based on the classified individual's interaction with an example apparatus, system, or computing device described herein. The example response classifier also can take as input data indicative of the performance of the classified individual at a cognitive test, and/or a behavioral test, and/or data indicative of a diagnosis of a status or progression of a cognitive condition, a disease, or a disorder (including an executive function disorder) of the classified individual.
[0120] In any example herein, the at least one processing unit can be programmed to cause an actuating component of the apparatus (including the cognitive platform) to effect auditory, tactile, or vibrational computerized elements to effect the stimulus or other interaction with the individual. In a non-limiting example, the at least one processing unit can be programmed to cause a component of the cognitive platform to receive data indicative of at least one response from the individual based on the user interaction with the task and/or interference, including responses provided using an input device. In an example where at least one graphical user interface is rendered to present the computerized stimulus to the individual, the at least one processing unit can be programmed to cause the graphical user interface to receive the data indicative of at least one response from the individual.
[0121] In any example herein, the data indicative of the response of the individual to a task and/or an interference can be measured using at least one sensor device contained in and/or coupled to an example system or apparatus herein, such as but not limited to a gyroscope, an accelerometer, a motion sensor, a position sensor, a pressure sensor, an optical sensor, an auditory sensor, a vibrational sensor, a video camera, a pressure-sensitive surface, a touch-sensitive surface, or other type of sensor.
In other examples, the data indicative of the response of the individual to the task and/or an interference can be measured using other types of sensor devices, including a video camera, a microphone, joystick, keyboard, a mouse, a treadmill, elliptical, bicycle, steppers, or a gaming system (including a Wii , a Playstation , or an Xbox or other gaming system). The data can be generated based on physical actions of the individual that are detected and/or measured using the at least one sensor device, as the individual executed a response to the stimuli presented with the task and/or interference.
In other examples, the data indicative of the response of the individual to the task and/or an interference can be measured using other types of sensor devices, including a video camera, a microphone, joystick, keyboard, a mouse, a treadmill, elliptical, bicycle, steppers, or a gaming system (including a Wii , a Playstation , or an Xbox or other gaming system). The data can be generated based on physical actions of the individual that are detected and/or measured using the at least one sensor device, as the individual executed a response to the stimuli presented with the task and/or interference.
[0122] The user may respond to tasks by interacting with the computer device. In an example, the user may execute a response using a keyboard for alpha-numeric or directional inputs; a mouse for go/no go clicking, screen location inputs, and movement inputs; a joystick for movement inputs, screen location inputs, and clicking inputs; a microphone for audio inputs; a camera for still or motion optical inputs;
sensors such as accelerometer and gyroscopes for device movement inputs; among others. Non-limiting example inputs for a game system include but are not limited to a game controller for navigation and clicking inputs, a game controller with accelerometer and gryroscope inputs, and a camera for motion optical inputs. Example inputs for a mobile device or tablet include a touch screen for screen location information inputs, virtual keyboard alpha-numeric inputs, go/no go tapping inputs, and touch screen movement inputs;
accelerometer and gyroscope motion inputs; a microphone for audio inputs; and a camera for still or motion optical inputs, among others. In other examples, data indicative of the individual's response can include physiological sensors/measures to incorporate inputs from the user's physical state, such as but not limited to electroencephalogram (EEG), magnetoencephalography (MEG), heart rate, heart rate variability, blood pressure, weight, eye movements, pupil dilation, electrodermal responses such as the galvanic skin response, blood glucose level, respiratory rate, and blood oxygenation.
sensors such as accelerometer and gyroscopes for device movement inputs; among others. Non-limiting example inputs for a game system include but are not limited to a game controller for navigation and clicking inputs, a game controller with accelerometer and gryroscope inputs, and a camera for motion optical inputs. Example inputs for a mobile device or tablet include a touch screen for screen location information inputs, virtual keyboard alpha-numeric inputs, go/no go tapping inputs, and touch screen movement inputs;
accelerometer and gyroscope motion inputs; a microphone for audio inputs; and a camera for still or motion optical inputs, among others. In other examples, data indicative of the individual's response can include physiological sensors/measures to incorporate inputs from the user's physical state, such as but not limited to electroencephalogram (EEG), magnetoencephalography (MEG), heart rate, heart rate variability, blood pressure, weight, eye movements, pupil dilation, electrodermal responses such as the galvanic skin response, blood glucose level, respiratory rate, and blood oxygenation.
[0123] In any example herein, the individual may be instructed to provide a response via a physical action of clicking a button and/or moving a cursor to a correct location on a screen, head movement, finger or hand movement, vocal response, eye movement, or other action of the individual.
[0124] As a non-limiting example, an individual's response to a task or interference rendered at the user interface that requires a user to navigate a course or environment or perform other visuo-motor activity may require the individual to make movements (such as but not limited to steering) that are detected and/or measured using at least one type of the sensor device. The data from the detection or measurement provides the response to the data indicative of the response.
[0125] As a non-limiting example, an individual's response to a task or interference rendered at the user interface that requires a user to discriminate between a target and a non-target may require the individual to make movements (such as but not limited to tapping or other spatially or temporally discriminating indication) that are detected and/or measured using at least one type of the sensor device. The data that is collected by a component of the system or apparatus based on the detection or other measurement of the individual's movements (such as but not limited to at least one sensor or other device or component described herein) provides the data indicative of the individual's responses.
[0126] The example system, method, and apparatus can be configured to apply the classifier model, using computational techniques and machine learning tools, such as but not limited to linear/logistic regression, principal component analysis, generalized linear mixed models, random decision forests, support vector machines, or artificial neural networks, to the data indicative of the individual's response to the tasks and/or interference, and/or data from one or more physiological measures, to create composite variables or profiles that are more sensitive than each measurement alone for generating a classifier output indicative of the cognitive response capabilities of the individual. In an example, the classifier output can be configured for other indications such as but not limited to detecting an indication of a disease, disorder or cognitive condition, or assessing cognitive health.
[0127] The example response classifiers herein can be trained to be applied to data collected from interaction sessions of individuals with the cognitive platform to provide the output. In a non-limiting example, the classifier model can be used to generate a standards table, which can be applied to the data collected from the individual's response to task and/or interference to classify the individual's cognitive response capabilities.
[0128] Non-limiting examples of assessment of cognitive abilities include assessment scales or surveys such as the Mini Mental State Exam, CANTAB
cognitive battery, Test of Variables of Attention (TOVA), Repeatable Battery for the Assessment of Neuropsychological Status, Clinical Global Impression scales relevant to specific conditions, Clinician's Interview-Based Impression of Change, Severe Impairment Battery, Alzheimer's Disease Assessment Scale, Positive and Negative Syndrome Scale, Schizophrenia Cognition Rating Scale, Conners Adult ADHD Rating Scales, Hamilton Rating Scale for Depression, Hamilton Anxiety Scale, Montgomery-Asberg Depressing Rating scale, Young Mania Rating Scale, Children's Depression Rating Scale, Penn State Worry Questionnaire, Hospital Anxiety and Depression Scale, Aberrant Behavior Checklist, Activities for Daily Living scales, ADHD self-report scale, Positive and Negative Affect Schedule, Depression Anxiety Stress Scales, Quick Inventory of Depressive Symptomatology, and PTSD Checklist.
cognitive battery, Test of Variables of Attention (TOVA), Repeatable Battery for the Assessment of Neuropsychological Status, Clinical Global Impression scales relevant to specific conditions, Clinician's Interview-Based Impression of Change, Severe Impairment Battery, Alzheimer's Disease Assessment Scale, Positive and Negative Syndrome Scale, Schizophrenia Cognition Rating Scale, Conners Adult ADHD Rating Scales, Hamilton Rating Scale for Depression, Hamilton Anxiety Scale, Montgomery-Asberg Depressing Rating scale, Young Mania Rating Scale, Children's Depression Rating Scale, Penn State Worry Questionnaire, Hospital Anxiety and Depression Scale, Aberrant Behavior Checklist, Activities for Daily Living scales, ADHD self-report scale, Positive and Negative Affect Schedule, Depression Anxiety Stress Scales, Quick Inventory of Depressive Symptomatology, and PTSD Checklist.
[0129] In other examples, the assessment may test specific functions of a range of cognitions in cognitive or behavioral studies, including tests for perceptive abilities, reaction and other motor functions, visual acuity, long-term memory, working memory, short-term memory, logic, and decision-making, and other specific example measurements, including but are not limited to TOVA, MOT (motion-object tracking), SART, CDT (Change detection task), UFOV (useful Field of view), Filter task, WAIS
digit symbol, Troop, Simon task, Attentional Blink, N-back task, PRP task, task-switching test, and Flanker task.
digit symbol, Troop, Simon task, Attentional Blink, N-back task, PRP task, task-switching test, and Flanker task.
[0130] In non-limiting examples, the example systems, methods, and apparatus according to the principles described herein can be applicable to many different types of neuropsychological conditions, such as but not limited to dementia, Parkinson's disease, cerebral amyloid angiopathy, familial amyloid neuropathy, Huntington's disease, or other neurodegenerative condition, autism spectrum disorder (ASD), presence of the 16p11.2 duplication, and/or an executive function disorder, such as but not limited to attention deficit hyperactivity disorder (ADHD), sensory-processing disorder (SPD), mild cognitive impairment (MCI), Alzheimer's disease, multiple-sclerosis, schizophrenia, major depressive disorder (MDD), or anxiety.
[0131] The instant disclosure is directed to computer-implemented devices formed as example cognitive platforms configured to implement software and/or other processor-executable instructions for the purpose of measuring data indicative of a user's performance at one or more tasks, to provide a user performance metric.
The example performance metric can be used to derive an assessment of a user's cognitive abilities and/or to measure a user's response to a cognitive treatment, and/or to provide data or other quantitative indicia of a user's condition (including physiological condition and/or cognitive condition). Non-limiting example cognitive platforms according to the principles herein can be configured to classify an individual as to a neuropsychological condition, autism spectrum disorder (ASD), presence of the 16p11.2 duplication, and/or an executive function disorder, and/or potential efficacy of use of the cognitive platform when the individual is being administered (or about to be administered) a drug, biologic or other pharmaceutical agent, based on the data collected from the individual's interaction with the cognitive platform and/or metrics computed based on the analysis (and associated computations) of that data. Yet other non-limiting example cognitive platforms according to the principles herein can be configured to classify an individual as to likelihood of onset and/or stage of progression of a neuropsychological condition, including as to a neurodegenerative condition, based on the data collected from the individual's interaction with the cognitive platform and/or metrics computed based on the analysis (and associated computations) of that data. The neurodegenerative condition can be, but is not limited to, Alzheimer's disease, dementia, Parkinson's disease, cerebral amyloid angiopathy, familial amyloid neuropathy, or Huntington's disease.
The example performance metric can be used to derive an assessment of a user's cognitive abilities and/or to measure a user's response to a cognitive treatment, and/or to provide data or other quantitative indicia of a user's condition (including physiological condition and/or cognitive condition). Non-limiting example cognitive platforms according to the principles herein can be configured to classify an individual as to a neuropsychological condition, autism spectrum disorder (ASD), presence of the 16p11.2 duplication, and/or an executive function disorder, and/or potential efficacy of use of the cognitive platform when the individual is being administered (or about to be administered) a drug, biologic or other pharmaceutical agent, based on the data collected from the individual's interaction with the cognitive platform and/or metrics computed based on the analysis (and associated computations) of that data. Yet other non-limiting example cognitive platforms according to the principles herein can be configured to classify an individual as to likelihood of onset and/or stage of progression of a neuropsychological condition, including as to a neurodegenerative condition, based on the data collected from the individual's interaction with the cognitive platform and/or metrics computed based on the analysis (and associated computations) of that data. The neurodegenerative condition can be, but is not limited to, Alzheimer's disease, dementia, Parkinson's disease, cerebral amyloid angiopathy, familial amyloid neuropathy, or Huntington's disease.
[0132] Any classification of an individual as to likelihood of onset and/or stage of progression of a neurodegenerative condition according to the principles herein can be transmitted as a signal to a medical device, healthcare computing system, or other device, and/or to a medical practitioner, a health practitioner, a physical therapist, a behavioral therapist, a sports medicine practitioner, a pharmacist, or other practitioner, to allow formulation of a course of treatment for the individual or to modify an existing course of treatment, including to determine a change in dosage of a drug, biologic or other pharmaceutical agent to the individual or to determine an optimal type or combination of drug, biologic or other pharmaceutical agent to the individual.
[0133] In any example herein, the cognitive platform can be configured as any combination of a medical device platform, a monitoring device platform, a screening device platform, or other device platform.
[0134] The instant disclosure is also directed to example systems that include cognitive platforms that are configured for coupling with one or more physiological or monitoring component and/or cognitive testing component. In some examples, the systems include cognitive platforms that are integrated with the one or more other physiological or monitoring component and/or cognitive testing component. In other examples, the systems include cognitive platforms that are separately housed from and configured for communicating with the one or more physiological or monitoring component and/or cognitive testing component, to receive data indicative of measurements made using such one or more components.
[0135] In an example system, method, and apparatus herein, the task or the interference can include a response-deadline procedure having the response-deadline;
where the at least one adaptive procedure modifies the response-deadline to modify a performance characteristics of the individual to an impulsive response profile or a conservative response profile.
where the at least one adaptive procedure modifies the response-deadline to modify a performance characteristics of the individual to an impulsive response profile or a conservative response profile.
[0136] In an example system, method, and apparatus herein, the processing unit can be programmed to control the user interface to modify a temporal length of the response window associated with the response-deadline procedure.
[0137] In an example system, method, and apparatus herein, the processing unit can be configured to control the user interface to modify a time-varying characteristics of an aspect of the task or the interference rendered to the user interface. For example, modifying the time-varying characteristics of an aspect of the task or the interference can include adjusting a temporal length of the rendering of the task or interference at the user interface between two or more sessions of interactions of the individual. As another example, the time-varying characteristics is one or more of a speed of an object, a rate of change of a facial expression, a direction of trajectory of an object, a change of orientation of an object, at least one color of an object, a type of an object, or a size of an object.
[0138] In an example system, method, and apparatus herein, the change in type of object is effected using morphing from a first type of object to a second type of object or rendering a blendshape as a proportionate combination of the first type of object and the second type of object.
[0139] In an example system, method, and apparatus herein, the processing unit can be further programmed to compute as the classifier output parameters indicative of one or more of a bias sensitivity derived from the data indicative of the first response and the second response, a non-decision time sensitivity to parallel tasks, a belief accumulation sensitivity to parallel task demands, a reward rate sensitivity, or a response window estimation efficiency.
[0140] In an example system, method, and apparatus herein, the processing unit can be further programmed to control the user interface to render the task as a continuous visuo-motor tracking task.
[0141] In an example system, method, and apparatus herein, the processing unit controls the user interface to render the interference as a target discrimination task.
[0142] As used herein, a target discrimination task may also be referred to as a perceptual reaction task, in which the individual is instructed to perform a two-feature reaction task including target stimuli and non-target stimuli through a specified form of response. As a non-limiting example, that specified type of response can be for the individual to make a specified physical action in response to a target stimulus (e.g., move or change the orientation of a device, tap on a sensor-coupled surface such as a screen, move relative to an optical sensor, make a sound, or other physical action that activates a sensor device) and refrain from making such specified physical action in response to a non-target stimulus.
[0143] In a non-limiting example, the individual is required to perform a visuomotor task (as a primary task) with a target discrimination task as an interference (secondary task). To effect the visuomotor task, a programmed processing unit renders visual stimuli that require fine motor movement as reaction of the individual to the stimuli. In some examples, the visumotor task is a continuous visuomotor task. The processing unit is programmed to alter the visual stimuli and recording data indicative of the motor movements of the individual over time (e.g., at regular intervals including 1, 5, 10, or 30 times per second). Example stimuli rendered using the programmed processing unit for a visuomotor task requiring fine motor movement may be a visual presentation of a path that an avatar is required to remain within. The programmed processing unit may render the path with certain types of obstacles that the individual is either required to avoid or to navigate towards. In an example, the fine motor movements effect by the individual, such as but not limited to tilting or rotating a device, are measured using an accelerometer and/or a gyroscope (e.g., to steer or otherwise guide the avatar on the path while avoiding or crossing the obstacles as specified). The target discrimination task (serving as the interference), can be based on targets and non-targets that differ in shape and/or color.
[0144] In some examples, the task and/or interference can be a visuomotor task, a target discrimination task, and/or a memory task.
[0145] Within the context of a computer-implemented adaptive response-deadline procedure, the response-deadline can be adjusted between trials or blocks of trials to manipulate the individual's performance characteristics towards certain goals.
A
common goal is driving the individual's average response accuracy towards a certain value by controlling the response deadline.
A
common goal is driving the individual's average response accuracy towards a certain value by controlling the response deadline.
[0146] Measurements at different response deadlines can provide different data as to the shape and/or area of their decision boundary, so the computer-implemented adaptive procedure can inform the calculation of the impulsiveness strategy metric.
[0147] In a non-limiting example, the metric from signal detection theory representing cognitive function may be the hit rate from a target discrimination task. In that context, hit rate may be defined as the number of correct responses to a target stimuli divided by the total number of target stimuli presented, or the false alarm rate (e.g., the number of responses to a distractor stimuli divided by the number of distractor stimuli presented), the miss rate (e.g., the number of nonresponses to a target stimuli divided by the number of incorrect responses, including the nonresponses to a target stimuli added to the number of responses to a distractor stimuli), the correct response rate (the proportion of correct responses not containing signal). In an example, the correct response rate may be calculated as the number of non-responses to the distractor stimuli divided by the number of non-responses to the distractor stimuli plus the number of responses to the target stimuli.
[0148] An example system, method, and apparatus according to the principles herein can be configured to apply adaptive performance procedures to modify measures of performance to a specific stimulus intensity. The procedure can be adapted based on a percent correct (PC) or a D-Prime (d') signal detection metric of sensitivity to a target. In an example system, the value of percent correct (i.e., percent of correct responses of the individual to a task) or D-prime may be used in the adaptive algorithms as the basis for adapting the stimulus level of tasks and/or interferences rendered at the user interface for user interaction from one trial to another.
However, the inventors have unexpectedly found that an adaptive procedure based on a computational model of human decision-making (such as but not limited to the modified DDM), classifiers built from outputs of such models, and the analysis described herein based on the output of the computational model, can be more quantitatively informative on individual differences or on changes in sensitivity to a specific stimulus level. The decision boundary metric (such as but not limited to the response criterion) provides a flexible tool for determining a tendency of an individual to provide a particular type of response. Accordingly, an adaptation procedure based on decision boundary metric (such as but not limited to the response criterion) measurements at the individual or group level become a desirable source of information about impulsive or conservative response strategies at the time of measurement and also as a quantifier of the changes in performance at the individual or group level over time with repeated measurements.
However, the inventors have unexpectedly found that an adaptive procedure based on a computational model of human decision-making (such as but not limited to the modified DDM), classifiers built from outputs of such models, and the analysis described herein based on the output of the computational model, can be more quantitatively informative on individual differences or on changes in sensitivity to a specific stimulus level. The decision boundary metric (such as but not limited to the response criterion) provides a flexible tool for determining a tendency of an individual to provide a particular type of response. Accordingly, an adaptation procedure based on decision boundary metric (such as but not limited to the response criterion) measurements at the individual or group level become a desirable source of information about impulsive or conservative response strategies at the time of measurement and also as a quantifier of the changes in performance at the individual or group level over time with repeated measurements.
[0149] Executive function training, such as that delivered by the example systems, methods, and apparatus described herein can be configured to apply an adaptive algorithm to modify the stimulus levels between trials, to move a user's response strategy as indicated by their measured criterion to a more conservative or impulsive strategy, depending on the needs or preference of the individual or based on the clinical population receiving the treatment.
[0150] The example systems, methods, and apparatus described herein can be configured to apply an adaptive algorithm that is adapted based on the computed decision boundary metric (such as but not limited to the response criterion) as described herein to modify the difficulty levels of the tasks and/or interference rendered at the user interface for user interaction from one trial to another.
[0151] FIG. 9 shows an example plot representing a stimulus that is adapted on a single property that has a range of possible intensities. FIG. 9 shows a projected two-dimensional (2D) representation of a three-dimensional (3D) joint distribution composed of a stimuli in which the observer attends to multiple features at a time.
FIG. 9 shows one of several techniques to measure the criterion of multi-dimensional stimulus. In this example, a combined PC of 80% or d' of 1.81 for multi-dimensional stimuli is located on the point labeled 900. The band 902 represents the possible d' resulting from the range of possible hit and false-alarm rates in a system or apparatus that adapts the tasks and/or interference based on an adaptive performance procedure where performance is directed to PC = 80% correct. In FIG. 9, the center noise distribution is centered at (0,0), which is a simplification to constrain the band 902 of possible d' locations, but in practice the noise distribution center can be located anywhere on the axes, as long as the distance between the noise and signal distributions are connected by a vector the length of the d' value. Multi-dimensional criterions can be estimated for individuals or groups of individuals, and produce an estimate of conservative or impulsive response strategies at the time of measurement or as a response to training using the computing device. Adapting the tasks and/or interference based on the output from the response classifiers herein can provide for greater flexibility than adaptation based on the percent correct.
FIG. 9 shows one of several techniques to measure the criterion of multi-dimensional stimulus. In this example, a combined PC of 80% or d' of 1.81 for multi-dimensional stimuli is located on the point labeled 900. The band 902 represents the possible d' resulting from the range of possible hit and false-alarm rates in a system or apparatus that adapts the tasks and/or interference based on an adaptive performance procedure where performance is directed to PC = 80% correct. In FIG. 9, the center noise distribution is centered at (0,0), which is a simplification to constrain the band 902 of possible d' locations, but in practice the noise distribution center can be located anywhere on the axes, as long as the distance between the noise and signal distributions are connected by a vector the length of the d' value. Multi-dimensional criterions can be estimated for individuals or groups of individuals, and produce an estimate of conservative or impulsive response strategies at the time of measurement or as a response to training using the computing device. Adapting the tasks and/or interference based on the output from the response classifiers herein can provide for greater flexibility than adaptation based on the percent correct.
[0152] In an example, the task and/or interference can be modified based on an iterative estimation of metrics by tracking current estimates and selecting the features, trajectory, and response window of the targeting task, and level/type of parallel task interference for the next trial in order to maximize information the trial can provide.
[0153] In some examples, the task and/or interference are adaptive tasks.
The task and/or interference can be adapted or modified in difficulty level based on the decision boundary metric (such as but not limited to the response criterion), as described hereinabove. Such difficulty adaption may be used to determine the ability of the participant.
The task and/or interference can be adapted or modified in difficulty level based on the decision boundary metric (such as but not limited to the response criterion), as described hereinabove. Such difficulty adaption may be used to determine the ability of the participant.
[0154] In an example, the difficulty of the task adapts with every stimuli that is presented, which could occur more often than once at regular time intervals (e.g., every seconds, every 10 seconds, every 20 seconds or other regular schedule).
[0155] In another example, the difficulty of a continuous task can be adapted on a set schedule, such as but not limited to every 30 seconds, 10 seconds, 1 second, 2 times per second, or 30 times per second.
[0156] In an example, the length of time of a trial depends on the number of iterations of rendering (of the tasks/interference) and receiving (of the individual's responses) and can vary in time. In an example, a trial can be on the order of about 500 milliseconds, about 1 second (s), about 10 s, about 20 s, about 25 s, about 30 s, about 45 s, about 60 s, about 2 minutes, about 3 minutes, about 4 minutes, about 5 minutes, or greater. Each trial may have a pre-set length or may be dynamically set by the processing unit (e.g., dependent on an individual's performance level or a requirement of the adapting from one level to another).
[0157] In an example, the task and/or interference can be modified based on targeting changes in one or more specific metrics by selecting features, trajectory, and response window of the targeting task, and level/type of parallel task interference to progressively require improvements in those metrics in order for the apparatus to indicate to an individual that they have successfully performed the task. This could include specific reinforcement, including explicit messaging, to guide the individual to modify performance according to the desired goals.
[0158] In an example, the task and/or interference can be modified based on a comparison of an individual's performance with normative data or a computer model or taking user input (the individual performing the task/interference or another individual such as a clinician) to select a set of metrics to target for changing in a specific order, and iteratively modifying this procedure based on the subject's response to treatment.
This could include feedback to the individual performing the task/interference or another individual to serve as notification of changes to the procedure, potentially enabling them to approve or modify these changes before they take effect.
This could include feedback to the individual performing the task/interference or another individual to serve as notification of changes to the procedure, potentially enabling them to approve or modify these changes before they take effect.
[0159] In various examples, the difficulty level may be kept constant or may be varied over at least a portion of a session in an adaptive implementation, where the adaptive task (primary task or secondary task) increases or decreases in difficulty based on the decision boundary metric (such as but not limited to the response criterion).
[0160] An example system, method, and apparatus according to the principles herein can be configured to enhance the cognitive skills in an individual. In an example implementation, a programmed processing unit is configured to execute processor-executable instructions to render a task with an interference at a user interface. As described in greater detail herein, one or more of the task and the interference can be time-varying and have a response deadline, such that the user interface imposes a limited time period for receiving at least one type of response from the individual interacting with the apparatus or system. The processing unit is configured to control the user interface to measure data indicative of two or more differing types of responses to the task or to the interference. The programmed processing unit is further configured to execute processor-executable instructions to cause the example system or apparatus to receive data indicative of a first response of the individual to the task and a second response of the individual to the interference, analyze at least some portion of the data to compute at least one response profile representative of the performance of the individual, and determine a decision boundary metric (such as but not limited to the response criterion) from the response profile. As described herein, including in connection with FIGs. 4A and 4B, the decision boundary metric (such as but not limited to the response criterion) gives a quantitative measure of a tendency of the individual to provide at least one type of response of the two or more differing types of responses (Response A vs. Response B) to the task or the interference. The programmed processing unit is further configured to execute processor-executable instructions to adapt the task and/or the interference to derive a modification in the computed decision boundary metric (such as but not limited to the response criterion) such that the first response and/or the second response is modified, thereby indicating a modification of the cognitive response capabilities of the individual.
[0161] In an example, the indication of the modification of the cognitive response capabilities can be based on observation of a change in a measure of a degree of impulsiveness or conservativeness of the individual's cognitive response capabilities.
[0162] In an example, the indication of the modification of the cognitive response capabilities can include a change in a measure of one or more of sustained attention, selective attention, attention deficit, impulsivity, inhibition, perceptive abilities, reaction and other motor functions, visual acuity, long-term memory, working memory, short-term memory, logic, and decision-making.
[0163] In an example, adapting the task and/or interference based on the first decision boundary metric (such as but not limited to the response criterion) includes one or more of modifying the temporal length of the response window, modifying a type of reward or rate of presentation of rewards to the individual, and modifying a time-varying characteristic of the task and/or interference.
[0164] In an example, modifying the time-varying characteristics of an aspect of the task or the interference can include adjusting a temporal length of the rendering of the task or interference at the user interface between two or more sessions of interactions of the individual.
[0165] In an example, the time-varying characteristics can include one or more of a speed of an object, a rate of change of a facial expression, a direction of trajectory of an object, a change of orientation of an object, at least one color of an object, a type of an object, or a size of an object, or modifying a sequence or balance of rendering of targets versus non-targets at the user interface.
[0166] In an example, the change in type of object is effected using morphing from a first type of object to a second type of object or rendering a blendshape as a proportionate combination of the first type of object and the second type of object.
[0167] Designing the computer-implemented adaptive procedure using a goal of explicitly measuring the shape and/or area of the decision boundary, the response deadlines can be adjusted to points where measurements produce maximal information of use for defining this boundary. These optimal deadlines may be determined using an information theoretic approach to minimize the expected information entropy.
[0168] Example systems, methods and apparatus according to the principles herein can be implemented using a programmed computing device including at least one processing unit, to determine a potential biomarker for clinical populations.
[0169] Example systems, methods and apparatus according to the principles herein can be implemented using a programmed computing device including at least one processing unit as a metric for individuals and groups to assess tendency for impulsive and/or conservative response strategies.
[0170] Example systems, methods and apparatus according to the principles herein can be implemented using a programmed computing device including at least one processing unit to improve computer-implemented adaptive procedures to compensate for impulsive or conservative response profiles.
[0171] Example systems, methods and apparatus according to the principles herein can be implemented using a programmed computing device including at least one processing unit to measure change in response profile in individuals or groups after use of an intervention.
[0172] Example systems, methods and apparatus according to the principles herein can be implemented using a programmed computing device including at least one processing unit to apply the example metrics herein, to add another measurable characteristic of individual or group data that can be implemented for greater measurement of psychophysical-threshold accuracy and assessment of response profile to computer-implemented adaptive psychophysical procedures.
[0173] Example systems, methods and apparatus according to the principles herein can be implemented using a programmed computing device including at least one processing unit to apply the example metrics herein to add a new dimension to available data that can be used to increase the amount of information harvested from psychophysical testing.
[0174] An example system, method, and apparatus according to the principles herein can be configured to enhance the cognitive skills in an individual. In an example implementation, a programmed processing unit is configured to execute processor-executable instructions to render a task with an interference at a user interface. As described in greater detail herein, one or more of the task and the interference can be time-varying and have a response deadline, such that the user interface imposes a limited time period for receiving at least one type of response from the individual interacting with the apparatus or system. The processing unit is configured to control the user interface to measure data indicative of two or more differing types of responses to the task or to the interference. The programmed processing unit is further configured to execute processor-executable instructions to cause the example system or apparatus to receive data indicative of a first response of the individual to the task and a second response of the individual to the interference (from a first session), analyze at least some portion of the data to compute a first response profile representative of the first performance of the individual, and determine a first decision boundary metric (such as but not limited to the response criterion) from the response profile. As described herein, including in connection with FIGs. 4A and 4B, the decision boundary metric (such as but not limited to the response criterion) gives a quantitative measure of a tendency of the individual to provide at least one type of response of the two or more differing types of responses (Response A vs. Response B) to the task or the interference. The programmed processing unit is further configured to execute processor-executable instructions to adapt the task and/or the interference based on the computed first decision boundary metric (such as but not limited to the response criterion) (to generate a second session), receive data indicative of the first response of the individual to the task and the second response of the individual to the interference, analyze at least some portion of the data to compute a second response profile and a second decision boundary metric (such as but not limited to the response criterion) representative of the second performance of the individual. The programmed processing unit is further configured to execute processor-executable instructions, based on the first decision boundary metric (such as but not limited to the response criterion) and second decision boundary metric (such as but not limited to the response criterion), to generate an output to the user interface indicative of one or more of: (i) a likelihood of the individual experiencing an adverse event in response to administration of the pharmaceutical agent, drug, or biologic, (ii) a change in one or more of the amount, concentration, or dose titration of a pharmaceutical agent, drug, biologic or other medication being or to be administered to an individual, and (iii) a change in the individual's cognitive response capabilities, a recommended treatment regimen, or recommending or determining a degree of effectiveness of at least one of a behavioral therapy, counseling, or physical exercise.
[0175] In a non-limiting example, based on the results of the analysis of the first decision boundary metric (such as but not limited to the response criterion) and the second decision boundary metric (such as but not limited to the response criterion), a medical, healthcare, or other professional (with consent of the individual) can gain a better understanding of an individual's cognitive response capabilities, and potentially more specifically identify the type of cognitive condition, executive function disorder, or disease that could be affecting an individual's cognitive abilities (including by reviewing the results of the analysis in conjunction with other physiological, behavioral, and/or diagnostic measures). For example, the results may be used to identify individuals of a group who may better benefit from a first type of pharmaceutical agent, drug, biologic, or other medication, while other individuals in the group could benefit from a second type.
[0176] In a non-limiting example, based on the results of the analysis of the first decision boundary metric (such as but not limited to the response criterion) and the second decision boundary metric (such as but not limited to the response criterion), a medical, healthcare, or other professional (with consent of the individual) can gain a better understanding of potential adverse events which may occur (or potentially are occurring) if the individual is administered a particular type of, amount, concentration, or dose titration of a pharmaceutical agent, drug, biologic, or other medication, including potentially affecting cognition.
[0177] In a non-limiting example, a searchable database is provided herein that includes data indicative of the results of the analysis of the first decision boundary metric (such as but not limited to the response criterion) and the second decision boundary metric (such as but not limited to the response criterion) for particular individuals, along with known levels of efficacy of at least one types of pharmaceutical agent, drug, biologic, or other medication experiences by the individuals, and/or quantifiable information on one or more adverse events experienced by the individual with administration of the at least one types of pharmaceutical agent, drug, biologic, or other medication. The searchable database can be configured to provide metrics for use to determine whether a given individual is a candidate for benefiting from a particular type of pharmaceutical agent, drug, biologic, or other medication based on the response measures, response profiles, and/or decision boundary metric (such as but not limited to response criteria) obtained for the individual in interacting with the task and/or interference rendered at the computing device.
[0178] As a non-limiting example, based on data indicative of a user interaction with the tasks and/or interference rendered at a user interface of a computing device, the decision boundary metric (such as but not limited to the response criterion) could provide information on the tendency of an individual to a particular type of response, such as but not limited to the degree of impulsiveness or conservativeness of the individual's cognitive response strategy. This data can assist with identifying a treatment regimen, or a degree of effectiveness of a behavioral therapy, counseling, and/or physical exercise.
[0179] In a non-limiting example, based on data indicative of a user interaction with the tasks and/or interference rendered at a user interface of a computing device, the decision boundary metric (such as but not limited to the response criterion) could provide information on the individual, based on the degree of impulsiveness or conservativeness of the individual's cognitive response strategy. This data can assist with identifying whether the individual is a candidate for a particular type of drug (such as but not limited to a stimulant, e.g., methylphenidate or amphetamine) or whether it might be beneficial for the individual to have the drug administered in conjunction with a regiment of specified repeated interactions with the tasks and/or interference rendered to the computing device. Other non-limiting examples of a biologic, drug or other pharmaceutical agent applicable to any example described herein include methylphenidate (MPH), scopolamine, donepezil hydrochloride, rivastigmine tartrate, memantine HCI, solanezumab, aducanumab, and crenezumab.
[0180] Another example system, method, and apparatus according to the principles herein can be configured to enhance the cognitive skills in an individual. In an example implementation, a programmed processing unit is configured to execute processor-executable instructions to render a task with an interference at a user interface. As described in greater detail herein, one or more of the task and the interference can be time-varying and have a response deadline, such that the user interface imposes a limited time period for receiving at least one type of response from the individual interacting with the apparatus or system. The processing unit is configured to control the user interface to receive data indicative of one or more of an amount, concentration, or dose titration of a pharmaceutical agent, drug, or biologic being or to be administered to an individual, and to measure data indicative of two or more differing types of responses to the task or to the interference. The programmed processing unit is further configured to execute processor-executable instructions to cause the example system or apparatus to receive data indicative of a first response of the individual to the task and a second response of the individual to the interference (from a first session), analyze at least some portion of the data to compute a first response profile representative of the first performance of the individual, and determine a first decision boundary metric (such as but not limited to the response criterion) from the response profile. As described herein, including in connection with FIGs. 4A and 4B, the decision boundary metric (such as but not limited to the response criterion) gives a quantitative measure of a tendency of the individual to provide at least one type of response of the two or more differing types of responses (Response A vs. Response B) to the task or the interference. The programmed processing unit is further configured to execute processor-executable instructions to adapt the task and/or the interference based on the first decision boundary metric (such as but not limited to the response criterion) and the amount or concentration of a pharmaceutical agent, drug, or biologic (to generate a second session), receive data indicative of the first response of the individual to the task and the second response of the individual to the interference, analyze at least some portion of the data to compute a second response profile and a second decision boundary metric (such as but not limited to the response criterion) representative of the second performance of the individual. The programmed processing unit is further configured to execute processor-executable instructions, based on the first decision boundary metric (such as but not limited to the response criterion) and second decision boundary metric (such as but not limited to the response criterion), to generate an output to the user interface indicative of one or more of: (i) a likelihood of the individual experiencing an adverse event in response to administration of the pharmaceutical agent, drug, or biologic, (ii) a recommended change in one or more of the amount, concentration, or dose titration of the pharmaceutical agent, drug, biologic or other medication, and (iii) a change in the individual's cognitive response capabilities, a recommended treatment regimen, or recommending or determining a degree of effectiveness of at least one of a behavioral therapy, counseling, or physical exercise.
[0181] In a non-limiting example, based on the results of the analysis of the first decision boundary metric (such as but not limited to the response criterion) and the second decision boundary metric (such as but not limited to the response criterion), a medical, healthcare, or other professional (with consent of the individual) can gain a better understanding of potential adverse events which may occur (or potentially are occurring) if the individual is administered a different amount, concentration, or dose titration of a pharmaceutical agent, drug, biologic, or other medication, including potentially affecting cognition.
[0182] In a non-limiting example, a searchable database is provided herein that includes data indicative of the results of the analysis of the first decision boundary metric (such as but not limited to the response criterion) and the second decision boundary metric (such as but not limited to the response criterion) for particular individuals, along with known levels of efficacy of at least one types of pharmaceutical agent, drug, biologic, or other medication experiences by the individuals, and/or quantifiable information on one or more adverse events experienced by the individual with administration of the at least one types of pharmaceutical agent, drug, biologic, or other medication. The searchable database can be configured to provide metrics for use to determine whether a given individual is a candidate for benefiting from a particular type of pharmaceutical agent, drug, biologic, or other medication based on the response measures, response profiles, and/or decision boundary metric (such as but not limited to response criteria) obtained for the individual in interacting with the task and/or interference rendered at the computing device. As a non-limiting example, based on data indicative of a user interaction with the tasks and/or interference rendered at a user interface of a computing device, the decision boundary metric (such as but not limited to the response criterion) could provide information on the individual, based on the degree of impulsiveness or conservativeness of the individual's cognitive response strategy. This data can assist with identifying whether the individual is a candidate for a particular type of drug (such as but not limited to a stimulant, e.g., methylphenidate or amphetamine) or whether it might be beneficial for the individual to have the drug administered in conjunction with a regiment of specified repeated interactions with the tasks and/or interference rendered to the computing device. Other non-limiting examples of a biologic, drug or other pharmaceutical agent applicable to any example described herein include methylphenidate (MPH), scopolamine, donepezil hydrochloride, rivastigmine tartrate, memantine HCI, solanezumab, aducanumab, and crenezumab.
[0183] In an example, the change in the individual's cognitive response capabilities comprises an indication of a change in degree of impulsiveness or conservativeness of the individual's cognitive response strategy.
[0184] As a non-limiting example, given that impulsive behavior is attendant with ADHD, an example cognitive platform that is configured for delivering treatment (including of executive function) may promote less impulsive behavior in a regimen.
This may target dopamine systems in the brain, increasing normal regulation, which may result in a transfer of benefits of the reduction of impulsive behavior to the everyday life of an individual.
This may target dopamine systems in the brain, increasing normal regulation, which may result in a transfer of benefits of the reduction of impulsive behavior to the everyday life of an individual.
[0185] Stimulants such as methylphenidate and amphetamine are also administered to individuals with ADHD, to increase levels of norepinephrine and dopamine in the brain. Their cognitive effects may be attributed to their actions at the prefrontal cortex, however, there may not be remediation of cognitive control deficits or other cognitive abilities. An example cognitive platform herein can be configured for delivering treatment (including of executive function) to remediate an individual's cognitive control deficit.
[0186] The use of the example systems, methods, and apparatus according to the principles described herein can be applicable to many different types of neuropsychological conditions, such as but not limited to dementia, Parkinson's disease, cerebral amyloid angiopathy, familial amyloid neuropathy, Huntington's disease, or other neurodegenerative condition, autism spectrum disorder (ASD), presence of the 16p11.2 duplication, and/or an executive function disorder, such as but not limited to attention deficit hyperactivity disorder (ADHD), sensory-processing disorder (SPD), mild cognitive impairment (MCI), Alzheimer's disease, multiple-sclerosis, schizophrenia, major depressive disorder (MDD), or anxiety.
[0187] In any example implementation, data and other information from an individual is collected, transmitted, and analyzed with their consent.
[0188] As a non-limiting example, the cognitive platform described in connection with any example system, method and apparatus herein, including a cognitive platform based on interference processing, can be based on or include the Project:
EVOTM
platform by Akili Interactive Labs, Inc., Boston, MA.
Non-limiting Example Tasks and Interference
EVOTM
platform by Akili Interactive Labs, Inc., Boston, MA.
Non-limiting Example Tasks and Interference
[0189] The effects of interference processing on the cognitive control abilities of individuals has been reported. See, e.g., A. Anguera, Nature vol. 501, p. 97 (September 5, 2013) (the "Nature article"). See, also, U.S. Publication No.
20140370479A1 (U.S. Application 13/879,589), filed on Nov. 10, 2011, which is incorporated herein by reference. Some of those cognitive abilities include cognitive control abilities in the areas of attention (selectivity, sustainability, etc.), working memory (capacity and the quality of information maintenance in working memory) and goal management (ability to effectively parallel process two attention-demanding tasks or to switch tasks). As an example, children diagnosed with ADHD (attention deficit hyperactivity disorder) exhibit difficulties in sustaining attention.
Attention selectivity was found to depend on neural processes involved in ignoring goal-irrelevant information and on processes that facilitate the focus on goal-relevant information. The publications report neural data showing that when two objects are simultaneously placed in view, focusing attention on one can pull visual processing resources away from the other. Studies were also reported showing that memory depended more on effectively ignoring distractions, and the ability to maintain information in mind is vulnerable to interference by both distraction and interruption. Interference by distraction can be, e.g., an interference that is a non-target, that distracts the individual's attention from the primary task, but that the instructions indicate the individual is not to respond to. Interference by interruption/interruptor can be, e.g., an interference that is a target or two or more targets, that also distracts the individual's attention from the primary task, but that the instructions indicate the individual is to respond to (e.g., for a single target) or choose between/among (e.g., a forced-choose situation where the individual decides between differing degrees of a feature).
20140370479A1 (U.S. Application 13/879,589), filed on Nov. 10, 2011, which is incorporated herein by reference. Some of those cognitive abilities include cognitive control abilities in the areas of attention (selectivity, sustainability, etc.), working memory (capacity and the quality of information maintenance in working memory) and goal management (ability to effectively parallel process two attention-demanding tasks or to switch tasks). As an example, children diagnosed with ADHD (attention deficit hyperactivity disorder) exhibit difficulties in sustaining attention.
Attention selectivity was found to depend on neural processes involved in ignoring goal-irrelevant information and on processes that facilitate the focus on goal-relevant information. The publications report neural data showing that when two objects are simultaneously placed in view, focusing attention on one can pull visual processing resources away from the other. Studies were also reported showing that memory depended more on effectively ignoring distractions, and the ability to maintain information in mind is vulnerable to interference by both distraction and interruption. Interference by distraction can be, e.g., an interference that is a non-target, that distracts the individual's attention from the primary task, but that the instructions indicate the individual is not to respond to. Interference by interruption/interruptor can be, e.g., an interference that is a target or two or more targets, that also distracts the individual's attention from the primary task, but that the instructions indicate the individual is to respond to (e.g., for a single target) or choose between/among (e.g., a forced-choose situation where the individual decides between differing degrees of a feature).
[0190] There were also fMRI results reported showing that diminished memory recall in the presence of a distraction can be associated with a disruption of a neural network involving the prefrontal cortex, the visual cortex, and the hippocam pus (involved in memory consolidation). Prefrontal cortex networks (which play a role in selective attention) can be vulnerable to disruption by distraction. The publications also report that goal management, which requires cognitive control in the areas of working memory or selective attention, can be impacted by a secondary goal that also demands cognitive control. The publications also reported data indicating beneficial effects of interference processing as an intervention with effects on an individual's cognitive abilities, including to diminish the detrimental effects of distractions and interruptions. The publications described cost measures that can be computed (including an interference cost) to quantify the individual's performance, including to assess single-tasking or multitasking performance.
[0191] An example cost measure disclosed in the publications is the percentage change in an individual's performance at a single-tasking task as compared to a multi-tasking task, such that greater cost (that is, a more negative percentage cost) indicates increased interference when an individual is engaged in single-tasking vs multi-tasking.
[0192] The tangible benefits of computer-implemented interference processing are also reported. For example, the Nature paper states that multi-tasking performance assessed using computer-implemented interference processing was able to quantify a linear age-related decline in performance in adults from 20 to 79 years of age. The Nature paper also reports that older adults (60 to 85 years old) who interacted with an adaptive form of the computer-implemented interference processing exhibited reduced multitasking costs, with the gains persisting for six (6) months. The Nature paper also reported that age-related deficits in neural signatures of cognitive control, as measured with electroencephalography, were remediated by the multitasking training (using the computer-implemented interference processing), with enhanced midline frontal theta power and frontal¨posterior theta coherence. Interacting with the computer-implemented interference processing resulted in performance benefits that extended to untrained cognitive control abilities (enhanced sustained attention and working memory), with an increase in midline frontal theta power predicting a boost in sustained attention and preservation of multitasking improvement six (6) months later.
[0193] The example systems, methods, and apparatus according to the principles herein are configured to classify an individual as to cognitive abilities and/or to enhance those cognitive abilities based on implementation of interference processing using a computerized cognitive platform. The example systems, methods, and apparatus are configured to implement a form of multi-tasking using the capabilities of a programmed computing device, where an individual is required to perform a task and an interference substantially simultaneously, and the sensing and measurement capabilities of the computing device are configured to collect data indicative of the physical actions taken by the individual during the response execution time to respond to the task at substantially the same time as the computing device collects the data indicative of the physical actions taken by the individual to respond to the interference. The capabilities of the computing devices and programmed processing units to render the task and/or the interference in real time to a user interface, and to measure the data indicative of the individual's responses to the task and/or the interference in real time and substantially simultaneously can provide quantifiable measures of an individual's cognitive capabilities to rapidly switch to and from different tasks and interferences or to perform multiple, different, tasks or interferences in a row (including for single-tasking, where the individual is required to perform a single type of task for a set period of time).
[0194] In any example herein, the task and/or interference includes a response deadline, such that the user interface imposes a limited time period for receiving at least one type of response from the individual interacting with the apparatus or computing device. For example, the period of time that an individual is required to interact with a computing device or other apparatus to perform a task and/or an interference can be a predetermined amount of time, such as but not limited to about 30 seconds, about 1 minute, about 4 minutes, about 7 minutes, about 10 minutes, or greater than 10 minutes.
[0195] The example systems, methods, and apparatus can be configured to implement a form of multi-tasking to provide measures of the individual's capabilities in deciding whether to perform one action instead of another and to activate the rules of the current task in the presence of an interference such that the interference diverts the individual's attention from the task, as a measure of an individual's cognitive abilities in executive function control.
[0196] The example systems, methods, and apparatus can be configured to implement a form of single-tasking, where measures of the individual's performance at interacting with a single type of task (i.e., with no interference) for a set period of time (such as but not limited to navigation task only or a target discriminating task only) can also be used to provide measure of an individual's cognitive abilities.
[0197] The example systems, methods, and apparatus can be configured to implement sessions that involve differing sequences and combinations of single-tasking and multi-tasking trials. In a first example implementation, a session can include a first single-tasking trial (with a first type of task), a second single-tasking trial (with a second type of task), and a multi-tasking trial (a primary task rendered with an interference). In a second example implementation, a session can include two or more multi-tasking trials (a primary task rendered with an interference). In a third example implementation, a session can include two or more single-tasking trials (all based on the same type of tasks or at least one being based on a different type of task).
[0198] The performance can be further analyzed to compare the effects of two different types of interference (e.g. distraction or interruptor) on the performances of the various tasks. Some comparisons can include performance without interference, performance with distraction, and performance with interruption. The cost of each type of interference (e.g. distraction cost and interruptor/multi-tasking cost) on the performance level of a task is analyzed and reported to the individual.
[0199] In any example herein, the interference can a secondary task that includes a stimulus that is either a non-target (as a distraction) or a target (as an interruptor), or a stimulus that is differing types of targets (e.g., differing degrees of a facial expression or other characteristic/feature difference).
[0200] Based on the capability of a programmed processing unit to control the effecting of multiple separate sources (including sensors and other measurement components) and the receiving of data selectively from these multiple different sources at substantially simultaneously (i.e., at roughly the same time or within a short time interval) and in real-time, the example systems, methods, and apparatus herein can be used to collect quantitative measures of the responses form an individual to the task and/or interference which could not be achieved using normal human capabilities. As a result, the example systems, methods, and apparatus herein can be configured to implement a programmed processing unit to render the interference substantially simultaneously with the task over certain time periods.
[0201] In some example implementations, the example systems, methods, and apparatus herein also can be configured to receive the data indicative of the measure of the degree and type of the individual's response to the task substantially simultaneously as the data indicative of the measure of the degree and type of the individual's response to the interference is collected (whether the interference includes a target or a non-target). In some examples, the example systems, methods, and apparatus are configured to perform the analysis by applying scoring or weighting factors to the measured data indicative of the individual's response to a non-target that differ from the scoring or weighting factors applied to the measured data indicative of the individual's response to a target, in order to compute an interference cost.
[0202] In some example implementations, the example systems, methods, and apparatus herein also can be configured to selectively receive data indicative of the measure of the degree and type of the individual's response to an interference that includes a target stimulus (i.e., an interruptor) substantially simultaneously (i.e., at substantially the same time) as the data indicative of the measure of the degree and type of the individual's response to the task is collected and to selectively not collect the measure of the degree and type of the individual's response to an interference that includes a non-target stimulus (i.e., a distraction) substantially simultaneously (i.e., at substantially the same time) as the data indicative of the measure of the degree and type of the individual's response to the task is collected. That is, the example systems, methods, and apparatus are configured to discriminate between the windows of response of the individual to the target versus non-target by selectively controlling the state of the sensing/measurement components for measuring the response either temporally and/or spatially. This can be achieved by selectively activating or de-activating sensing/measurement components based on the presentation of a target or non-target, or by receiving the data measured for the individual's response to a target and selectively not receiving (e.g., disregarding, denying, or rejecting) the data measured for the individual's response to a non-target.
[0203] As described herein, using the example systems, methods, and apparatus herein can be implemented to provide a measure of the cognitive abilities of an individual in the area of attention, including based on capabilities for sustainability of attention over time, selectivity of attention, and reduction of attention deficit. Other areas of an individual's cognitive abilities that can be measured using the example systems, methods, and apparatus herein include impulsivity, inhibition, perceptive abilities, reaction and other motor functions, visual acuity, long-term memory, working memory, short-term memory, logic, and decision-making.
[0204] As described herein, using the example systems, methods, and apparatus herein can be implemented to adapt the tasks and/or is the most critical design element for any effective plasticity-harnessing tool. Also, we wanted to control every single game element-timing, positioning, and nature of stimuli¨so that we could record neural activity during game play and understand what was changing in the brain in response to training.
[0205] FIGs. 10A ¨ 15V show non-limiting example user interfaces that can be rendered using example systems, methods, and apparatus herein to render the tasks and/or interferences for user interactions. The non-limiting example user interfaces of FIGs. 10A ¨ 15V also can be used for one or more of: to display instructions to the individual for performing the tasks and/or interferences, to collect the data indicative of the individual's responses to the tasks and/or the interferences, to show progress metrics, and to provide the analysis metrics.
[0206] FIGs. 10A¨ 10D show non-limiting example user interfaces rendered using example systems, methods, and apparatus herein. As shown in FIGs. 10A¨ 10B, an example programmed processing unit can be used to render to the user interfaces (including graphical user interfaces) display features 1000 for displaying instructions to the individual for performing the tasks and/or interferences, and metric features 1002 to show status indicators from progress metrics and/or results from application of analytics to the data collected from the individual's interactions (including the responses to tasks/interferences) to provide the analysis metrics. In any example systems, methods, and apparatus herein, the response classifier can be used to provide the analysis metrics provided as a response output. In any example systems, methods, and apparatus herein, the data collected from the user interactions can be used as input to train the response classifier. As shown in FIGs. 10A ¨ 10B, an example programmed processing unit also may be used to render to the user interfaces (including graphical user interfaces) an avatar or other processor-rendered guide 1004 that an individual is required to control (such as but not limited to navigate a path or other environment in a visuo-motor task, and/or to select an object in a target discrimination task).
As shown in FIG. 10B, the display features 1000 can be used to instruct the individual what is expected to perform a navigation task while the user interface depicts (using the dashed line) the type of movement of the avatar or other processor-rendered guide required for performing the navigation task. As shown in FIG. 10C, the display features 1000 can be used to instruct the individual what is expected to perform a target discrimination task while the user interface depicts the type of object(s) 1006 and 1008 that may be rendered to the user interface, with one type of object 1006 designated as a target while the other type of object 1008 that may be rendered to the user interface is designated as a non-target (e.g., by being crossed out in this example). As shown in FIG. 10D, the display features 1000 can be used to instruct the individual what is expected to perform both a navigation task as a primary task and a target discrimination as a secondary task (i.e., an interference) while the user interface depicts (using the dashed line) the type of movement of the avatar or other processor-rendered guide 1004 required for performing the navigation task, and the user interface renders the object type designated as a target object 1006 and the object type designated as a non-target object 1008.
As shown in FIG. 10B, the display features 1000 can be used to instruct the individual what is expected to perform a navigation task while the user interface depicts (using the dashed line) the type of movement of the avatar or other processor-rendered guide required for performing the navigation task. As shown in FIG. 10C, the display features 1000 can be used to instruct the individual what is expected to perform a target discrimination task while the user interface depicts the type of object(s) 1006 and 1008 that may be rendered to the user interface, with one type of object 1006 designated as a target while the other type of object 1008 that may be rendered to the user interface is designated as a non-target (e.g., by being crossed out in this example). As shown in FIG. 10D, the display features 1000 can be used to instruct the individual what is expected to perform both a navigation task as a primary task and a target discrimination as a secondary task (i.e., an interference) while the user interface depicts (using the dashed line) the type of movement of the avatar or other processor-rendered guide 1004 required for performing the navigation task, and the user interface renders the object type designated as a target object 1006 and the object type designated as a non-target object 1008.
[0207] FIGs. 11A ¨ 11D show examples of the features of object(s) (targets or non-targets) that can be rendered as time-varying characteristics to an example user interface, according to the principles herein. FIG. 11A shows an example where the modification to the time-varying characteristics of an aspect of the object 1100 rendered to the user interface is a dynamic change in position and/or speed of the object 1100 relative to environment rendered in the graphical user interface. FIG. 11B
shows an example where the modification to the time-varying characteristics of an aspect of the object 1102 rendered to the user interface is a dynamic change in size and/or direction of trajectory/motion, and/or orientation of the object 1102 relative to the environment rendered in the graphical user interface. FIG. 11C shows an example where the modification to the time-varying characteristics of an aspect of the object 1104 rendered to the user interface is a dynamic change in shape or other type of the object relative to the environment rendered in the graphical user interface. In this non-limiting example, the time-varying characteristic of object 1104 is effected using morphing from a first type of object (a star object) to a second type of object (a round object). In another non-limiting example, the time-varying characteristic of object 1104 is effected by rendering a blendshape as a proportionate combination of a first type of object and a second type of object. FIG. 11C shows an example where the modification to the time-varying characteristics of an aspect of the object 1104 rendered to the user interface is a dynamic change in shape or other type of the object 1104 rendered in the graphical user interface (in this non-limiting example, from a star object to a round object). FIG.
11D shows an example where the modification to the time-varying characteristics of an aspect of the object 1106 rendered to the user interface is a dynamic change in pattern, or color, or visual feature of the object 1106 relative to environment rendered in the graphical user interface (in this non-limiting example, from a star object having a first pattern to a round object having a second pattern). In another non-limiting example, the time-varying characteristic of object can be a rate of change of a facial expression depicted on or relative to the object.
shows an example where the modification to the time-varying characteristics of an aspect of the object 1102 rendered to the user interface is a dynamic change in size and/or direction of trajectory/motion, and/or orientation of the object 1102 relative to the environment rendered in the graphical user interface. FIG. 11C shows an example where the modification to the time-varying characteristics of an aspect of the object 1104 rendered to the user interface is a dynamic change in shape or other type of the object relative to the environment rendered in the graphical user interface. In this non-limiting example, the time-varying characteristic of object 1104 is effected using morphing from a first type of object (a star object) to a second type of object (a round object). In another non-limiting example, the time-varying characteristic of object 1104 is effected by rendering a blendshape as a proportionate combination of a first type of object and a second type of object. FIG. 11C shows an example where the modification to the time-varying characteristics of an aspect of the object 1104 rendered to the user interface is a dynamic change in shape or other type of the object 1104 rendered in the graphical user interface (in this non-limiting example, from a star object to a round object). FIG.
11D shows an example where the modification to the time-varying characteristics of an aspect of the object 1106 rendered to the user interface is a dynamic change in pattern, or color, or visual feature of the object 1106 relative to environment rendered in the graphical user interface (in this non-limiting example, from a star object having a first pattern to a round object having a second pattern). In another non-limiting example, the time-varying characteristic of object can be a rate of change of a facial expression depicted on or relative to the object.
[0208] FIGs. 12A¨ 12T show a non-limiting example of the dynamics of tasks and interferences that can be rendered at user interfaces, according to the principles herein.
In this example, the task is a visuo-motor navigation task, and the interference is target discrimination (as a secondary task). As shown in FIGs. 12D, 121¨ 12K, and 120 ¨
12Q, the individual is required to perform the navigation task by controlling the motion of the avatar 1202 along a path that coincides with the milestone objects 1204.
FIGs. 12A
¨ 12T show a non-limiting example implementation where the individual is expected to actuate an apparatus or computing device (or other sensing device) to cause the avatar 1202 to coincide with the milestone object 1204 as the response in the navigation task, with scoring based on the success of the individual at crossing paths with (e.g., hitting) the milestone objects 1204. In another example, the individual is expected to actuate an apparatus or computing device (or other sensing device) to cause the avatar 1202 to miss the milestone object 1204, with scoring based on the success of the individual at avoiding the milestone objects 1204. FIGs. 12A ¨ 12C show the dynamics of a target object 1206 (a star having a first type of pattern), where the time-varying characteristic is the trajectory of motion of the object. FIGs. 12E ¨ 12H show the dynamics of a non-target object 1208 (a star having a second type of pattern), where the time-varying characteristic is the trajectory of motion of the object. FIGs. 121¨ 12T show the dynamics of other portions of the navigation task, where the individual is expected to guide the avatar 1202 to cross paths with the milestone object 1204 in the absence of an interference (a secondary task).
In this example, the task is a visuo-motor navigation task, and the interference is target discrimination (as a secondary task). As shown in FIGs. 12D, 121¨ 12K, and 120 ¨
12Q, the individual is required to perform the navigation task by controlling the motion of the avatar 1202 along a path that coincides with the milestone objects 1204.
FIGs. 12A
¨ 12T show a non-limiting example implementation where the individual is expected to actuate an apparatus or computing device (or other sensing device) to cause the avatar 1202 to coincide with the milestone object 1204 as the response in the navigation task, with scoring based on the success of the individual at crossing paths with (e.g., hitting) the milestone objects 1204. In another example, the individual is expected to actuate an apparatus or computing device (or other sensing device) to cause the avatar 1202 to miss the milestone object 1204, with scoring based on the success of the individual at avoiding the milestone objects 1204. FIGs. 12A ¨ 12C show the dynamics of a target object 1206 (a star having a first type of pattern), where the time-varying characteristic is the trajectory of motion of the object. FIGs. 12E ¨ 12H show the dynamics of a non-target object 1208 (a star having a second type of pattern), where the time-varying characteristic is the trajectory of motion of the object. FIGs. 121¨ 12T show the dynamics of other portions of the navigation task, where the individual is expected to guide the avatar 1202 to cross paths with the milestone object 1204 in the absence of an interference (a secondary task).
[0209] In the example of FIGs. 12A ¨ 12T, the processing unit of the example system, method, and apparatus is configured to receive data indicative of the individual's physical actions to cause the avatar 1202 to navigate the path.
For example, the individual may be required to perform physical actions to "steer"
the avatar, e.g., by changing the rotational orientation or otherwise moving a computing device. Such action can cause a gyroscope or accelerometer or other motion or position sensor device to detect the movement, thereby providing measurement data indicative of the individual's degree of success in performing the navigation task.
For example, the individual may be required to perform physical actions to "steer"
the avatar, e.g., by changing the rotational orientation or otherwise moving a computing device. Such action can cause a gyroscope or accelerometer or other motion or position sensor device to detect the movement, thereby providing measurement data indicative of the individual's degree of success in performing the navigation task.
[0210] In the example of FIGs. 12A ¨ 12C and 12E ¨ 12H, the processing unit of the example system, method, and apparatus is configured to receive data indicative of the individual's physical actions to perform the target discrimination task. For example, the individual may be instructed prior to a trial or other session to tap, or make other physical indication, in response to display of a target object 1206, and not to tap to make the physical indication in response to display of a non-target object 1208. In FIGs. 12A ¨ 12C and 12E ¨ 12H, the target discrimination task acts as an interference (i.e., a secondary task) to the primary navigation task, in an interference processing multi-tasking implementation. As described hereinabove, the example systems, methods, and apparatus can cause the processing unit to render a display feature (e.g., display feature 1000) to display the instructions to the individual as to the expected performance. As also described hereinabove, the processing unit of the example system, method, and apparatus can be configured to (i) receive the data indicative of the measure of the degree and type of the individual's response to the primary task substantially simultaneously as the data indicative of the measure of the degree and type of the individual's response to the interference is collected (whether the interference includes a target or a non-target), or (i) to selectively receive data indicative of the measure of the degree and type of the individual's response to an interference that includes a target stimulus (i.e., an interruptor) substantially simultaneously (i.e., at substantially the same time) as the data indicative of the measure of the degree and type of the individual's response to the task is collected and to selectively not collect the measure of the degree and type of the individual's response to an interference that includes a non-target stimulus (i.e., a distraction) substantially simultaneously (i.e., at substantially the same time) as the data indicative of the measure of the degree and type of the individual's response to the task is collected
[0211] FIGs. 13A ¨ 13D show another non-limiting example of the dynamics of tasks and interferences that can be rendered at user interfaces, according to the principles herein. In this example, the task is a visuo-motor navigation task, and the interference is target discrimination (as a secondary task), where an individual is required to perform physical actions to cause an avatar 1302 to navigate to cross paths with the milestone object 1304 as the primary task and interact with an object 1306 as target discrimination (interference as a secondary task). FIGs. 13A ¨ 13D show an example of the type of reward 1308 that can be shown on the graphical user interface responsive to the individual's indication of selecting a target object. In this non-limiting example, the reward 1308 is a set of rings that are rendered near the target 1306 at substantially the time the individual makes the second response selecting the target. In a non-limiting example, the second response is made by a tap, or other physical action to a portion of the user interface based on the individual's decision to enter a response.
[0212] FIGs. 14A¨ 14D show examples of the dynamics of an instructions panel rendered to a user interface of an example user interface. In this example, the processing unit causes the user interface to show the dynamics of movement of the instructions panel 1402 into view from the right side of the user interface.
FIGs. 14A ¨
14D also show non-limiting examples of target objects 1404 and non-target objects 1406. In this non-limiting example, the target objects 1404 and non-target objects 1406 differ in color. As shown in FIG. 14D, the instructions panel 1402 can include a visual representation of the target object in addition to the written instructions to the individual.
FIGs. 14A ¨
14D also show non-limiting examples of target objects 1404 and non-target objects 1406. In this non-limiting example, the target objects 1404 and non-target objects 1406 differ in color. As shown in FIG. 14D, the instructions panel 1402 can include a visual representation of the target object in addition to the written instructions to the individual.
[0213] FIGs. 15A¨ 15V show other examples of the dynamics of multi-tasking involving user interaction with an implementation of a navigation task, and with an interference. In this example, the task is a visuo-motor navigation task, and the interference is target discrimination (as a secondary task). The individual is instructed to perform the navigation task by controlling the motion of the avatar 1502 along a path that coincides with the milestone objects 1504. FIGs. 15A¨ 15V show a non-limiting example implementation where the individual is expected to actuate an apparatus or computing device (or other sensing device) to cause the avatar 1502 to coincide with the milestone object 1504 as the response in the navigation task, with scoring based on the success of the individual at hitting or otherwise crossing paths with the milestone objects 1504. In another example, the individual is expected to actuate an apparatus or computing device (or other sensing device) to cause the avatar 1502 to miss (i.e., not cross paths) with the milestone object 1504, with scoring based on the success of the individual at avoiding the milestone objects 1504. FIGs. 15A ¨ 15V also show the dynamics of a target object 1506, where the time-varying characteristic is the trajectory of motion of the target object 1506. FIGs. 15A ¨ 15V also show the dynamics of a non-target object 1508, where the time-varying characteristic is the trajectory of motion of the non-target object 1508.
[0214] FIGs. 15K ¨ 15V show non-limiting examples of the types of rewards that may be rendered to an individual to signal a degree of success in interacting with the tasks and/or interference. In FIGs. 15K¨ 15R, a feature 1510 including the word "GOOD" is rendered near the avatar 1502 to signal to the individual that analysis of the data indicative of the individual's responses to the navigation task and target discrimination interference indicate satisfactory performance. FIG. 15V shows an example of a change in the type of rewards presented to the individual as another indication of satisfactory performance, including a change in the rendering of feature 1500 to display the word "GREAT", at least one modification to the avatar 1502 to symbolize excitement, such as but not limited to the rings 1504 or other active element and/or showing jet booster elements 1506 that become star-shaped. Many other types of reward elements can be used, and the rate and type of reward elements displayed can be changed and modulated as a time-varying element.
[0215] As described hereinabove, the example systems, methods, and apparatus herein are configured to apply a computational model of human decision-making to the received response data received, based on the time-varying characteristics of the task and/or interference. The time-varying characteristics of the task and/or interference result in nonlinear accumulation of belief for an individual's decision making.
Accordingly, based on the response data from the individual's interaction with the task and/or interference, the processing unit can be configured to compute at least one response profile representative of the performance of the individual and determines a decision boundary metric (such as but not limited to the response criterion) from the response profile. As also described hereinabove, the response classifier can be executed based on the computed values of decision boundary metric (such as but not limited to the response criterion), to generate a classifier output indicative of the cognitive response capabilities of the individual.
Accordingly, based on the response data from the individual's interaction with the task and/or interference, the processing unit can be configured to compute at least one response profile representative of the performance of the individual and determines a decision boundary metric (such as but not limited to the response criterion) from the response profile. As also described hereinabove, the response classifier can be executed based on the computed values of decision boundary metric (such as but not limited to the response criterion), to generate a classifier output indicative of the cognitive response capabilities of the individual.
[0216] In various examples, the degree of non-linearity of the accumulation of belief for an individual's decision making (i.e., as to whether to execute a response) can be modulated based on adjusting the time-varying characteristics of the task and/or interference. As a non-limiting example, where the time-varying characteristic is a trajectory, speed, orientation, or size of the object (target or non-target), the amount of information available to an individual to develop a belief (in order to make decision as to whether to execute a response) can be made smaller initially, e.g., where the object caused to be more difficult to discriminate by being rendered as farther away or smaller, and can be made to increase at differing rates (nonlinearly) depending on how quickly more information is made available to the individual to develop belief (e.g., as the object is rendered to appear to get larger, change orientation, move slower, or move closer in the environment). Other non-limiting example time-varying characteristics of the task and/or interference that can be adjusted to modulate the degree of non-linearity of the accumulation of belief include one or more of a rate of change of a facial expression, at least one color of an object, the type of the object, a rate of morphing of a first type of object to change to a second type of object, and a proportionate amount of a first type of object and a second type of object that form a blendshape (e.g., where the second type of object is the target and the first type of object is a non-target).
[0217] As also described hereinabove, the programmed processing unit can be configured to execute processor-executable instructions to adapt the task and/or the interference to derive a modification in the computed response profile. Given that the response profile is computed based on the individual's response data (e.g., data based on a first response to the task and/or data based on a second response to the interference), a change in the computed response profile can be used as an indication of a change in the responses and performance measures of the individual. This in turn can provide an indication of a modification of the cognitive response capabilities of the individual.
[0218] As described hereinabove, adapting the tasks and/or interference based on the output from the response classifiers can provide for greater flexibility than adaptation based on the percent correct or D-Prime (d') signal detection metric of sensitivity to a target. That is, the interaction is adapted by modifying parameters of the tasks and/or interference to be rendered to the user interface in a subsequent trial or session of the individual's interaction based on the computed decision boundary metric (such as but not limited to the response criterion) and/or the output from a trained response classifier using response data from a previous trial or session. For example, if the decision boundary metric (such as but not limited to the response criterion) indicates a tendency of the individual to provide a first type of response of the two or more differing types of responses (Response A vs. Response B) to the task or the interference, the difficulty levels of the tasks and/or interference rendered at the user interface for user interaction for a subsequent level can be modified. The methodology for adapting the difficulty levels of the task and/or interference of a subsequent session based on the decision boundary metric (such as but not limited to the response criterion) computed for the individual's performance from a previous session can be optimized to modify an individual's first decision boundary metric (such as but not limited to the response criterion) (and first performance) indicative of first type of response strategy towards a second decision boundary metric (such as but not limited to the response criterion) (and second performance) indicative of a second type of response strategy.
[0219] As a non-limiting example, the difficulty level of a subsequent session of an implementation of an example system, method, and apparatus herein can be adapted to modify an individual's first decision boundary metric (such as but not limited to the response criterion) (and first performance) indicative of a more impulsive response strategy to a second decision boundary metric (such as but not limited to the response criterion) (and second performance) indicative of a more conservative response strategy, thereby seeking to promote less impulsive behavior in the individual.
[0220] In a non-limiting example, the adaptation of the difficulty of a task and/or interference may be adapted with each different stimulus that is presented.
[0221] In another non-limiting example, the example system, method, and apparatus herein can be configured to adapt a difficulty level of a task and/or interference one or more times in fixed time intervals or in other set schedule, such as but not limited to each second, in 10 second intervals, every 30 seconds, or on frequencies of once per second, 2 times per second, or more (such as but not limited to 30 times per second).
[0222] In an example, the difficulty level of a task or interference can be adapted by changing the time-varying characteristics, such as but not limited to a speed of an object, a rate of change of a facial expression, a direction of trajectory of an object, a change of orientation of an object, at least one color of an object, a type of an object, or a size of an object, or changing a sequence or balance of presentation of a target stimulus versus a non-target stimulus.
[0223] In a non-limiting example of a visuo-motor task (a type of navigation task), one or more of navigation speed, shape of the course (changing frequency of turns, changing turning radius), and number or size of obstacles can be changed to modify the difficulty of a navigation game level, with the difficulty level increasing with increasing speed and/or increasing numbers and/or sizes of obstacles (milestone objects).
[0224] In a non-limiting example, the difficulty level of a task and/or interference of a subsequent level can also be changed in real-time as feedback, e.g., the difficulty of a subsequent level can be increased or decreased in relation to the data indicative of the performance of the task.
[0225] FIG. 16A ¨ 16C show flowcharts of example methods, according to the principles herein. In any example, the method is executed based on execution of processor-executable instructions using a programmed processing unit.
[0226] FIG. 16A shows a method for generating a quantifier of cognitive skills in an individual using a machine learning response classifier, using a programmed processing unit. In block 1602, a task with an interference is rendered at a user interface, where the task and/or the interference is time-varying and has a response deadline, such that the user interface imposes a limited time period for receiving at least one type of response from an individual. In block 1604, data indicative of a first response of an individual to the task and a second response of the individual to the interference is received. In block 1606, the data indicative of the first response and the second response are analyzed to compute at least one response profile representative of a performance of the individual. In block 1608, a decision boundary metric is determined from the response profile, where the decision boundary metric includes a quantitative measure of a tendency of the individual to provide at least one type of response of the two or more differing types of responses to the task or the interference. In block 1610, a response classifier is executed based on the computed values of decision boundary metric, to generate a classifier output indicative of the cognitive response capabilities of the individual.
[0227] FIG. 16B shows a method for enhancing cognitive skills in an individual, using a programmed processing unit. In block 1612, a task with an interference is rendered at a user interface, where the task and/or the interference is time-varying and has a response deadline, such that the user interface imposes a limited time period for receiving at least one type of response from an individual. In block 1614, data indicative of a first response of an individual to the task and a second response of the individual to the interference is received. In block 1616, the data indicative of the first response and the second response are analyzed to compute at least one response profile representative of a performance of the individual. In block 1618, a decision boundary metric is determined from the response profile, where the decision boundary metric includes a quantitative measure of a tendency of the individual to provide at least one type of response of the two or more differing types of responses to the task or the interference. In block 1620, based on the computed first decision boundary metric, the task and/or the interference is adapted to derive a modification in the computed at least one decision boundary metric (such as but not limited to the response criterion) such that the first response and/or the second response is modified, thereby indicating a modification of the cognitive response capabilities of the individual.
[0228] FIG. 16C shows a method for enhancing cognitive skills in an individual, using a programmed processing unit. In block 1622, data indicative of one or more of an amount, concentration, or dose titration of a pharmaceutical agent, drug, or biologic being or to be administered to an individual is received. In block 1624, a task with an interference is rendered at a user interface, where the task and/or the interference is time-varying and has a response deadline, such that the user interface imposes a limited time period for receiving at least one type of response from an individual. In block 1626, data indicative of a first response of an individual to the task and a second response of the individual to the interference, from a first session, is received. In block 1628, the data indicative of the first response and the second response is analyzed to compute a first response profile representative of a first performance of the individual.
In block 1630, a first decision boundary metric is determined based on the first response profile, where the first decision boundary metric includes a quantitative measure of a tendency of the individual to provide at least one type of response of the two or more differing types of responses to the interference. In block 1632, based on the computed first decision boundary metric and the amount or concentration of a pharmaceutical agent, drug, or biologic, the task and/or the interference is adapted to generate a second session. In block 1634, the collected data indicative of the first response and the second response from the second session is analyzed to compute a second response profile and a second decision boundary metric representative of a second performance of the individual. In block 1636, based on the first decision boundary metric and second decision boundary metric, an output is generated to the user interface indicative of one or more of: (i) a likelihood of the individual experiencing an adverse event in response to administration of the pharmaceutical agent, drug, or biologic, (ii) a recommended change in one or more of the amount, concentration, or dose titration of the pharmaceutical agent, drug, or biologic, and (iii) a change in the individual's cognitive response capabilities, a recommended treatment regimen, or recommending or determining a degree of effectiveness of at least one of a behavioral therapy, counseling, or physical exercise
In block 1630, a first decision boundary metric is determined based on the first response profile, where the first decision boundary metric includes a quantitative measure of a tendency of the individual to provide at least one type of response of the two or more differing types of responses to the interference. In block 1632, based on the computed first decision boundary metric and the amount or concentration of a pharmaceutical agent, drug, or biologic, the task and/or the interference is adapted to generate a second session. In block 1634, the collected data indicative of the first response and the second response from the second session is analyzed to compute a second response profile and a second decision boundary metric representative of a second performance of the individual. In block 1636, based on the first decision boundary metric and second decision boundary metric, an output is generated to the user interface indicative of one or more of: (i) a likelihood of the individual experiencing an adverse event in response to administration of the pharmaceutical agent, drug, or biologic, (ii) a recommended change in one or more of the amount, concentration, or dose titration of the pharmaceutical agent, drug, or biologic, and (iii) a change in the individual's cognitive response capabilities, a recommended treatment regimen, or recommending or determining a degree of effectiveness of at least one of a behavioral therapy, counseling, or physical exercise
[0229] FIG. 17 is a block diagram of an example computing device 1710 that can be used as a computing component according to the principles herein. In any example herein, computing device 1710 can be configured as a console that receives user input to implement the computing component, including to apply the signal detection metrics in computer-implemented adaptive response-deadline procedures. For clarity, FIG. 17 also refers back to and provides greater detail regarding various elements of the example system of FIG. 1 and the example computing device of FIG. 2. The computing device 1710 can include one or more non-transitory computer-readable media for storing one or more computer-executable instructions or software for implementing examples. The non-transitory computer-readable media can include, but are not limited to, one or more types of hardware memory, non-transitory tangible media (for example, one or more magnetic storage disks, one or more optical disks, one or more flash drives), and the like. For example, memory 302 included in the computing device 1710 can store computer-readable and computer-executable instructions or software for performing the operations disclosed herein. For example, the memory 302 can store a software application 1740 which is configured to perform various of the disclosed operations (e.g., analyze cognitive platform measurement data and response data, apply a signal detection metrics in adaptive response-deadline procedures, or performing a computation). The computing device 1710 also includes configurable and/or programmable processor 304 and an associated core 1714, and optionally, one or more additional configurable and/or programmable processing devices, e.g., processor(s) 1712' and associated core(s) 1714' (for example, in the case of computational devices having multiple processors/cores), for executing computer-readable and computer-executable instructions or software stored in the memory and other programs for controlling system hardware. Processor 304 and processor(s) 1712' can each be a single core processor or multiple core (1714 and 1714') processor.
[0230] Virtualization can be employed in the computing device 1710 so that infrastructure and resources in the console can be shared dynamically. A
virtual machine 1724 can be provided to handle a process running on multiple processors so that the process appears to be using only one computing resource rather than multiple computing resources. Multiple virtual machines can also be used with one processor.
virtual machine 1724 can be provided to handle a process running on multiple processors so that the process appears to be using only one computing resource rather than multiple computing resources. Multiple virtual machines can also be used with one processor.
[0231] Memory 302 can include a computational device memory or random access memory, such as DRAM, SRAM, EDO RAM, and the like. Memory 302 can include other types of memory as well, or combinations thereof.
[0232] A user can interact with the computing device 1710 through a visual display unit 1728, such as a computer monitor, which can display one or more user interfaces (UI) 1730 that can be provided in accordance with example systems and methods.
The computing device 1710 can include other I/O devices for receiving input from a user, for example, a keyboard or any suitable multi-point touch interface 1718, a pointing device 1720 (e.g., a mouse). The keyboard 1718 and the pointing device 1720 can be coupled to the visual display unit 1728. The computing device 1710 can include other suitable conventional I/O peripherals.
The computing device 1710 can include other I/O devices for receiving input from a user, for example, a keyboard or any suitable multi-point touch interface 1718, a pointing device 1720 (e.g., a mouse). The keyboard 1718 and the pointing device 1720 can be coupled to the visual display unit 1728. The computing device 1710 can include other suitable conventional I/O peripherals.
[0233] The computing device 1710 can also include one or more storage devices 1734, such as a hard-drive, CD-ROM, or other computer readable media, for storing data and computer-readable instructions and/or software that perform operations disclosed herein. Example storage device 1734 can also store one or more databases for storing any suitable information required to implement example systems and methods. The databases can be updated manually or automatically at any suitable time to add, delete, and/or update one or more items in the databases.
[0234] The computing device 1710 can include a network interface 1722 configured to interface via one or more network devices 1732 with one or more networks, for example, Local Area Network (LAN), Wide Area Network (WAN) or the Internet through a variety of connections including, but not limited to, standard telephone lines, LAN or WAN links (for example, 802.11, Ti, T3, 56kb, X.25), broadband connections (for example, ISDN, Frame Relay, ATM), wireless connections, controller area network (CAN), or some combination of any or all of the above. The network interface 1722 can include a built-in network adapter, network interface card, PCMCIA network card, card bus network adapter, wireless network adapter, USB network adapter, modem or any other device suitable for interfacing the computing device 1710 to any type of network capable of communication and performing the operations described herein.
Moreover, the computing device 1710 can be any computational device, such as a workstation, desktop computer, server, laptop, handheld computer, tablet computer, or other form of computing or telecommunications device that is capable of communication and that has sufficient processor power and memory capacity to perform the operations described herein.
Moreover, the computing device 1710 can be any computational device, such as a workstation, desktop computer, server, laptop, handheld computer, tablet computer, or other form of computing or telecommunications device that is capable of communication and that has sufficient processor power and memory capacity to perform the operations described herein.
[0235] The computing device 1710 can run any operating system 1726, such as any of the versions of the Microsoft Windows operating systems, the different releases of the Unix and Linux operating systems, any version of the MacOS@ for Macintosh computers, any embedded operating system, any real-time operating system, any open source operating system, any proprietary operating system, or any other operating system capable of running on the console and performing the operations described herein. In some examples, the operating system 1726 can be run in native mode or emulated mode. In an example, the operating system 1726 can be run on one or more cloud machine instances.
[0236] Examples of the systems, methods and operations described herein can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more thereof. Examples of the systems, methods and operations described herein can be implemented as one or more computer programs, i.e., one or more modules of computer program instructions, encoded on computer storage medium for execution by, or to control the operation of, data processing apparatus. The program instructions can be encoded on an artificially generated propagated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal, that is generated to encode information for transmission to suitable receiver apparatus for execution by a data processing apparatus. A
computer storage medium can be, or be included in, a computer-readable storage device, a computer-readable storage substrate, a random or serial access memory array or device, or a combination of one or more of them. Moreover, while a computer storage medium is not a propagated signal, a computer storage medium can be a source or destination of computer program instructions encoded in an artificially generated propagated signal. The computer storage medium can also be, or be included in, one or more separate physical components or media (e.g., multiple CDs, disks, or other storage devices).
computer storage medium can be, or be included in, a computer-readable storage device, a computer-readable storage substrate, a random or serial access memory array or device, or a combination of one or more of them. Moreover, while a computer storage medium is not a propagated signal, a computer storage medium can be a source or destination of computer program instructions encoded in an artificially generated propagated signal. The computer storage medium can also be, or be included in, one or more separate physical components or media (e.g., multiple CDs, disks, or other storage devices).
[0237] The operations described in this specification can be implemented as operations performed by a data processing apparatus on data stored on one or more computer-readable storage devices or received from other sources.
[0238] The term data processing apparatus" or "computing device"
encompasses all kinds of apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, a system on a chip, or multiple ones, or combinations, of the foregoing. The apparatus can include special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC
(application specific integrated circuit). The apparatus can also include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, a cross-platform runtime environment, a virtual machine, or a combination of one or more of them.
encompasses all kinds of apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, a system on a chip, or multiple ones, or combinations, of the foregoing. The apparatus can include special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC
(application specific integrated circuit). The apparatus can also include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, a cross-platform runtime environment, a virtual machine, or a combination of one or more of them.
[0239] A computer program (also known as a program, software, software application, script, application or code) can be written in any form of programming language, including compiled or interpreted languages, declarative or procedural languages, and it can be deployed in any form, including as a stand alone program or as a module, component, subroutine, object, or other unit suitable for use in a computing environment. A computer program may, but need not, correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
[0240] The processes and logic flows described in this specification can be performed by one or more programmable processors executing on one or more computer programs to perform actions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatuses can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an AS IC (application specific integrated circuit).
[0241] Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read only memory or a random access memory or both.
The essential elements of a computer are a processor for performing actions in accordance with instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. However, a computer need not have such devices. Moreover, a computer can be embedded in another device, e.g., a mobile telephone, a personal digital assistant (PDA), a mobile audio or video player, a game console, a Global Positioning System (GPS) receiver, or a portable storage device (e.g., a universal serial bus (USB) flash drive), for example. Devices suitable for storing computer program instructions and data include all forms of non volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD
ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
The essential elements of a computer are a processor for performing actions in accordance with instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. However, a computer need not have such devices. Moreover, a computer can be embedded in another device, e.g., a mobile telephone, a personal digital assistant (PDA), a mobile audio or video player, a game console, a Global Positioning System (GPS) receiver, or a portable storage device (e.g., a universal serial bus (USB) flash drive), for example. Devices suitable for storing computer program instructions and data include all forms of non volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD
ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
[0242] To provide for interaction with a user, embodiments of the subject matter described in this specification can be implemented on a computer having a display device, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse, a stylus, touch screen or a trackball, by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well. For example, feedback (i.e., output) provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input. In addition, a computer can interact with a user by sending documents to and receiving documents from a device that is used by the user; for example, by sending web pages to a web browser on a user's client device in response to requests received from the web browser.
[0243] In some examples, a system, method or operation herein can be implemented in a computing system that includes a back end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the subject matter described in this specification, or any combination of one or more such back end, middleware, or front end components. The components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network ("LAN") and a wide area network ("WAN"), an inter-network (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks).
[0244] Example computing system 400 can include clients and servers. A
client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. In some embodiments, a server transmits data to a client device (e.g., for purposes of displaying data to and receiving user input from a user interacting with the client device). Data generated at the client device (e.g., a result of the user interaction) can be received from the client device at the server.
Conclusion
client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. In some embodiments, a server transmits data to a client device (e.g., for purposes of displaying data to and receiving user input from a user interacting with the client device). Data generated at the client device (e.g., a result of the user interaction) can be received from the client device at the server.
Conclusion
[0245] While this specification contains many specific implementation details, these should not be construed as limitations on the scope of any inventions or of what may be claimed, but rather as descriptions of features specific to particular embodiments of the systems and methods described herein. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
[0246] Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results.
[0247] In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products
[0248] The above-described embodiments can be implemented in any of numerous ways. For example, some embodiments may be implemented using hardware, software or a combination thereof. When any aspect of an embodiment is implemented at least in part in software, the software code can be executed on any suitable processor or collection of processors, whether provided in a single computer or distributed among multiple computers.
[0249] The above-described embodiments can be implemented in any of numerous ways. For example, some embodiments may be implemented using hardware, software or a combination thereof. When any aspect of an embodiment is implemented at least in part in software, the software code can be executed on any suitable processor or collection of processors, whether provided in a single computer or distributed among multiple computers.
[0250] In this respect, various aspects of the invention may be embodied at least in part as a computer readable storage medium (or multiple computer readable storage media) (e.g., a computer memory, compact disks, optical disks, magnetic tapes, flash memories, circuit configurations in Field Programmable Gate Arrays or other semiconductor devices, or other tangible computer storage medium or non-transitory medium) encoded with one or more programs that, when executed on one or more computers or other processors, perform methods that implement the various embodiments of the technology discussed above. The computer readable medium or media can be transportable, such that the program or programs stored thereon can be loaded onto one or more different computers or other processors to implement various aspects of the present technology as discussed above.
[0251] The terms "program" or "software" are used herein in a generic sense to refer to any type of computer code or set of computer-executable instructions that can be employed to program a computer or other processor to implement various aspects of the present technology as discussed above. Additionally, it should be appreciated that according to one aspect of this embodiment, one or more computer programs that when executed perform methods of the present technology need not reside on a single computer or processor, but may be distributed in a modular fashion amongst a number of different computers or processors to implement various aspects of the present technology.
[0252] Computer-executable instructions may be in many forms, such as program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc.
that perform particular tasks or implement particular abstract data types.
Typically the functionality of the program modules may be combined or distributed as desired in various embodiments.
that perform particular tasks or implement particular abstract data types.
Typically the functionality of the program modules may be combined or distributed as desired in various embodiments.
[0253] Also, the technology described herein may be embodied as a method, of which at least one example has been provided. The acts performed as part of the method may be ordered in any suitable way. Accordingly, embodiments may be constructed in which acts are performed in an order different than illustrated, which may include performing some acts simultaneously, even though shown as sequential acts in illustrative embodiments.
[0254] All definitions, as defined and used herein, should be understood to control over dictionary definitions, definitions in documents incorporated by reference, and/or ordinary meanings of the defined terms.
[0255] The indefinite articles "a" and "an," as used herein in the specification and in the claims, unless clearly indicated to the contrary, should be understood to mean at least one."
[0256] The phrase "and/or," as used herein in the specification and in the claims, should be understood to mean "either or both" of the elements so conjoined, i.e., elements that are conjunctively present in some cases and disjunctively present in other cases. Multiple elements listed with "and/or" should be construed in the same fashion, i.e., one or more" of the elements so conjoined. Other elements may optionally be present other than the elements specifically identified by the "and/or"
clause, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, a reference to "A and/or B", when used in conjunction with open-ended language such as "comprising" can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc.
clause, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, a reference to "A and/or B", when used in conjunction with open-ended language such as "comprising" can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc.
[0257] As used herein in the specification and in the claims, "or" should be understood to have the same meaning as "and/or" as defined above. For example, when separating items in a list, "or" or "and/or" shall be interpreted as being inclusive, i.e., the inclusion of at least one, but also including more than one, of a number or list of elements, and, optionally, additional unlisted items. Only terms clearly indicated to the contrary, such as only one of" or "exactly one of," or, when used in the claims, "consisting of," will refer to the inclusion of exactly one element of a number or list of elements. In general, the term "or" as used herein shall only be interpreted as indicating exclusive alternatives (i.e. one or the other but not both") when preceded by terms of exclusivity, such as "either," one of," only one of," or "exactly one of."
"Consisting essentially of," when used in the claims, shall have its ordinary meaning as used in the field of patent law.
"Consisting essentially of," when used in the claims, shall have its ordinary meaning as used in the field of patent law.
[0258] As used herein in the specification and in the claims, the phrase at least one," in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements. This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase at least one" refers, whether related or unrelated to those elements specifically identified.
Thus, as a non-limiting example, at least one of A and B" (or, equivalently, at least one of A or B," or, equivalently at least one of A and/or B") can refer, in one embodiment, to at least one, optionally including more than one, A, with no B present (and optionally including elements other than B); in another embodiment, to at least one, optionally including more than one, B, with no A present (and optionally including elements other than A); in yet another embodiment, to at least one, optionally including more than one, A, and at least one, optionally including more than one, B (and optionally including other elements); etc.
Thus, as a non-limiting example, at least one of A and B" (or, equivalently, at least one of A or B," or, equivalently at least one of A and/or B") can refer, in one embodiment, to at least one, optionally including more than one, A, with no B present (and optionally including elements other than B); in another embodiment, to at least one, optionally including more than one, B, with no A present (and optionally including elements other than A); in yet another embodiment, to at least one, optionally including more than one, A, and at least one, optionally including more than one, B (and optionally including other elements); etc.
[0259] In the claims, as well as in the specification above, all transitional phrases such as "comprising," "including," "carrying," "having," "containing,"
"involving," "holding,"
"composed of," and the like are to be understood to be open-ended, i.e., to mean including but not limited to. Only the transitional phrases "consisting of"
and "consisting essentially of" shall be closed or semi-closed transitional phrases, respectively, as set forth in the United States Patent Office Manual of Patent Examining Procedures, Section 2111.03.
"involving," "holding,"
"composed of," and the like are to be understood to be open-ended, i.e., to mean including but not limited to. Only the transitional phrases "consisting of"
and "consisting essentially of" shall be closed or semi-closed transitional phrases, respectively, as set forth in the United States Patent Office Manual of Patent Examining Procedures, Section 2111.03.
Claims (108)
1. An apparatus for generating a quantifier of cognitive skills in an individual using a response classifier, said apparatus comprising:
a user interface;
a memory to store processor-executable instructions; and a processing unit communicatively coupled to the user interface and the memory, wherein upon execution of the processor-executable instructions by the processing unit, the processing unit is configured to:
render a task with an interference at the user interface, one or more of the task and the interference being time-varying and having a response deadline, such that the user interface imposes a limited time period for receiving at least one type of response from an individual; and the user interface being configured to measure data indicative of two or more differing types of responses to the task or to the interference;
receive data indicative of a first response of an individual to the task and a second response of the individual to the interference;
analyze the data indicative of the first response and the second response to compute at least one response profile representative of a performance of the individual;
determine a decision boundary metric from the response profile, the decision boundary metric comprising a quantitative measure of a tendency of the individual to provide at least one type of response of the two or more differing types of responses to the task or the interference; and execute a response classifier based at least in part on the computed values of decision boundary metric, to generate a classifier output indicative of the cognitive response capabilities of the individual.
a user interface;
a memory to store processor-executable instructions; and a processing unit communicatively coupled to the user interface and the memory, wherein upon execution of the processor-executable instructions by the processing unit, the processing unit is configured to:
render a task with an interference at the user interface, one or more of the task and the interference being time-varying and having a response deadline, such that the user interface imposes a limited time period for receiving at least one type of response from an individual; and the user interface being configured to measure data indicative of two or more differing types of responses to the task or to the interference;
receive data indicative of a first response of an individual to the task and a second response of the individual to the interference;
analyze the data indicative of the first response and the second response to compute at least one response profile representative of a performance of the individual;
determine a decision boundary metric from the response profile, the decision boundary metric comprising a quantitative measure of a tendency of the individual to provide at least one type of response of the two or more differing types of responses to the task or the interference; and execute a response classifier based at least in part on the computed values of decision boundary metric, to generate a classifier output indicative of the cognitive response capabilities of the individual.
2. The apparatus of claim 1, further comprising at least one actuating component, wherein the processing unit further controls the actuating component to effect one or more of an auditory stimulus, a tactile stimulus, and a vibrational stimulus, and wherein the task and/or the interference comprises one or more of the auditory stimulus, the tactile stimulus, and the vibrational stimulus.
3. The apparatus of claim 1, wherein the response classifier comprises one or more of a linear/logistic regression, principal component analysis, a generalized linear mixed model, a random decision forest, a support vector machine, or an artificial neural network.
4. The apparatus of claim 1, wherein computing at least one response profile comprises generating at least one response profile that is an impulsive response profile or a conservative response profile.
5. The apparatus of claim 4, wherein the processing unit is further configured to execute instructions and perform: applying at least one adaptive procedure to modify the task and/or the interference, such that analysis of the data indicative of the first response and/or the second response indicates a modification of the first response profile.
6. The apparatus of claim 5, wherein the analysis of the data indicative of the first response and/or the second response indicates that the at least one response profile changes from an impulsive response profile to a conservative response profile.
7. The apparatus of claim 5, wherein the task or the interference comprises a response-deadline procedure having the response deadline; and wherein the at least one adaptive procedure modifies the response deadline to modify a performance characteristics of the individual to an impulsive response profile or a conservative response profile.
8. The apparatus of claim 1, wherein the processing unit is configured to control the user interface to modify a temporal length of the response window associated with the response-deadline procedure.
9. The apparatus of claim 1, wherein the processing unit is configured to control the user interface to modify a time-varying characteristics of an aspect of the task or the interference rendered to the user interface.
10. The apparatus of claim 9, wherein modifying the time-varying characteristics of an aspect of the task or the interference comprises adjusting a temporal length of the rendering of the task or interference at the user interface between two or more sessions of interactions of the individual.
11. The apparatus of claim 9, wherein modifying the time-varying characteristics comprises at least one of a change of a speed of an object, a change in a rate of change of a facial expression, a change in a direction of trajectory of an object, a change of orientation of an object, a change of at least one color of an object, a change of a type of an object, or a change of a size of an object.
12. The apparatus of claim 11, wherein the change in type of object is effected using morphing from a first type of object to a second type of object or rendering a blendshape as a proportionate combination of the first type of object and the second type of object.
13. The apparatus of claim 1, wherein the processing unit is further configured to compute as the classifier output parameters indicative of one or more of a bias sensitivity derived from the data indicative of the first response and the second response, a non-decision time sensitivity to parallel tasks, a belief accumulation sensitivity to parallel task demands, a reward rate sensitivity, or a response window estimation efficiency.
14. The apparatus of claim 1, wherein the processing unit is configured to control the user interface to render the task as a continuous visuo-motor tracking task.
15. The apparatus of claim 1, wherein the processing unit is configured to control the user interface to render the interference as a target discrimination interference.
16. The apparatus of claim 1, wherein the processing unit is configured to render the task with the interference by configuring the user interface to:
render the task in the presence of the interference such that the interference diverts the individual's attention from the task, in which the interference is selected from a group consisting of a distraction and an interruptor.
render the task in the presence of the interference such that the interference diverts the individual's attention from the task, in which the interference is selected from a group consisting of a distraction and an interruptor.
17. The apparatus of claim 16, wherein the processing unit is configured to receive data indicative of the first response and the second response by configuring the user interface to:
(i) receive the second response to the interference at substantially the same time as the user interface receives the first response to the task; or (ii) receive the second response to the interference that is an interruptor at substantially the same time as the user interface receives the first response to the task and not receive the second response to the interference that is a distraction at substantially the same time that the user interface receives the first response to the task.
(i) receive the second response to the interference at substantially the same time as the user interface receives the first response to the task; or (ii) receive the second response to the interference that is an interruptor at substantially the same time as the user interface receives the first response to the task and not receive the second response to the interference that is a distraction at substantially the same time that the user interface receives the first response to the task.
18. The apparatus of claim 1, wherein the response classifier is trained based at least in part on feedback data from a computational model of human decision-making.
19. The apparatus of claim 18, wherein the computational model of human decision-making is a drift-diffusion model.
20. The apparatus of claim 1, wherein the response classifier is trained using a plurality of training datasets, each training dataset corresponding to a previously classified individual of a plurality of individuals, and each training dataset comprising data indicative of the first response of the classified individual to the task, data indicative of the second response of the classified individual to the interference, and one or more of (i) data indicative of a performance of the classified individual at one or more of a cognitive test or a behavioral test, and (ii) data indicative of a diagnosis of a status or progression of a cognitive condition, a disease or an executive function disorder of the classified individual.
21. The apparatus of claim 20, wherein the cognitive test or behavioral test comprises at least one of a cognitive assessment test, a cognitive development test, a test for sustained attention, a test for selective attention, a test for impulsivity, a test for perceptive abilities, reaction and other motor functions, a test for visual acuity, a test for long-term memory, a test for working memory, a test for short-term memory, a test for logic, or a test for decision-making.
22. The apparatus of claim 1, wherein the classifier output comprises an indication of a degree of impulsiveness or conservativeness of the individual's cognitive response capabilities.
23. The apparatus of claim 1, wherein the processing unit is configured to transmit the classifier output to the user and/or display the classifier output on the user interface.
24. The apparatus of claim 1, wherein the response classifier serves as an intelligent proxy for subsequent measures of cognitive capabilities.
25. The apparatus of claim 1, wherein the classifier output comprises a measure of attention deficit or impulsivity of the individual.
26. The apparatus of claim 1, wherein the processing unit is further configured to use the classifier output for cognitive monitoring of one or more of a cognitive condition, a disease, or an executive function disorder.
27. The apparatus of claim 1, wherein the processing unit is further configured to use the classifier output for monitoring of the individual's treatment regimen for one or more of the cognitive condition, the disease, or the executive function disorder.
28. The apparatus of claim 26 or 27, wherein the cognitive condition, disease, or executive function disorder is selected from the group consisting of dementia, Parkinson's disease, cerebral amyloid angiopathy, familial amyloid neuropathy, Huntington's disease, or other neurodegenerative condition, autism spectrum disorder (ASD), presence of the 16p11.2 duplication, attention deficit hyperactivity disorder (ADHD), sensory-processing disorder (SPD), mild cognitive impairment (MCI), Alzheimer's disease, multiple-sclerosis, schizophrenia, major depressive disorder (MDD), and anxiety.
29. The apparatus of claim 1, wherein the apparatus comprises one or more sensor components, and wherein the processing unit is configured to control the one or more sensor components to receive the data indicative of the first response and the second response.
30. The apparatus of claim 1, wherein the one or more sensor components comprises at least one of a gyroscope, an accelerometer, a motion sensor, a position sensor, a pressure sensor, an optical sensor, a video camera, an auditory sensor, or a vibrational sensor.
31. The apparatus of claim 1, wherein the processing unit is further configured to use the classifier output for one or more of (i) changing one or more of a recommended amount, concentration, or dose titration of the pharmaceutical agent, drug, or biologic, (ii) identifying a likelihood of the individual experiencing an adverse event in response to administration of the pharmaceutical agent, drug, or biologic, (iii) identifying a change in the individual's cognitive response capabilities, (iv) recommending a treatment regimen, (v) recommending at least one of a behavioral therapy, counseling, or physical exercise, or (vi) determining a degree of effectiveness of at least one of a behavioral therapy, counseling, or physical exercise.
32. The apparatus of claim any one of claims 1 ¨ 30, wherein the decision boundary metric is a response criterion.
33. A system comprising one or more physiological component and the apparatus of any one of claims 1 ¨ 32, wherein upon execution of the processor-executable instructions by the processing unit, the processing unit:
receives data indicative of one or more measurements of the physiological component; and executes the response classifier based at least in part on the computed values of decision boundary metric and the data indicative of one or more measurements of the physiological component, to generate the classifier output.
receives data indicative of one or more measurements of the physiological component; and executes the response classifier based at least in part on the computed values of decision boundary metric and the data indicative of one or more measurements of the physiological component, to generate the classifier output.
34. A computer-implemented method for generating a quantifier of cognitive skills in an individual using a response classifier, said method comprising:
rendering, using at least one processing unit, a task with an interference at a user interface;
measuring data indicative of two or more differing types of responses to the task or to the interference;
receiving data indicative of a first response of an individual to the task and a second response of the individual to the interference;
analyzing, using the at least one processing unit, the data indicative of the first response and the second response to compute at least one response profile representative of the performance of the individual;
determining, using the at least one processing unit, a decision boundary metric from the response profile, the decision boundary metric comprising a quantitative measure of a tendency of the individual to provide at least one type of response of the two or more differing types of responses to the interference; and executing a response classifier based at least in part on the decision boundary metric, to generate a classifier output indicative of the individual's cognitive response capabilities.
rendering, using at least one processing unit, a task with an interference at a user interface;
measuring data indicative of two or more differing types of responses to the task or to the interference;
receiving data indicative of a first response of an individual to the task and a second response of the individual to the interference;
analyzing, using the at least one processing unit, the data indicative of the first response and the second response to compute at least one response profile representative of the performance of the individual;
determining, using the at least one processing unit, a decision boundary metric from the response profile, the decision boundary metric comprising a quantitative measure of a tendency of the individual to provide at least one type of response of the two or more differing types of responses to the interference; and executing a response classifier based at least in part on the decision boundary metric, to generate a classifier output indicative of the individual's cognitive response capabilities.
35. The method of claim 34, wherein the processing unit is further configured to control at least one actuating component to effect one or more of an auditory stimulus, a tactile stimulus, and a vibrational stimulus, and wherein the task and/or the interference comprises one or more of the auditory stimulus, the tactile stimulus, and the vibrational stimulus.
36. The method of claim 34, wherein the response classifier comprises one or more of a linear/logistic regression, principal component analysis, a generalized linear mixed model, a random decision forest, a support vector machine, or an artificial neural network.
37. The method of claim 34, wherein computing at least one response profile comprises generating at least one response profile that is an impulsive response profile or a conservative response profile.
38. The method of claim 37, wherein the processing unit is configured to execute further instructions and perform: applying at least one adaptive procedure to modify the task and/or the interference, such that analysis of the data indicative of the first response and/or the second response indicates a modification of the first response profile.
39. The method of claim 38, wherein an analysis of received data collected from measurement of the first response and/or the second response to the modified task and/or the modified interference indicates that the at least one response profile changes from an impulsive response profile to a conservative response profile.
40. The method of claim 34, wherein the task or the interference comprises a response-deadline procedure having the response deadline; and wherein the at least one adaptive procedure modifies the response deadline to modify a performance characteristics of the individual to an impulsive response profile or a conservative response profile.
41. The method of claim 34, wherein the processing unit is configured to control the user interface to modify a temporal length of the response window associated with the response-deadline procedure.
42. The method of claim 34, wherein the processing unit is configured to control the user interface to modify a time-varying characteristics of an aspect of the task or the interference rendered to the user interface.
43. The method of claim 42, wherein modifying the time-varying characteristics of an aspect of the task or the interference comprises adjusting a temporal length of the rendering of the task or interference at the user interface between two or more sessions of interactions of the individual.
44. The method of claim 42, wherein modifying the time-varying characteristics comprises at least one of a change of a speed of an object, a change in a rate of change of a facial expression, a change in a direction of trajectory of an object, a change of orientation of an object, a change of at least one color of an object, a change of a type of an object, or a change of a size of an object.
45. The method of claim 44, wherein the change in type of object is effected using morphing from a first type of object to a second type of object or rendering a blendshape as a proportionate combination of the first type of object and the second type of object.
46. The method of claim 34, wherein the processing unit is further configured to compute as the classifier output parameters indicative of one or more of a bias sensitivity derived from the data indicative of the first response and the second response, a non-decision time sensitivity to parallel tasks, a belief accumulation sensitivity to parallel task demands, a reward rate sensitivity, or a response window estimation efficiency.
47. The method of claim 34, wherein the processing unit is configured to control the user interface to render the task as a continuous visuo-motor tracking task.
48. The method of claim 34, wherein the processing unit is configured to control the user interface to render the interference as a target discrimination interference.
49. The method of claim 34, wherein rendering the task with the interference comprises:
rendering the task in the presence of the interference such that the interference diverts the individual's attention from the task, the interference selected from the group consisting of a distraction and an interruptor.
rendering the task in the presence of the interference such that the interference diverts the individual's attention from the task, the interference selected from the group consisting of a distraction and an interruptor.
50. The method of claim 49, wherein receiving data indicative of the first response and the second response comprises at least one of:
(i) receiving the second response to the interference at substantially the same time as receiving the first response to the task; or (ii) receiving the second response to the interference that is an interruptor at substantially the same time as receiving the first response to the task and not receiving the second response to the interference that is a distraction at substantially the same time as receiving the first response to the task.
(i) receiving the second response to the interference at substantially the same time as receiving the first response to the task; or (ii) receiving the second response to the interference that is an interruptor at substantially the same time as receiving the first response to the task and not receiving the second response to the interference that is a distraction at substantially the same time as receiving the first response to the task.
51. The method of claim 34, wherein the response classifier is trained based at least in part on feedback data from a computational model of human decision-making.
52. The method of claim 51, wherein the computational model of human decision-making is a drift-diffusion model.
53. The method of claim 34, wherein the response classifier is trained using a plurality of training datasets, each training dataset corresponding to a previously classified individual of a plurality of individuals, and each training dataset comprising data indicative of the first response of the classified individual to the task, data indicative of the second response of the classified individual to the interference, and one or more of (i) data indicative of a performance of the classified individual at one or more of a cognitive test or a behavioral test, and (ii) data indicative of a diagnosis of a status or progression of a cognitive condition, a disease or an executive function disorder of the classified individual.
54. The method of claim 53, wherein the cognitive test or behavioral test comprises at least one of a cognitive assessment test, a cognitive development test, a test for sustained attention, a test for selective attention, a test for impulsivity, a test for perceptive abilities, reaction and other motor functions, a test for visual acuity, a test for long-term memory, a test for working memory, a test for short-term memory, a test for logic, or a test for decision-making.
55. The method of claim 34, wherein the classifier output comprises an indication of a degree of impulsiveness or conservativeness of the individual's cognitive response capabilities.
56. The method of claim 34, wherein the classifier output is transmitted to the user and/or displayed to the user interface.
57. The method of claim 34, wherein the response classifier serves as an intelligent proxy for subsequent measures of cognitive capabilities.
58. The method of claim 34, wherein the classifier output comprises a measure of attention deficit or impulsivity of the individual.
59. The method of claim 34, further comprising using the classifier output for cognitive monitoring of one or more of a cognitive condition, a disease, or an executive function disorder.
60. The method of claim 34, further comprising using the classifier output for monitoring of the individual's treatment regimen for one or more of the cognitive condition, the disease, or the executive function disorder.
61. The method of claim 59 or 60, wherein the cognitive condition, disease, or executive function disorder is selected from the group consisting of dementia, Parkinson's disease, cerebral amyloid angiopathy, familial amyloid neuropathy, Huntington's disease, or other neurodegenerative condition, autism spectrum disorder (ASD), presence of the 16p11.2 duplication, attention deficit hyperactivity disorder (ADHD), sensory-processing disorder (SPD), mild cognitive impairment (MCI), Alzheimer's disease, multiple-sclerosis, schizophrenia, major depressive disorder (MDD), and anxiety.
62. The method of claim 34, wherein receiving the data indicative of the first response and the second response comprises using one or more sensor components to receive the data indicative of the first response and the second response.
63. The method of claim 62, wherein the one or more sensor components comprises at least one of a gyroscope, an accelerometer, a motion sensor, a position sensor, a pressure sensor, an optical sensor, a video camera, an auditory sensor, or a vibrational sensor.
64. The method of claim 34, further comprising using the classifier output for one or more of changing one or more of the amount, concentration, or dose titration of the pharmaceutical agent, drug, or biologic, identifying a likelihood of the individual experiencing an adverse event in response to administration of the pharmaceutical agent, drug, or biologic, identifying a change in the individual's cognitive response capabilities, recommending a treatment regimen, or recommending or determining a degree of effectiveness of at least one of a behavioral therapy, counseling, or physical exercise.
65. The method of any one of claims 34 ¨ 64, wherein rendering the task with the interference comprises:
rendering the task in the presence of the interference such that the interference diverts the individual's attention from the task and is selected from the group consisting of a distraction and an interruptor.
rendering the task in the presence of the interference such that the interference diverts the individual's attention from the task and is selected from the group consisting of a distraction and an interruptor.
66. The method of claim 65, wherein receiving data indicative of the first response and the second response comprises:
(i) receiving the second response to the interference at substantially the same time as receiving the first response to the task; or (ii) receiving the second response to the interference that is an interruptor at substantially the same time as receiving the first response to the task and not receiving the second response to the interference that is a distraction at substantially the same time as receiving the first response to the task.
(i) receiving the second response to the interference at substantially the same time as receiving the first response to the task; or (ii) receiving the second response to the interference that is an interruptor at substantially the same time as receiving the first response to the task and not receiving the second response to the interference that is a distraction at substantially the same time as receiving the first response to the task.
67. The method of any one of claims 34 ¨ 66, wherein the decision boundary metric comprises a response criterion.
68. A system comprising one or more physiological component and an apparatus configured to execute the method of any one of claims 34 ¨ 66, wherein upon execution of the processor-executable instructions by the processing unit, the processing unit is configured to:
receive data indicative of one or more measurements of the physiological component; and execute the response classifier based at least in part on the computed values of decision boundary metric and the data indicative of one or more measurements of the physiological component, to generate the classifier output.
receive data indicative of one or more measurements of the physiological component; and execute the response classifier based at least in part on the computed values of decision boundary metric and the data indicative of one or more measurements of the physiological component, to generate the classifier output.
69. At least one non-transitory computer-readable medium for storing one or more computer-executable instructions which, when executed, causes a processing unit to execute the method of any one of claims 34 ¨ 66.
70. An apparatus for enhancing cognitive skills in an individual, said apparatus comprising:
a user interface;
a memory to store processor-executable instructions; and a processing unit communicatively coupled to the user interface and the memory, wherein upon execution of the processor-executable instructions by the processing unit, the processing unit is configured to:
render a primary task with an interference at the user interface, one or more of the task and the interference being time-varying and having a response deadline, such that the user interface imposes a limited time period for receiving at least one type of response from an individual; and the user interface being configured to measure data indicative of two or more differing types of responses to the task or to the interference;
receive data indicative of a first response of an individual to the task and a second response of the individual to the interference;
analyze the data indicative of the first response and the second response to compute at least one response profile representative of a performance of the individual;
determine a first decision boundary metric based at least in part on the at least one response profile, the first decision boundary metric comprising a quantitative measure of a tendency of the individual to provide at least one type of response of the two or more differing types of responses to the interference;
and based at least in part on the computed first decision boundary metric, adjust the task and/or the interference to derive a modification in the computed at least one decision boundary metric such that a further response to the task and/or a further response to the interference is modified as compared to an earlier response to the task and/or an earlier response to the interference, thereby indicating a modification of the cognitive response capabilities of the individual.
a user interface;
a memory to store processor-executable instructions; and a processing unit communicatively coupled to the user interface and the memory, wherein upon execution of the processor-executable instructions by the processing unit, the processing unit is configured to:
render a primary task with an interference at the user interface, one or more of the task and the interference being time-varying and having a response deadline, such that the user interface imposes a limited time period for receiving at least one type of response from an individual; and the user interface being configured to measure data indicative of two or more differing types of responses to the task or to the interference;
receive data indicative of a first response of an individual to the task and a second response of the individual to the interference;
analyze the data indicative of the first response and the second response to compute at least one response profile representative of a performance of the individual;
determine a first decision boundary metric based at least in part on the at least one response profile, the first decision boundary metric comprising a quantitative measure of a tendency of the individual to provide at least one type of response of the two or more differing types of responses to the interference;
and based at least in part on the computed first decision boundary metric, adjust the task and/or the interference to derive a modification in the computed at least one decision boundary metric such that a further response to the task and/or a further response to the interference is modified as compared to an earlier response to the task and/or an earlier response to the interference, thereby indicating a modification of the cognitive response capabilities of the individual.
71. The apparatus of claim 70, wherein the indication of the modification of the cognitive response capabilities comprises a change in a measure of a degree of impulsiveness or conservativeness of the individual's cognitive response capabilities.
72. The apparatus of claim 70, wherein the indication of the modification of the cognitive response capabilities comprises a change in a measure of one or more of sustained attention, selective attention, attention deficit, impulsivity, inhibition, perceptive abilities, reaction and other motor functions, visual acuity, long-term memory, working memory, short-term memory, logic, and decision-making.
73. The apparatus of claim 70, wherein adapting the task and/or interference based at least in part on the first decision boundary metric comprises one or more of modifying the temporal length of the response window, modifying a type of reward or rate of presentation of rewards to the individual, and modifying a time-varying characteristic of the task and/or interference.
74. The apparatus of claim 73, wherein modifying the time-varying characteristics of an aspect of the task or the interference comprises adjusting a temporal length of the rendering of the task or interference at the user interface between two or more sessions of interactions of the individual.
75. The apparatus of claim 73, wherein modifying the time-varying characteristics comprises at least one of a change of a speed of an object, a change of a rate of change of a facial expression, a change in a direction of trajectory of an object, a change of orientation of an object, a change of at least one color of an object, a change of a type of an object, a change of a size of an object, or a change of a sequence or balance of presentation of a target stimulus versus a non-target stimulus.
76. The apparatus of claim 75, wherein the change in type of object is effected using morphing from a first type of object to a second type of object or rendering a blendshape as a proportionate combination of the first type of object and the second type of object.
77. The apparatus of claim 70, wherein the processing unit is configured to render the task with the interference by configuring the user interface to:
render the task in the presence of the interference such that the interference diverts the individual's attention from the task and is selected from the group consisting of a distraction and an interruptor.
render the task in the presence of the interference such that the interference diverts the individual's attention from the task and is selected from the group consisting of a distraction and an interruptor.
78. The apparatus of claim 77, wherein the processing unit is configured to receive data indicative of the first response and the second response by configuring the user interface to:
(i) receive the second response to the interference at substantially the same time as the user interface receives the first response to the task; or (ii) receive the second response to the interference that is an interruptor at substantially the same time as the user interface receives the first response to the task and not receive the second response to the interference that is a distraction at substantially the same time that the user interface receives the first response to the task.
(i) receive the second response to the interference at substantially the same time as the user interface receives the first response to the task; or (ii) receive the second response to the interference that is an interruptor at substantially the same time as the user interface receives the first response to the task and not receive the second response to the interference that is a distraction at substantially the same time that the user interface receives the first response to the task.
79. The apparatus of claim 70, wherein computing at least one response profile comprises generating at least one response profile that is an impulsive response profile or a conservative response profile.
80. The apparatus of claim 79, wherein the processing unit is configured to execute further instructions and perform: applying at least one adaptive procedure to modify the task and/or the interference, such that analysis of the data indicative of the first response and/or the second response indicates a modification of the first response profile.
81. The apparatus of claim 80, wherein an analysis of received data collected from measurement of the first response and/or the second response to the modified task and/or the modified interference indicates that the at least one response profile changes from an impulsive response profile to a conservative response profile.
82. The apparatus of claim 70, wherein the task or the interference comprises a response-deadline procedure having the response deadline; and wherein the at least one adaptive procedure modifies the response deadline to modify a performance characteristics of the individual to an impulsive response profile or a conservative response profile.
83. The apparatus of claim 70, wherein the processing unit is configured to control the user interface to modify a temporal length of the response window associated with the response-deadline procedure.
84. The apparatus of claim 70, wherein the processing unit is configured to control the user interface to modify a time-varying characteristics of an aspect of the task or the interference rendered to the user interface.
85. The apparatus of claim 84, wherein modifying the time-varying characteristics of an aspect of the task or the interference comprises adjusting a temporal length of the rendering of the task or interference at the user interface between two or more sessions of interactions of the individual.
86. The apparatus of claim 84, wherein modifying the time-varying characteristics comprises at least one of a change in a speed of an object, a change in a rate of change of a facial expression, a change in a direction of trajectory of an object, a change of orientation of an object, a change in at least one color of an object, a change in a type of an object, or a change in a size of an object.
87. The apparatus of claim 86, wherein the change in type of object is effected using morphing from a first type of object to a second type of object or rendering a blendshape as a proportionate combination of the first type of object and the second type of object.
88. The apparatus of claim any one of claims 70 ¨ 87, wherein the decision boundary metric comprises a response criterion.
89. A computer-implemented method for enhancing cognitive skills in an individual, said method comprising:
rendering a task with an interference at a user interface;
measuring data indicative of two or more differing types of responses to the task or to the interference;
receiving data indicative of a first response of an individual to the task and a second response of the individual to the interference;
analyzing, using the at least one processing unit, the data indicative of the first response and the second response to compute at least one response profile representative of the performance of the individual;
determining a first decision boundary metric based at least in part on the at least one response profile, the first decision boundary metric comprising a quantitative measure of a tendency of the individual to provide at least one type of response of the two or more differing types of responses to the interference; and based at least in part on the computed first decision boundary metric, adapting the task and/or the interference to derive a modification in the computed first decision boundary metric such that the first response and/or the second response is modified, thereby indicating a modification of the cognitive response capabilities of the individual.
rendering a task with an interference at a user interface;
measuring data indicative of two or more differing types of responses to the task or to the interference;
receiving data indicative of a first response of an individual to the task and a second response of the individual to the interference;
analyzing, using the at least one processing unit, the data indicative of the first response and the second response to compute at least one response profile representative of the performance of the individual;
determining a first decision boundary metric based at least in part on the at least one response profile, the first decision boundary metric comprising a quantitative measure of a tendency of the individual to provide at least one type of response of the two or more differing types of responses to the interference; and based at least in part on the computed first decision boundary metric, adapting the task and/or the interference to derive a modification in the computed first decision boundary metric such that the first response and/or the second response is modified, thereby indicating a modification of the cognitive response capabilities of the individual.
90. The method of claim 89, wherein computing at least one response profile comprises generating at least one response profile that is an impulsive response profile or a conservative response profile.
91. The method of claim 90, further comprising applying at least one adaptive procedure to modify the task and/or the interference, such that analysis of the data indicative of the first response and/or the second response indicates a modification of the first response profile.
92. The method of claim 91, wherein an analysis of received data collected from measurement of the first response and/or the second response to the modified task and/or the modified interference indicates that the at least one response profile changes from an impulsive response profile to a conservative response profile.
93. The method of claim 89, wherein the task or the interference comprises a response-deadline procedure having the response deadline; and wherein the at least one adaptive procedure modifies the response deadline to modify a performance characteristics of the individual to an impulsive response profile or a conservative response profile.
94. The method of claim 89, further comprising controlling the user interface to modify a temporal length of the response window associated with the response-deadline procedure.
95. The method of claim 89, further comprising controlling the user interface to modify a time-varying characteristics of an aspect of the task or the interference rendered to the user interface.
96. The method of claim 95, wherein modifying the time-varying characteristics of an aspect of the task or the interference comprises adjusting a temporal length of the rendering of the task or interference at the user interface between two or more sessions of interactions of the individual.
97. The method of claim 95, wherein modifying the time-varying characteristics comprises at least one of a change of a speed of an object, a change of a rate of change of a facial expression, a change of a direction of trajectory of an object, a change of orientation of an object, a change of at least one color of an object, a change of a type of an object, or a change of a size of an object.
98. The method of claim 97, wherein the change in type of object is effected using morphing from a first type of object to a second type of object or rendering a blendshape as a proportionate combination of the first type of object and the second type of object.
99. The method of any one of claims 89 ¨ 98, wherein the decision boundary metric comprises a response criterion.
100. A system comprising one or more physiological component and an apparatus configured to execute the method of any one of claims 89 ¨ 98, wherein upon execution of the processor-executable instructions by the processing unit, the processing unit is configured to:
receive data indicative of one or more measurements of the physiological component; and adapt the task and/or the interference based at least in part on the computed first decision boundary metric and the data indicative of one or more measurements of the physiological component.
receive data indicative of one or more measurements of the physiological component; and adapt the task and/or the interference based at least in part on the computed first decision boundary metric and the data indicative of one or more measurements of the physiological component.
101. At least one non-transitory computer-readable medium for storing one or more computer-executable instructions which, when executed, causes a processing unit to execute the method of any one of claims 89 ¨ 98.
102. An apparatus for enhancing cognitive skills in an individual, said apparatus comprising:
a user interface;
a memory to store processor-executable instructions; and a processing unit communicatively coupled to the user interface and the memory, wherein upon execution of the processor-executable instructions by the processing unit, the processing unit is configured to:
receive data indicative of one or more of an amount, concentration, or dose titration of a pharmaceutical agent, drug, or biologic being or to be administered to an individual;
render a primary task with an interference at the user interface, one or more of the task and the interference being time-varying and having a response deadline, such that the user interface imposes a limited time period for receiving at least one type of response from an individual; and the user interface being configured to measure data indicative of two or more differing types of responses to the task or to the interference;
receive data indicative of a first response of an individual to the task and a second response of the individual to the interference, from a first session;
analyze the data indicative of the first response and the second response to compute a first response profile representative of a first performance of the individual;
determine a first decision boundary metric based at least in part on the at least one response profile, the first decision boundary metric comprising a quantitative measure of a tendency of the individual to provide at least one type of response of the two or more differing types of responses to the interference;
based at least in part on the computed first decision boundary metric and the amount or concentration of a pharmaceutical agent, drug, or biologic, adapt the task and/or the interference to generate a second session;
analyze collected data indicative of the first response and the second response from the second session, to compute a second response profile and a second decision boundary metric representative of a second performance of the individual; and based at least in part on the first decision boundary metric and second decision boundary metric, generate an output to the user interface indicative of at least one of: (i) a likelihood of the individual experiencing an adverse event in response to administration of the pharmaceutical agent, drug, or biologic, (ii) a recommended change in one or more of the amount, concentration, or dose titration of the pharmaceutical agent, drug, or biologic, (iii) a change in the individual's cognitive response capabilities, (iv) a recommended treatment regimen, (v) a recommendation of at least one of a behavioral therapy, counseling, or physical exercise , or (vi) a degree of effectiveness of at least one of a behavioral therapy, counseling, or physical exercise.
a user interface;
a memory to store processor-executable instructions; and a processing unit communicatively coupled to the user interface and the memory, wherein upon execution of the processor-executable instructions by the processing unit, the processing unit is configured to:
receive data indicative of one or more of an amount, concentration, or dose titration of a pharmaceutical agent, drug, or biologic being or to be administered to an individual;
render a primary task with an interference at the user interface, one or more of the task and the interference being time-varying and having a response deadline, such that the user interface imposes a limited time period for receiving at least one type of response from an individual; and the user interface being configured to measure data indicative of two or more differing types of responses to the task or to the interference;
receive data indicative of a first response of an individual to the task and a second response of the individual to the interference, from a first session;
analyze the data indicative of the first response and the second response to compute a first response profile representative of a first performance of the individual;
determine a first decision boundary metric based at least in part on the at least one response profile, the first decision boundary metric comprising a quantitative measure of a tendency of the individual to provide at least one type of response of the two or more differing types of responses to the interference;
based at least in part on the computed first decision boundary metric and the amount or concentration of a pharmaceutical agent, drug, or biologic, adapt the task and/or the interference to generate a second session;
analyze collected data indicative of the first response and the second response from the second session, to compute a second response profile and a second decision boundary metric representative of a second performance of the individual; and based at least in part on the first decision boundary metric and second decision boundary metric, generate an output to the user interface indicative of at least one of: (i) a likelihood of the individual experiencing an adverse event in response to administration of the pharmaceutical agent, drug, or biologic, (ii) a recommended change in one or more of the amount, concentration, or dose titration of the pharmaceutical agent, drug, or biologic, (iii) a change in the individual's cognitive response capabilities, (iv) a recommended treatment regimen, (v) a recommendation of at least one of a behavioral therapy, counseling, or physical exercise , or (vi) a degree of effectiveness of at least one of a behavioral therapy, counseling, or physical exercise.
103. The apparatus of claim 102, wherein the change in the individual's cognitive response capabilities comprises a change in degree of impulsiveness or conservativeness of the individual's cognitive response strategy.
104. The apparatus of claim 102, wherein the decision boundary metric comprises a response criterion.
105. A computer-implemented method for enhancing cognitive skills in an individual, said method comprising:
receiving data indicative of one or more of an amount, concentration, or dose titration of a pharmaceutical agent, drug, or biologic being or to be administered to an individual;
rendering a task with an interference at a user interface;
measuring data indicative of two or more differing types of responses to the task or to the interference;
receiving data indicative of a first response of an individual to the task and a second response of the individual to the interference;
analyzing, using at least one processing unit, the data indicative of the first response and the second response to compute a first response profile representative of the performance of the individual;
determining a first decision boundary metric based at least in part on the at least one response profile, the first decision boundary metric comprising a quantitative measure of a tendency of the individual to provide at least one type of response of the two or more differing types of responses to the interference;
based at least in part on the computed first decision boundary metric and the amount or concentration of a pharmaceutical agent, drug, or biologic, adapting the task and/or the interference such that the at least one response profile is modified;
analyzing the collected data indicative of the first response and the second response to compute a second decision boundary metric representative of a second performance of the individual; and based at least in part on the first decision boundary metric and second decision boundary metric, generate an output to the user interface indicative of at least one of (i) a change in one or more of the amount, concentration, or dose titration of the pharmaceutical agent, drug, or biologic, (ii) a likelihood of the individual experiencing an adverse event in response to administration of the pharmaceutical agent, drug, or biologic, (iii) a change in the individual's cognitive response capabilities, (iv) a recommended treatment regimen, (v) a recommendation of at least one of a behavioral therapy, counseling, or physical exercise, or (vi) a degree of effectiveness of at least one of a behavioral therapy, counseling, or physical exercise.
receiving data indicative of one or more of an amount, concentration, or dose titration of a pharmaceutical agent, drug, or biologic being or to be administered to an individual;
rendering a task with an interference at a user interface;
measuring data indicative of two or more differing types of responses to the task or to the interference;
receiving data indicative of a first response of an individual to the task and a second response of the individual to the interference;
analyzing, using at least one processing unit, the data indicative of the first response and the second response to compute a first response profile representative of the performance of the individual;
determining a first decision boundary metric based at least in part on the at least one response profile, the first decision boundary metric comprising a quantitative measure of a tendency of the individual to provide at least one type of response of the two or more differing types of responses to the interference;
based at least in part on the computed first decision boundary metric and the amount or concentration of a pharmaceutical agent, drug, or biologic, adapting the task and/or the interference such that the at least one response profile is modified;
analyzing the collected data indicative of the first response and the second response to compute a second decision boundary metric representative of a second performance of the individual; and based at least in part on the first decision boundary metric and second decision boundary metric, generate an output to the user interface indicative of at least one of (i) a change in one or more of the amount, concentration, or dose titration of the pharmaceutical agent, drug, or biologic, (ii) a likelihood of the individual experiencing an adverse event in response to administration of the pharmaceutical agent, drug, or biologic, (iii) a change in the individual's cognitive response capabilities, (iv) a recommended treatment regimen, (v) a recommendation of at least one of a behavioral therapy, counseling, or physical exercise, or (vi) a degree of effectiveness of at least one of a behavioral therapy, counseling, or physical exercise.
106. The method of claim 105, wherein the decision boundary metric comprises a response criterion.
107. A system comprising one or more physiological component and an apparatus configured to execute the method of claim 105, wherein upon execution of the processor-executable instructions by the processing unit, the processing unit is configured to:
receive data indicative of one or more measurements of the physiological component; and adapt the task and/or the interference based at least in part on the computed first decision boundary metric and the data indicative of one or more measurements of the physiological component.
receive data indicative of one or more measurements of the physiological component; and adapt the task and/or the interference based at least in part on the computed first decision boundary metric and the data indicative of one or more measurements of the physiological component.
108. At least one non-transitory computer-readable medium for storing one or more computer-executable instructions which, when executed, causes a processing unit to execute the method of claim 105 or 106.
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662364297P | 2016-07-19 | 2016-07-19 | |
US62/364,297 | 2016-07-19 | ||
US29/579,480 USD879133S1 (en) | 2016-09-30 | 2016-09-30 | Display screen or portion thereof with an animated graphical user interface |
US29/579,480 | 2016-09-30 | ||
PCT/US2017/042938 WO2018017767A1 (en) | 2016-07-19 | 2017-07-19 | Platforms to implement signal detection metrics in adaptive response-deadline procedures |
Publications (1)
Publication Number | Publication Date |
---|---|
CA3031251A1 true CA3031251A1 (en) | 2018-01-25 |
Family
ID=60992822
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA3031251A Pending CA3031251A1 (en) | 2016-07-19 | 2017-07-19 | Platforms to implement signal detection metrics in adaptive response-deadline procedures |
Country Status (6)
Country | Link |
---|---|
JP (2) | JP7267910B2 (en) |
KR (1) | KR102449377B1 (en) |
CN (1) | CN109996485B (en) |
AU (1) | AU2017299614A1 (en) |
CA (1) | CA3031251A1 (en) |
WO (1) | WO2018017767A1 (en) |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11120158B2 (en) | 2018-04-13 | 2021-09-14 | Plaid Inc. | Secure permissioning of access to user accounts, including secure distribution of aggregated user account data |
CA3115994A1 (en) * | 2018-10-15 | 2020-04-23 | Akili Interactive Labs, Inc. | Cognitive platform for deriving effort metric for optimizing cognitive treatment |
CN109635917B (en) * | 2018-10-17 | 2020-08-25 | 北京大学 | Multi-agent cooperation decision and training method |
KR102248732B1 (en) * | 2019-06-27 | 2021-05-06 | (주)해피마인드 | System and method for classifying attention deficit hyperactivity and predicting therapeutic response and based on comprehensive attention test data |
CN110313924B (en) * | 2019-07-12 | 2022-05-17 | 中国科学院心理研究所 | Diamagnetization touch-free time estimation recording trigger |
WO2021033827A1 (en) * | 2019-08-22 | 2021-02-25 | 주식회사 프로젝트레인보우 | Developmental disability improvement system and method using deep learning module |
US11869005B2 (en) | 2019-09-17 | 2024-01-09 | Plaid Inc. | System and method linking to accounts using credential-less authentication |
US10722165B1 (en) * | 2019-09-30 | 2020-07-28 | BioMech Sensor LLC | Systems and methods for reaction measurement |
WO2021064726A1 (en) * | 2019-10-02 | 2021-04-08 | Feuerstein Learning And Thinking, Ltd. | Profile oriented cognitive improvement system and method |
US12026704B2 (en) | 2019-12-17 | 2024-07-02 | Plaid Inc. | System and method for assessing a digital interaction with a digital third party account service |
CN111260984B (en) * | 2020-01-20 | 2022-03-01 | 北京津发科技股份有限公司 | Multi-person cooperative cognitive ability training method and device and storage medium |
CA3189855A1 (en) | 2020-08-18 | 2022-02-24 | William Frederick Kiefer | System and method for managing user interaction flows within third party applications |
CN112137628B (en) * | 2020-09-10 | 2021-08-03 | 北京津发科技股份有限公司 | Three-dimensional space cognition evaluation and training method and system |
CN112241971A (en) * | 2020-09-30 | 2021-01-19 | 天津大学 | Method for measuring motion prediction capability by using entropy and eye movement data |
WO2022085327A1 (en) * | 2020-10-23 | 2022-04-28 | 株式会社島津製作所 | Brain function analysis method and brain function analysis system |
CN115120240B (en) * | 2022-08-30 | 2022-12-02 | 山东心法科技有限公司 | Sensitivity evaluation method, equipment and medium for special industry target perception skills |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2245568A4 (en) * | 2008-02-20 | 2012-12-05 | Univ Mcmaster | Expert system for determining patient treatment response |
WO2010045356A1 (en) * | 2008-10-14 | 2010-04-22 | Ohio University | Cognitive and linguistic assessment using eye tracking |
US20100292545A1 (en) * | 2009-05-14 | 2010-11-18 | Advanced Brain Monitoring, Inc. | Interactive psychophysiological profiler method and system |
JP5476137B2 (en) * | 2010-01-19 | 2014-04-23 | 株式会社日立製作所 | Human interface based on biological and brain function measurement |
CA2720892A1 (en) * | 2010-11-12 | 2012-05-12 | The Regents Of The University Of California | Enhancing cognition in the presence of distraction and/or interruption |
ES2831648T3 (en) * | 2010-11-24 | 2021-06-09 | Digital Artefacts Llc | Systems and methods for assessing cognitive function |
WO2013111746A1 (en) * | 2012-01-26 | 2013-08-01 | 独立行政法人国立精神・神経医療研究センター | Cognitive function testing system, cognitive function estimation system, cognitive function testing method, and cognitive function estimation method |
US9265458B2 (en) * | 2012-12-04 | 2016-02-23 | Sync-Think, Inc. | Application of smooth pursuit cognitive testing paradigms to clinical drug development |
CA2949431C (en) * | 2014-05-21 | 2023-09-26 | Akili Interactive Labs, Inc. | Processor-implemented systems and methods for enhancing cognitive abilities by personalizing cognitive training regimens |
JP6234563B2 (en) * | 2014-05-22 | 2017-11-22 | 株式会社日立製作所 | Training system |
US20160125758A1 (en) * | 2014-10-29 | 2016-05-05 | Ohio University | Assessing cognitive function using a multi-touch device |
JP6013438B2 (en) | 2014-12-09 | 2016-10-25 | 株式会社Nttデータ・アイ | Brain disease diagnosis support system, brain disease diagnosis support method and program |
-
2017
- 2017-07-19 AU AU2017299614A patent/AU2017299614A1/en not_active Abandoned
- 2017-07-19 KR KR1020197004637A patent/KR102449377B1/en active IP Right Grant
- 2017-07-19 JP JP2019502690A patent/JP7267910B2/en active Active
- 2017-07-19 CN CN201780057404.6A patent/CN109996485B/en active Active
- 2017-07-19 WO PCT/US2017/042938 patent/WO2018017767A1/en unknown
- 2017-07-19 CA CA3031251A patent/CA3031251A1/en active Pending
-
2022
- 2022-06-14 JP JP2022095789A patent/JP2022153354A/en active Pending
Also Published As
Publication number | Publication date |
---|---|
KR20190077305A (en) | 2019-07-03 |
CN109996485B (en) | 2022-06-21 |
CN109996485A (en) | 2019-07-09 |
AU2017299614A1 (en) | 2019-01-31 |
WO2018017767A1 (en) | 2018-01-25 |
JP2019528812A (en) | 2019-10-17 |
JP2022153354A (en) | 2022-10-12 |
KR102449377B1 (en) | 2022-09-30 |
JP7267910B2 (en) | 2023-05-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12016700B2 (en) | Cognitive platform coupled with a physiological component | |
JP7473338B2 (en) | A cognitive platform that includes computerized evocative elements | |
KR102449377B1 (en) | Platforms for implementing signal detection metrics in adaptive response deadline procedures | |
US11846964B2 (en) | Cognitive platform including computerized elements | |
US11839472B2 (en) | Platforms to implement signal detection metrics in adaptive response-deadline procedures | |
US20200380882A1 (en) | Cognitive platform including computerized evocative elements in modes | |
US20240081706A1 (en) | Platforms to implement signal detection metrics in adaptive response-deadline procedures |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
EEER | Examination request |
Effective date: 20220228 |
|
EEER | Examination request |
Effective date: 20220228 |
|
EEER | Examination request |
Effective date: 20220228 |
|
EEER | Examination request |
Effective date: 20220228 |
|
EEER | Examination request |
Effective date: 20220228 |
|
EEER | Examination request |
Effective date: 20220228 |
|
EEER | Examination request |
Effective date: 20220228 |
|
EEER | Examination request |
Effective date: 20220228 |