WO2023275975A1 - 認知機能推定装置、認知機能推定方法及び記憶媒体 - Google Patents
認知機能推定装置、認知機能推定方法及び記憶媒体 Download PDFInfo
- Publication number
- WO2023275975A1 WO2023275975A1 PCT/JP2021/024506 JP2021024506W WO2023275975A1 WO 2023275975 A1 WO2023275975 A1 WO 2023275975A1 JP 2021024506 W JP2021024506 W JP 2021024506W WO 2023275975 A1 WO2023275975 A1 WO 2023275975A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- cognitive function
- subject
- state
- state information
- information
- Prior art date
Links
- 230000003920 cognitive function Effects 0.000 title claims abstract description 344
- 238000000034 method Methods 0.000 title claims description 36
- 230000008859 change Effects 0.000 claims abstract description 23
- 238000005259 measurement Methods 0.000 claims description 46
- 230000005021 gait Effects 0.000 claims description 44
- 230000008569 process Effects 0.000 claims description 11
- 230000006866 deterioration Effects 0.000 claims description 8
- 235000019640 taste Nutrition 0.000 claims description 3
- 230000006870 function Effects 0.000 abstract description 34
- 230000019771 cognition Effects 0.000 abstract 4
- 238000012545 processing Methods 0.000 description 20
- 201000010099 disease Diseases 0.000 description 15
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 15
- 238000000605 extraction Methods 0.000 description 15
- 238000012360 testing method Methods 0.000 description 15
- 238000004364 calculation method Methods 0.000 description 14
- 230000015654 memory Effects 0.000 description 12
- 238000012937 correction Methods 0.000 description 9
- 238000010998 test method Methods 0.000 description 9
- 238000012986 modification Methods 0.000 description 8
- 230000004048 modification Effects 0.000 description 8
- 238000010586 diagram Methods 0.000 description 7
- 238000004891 communication Methods 0.000 description 6
- 208000024827 Alzheimer disease Diseases 0.000 description 5
- 210000004027 cell Anatomy 0.000 description 5
- 201000002832 Lewy body dementia Diseases 0.000 description 4
- 238000013500 data storage Methods 0.000 description 4
- 230000007423 decrease Effects 0.000 description 4
- 230000008921 facial expression Effects 0.000 description 4
- 206010012289 Dementia Diseases 0.000 description 3
- 208000009829 Lewy Body Disease Diseases 0.000 description 3
- 230000032683 aging Effects 0.000 description 3
- 238000013528 artificial neural network Methods 0.000 description 3
- 208000010877 cognitive disease Diseases 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000003557 neuropsychological effect Effects 0.000 description 3
- 208000028698 Cognitive impairment Diseases 0.000 description 2
- 206010041349 Somnolence Diseases 0.000 description 2
- 230000005856 abnormality Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 2
- 238000005452 bending Methods 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 230000003247 decreasing effect Effects 0.000 description 2
- 238000003745 diagnosis Methods 0.000 description 2
- 230000001815 facial effect Effects 0.000 description 2
- 210000001652 frontal lobe Anatomy 0.000 description 2
- 230000014509 gene expression Effects 0.000 description 2
- 230000010365 information processing Effects 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 230000008092 positive effect Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 230000004622 sleep time Effects 0.000 description 2
- 230000035882 stress Effects 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 206010067889 Dementia with Lewy bodies Diseases 0.000 description 1
- 208000017899 Foot injury Diseases 0.000 description 1
- 208000018982 Leg injury Diseases 0.000 description 1
- 206010061225 Limb injury Diseases 0.000 description 1
- 208000027418 Wounds and injury Diseases 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 201000007201 aphasia Diseases 0.000 description 1
- 230000037007 arousal Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000003542 behavioural effect Effects 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 230000036772 blood pressure Effects 0.000 description 1
- 230000036760 body temperature Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 230000003727 cerebral blood flow Effects 0.000 description 1
- 208000013677 cerebrovascular dementia Diseases 0.000 description 1
- 230000006999 cognitive decline Effects 0.000 description 1
- 238000005094 computer simulation Methods 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 230000006735 deficit Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000002068 genetic effect Effects 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 229940088597 hormone Drugs 0.000 description 1
- 239000005556 hormone Substances 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 238000012905 input function Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- AJVRSHNXSHMMCH-UHFFFAOYSA-K iron(III) citrate monohydrate Chemical compound O.[Fe+3].[O-]C(=O)CC(O)(CC([O-])=O)C([O-])=O AJVRSHNXSHMMCH-UHFFFAOYSA-K 0.000 description 1
- 230000006386 memory function Effects 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 230000001575 pathological effect Effects 0.000 description 1
- SBNFWQZLDJGRLK-UHFFFAOYSA-N phenothrin Chemical compound CC1(C)C(C=C(C)C)C1C(=O)OCC1=CC=CC(OC=2C=CC=CC=2)=C1 SBNFWQZLDJGRLK-UHFFFAOYSA-N 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000029058 respiratory gaseous exchange Effects 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 230000028327 secretion Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 238000007619 statistical method Methods 0.000 description 1
- 238000013179 statistical model Methods 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 230000003936 working memory Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/40—Detecting, measuring or recording for evaluating the nervous system
- A61B5/4076—Diagnosing or monitoring particular conditions of the nervous system
- A61B5/4088—Diagnosing of monitoring cognitive diseases, e.g. Alzheimer, prion diseases or dementia
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B10/00—Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0077—Devices for viewing the surface of the body, e.g. camera, magnifying lens
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/112—Gait analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1126—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
- A61B5/1128—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/167—Personality evaluation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/40—Detecting, measuring or recording for evaluating the nervous system
- A61B5/4058—Detecting, measuring or recording for evaluating the nervous system for evaluating the central nervous system
- A61B5/4064—Evaluating the brain
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4803—Speech analysis specially adapted for diagnostic purposes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/486—Bio-feedback
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7271—Specific aspects of physiological measurement analysis
- A61B5/7278—Artificial waveform generation or derivation, e.g. synthesising signals from measured signals
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7271—Specific aspects of physiological measurement analysis
- A61B5/7282—Event detection, e.g. detecting unique waveforms indicative of a medical condition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/746—Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
- G06V10/443—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
- G06V10/449—Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters
- G06V10/451—Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters with interaction between the filter responses, e.g. cortical complex cells
- G06V10/454—Integrating the filters into a hierarchical structure, e.g. convolutional neural networks [CNN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L17/00—Speaker identification or verification techniques
- G10L17/26—Recognition of special voice characteristics, e.g. for use in lie detectors; Recognition of animal voices
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H10/00—ICT specially adapted for the handling or processing of patient-related medical or healthcare data
- G16H10/60—ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/30—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2560/00—Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
- A61B2560/02—Operational features
- A61B2560/0266—Operational features for monitoring or limiting apparatus function
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2560/00—Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
- A61B2560/04—Constructional details of apparatus
- A61B2560/0443—Modular apparatus
- A61B2560/045—Modular apparatus with a separable interface unit, e.g. for communication
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/02—Details of sensors specially adapted for in-vivo measurements
- A61B2562/0219—Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2576/00—Medical imaging apparatus involving image processing or analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2576/00—Medical imaging apparatus involving image processing or analysis
- A61B2576/02—Medical imaging apparatus involving image processing or analysis specially adapted for a particular organ or body part
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
- A61B5/0015—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
- A61B5/0022—Monitoring a patient using a global network, e.g. telephone networks, internet
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/0205—Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
- A61B5/02055—Simultaneously evaluating both cardiovascular condition and temperature
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/145—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
Definitions
- the present disclosure relates to the technical field of cognitive function estimation devices, cognitive function estimation methods, and storage media that perform processing related to estimation of cognitive functions.
- Patent Literature 1 discloses a cognitive function measuring device that calculates an evaluation value related to cognitive function based on gait data of a subject.
- Non-Patent Document 1 discloses a technique of testing a subject's cognitive function based on the subject's face data (especially, line-of-sight measurement information).
- Non-Patent Document 2 discloses a technique for determining whether or not a subject has dementia from a face image of the subject using a model based on deep learning.
- Non-Patent Document 3 compares the gait in Alzheimer's disease and Lewy body dementia, and compared to Alzheimer's disease in Lewy body dementia, step time and swing phase asymmetry is conspicuous, and the step time and step length have a large variance. Also, in the case of Alzheimer's dementia in the late stage, it is generally known that walking becomes slow and there is a gait tendency such as bending forward and tilting to the left and right. In the case of dementia with Lewy bodies, it is known that gait tendencies such as shuffling, walking with short strides, bending forward, and small arm swings are observed. In the case of cerebrovascular dementia, it is known that gait tendencies such as short walking, wide walking, and shuffling are observed.
- one object of the present disclosure is to provide a cognitive function estimation device, a cognitive function estimation method, and a storage medium capable of accurately estimating cognitive function.
- One aspect of the cognitive function estimation device is a first state information obtaining means for obtaining first state information representing a first state of the subject related to cognitive function of the subject; a second state information acquiring means for acquiring second state information representing a second state of the subject having a longer state change interval than the first state; Cognitive function estimation means for estimating the subject's cognitive function based on the first state information and the second state information; It is a cognitive function estimation device comprising
- Another aspect of the cognitive function estimation device is acquisition means for acquiring face data, which is measurement information about a subject's face, and gait data, which is measurement information about a walking state of the subject; Cognitive function estimation means for estimating the cognitive function of the subject based on the face data and the gait data; It is a cognitive function estimation device having
- One aspect of the method for estimating cognitive function is the computer obtaining first state information representing a first state of the subject associated with the subject's cognitive function; Acquiring second state information representing a second state of the subject having a longer state change interval than the first state, estimating the cognitive function of the subject based on the first state information and the second state information; It is a cognitive function estimation method.
- the "computer” includes any electronic device (it may be a processor included in the electronic device), and may be composed of a plurality of electronic devices.
- Another aspect of the method for estimating cognitive function is the computer Acquiring face data, which is measurement information about the subject's face, and gait data, which is measurement information about the subject's walking state; estimating the cognitive function of the subject based on the face data and the gait data; It is a cognitive function estimation method.
- One aspect of the storage medium is obtaining first state information representing a first state of the subject associated with the subject's cognitive function; Acquiring second state information representing a second state of the subject having a longer state change interval than the first state,
- a storage medium storing a program for causing a computer to execute a process of estimating the subject's cognitive function based on the first state information and the second state information.
- Another aspect of the storage medium is Acquiring face data, which is measurement information about the subject's face, and gait data, which is measurement information about the subject's walking state;
- a storage medium storing a program for causing a computer to execute a process of estimating the subject's cognitive function based on the face data and the gait data.
- FIG. 1 shows a schematic configuration of a cognitive function estimation system according to a first embodiment
- 2 shows a hardware configuration of an information processing device
- 1 is a schematic representation of factors affecting cognitive function
- FIG. It is an example of a functional block of an information processing device. It is a figure which shows the specific example regarding the estimation of a cognitive function. It is an example of the functional block of the cognitive function estimation apparatus regarding learning of an inference model. It is an example of the flowchart which shows the processing procedure regarding the estimation of a cognitive function.
- 1 shows a schematic configuration of a cognitive function estimation system according to a second embodiment
- It is a block diagram of a cognitive function estimation device in a 3rd embodiment.
- FIG. 1 shows a schematic configuration of a cognitive function estimation system 100 according to the first embodiment.
- the cognitive function estimation system 100 highly accurately estimates the subject's cognitive function without imposing an excessive measurement load on the subject, and presents the estimation result.
- the "subject” may be a person whose cognitive function is managed by an organization, or an individual user.
- the cognitive function estimation system 100 mainly includes a cognitive function estimation device 1, an input device 2, an output device 3, a storage device 4, and a sensor 5.
- the cognitive function estimation device 1 performs data communication with the input device 2, the output device 3, and the sensor 5 via a communication network or by direct wireless or wired communication. Then, the cognitive function estimation device 1, based on the input signal "S1" supplied from the input device 2, the sensor (detection) signal “S3" supplied from the sensor 5, and the information stored in the storage device 4, Estimate the subject's cognitive function. At this time, the cognitive function estimating apparatus 1 detects the subject's temporary (that is, short-term change) state (also referred to as "first state”), and also changes at intervals longer than the first state. The subject's cognitive function is estimated with high accuracy by considering the state (also referred to as "second state").
- the cognitive function estimating device 1 uses a cognitive function score adopted in any neuropsychological test such as MMSE (Mini-Mental State Examination) (30 points in the case of MMSE full marks).
- MMSE Mini-Mental State Examination
- the higher the score the higher the cognitive function (normal).
- the cognitive function estimating device 1 generates an output signal “S2” regarding the estimation result of the subject's cognitive function, and supplies the generated output signal S2 to the output device 3 .
- the input device 2 is an interface that accepts manual input (external input) of information about each subject.
- the user who inputs information using the input device 2 may be the subject himself/herself, or may be a person who manages or supervises the activity of the subject.
- the input device 2 may be, for example, various user input interfaces such as a touch panel, buttons, keyboard, mouse, and voice input device.
- the input device 2 supplies the generated input signal S ⁇ b>1 to the cognitive function estimation device 1 .
- the output device 3 displays or outputs predetermined information based on the output signal S ⁇ b>2 supplied from the cognitive function estimation device 1 .
- the output device 3 is, for example, a display, a projector, a speaker, or the like.
- the sensor 5 measures the subject's biological signal and the like, and supplies the measured biological signal and the like to the cognitive function estimation device 1 as a sensor signal S3.
- the sensor signal S3 is an arbitrary biological signal such as the subject's heartbeat, brain wave, pulse wave, perspiration (electrodermal activity), hormone secretion, cerebral blood flow, blood pressure, body temperature, myoelectricity, respiration rate, acceleration, etc. It may be a signal (including vital information).
- the sensor 5 may be a device that analyzes blood collected from a subject and outputs a sensor signal S3 indicating the analysis result.
- the senor 5 may be a wearable terminal worn by the subject, a camera that photographs the subject, a microphone that generates an audio signal of the subject's speech, or the like.
- a terminal such as a computer or a smartphone may be used.
- the wearable terminal described above includes, for example, a GNSS (global navigation satellite system) receiver, an acceleration sensor, and other sensors that detect biological signals, and outputs the output signal of each of these sensors as a sensor signal S3.
- the sensor 5 may supply the cognitive function estimation device 1 with information corresponding to the operation amount of a personal computer, a smartphone, or the like as the sensor signal S3.
- the sensor 5 may output a sensor signal S3 representing biometric data (including sleep time) from the subject while the subject is sleeping.
- the storage device 4 is a memory that stores various information necessary for the processing executed by the cognitive function estimation device 1 .
- the storage device 4 may be an external storage device such as a hard disk connected to or built into the cognitive function estimation device 1, or may be a storage medium such as a flash memory. Further, the storage device 4 may be a server device that performs data communication with the cognitive function estimation device 1 . Also, the storage device 4 may be composed of a plurality of devices.
- the storage device 4 functionally has a second state information storage section 41 and a calculation information storage section 42 .
- the second state information storage unit 41 stores second state information that is information about the subject's second state.
- the second state information includes, for example, disease information (including results of diagnosis by a doctor) on the subject's disease, lifestyle information on lifestyle habits, genetic information, and various attributes of the subject (age, race, gender, etc.). , occupation, interest, taste, and/or personality).
- the second state information may be data converted into a data format that matches the input format of the model used by the cognitive function estimation device 1 in cognitive function estimation, which will be described later.
- the second state information is data obtained by performing feature extraction processing on the above-described disease information, lifestyle information, attribute information, etc., and is represented by a tensor in a predetermined format (for example, a feature vector).
- This feature extraction processing may be processing based on any feature extraction technology (including feature extraction technology based on learning using a neural network or the like).
- Generation of the second state information may be performed before estimation of cognitive function and may be performed by the cognitive function estimation device 1 or may be performed by a device other than the cognitive function estimation device 1 .
- the second status information is generated based on the questionnaire results in the first example. For example, there is a Big 5 questionnaire as a questionnaire for determining personality, and there is also a questionnaire regarding lifestyle habits. Personal attribute information such as age, gender, occupation, and race may also be generated based on the results of questionnaire responses.
- the second status information is generated by an image recognition technique using an image of a subject (for example, a technique for generating age information or race information of a person included in the image).
- the second state information may be information based on a measurement result obtained by continuously measuring the subject's first state, which is a temporary state, for a predetermined period (for example, one month or more).
- a predetermined period for example, one month or more.
- statistical data obtained by applying arbitrary statistical analysis processing to the measurement results of the first state of the subject continuously measured for a predetermined period is used as the second state information. It is stored in the second state information storage unit 41 .
- the second state information generated in the third example corresponds to the subject's lifestyle information.
- the calculation information storage unit 42 stores calculation information that is information used to calculate the estimation result (score) of cognitive function.
- the calculation information is information about a model for calculating the cognitive function score of the subject from the first state information and the second state information, which are information about the first state of the subject.
- the calculated information includes inference model information about an inference model that calculates a provisional cognitive function score for the subject from the first state information, and corrects the provisional score based on the second state information. and correction model information about the correction model.
- the score obtained by correcting the provisional score calculated by the inference model by the correction model is the final cognitive function estimation result (score).
- the correction model in the first example may be a model in which the correction amount for correcting the provisional score changes continuously or stepwise according to the second state information.
- the correction model may be a lookup table showing combinations of assumed second state information and correction amounts to be applied. Or it may be another computational model.
- the correction model may be a model that calculates a cognitive function score from the second status information and the provisional score.
- the correction model sets the provisional score to a predetermined value or a predetermined rate in the case of a classification result in which the second state information has a positive effect. and the provisional score is decreased by a predetermined value or rate in the case of classification results in which the second state information has a negative impact.
- the calculated information includes inference model information of an inference model trained to output an estimated score of cognitive function with both the first state information and the second state information as input data. It's okay.
- the inference model in the first example or the second example is, for example, a regression model (statistical model) or a machine learning model, in which case the calculated information is the parameters necessary to construct the model contains information about
- the calculated information includes the layer structure, the neuron structure of each layer, the number and size of filters in each layer, and the weight of each element of each filter. Contains information on various parameters.
- the inference model in the second example may be a formula or a lookup table for directly calculating an estimated cognitive function score from the first state information and the second state information.
- the inference model in the first example i.e., the model that outputs the provisional score from the first state information
- the inference model in the first example may be a formula or lookup table that directly calculates the estimated cognitive function score from the first state information. good.
- the configuration of the cognitive function estimation system 100 shown in FIG. 1 is an example, and various modifications may be made to the configuration.
- the input device 2 and the output device 3 may be integrally configured.
- the input device 2 and the output device 3 may be configured as a tablet terminal integrated with or separate from the cognitive function estimation device 1 .
- the input device 2 and the sensor 5 may be configured integrally.
- the cognitive function estimation device 1 may be composed of a plurality of devices. In this case, the plurality of devices that constitute the cognitive function estimation device 1 exchange information necessary for executing pre-assigned processing among the plurality of devices. In this case, the cognitive function estimation device 1 functions as a system.
- FIG. 2 shows the hardware configuration of the cognitive function estimation device 1.
- the cognitive function estimation device 1 includes a processor 11, a memory 12, and an interface 13 as hardware.
- Processor 11 , memory 12 and interface 13 are connected via data bus 10 .
- the processor 11 functions as a controller (arithmetic device) that controls the entire cognitive function estimation device 1 by executing a program stored in the memory 12 .
- the processor 11 is, for example, a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), or a TPU (Tensor Processing Unit).
- Processor 11 may be composed of a plurality of processors.
- Processor 11 is an example of a computer.
- the memory 12 is composed of various volatile and nonvolatile memories such as RAM (Random Access Memory), ROM (Read Only Memory), and flash memory.
- the memory 12 stores a program for executing the process executed by the cognitive function estimation device 1 .
- Part of the information stored in the memory 12 may be stored in one or more external storage devices that can communicate with the cognitive function estimation device 1, and may be stored in a storage medium that is detachable from the cognitive function estimation device 1. may be stored.
- the interface 13 is an interface for electrically connecting the cognitive function estimation device 1 and other devices.
- These interfaces may be wireless interfaces such as network adapters for wirelessly transmitting and receiving data to and from other devices, or hardware interfaces for connecting to other devices via cables or the like.
- the hardware configuration of the cognitive function estimation device 1 is not limited to the configuration shown in FIG.
- the cognitive function estimation device 1 may include at least one of the input device 2 and the output device 3 .
- the cognitive function estimation device 1 may be connected to or built in a sound output device such as a speaker.
- FIG. 3 is a diagram schematically showing factors affecting cognitive function.
- the subject's cognitive function is a) the subject's temporary condition; b) characteristics of the subject; c) personality of the subject; d) biological changes in the subject due to disease; e) Affected by biological changes of subjects due to secular change.
- Subject's temporary state represents a temporary (and short-term change) state, such as a subject's stress state or drowsiness.
- Subject's characteristic represents, for example, the subject's occupation, lifestyle, hobbies, preferences, and the like.
- a subject's biological change due to a disease refers to a biological change due to a disease that affects cognitive function, eg, dementia.
- Biological changes in a subject over time refers to age-related changes.
- each of these elements a) to e) has different varying intervals.
- the subject's temporary state is a state that changes with a period of about one day or less
- the subject's characteristics is “a) the subject's temporary state”. It is a condition that changes with cycles of longer and generally less than three years.
- subject's personality is a condition that changes with a period longer than "b) subject's characteristics” and less than 5 years
- subject's biological change due to disease is "c It is a state that changes with a cycle longer than "subject's personality” and about 10 years or less.
- Biological change of the subject due to secular change is an element whose degree of change does not vary depending on the living environment of the subject, and basically changes according to age.
- the first state information is information about "a) the subject's temporary state.
- the subject's stress state, drowsiness, etc. exemplified as “a) subject's temporary state” are the subject's first state information (for example, subject's face data, gait data, voice data, or subject's subjective questionnaire results, etc.).
- the second state information includes “b) characteristics of the subject,” “c) personality of the subject,” “d) biological changes in the subject due to disease,” and “e) changes in the subject due to aging. information about "biological changes”.
- the information representing "b) characteristics of the subject” and “c) character of the subject” is information related to the inner state of the subject (also referred to as “inner-related information”). , which is information that affects how the subject receives things.
- the information representing "d) biological change of the subject due to disease” and “e) biological change of the subject due to aging” is the underlying health level of the living body ( In other words, it is information (also referred to as “cell deterioration information”) regarding the degree of deterioration of cells.
- the cell deterioration information includes information on sex, race, etc., in addition to information on age and disease.
- the cognitive function estimating device 1 estimates the cognitive function of the subject based on the first state information and the second state information based on the measurement result of the subject, so that the cognitive function of the subject is highly accurate. presume.
- Cognitive functions are subdivided into, for example, intelligence functions (including language comprehension, perceptual integration, working memory, processing speed), attention functions, frontal lobe functions, language functions, memory functions, visuospatial cognitive functions, and directional attention functions.
- intelligence functions including language comprehension, perceptual integration, working memory, processing speed
- attention functions frontal lobe functions
- language functions including language comprehension, perceptual integration, working memory, processing speed
- frontal lobe functions language functions
- memory functions memory functions
- visuospatial cognitive functions and directional attention functions.
- directional attention functions for example, there are PVT task, WAIS-III, etc. as test methods for intelligence function, standard attention test method etc. as test method for attention function, and Trail marking test etc. as test method for frontal lobe function. exist.
- WAB aphasia test, Category Fluency test, etc. as test methods for language function, WMS-R etc.
- test method for language function As test method for language function, and Rey complex figure test as test method for visuospatial cognitive function.
- Rey complex figure test As test method for visuospatial cognitive function.
- BIT behavioral neglect test and the like as a test method for the directional attention function.
- these tests are examples, and any other neuropsychological test can be used to measure cognitive function.
- simple cognitive function testing methods that can be tested outside medical institutions, such as the N-back test and tests based on calculation problems.
- FIG. 4 is an example of functional blocks of the cognitive function estimation device 1 .
- the processor 11 of the cognitive function estimation device 1 functionally includes a first state information acquisition unit 15 , a second state information acquisition unit 16 , a cognitive function estimation unit 17 , and an output control unit 18 .
- the blocks that exchange data are connected by solid lines, but the combinations of blocks that exchange data are not limited to those shown in FIG. The same applies to other functional block diagrams to be described later.
- the first state information acquisition unit 15 receives the input signal S1 supplied from the input device 2 and/or the sensor signal S3 supplied from the sensor 5 via the interface 13, and based on these signals, the subject's Generate first state information.
- the input signal S1 used to generate the first state information corresponds to measurement information obtained by subjectively measuring the temporary state of the subject, and similarly the sensor signal S3 used to generate the first state information corresponds to the subject. It corresponds to measurement information that objectively measures a person's temporary state.
- the first state information acquisition unit 15 obtains face data that is measurement information about the subject's face (for example, video data showing the subject's face), gait data that is measurement information about the subject's walking state (for example, the subject's Video data showing the walking of the subject), audio data representing the voice uttered by the subject, or subjective questionnaire results for measuring the subject's arousal, concentration, or tension, etc. are generated as the first state information. do.
- the first state information acquisition unit 15 should generate first state information that matches the input format of the inference model used by the cognitive function estimation unit 17.
- the first state information acquisition unit 15 performs feature extraction processing on the above-described face data, gait data, voice data, and/or subjective questionnaire results. Then, the first state information acquisition unit 15 regards a tensor (for example, a feature vector) in a predetermined format obtained by the feature extraction process as the first state information.
- the feature extraction process described above may be a process based on any feature extraction technique (including feature extraction technique using a neural network).
- the first state information acquisition unit 15 supplies the generated first state information to the cognitive function estimation unit 17 .
- the first status information acquisition unit 15 transmits the output signal S2, which is a display signal for displaying the questionnaire answer screen, to the output device 3 via the interface 13. , causes the output device 3 to display the questionnaire response screen. Also, the first status information acquisition unit 15 receives an input signal S ⁇ b>1 representing the answer result on the questionnaire answer screen from the input device 2 via the interface 13 .
- the second state information acquisition unit 16 extracts the subject's second state information from the second state information storage unit 41 and supplies the extracted second state information to the cognitive function estimation unit 17 .
- the second state information acquisition unit 16 may convert the second state information extracted from the second state information storage unit 41 so as to match the input format of the model used by the cognitive function estimation unit 17. .
- the second state information acquisition unit 16 performs feature extraction processing to convert the second state information extracted from the second state information storage unit 41 into a tensor of a predetermined format (for example, a feature vector with a predetermined number of dimensions). Convert to Note that the second state information after being converted into the tensor described above may be stored in the second state information storage unit 41 in advance.
- the cognitive function estimating unit 17 obtains the first state information supplied from the first state information acquiring unit 15, the second state information supplied from the second state information acquiring unit 16, and the calculated information storage unit 42. Based on the calculated information, the subject's cognitive function is estimated. In this case, for example, the cognitive function estimation unit 17 calculates the estimated cognitive function score by correcting the provisional cognitive function score calculated based on the first state information based on the second state information. In another example, the cognitive function estimation unit 17 determines an estimated cognitive function score from information output by an inference model based on calculation information when the first state information and the second state information are input to the inference model. The cognitive function estimation unit 17 supplies the estimation result of the subject's cognitive function to the output control unit 18 .
- the output control unit 18 outputs information regarding the result of estimation of the subject's cognitive function.
- the output control unit 18 displays the estimation result of the cognitive function by the cognitive function estimation unit 17 on the display unit of the output device 3 or outputs the result by the sound output unit of the output device 3 .
- the output control unit 18, for example, compares the estimation result of the cognitive function with a reference value for determining the presence or absence of impairment of the cognitive function, and based on the comparison result, sends a predetermined notification to the subject or its administrator. you can go
- the output control unit 18 outputs information (warning information) prompting to go to the hospital, or outputs information regarding advice to increase sleep time. do.
- the output control unit 18 acquires the contact information of the subject's family from the storage device 4 or the like when the estimation result of the cognitive function is below the above-described reference value, etc., and provides the subject's family with the cognitive function. You may notify the information regarding an estimation result.
- the reference value described above may be a reference value determined based on the estimation results of the cognitive function obtained in the past in chronological order of the subject. It may be a reference value.
- the cognitive function estimating unit 17 associates the cognitive function estimation result with the identification information of the subject and stores it in the storage device 4, and the output control unit 18 controls the cognitive function of the subject stored in the storage device 4.
- the reference values are set based on the statistical values (that is, representative values such as average values and median values) of the estimation results obtained in the time series of functions.
- the output control unit 18 may set the above-described statistical value as the reference value, or may set a value lower than the above-described statistical value by a predetermined value or a predetermined rate as the reference value.
- a general reference value for determining the presence or absence of cognitive impairment is stored in advance in the storage device 4 or the like, and the output control unit 18 acquires the reference value and uses it as the reference value. It compares with the cognitive function estimation result generated by the cognitive function estimation unit 17 .
- the cognitive function estimation device 1 can easily determine the cognitive function of the subject based on the measurement by the sensor 5 or the simple input to the input device 2 (that is, easily without measurement load) ) and can be estimated with high accuracy. Then, the cognitive function estimation device 1 outputs the estimation result of the cognitive function estimated in a simple and highly accurate manner in this way, so that the subject is preferably encouraged to take self-care, early detection or prevention of cognitive function decline, etc. can be promoted.
- each component of the first state information acquisition unit 15, the second state information acquisition unit 16, the cognitive function estimation unit 17, and the output control unit 18 described with reference to FIG. can be realized by Further, each component may be realized by recording necessary programs in an arbitrary nonvolatile storage medium and installing them as necessary. Note that at least part of each of these components may be realized by any combination of hardware, firmware, and software, without being limited to being implemented by program software. Also, at least part of each of these components may be implemented using a user-programmable integrated circuit, such as an FPGA (Field-Programmable Gate Array) or a microcontroller. In this case, this integrated circuit may be used to implement a program composed of the above components.
- FPGA Field-Programmable Gate Array
- each component may be configured by an ASSP (Application Specific Standard Produce), an ASIC (Application Specific Integrated Circuit), or a quantum processor (quantum computer control chip).
- ASSP Application Specific Standard Produce
- ASIC Application Specific Integrated Circuit
- quantum processor quantum computer control chip
- FIG. 5 is a diagram showing a concrete example of estimation of cognitive function.
- FIG. 5 shows an example of estimating cognitive function using gait data and face data as first state information, and questionnaire results regarding lifestyle habits, diseases, personality, and race as second state information.
- the first state information acquisition unit 15 acquires gait data and face data of the subject based on, for example, video data output by a camera included in the sensor 5, and these acquired data is supplied to the cognitive function estimation unit 17 .
- the camera is provided at a position where the subject can be photographed (including, for example, the subject's residence or workplace), and the first state information acquisition unit 15 captures the video output by the camera (time series ), an image showing the subject's walking state is extracted as gait data, and an image showing the subject's face is extracted as face data based on image recognition technology.
- the questionnaire result which is the second status information
- the questionnaire result is generated based on a questionnaire conducted in advance and stored in the second status information storage unit 41 in advance.
- the above questionnaire results stored in the storage unit 41 are supplied to the cognitive function estimation unit 17 .
- the first state information acquisition unit 15 and the second state information acquisition unit 16 for example, perform a predetermined feature extraction process to convert each of the above-described information into a tensor in a predetermined format, and use the tensor in the predetermined format to The represented first state information and second state information are supplied to the cognitive function estimation unit 17 .
- the cognitive function estimation unit 17 acquires the gait data and face data of the subject acquired by the first state information acquisition unit 15, and The cognitive function of the subject is estimated by referring to the calculated information based on the results of the subject's questionnaire.
- the cognitive function estimating device 1 acquires a sensor signal S3 output by a non-contact sensor (here, a camera or the like), and refers to information pre-stored in the storage device 4. Therefore, it is possible to estimate the subject's cognitive function without imposing an excessive measurement burden on the subject. Then, the subject or the manager thereof can easily grasp the estimation result of the cognitive function based on the output of the information on the estimation result of the cognitive function by the cognitive function estimation device 1 .
- the cognitive function estimating apparatus 1 uses, as the first state information, gait data related to the directional attention function, which is one element of the cognitive function, and face data related to the attention function, which is another element of the cognitive function.
- the cognitive function estimating device 1 can estimate a wide range of functions with high accuracy by multilaterally estimating cognitive functions in consideration of multiple factors such as attention function and directional attention function among cognitive functions. becomes.
- the cognitive function estimating device 1 can detect lifestyle habits such as lack of exercise that affect gait ("b) subject's characteristics" in FIG. 3), diseases such as foot injuries, and facial expressions. Cognitive function is estimated using the second state information representing personality, race, and the like. In this way, the cognitive function estimation device 1 can obtain an accurate cognitive function estimation result by accurately considering the second state that is related (influenced) to the first state.
- the cognitive function estimation device 1 may estimate the cognitive function using the subject's voice data as the first state information in addition to the gait data and the face data.
- the sensor 5 includes a voice input device, supplies voice data generated when the subject speaks to the cognitive function estimation device 1, and the first state information acquisition unit 15 of the cognitive function estimation device 1 receives the Audio data is obtained as part of the first state information.
- the cognitive function estimation device 1 uses speech data related to language functions, which are cognitive function elements different from cognitive function elements related to gait data and face data, to achieve more multifaceted recognition. Function inference can be made. Also in this case, the cognitive function estimation device 1 can easily estimate the subject's cognitive function without a measurement load based on the output of the non-contact sensor (voice input device).
- FIG. 6 is an example of functional blocks of the processor 11 of the cognitive function estimation device 1 regarding learning of an inference model.
- the processor 11 functionally has a learning unit 19 .
- the storage device 4 further has a learning data storage unit 43 .
- the learning data storage unit 43 stores learning data including input data and correct answer data.
- the input data is the data input to the inference model in the training of the inference model
- the correct answer data is the correct cognitive function to be output by the inference model when the above-mentioned input data is input to the inference model in the training of the inference model. is the estimation result (that is, the correct score).
- the input data includes first state information and second state information.
- the first state information is data subjectively or objectively measured for learning from the subject or a person other than the subject (that is, the input signal S1 and the sensor signal S3 in FIGS. 1 and 4). data) by performing the same processing as the processing performed by the first state information acquisition unit 15.
- the second state information included in the input data may be the same data as the second state information stored in the second state information storage unit 41, or may be data separately generated for learning. .
- the input data is represented by a tensor in a predetermined format so as to match the input format of the inference model, for example, by performing the feature extraction processing already mentioned in the description of FIG. Note that such a feature extraction process may be executed by a learning device (cognitive function estimation device 1 in FIG. 6).
- the correct data is, for example, the result of a diagnosis of the cognitive function of the subject or a person other than the subject, or the result of a neuropsychological examination of the cognitive function.
- test results based on the various test methods for cognitive function described in the section "(3) Specific examples of first state and second state " are employed as correct data.
- the learning unit 19 refers to the learning data storage unit 43 and performs learning to generate calculation information, which is the parameters of the inference model stored in the calculation information storage unit 42, in the pre-stage of the cognitive function estimation process.
- the learning unit 19 for example, when the input data is input to the inference model, the error (loss) between the information output by the inference model and the correct data corresponding to the input data is minimized. , to determine the parameters of the inference model.
- the algorithm for determining the above parameters to minimize loss may be any learning algorithm used in machine learning, such as gradient descent or error backpropagation. Then, the learning unit 19 stores the parameters of the inference model after learning in the learning data storage unit 43 as calculation information.
- FIG. 7 is an example of a flow chart showing a processing procedure of the cognitive function estimation device 1 regarding estimation of cognitive function.
- the cognitive function estimation device 1 determines that it is time to estimate the cognitive function, and executes the process of the flowchart of FIG. 7 .
- the cognitive function estimation device 1 determines that the above estimation execution condition is satisfied.
- the cognitive function estimating device 1 may refer to the presumed execution conditions stored in the storage device 4 or the like to determine whether or not the presumed execution conditions are satisfied. (predetermined time), it may be determined that the estimation execution condition is satisfied.
- the cognitive function estimation device 1 acquires the sensor signal S3 and/or the input signal S1 for generating the first state information necessary for estimating the cognitive function, and the estimation execution condition is satisfied. It may be determined that
- the first state information acquisition unit 15 of the cognitive function estimation device 1 is based on the sensor signal S3 or / and the input signal S1, etc., which is the measurement information of the subject at the timing of estimating the cognitive function described above, and the first state information is generated (step S11).
- the first state information acquisition unit 15 provides a sensor signal S3 indicating objective measurement information of the subject from the sensor 5 and/or subjective measurement information of the subject from the input device 2 via the interface 13.
- An input signal S1 is obtained, and first state information is generated based on the obtained signal.
- the first state information acquisition unit 15 performs a predetermined feature extraction process on the acquired sensor signal S3 and/or the input signal S1, so that the input format of the model used by the cognitive function estimation unit 17 may be generated to match the first state information.
- the second state information acquisition unit 16 of the cognitive function estimation device 1 acquires the second state information of the subject (step S12).
- the second status information acquisition unit 16 acquires the subject's second status information from the second status information storage unit 41 via the interface 13 .
- the second state information acquisition unit 16 for example, by performing a predetermined feature extraction process on the information extracted from the second state information storage unit 41, the input format of the model used by the cognitive function estimation unit 17 Consistent second state information may be generated.
- the cognitive function estimation unit 17 of the cognitive function estimation device 1 estimates the subject's cognitive function based on the first state information acquired in step S11 and the second state information acquired in step S12 (step S13).
- the cognitive function estimating unit 17 inputs the first state information and the second state information to the inference model based on the calculation information stored in the calculation information storage unit 42, for example. Get feature estimation results.
- the inference model mentioned above may be a learning model, a formula, a lookup table, or the like, as described above.
- the output control unit 18 of the cognitive function estimating device 1 outputs the cognitive function estimation result calculated in step S13 (step S14).
- the output control unit 18 supplies the output signal S2 to the output device 3 so that the output device 3 performs display or audio output representing the estimation result of the cognitive function.
- the output control unit 18 compares the cognitive function estimation result with a predetermined reference value, and based on the comparison result, notifies the subject or the administrator of the cognitive function estimation result. do.
- the cognitive function estimation device 1 can preferably present the information on the estimation result of the cognitive function of the subject to the subject or the manager thereof.
- the cognitive function estimation device 1 may estimate the subject's cognitive function based on the first state information without using the second state information.
- the cognitive function estimation device 1 estimates the subject's cognitive function based on the gait data and the face data.
- the calculation information stored in the calculation information storage unit 42 includes parameters of an inference model that outputs an estimation result of cognitive function when the first state information is input, and the output control unit 18 , the subject's cognitive function is estimated from the first state information by using an inference model based on the calculated information. Note that the storage device 4 does not need to have the second state information storage section 41 in this modification.
- the cognitive function estimation device 1 acquires gait data related to the directional attention function and face data related to the attention function based on the output of a non-contact sensor (here, a camera or the like). Therefore, it is possible to estimate cognitive function with high accuracy without imposing a measurement burden on the subject, and to estimate a wide range of cognitive functions.
- the cognitive function estimating device 1 can estimate a wide range of functions with high accuracy by multilaterally estimating cognitive functions in consideration of multiple factors such as attention function and directional attention function among cognitive functions. becomes.
- FIG. 8 shows a schematic configuration of a cognitive function estimation system 100A in the second embodiment.
- a cognitive function estimation system 100A according to the second embodiment is a server-client model system, and a cognitive function estimation device 1A functioning as a server device performs the processing of the cognitive function estimation device 1 in the first embodiment.
- symbol is attached suitably, and the description is abbreviate
- the cognitive function estimation system 100A mainly includes a cognitive function estimation device 1A functioning as a server, a storage device 4 storing data similar to that of the first embodiment, and a terminal device functioning as a client. 8.
- the cognitive function estimation device 1A and the terminal device 8 perform data communication via the network 7.
- the terminal device 8 is a terminal having an input function, a display function, and a communication function, and functions as the input device 2 and the output device 3 shown in FIG.
- the terminal device 8 may be, for example, a personal computer, a tablet terminal, a PDA (Personal Digital Assistant), or the like.
- the terminal device 8 transmits a biological signal output by a sensor (not shown) or an input signal based on a user's input to the cognitive function estimation device 1A.
- the cognitive function estimation device 1A has the same configuration as the cognitive function estimation device 1 shown in FIGS. 1, 2, and 4, for example. Then, the cognitive function estimation device 1A receives information obtained by the cognitive function estimation device 1 from the input device 2 and the sensor 5 shown in FIG. Estimate the subject's cognitive function. In addition, the cognitive function estimation device 1A transmits an output signal indicating information on the estimation result to the terminal device 8 via the network 7 based on a request from the terminal device 8 . That is, in this case, the terminal device 8 functions as the output device 3 in the first embodiment. As a result, the cognitive function estimation device 1A preferably presents the user of the terminal device 8 with information about the cognitive function estimation result.
- FIG. 9 is a block diagram of a cognitive function estimation device 1X according to the third embodiment.
- the cognitive function estimation device 1X mainly includes first state information acquisition means 15X, second state information acquisition means 16X, and cognitive function estimation means 17X. Note that the cognitive function estimation device 1X may be configured by a plurality of devices.
- the first state information acquisition means 15X acquires first state information representing the subject's first state related to the subject's cognitive function.
- the first state information acquisition means 15X can be, for example, the first state information acquisition section 15 in the first embodiment or the second embodiment.
- the second state information acquisition means 16X acquires second state information representing a second state of the subject with a longer state change interval (not necessarily a constant cycle, the same shall apply hereinafter) than the first state. .
- the second state information acquisition means 16X can be, for example, the second state information acquisition section 16 in the first embodiment (except for modifications, the same applies to the third embodiment below) or the second embodiment.
- the cognitive function estimation means 17X estimates the subject's cognitive function based on the first state information and the second state information.
- the cognitive function estimation means 17X can be, for example, the cognitive function estimation unit 17 in the first embodiment or the second embodiment.
- FIG. 10 is an example of a flowchart executed by the cognitive function estimation device 1X in the third embodiment.
- the first state information acquisition means 15X acquires first state information representing the subject's first state related to the subject's cognitive function (step S21).
- the second state information acquiring means 16X acquires second state information representing a second state of the subject having a longer state change interval than the first state (step S22).
- the cognitive function estimation means 17X estimates the subject's cognitive function based on the first state information and the second state information (step S23).
- the cognitive function estimation device 1X can accurately estimate the subject's cognitive function.
- FIG. 11 is a block diagram of a cognitive function estimation device 1Y in the fourth embodiment.
- Cognitive function estimation device 1Y mainly has acquisition means 15Y and cognitive function estimation means 17Y. Note that the cognitive function estimation device 1Y may be configured by a plurality of devices.
- the acquisition means 15Y acquires face data, which is measurement information about the subject's face, and gait data, which is measurement information about the subject's walking state.
- the acquisition unit 15Y can be, for example, the first state information acquisition unit 15 in the first embodiment (including modifications) or the second embodiment.
- the cognitive function estimation means 17Y estimates the subject's cognitive function based on the face data and the gait data.
- the cognitive function estimating means 17Y can be, for example, the cognitive function estimating section 17 in the first embodiment (including modifications) or the second embodiment.
- FIG. 12 is an example of a flowchart executed by the cognitive function estimation device 1Y in the fourth embodiment.
- the acquiring unit 15Y acquires face data, which is measurement information about the subject's face, and gait data, which is measurement information about the subject's walking state (step S31).
- the cognitive function estimation means 17Y estimates the subject's cognitive function based on the face data and the gait data (step S32).
- the cognitive function estimation device 1X makes it possible to estimate the subject's cognitive function with high accuracy without imposing an excessive measurement load on the subject.
- Non-transitory computer readable media include various types of tangible storage media.
- Examples of non-transitory computer-readable media include magnetic storage media (e.g., floppy disks, magnetic tapes, hard disk drives), magneto-optical storage media (e.g., magneto-optical discs), CD-ROMs (Read Only Memory), CD-Rs, CD-R/W, semiconductor memory (eg mask ROM, PROM (Programmable ROM), EPROM (Erasable PROM), flash ROM, RAM (Random Access Memory)).
- the program may also be delivered to the computer on various types of transitory computer readable medium.
- Examples of transitory computer-readable media include electrical signals, optical signals, and electromagnetic waves.
- Transitory computer-readable media can deliver the program to the computer via wired channels, such as wires and optical fibers, or wireless channels.
- the cognitive function estimating device according to appendix 1 or 2, wherein the second state information acquiring means acquires the second state information including inner related information that is information related to the inner state of the subject.
- the cognitive function estimating device according to appendix 3, wherein the inner face-related information is information relating to at least one of personality, occupation, interest, taste, and lifestyle of the subject.
- the second state information acquisition means acquires the second state information including cell deterioration information that is information on the degree of cell deterioration of the subject. Device.
- the first state information acquisition means acquires the first state information including face data, which is measurement information about the subject's face, and gait data, which is measurement information about the subject's walking state.
- the cognitive function estimation device according to any one of 1 to 5.
- the cognitive function estimating device according to appendix 6, wherein the first state information acquisition means acquires the first state information further including voice data that is measurement information about voice of the subject.
- the first state information acquisition means generates the first state information based on information subjectively or objectively measured from the subject at the timing of estimating the cognitive function, 8.
- the cognitive function estimation device according to any one of appendices 1 to 7, wherein the second state information acquisition means acquires the second state information from a storage device that stores the second state information.
- Appendix 9 9.
- the cognitive function estimating device according to any one of attachments 1 to 8, further comprising output control means for outputting information about the cognitive function estimation result.
- a storage medium storing a program that causes a computer to execute a process of estimating the subject's cognitive function based on the first state information and the second state information.
- [Appendix 14] Acquiring face data, which is measurement information about the subject's face, and gait data, which is measurement information about the subject's walking state;
- a storage medium storing a program for causing a computer to execute a process of estimating the subject's cognitive function based on the face data and the gait data.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Pathology (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Surgery (AREA)
- Heart & Thoracic Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Psychiatry (AREA)
- Physiology (AREA)
- Artificial Intelligence (AREA)
- Neurology (AREA)
- Psychology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Developmental Disabilities (AREA)
- Hospice & Palliative Care (AREA)
- Child & Adolescent Psychology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Evolutionary Computation (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Educational Technology (AREA)
- Social Psychology (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Neurosurgery (AREA)
- Dentistry (AREA)
- Human Computer Interaction (AREA)
- Cardiology (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Fuzzy Systems (AREA)
Abstract
Description
対象者の認知機能に関連する前記対象者の第1状態を表す第1状態情報を取得する第1状態情報取得手段と、
前記第1状態よりも状態変化の間隔が長い前記対象者の第2状態を表す第2状態情報を取得する第2状態情報取得手段と、
前記第1状態情報と、前記第2状態情報とに基づき、前記対象者の認知機能を推定する認知機能推定手段と、
を備える認知機能推定装置である。
対象者の顔に関する測定情報である顔データと、前記対象者の歩行状態に関する測定情報である歩容データとを取得する取得手段と、
前記顔データと、前記歩容データとに基づき、前記対象者の認知機能を推定する認知機能推定手段と、
を有する認知機能推定装置である。
コンピュータが、
対象者の認知機能に関連する前記対象者の第1状態を表す第1状態情報を取得し、
前記第1状態よりも状態変化の間隔が長い前記対象者の第2状態を表す第2状態情報を取得し、
前記第1状態情報と、前記第2状態情報とに基づき、前記対象者の認知機能を推定する、
認知機能推定方法である。なお、「コンピュータ」は、あらゆる電子機器(電子機器に含まれるプロセッサであってもよい)を含み、かつ、複数の電子機器により構成されてもよい。
コンピュータが、
対象者の顔に関する測定情報である顔データと、前記対象者の歩行状態に関する測定情報である歩容データとを取得し、
前記顔データと、前記歩容データとに基づき、前記対象者の認知機能を推定する、
認知機能推定方法である。
対象者の認知機能に関連する前記対象者の第1状態を表す第1状態情報を取得し、
前記第1状態よりも状態変化の間隔が長い前記対象者の第2状態を表す第2状態情報を取得し、
前記第1状態情報と、前記第2状態情報とに基づき、前記対象者の認知機能を推定する処理をコンピュータに実行させるプログラムが格納された記憶媒体である。
対象者の顔に関する測定情報である顔データと、前記対象者の歩行状態に関する測定情報である歩容データとを取得し、
前記顔データと、前記歩容データとに基づき、前記対象者の認知機能を推定する処理をコンピュータに実行させるプログラムが格納された記憶媒体である。
(1)システム構成
図1は、第1実施形態に係る認知機能推定システム100の概略構成を示す。認知機能推定システム100は、対象者の認知機能を対象者に過度な測定負荷をかけることなく高精度に推定し、その推定結果の提示を行う。ここで、「対象者」は、組織により認知機能の管理が行われる者であってもよく、個人のユーザであってもよい。
図2は、認知機能推定装置1のハードウェア構成を示す。認知機能推定装置1は、ハードウェアとして、プロセッサ11と、メモリ12と、インターフェース13とを含む。プロセッサ11、メモリ12及びインターフェース13は、データバス10を介して接続されている。
図3は、認知機能に影響がある要素を概略的に表した図である。図3に示すように、対象者の認知機能は、
a)対象者の一時的状態、
b)対象者の特性、
c)対象者の性格、
d)疾患による対象者の生物学的変化、
e)経年変化による対象者の生物学的変化
などに影響を受ける。
図4は、認知機能推定装置1の機能ブロックの一例である。認知機能推定装置1のプロセッサ11は、機能的には、第1状態情報取得部15と、第2状態情報取得部16と、認知機能推定部17と、出力制御部18とを有する。なお、図4では、データの授受が行われるブロック同士を実線により結んでいるが、データの授受が行われるブロックの組合せは図4に限定されない。後述する他の機能ブロックの図においても同様である。
図5は、認知機能の推定に関する具体例を示す図である。図5は、第1状態情報として歩容データ及び顔データを用い、かつ、第2状態情報として生活習慣、疾患、性格、人種に関するアンケート結果を用いて認知機能の推定を行う例を示している。
次に、一例として、認知機能の推定において学習済みの推論モデルを用いる場合の推論モデルの学習方法(即ち算出情報の生成方法)について説明する。以後では、一例として、認知機能推定装置1が推論モデルの学習を行う場合について説明するが、認知機能推定装置1以外の装置が推論モデルの学習を行ってもよい。
図7は、認知機能の推定に関する認知機能推定装置1の処理手順を示すフローチャートの一例である。認知機能推定装置1は、例えば、所定の認知機能の推定実行条件が満たされた場合に認知機能の推定タイミングであると判定し、図7のフローチャートの処理を実行する。なお、認知機能推定装置1は、例えば、入力装置2により認知機能の推定処理の実行を指示する入力信号S1を受信した場合に、上述の推定実行条件が満たされたと判定する。その他、認知機能推定装置1は、記憶装置4等に予め記憶された推定実行条件を参照し、推定実行条件が満たされたか否か判定してもよく、予め設定された所定の日時(例えば毎日の所定時刻)になった場合に推定実行条件が満たされたと判定してもよい。さらに別の例では、認知機能推定装置1は、認知機能の推定に必要な第1状態情報を生成するためのセンサ信号S3又は/及び入力信号S1を取得した場合に、推定実行条件が満たされたと判定してもよい。
認知機能推定装置1は、第2状態情報を用いることなく、第1状態情報に基づいて、対象者の認知機能を推定してもよい。
図8は、第2実施形態における認知機能推定システム100Aの概略構成を示す。第2実施形態に係る認知機能推定システム100Aは、サーバクライアントモデルのシステムであり、サーバ装置として機能する認知機能推定装置1Aが第1実施形態における認知機能推定装置1の処理を行う。以後では、第1実施形態と同一構成要素については、適宜同一符号を付し、その説明を省略する。
図9は、第3実施形態における認知機能推定装置1Xのブロック図である。認知機能推定装置1Xは、主に、第1状態情報取得手段15Xと、第2状態情報取得手段16Xと、認知機能推定手段17Xとを有する。なお、認知機能推定装置1Xは、複数の装置により構成されてもよい。
図11は、第4実施形態における認知機能推定装置1Yのブロック図である。認知機能推定装置1Yは、主に、取得手段15Yと、認知機能推定手段17Yとを有する。なお、認知機能推定装置1Yは、複数の装置により構成されてもよい。
対象者の認知機能に関連する前記対象者の第1状態を表す第1状態情報を取得する第1状態情報取得手段と、
前記第1状態よりも状態変化の間隔が長い前記対象者の第2状態を表す第2状態情報を取得する第2状態情報取得手段と、
前記第1状態情報と、前記第2状態情報とに基づき、前記対象者の認知機能を推定する認知機能推定手段と、
を備える認知機能推定装置。
[付記2]
前記第2状態情報取得手段は、前記第1状態に関連する前記第2状態を表す前記第2状態情報を取得する、付記1に記載の認知機能推定装置。
[付記3]
前記第2状態情報取得手段は、前記対象者の内面状態に関連する情報である内面関連情報を含む前記第2状態情報を取得する、付記1または2に記載の認知機能推定装置。
[付記4]
前記内面関連情報は、前記対象者の性格、職業、興味、趣向、生活習慣の少なくともいずれかに関する情報である、付記3に記載の認知機能推定装置。
[付記5]
前記第2状態情報取得手段は、前記対象者の細胞の劣化度に関する情報である細胞劣化情報を含む前記第2状態情報を取得する、付記1~4のいずれか一項に記載の認知機能推定装置。
[付記6]
前記第1状態情報取得手段は、前記対象者の顔に関する測定情報である顔データと、前記対象者の歩行状態に関する測定情報である歩容データとを含む前記第1状態情報を取得する、付記1~5のいずれか一項に記載の認知機能推定装置。
[付記7]
前記第1状態情報取得手段は、前記対象者の音声に関する測定情報である音声データをさらに含む前記第1状態情報を取得する、付記6に記載の認知機能推定装置。
[付記8]
前記第1状態情報取得手段は、前記認知機能の推定タイミングにおいて前記対象者から主観的又は客観的に測定された情報に基づき前記第1状態情報を生成し、
前記第2状態情報取得手段は、前記第2状態情報を記憶する記憶装置から前記第2状態情報を取得する、付記1~7のいずれか一項に記載の認知機能推定装置。
[付記9]
前記認知機能の推定結果に関する情報を出力する出力制御手段をさらに有する、付記1~8のいずれか一項に記載の認知機能推定装置。
[付記10]
対象者の顔に関する測定情報である顔データと、前記対象者の歩行状態に関する測定情報である歩容データとを取得する取得手段と、
前記顔データと、前記歩容データとに基づき、前記対象者の認知機能を推定する認知機能推定手段と、
を有する認知機能推定装置。
[付記11]
コンピュータが、
対象者の認知機能に関連する前記対象者の第1状態を表す第1状態情報を取得し、
前記第1状態よりも状態変化の間隔が長い前記対象者の第2状態を表す第2状態情報を取得し、
前記第1状態情報と、前記第2状態情報とに基づき、前記対象者の認知機能を推定する、
認知機能推定方法。
[付記12]
コンピュータが、
対象者の顔に関する測定情報である顔データと、前記対象者の歩行状態に関する測定情報である歩容データとを取得し、
前記顔データと、前記歩容データとに基づき、前記対象者の認知機能を推定する、
認知機能推定方法。
[付記13]
対象者の認知機能に関連する前記対象者の第1状態を表す第1状態情報を取得し、
前記第1状態よりも状態変化の間隔が長い前記対象者の第2状態を表す第2状態情報を取得し、
前記第1状態情報と、前記第2状態情報とに基づき、前記対象者の認知機能を推定する処理をコンピュータに実行させるプログラムが格納された記憶媒体。
[付記14]
対象者の顔に関する測定情報である顔データと、前記対象者の歩行状態に関する測定情報である歩容データとを取得し、
前記顔データと、前記歩容データとに基づき、前記対象者の認知機能を推定する処理をコンピュータに実行させるプログラムが格納された記憶媒体。
2 入力装置
3 出力装置
4 記憶装置
5 センサ
8 端末装置
100、100A 認知機能推定システム
Claims (14)
- 対象者の認知機能に関連する前記対象者の第1状態を表す第1状態情報を取得する第1状態情報取得手段と、
前記第1状態よりも状態変化の間隔が長い前記対象者の第2状態を表す第2状態情報を取得する第2状態情報取得手段と、
前記第1状態情報と、前記第2状態情報とに基づき、前記対象者の認知機能を推定する認知機能推定手段と、
を備える認知機能推定装置。 - 前記第2状態情報取得手段は、前記第1状態に関連する前記第2状態を表す前記第2状態情報を取得する、請求項1に記載の認知機能推定装置。
- 前記第2状態情報取得手段は、前記対象者の内面状態に関連する情報である内面関連情報を含む前記第2状態情報を取得する、請求項1または2に記載の認知機能推定装置。
- 前記内面関連情報は、前記対象者の性格、職業、興味、趣向、生活習慣の少なくともいずれかに関する情報である、請求項3に記載の認知機能推定装置。
- 前記第2状態情報取得手段は、前記対象者の細胞の劣化度に関する情報である細胞劣化情報を含む前記第2状態情報を取得する、請求項1~4のいずれか一項に記載の認知機能推定装置。
- 前記第1状態情報取得手段は、前記対象者の顔に関する測定情報である顔データと、前記対象者の歩行状態に関する測定情報である歩容データとを含む前記第1状態情報を取得する、請求項1~5のいずれか一項に記載の認知機能推定装置。
- 前記第1状態情報取得手段は、前記対象者の音声に関する測定情報である音声データをさらに含む前記第1状態情報を取得する、請求項6に記載の認知機能推定装置。
- 前記第1状態情報取得手段は、前記認知機能の推定タイミングにおいて前記対象者から主観的又は客観的に測定された情報に基づき前記第1状態情報を生成し、
前記第2状態情報取得手段は、前記第2状態情報を記憶する記憶装置から前記第2状態情報を取得する、請求項1~7のいずれか一項に記載の認知機能推定装置。 - 前記認知機能の推定結果に関する情報を出力する出力制御手段をさらに有する、請求項1~8のいずれか一項に記載の認知機能推定装置。
- 対象者の顔に関する測定情報である顔データと、前記対象者の歩行状態に関する測定情報である歩容データとを取得する取得手段と、
前記顔データと、前記歩容データとに基づき、前記対象者の認知機能を推定する認知機能推定手段と、
を有する認知機能推定装置。 - コンピュータが、
対象者の認知機能に関連する前記対象者の第1状態を表す第1状態情報を取得し、
前記第1状態よりも状態変化の間隔が長い前記対象者の第2状態を表す第2状態情報を取得し、
前記第1状態情報と、前記第2状態情報とに基づき、前記対象者の認知機能を推定する、
認知機能推定方法。 - コンピュータが、
対象者の顔に関する測定情報である顔データと、前記対象者の歩行状態に関する測定情報である歩容データとを取得し、
前記顔データと、前記歩容データとに基づき、前記対象者の認知機能を推定する、
認知機能推定方法。 - 対象者の認知機能に関連する前記対象者の第1状態を表す第1状態情報を取得し、
前記第1状態よりも状態変化の間隔が長い前記対象者の第2状態を表す第2状態情報を取得し、
前記第1状態情報と、前記第2状態情報とに基づき、前記対象者の認知機能を推定する処理をコンピュータに実行させるプログラムが格納された記憶媒体。 - 対象者の顔に関する測定情報である顔データと、前記対象者の歩行状態に関する測定情報である歩容データとを取得し、
前記顔データと、前記歩容データとに基づき、前記対象者の認知機能を推定する処理をコンピュータに実行させるプログラムが格納された記憶媒体。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/279,135 US20240138750A1 (en) | 2021-06-29 | 2021-06-29 | Cognitive function estimation device, cognitive function estimation method, and storage medium |
JP2023531185A JPWO2023275975A5 (ja) | 2021-06-29 | 認知機能推定装置、認知機能推定方法及びプログラム | |
PCT/JP2021/024506 WO2023275975A1 (ja) | 2021-06-29 | 2021-06-29 | 認知機能推定装置、認知機能推定方法及び記憶媒体 |
US18/484,817 US20240032852A1 (en) | 2021-06-29 | 2023-10-11 | Cognitive function estimation device, cognitive function estimation method, and storage medium |
US18/379,326 US20240032851A1 (en) | 2021-06-29 | 2023-10-12 | Cognitive function estimation device, cognitive function estimation method, and storage medium |
US18/379,317 US20240065599A1 (en) | 2021-06-29 | 2023-10-12 | Cognitive function estimation device, cognitive function estimation method, and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/024506 WO2023275975A1 (ja) | 2021-06-29 | 2021-06-29 | 認知機能推定装置、認知機能推定方法及び記憶媒体 |
Related Child Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/279,135 A-371-Of-International US20240138750A1 (en) | 2021-06-29 | 2021-06-29 | Cognitive function estimation device, cognitive function estimation method, and storage medium |
US18/484,817 Continuation US20240032852A1 (en) | 2021-06-29 | 2023-10-11 | Cognitive function estimation device, cognitive function estimation method, and storage medium |
US18/379,317 Continuation US20240065599A1 (en) | 2021-06-29 | 2023-10-12 | Cognitive function estimation device, cognitive function estimation method, and storage medium |
US18/379,326 Continuation US20240032851A1 (en) | 2021-06-29 | 2023-10-12 | Cognitive function estimation device, cognitive function estimation method, and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023275975A1 true WO2023275975A1 (ja) | 2023-01-05 |
Family
ID=84691594
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/024506 WO2023275975A1 (ja) | 2021-06-29 | 2021-06-29 | 認知機能推定装置、認知機能推定方法及び記憶媒体 |
Country Status (2)
Country | Link |
---|---|
US (4) | US20240138750A1 (ja) |
WO (1) | WO2023275975A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7274253B1 (ja) * | 2023-03-14 | 2023-05-16 | ロゴスサイエンス株式会社 | ヘルスケアシステムおよびその方法 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009515568A (ja) * | 2005-09-13 | 2009-04-16 | ウェルチ・アリン・インコーポレーテッド | 光学的に識別可能な眼科症状の診断のための、装置、および方法 |
JP2018015139A (ja) * | 2016-07-26 | 2018-02-01 | ヤンマー株式会社 | 認知症検査システム |
US20200060566A1 (en) * | 2018-08-24 | 2020-02-27 | Newton Howard | Automated detection of brain disorders |
US20200261013A1 (en) * | 2017-09-27 | 2020-08-20 | Ilan Ben-Oren | Cognitive and physiological monitoring and analysis for correlation for management of cognitive impairment related conditions |
WO2021014938A1 (ja) * | 2019-07-22 | 2021-01-28 | パナソニックIpマネジメント株式会社 | 歩行機能評価装置、歩行機能評価システム、歩行機能評価方法、プログラム、及び、認知機能評価装置 |
WO2021075061A1 (ja) * | 2019-10-18 | 2021-04-22 | エーザイ・アール・アンド・ディー・マネジメント株式会社 | 認知機能測定装置、認知機能測定システム、認知機能測定方法及び認知機能測定プログラム |
JP2021087503A (ja) * | 2019-12-02 | 2021-06-10 | 地方独立行政法人東京都健康長寿医療センター | 認知症判定プログラム及び認知症判定装置 |
-
2021
- 2021-06-29 US US18/279,135 patent/US20240138750A1/en active Pending
- 2021-06-29 WO PCT/JP2021/024506 patent/WO2023275975A1/ja active Application Filing
-
2023
- 2023-10-11 US US18/484,817 patent/US20240032852A1/en active Pending
- 2023-10-12 US US18/379,326 patent/US20240032851A1/en active Pending
- 2023-10-12 US US18/379,317 patent/US20240065599A1/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009515568A (ja) * | 2005-09-13 | 2009-04-16 | ウェルチ・アリン・インコーポレーテッド | 光学的に識別可能な眼科症状の診断のための、装置、および方法 |
JP2018015139A (ja) * | 2016-07-26 | 2018-02-01 | ヤンマー株式会社 | 認知症検査システム |
US20200261013A1 (en) * | 2017-09-27 | 2020-08-20 | Ilan Ben-Oren | Cognitive and physiological monitoring and analysis for correlation for management of cognitive impairment related conditions |
US20200060566A1 (en) * | 2018-08-24 | 2020-02-27 | Newton Howard | Automated detection of brain disorders |
WO2021014938A1 (ja) * | 2019-07-22 | 2021-01-28 | パナソニックIpマネジメント株式会社 | 歩行機能評価装置、歩行機能評価システム、歩行機能評価方法、プログラム、及び、認知機能評価装置 |
WO2021075061A1 (ja) * | 2019-10-18 | 2021-04-22 | エーザイ・アール・アンド・ディー・マネジメント株式会社 | 認知機能測定装置、認知機能測定システム、認知機能測定方法及び認知機能測定プログラム |
JP2021087503A (ja) * | 2019-12-02 | 2021-06-10 | 地方独立行政法人東京都健康長寿医療センター | 認知症判定プログラム及び認知症判定装置 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7274253B1 (ja) * | 2023-03-14 | 2023-05-16 | ロゴスサイエンス株式会社 | ヘルスケアシステムおよびその方法 |
Also Published As
Publication number | Publication date |
---|---|
US20240138750A1 (en) | 2024-05-02 |
JPWO2023275975A1 (ja) | 2023-01-05 |
US20240032852A1 (en) | 2024-02-01 |
US20240032851A1 (en) | 2024-02-01 |
US20240065599A1 (en) | 2024-02-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3403235B1 (en) | Sensor assisted evaluation of health and rehabilitation | |
US11904224B2 (en) | System and method for client-side physiological condition estimations based on a video of an individual | |
KR20200074951A (ko) | 신경계 장애의 식별 및 모니터링을 위한 머신 러닝 기반 시스템 | |
US10595776B1 (en) | Determining energy expenditure using a wearable device | |
US20190313966A1 (en) | Pain level determination method, apparatus, and system | |
EP4251048A1 (en) | Method and system for detecting mood | |
US11998321B2 (en) | System and a method for automatically managing continuous glucose monitoring measurements indicative of glucose level in a bodily fluid | |
WO2020091053A1 (ja) | 健康ポジショニングマップおよび健康関数を作成する方法、システム、およびプログラム、ならびにそれらの使用方法 | |
Mahesh et al. | Requirements for a reference dataset for multimodal human stress detection | |
US20240065599A1 (en) | Cognitive function estimation device, cognitive function estimation method, and storage medium | |
US20230394124A1 (en) | Method for configuring data acquisition settings of a computing device | |
US20210000405A1 (en) | System for estimating a stress condition of an individual | |
WO2022208873A1 (ja) | ストレス推定装置、ストレス推定方法及び記憶媒体 | |
US10079074B1 (en) | System for monitoring disease progression | |
WO2023199839A1 (ja) | 内面状態推定装置、内面状態推定方法及び記憶媒体 | |
WO2022113276A1 (ja) | 情報処理装置、制御方法及び記憶媒体 | |
WO2022144978A1 (ja) | 情報処理装置、制御方法及び記憶媒体 | |
WO2022014538A1 (ja) | 被験者の健康度を計測するための方法、装置、プログラム、およびシステム | |
US20240008813A1 (en) | Smart wearable device and method for estimating traditional medicine system parameters | |
WO2022254574A1 (ja) | 疲労推定装置、疲労推定方法及び記憶媒体 | |
US20240233913A9 (en) | User analysis and predictive techniques for digital therapeutic systems | |
US20240136051A1 (en) | User analysis and predictive techniques for digital therapeutic systems | |
WO2023058200A1 (ja) | 疲労度算出装置、疲労度算出方法及び記憶媒体 | |
JP7419904B2 (ja) | 生体モニタ装置、生体モニタ方法及びプログラム | |
US20240180434A1 (en) | System and method for blood pressure measurement, computer program product using the method, and computer-readable recording medium thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21948284 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18279135 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023531185 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21948284 Country of ref document: EP Kind code of ref document: A1 |