WO2022114143A1 - 物体認知能力を評価するための評価装置、方法、及びプログラム - Google Patents
物体認知能力を評価するための評価装置、方法、及びプログラム Download PDFInfo
- Publication number
- WO2022114143A1 WO2022114143A1 PCT/JP2021/043460 JP2021043460W WO2022114143A1 WO 2022114143 A1 WO2022114143 A1 WO 2022114143A1 JP 2021043460 W JP2021043460 W JP 2021043460W WO 2022114143 A1 WO2022114143 A1 WO 2022114143A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- subject
- input
- target object
- evaluation
- unit
- Prior art date
Links
- 238000011156 evaluation Methods 0.000 title claims abstract description 119
- 238000000034 method Methods 0.000 title claims description 19
- 238000012360 testing method Methods 0.000 claims abstract description 120
- 238000012545 processing Methods 0.000 claims abstract description 66
- 230000004044 response Effects 0.000 claims abstract description 52
- 238000001514 detection method Methods 0.000 claims description 16
- 230000003930 cognitive ability Effects 0.000 claims description 7
- 210000003478 temporal lobe Anatomy 0.000 abstract description 55
- 230000000007 visual effect Effects 0.000 abstract description 26
- 230000006870 function Effects 0.000 description 59
- 238000010586 diagram Methods 0.000 description 11
- 230000003340 mental effect Effects 0.000 description 11
- 230000010365 information processing Effects 0.000 description 10
- 210000001152 parietal lobe Anatomy 0.000 description 8
- 238000004891 communication Methods 0.000 description 7
- 230000008569 process Effects 0.000 description 7
- 210000004556 brain Anatomy 0.000 description 6
- 230000037361 pathway Effects 0.000 description 5
- 239000003086 colorant Substances 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 230000035484 reaction time Effects 0.000 description 4
- 230000007177 brain activity Effects 0.000 description 3
- 230000003925 brain function Effects 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 230000004936 stimulating effect Effects 0.000 description 3
- 241000282412 Homo Species 0.000 description 2
- 230000004913 activation Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 210000004709 eyebrow Anatomy 0.000 description 2
- 238000002599 functional magnetic resonance imaging Methods 0.000 description 2
- 210000003128 head Anatomy 0.000 description 2
- 238000000554 physical therapy Methods 0.000 description 2
- 230000037237 body shape Effects 0.000 description 1
- 230000005978 brain dysfunction Effects 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 208000020016 psychiatric disease Diseases 0.000 description 1
- 210000001747 pupil Anatomy 0.000 description 1
- 230000011514 reflex Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/40—Detecting, measuring or recording for evaluating the nervous system
- A61B5/4058—Detecting, measuring or recording for evaluating the nervous system for evaluating the central nervous system
- A61B5/4064—Evaluating the brain
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H1/00—Apparatus for passive exercising; Vibrating apparatus; Chiropractic devices, e.g. body impacting devices, external devices for briefly extending or aligning unbroken bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/162—Testing reaction times
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/163—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
- A61B5/7425—Displaying combinations of multiple images regardless of image source, e.g. displaying a reference anatomical image with a live image
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2560/00—Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
- A61B2560/02—Operational features
Definitions
- the present invention relates to an evaluation device, a method and a program, and more particularly to an evaluation device, a method and a program for evaluating an object cognitive ability of a subject.
- the processing pathway of visual information in the brain is mainly divided into the dorsal (parietal lobe) pathway and the ventral (temporal lobe) pathway, and the parietal lobe pathway provides spatial information such as movement and three-dimensional structure of space.
- the ventral (temporal lobe) pathway is known to be responsible for object recognition based on color and shape. Further, regarding the visual recognition characteristics, it is known that there are some people whose parietal lobe part is more activated and some people whose temporal lobe part is more activated when visually recognizing an object or a person.
- Non-Patent Document 1 discloses the relationship between brain activity and reaction time in a mental rotation task, divides subjects into a group with a fast reaction time and a group with a slow reaction time, and a site in which the brains of the subjects in the two groups are activated. It discloses that it is different. Since brain activity can grasp the disease state more quantitatively compared with the diagnostic index centered on interviews with doctors, its application to the evaluation of the state of mental illness is also being studied.
- the inventors of the present application realize a society in which the function of the temporal lobe when visually recognizing an object or a person can be objectively evaluated, not the superiority or inferiority of the individual, but the individuality. I thought it would be extremely useful. Until now, such techniques and technical ideas have not been considered. On the other hand, measurement of brain waves and fMRI for quantitatively grasping the characteristics of the brain requires a large device, which takes a long time to measure and involves physical restraint, which imposes a burden on the subject. It was big.
- the present invention has been made to solve such a problem, and to provide an evaluation device and the like capable of objectively evaluating an object recognition ability while further reducing the burden on a subject.
- the main purpose is to provide an evaluation device and the like capable of objectively evaluating an object recognition ability while further reducing the burden on a subject.
- the evaluation device is An evaluation device for evaluating the object recognition ability of a subject.
- An input unit that accepts subject input and A display control unit that displays a test screen on the display device that includes target objects that the subject should select and non-target objects that the subject should not select.
- An input processing unit that determines whether or not the target object is selected based on the input of the subject received by the input unit, and an input processing unit.
- An evaluation unit that evaluates the object recognition ability of the subject based on the response time required for the subject to input when it is determined by the input processing unit that the target object is selected. To prepare for.
- the input processing unit accepts the subject's input as an object selection input when the subject's input received by the input unit corresponds to the selection of a target object or a non-target object.
- the evaluation unit determines that the target object has been selected by the input processing unit
- the time from receiving the object selection input immediately before the input to receiving the object selection input is determined as the response time, and the object recognition ability of the subject is determined based on the determined response time. To evaluate.
- the display control unit displays a test screen including a plurality of target objects on a display device.
- the evaluation unit determines the response time in each case where the input processing unit determines that the target object has been selected, and determines the index of the object recognition ability based on the determined response time to determine the object of the subject. Evaluate cognitive ability.
- the display control unit displays a test screen in which the target object and the graphic feature, the color, or the pattern have a predetermined degree of similarity and include different non-target objects.
- the display control unit displays a test screen including a target object and a non-target object rotated at random angles about a predetermined position in each object on the display device.
- the display control unit displays a test screen including only one target object on a display device, and the target object is a UI object that can be selected by the subject including a message for the subject.
- the display control unit sequentially displays different test screens on the display device while satisfying a predetermined condition.
- the evaluation unit determines the response time on each of the test screens, and evaluates the object recognition ability of the subject by determining the index of the object recognition ability based on the determined response time.
- the input processing unit determines whether or not the target object or the non-target object is selected based on the input of the subject received by the input unit.
- the evaluation unit does not evaluate the object recognition ability when the ratio between the number of times the target object is determined to be selected by the input processing unit and the number of times the non-target object is determined to be selected is within a predetermined range. Outputs information that the object recognition ability cannot be evaluated.
- a line-of-sight detection unit for detecting the line of sight of the subject is further provided.
- the input processing unit does not determine that the target object has been selected if the detected line of sight does not exist in the predetermined area including the object in the test screen displayed by the display control unit.
- the evaluation unit is based on the ratio of the number of times the target object is determined to be selected by the input processing unit to the number of times the non-target object is determined to be selected, and the input processing unit.
- the object recognition ability of the subject is evaluated based on the response time required for the subject to input when it is determined that the target object is selected.
- the method of one embodiment of the present invention A method for assessing a subject's ability to recognize objects, A step of displaying a test screen on the display device containing target objects that the subject should select and non-target objects that the subject should not select. A step to determine whether the target object is selected based on the input of the received subject, and A step to evaluate the object cognitive ability based on the response time required for the subject to input when it is determined that the target object is selected, and including.
- the program of one embodiment of the present invention is characterized in that each step of the above method is executed by a computer.
- the evaluation device 1 is a device for presenting a test screen to a subject to visually recognize an object and evaluating the object recognition ability of the subject. For example, the evaluation of object recognition ability is performed by evaluating whether the part of the parietal lobe is more activated or the part of the temporal lobe is more activated when visually recognizing an object such as an object or a person. be. More activation of the parietal lobe indicates better ability to grasp spatial information, and more activation of the temporal lobe indicates better image memory. Show that. For example, object recognition ability can be evaluated by determining an index of object recognition ability.
- evaluating the object cognitive ability means evaluating the function of the temporal lobe with respect to the visual ability of the subject, for example, to what extent the temporal lobe is activated when the subject visually recognizes the object. It means to evaluate the degree of being done or used.
- the evaluation device 1 evaluates the function of the temporal lobe by determining an index of the function of the temporal lobe regarding the visual ability of the subject. Determining an index of temporal lobe function involves calculating the index.
- An object is a virtual object displayed in a display device. The subject means a person who is evaluated by undergoing (presenting) a test performed by the evaluation device 1, and can mean a person to be evaluated or a person to be measured.
- FIG. 1 is a block diagram showing a hardware configuration of the evaluation device 1 according to the embodiment of the present invention.
- the evaluation device 1 includes a processor 11, an input device 12, a display device 13, a line-of-sight detection device 14, a storage device 15, and a communication device 16. Each of these components is connected by a bus 17. It is assumed that an interface is interposed between the bus 17 and each component as necessary.
- the evaluation device 1 can be a computer, a tablet terminal, a smartphone, or the like.
- the evaluation device 1 may be composed of one device or a plurality of devices.
- the processor 11 controls the operation of the entire evaluation device 1.
- the processor 11 is a CPU.
- the processor 11 executes various processes by reading and executing a program or data stored in the storage device 15.
- the processor 11 may be composed of a plurality of processors.
- the input device 12 is a user interface that receives input from the user to the evaluation device 1, and is, for example, a touch panel, a touch pad, a mouse, a keyboard, or a sensor.
- the display device 13 is a display that displays an application screen or the like to the user of the evaluation device 1 under the control of the processor 11.
- the input device 12 is a touch panel 18, and has a structure integrated with the display device 13 (display).
- the line-of-sight detection device 14 is a known eye tracking device or line-of-sight measurement device.
- the line-of-sight detection device 14 includes an image pickup device for detecting the line of sight.
- the line-of-sight detector 14 comprises an infrared camera and an infrared LED.
- the line-of-sight detection device 14 may be a line-of-sight detection device module or the like, and may be built in the evaluation device 1. Alternatively, the line-of-sight detection device 14 may be composed of a plurality of devices.
- the storage device 15 includes a main storage device and an auxiliary storage device.
- the main storage device is a semiconductor memory such as RAM.
- the RAM is a volatile storage medium capable of high-speed reading and writing of information, and is used as a storage area and a work area when the processor 11 processes information.
- the main storage device may include a ROM, which is a read-only non-volatile storage medium.
- the auxiliary storage device stores various programs and data used by the processor 11 when executing each program.
- the auxiliary storage device may be any non-volatile storage or non-volatile memory as long as it can store information, and may be removable.
- the storage device 15 stores an evaluation program for evaluating the function of the temporal lobe regarding the visual ability of the subject, image data of an object referred to by the program, and the like.
- the communication device 16 exchanges data with another computer such as a user terminal or a server via a network, and is, for example, a wireless LAN module.
- the communication device 16 may be another wireless communication device or module such as a Bluetooth (registered trademark) module, or may be a wired communication device or module such as an Ethernet (registered trademark) module or a USB interface. You can also do it.
- FIG. 2 is a functional block diagram of the evaluation device 1 according to the embodiment of the present invention.
- the evaluation device 1 includes an input unit 21, a control unit 22, and a line-of-sight detection unit 26.
- the control unit 22 includes a display control unit 23, an input processing unit 24, and an evaluation unit 25.
- these functions are realized by executing the program stored in the storage device 15 by the processor 11.
- the other functional blocks may have some or all the functions of one functional block.
- these functions may also be realized by hardware by configuring an electronic circuit or the like for realizing a part or all of each function.
- the functions included in the display control unit 23 and the input processing unit 24 may be realized by one functional block, may be realized by more functional blocks, or some functions may be realized by other functional blocks. It may be realized by.
- the input unit 21 accepts the subject's input.
- the input unit 21 receives, for example, the subject's input for selecting the object 40 with respect to the test screen 30 displayed by the display control unit 23.
- the input unit 21 is configured by using the touch panel 18, and acquires the touch position by receiving the user's touch input to the touch panel 18.
- the input unit 21 is a function generally possessed by a tablet terminal or a smartphone.
- the input unit 21 stores the acquired touch position in a predetermined memory area in the storage device 15 and delivers it to the input processing unit 24.
- the input unit 21 acquires the touch position and the time (information about the time) at which the touch position is acquired, associates the time with the touch position, and stores the time in a predetermined memory area in the storage device 15. Then, it is handed over to the input processing unit 24.
- the time when the input unit 21 acquires the touch position can mean the time when the input unit 21 receives the input.
- the input unit 21 and the control unit 22 can acquire information about the time from, for example, an OS.
- the control unit 22 conducts a test on the subject via the test screen 30 and evaluates the function of the temporal lobe regarding the visual ability of the subject.
- the display control unit 23 displays a test screen 30 including a plurality of objects 40 on the display device 13.
- the plurality of objects 40 include a target object 41 that the subject should select and a non-target object 42 that the subject should not select.
- the target object 41 and the non-target object 42 differ in at least one of graphic features, colors, and patterns.
- the display control unit 23 displays each of the preset predetermined number of target objects 41 and non-target objects 42 at the preset predetermined positions according to the test screen 30.
- each of the objects 40 displayed by the display control unit 23 is a two-dimensional object having no information in the depth direction of the test screen 30.
- each of the objects 40 displayed by the display control unit 23 is a picture such as an illustration.
- each of the objects 40 displayed by the display control unit 23 may be a three-dimensional object, or may be an image instead of a picture.
- the display control unit 23 displays various patterns of test screens 30 on the display device 13 according to, for example, the contents of the evaluation program.
- the test screen 30 includes a reference object 43 indicating a target object 41 to be selected by the subject.
- the test screen 30 includes a user interface (UI) object 44.
- the UI object 44 is an object that includes a message to the subject and can be selected by the subject.
- the display control unit 23 displays a test screen 30 including the target object 41 and the non-target object 42 whose shape, color, or pattern has a predetermined degree of similarity with the target object 41 on the display device 13.
- the non-target object 42 is an object in which at least one of the graphic features, colors, and patterns has a predetermined degree of similarity to the target object 41.
- the asymmetric object 42 is an object in which at least one of the graphic features, colors and patterns is the same, but at least one of the graphic features, colors and patterns is different.
- a graphic feature means a graphic feature of at least a part of an object.
- the target object 41 and the non-target object 42 are objects that may give a similar impression at first glance.
- the display control unit 23 displays a test screen 30 including a target object 41 and a non-target object 42 rotated at random angles about a predetermined position in each object 40 on the display device 13.
- each of the target object 41 and the non-target object 42 contains information about a preset orientation.
- the display control unit 23 displays a test screen 30 including each object 40 rotated by a random angle about a predetermined position in each object 40 with respect to a preset direction of each object 40.
- the predetermined position in each object 40 is, for example, the center position or the center of gravity position of each object 40.
- the display control unit 23 can also display each object 40 by rotating the object 40 by a preset angle.
- the test screen 30 displayed by the display control unit 23 can include a mental rotation task.
- the target object 41 is a simple person illustration of one pattern with one feature
- the non-target object 42 is a plurality of other features different from one feature. It is a simple person illustration of the pattern.
- the characteristics of the person illustration are at least one of body shape, hairstyle, presence / absence of a hat, posture, and clothes, and at least one of these is different between the target object 41 and the non-target object 42.
- the display control unit 23 displays a plurality of target objects 41 of one pattern having the same characteristics
- the display control unit 23 sets each of the plurality of target objects 41 at a random angle or in advance. Display by rotating the set angle.
- the display control unit 23 sets each of one or a plurality of non-target objects 42 of each pattern of the plurality of patterns of non-target objects 42 having the same characteristics at a random angle or a preset angle. Rotate and display.
- FIG. 3 is a diagram showing an example (test screen 30a) of the test screen 30 displayed by the display control unit 23.
- the display control unit 23 displays a test screen 30a including a plurality of target objects 41a and a plurality of non-target objects 42a.
- the test screen 30a includes a reference object 43a indicating a target object 41a to be selected by the subject.
- the test screen 30a includes a UI object 44a for selecting when the test screen 30a is interrupted, and a UI object 44b for selecting when the same target object as the reference object 43a does not exist.
- the test screen 30a includes a target object 41a and a non-target object 42a, each of which is arranged in a state of being rotated by a random angle.
- the test screen 30a displayed in FIG. 3 is a mental rotation task.
- the input processing unit 24 determines whether or not the target object 41 is selected based on the input of the subject received by the input unit 21.
- the input processing unit 24 determines whether or not the target object 41 or the non-target object 42 is selected based on the input of the subject received by the input unit 21. By this determination process, the input processing unit 24 can also determine whether or not the subject's input received by the input unit 21 is an input corresponding to the selection of the target object 41 or the non-target object 42. When the input of the subject received by the input unit 21 is the input corresponding to the selection of the target object 41 or the non-target object 42, the input processing unit 24 accepts the input of the subject as the object selection input.
- the input processing unit 24 when the target object 41 is selected, the input processing unit 24 associates the information indicating the target object 41 with the time when the input unit 21 receives the input corresponding to the selection, and stores the storage device 15. It is stored in a predetermined memory area inside.
- the input processing unit 24 associates the information indicating the non-target object 42 with the time when the input unit 21 receives the input corresponding to the selection, and determines the predetermined value in the storage device 15. Store in the memory area of.
- the control unit 22 stores the time when the display control unit 23 starts displaying the test screen 30 in a predetermined memory area in the storage device 15.
- the input processing unit 24 compares the touch position acquired from the input unit 21 with the positions of the target object 41 and the non-target object 42. When the acquired touch position is within the range of the target object 41, the input processing unit 24 determines that the target object 41 has been selected and accepts it as the selection input of the target object 41. When the acquired touch position is within the range of the non-target object 42, the input processing unit 24 determines that the non-target object 42 has been selected and accepts it as the selection input of the non-target object 42. When the acquired touch position is not within the range of the target object 41 or the non-target object 42, the input processing unit 24 does not accept the subject's input accepted by the input unit 21 as the selection input.
- the display control unit 23 sequentially displays different test screens 30 on the display device 13 while the predetermined conditions are satisfied, and the input processing unit 24 is based on the input of the subject received by the input unit 21. , It is determined whether or not the target object 41 is selected.
- the predetermined condition is a predetermined condition for the test performed by the evaluation device 1.
- the display control unit 23 is within a predetermined time after displaying the test screen 30, or the display control unit 23 sequentially displays the test screen 30.
- the number is within a predetermined number.
- the evaluation unit 25 evaluates the function of the temporal lobe regarding the visual ability of the subject based on the response time required for the subject to input when it is determined by the input processing unit 24 that the target object 41 is selected.
- the evaluation unit 25 evaluates the function of the temporal lobe by determining an index of the function of the temporal lobe regarding the visual ability of the subject. For example, the higher the index of temporal lobe function, the more activated or used the temporal lobe, and the shorter the response time, the higher the response time, and the longer the response time, the lower the determination.
- the display control unit 23 displays the test screen 30 and then the input processing unit 24 accepts the object selection input.
- the time until or the time from receiving the object selection input immediately before the object selection input to receiving the object selection input is determined as the response time, and the index of the function of the temporal lobe is determined based on the determined response time. decide.
- the evaluation unit 25 determines that the target object 41 has been selected by the input processing unit 24.
- the response time is determined in each of the above, and the index of the function of the temporal lobe is determined based on the determined response time.
- the evaluation unit 25 calculates the average response time (average response time) determined in each case where the input processing unit 24 determines that the target object 41 has been selected, thereby performing the temporal lobe. Determine the index of function of.
- the evaluation unit 25 determines the response time on each of the test screens 30, and based on the determined response time, for example, calculates the average (average response time) of the determined response time. Evaluate the function of the temporal lobe with respect to the subject's visual ability. For example, the evaluation unit 25 uses a correspondence table or a correspondence relationship between a predetermined response time and an index of the function of the temporal lobe for the test performed by the evaluation device 1 on the subject, and the evaluation unit 25 uses the correspondence table or the correspondence relationship of the function of the temporal lobe. Determined by calculating the index.
- a plurality of patterns are set, and the evaluation device 1 displays a different set of test screens 30 for each pattern.
- a plurality of humans are made to take a test of a plurality of patterns presented by the evaluation device 1 in advance, an average response time is determined for each test pattern displayed by the evaluation device 1, and the storage device 15 determines the average response time.
- the evaluation unit 25 can determine the function of the temporal lobe regarding the visual ability of the subject and can determine the relative evaluation by comparing with the sample data.
- the evaluation unit 25 determines an index of temporal lobe function on each of the test screens 30, and calculates, for example, the average (average index) of the determined index based on the determined index.
- the evaluation unit 25 uses a correspondence table or correspondence relationship between the response time and the temporal lobe function index predetermined for each test screen 30 displayed by the evaluation device 1 to obtain the temporal lobe function index. Determined by calculation.
- a plurality of humans are made to undergo a test presented by the evaluation device 1 in advance, an average index is determined for each test screen 30 displayed by the evaluation device 1, and the storage device 15 samples the determined average index for each person.
- the evaluation unit 25 can determine the function of the temporal lobe regarding the visual ability of the subject and can determine the relative evaluation by comparing with the sample data.
- the evaluation unit 25 determines that the ratio between the number of times the target object 41 is determined to be selected by the input processing unit 24 and the number of times the non-target object 42 is determined to be selected is within a predetermined range. It outputs information that the function of the temporal lobe is not evaluated or the function of the temporal lobe cannot be evaluated.
- the ratio of the number of times the target object 41 is determined to be selected to the number of times the non-target object 42 is determined to be selected corresponds to the correct answer rate for selecting the target object 41 to be selected.
- the line-of-sight detection unit 26 detects the subject's line-of-sight information, for example, based on the subject's corneal reflex and the position of the pupil by analyzing the subject's image pickup information by the image pickup device included in the line-of-sight detection device 14.
- the line-of-sight detection unit 26 stores the detected line-of-sight information of the subject in a predetermined memory area in the storage device 15 and passes it to the input processing unit 24.
- the input processing unit 24 does not determine that the target object has been selected if the line of sight detected by the line of sight detection unit 26 does not exist in the predetermined area including the object in the test screen 30 displayed by the display control unit 23. ..
- the evaluation device 1 (control unit 22) starts a test on the subject via the test screen 30 when the evaluation program stored in the storage device 15 is executed.
- the display control unit 23 displays the test mode selection screen 31 and the test start screen 32 before the start of the test, and displays the evaluation screen 33 after the end of the test.
- FIG. 4 is a diagram showing an example of the mode selection screen 31 displayed by the display control unit 23.
- the mode selection screen 31 is a screen for setting a time limit or a designated number of times as a predetermined condition for the test performed by the evaluation device 1.
- the user of the evaluation device 1 can determine the mode of the test, for example, by inputting a numerical value of time (minutes) or number of times and pressing the "start" button of time or number of times.
- the input processing unit 24 determines the test mode based on the input of the subject received by the input unit 21.
- the display control unit 23 displays the start screen 32.
- FIG. 5 is a diagram showing an example of the start screen 32 displayed by the display control unit 23.
- the input processing unit 24 determines the start of the test when the input of the subject received by the input unit 21 is the input corresponding to the "start".
- the display control unit 23 displays the test screen 30 when the start of the test is determined.
- FIG. 6 is a diagram showing an example of the evaluation screen 33 displayed by the display control unit 23.
- the evaluation screen 33 shown in FIG. 6 presents information indicating whether or not the subject's answer is valid (that is, whether or not the target object 41) and response information including the response time required for the subject to input the answer. do.
- the first selection is the non-target object 42 and the response time is 16.079 seconds
- the second selection is the target object 41 and the response time is 2. It shows that it is 321 seconds
- the third selection is the target object 41
- the response time is 2.543 seconds.
- the evaluation screen 33 includes an index of temporal lobe function with respect to the visual ability of the subject (not shown).
- FIG. 7 is a diagram showing an example of a flowchart illustrating information processing executed in the evaluation device 1 according to the embodiment of the present invention. This flowchart is a flowchart when a time limit is set as a predetermined condition for the test performed by the evaluation device 1.
- step 101 the display control unit 23 displays the test screen 30.
- step 102 the control unit 22 determines whether or not the current time is within the time limit set from the time when step 101 is first executed, and this flowchart goes to step 103 as long as it is within the time limit. move on.
- step 103 the input processing unit 24 determines whether or not the target object 41 or the non-target object 42 is selected based on the input of the subject received by the input unit 21. If the target object 41 or the non-target object 42 is not selected, the flowchart proceeds to step 102. If the target object 41 is selected, the flowchart proceeds to step 104, and if the non-target object 42 is selected, the flowchart proceeds to step 106.
- step 104 the input processing unit 24 stores the information indicating the target object 41 and the time when the input unit 21 receives the input corresponding to the selection in a predetermined memory area in the storage device 15.
- step 106 the input processing unit 24 stores the information indicating the non-target object 42 and the time when the input unit 21 receives the input corresponding to the selection in a predetermined memory area in the storage device 15. ..
- step 105 the control unit 22 determines whether or not to exit the displayed test screen 30. For example, the control unit 22 determines whether or not the selection of all the target objects 41 included in the displayed test screen 30 is completed, and if it is determined that the selection is completed, the control unit 22 determines that the test screen 30 is terminated. When the control unit 22 determines that the test screen 30 is finished, the flowchart proceeds to step 101, and in step 101, the display control unit 23 displays a new preset test screen 30. If the control unit 22 determines that the test screen 30 is not finished, the flowchart proceeds to step 102.
- the evaluation unit 25 functions of the temporal lobe regarding the visual ability of the subject based on the response time required for the subject to input when the input processing unit 24 determines that the target object 41 is selected. Determine the index. For example, when the subject's answer is the answer information as shown in FIG. 6, since the second and third selections are the target object 41, the evaluation unit 25 is based on the second and third answer times. , Determine indicators of temporal lobe function with respect to subject visibility. The determination of the index of the evaluation unit 25 in this case is not based on the first response time.
- step 108 the display control unit 23 displays the evaluation screen 33 including the index determined in step 107 on the display device 13.
- the control unit 22 transmits data related to the evaluation screen 33 to another device via the communication device 16.
- FIG. 8 is a diagram showing an example of a flowchart illustrating information processing executed in the evaluation device 1 according to the embodiment of the present invention.
- This flowchart is a flowchart when a designated number of times is set as a predetermined condition for the test performed by the evaluation device 1.
- the differences from the flowchart shown in FIG. 7 will be mainly described.
- step 110 the control unit 22 determines whether or not the quantity of the test screen 30 displayed by the display control unit 23 is the same as the set specified number of times. If it is determined that the specified number of times is the same, this flowchart proceeds to step 111, and if it is not determined that the specified number of times is the same, that is, the specified number of times that the quantity of the test screen 30 displayed by the display control unit 23 is set. If less, the flowchart proceeds to step 101. Step 111 and 112 are the same as step 107 and step 108, respectively.
- Non-Patent Document 1 discloses that a subject whose temporal lobe is activated in a mental rotation task and another subject have a difference in response time.
- Patent Document 1 discloses a higher brain dysfunction rehabilitation device that presents task data to a patient and provides predetermined teaching data according to a state of error in the patient's reaction. The technical idea and technical contents related to the evaluation device 1 of the present embodiment are not disclosed.
- the display control unit 23 displays the test screen 30 including the target object 41 that the subject should select and the non-target object 42 that the subject should not select on the display device 13.
- the display control unit 23 can display the test screen 30 including the non-target object 42 whose shape, color, or pattern has a predetermined degree of similarity with the target object 41 on the display device 13.
- the input processing unit 24 determines whether or not the target object 41 is selected based on the input of the subject received by the input unit 21.
- the display control unit 23 sequentially displays different test screens 30 on the display device 13 while the predetermined conditions are satisfied, and the input processing unit 24 has the target object 41 based on the input of the subject received by the input unit 21. Determine if selected.
- the evaluation unit 25 evaluates the function of the temporal lobe regarding the visual ability of the subject based on the response time required for the subject to input when it is determined by the input processing unit 24 that the target object 41 is selected.
- the evaluation unit 25 has a case where the ratio of the number of times the target object 41 is determined to be selected by the input processing unit 24 to the number of times the non-target object 42 is determined to be selected is within a predetermined range. , Outputs information that the function of the temporal lobe is not evaluated or the function of the temporal lobe cannot be evaluated.
- the evaluation device 1 it is possible for the evaluation device 1 to perform evaluation only when the correct answer rate or more is predetermined. This makes it possible to evaluate only the subject who has properly confirmed the test screen 30, and it is possible to more appropriately evaluate the function of the temporal lobe regarding the visual ability of the subject.
- the evaluation unit 25 can be configured to evaluate or not evaluate the function of the temporal lobe for each test screen 30 according to the correct answer rate. As a result, it is possible to evaluate only the test screen 30 that responds intensively to the test, and it is possible to more appropriately evaluate the function of the temporal lobe regarding the visual ability of the subject. Further, for example, it is possible to make the evaluation device 1 evaluate only when all the questions are correct, and in this case, it is possible to evaluate only when the test is concentrated and the answer is appropriate, which is more appropriate. It is possible to evaluate the function of the temporal lobe regarding the visual ability of the subject.
- the input processing unit 24 does not determine that the target object 41 has been selected if there is no line of sight detected in the predetermined area including the object 40 in the test screen 30 displayed by the display control unit 23.
- the inventors of the present application have proposed in Japanese Patent Application No. 2020-082634 an evaluation system for quantifying the function of the parietal lobe when visually recognizing an object or a person.
- objective evaluation is performed by quantifying the functions of the parietal lobe and the temporal lobe when visually recognizing an object or a person. It becomes possible to more appropriately grasp the individuality of each individual from the aspect of brain function.
- a program that realizes the information processing shown in the functions and flowcharts of the embodiment of the present invention described above, or a computer-readable storage medium that stores the program can also be used.
- the function of the embodiment of the present invention described above or the method of realizing the information processing shown in the flowchart can be used.
- the server can be a server capable of supplying a computer with a program that realizes the functions of the embodiment of the present invention described above and the information processing shown in the flowchart.
- it can be a virtual machine that realizes the functions of the embodiment of the present invention described above and the information processing shown in the flowchart.
- the input device 12 can be a mouse instead of the touch panel 18, and the display device 13 can be a display that is not the touch panel 18. In this case, clicking the mouse pointer corresponds to touch input to the touch panel 18.
- the display device 13 can be a head-mounted display (HMD) instead of the touch panel 18.
- the HMD has an electronic display built in a housing shaped like goggles, and is configured to display an image in the line-of-sight direction of the wearing user.
- the input device 12 can be any device, sensor, or the like attached to the HMD.
- the evaluation unit 25 is on the side regardless of the ratio of the number of times the target object 41 is determined to be selected by the input processing unit 24 to the number of times the non-target object 42 is determined to be selected. It can also be configured to assess head lobe function.
- the evaluation device 1 may not include the line-of-sight detection device 14. In this case, the evaluation device 1 does not include the line-of-sight detection unit 26 as a functional block.
- the display control unit 23 displays the test screen 30 and presents a mental rotation task
- the object 40 (object 40) having a certain or more complicated stimulating elements (color, shape, pattern).
- the test screen 30 including the object 41 and the non-target object 42) is displayed.
- the object 40 can be a three-dimensional figure having a certain or more complicated stimulating elements.
- Non-Patent Document 2 it has been reported that when the figure presented by the mental rotation task has a certain difficulty level, the object recognition of the temporal lobe system is strengthened during the task without mental rotation. Therefore, if the temporal lobe system is strengthened while measuring the visual characteristics through the mental rotation task, it may not be possible to accurately measure the visual characteristics.
- the display control unit 23 is configured to display the test screen 30 including the object 40 having a certain or more complicated stimulating element when presenting the mental rotation task, so that the subject is subjected to the test a plurality of times. Even in this case, it is possible to more appropriately evaluate the function of the temporal lobe regarding the visual ability of the subject.
- the non-target object 42 is a person image of one person
- the target object 41 is a face obtained by applying a predetermined process to the person image of the one person. It is an image.
- the target object 41 is a face image obtained by replacing only the face portion of one person of the non-target object 42 with the face portion of another person.
- FIG. 9 is a diagram showing an example (test screen 30b) of the test screen 30 displayed by the display control unit 23.
- the display control unit 23 displays the test screen 30b including the target objects 41b-1 and 41b-2 and the non-target objects 42b-1 and 42b-2.
- the non-target object 42b-1 is a normal person image of Mr. A
- the target object 41b-1 is a main part of the face of Mr. A's person image (the part of the face including the eyes, nose, mouth, and eyebrows) of B. It is a dummy image replaced with the corresponding part of.
- the non-target object 42b-2 is a normal person image of Mr. B
- the target object 41b-2 is the main part of the face of Mr. B's person image (the part of the face including the eyes, nose, mouth, and eyebrows). It is a dummy image replaced with the corresponding part of.
- the test screen 30b displays a plurality of normal person images of Mr. A and a plurality of normal person images of Mr. B, and a dummy image of one person image of Mr. A and a dummy image of one person image of Mr. B. Is displayed, and this is a screen for allowing the subject to select each dummy image.
- the test screen 30b displays a relatively large number of, for example, 10 or more ordinary person images of Mr. A and ordinary person images of Mr. B.
- the test screen 30b can include an area for displaying reference objects 43b of the normal person images and dummy images of Mr. A and Mr. B.
- the target object 41 is an illustration of the palm of the right hand
- the non-target object 42 is an illustration of the palm of the left hand.
- each of the target object 41 and the non-target object 42 is displayed in a state of being rotated by a random angle.
- the target object 41 is one, and the one target object 41 is a UI object 44 that can be selected by the subject including a message to the subject.
- the display control unit 23 displays the test screen 30 including only the one target object 41 on the display device 13.
- the evaluation unit 25 determines that the target object 41 is selected by the input processing unit 24, and the evaluation unit 25 determines that the subject's visual ability is based on the response time required for the subject to input. Evaluate the function.
- the evaluation unit 25 has a ratio ( ⁇ ) of the number of times ( ⁇ ) that the input processing unit 24 determines that the target object 41 is selected and the number of times ( ⁇ ) that the non-target object 42 is determined to be selected. ⁇ ), and the temporal lobe function related to the visual ability of the subject is evaluated based on the response time ( ⁇ ) required for the subject to input when the target object 41 is determined to be selected by the input processing unit 24. do.
- the evaluation unit 25 calculates the ratio ( ⁇ ) corresponding to the correct answer rate ( ⁇ / ( ⁇ + ⁇ )), calculates the average response time ( ⁇ ), and side by the ratio ( ⁇ ) / average response time ( ⁇ ). Calculate an index of head lobe function.
- the display control unit 23 displays only one test screen 30 on the display device 13.
- the evaluation unit 25 has a temporal lobe regarding the visual ability of the subject based on the response time required for the subject to input when it is determined by the input processing unit 24 that the target object 41 has been selected on the test screen 30. Evaluate the function of.
- the process or operation described above can be freely performed as long as there is no contradiction in the process or operation such as using data that should not be available in that step at a certain step. Can be changed. Further, each of the examples described above is an example for explaining the present invention, and the present invention is not limited to these examples. The present invention can be carried out in various forms as long as it does not deviate from the gist thereof.
- Evaluation device 11 Processor 12 Input device 13 Display device 14 Line-of-sight detection device 15 Storage device 16 Communication device 17 Bus 18 Touch panel 21 Input unit 22 Control unit 30 Test screen 31 Mode selection screen 32 Start screen 33 Evaluation screen 40 Object 41 Target object 42 Non-target object 43 Reference object 44 UI object
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Public Health (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Physics & Mathematics (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Neurology (AREA)
- Psychology (AREA)
- Hospice & Palliative Care (AREA)
- Developmental Disabilities (AREA)
- Psychiatry (AREA)
- Educational Technology (AREA)
- Social Psychology (AREA)
- Child & Adolescent Psychology (AREA)
- Epidemiology (AREA)
- Data Mining & Analysis (AREA)
- Physiology (AREA)
- Neurosurgery (AREA)
- Databases & Information Systems (AREA)
- Primary Health Care (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Rehabilitation Therapy (AREA)
- Pain & Pain Management (AREA)
- Physical Education & Sports Medicine (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
被験者の物体認知能力を評価するための評価装置であって、
被験者の入力を受け付ける入力部と、
被験者が選択すべき対象オブジェクト及び被験者が選択すべきではない非対象オブジェクトを含むテスト画面を表示装置に表示する表示制御部と、
前記入力部が受け付けた被験者の入力に基づいて対象オブジェクトが選択されたか否かを判定する入力処理部と、
前記入力処理部により対象オブジェクトが選択されたと判定された場合における被験者が入力するまでに要する回答時間に基づいて、該被験者の物体認知能力を評価する評価部と、
を備える。
前記評価部は、前記入力処理部により対象オブジェクトが選択されたと判定された場合、前記表示制御部がテスト画面を表示してから前記入力処理部がオブジェクト選択入力を受け付けるまでの時間又は該オブジェクト選択入力の直前のオブジェクト選択入力を受け付けてから該オブジェクト選択入力を受け付けるまでの時間を回答時間として決定し、該決定した回答時間に基づいて物体認知能力の指標を決定することにより被験者の物体認知能力を評価する。
前記評価部は、前記入力処理部により対象オブジェクトが選択されたと判定された場合の各々において回答時間を決定し、該決定した回答時間に基づいて物体認知能力の指標を決定することにより被験者の物体認知能力を評価する。
前記評価部は、テスト画面の各々において回答時間を決定し、該決定した回答時間に基づいて物体認知能力の指標を決定することにより被験者の物体認知能力を評価する。
前記評価部は、前記入力処理部により対象オブジェクトが選択されたと判定された回数と非対象オブジェクトが選択されたと判定された回数との比率が所定の範囲にある場合、物体認知能力を評価しない又は物体認知能力の評価ができない旨の情報を出力する。
前記入力処理部は、前記表示制御部が表示するテスト画面内のオブジェクトを含む所定領域内に前記検出された視線がない場合、対象オブジェクトが選択されたと判定しない。
被験者の物体認知能力を評価するための方法であって、
被験者が選択すべき対象オブジェクト及び被験者が選択すべきではない非対象オブジェクトを含むテスト画面を表示装置に表示するステップと、
受け付けた被験者の入力に基づいて対象オブジェクトが選択されたか否かを判定するステップと、
対象オブジェクトが選択されたと判定された場合における被験者が入力するまでに要する回答時間に基づいて、物体認知能力を評価するステップと、
を含む。
11 プロセッサ
12 入力装置
13 表示装置
14 視線検出装置
15 記憶装置
16 通信装置
17 バス
18 タッチパネル
21 入力部
22 制御部
30 テスト画面
31 モード選択画面
32 開始画面
33 評価画面
40 オブジェクト
41 対象オブジェクト
42 非対象オブジェクト
43 参照オブジェクト
44 UIオブジェクト
Claims (12)
- 被験者の物体認知能力を評価するための評価装置であって、
被験者の入力を受け付ける入力部と、
被験者が選択すべき対象オブジェクト及び被験者が選択すべきではない非対象オブジェクトを含むテスト画面を表示装置に表示する表示制御部と、
前記入力部が受け付けた被験者の入力に基づいて対象オブジェクトが選択されたか否かを判定する入力処理部と、
前記入力処理部により対象オブジェクトが選択されたと判定された場合における被験者が入力するまでに要する回答時間に基づいて、該被験者の物体認知能力を評価する評価部と、
を備える、評価装置。 - 前記入力処理部は、前記入力部が受け付けた被験者の入力が対象オブジェクト又は非対象オブジェクトの選択に対応する入力である場合、該被験者の入力をオブジェクト選択入力として受け付け、
前記評価部は、前記入力処理部により対象オブジェクトが選択されたと判定された場合、前記表示制御部がテスト画面を表示してから前記入力処理部がオブジェクト選択入力を受け付けるまでの時間又は該オブジェクト選択入力の直前のオブジェクト選択入力を受け付けてから該オブジェクト選択入力を受け付けるまでの時間を回答時間として決定し、該決定した回答時間に基づいて物体認知能力の指標を決定することにより被験者の物体認知能力を評価する、請求項1に記載の評価装置。 - 前記表示制御部は、複数の対象オブジェクトを含むテスト画面を表示装置に表示し、
前記評価部は、前記入力処理部により対象オブジェクトが選択されたと判定された場合の各々において回答時間を決定し、該決定した回答時間に基づいて物体認知能力の指標を決定することにより被験者の物体認知能力を評価する、請求項1又は2に記載の評価装置。 - 前記表示制御部は、対象オブジェクトと図形特徴、色彩又は模様が所定の類似度を有し、かつ異なる非対象オブジェクトを含むテスト画面を表示する、請求項1から3のいずれか1項に記載の評価装置。
- 前記表示制御部は、それぞれのオブジェクト内の所定位置を中心としてランダムな角度回転させた対象オブジェクト及び非対象オブジェクトを含むテスト画面を表示装置に表示する、請求項1から4のいずれか1項に記載の評価装置。
- 前記表示制御部は、1つの対象オブジェクトのみを含むテスト画面を表示装置に表示し、該対象オブジェクトは、被験者に対するメッセージを含む被験者が選択可能なUIオブジェクトである、請求項1から3のいずれか1項に記載の評価装置。
- 前記表示制御部は、所定条件を満たしている間、異なるテスト画面を表示装置に順次表示し、
前記評価部は、テスト画面の各々において回答時間を決定し、該決定した回答時間に基づいて物体認知能力の指標を決定することにより被験者の物体認知能力を評価する、請求項1から6のいずれか1項に記載の評価装置。 - 前記入力処理部は、前記入力部が受け付けた被験者の入力に基づいて対象オブジェクト又は非対象オブジェクトが選択されたか否かを判定し、
前記評価部は、前記入力処理部により対象オブジェクトが選択されたと判定された回数と非対象オブジェクトが選択されたと判定された回数との比率が所定の範囲にある場合、物体認知能力を評価しない又は物体認知能力の評価ができない旨の情報を出力する、
請求項1から7のいずれか1項に記載の評価装置。 - 被験者の視線を検出する視線検出部を更に備え、
前記入力処理部は、前記表示制御部が表示するテスト画面内のオブジェクトを含む所定領域内に前記検出された視線がない場合、対象オブジェクトが選択されたと判定しない、請求項1から8のいずれか1項に記載の評価装置。 - 前記評価部は、前記入力処理部により対象オブジェクトが選択されたと判定された回数と非対象オブジェクトが選択されたと判定された回数との比率、及び前記入力処理部により対象オブジェクトが選択されたと判定された場合における被験者が入力するまでに要する回答時間に基づいて、該被験者の物体認知能力を評価する、請求項1から9のいずれか1項に記載の評価装置。
- 被験者の物体認知能力を評価するための方法であって、
被験者が選択すべき対象オブジェクト及び被験者が選択すべきではない非対象オブジェクトを含むテスト画面を表示装置に表示するステップと、
受け付けた被験者の入力に基づいて対象オブジェクトが選択されたか否かを判定するステップと、
対象オブジェクトが選択されたと判定された場合における被験者が入力するまでに要する回答時間に基づいて、物体認知能力を評価するステップと、
を含む、方法。 - 請求項11に記載の方法の各ステップをコンピュータに実行させるプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022565461A JPWO2022114143A1 (ja) | 2020-11-26 | 2021-11-26 | |
EP21898115.7A EP4252733A1 (en) | 2020-11-26 | 2021-11-26 | Evaluation device, method, and program for evaluating ability to identify object |
US18/038,646 US20240032850A1 (en) | 2020-11-26 | 2021-11-26 | Evaluation device, method, and program for evaluating ability to identify object |
CN202180079656.5A CN116507306A (zh) | 2020-11-26 | 2021-11-26 | 用于评价物体认知能力的评价装置、方法以及程序 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020196171 | 2020-11-26 | ||
JP2020-196171 | 2020-11-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022114143A1 true WO2022114143A1 (ja) | 2022-06-02 |
Family
ID=81754444
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/043460 WO2022114143A1 (ja) | 2020-11-26 | 2021-11-26 | 物体認知能力を評価するための評価装置、方法、及びプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240032850A1 (ja) |
EP (1) | EP4252733A1 (ja) |
JP (1) | JPWO2022114143A1 (ja) |
CN (1) | CN116507306A (ja) |
WO (1) | WO2022114143A1 (ja) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001079050A (ja) | 1999-09-16 | 2001-03-27 | Japan Science & Technology Corp | 高次脳機能障害リハビリテーション装置 |
US20160367180A1 (en) * | 2015-06-17 | 2016-12-22 | Obsevera, Inc. | Apparatus and method of conducting medical evaluation of add/adhd |
JP2017205191A (ja) * | 2016-05-17 | 2017-11-24 | 公立大学法人会津大学 | 被験者の識別・反応機能を計測するための識別・反応計測装置、及び被験者の識別・反応機能の計測を実行制御するプログラム |
JP2018175052A (ja) * | 2017-04-05 | 2018-11-15 | 北海道公立大学法人 札幌医科大学 | 診断支援システム、診断支援システムの作動方法、及びプログラム |
JP2019208758A (ja) * | 2018-06-01 | 2019-12-12 | レデックス株式会社 | 認知機能測定システム、認知機能測定通信システム及びプログラム |
JP2020082634A (ja) | 2018-11-29 | 2020-06-04 | 株式会社パイロットコーポレーション | 筆記具 |
JP2020127687A (ja) * | 2019-02-12 | 2020-08-27 | 株式会社Jvcケンウッド | 評価装置、評価方法、及び評価プログラム |
-
2021
- 2021-11-26 WO PCT/JP2021/043460 patent/WO2022114143A1/ja active Application Filing
- 2021-11-26 JP JP2022565461A patent/JPWO2022114143A1/ja active Pending
- 2021-11-26 US US18/038,646 patent/US20240032850A1/en active Pending
- 2021-11-26 EP EP21898115.7A patent/EP4252733A1/en active Pending
- 2021-11-26 CN CN202180079656.5A patent/CN116507306A/zh active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001079050A (ja) | 1999-09-16 | 2001-03-27 | Japan Science & Technology Corp | 高次脳機能障害リハビリテーション装置 |
US20160367180A1 (en) * | 2015-06-17 | 2016-12-22 | Obsevera, Inc. | Apparatus and method of conducting medical evaluation of add/adhd |
JP2017205191A (ja) * | 2016-05-17 | 2017-11-24 | 公立大学法人会津大学 | 被験者の識別・反応機能を計測するための識別・反応計測装置、及び被験者の識別・反応機能の計測を実行制御するプログラム |
JP2018175052A (ja) * | 2017-04-05 | 2018-11-15 | 北海道公立大学法人 札幌医科大学 | 診断支援システム、診断支援システムの作動方法、及びプログラム |
JP2019208758A (ja) * | 2018-06-01 | 2019-12-12 | レデックス株式会社 | 認知機能測定システム、認知機能測定通信システム及びプログラム |
JP2020082634A (ja) | 2018-11-29 | 2020-06-04 | 株式会社パイロットコーポレーション | 筆記具 |
JP2020127687A (ja) * | 2019-02-12 | 2020-08-27 | 株式会社Jvcケンウッド | 評価装置、評価方法、及び評価プログラム |
Non-Patent Citations (4)
Title |
---|
ALEXANDER PROVOSTBLAKE JOHNSONFRINI KARAYANIDISSCOTT D. BROWNANDREW HEATHCOTE: "Two Routes to Expertise in Mental Rotation", COGNITIVE SCIENCE, vol. 37, 2013, pages 1321 - 1342 |
FUMI KUSUMOTORYOTA IMAITAKAYUKI KODAMASHU MORIOKA: "Relation between Brain Activity and Reaction Time in a Mental Rotation Task", PHYSICAL THERAPY SCIENCE, SOCIETY OF PHYSICAL THERAPY SCIENCE, vol. 29, no. 4, 2014, pages 479 - 483, XP055933216 |
KUSUMOTO, AYA,, IMAI, RYOTA, KODAMA, TAKAYUKI, MORIOKA: "Relation between Brain Activity and Reaction Time in a Mental Rotation Task", RIGAKURYOHO KAGAKU, vol. 29, no. 4, 1 September 2014 (2014-09-01), pages 479 - 483, XP055933216 * |
TARAGIN, DORIT,, TZURIEL, DAVID, VAKIL, EL: "Mental Rotation: The Effects of Processing Strategy, Gender and Task Characteristics on Children’s Accuracy, Reaction Time and Eye Movements' Pattern. ", JOURNAL OF EYE MOVEMENT, vol. 12, no. 8, 1 November 2019 (2019-11-01), pages 1 - 19, XP055933221 * |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022114143A1 (ja) | 2022-06-02 |
US20240032850A1 (en) | 2024-02-01 |
CN116507306A (zh) | 2023-07-28 |
EP4252733A1 (en) | 2023-10-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
RU2754195C2 (ru) | Система для измерения совокупности клинических параметров функции зрения | |
US10849491B2 (en) | Method and device for determining the visual acuity of a user | |
US9844317B2 (en) | Method and system for automatic eyesight diagnosis | |
JP2018520820A (ja) | 視覚の様相を検査する方法及びシステム | |
US10709328B2 (en) | Main module, system and method for self-examination of a user's eye | |
CN106256312B (zh) | 认知功能障碍评价装置 | |
US9386949B2 (en) | Device to determine visuo-spatial ability | |
CN109152559A (zh) | 用于定量评估视觉运动神经响应的方法和系统 | |
JP2022502789A (ja) | 認知処置を最適化するための努力メトリックを導出するための認知プラットフォーム | |
CN110603550A (zh) | 利用导航任务识别生物标志物和利用导航任务进行治疗的平台 | |
WO2022114143A1 (ja) | 物体認知能力を評価するための評価装置、方法、及びプログラム | |
Han et al. | Is the Human Visual System Invariant to Translation and Scale? | |
CN115471903A (zh) | 认知评估系统 | |
US20220183546A1 (en) | Automated vision tests and associated systems and methods | |
EP4011273A1 (en) | Method and device for determining at least one astigmatic effect of at least one eye | |
Flavia et al. | A Wearable Brain-Computer Interface Instrument with Aug-Mented Reality-Based Interface for General Applications | |
Cidade | From Basic Research to the Clinic: Creation of an Interface for Analysis of Eye Movements | |
Tolmachev et al. | Sensory Dissociation in Vestibular Function Assessment | |
WO2023192470A1 (en) | Eeg-guided spatial neglect detection system and detection method employing same | |
KR20240015687A (ko) | 시각 능력을 특성화하기 위한 가상 현실 기법 | |
WO2024095261A1 (en) | System and method for diagnosis and treatment of various movement disorders and diseases of the eye | |
EP4271275A1 (de) | Computerimplementiertes verfahren und vorrichtung zur bestimmung von reaktionszeitverläufen | |
Al-Aidroos et al. | Saturday Sessions |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21898115 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18038646 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2022565461 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202180079656.5 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2021898115 Country of ref document: EP Effective date: 20230626 |