CN116077060B - Display device, evaluation system and evaluation method - Google Patents

Display device, evaluation system and evaluation method Download PDF

Info

Publication number
CN116077060B
CN116077060B CN202310093997.9A CN202310093997A CN116077060B CN 116077060 B CN116077060 B CN 116077060B CN 202310093997 A CN202310093997 A CN 202310093997A CN 116077060 B CN116077060 B CN 116077060B
Authority
CN
China
Prior art keywords
screen
user
eye movement
guide information
target display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310093997.9A
Other languages
Chinese (zh)
Other versions
CN116077060A (en
Inventor
陈涛
刘海春
李岚臻
马思悦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Naixin Technology Co ltd
Original Assignee
Shanghai Naixin Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Naixin Technology Co ltd filed Critical Shanghai Naixin Technology Co ltd
Priority to CN202310093997.9A priority Critical patent/CN116077060B/en
Publication of CN116077060A publication Critical patent/CN116077060A/en
Application granted granted Critical
Publication of CN116077060B publication Critical patent/CN116077060B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0041Operational features thereof characterised by display arrangements
    • A61B3/005Constructional features of the display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0041Operational features thereof characterised by display arrangements
    • A61B3/0058Operational features thereof characterised by display arrangements for multiple images
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0075Apparatus for testing the eyes; Instruments for examining the eyes provided with adjusting devices, e.g. operated by control lever
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0091Fixation targets for viewing direction
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4076Diagnosing or monitoring particular conditions of the nervous system
    • A61B5/4088Diagnosing of monitoring cognitive diseases, e.g. Alzheimer, prion diseases or dementia
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/7445Display arrangements, e.g. multiple display units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Abstract

The specification discloses a display device, an evaluation system and an evaluation method, wherein the display device comprises a first screen and a second screen, and an included angle is formed between the first screen and the second screen. The first screen displays first guide information, guides the first user to select target display content displayed on the second screen so as to generate a display signal according to the target display content, displays second guide information and the target display content on the second screen according to the display signal, and guides the second user to observe the target display content. Therefore, the first screen and the second screen are configured so that the screens observed by the first user and the second user are separated, and the first user can assist the second user to perform eye movement test based on the first screen and can be far away from the acquisition range of the eye movement image by the way that an included angle is formed between the first screen and the second screen, so that the acquisition equipment cannot acquire the eye movement image of the first user, and the accuracy of the diagnosis index is improved.

Description

Display device, evaluation system and evaluation method
Technical Field
The present disclosure relates to the field of computers, and more particularly, to a display device, an evaluation system, and an evaluation method.
Background
For diseases with cognitive dysfunction such as schizophrenia, the diagnosis mode mainly depends on clinical symptoms, and objective and effective biological diagnosis indexes are lacked. Since the eye movement process of the user needs to be regulated and controlled by the cognitive function of the brain, that is, the eye movement data may reflect the state of the cognitive function of the user. Therefore, the auxiliary diagnosis index can be obtained by analyzing the eye movement data so as to assist the evaluator in diagnosing the diseases for the user.
Currently, in screen-based eye movement tests, a screen guides a user to observe presentation contents by displaying the presentation contents of the eye movement test. Meanwhile, an eye movement image when a user observes the display content is acquired through the acquisition equipment. The eye movement data of the user is obtained by analyzing each eye movement image acquired in the process of observing the display content by the user, and then the reference index for reflecting the health degree of the mental activities of the user is obtained based on the eye movement data.
However, in the eye movement test, in order to avoid misoperation of a user who has never performed the eye movement test, such as not matching with the device calibration and the eye movement test, or misselecting the test content, an evaluator typically assists the user to perform the work of test selection, device calibration, and the like before the test, or controls the screen to display the display content during the test. At this time, the evaluator is also in the acquisition range in which the acquisition device can acquire the eye movement image. This results in the acquisition device potentially acquiring eye movement images of the evaluator and thereby affecting the acquisition of eye movement data of the user based on the eye movement images, thereby reducing the accuracy of the reference index.
Disclosure of Invention
The present specification provides a display apparatus, an evaluation system, and an evaluation method to partially solve the above-described problems existing in the prior art.
The technical scheme adopted in the specification is as follows:
the present specification provides a display device including a first screen 1 and a second screen 2;
an included angle is formed between the first screen 1 and the second screen 2;
the first screen 1 is configured to display first guiding information, so that a first user selects target display content displayed on the second screen 2 according to the first guiding information, so as to generate a display signal according to the target display content, and send the display signal to the second screen 2;
the second screen 2 is configured to display, when receiving the display signal, second guiding information and target display content corresponding to the display signal, where the second guiding information is used to guide a second user to observe the target display content.
Optionally, the magnitude of the included angle is positively correlated with the acquisition range of the eye movement image.
Optionally, the display device further includes a first bracket 11 and a second bracket 21, the first bracket 11 is used for supporting the first screen 1, and the second bracket 21 is used for supporting the second screen 2.
Alternatively, the first end of the first bracket 11 is used for fixing the first screen 1, and the first end of the second bracket 21 is used for fixing the second screen 2; a second end of the first bracket 11 is connected to a second end of the second bracket 21.
Optionally, the second end of the first bracket 11 is rotatably connected to the second end of the second bracket 21;
the magnitude of the angle between the first bracket 11 and the second bracket 21 is adjusted by rotating the second end of the first bracket 11 and the second end of the second bracket 21.
The present specification provides an assessment system, the system comprising: the acquisition device 3, the control device 4 and any one of the display devices described above;
the first screen 1 is used for displaying first guide information, and the first guide information is used for guiding a first user to determine target display content displayed on the second screen 2;
the second screen 2 is configured to display second guide information and target display content corresponding to the display signal when receiving the display signal, where the second guide information is used to guide a second user to observe the target display content;
the acquisition device 3 is configured to acquire an eye movement image of the second user during the process of displaying the second guide information and the target display content on the second screen 2, and send the eye movement image to the control device 4;
The control device 4 is used for responding to the target display content input by the first user according to the first guide information to generate a display signal and sending the display signal to the second screen 2; and evaluating the mental activity health of the second user according to the received eye movement image.
Optionally, the control device 4 is specifically configured to determine, according to the received eye movement image, a gaze point of the second user on the second screen 2; according to the fixation point, determining eye movement data of the second user corresponding to the second guide information; and determining an evaluation result for indicating the mental activity health condition of the second user according to the eye movement data of the second user corresponding to the second guide information.
Optionally, the system further comprises a head stabilization device 5 and a fixation base 6; the head stabilizing device 5 comprises a first supporting rod 51, a second supporting rod 52, a flexible piece 53 and a chin support bracket 54;
the first support rod 51 and the second support rod 52 are respectively fixed on the fixed base 6, and the first support rod 51 and the second support rod 52 are parallel to each other;
two ends of the flexible piece 53 are respectively connected with the first supporting rod 51 and the second supporting rod 52;
The two ends of the chin support bracket 54 are respectively connected with the first support bar 51 and the second support bar 52, the chin support bracket 54 is perpendicular to the first support bar 51 and the second support bar 52, and the chin support bracket 54 and the flexible piece 53 are parallel to each other;
the chin rest 54 is used for supporting the chin of the second user to stabilize the head of the second user when the second user views the target presentation content displayed on the second screen 2.
Optionally, a first end of the chin rest 54 is mounted on the first support bar 51 through a clamping groove 541, and a second end of the chin rest 54 is slidably connected to the second support bar 52 through a sliding component 542;
when the clamping groove 541 is separated from the first support rod 51, the sliding component 542 is slid to drive the chin rest 54 to slide along the second support rod 52, so as to adjust the distance between the chin rest 54 and the flexible member 53.
Optionally, the acquisition device 3 is arranged on the second screen 2.
The present specification provides an evaluation method applied to a control apparatus, the method comprising:
generating a display signal in response to target display content input by a first user according to first guide information, and sending the display signal to a second screen, so that the second screen displays second guide information and the target display content corresponding to the display signal when receiving the display signal, wherein the first guide information is displayed by the first screen, and the second guide information is used for guiding the second user to observe the target display content; wherein an included angle is formed between the first screen and the second screen;
Receiving an eye movement image sent by an acquisition device, wherein the eye movement image is acquired by the acquisition device in the process of displaying the second guide information and the target display content on the second screen;
and evaluating the mental activity health of the second user according to the received eye movement image.
Optionally, the method further comprises:
determining the identification of the second user according to the identity information of the second user input by the first user;
and establishing a corresponding relation between the identification of the second user and the evaluation result of the second user, and storing the corresponding relation.
The present specification provides an evaluation apparatus, the apparatus being applied to a control device, the apparatus comprising:
the display signal generation module is used for responding to target display contents input by a first user according to first guide information to generate display signals and sending the display signals to a second screen, so that the second screen displays the second guide information and the target display contents corresponding to the display signals when receiving the display signals, the first guide information is displayed by the first screen, and the second guide information is used for guiding the second user to observe the target display contents; wherein an included angle is formed between the first screen and the second screen;
An eye moving image receiving module for receiving an eye moving image transmitted by an acquisition device, the eye moving image being acquired by the acquisition device during the process of displaying the second guide information and the target display content on the second screen;
and the evaluation module is used for evaluating the mental activity health condition of the second user according to the received eye movement image.
The present specification provides a computer readable storage medium storing a computer program which when executed by a processor implements the above-described evaluation method.
The present specification provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the above-mentioned evaluation method when executing the program.
The above-mentioned at least one technical scheme that this specification adopted can reach following beneficial effect:
in the display device provided in the present specification, the display device includes a first screen and a second screen, wherein an angle is formed between the first screen and the second screen. The first screen displays the first guide information, so that when the first guide information is observed by a first user, target display content displayed on the second screen is selected, a display signal is generated according to the target display content, the display signal is sent to the second screen, when the second screen receives the display signal, the second guide information and the target display content are displayed, and the second guide information can guide the second user to observe the target display content. Therefore, the first screen and the second screen are configured so that the screens observed by the first user and the second user are separated, and the first user can assist the second user to perform eye movement test based on the first screen and can be far away from the acquisition range of the eye movement image through the form of an included angle between the first screen and the second screen, so that the acquisition equipment cannot acquire the eye movement image of the first user, and the accuracy of the diagnosis index is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the specification, illustrate and explain the exemplary embodiments of the present specification and their description, are not intended to limit the specification unduly. In the drawings:
fig. 1 is a schematic view of a display device according to the present disclosure;
FIG. 2 is a schematic diagram of a display device according to the present disclosure;
FIG. 3 is a schematic diagram of a display device according to the present disclosure;
FIG. 4 is a schematic diagram of a display device according to the present disclosure;
FIG. 5 is a schematic diagram of a display device according to the present disclosure;
FIG. 6 is a schematic diagram of a display device according to the present disclosure;
FIG. 7 is a schematic diagram of an evaluation system according to the present disclosure;
FIG. 8 is a schematic view of a head stabilization device of the present disclosure;
FIG. 9 is a schematic view of a head stabilization device of the present disclosure;
FIG. 10 is a flow chart of an evaluation method according to the present disclosure;
FIG. 11 is a flow chart of an evaluation method in the present specification;
FIG. 12 is a schematic view of an evaluation device provided herein;
fig. 13 is a schematic view of the electronic device corresponding to fig. 10 provided in the present specification.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the present specification more apparent, the technical solutions of the present specification will be clearly and completely described below with reference to specific embodiments of the present specification and corresponding drawings. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present specification. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are intended to be within the scope of the present disclosure.
In addition, it should be noted that, all actions of acquiring signals, information or data are performed under the condition of conforming to the corresponding data protection rule policy of the country of the location and obtaining the authorization given by the owner of the corresponding device.
In this specification, the evaluation result is only used to indicate the health degree of the mental activities of the second user. The evaluation result can be used as reference information for evaluating the health condition of the mental activities of the second user, and can be provided for an evaluator (such as an evaluator) for evaluating the health condition of the mental activities of the second user, so that when the evaluator evaluates the health condition of the mental activities of the second user, the evaluator can have a referent biological index, and the evaluator can obtain the evaluation result with higher accuracy. Also, the manner in which the evaluation person evaluates the health condition of the mental activities of the second user with the evaluation result as a reference is not limited in the embodiments of the present specification.
With the development of computer vision technology, the eye movement index of a user is analyzed through the eye moving image of the user, and the method is widely applied to the fields of vision transmission, advertisement recommendation, medical technology and the like. In the medical technical field, the eyeball movement index can be used as a behavioural measurement index for exploring the related advanced cognitive processes of the cerebral cortex and the subcortical of the human. Since cognitive function is capable of reflecting mental conditions, eye movement data is currently commonly used clinically to aid in the diagnosis of mental disorders, such as mental classification, depression, and the like. For schizophrenic patients with cognitive dysfunction, the abnormal value of their eye movement relative to normal can be used as a biological indicator for the state of mental depression. Thus, the user may be tested by eye movement testing to obtain a reference index that reflects the health of the user's mental activities.
Currently, in order to assist an evaluator in evaluating whether a user has a cognitive dysfunction disease such as schizophrenia, eye movement data of the user may be obtained by guiding the user to perform an eye movement test, and a reference index reflecting the health degree of the mental activities of the user may be provided to the evaluator based on the obtained eye movement data. Whether the mode of obtaining the reference index based on the eye movement data in the scheme is a statistical method, a machine learning method or any other existing reference index determining method, the premise of obtaining the reference index with higher accuracy is to obtain the eye movement image of the user with higher accuracy.
However, in the current eye movement test, only one display device is included in the test device for providing the eye movement test to the user, as shown in fig. 1. The display device 101 will present various presentation content required for eye movement testing including, but not limited to, device calibration content, test options, and actual eye movement test content, the user 103 needs to face the display device 101 and maintain a relatively fixed distance from the display device 101. If the user 103 is not familiar with the flow of the eye movement test, misoperation situations such as device calibration and eye movement test, or wrong selection of test options may occur, which greatly interferes with the eye movement test process.
For this situation, the evaluator 104 usually assists the user 103 to perform the test selection, device calibration, etc. before the test, or controls the display apparatus 101 to display different types of display contents during the test, where the evaluator 104 faces the display device, takes up and sits in place of the user 103 after adjusting the eye movement of the test contents, or otherwise places the evaluator 104 at the seat of the user 103. At this time, the evaluator 104 is also within the acquisition range in which the photographing device 102 can acquire an eye movement image, as in a sector area shown by a broken line in fig. 1. This results in the capture device 102 potentially capturing an eye movement image of the evaluator 104. Even if the collected eye moving images can be arranged according to the collection time to obtain an eye moving image sequence, a simple manner of removing a plurality of eye moving images arranged at the beginning of the eye moving image sequence cannot accurately remove the eye moving images of the evaluator 104, and the eye moving images of the user 103 may be erroneously removed, so that loss of eye moving data is caused, thereby affecting the eye moving data of the user 103 obtained based on the eye moving images, and further reducing accuracy of the reference index.
Based on this, the present disclosure provides a display device, by configuring a first screen and a second screen, so that the screens observed by the first user and the second user are separated, and by having an included angle between the first screen and the second screen, the first user can assist the second user to perform an eye movement test based on the first screen, and can also be far away from an acquisition range of an eye movement image, so that the acquisition device cannot acquire the eye movement image of the first user, and further improve accuracy of a diagnostic index.
In one or more embodiments of the present description, a first user is an evaluator, who is a medical technician assisting a second user in performing an eye movement test, and the first user is generally familiar with the test procedure of the eye movement test and the use scheme of the display device provided in the present description. The second user is a subject who is a person who needs to be evaluated for the presence or absence of a cognitive dysfunction such as schizophrenia.
The following describes in detail the technical solutions provided by the embodiments of the present specification with reference to the accompanying drawings.
Fig. 2 is a schematic view of a display device in the present specification, wherein the display device includes a first screen 1 and a second screen 2, and an included angle is formed between the first screen 1 and the second screen 2. In the embodiment of the present specification, the first screen 1 and the second screen 2 have an included angle therebetween, which specifically means: in an eye movement testing environment where the display device is located, the position of the first screen 1 and the position of the second screen 2 form a space included angle, i.e. the first screen 1 and the second screen 2 included in the display device form a space isolation. In the present embodiment, it is not limited whether the first screen 1 and the second screen 2 are provided on the same supporting apparatus. In addition, the angle between the first screen 1 and the second screen 2 may be adjusted according to a specific application scenario, and the size of the angle between the first screen 1 and the second screen 2 is not limited in this specification. Taking fig. 2 as an example, starting from the side of the second screen 2 displaying the picture, the angle between the side of the first screen 1 displaying the picture is rotated clockwise or anticlockwise, and θ shown in fig. 2 is the included angle between the first screen 1 and the second screen 2. Since the first user 7 faces the first screen 1 and the second user 8 faces the second screen 2, there is also an angle between the orientation of the first user 7 and the orientation of the second user 8.
In general, an acquisition device 3 that acquires an eye movement image is disposed near the second screen 2, and as shown in fig. 2, the acquisition device 3 is disposed on the left side of the second screen 2. The acquisition range of the acquisition device 3 may generally cover an area facing the second screen 2, which may alternatively be a sector area. Since an included angle exists between the direction of the first user 7 and the direction of the second user 8, when the second user 8 is located in the acquisition range, the first user 7 is not located in the acquisition range, so that only the eye movement image of the second user 8 can be acquired, but the eye movement image of the first user 7 cannot be acquired.
In the present embodiment, the first user 7 and the second user 8 may be in the same test environment, but the screens facing them are not the same screen.
Specifically, in the present embodiment, the first user 7 is guided to view the first screen 1, and the second user 8 is guided to view the second screen 2. The first screen 1 is used for displaying first guide information, and the second screen 2 is used for displaying second guide information and target display content.
Wherein the first guiding information is used for guiding the first user 7 to select target presentation content presented on the second screen 2, which target presentation content may correspond to a test task of at least one type of eye movement test. Generally, before the second user 8 performs the eye movement test, the first user 7 may select what type of eye movement test is specifically performed by the second user 8 to obtain the eye movement image of the second user 8 in a targeted manner. The first guiding information may be displayed in text or sound, which is not limited in this specification. For example, the second screen 2 may support presentation contents for presenting four types of eye movement tests, namely, a fixation stability test, an eye jump test, a follow-up test, and a free view test, and the first guide information may be the word "please select an eye movement test type" as shown in fig. 3.
Further, in the middle of the first screen 1 displaying the first guide information and the second screen 2 displaying the second guide information, a display signal may be generated according to the target display content selected by the first user 7, and the generated display signal may be transmitted to the second screen 2. The execution subject that generates and transmits the presentation signal may be the control device 4 in the evaluation system shown in fig. 7 described below, or may be another input/output device, which is not limited in this specification.
And then, when the second screen 2 receives the display signal, displaying second guide information and target display content corresponding to the display signal, wherein the second guide information is used for guiding the second user 8 to observe the target display content. In connection with the foregoing, the target presentation content corresponds to a test task of at least one type of eye movement test, wherein the different types of eye movement tests may include: gaze stability test, eye jump test, follow-up test, and free view test. Different types of eye movement tests correspond to different display contents being displayed, and at the same time, different types of eye movement tests correspond to different second guide information being displayed. Taking an eye movement test in which the target display content corresponds to a gaze stabilization type as an example, as shown in fig. 4, the target display content includes solid dots that are displayed stationary within the display range of the second screen 2, and the second guide information is displayed in the form of characters, such as "please look at a dot appearing in the center of the screen".
Optionally, in each type of eye movement test, the same type of eye movement test may also guide the second user 8 to perform different test tasks by displaying different guiding information, such as the eye jump towards test and the eye jump reverse test, which are both eye jump type eye movement tests, but the two tests are different test tasks.
In the display device provided in the present specification, by configuring the first screen 1 and the second screen 2, the screens observed by the first user 7 and the second user 8 are separated, and by having an included angle between the first screen 1 and the second screen 2, the first user 7 can assist the second user 8 to perform an eye movement test based on the first screen 1, and can be far away from the acquisition range of the eye movement image, so that the acquisition device 3 cannot acquire the eye movement image of the first user 7, and further improve the accuracy of the diagnostic index.
In an alternative embodiment of the present disclosure, the acquisition range corresponding to the acquisition device 3 for acquiring an eye movement image may be adjusted according to a specific application scenario, and the larger the acquisition range of the eye movement image, the easier it is to cover the first user 7 in a testing environment with the second user 8. As shown in the left graph of fig. 5, the acquisition range of the eye movement image is a sector area, at this time, the first user 7 falls into the acquisition range of the eye movement image, and at this time, the included angle between the first screen 1 and the second screen 2 may be increased, as shown in the left graph of fig. 5, so that the first user 7 leaves the acquisition range of the eye movement image. Thus, the larger the acquisition range of the eye movement image is, the larger the included angle between the first screen 1 and the second screen 2 is, so as to avoid that the first user 7 falls into the acquisition range and interfere with the acquisition of the eye movement image of the second user 8. Thus, the angle between the first screen 1 and the second screen 2 is positively correlated with the acquisition range of the eye movement image.
In one or more embodiments of the present specification, the first screen 1 and the second screen 2 may be supported by a screen bracket, respectively, to ensure stability of the first screen 1 and the second screen 2 during an eye movement test. Specifically, as shown in fig. 6, the display apparatus further includes a first bracket 11 and a second bracket 21, the first bracket 11 is used for supporting the first screen 1, and the second bracket 21 is used for supporting the second screen 2. In order to ensure the stability of the first screen 1 and the second screen 2, the first bracket 11 and the second bracket 21 may be mounted on a reliable object in the test environment, such as a wall of a test room, a floor, or a fixed base fixed on the wall (or floor) of the test room, while the first bracket 11 supports the first screen 1 and the second bracket 21 supports the second screen 2, which is not particularly limited in this specification.
Further, in one or more embodiments of the present disclosure, the specific manner in which the first bracket 11 supports the first screen 1 and the specific manner in which the second bracket 21 supports the second screen 2 may be: the first end of the first bracket 11 is used for fixing the first screen 1, the first end of the second bracket 21 is used for fixing the second screen 2, and the second end of the first bracket 11 is connected with the second end of the second bracket 21. The second end of the first bracket 11 and the second end of the second bracket 21 may be fixedly connected or rotatably connected, which is not limited in this specification.
In an alternative embodiment of the present disclosure, the second end of the first bracket 11 is connected to the second end of the second bracket 21 in a rotational manner. At this time, the second end of the first bracket 11 and the second end of the second bracket 21 are rotated to adjust the angle between the first bracket 11 and the second bracket 21, so as to adjust the angle between the first screen 1 and the second screen 2, thereby being suitable for adjusting the acquisition range of the moving images.
Based on the display device provided in one or more embodiments of the present disclosure, the present disclosure further provides an evaluation system, which includes the acquisition device 3, the control device 4, and the display device shown in fig. 2, and is shown in fig. 7 as a schematic diagram of an evaluation system provided in the present disclosure.
As for the display device, as described above, the display device includes the first screen 1 and the second screen 2 with an angle therebetween, the first screen 1 and the second screen 2 are disposed. The first screen 1 displays first guide information for guiding the first user 7 to determine target display contents displayed on the second screen 2. The second screen 2 is configured to display second guiding information and target display content corresponding to the display signal when receiving the display signal, where the second guiding information is used to guide the second user 8 to observe the target display content. The specific technical solution of the display device refers to the technical solution as described in fig. 2, and will not be described here again.
For the acquisition device 3, the screen-based acquisition device 3 requires the second user 8 to face the second screen 2 and interact with the target presentation content presented by the second screen 2 and the second guide information. Specifically, the second screen 2 displays second guide information and target display content, and the second user 8 is guided by the second guide information to observe the target display content. During the eye movement test, at least one acquisition device 3 is present in the test environment for acquiring an eye movement image of the second user 8. The capturing device 3 may be installed at any position capable of accurately capturing the eye movement image of the second user 8 in the test environment such as under or near the second screen 2, which is not limited in this embodiment of the present specification. As shown in fig. 7, the acquisition device 3 is disposed on the left side of the second screen 2. In general, the acquisition device 3 has an eye moving image acquisition range, and the second user 8 needs to be located in the eye moving image acquisition range while facing the second screen 2, so that the eye moving image of the second user 8 when observing the target display content displayed on the second screen 2 can be acquired through the acquisition device 3. Of course, the time when the acquisition device 3 starts to acquire the eye movement image of the second user 8 may also be determined according to a specific application scenario, which is not limited in this specification.
For the control device 4, the control device 4 is in communication with the first screen 1, the second screen 2 and the acquisition device 3 by wireless and/or wired means. The control device 4 may determine the first guide information and transmit the first guide information to the first screen 1 such that the first screen 1 displays the first guide information. After the first user 7 selects the target presentation content according to the first guide information, the control device 4 may generate a presentation signal according to the target presentation content transmitted from the first screen 1 and transmit the presentation signal to the second screen 2. During the process of displaying the target display content on the second screen 2, the control device 4 receives the eye movement images sent by the acquisition device 3, and evaluates the mental activity health condition of the second user 8 according to all the received eye movement images after the display is finished.
In one or more embodiments of the present disclosure, the specific manner in which the control device 4 as shown in fig. 7 evaluates the mental activity health of the second user 8 based on the received eye movement image may be: and determining the gaze point of the second user 8 on the second screen 2 according to the received eye movement image, determining the eye movement data of the second user 8 corresponding to the second guiding information according to the gaze point, and determining the evaluation result for indicating the mental activity health condition of the second user 8 according to the eye movement data of the second user 8 corresponding to the second guiding information.
In practical applications, the screen-based eye moving image capturing device 3 is typically configured with an infrared light source directed to the pupil center of the second user 8, and an infrared light source reflection point may exist on the pupil of the second user 8. In general, when the difference between the emission position of the infrared light source and the position where the second user 8 is located is smaller than a preset threshold value, the position of the infrared light source reflection point on the pupil of the second user 8 is unchanged. The eye movement images at each moment acquired by the acquisition device 3 may present the position of the pupil center of the second user 8 at each moment and the position of the infrared light source reflection point, whereby the relative position between the pupil center and the infrared light source reflection point at each moment is tracked by the control device 4. Since the position of the pupil center of the second user 8 may indicate the gaze of the second user 8, the falling point of the gaze of the second user 8 on the second screen 2 may be obtained as the gaze point of the second user 8 on said second screen 2 by the relative position between the pupil center and the infrared light source reflection point at each moment.
Then, for each eye movement image, the process of the first user 7 in observing the target display content is determined according to the above scheme, and the point of regard at each moment is determined. According to the point of regard at each moment, eye movement data of the second user 8 corresponding to the observation target presentation content is obtained through a plurality of different types of statistical schemes. The statistical scheme may be to determine a glance path, a stable gazing duration, a glance speed, or a glance jump reflecting time of the second user 8 in a specified period, and the specific statistical scheme may be determined according to specific target display content and application scenario, which is not limited in this specification.
Further, an evaluation result for indicating the mental activity health of the second user 8 is obtained by using any existing eye movement data analysis scheme based on the eye movement data. In the embodiment of the present specification, the eye movement data analysis scheme may be an eye movement data analysis scheme based on machine learning, or may be an eye movement data analysis scheme based on statistics, which is not limited in the present specification.
In an alternative embodiment of the present specification, in the evaluation system as shown in fig. 7, the capturing device 3 may be disposed in the vicinity of the second screen 2 as a separate electronic device, and may also be a component integrated on the second screen 2, and the position of the capturing device 3 on the second screen 2 may be any position on the second screen 2 where an eye movement image of the second user 8 facing the second screen 2 may be captured, which is not limited in this specification.
In an alternative embodiment of the present description, the head stabilization device 5 and the fixation base 6 may also be deployed in an evaluation system based on the one shown in fig. 7, as shown in fig. 8. The head stabilizing device 5 includes a first support bar 51, a second support bar 52, a flexible member 53, and a chin support bar 54. The connection relationship between the four components included in the head stabilization device 5 is as follows:
The first support rod 51 and the second support rod 52 are respectively fixed on the fixed base 6, and the first support rod 51 and the second support rod 52 are parallel to each other. Both ends of the flexible member 53 are respectively connected to the first support bar 51 and the second support bar 52. The two ends of the chin support bracket 54 are respectively connected with the first support bar 51 and the second support bar 52, the chin support bracket 54 is perpendicular to the first support bar 51 and the second support bar 52, and the chin support bracket 54 and the flexible piece 53 are parallel to each other.
As can be seen from the above-described connection structure of the head stabilizing device 5, the first support bar 51, the second support bar 52, the flexible member 53 and the chin underlying frame 54 together form a rectangular structure. During the eye movement test, based on the support of the second user 8 by the head stabilizing device 5, it is ensured that the second user 8 can maintain a posture facing the second screen 2 during the test and maintain a relatively fixed position. The specific way in which the head stabilizing device 5 supports the second user 8 is: the chin of the second user 8 is placed on the chin rest 54, and at the same time, the flexible member 53 is in contact with the forehead of the second user 8 and is elastically deformed. The chin support bracket 54 is used for supporting the chin of the second user 8 when the second user 8 observes the target display content displayed on the second screen 2. The flexible member 53 is used to contact the forehead of the second user 8 and apply an elastic force to the forehead of the second user 8 so that the second user 8 maintains a posture facing the second screen 2. It can be seen that, by installing the head stabilizing device 5 on the fixing base 6, when the second user 8 observes the target display content displayed on the second screen 2, it is ensured that the second user 8 faces the second screen 2, and the head of the second user 8 is stabilized, so that the definition of the eye movement image acquired by the acquisition device 3 is improved.
Further, in an alternative embodiment of the present disclosure, because the head sizes of the different second users 8 are different, the distance between the flexible member 53 and the chin rest 54 can be adjusted to accommodate the different sizes of the heads of the respective second users 8, so that each second user 8 can support the head with the chin rest 54 and the flexible member 53 contacts the forehead when performing the eye movement test with the head stabilization device 5.
Specifically, as shown in fig. 9, a first end of the chin rest 54 is mounted on the first support bar 51 through a clamping groove 541, and a second end of the chin rest 54 is slidably connected to the second support bar 52 through a sliding assembly 542. The clamping groove 541 may be an arc-shaped clamping groove 541 attached to an outer wall of the first support rod. The sliding component 542 may be a hollow sliding rod sleeved outside the second supporting rod 52, or a sliding rail is disposed on the second supporting rod 52, where the sliding component 542 is a sliding block, and when the sliding block slides up and down along the sliding rail, the sliding block drives the chin support bracket 54 to move up and down.
When the clamping groove 541 is separated from the first support rod 51, the sliding component 542 is slid to drive the chin rest 54 to slide along the second support rod 52, so as to adjust the distance between the chin rest 54 and the flexible member 53. After that, after being adjusted to a proper distance, the clamping groove 541 can be fixed to the first support bar, and the sliding assembly 542 can be fixed to fix the distance between the lower handle support frame and the flexible member 53, thereby providing a supporting effect to the head of the second user 8.
Fig. 10 is a schematic diagram of an interaction flow of an evaluation system for executing an evaluation method according to the present disclosure, specifically including the following steps:
s100: the first screen displays first guide information.
In the embodiment of the present disclosure, the first screen and the second screen are fixed by the bracket, and an included angle exists between the first screen and the second screen, and since the first user faces the first screen and the second user faces the second screen, an included angle also exists between the orientation of the first user and the orientation of the second user. Thus, the first user can assist the second user to perform the eye movement test based on the first screen, and can be far away from the acquisition range of the eye movement image, so that the acquisition device cannot acquire the eye movement image of the first user.
The first guiding information is used for guiding a first user to determine target display content displayed on the second screen. The target presentation may correspond to a test task of at least one type of eye movement test. Generally, before the second user performs the eye movement test, the first user may select what type of eye movement test is specifically performed by the second user, so as to obtain the eye movement image of the second user in a targeted manner. The first guiding information may be displayed in text or sound, which is not limited in this specification. For example, the second screen may support presentation contents for presenting four types of eye movement tests, namely, a gaze stability test, an eye jump test, a follow-up test, and a free view test, and the first guide information may be the text "please select an eye movement test type" as shown in fig. 3.
S102: and the control equipment responds to the target display content input by the first user according to the first guide information, and generates a display signal.
S104: and sending the display signal to the second screen.
S106: and when the second screen receives the display signal, displaying second guide information and target display content corresponding to the display signal.
And then, when the second screen receives the display signal, displaying second guide information and target display content corresponding to the display signal, wherein the second guide information is used for guiding a second user to observe the target display content. In connection with the foregoing, the target presentation content corresponds to a test task of at least one type of eye movement test, wherein the different types of eye movement tests may include: gaze stability test, eye jump test, follow-up test, and free view test. Different types of eye movement tests correspond to different display contents being displayed, and at the same time, different types of eye movement tests correspond to different second guide information being displayed. Taking an eye movement test in which the target display content corresponds to a fixation stabilization type as an example, as shown in fig. 4, the target display content includes a solid dot that is displayed stationary within a display range of the second screen, and the second guide information is displayed in a text form, such as "please look at a dot appearing in the center of the screen".
The second instruction information is used for guiding a second user to observe the target display content.
S108: and the acquisition equipment acquires an eye movement image of the second user in the process of displaying the second guide information and the target display content on the second screen.
S110: an eye moving image of the second user is transmitted to the control device.
S112: and evaluating the mental activity health of the second user according to the received eye movement image.
Based on the evaluation method shown in fig. 10, a display signal is generated in response to target display content input by a first user according to first guide information, so that when the display signal is received by a second screen, the second guide information and the target display content corresponding to the display signal are displayed. And evaluating the mental activity health condition of the second user according to the eye movement image acquired by the acquisition device in the process of displaying the second guide information and the target display content on the second screen.
According to the method, an included angle is formed between the first screen and the second screen, so that the first user can assist the second user in eye movement test based on the first screen and can be far away from the acquisition range of the eye movement images, the acquisition equipment cannot acquire the eye movement images of the first user, the acquisition accuracy of the eye movement images of the second user is improved, and the accuracy of diagnostic indexes is further improved.
In one or more embodiments of the present disclosure, in the evaluation of the mental activity health status of the second user as shown in step S112 of fig. 10, the evaluation is specifically implemented by the following scheme, as shown in fig. 11:
s200: and determining the gaze point of the second user on the second screen according to the received eye movement image.
Similarly to the above description of the acquisition device of the evaluation system as shown in fig. 7, since the screen-based eye moving image acquisition device is generally configured with an infrared light source, the position of the pupil center of the second user at each moment and the position of the infrared light source reflection point can be presented in the eye moving image, and further, by the relative position between the pupil center and the infrared light source reflection point at each moment, the landing point of the gaze line of the second user on the second screen can be obtained as the gaze point of the second user on the second screen.
S202: and determining eye movement data of the second user corresponding to the second guide information according to the fixation point.
In the actual eye movement testing process, the acquisition equipment can acquire eye movement images at all times, further can determine the point of regard of the second user on the second screen at all times, and obtains eye movement data of the second user corresponding to the observation target display content through a plurality of different types of statistical schemes according to the point of regard at all times. The statistical scheme may be to determine a glance path, a stable gazing duration, a glance speed, or a glance jump reflecting time of the second user in the specified time period, and the specific statistical scheme may be determined according to specific target display content and application scenario, which is not limited in this specification.
S204: and determining an evaluation result for indicating the mental activity health condition of the second user according to the eye movement data of the second user corresponding to the second guide information.
Further, an evaluation result for indicating the mental activity health condition of the second user is obtained by adopting any existing eye movement data analysis scheme based on the eye movement data. In the embodiment of the present specification, the eye movement data analysis scheme may be an eye movement data analysis scheme based on machine learning, or may be an eye movement data analysis scheme based on statistics, which is not limited in the present specification.
In an alternative embodiment of the present disclosure, after the mental activity health status of the second user is evaluated as in step S112 of fig. 10, a correspondence relationship between the evaluation result and the second user may be further established, specifically by the following manner.
First, determining the identification of the second user according to the identity information of the second user input by the first user.
During the eye movement test, the first user may assist the second user in completing the eye movement test and assessment of mental activity health. Thus, the first user can obtain the identity information of the second user, such as the name and the certificate number of the second user, through the second user. By processing the identity of the second user, a unique identity of the second user may be obtained. Here, the manner of generating the identifier of the second user may be any existing identifier generation scheme, such as by a hash function.
And secondly, establishing a corresponding relation between the identification of the second user and the evaluation result of the second user, and storing the corresponding relation.
After determining the identifier of the second user, the evaluation result obtained by evaluating the mental activity health condition of the second user in step S112 may be bound with the identifier of the second user, and a corresponding relationship between the evaluation result and the identifier of the second user is established and stored in the database, so that when a doctor or other evaluator determines the mental activity health condition of the second user, the evaluation result of the second user may be searched from the database through the identifier of the second user, to assist the evaluator in diagnosing the disease for the user.
The above evaluation method provided for one or more embodiments of the present disclosure further provides a corresponding evaluation device based on the same concept, as shown in fig. 12.
Fig. 12 is a schematic diagram of an evaluation apparatus provided in the present specification, specifically including:
the display signal generating module 300 is configured to generate a display signal in response to target display content input by a first user according to first guide information, and send the display signal to a second screen, so that the second screen displays second guide information and target display content corresponding to the display signal when receiving the display signal, where the first guide information is displayed by the first screen, and the second guide information is used to guide the second user to observe the target display content; wherein an included angle is formed between the first screen and the second screen;
An eye moving image receiving module 302, configured to receive an eye moving image sent by an acquisition device, where the eye moving image is acquired by the acquisition device during the process of displaying the second guide information and the target display content on the second screen;
and the evaluation module 304 is used for evaluating the mental activity health condition of the second user according to the received eye movement image.
Optionally, the apparatus further comprises:
the correspondence establishing module 306 is specifically configured to determine an identifier of the second user according to the identity information of the second user input by the first user; and establishing a corresponding relation between the identification of the second user and the evaluation result of the second user, and storing the corresponding relation.
The present specification also provides a computer-readable storage medium storing a computer program operable to perform the above-described evaluation method provided in fig. 10.
The present specification also provides a schematic structural diagram of the electronic device shown in fig. 13. At the hardware level, the electronic device includes a processor, an internal bus, a network interface, a memory, and a non-volatile storage, as illustrated in fig. 13, although other hardware required by other services may be included. The processor reads the corresponding computer program from the non-volatile memory into the memory and then runs to implement the evaluation method described above with respect to fig. 10. Of course, other implementations, such as logic devices or combinations of hardware and software, are not excluded from the present description, that is, the execution subject of the following processing flows is not limited to each logic unit, but may be hardware or logic devices.
In the 90 s of the 20 th century, improvements to one technology could clearly be distinguished as improvements in hardware (e.g., improvements to circuit structures such as diodes, transistors, switches, etc.) or software (improvements to the process flow). However, with the development of technology, many improvements of the current method flows can be regarded as direct improvements of hardware circuit structures. Designers almost always obtain corresponding hardware circuit structures by programming improved method flows into hardware circuits. Therefore, an improvement of a method flow cannot be said to be realized by a hardware entity module. For example, a programmable logic device (Programmable Logic Device, PLD) (e.g., field programmable gate array (Field Programmable Gate Array, FPGA)) is an integrated circuit whose logic function is determined by the programming of the device by a user. A designer programs to "integrate" a digital system onto a PLD without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Moreover, nowadays, instead of manually manufacturing integrated circuit chips, such programming is mostly implemented by using "logic compiler" software, which is similar to the software compiler used in program development and writing, and the original code before the compiling is also written in a specific programming language, which is called hardware description language (Hardware Description Language, HDL), but not just one of the hdds, but a plurality of kinds, such as ABEL (Advanced Boolean Expression Language), AHDL (Altera Hardware Description Language), confluence, CUPL (Cornell University Programming Language), HDCal, JHDL (Java Hardware Description Language), lava, lola, myHDL, PALASM, RHDL (Ruby Hardware Description Language), etc., VHDL (Very-High-Speed Integrated Circuit Hardware Description Language) and Verilog are currently most commonly used. It will also be apparent to those skilled in the art that a hardware circuit implementing the logic method flow can be readily obtained by merely slightly programming the method flow into an integrated circuit using several of the hardware description languages described above.
The controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, a microprocessor or processor and a computer readable medium storing computer readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, application specific integrated circuits (Application Specific Integrated Circuit, ASIC), programmable logic controllers, and embedded microcontrollers, examples of which include, but are not limited to, the following microcontrollers: ARC 625D, atmel AT91SAM, microchip PIC18F26K20, and Silicone Labs C8051F320, the memory controller may also be implemented as part of the control logic of the memory. Those skilled in the art will also appreciate that, in addition to implementing the controller in a pure computer readable program code, it is well possible to implement the same functionality by logically programming the method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers, etc. Such a controller may thus be regarded as a kind of hardware component, and means for performing various functions included therein may also be regarded as structures within the hardware component. Or even means for achieving the various functions may be regarded as either software modules implementing the methods or structures within hardware components.
The system, apparatus, module or unit set forth in the above embodiments may be implemented in particular by a computer chip or entity, or by a product having a certain function. One typical implementation is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being functionally divided into various units, respectively. Of course, the functions of each element may be implemented in one or more software and/or hardware elements when implemented in the present specification.
It will be appreciated by those skilled in the art that embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data control apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data control apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data control apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data control apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
It will be appreciated by those skilled in the art that embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, the present specification may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present description can take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
The description may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The specification may also be practiced in distributed computing environments where tasks are performed by remote control devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for system embodiments, since they are substantially similar to method embodiments, the description is relatively simple, as relevant to see a section of the description of method embodiments.
The foregoing is merely exemplary of the present disclosure and is not intended to limit the disclosure. Various modifications and alterations to this specification will become apparent to those skilled in the art. Any modifications, equivalent substitutions, improvements, or the like, which are within the spirit and principles of the present description, are intended to be included within the scope of the claims of the present description.

Claims (14)

1. A display device, characterized in that it comprises a first screen (1) and a second screen (2), said display device further comprising an acquisition device (3) for acquiring eye movement images, said second screen (2) being associated with said acquisition device (3), said acquisition device (3) being located in an arbitrary position where eye movement images of a second user facing said second screen (2) can be acquired;
An included angle is formed between the first screen (1) and the second screen (2), and the size of the included angle is positively correlated with the acquisition range of the eye movement image, so that a first user is not in the acquisition range of the eye movement image of the acquisition equipment (3), the acquisition equipment (3) is prevented from acquiring the eye movement image of the first user, and the accuracy of the diagnosis index is improved;
the first screen (1) is used for displaying first guide information, so that the first user selects target display contents displayed on the second screen (2) according to the first guide information, generates display signals according to the target display contents input by the first user, and sends the display signals to the second screen (2);
the second screen (2) is used for displaying second guide information and target display content corresponding to the display signal when the display signal is received, and the second guide information is used for guiding the second user to observe the target display content.
2. A device as claimed in claim 1, characterized in that the display device further comprises a first support (11) and a second support (21), the first support (11) being arranged to support the first screen (1) and the second support (21) being arranged to support the second screen (2).
3. The device according to claim 2, wherein a first end of the first bracket (11) is used for fixing the first screen (1) and a first end of the second bracket (21) is used for fixing the second screen (2); the second end of the first bracket (11) is connected with the second end of the second bracket (21).
4. A device according to claim 3, characterized in that the second end of the first bracket (11) is rotatably connected to the second end of the second bracket (21);
the size of the included angle between the first bracket (11) and the second bracket (21) is adjusted by rotating the second end of the first bracket (11) and the second end of the second bracket (21).
5. An evaluation system, the system comprising: a control device (4) and a display device as claimed in any one of claims 1 to 4;
the first screen (1) is used for displaying first guide information, and the first guide information is used for guiding a first user to determine target display content displayed on the second screen (2);
the second screen (2) is used for displaying second guide information and target display content corresponding to the display signal when the display signal is received, and the second guide information is used for guiding a second user to observe the target display content;
The acquisition device (3) is used for acquiring an eye movement image of the second user and sending the eye movement image to the control device (4) in the process of displaying the second guide information and the target display content on the second screen (2);
the control device (4) is used for responding to the target display content input by the first user according to the first guide information to generate a display signal and sending the display signal to the second screen (2); and evaluating the mental activity health of the second user according to the received eye movement image.
6. The system according to claim 5, wherein the control device (4) is adapted to determine a gaze point of the second user at the second screen (2) in particular from the received eye movement image; according to the fixation point, determining eye movement data of the second user corresponding to the second guide information; and determining an evaluation result for indicating the mental activity health condition of the second user according to the eye movement data of the second user corresponding to the second guide information.
7. The system according to claim 5, characterized in that it further comprises a head stabilization device (5) and a fixation base (6); the head stabilizing device (5) comprises a first supporting rod (51), a second supporting rod (52), a flexible piece (53) and a chin support bracket (54);
The first supporting rod (51) and the second supporting rod (52) are respectively fixed on the fixed base (6), and the first supporting rod (51) and the second supporting rod (52) are mutually parallel;
two ends of the flexible piece (53) are respectively connected with the first supporting rod (51) and the second supporting rod (52);
the two ends of the chin support frame (54) are respectively connected with the first support bar (51) and the second support bar (52), the chin support frame (54) is perpendicular to the first support bar (51) and the second support bar (52), and the chin support frame (54) and the flexible piece (53) are parallel to each other;
the chin rest (54) is used for supporting the chin of the second user to stabilize the head of the second user when the second user observes the target display content displayed by the second screen (2).
8. The system of claim 7, wherein a first end of the chin support bar (54) is mounted to the first support bar (51) via a clamping slot (541), and a second end of the chin support bar (54) is slidably coupled to the second support bar (52) via a sliding assembly (542);
When the clamping groove (541) is separated from the first supporting rod (51), the sliding component (542) is slid to drive the chin rest (54) to slide along the second supporting rod (52), so as to adjust the distance between the chin rest (54) and the flexible piece (53).
9. A system according to claim 5, characterized in that the acquisition device (3) is arranged on the second screen (2).
10. An evaluation method, characterized in that the method is applied to a control device, the method comprising:
generating a display signal in response to target display content input by a first user according to first guide information, and sending the display signal to a second screen, so that the second screen displays second guide information and the target display content corresponding to the display signal when receiving the display signal, wherein the first guide information is displayed by the first screen, and the second guide information is used for guiding the second user to observe the target display content; the first screen and the second screen are provided with an included angle, the second screen is associated with a collection device, and the collection device is positioned at any position capable of collecting an eye moving image of the second user facing the second screen, so that the first user is not in the collection range of the eye moving image of the collection device, the collection device is prevented from collecting the eye moving image of the first user, and the accuracy of diagnostic indexes is improved;
Receiving the eye movement image sent by the acquisition device, wherein the eye movement image is acquired by the acquisition device in the process of displaying the second guide information and the target display content on the second screen;
and evaluating the mental activity health condition of the second user according to the received eye movement image.
11. The method of claim 10, wherein the method further comprises:
determining the identification of the second user according to the identity information of the second user input by the first user;
and establishing a corresponding relation between the identification of the second user and the evaluation result of the second user, and storing the corresponding relation.
12. An evaluation device, characterized in that the device is applied to a control apparatus, the device comprising:
the display signal generation module is used for responding to target display contents input by a first user according to first guide information to generate display signals and sending the display signals to a second screen, so that the second screen displays the second guide information and the target display contents corresponding to the display signals when receiving the display signals, the first guide information is displayed by the first screen, and the second guide information is used for guiding the second user to observe the target display contents; the first screen and the second screen are provided with an included angle, the second screen is associated with a collection device, and the collection device is positioned at any position capable of collecting an eye moving image of the second user facing the second screen, so that the first user is not in the collection range of the eye moving image of the collection device, the collection device is prevented from collecting the eye moving image of the first user, and the accuracy of diagnostic indexes is improved;
An eye moving image receiving module for receiving an eye moving image transmitted by the acquisition device, the eye moving image being acquired by the acquisition device in the process of displaying the second guide information and the target display content on the second screen;
and the evaluation module is used for evaluating the mental activity health condition of the second user according to the received eye movement image.
13. A computer readable storage medium, characterized in that the storage medium stores a computer program which, when executed by a processor, implements the method of any of the preceding claims 10-11.
14. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method of any of the preceding claims 10-11 when executing the program.
CN202310093997.9A 2023-02-03 2023-02-03 Display device, evaluation system and evaluation method Active CN116077060B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310093997.9A CN116077060B (en) 2023-02-03 2023-02-03 Display device, evaluation system and evaluation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310093997.9A CN116077060B (en) 2023-02-03 2023-02-03 Display device, evaluation system and evaluation method

Publications (2)

Publication Number Publication Date
CN116077060A CN116077060A (en) 2023-05-09
CN116077060B true CN116077060B (en) 2024-01-16

Family

ID=86213808

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310093997.9A Active CN116077060B (en) 2023-02-03 2023-02-03 Display device, evaluation system and evaluation method

Country Status (1)

Country Link
CN (1) CN116077060B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103472908A (en) * 2012-06-05 2013-12-25 由田新技股份有限公司 Bidirectional communication eye movement system
CN105609088A (en) * 2015-12-21 2016-05-25 联想(北京)有限公司 Display control method and electronic device
CN105912109A (en) * 2016-04-06 2016-08-31 众景视界(北京)科技有限公司 Screen automatic switching device of head-wearing visual device and head-wearing visual device
CN107102732A (en) * 2017-04-08 2017-08-29 闲客智能(深圳)科技有限公司 A kind of eye moves determination methods and device
CN112154498A (en) * 2018-05-31 2020-12-29 矽光学有限公司 Method for processing image data of display screen
CN113128417A (en) * 2021-04-23 2021-07-16 南开大学 Double-region eye movement tracking method based on head posture
CN113842290A (en) * 2020-06-28 2021-12-28 北京清华长庚医院 Ankle training system, method, apparatus and storage medium
CN115444423A (en) * 2022-10-18 2022-12-09 上海耐欣科技有限公司 Prediction system, prediction method, prediction device, prediction equipment and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8649099B2 (en) * 2010-09-13 2014-02-11 Vuzix Corporation Prismatic multiple waveguide for near-eye display
TWI470477B (en) * 2012-08-29 2015-01-21 Utechzone Co Ltd Eye-controlled communicating system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103472908A (en) * 2012-06-05 2013-12-25 由田新技股份有限公司 Bidirectional communication eye movement system
CN105609088A (en) * 2015-12-21 2016-05-25 联想(北京)有限公司 Display control method and electronic device
CN105912109A (en) * 2016-04-06 2016-08-31 众景视界(北京)科技有限公司 Screen automatic switching device of head-wearing visual device and head-wearing visual device
CN107102732A (en) * 2017-04-08 2017-08-29 闲客智能(深圳)科技有限公司 A kind of eye moves determination methods and device
CN112154498A (en) * 2018-05-31 2020-12-29 矽光学有限公司 Method for processing image data of display screen
CN113842290A (en) * 2020-06-28 2021-12-28 北京清华长庚医院 Ankle training system, method, apparatus and storage medium
CN113128417A (en) * 2021-04-23 2021-07-16 南开大学 Double-region eye movement tracking method based on head posture
CN115444423A (en) * 2022-10-18 2022-12-09 上海耐欣科技有限公司 Prediction system, prediction method, prediction device, prediction equipment and storage medium

Also Published As

Publication number Publication date
CN116077060A (en) 2023-05-09

Similar Documents

Publication Publication Date Title
CN113164036B (en) Methods, apparatus and systems for ophthalmic testing and measurement
KR101785255B1 (en) Shape discrimination vision assessment and tracking system
KR101711093B1 (en) Virtual reality system based on eye movement and perceptual function for self diagnosis and training of dementia
CN115444423A (en) Prediction system, prediction method, prediction device, prediction equipment and storage medium
KR102344493B1 (en) A smart inspecting system, method and program for nystagmus using artificial intelligence
US10786191B2 (en) System and method for supporting of neurological state assessment and for supporting neurological rehabilitation, especially within cognitive and/or speech dysfunction
CN116077060B (en) Display device, evaluation system and evaluation method
EP3801252A1 (en) Technology adapted to enable improved collection of involuntary eyelid movement parameters, including collection of eyelid movement parameters to support analysis of neurological factors
EP4104749A1 (en) Diagnosis assisting device, and diagnosis assisting system and program
US10888263B2 (en) Procedure of non-invasive video-oculographic measurement of eye movements as a diagnostic tool for (early) detection of neuropsychiatric diseases
US10712815B2 (en) Information processing device and display method
Hanke et al. A practical guide to functional magnetic resonance imaging with simultaneous eye tracking for cognitive neuroimaging research
CN219048455U (en) Display device and evaluation system
US10779726B2 (en) Device and method for determining eye movements by tactile interface
Kovesdi et al. Measuring human performance in simulated nuclear power plant control rooms using eye tracking
US11867984B2 (en) Methods for determining the near point, for determining the near point distance, for determining a spherical refractive power, and for producing a spectacle lens, and corresponding mobile terminals and computer programs
CN117058148B (en) Imaging quality detection method, device and equipment for nystagmus patient
KR102312359B1 (en) system for self-determination of mild cognitive impairment and prevention training using memory and response rate information, based on breathing patterns
RU2668462C1 (en) Method of investigation of field of vision and device for implementation thereof
CASTNER et al. Eye movement recording/Some approaches to the study of map perception
JP4429743B2 (en) Electronic medical record input system, program for causing computer to execute electronic medical record creation process, and computer-readable information recording medium recording the program
KR20230093116A (en) Protocol/tool for early detection of autism spectrum disorder by measuring the response to joint attention
Charlier et al. Experience with an eye tracker in visual communication evaluation
CN109285602A (en) Main module, system and method for self-examination eyes of user
KR20170090877A (en) Apparatus and Method for Interactive User Calibration

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant