CN110327061B - Character determining device, method and equipment based on eye movement tracking technology - Google Patents

Character determining device, method and equipment based on eye movement tracking technology Download PDF

Info

Publication number
CN110327061B
CN110327061B CN201910740172.5A CN201910740172A CN110327061B CN 110327061 B CN110327061 B CN 110327061B CN 201910740172 A CN201910740172 A CN 201910740172A CN 110327061 B CN110327061 B CN 110327061B
Authority
CN
China
Prior art keywords
character
determining
test
tested person
eye movement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910740172.5A
Other languages
Chinese (zh)
Other versions
CN110327061A (en
Inventor
王凯强
秦林婵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing 7Invensun Technology Co Ltd
Original Assignee
Beijing 7Invensun Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing 7Invensun Technology Co Ltd filed Critical Beijing 7Invensun Technology Co Ltd
Priority to CN201910740172.5A priority Critical patent/CN110327061B/en
Publication of CN110327061A publication Critical patent/CN110327061A/en
Application granted granted Critical
Publication of CN110327061B publication Critical patent/CN110327061B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/167Personality evaluation

Abstract

The invention discloses a character determining device, method and equipment based on an eye movement tracking technology. The device comprises a determining unit, a judging unit and a judging unit, wherein the determining unit is used for determining the tendency selection of a tested person according to first eye movement data and a first data model of the tested person so as to determine a first character test result of the tested person; the device is also used for determining a second character test result of the tested person according to the second eye movement data and the second data model of the tested person; and determining the character type of the tested person according to the first character test result and the second character test result. By the device, the character type hidden by the tested person during testing can be excavated, and the character characteristics of the tested person can be comprehensively and truly reflected from a plurality of angles.

Description

Character determining device, method and equipment based on eye movement tracking technology
Technical Field
The embodiment of the invention relates to an information processing technology, in particular to a character determining device, method and equipment based on an eye movement tracking technology.
Background
When the character test is performed, the test result is obtained by generally adopting a mode of making a choice question, and then the character type of the tested person is comprehensively evaluated according to the test result.
But during the testing process, it is inevitable that the tester hides his actual idea in order to reach a more matching personality type, and selects options that are more toward the target personality.
Such a test method cannot determine whether the tested person hides the real situation during the test, and cannot dig out the real situation hidden by the tested person, and at the same time, cannot acquire the personality characteristics reflected by various behaviors of the tested person during the test.
Disclosure of Invention
The invention provides a character determining device, method and equipment based on an eye tracking technology, which can dig out the hidden real character type of a tested person and comprehensively and truly reflect the character characteristics of the tested person.
In a first aspect, an embodiment of the present invention further provides a personality determination device based on an eye tracking technology, where the device includes:
the determining unit is used for determining the tendency selection of the tested person according to the first eye movement data and the first data model of the tested person;
the determining unit is further used for determining a first character test result of the tested person according to the tendency selection;
the determining unit is further used for determining a second character test result of the tested person according to the second eye movement data and the second data model of the tested person;
and the determining unit is also used for determining the character type of the tested person according to the first character test result and the second character test result.
In a second aspect, an embodiment of the present invention provides a character determining method based on an eye tracking technique, where the method includes:
determining the tendency selection of the testee according to the first eye movement data and the first data model of the testee;
according to the tendency selection, determining a first character test result of the tested person;
determining a second character test result of the tested person according to the second eye movement data and the second data model of the tested person;
and determining the character type of the tested person according to the first character test result and the second character test result.
In a third aspect, an embodiment of the present invention further provides a personality determination device based on an eye tracking technology, where the device includes a memory, a processor, and a computer program stored in the memory and running on the processor, and when the processor executes the computer program, the function of the personality determination apparatus according to the first aspect of the present invention is implemented.
In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the computer program implements the functions of the character determination apparatus as provided in the first aspect of the present invention.
The embodiment of the invention provides a character determining device, method and equipment based on an eye movement tracking technology, wherein the device comprises a determining unit, a judging unit and a judging unit, wherein the determining unit is used for determining the tendency selection of a tested person according to first eye movement data and a first data model of the tested person so as to determine a first character testing result of the tested person; the second character test result of the tested person is determined according to the second eye movement data and the second data model of the tested person; and determining the character type of the tested person according to the first character test result and the second character test result. By the device, the character type hidden by the tested person during testing can be excavated, and the character characteristics of the tested person can be comprehensively and truly reflected from a plurality of angles.
Drawings
FIG. 1 is a schematic structural diagram of a neutral grid determining apparatus according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of an eye movement trajectory according to a first embodiment of the present invention;
FIG. 3 is a schematic diagram of a portion of eye movement data in accordance with a first embodiment of the present invention;
FIG. 4 is a schematic structural diagram of another personality determination device in the first embodiment of the invention;
FIG. 5 is a flowchart of a character determination method according to a second embodiment of the present invention;
FIG. 6 is a flowchart of another character determination method according to the second embodiment of the present invention;
FIG. 7 is a schematic structural diagram of another personality determination device in the second embodiment of the invention;
fig. 8 is a schematic structural diagram of a character determination device in the third embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
In addition, in the embodiments of the present application, the words "optionally" or "exemplarily" are used for indicating as examples, illustrations or explanations. Any embodiment or design described as "optionally" or "exemplary" in embodiments of the invention is not to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the words "optionally" or "exemplarily" etc. is intended to present the relevant concepts in a concrete fashion.
For ease of understanding, some of the descriptions of concepts related to the present invention are given by way of example for reference. As follows:
eye movement data: the method comprises the steps of estimating the fixation point information of a tested person by an eye tracker through an image recognition algorithm, and acquiring the eye activity data of the tested person, wherein the eye activity data can comprise the fixation point information, a fixation track, fixation times, fixation duration and other information.
Eye tracking technology: the method is a technology for estimating the fixation point information of a tested person by an eye tracker through an image recognition algorithm so as to determine the fixation point of the tested person in real time according to the fixation point information.
Selecting the tendency: and according to the eye movement data, calculating the option which is most inclined to the inner intention of the tested person by a deep learning algorithm.
And (3) actual selection: and (4) selecting the actual selection of the testee in the test process.
The difference between the actual selection and the trend selection is as follows:
how do you look when you last for unpleasant things?
□ I go mouldy and go bad, which is worried once more.
□, there is always a turning point when the defect reaches the extreme.
Figure BDA0002163664360000041
Some are not happy, but can quickly persist in the past.
In the above options, according to the acquired user eye movement data, it is determined that the gaze duration and the gaze frequency of the user's gaze point staying at the second option are more, so that the second item is a tendency selection of the tested person, and the third item is an actual selection option of the tested person.
Example one
On the basis of the above concept, an embodiment of the present invention provides a character determining apparatus based on an eye tracking technique, as shown in fig. 1, the apparatus including: a determination unit 101.
The determining unit 101 is configured to determine the trend selection of the test subject according to the first eye movement data and the first data model of the test subject.
Illustratively, the first eye movement data may include: the watching time length of the option, the watching times of the option, the final watching option, the watching option in the selection, the watching track, the target losing time length and the like.
In an embodiment of the present invention, the first data model may be a model for determining the subject's disposition selection based on the input first eye movement data.
For example, the first eye movement data of the test subject is input into the first data model, and compared with the eye movement data information in the first data model, a corresponding test result is matched, and the test result is output, and the result is determined as the tendency selection of the test subject.
The determining unit 101 is further configured to determine a first personality test result of the subject according to the trend selection.
After acquiring the tendency selection of the test subject, the determination unit 101 may perform a character test analysis on the test subject based on the tendency selection, and determine a first character test result of the test subject based on the character test analysis result.
It should be noted that, when performing the personality test analysis on the testee, the implementation method in the prior art may be adopted, for example, the type of the personality to which the testee belongs is determined based on the tendency selection of the testee, and the corresponding score is given, which is not limited by the present invention.
Optionally, in this embodiment of the present invention, a first character test result is designed to include at least one character type and a weight corresponding to the at least one character type. Further, the character type after the character test analysis and the corresponding score may be used as the first character test result.
It will be appreciated by those skilled in the art that when the trend selection is the same as the actual selection made by the subject, i.e., the subject performs the test according to his/her actual situation without being concealed, the process may also be such that the first personality test result of the subject is determined according to the actual selection of the subject.
The determining unit 101 is further configured to determine a second character test result of the test subject according to the second eye movement data and the second data model of the test subject.
Illustratively, the second eye movement data may include: the method comprises the following steps of selecting watching time length, selecting watching times, selecting a final watching option, selecting a watching option, watching a track, selecting used time length, losing target time length, selecting modification times and the like.
In the technical solution provided by the embodiment of the present invention, a person skilled in the art can design the second eye movement data differently according to actual requirements, so as to distinguish the second eye movement data from the first eye movement data, and further test the testee based on different angles. It is understood that the inventive concept is intended to be covered by the present invention regardless of the design of the first eye movement data and the second eye movement data.
As shown in fig. 2, the embodiment of the present invention designs the second eye movement data from the perspective of the eye movement trajectory of the test subject, wherein the size of the circle represents the fixation time of the test subject. The larger the circle is, the longer the fixation time of the subject is.
Second eye movement data of the test subject can be acquired based on the eye movement locus of the test subject, and partial eye movement data of the test subject is listed as shown in fig. 3.
Alternatively, in the embodiment of the present invention, the second data model may be a model that outputs a result of the second character test of the subject based on the input second eye movement data.
For example, the collected second eye movement data of the tested person is input into the second data model, and compared with the eye movement data information in the second data model, each character type corresponding to the second eye movement data and the weight value corresponding to each character type are matched, and each character type and the weight value corresponding to each character type are used as the output second character test result.
Optionally, the second character test result may include at least one character type and a weight corresponding to the at least one character type.
It should be noted that at least one character type included in the second character test result may be the same as or different from at least one character type included in the first character test result.
For example, assume that the second personality test result and the first personality test result both contain A, B, C, D types of characters. When the weights corresponding to the four types of characters in the first character test result and the second character test result are both larger than 0, at least one character type contained in the second character test result is the same as at least one character type contained in the first character test result.
On the contrary, it is assumed that the weights corresponding to the B, C, D types of characters in the first character test result are all 0, that is, the first character test result only includes the a type characters and the corresponding weights thereof, and the weights corresponding to the A, B types of characters in the second character test result are all 0, that is, the second character test result only includes the C, D types of characters and the corresponding weights thereof, at this time, the second character test result is different from the first character test result in at least one character type.
The determining unit 101 is further configured to determine the personality type of the subject according to the first personality test result and the second personality test result.
Optionally, an implementation method provided in the embodiment of the present invention is that the first weight is multiplied by a weight corresponding to at least one character type in the first character test results, so as to obtain a first result corresponding to the at least one character type, where the first weight is a proportion of the number of test questions actually selected by the testee, the number of test questions being the same as the tendency selection, among all test questions;
respectively multiplying the second weight by the weight corresponding to at least one character type in the second character test results to obtain a second result corresponding to at least one character type, wherein the second weight is the proportion of the number of the test questions actually selected by the testee and different from the number of the test questions selected by the examinee;
adding the first result and the second result corresponding to at least one character type to obtain a third result corresponding to at least one character type;
further, at least one character type corresponding to the third result is determined as the character type of the subject.
For example, assuming that there are 10 test questions in total, wherein the number of questions actually selected by the test subject is 8 questions that are the same as the number of questions actually selected by the test subject by the tendency, and the number of questions actually selected by the test subject is 2 questions that are different from the number of questions actually selected by the test subject by the tendency, the first weight may be determined to be 80% and the second weight may be determined to be 20%.
Assuming that the first personality test result includes A, B, C, D four personality types listed in the above steps, and the weights thereof are 8, 2, 0, and 0, respectively, and the second personality test result includes A, B, C, D four personality types, and the weights thereof are 0, 6, and 4, respectively, it can be determined according to the implementation manner provided by the embodiment of the present invention that:
the first result is a 8 × 80%, B2 × 80%, C0 × 80%, D0 × 80%;
the second result is a 0 x 20%, B0 x 20%, C6 x 20%, D4 x 20%,
the third result is that a is 8 × 80% +0 × 20% ═ 6.4, B is 2 × 80% +0 × 20% ═ 1.6, C is 0 × 80% +6 × 20% ═ 1.2, and D is 0 × 80% +4 × 20% ═ 0.8.
That is, the character of the test subject is most inclined to the type a character, and also has B, C, D partial features of the three types of characters, wherein the larger the weight value is, the closer the test subject is to the character type, and the smaller the weight value is, the less the feature of the character type owned by the test subject is.
It should be noted that the above-mentioned first, second and third results are merely exemplary representations, and the embodiment of the present invention does not limit the presentation manner of the first, second and third results.
In addition, the character determining apparatus further includes: the establishing unit 102, as shown in fig. 4, in conjunction with fig. 2.
The establishing unit 102 is used for establishing a first data model before determining the tendency selection of the testee.
For example, when a test is performed by a test subject, eye movement data in two different cases, i.e., a case where no selection behavior is concealed and a case where all selection behaviors are concealed, are collected in a sorted manner.
And establishing a first data model capable of outputting the tendency selection of the tested person through a deep learning algorithm based on the collected eye movement data, wherein the first data model is used for determining the tendency selection of the tested person.
When eye movement data of the test subject, in which all the selection behaviors are concealed, is collected, the tendency selection of the test subject, that is, the option of the mental intention of the test subject is marked.
It should be noted that, a person skilled in the art may select a corresponding deep learning algorithm to establish the first data model according to actual requirements, which is not limited in the embodiment of the present invention.
Further, the determination unit 101 is further configured to determine that the first eye movement data is valid according to a first preset condition before determining the tendency selection of the subject.
When determining the tendency selection of the subject based on the first eye movement data, it may be determined whether the first eye movement data is valid in advance according to a first preset condition. And if the first eye movement data is judged to be valid, continuing to perform character test analysis, and if the first eye movement data is judged to be invalid, not performing character test analysis.
For example, the first preset condition may be a preset time length for the testee to watch the subject, a preset time length for the watching option, a preset number of watching options, and the like. For example, when the testee answers, whether the watching topic and the option duration of the testee, the option watching frequency and the like are within the effective threshold value is judged, if the watching topic and the option duration of the testee are respectively less than the corresponding preset duration, that is, the testee does not have detailed reading of the questions, or the option watching frequency is 0, that is, the testee does not read the option, a selection is made, and then the collected first eye movement data of the testee can be considered to be invalid. In this case, it is not necessary to determine the character type of the subject.
Whether the character type of the tested person is determined continuously can be further judged by judging whether the collected first eye movement data is effective or not. When the first eye movement data is determined to be effective, the tendency selection obtained through the first eye movement data and the first data model is effective information, and further the character type of the tested person can be determined more strictly based on the effective tendency selection, so that the accuracy and the reliability of determining the character type of the tested person are improved.
The establishing unit 102 is further configured to establish a second data model before determining a second personality test result of the testee.
For example, eye movement trajectory data of persons of different character types are collected, and a second data model is established through a deep learning algorithm based on each eye movement trajectory data. The second data model is used to determine a second personality test result for the subject.
It should be noted that the deep learning algorithm used in the second data model is the same as the algorithm used in the first data model, so as to ensure the consistency of the processing of the first eye movement data and the second eye movement data.
The second data model can analyze and calculate the character type of the tested person and the weight value corresponding to each character type based on the input second eye movement data of the tested person.
The determining unit 101 is further configured to determine that the second eye movement data is valid according to a second preset condition before determining the second personality test result of the test subject.
When determining the second character test result of the subject based on the second eye movement data, it may be determined whether the second eye movement data is valid in advance according to a second preset condition.
For example, the second preset condition may be a preset time duration for the testee to watch the option, a preset time duration for making a selection, and the like.
For example, the option fixation time length and the time length for selection in the second eye movement data are determined. If the tested person tests, the collected item watching time length of the tested person is less than the corresponding preset time length, namely the tested person does not read questions seriously, or the time length used by the tested person for selection is less than the corresponding preset time length, namely the tested person does not make a selection seriously under consideration. It may be determined that the second eye movement data of the subject is invalid. In this case, there is no need to continue to determine the second personality test result and the personality type of the subject. Otherwise, if the judgment result is that the second eye movement data of the tested person is valid, the second character test result is continuously determined according to the second eye movement data.
In this way, the accuracy and validity of the second personality test result may be ensured.
The embodiment of the invention provides a character determining device based on an eye movement tracking technology, wherein the device comprises a determining unit, a judging unit and a judging unit, wherein the determining unit is used for determining the tendency selection of a tested person according to first eye movement data and a first data model of the tested person so as to determine a first character testing result of the tested person; the second character test result of the tested person is determined according to the second eye movement data and the second data model of the tested person; and determining the character type of the tested person according to the first character test result and the second character test result. By the device, the character type hidden by the tested person during testing can be excavated, and the character characteristics of the tested person can be comprehensively and truly reflected from a plurality of angles.
Example two
An embodiment of the present invention provides a character determining method based on an eye tracking technology, as shown in fig. 5, the method includes:
s501, determining the tendency selection of the tested person according to the first eye movement data and the first data model of the tested person.
Illustratively, the first eye movement data may include: the watching time length of the option, the watching times of the option, the final watching option, the watching option in the selection, the watching track, the target losing time length and the like.
In an embodiment of the present invention, the first data model may be a model for determining the subject's disposition selection based on the input first eye movement data.
S502, according to the tendency selection, determining a first character test result of the tested person.
In the embodiment of the invention, the first character test result comprises at least one character type and a weight value corresponding to the at least one character type.
S503, determining a second character test result of the tested person according to the second eye movement data and the second data model of the tested person.
Illustratively, the second eye movement data may include: the method comprises the following steps of selecting watching time length, selecting watching times, selecting a final watching option, selecting a watching option, watching a track, selecting used time length, losing target time length, selecting modification times and the like.
The second data model may be a model for outputting a second character test result of the test subject based on the input second eye movement data.
Optionally, the second character test result may include at least one character type and a weight corresponding to the at least one character type.
It should be noted that at least one character type included in the second character test result may be the same as or different from at least one character type included in the first character test result.
And S504, determining the character type of the tested person according to the first character test result and the second character test result.
Optionally, an implementation method provided in the embodiment of the present invention is that the first weight is multiplied by a weight corresponding to at least one character type in the first character test results, so as to obtain a first result corresponding to the at least one character type, where the first weight is a proportion of the number of test questions actually selected by the testee, the number of test questions being the same as the tendency selection, among all test questions;
respectively multiplying the second weight by the weight corresponding to at least one character type in the second character test results to obtain a second result corresponding to at least one character type, wherein the second weight is the proportion of the number of the test questions actually selected by the testee and different from the number of the test questions selected by the examinee;
adding the first result and the second result corresponding to at least one character type to obtain a third result corresponding to at least one character type;
further, at least one character type corresponding to the third result is determined as the character type of the subject.
Before determining the tendency selection of the testee, an embodiment of the present invention provides a method for determining a character, which is shown in fig. 6 in conjunction with fig. 5, and further includes:
s5010, establishing a first data model.
When the tested person performs the test, the eye movement data under two different conditions of no hiding and all hiding and selecting behaviors are collected in a classified mode.
And establishing a first data model capable of outputting the tested person tendency selection through a deep learning algorithm based on the collected eye movement data.
S5011, determining that the first eye movement data are valid according to a first preset condition.
The first preset condition can be preset duration of watching the subject, preset duration of watching the option, preset times of watching the option and the like of the testee.
When determining the tendency selection of the subject based on the first eye movement data, it may be determined whether the first eye movement data is valid in advance according to a first preset condition. And if the first eye movement data is judged to be valid, continuing to perform character test analysis, and if the first eye movement data is judged to be invalid, not performing character test analysis.
It should be noted that S5011 is not limited to be executed after S5010.
Before determining the second character test result of the tested person, the embodiment of the present invention provides a character determining method, which is combined with fig. 6, as shown in fig. 7, and further includes:
and S5012, establishing a second data model.
And collecting eye movement track data of people with different character types, and establishing a second data model through a deep learning algorithm based on the eye movement track data.
It should be noted that the deep learning algorithm used in the second data model is the same as the algorithm used in the first data model, so as to ensure the consistency of the processing of the first eye movement data and the second eye movement data.
And S5013, determining that the second eye movement data is valid according to a second preset condition.
When determining the second character test result of the subject based on the second eye movement data, it may be determined whether the second eye movement data is valid in advance according to a second preset condition.
For example, the second preset condition may be a preset time duration for the testee to watch the option, a preset time duration for making a selection, and the like.
It should be noted that S1013 is not limited to being performed after S1012, and S1013 and S1012 are not limited to being performed after S1010-S1011 and S101-S102.
The embodiment of the invention provides a character determining method based on an eye movement tracking technology, which comprises the following steps: determining the tendency selection of the testee according to the first eye movement data and the first data model of the testee, and further determining a first character test result of the testee; determining a second character test result of the tested person according to the second eye movement data and the second data model of the tested person; and determining the character type of the tested person according to the first character test result and the second character test result. Compared with the prior art, the method has the advantages that the character type hidden by the tested person during testing can be excavated through the implementation mode, and the character characteristics of the tested person can be comprehensively and truly reflected from multiple angles.
EXAMPLE III
An embodiment of the present invention provides a personality determination device based on an eye tracking technique, and as shown in fig. 8, the personality determination device includes: a processor 801, a memory 802, an input device 803, an output device 804; the number of the processors 801 in the character determination device may be one or more, and one processor 801 is taken as an example in fig. 8; the processor 801, the memory 802, the input device 803, and the output device 804 in the character determination apparatus may be connected by a bus or other means, and fig. 8 illustrates an example of connection by a bus.
The memory 802, which is a computer-readable storage medium, may be used to store software programs, computer-executable programs, and modules, such as program instructions/modules corresponding to the personality determination method in the embodiment of the present invention (e.g., the determination unit 101 in the personality determination device, etc.). The processor 801 executes various functional applications of the character determination apparatus and data processing by running software programs, instructions, and modules stored in the memory 802, that is, implements the character determination method described above.
The memory 802 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal, and the like. Further, the memory 802 may include high speed random access memory and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some instances, the memory 802 may further include memory located remotely from the processor 901, which may be connected to devices/terminals/servers via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input unit 803 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function controls of the personality determination device. The output device 804 may include a display device such as a display screen.
The character determining device can execute the character determining method based on the eye tracking technology provided by the first embodiment of the invention, and has corresponding functional modules and beneficial effects of the executing method.
Example four
A fourth embodiment of the present invention further provides a storage medium containing computer-executable instructions, which when executed by a computer processor, are configured to perform a method for determining a personality based on an eye tracking technique, the method including:
determining the tendency selection of the testee according to the first eye movement data and the first data model of the testee;
according to the tendency selection, determining a first character test result of the tested person;
determining a second character test result of the tested person according to the second eye movement data and the second data model of the tested person;
and determining the character type of the tested person according to the first character test result and the second character test result.
Of course, the storage medium provided by the embodiment of the present invention contains computer-executable instructions, and the computer-executable instructions are not limited to the method operations described above, and may also perform related operations in the personality determination provided by any embodiment of the present invention.
From the above description of the embodiments, it is obvious for those skilled in the art that the present invention can be implemented by software and necessary general hardware, and certainly, can also be implemented by hardware, but the former is a better embodiment in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which can be stored in a computer-readable storage medium, such as a floppy disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a FLASH Memory (FLASH), a hard disk or an optical disk of a computer, and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device) to execute the methods according to the embodiments of the present invention.
It should be noted that, in the embodiment of the character determining apparatus based on the eye tracking technology, the included units and modules are only divided according to the functional logic, but are not limited to the above division as long as the corresponding functions can be realized; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present invention.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (8)

1. A character determining apparatus based on an eye-tracking technique, comprising:
the device comprises a determining unit, a judging unit and a judging unit, wherein the determining unit is used for determining the tendency selection of a tested person according to first eye movement data and a first data model of the tested person;
the determining unit is further used for determining a first character test result of the tested person according to the tendency selection;
the determining unit is further configured to determine a second character test result of the test subject according to the second eye movement data and the second data model of the test subject;
the determining unit is further configured to determine the personality type of the subject according to the first personality test result and the second personality test result;
the determining unit is specifically configured to:
respectively multiplying the first weight by a weight corresponding to at least one character type in the first character test results to obtain a first result corresponding to at least one character type, wherein the first weight is the proportion of the number of the test questions actually selected by the testee and the number of the test questions with the same tendency to all the test questions;
respectively multiplying the second weight by the weight corresponding to at least one character type in the second character test results to obtain a second result corresponding to at least one character type, wherein the second weight is the proportion of the number of the test questions actually selected by the testee and different from the number of the test questions selected by the examinee;
adding the first result and the second result corresponding to at least one character type to obtain a third result corresponding to at least one character type;
determining at least one character type corresponding to the third result as the character type of the tested person;
the tendency selection is an option which is calculated by a deep learning algorithm according to the eye movement data and is most inclined to the inner selection intention of the tested person;
the actual selection is an option actually selected by the tested person in the testing process.
2. The personality determination device of claim 1, wherein the first personality test result comprises at least one personality type and a weight corresponding to the at least one personality type.
3. The personality determination device of claim 1, wherein the second personality test result comprises at least one personality type and a weight corresponding to the at least one personality type.
4. The personality determination device of claim 1, further comprising:
the device comprises a building unit, a selection unit and a control unit, wherein the building unit is used for building a first data model before determining the tendency selection of a tested person, and the first data model is used for determining the tendency selection of the tested person;
the determining unit is further used for determining that the first eye movement data is valid according to a first preset condition before determining the tendency selection of the testee.
5. The personality determination device of claim 4, wherein the establishing unit is further configured to establish a second data model for determining a second personality test result for the test subject prior to determining the second personality test result for the test subject;
the determining unit is further configured to determine that the second eye movement data is valid according to a second preset condition before determining a second character test result of the test subject.
6. A character determining method based on an eye movement tracking technology is characterized by comprising the following steps:
determining the tendency selection of the testee according to the first eye movement data and the first data model of the testee;
determining a first character test result of the tested person according to the tendency selection;
determining a second character test result of the tested person according to the second eye movement data and the second data model of the tested person;
determining the character type of the tested person according to the first character test result and the second character test result;
respectively multiplying the first weight by a weight corresponding to at least one character type in the first character test results to obtain a first result corresponding to at least one character type, wherein the first weight is the proportion of the number of the test questions actually selected by the testee and the number of the test questions with the same tendency to all the test questions;
respectively multiplying the second weight by the weight corresponding to at least one character type in the second character test results to obtain a second result corresponding to at least one character type, wherein the second weight is the proportion of the number of the test questions actually selected by the testee and different from the number of the test questions selected by the examinee;
adding the first result and the second result corresponding to at least one character type to obtain a third result corresponding to at least one character type;
determining at least one character type corresponding to the third result as the character type of the tested person;
the tendency selection is an option which is calculated by a deep learning algorithm according to the eye movement data and is most inclined to the inner selection intention of the tested person;
the actual selection is an option actually selected by the tested person in the testing process.
7. A personality determination device based on eye-tracking technology comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor, when executing the computer program, implements the functionality of the personality determination apparatus as claimed in any one of claims 1-5.
8. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the functions of the personality determination device of any one of claims 1-5.
CN201910740172.5A 2019-08-12 2019-08-12 Character determining device, method and equipment based on eye movement tracking technology Active CN110327061B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910740172.5A CN110327061B (en) 2019-08-12 2019-08-12 Character determining device, method and equipment based on eye movement tracking technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910740172.5A CN110327061B (en) 2019-08-12 2019-08-12 Character determining device, method and equipment based on eye movement tracking technology

Publications (2)

Publication Number Publication Date
CN110327061A CN110327061A (en) 2019-10-15
CN110327061B true CN110327061B (en) 2022-03-08

Family

ID=68149325

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910740172.5A Active CN110327061B (en) 2019-08-12 2019-08-12 Character determining device, method and equipment based on eye movement tracking technology

Country Status (1)

Country Link
CN (1) CN110327061B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113077872A (en) * 2021-03-29 2021-07-06 山东思正信息科技有限公司 Multi-index group psychological state prediction system and device based on scale test data

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008062416A2 (en) * 2006-11-22 2008-05-29 Nader Butto A system and method for diagnosis of human behavior based on external body markers
US20100010317A1 (en) * 2008-07-09 2010-01-14 De Lemos Jakob Self-contained data collection system for emotional response testing
EP2490584B1 (en) * 2009-10-20 2019-02-20 Dignity Health Eye movements as a way to determine foci of covert attention
US20140214709A1 (en) * 2013-01-07 2014-07-31 Assessment Innovation, Inc. Occupational performance assessment apparatuses, methods and systems
CN105139317B (en) * 2015-08-07 2018-10-09 北京环度智慧智能技术研究所有限公司 The cognition index analysis method of interest orientation value test
CN108078573B (en) * 2015-08-07 2021-02-09 北京智能阳光科技有限公司 Interest orientation value testing method based on physiological response information and stimulation information
JP2018102617A (en) * 2016-12-27 2018-07-05 オムロン株式会社 Emotion estimation apparatus, method, and program
CN106691476B (en) * 2017-01-16 2019-12-13 清华大学 Image cognition psychoanalysis system based on eye movement characteristics
CN106901686B (en) * 2017-02-28 2018-10-12 北京七鑫易维信息技术有限公司 Execution method, server, test lead and the system of test of eye movement task
CN107179829A (en) * 2017-05-22 2017-09-19 贵州大学 The method that user individual preference is predicted using eye movement data
CN107562202B (en) * 2017-09-14 2020-03-13 中国石油大学(北京) Method and device for identifying human errors of process operators based on sight tracking
CN107608523B (en) * 2017-09-30 2021-04-13 Oppo广东移动通信有限公司 Control method and device of mobile terminal, storage medium and mobile terminal
CN108682189A (en) * 2018-04-20 2018-10-19 南京脑桥智能科技有限公司 A kind of learning state confirmation system and method
CN109117711B (en) * 2018-06-26 2021-02-19 西安交通大学 Eye movement data-based concentration degree detection device and method based on hierarchical feature fusion
CN109044373B (en) * 2018-07-12 2022-04-05 济南博图信息技术有限公司 System for assessing panic disorder based on virtual reality and eye movement brain wave detection
CN109199412B (en) * 2018-09-28 2021-11-09 南京工程学院 Abnormal emotion recognition method based on eye movement data analysis
CN109199411B (en) * 2018-09-28 2021-04-09 南京工程学院 Case-conscious person identification method based on model fusion
CN109157231B (en) * 2018-10-24 2021-04-16 阿呆科技(北京)有限公司 Portable multichannel depression tendency evaluation system based on emotional stimulation task
CN109480867A (en) * 2018-10-30 2019-03-19 深圳市心流科技有限公司 Psychological condition adjusting method, device and computer readable storage medium
CN109222888B (en) * 2018-11-05 2021-03-23 温州职业技术学院 Method for judging reliability of psychological test based on eye movement technology
CN109558012B (en) * 2018-12-26 2022-05-13 北京七鑫易维信息技术有限公司 Eyeball tracking method and device
CN109549644B (en) * 2019-01-14 2024-01-19 陕西师范大学 Personality characteristic matching system based on electroencephalogram acquisition
CN109917923B (en) * 2019-03-22 2022-04-12 北京七鑫易维信息技术有限公司 Method for adjusting gazing area based on free motion and terminal equipment

Also Published As

Publication number Publication date
CN110327061A (en) 2019-10-15

Similar Documents

Publication Publication Date Title
Singh et al. Combining gaze and AI planning for online human intention recognition
Conati et al. Modeling user affect from causes and effects
Brandstetter et al. Persistent lexical entrainment in HRI
Bakhtiyari et al. Fuzzy model on human emotions recognition
CN110327061B (en) Character determining device, method and equipment based on eye movement tracking technology
Pastushenko et al. A methodology for multimodal learning analytics and flow experience identification within gamified assignments
CN111639218A (en) Interactive method for spoken language training and terminal equipment
Oleson et al. Statistical considerations for analyzing ecological momentary assessment data
CN113961692A (en) Machine reading understanding method and system
Hershkovitch Neiterman et al. Multilingual deception detection by autonomous agents
Kleider et al. Aggressive shooting behavior: How working memory and threat influence shoot decisions
Pedersen et al. Cognitive Abilities in the Wild: Population-scale game-based cognitive assessment
CN111159379B (en) Automatic question setting method, device and system
Jansen et al. How relevant are incidental power poses for HCI?
Szabelska et al. A tutorial towards theory formalization in (social) embodiment
CN112446644A (en) Method and device for improving quality of network questionnaire
CN110693509A (en) Case correlation determination method and device, computer equipment and storage medium
CN116913525B (en) Feature group normalization method, device, electronic equipment and storage medium
Boscarioli et al. Evaluating the interaction of users with low vision in a multimodal environment
KR102574979B1 (en) Method for customized counseling by user type based on artificial intelligence
CN112419112B (en) Method and device for generating academic growth curve, electronic equipment and storage medium
Karlsson Mini cases vs. Full length case studies: advantages and disadvantages
Shattuck et al. A process tracing approach to the investigation of situated cognition
CN113178109B (en) Operation simulation test method, device, equipment and storage medium
Diener et al. Experimental approach to affective interaction in games

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant