CN111208904A - Sight estimation equipment performance evaluation method, system and equipment - Google Patents

Sight estimation equipment performance evaluation method, system and equipment Download PDF

Info

Publication number
CN111208904A
CN111208904A CN202010016069.9A CN202010016069A CN111208904A CN 111208904 A CN111208904 A CN 111208904A CN 202010016069 A CN202010016069 A CN 202010016069A CN 111208904 A CN111208904 A CN 111208904A
Authority
CN
China
Prior art keywords
user
sight
sight line
module
line information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010016069.9A
Other languages
Chinese (zh)
Inventor
朱成彦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Untouchtech Co ltd
Original Assignee
Untouchtech Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Untouchtech Co ltd filed Critical Untouchtech Co ltd
Priority to CN202010016069.9A priority Critical patent/CN111208904A/en
Publication of CN111208904A publication Critical patent/CN111208904A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements

Abstract

The invention belongs to the technical field of artificial intelligence, and particularly discloses a performance evaluation method, a system and equipment for sight estimation equipment, wherein the system simultaneously acquires user images through a plurality of modules to provide a sufficiently large sight detection range, correspondingly calculates user sight information of each module, and then selects one of the user sight information as user real sight information.

Description

Sight estimation equipment performance evaluation method, system and equipment
Technical Field
The embodiment of the invention relates to the technical field of artificial intelligence, in particular to a sight line estimation equipment performance evaluation method, system and equipment.
Background
The application of the sight line estimation technology has become more and more extensive, and the sight line estimation technology is applied to eye control instruments for disabled persons or people with gradually frozen persons, sight line tracking instruments for psychological analysis, and sight line estimation instruments on a DMS (Driver Monitoring System).
The sight estimation product generally determines human eye characteristics by collecting human eye images, and then estimates the fixation point, the fixation direction and the like of human eyes according to the human eye characteristics. Specifically, the method can be classified into an appearance method and a corneal reflex method. Gaze estimation products based on appearance methods typically rely on features of the human face and/or human eye image (e.g., eyelid position, pupil position, iris position, internal/external eye angles, face orientation, etc.) to estimate the gaze point and gaze direction of the human eye. The sight line estimation product based on the corneal reflection method relies on light spots (light reflection points which appear in the image of the camera in the corneal region of the human eye due to reflected light) in addition to partial features used in the appearance method. In general, since the light spot well reflects the gaze direction of the human eye, the corneal reflex method has higher accuracy than the appearance method, and thus almost all of the mature commercial products for sight line estimation are based on the corneal reflex method.
At present, sight line estimation products on the market are various in types and are good and irregular in quality. Therefore, it is important to have a solution for quantitatively evaluating a sight-line estimation product.
Disclosure of Invention
Due to the fact that quantitative evaluation cannot be performed on the performance of some sight line estimation devices in the prior art, the application provides a sight line estimation device performance evaluation method, system and device.
In a first aspect of embodiments of the present invention, there is provided a sight line estimation apparatus performance evaluation system including: the data acquisition unit to be detected is used for acquiring user sight information obtained by acquiring and calculating user images by the sight estimation equipment; at least two modules; each module is used for emitting infrared light from different angles to irradiate the user and collecting the user image; the initial sight determining unit is used for calculating user sight information according to the user images acquired by each module, wherein each module corresponds to one group of user sight information; the real sight line determining unit is used for receiving a plurality of groups of user sight line information corresponding to each module, and selecting one group of user sight line information from the user sight line information to determine the user real sight line information; and the comparison unit is used for comparing the user sight line information obtained by calculation of the sight line estimation equipment with the real user sight line information and evaluating the performance of the sight line estimation equipment according to a comparison result.
In a second aspect of embodiments of the present invention, there is provided a line-of-sight estimation device performance evaluation method including: acquiring user sight information obtained by acquiring and calculating user images by sight estimation equipment; emitting infrared light from different angles by utilizing at least two modules to irradiate a user and collecting a user image; calculating user sight line information according to the user image acquired by each module, wherein each module corresponds to one group of user sight line information; selecting one group of user sight line information from a plurality of groups of user sight line information corresponding to each module to determine the user real sight line information; and comparing the user sight line information obtained by calculation of the sight line estimation equipment with the real user sight line information, and evaluating the performance of the sight line estimation equipment according to a comparison result.
In a third aspect of embodiments of the present invention, there is provided a line-of-sight estimation device performance evaluation device, including a processor, a memory, and a computer program stored on the memory and executable on the processor, wherein the processor executes the line-of-sight estimation device performance evaluation method according to the first aspect when executing the computer program.
In a fourth aspect of embodiments of the present invention, there is provided a computer-readable storage medium on which a computer program is stored, characterized in that the computer program, when executed by a processor, implements the line-of-sight estimation device performance evaluation method of the first aspect.
According to the performance evaluation method, system and device for the sight line evaluation device provided by the invention, in the first aspect, the performance evaluation system for the sight line evaluation device simultaneously acquires the user images through a plurality of modules, so that a sufficiently large sight line detection range is provided, the real sight line information of the user obtained according to the performance evaluation system is used as high-precision sight line information of the user, and the high-precision sight line information of the user is compared with the sight line information of the user obtained by calculating the sight line evaluation device, so that the quantitative evaluation effect is realized.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
The technical solution of the present invention is further described in detail by the accompanying drawings and embodiments.
Drawings
The above and other objects, features and advantages of exemplary embodiments of the present invention will become readily apparent from the following detailed description read in conjunction with the accompanying drawings. Several embodiments of the invention are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which:
FIG. 1 illustrates an exemplary application scenario for gaze estimation device performance evaluation;
FIG. 2 is a first schematic structural diagram illustrating an embodiment of a system for evaluating performance of a gaze estimation device provided by the present application;
FIG. 3 is a schematic diagram illustrating an application scenario of the sight-line estimation device performance evaluation system of the present application;
fig. 4 shows a schematic structural diagram ii of an embodiment of the sight-line estimation device performance evaluation system provided in the present application;
fig. 5 shows a schematic diagram of an embodiment of the initial gaze determination unit in the embodiment shown in fig. 4;
fig. 6 is a first schematic diagram showing an embodiment of the initial line-of-sight determining unit in the embodiment shown in fig. 5;
fig. 7 is a second schematic diagram of an embodiment of the initial gaze determination unit in the embodiment of fig. 5 described above;
fig. 8 is a third schematic diagram showing an embodiment of the initial line-of-sight determining unit in the embodiment shown in fig. 5;
fig. 9 is a fourth schematic diagram showing an embodiment of the initial line-of-sight determining unit in the embodiment shown in fig. 5 described above;
fig. 10 is a schematic diagram five showing an embodiment of the initial line-of-sight determining unit in the embodiment shown in fig. 5 described above;
FIG. 11 is a first schematic diagram illustrating the structure of one embodiment of the module shown in FIG. 2;
FIG. 12 is a second block diagram of one embodiment of the module 202 of the embodiment of FIG. 2;
FIG. 13 shows a schematic diagram of an application scenario of the gaze estimation device performance evaluation system;
fig. 14 is a schematic structural diagram three illustrating an embodiment of a sight line estimation device performance evaluation system provided in the present application;
FIG. 15 is a diagram illustrating an application scenario of the module control unit performing time-sharing control on the module in the embodiment shown in FIG. 14;
fig. 16 is a schematic structural diagram of a fourth embodiment of the sight-line estimation device performance evaluation system provided in the present application;
fig. 17 is a schematic structural diagram illustrating a fifth embodiment of the sight-line estimation device performance evaluation system provided in the present application;
fig. 18 is a schematic structural diagram six illustrating an embodiment of a sight-line estimation device performance evaluation system provided by the present application;
FIG. 19 shows a schematic diagram of an application scenario of the embodiment shown in FIG. 18;
fig. 20 is a schematic structural diagram seven illustrating an embodiment of the sight-line estimation device performance evaluation system provided by the present application;
FIG. 21 shows a schematic diagram of an application scenario of the embodiment shown in FIG. 20;
FIG. 22 is a flowchart illustrating a first implementation of an embodiment of a gaze estimation device performance evaluation method provided by the present application;
FIG. 23 is a flowchart illustrating an implementation of an embodiment of step S223 in the embodiment shown in FIG. 22;
FIG. 24 is a flowchart illustrating an implementation of an embodiment of step S231 in the embodiment illustrated in FIG. 22;
FIG. 25 is a flowchart illustrating a second implementation of an embodiment of step S241 in the embodiment shown in FIG. 24;
FIG. 26 is a flow chart showing a third implementation of an embodiment of step S241 in the embodiment of FIG. 24;
FIG. 27 is a flowchart illustrating a fourth implementation of an embodiment of step S241 in the embodiment shown in FIG. 24;
FIG. 28 is a flow chart showing a fifth implementation of an embodiment of step S241 in the embodiment of FIG. 24;
fig. 29 is a flowchart illustrating a second implementation of an embodiment of a line-of-sight estimation device performance evaluation method provided by the present application;
fig. 30 shows a third implementation flow chart of an embodiment of the sight-line estimation device performance evaluation method provided by the present application;
fig. 31 shows a flowchart of a fourth implementation of an embodiment of a line-of-sight estimation device performance evaluation method provided by the present application;
FIG. 32 is a flow chart illustrating implementation of an embodiment of the present application based on the embodiment shown in FIG. 30;
FIG. 33 is a flowchart illustrating an implementation of an embodiment of the present application that is based on the embodiment illustrated in FIG. 31;
fig. 34 shows a schematic structural diagram of an electronic device suitable for implementing some embodiments of the gaze estimation device performance evaluation method of the present application.
Detailed Description
The principles and spirit of the present invention will be described with reference to a number of exemplary embodiments. It is understood that these embodiments are given solely for the purpose of enabling those skilled in the art to better understand and to practice the invention, and are not intended to limit the scope of the invention in any way. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
The term "and/or" herein is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
In addition, those skilled in the art will appreciate that embodiments of the present invention may be implemented as a system, apparatus, device, method, or computer program product. Accordingly, the present disclosure may be embodied in the form of: entirely hardware, entirely software (including firmware, resident software, micro-code, etc.), or a combination of hardware and software.
The inventor of the application finds out through research that:
at present, a sight line estimation product is evaluated mainly in a simple way of giving accurate or inaccurate estimation by a user according to the actual experience of the user and the actual fixation point of the user and the result given by a system. In DMS application, many sight estimation products do not give specific fixation points, but give rough fixation directions, so that a user can only visually see large directions accurately, the specific precision cannot be quantitatively analyzed, and certain difficulty is caused for the user to select products and developers to improve the precision.
The inventor further researches and discovers that: the premise of evaluating the sight line estimation product is to provide real sight line information of the user so as to compare with the sight line estimation product. Therefore, it is necessary to provide the user's sight line information with high accuracy, and the accuracy of other sight line estimation products can be evaluated only with the result of high accuracy. After a large number of experiments, the inventor finds that: whether or not a sufficiently large sight line detection range can be provided directly affects whether or not highly accurate user sight line information can be provided.
Through the discovery of the technical problem, the inventor carries out technical research and development around how to provide a sufficiently large sight line detection range, finally provides a method for utilizing a plurality of modules to emit infrared light from different angles to irradiate a user and collect user images, high-precision user sight line information can be obtained through calculation by integrating the images collected by each module, the high-precision user sight line information is used as a true value, namely, the performance evaluation can be carried out on the existing sight line estimation products on the market at present, and the calculation results of the sight line estimation products are quantitatively analyzed.
The invention of the inventor is now clearly and completely illustrated by the following specific examples.
Application examples
Referring to fig. 1, an exemplary application scenario 100 for line-of-sight estimation device performance evaluation is shown.
As shown in fig. 1, in the application scenario 100, the application scenario includes a line-of-sight estimation device 101, a display screen 102 and a computer host 103, where the display screen 102 is electrically connected to the computer host 103, the electrical connection includes a wired data connection or a wireless data connection, the computer host 103 controls a gaze point S (also called a nominal point) displayed on the display screen 102, and the line-of-sight estimation device 101 is fixed on or near the display screen 102 and is configured to collect a user image of a user 104 in front of the display screen, calculate the collected user image to obtain user gaze information, compare the obtained user gaze information with actual gaze information of the gaze point on the display screen 102 watched by the user 104, and then evaluate the quality of the gaze estimation device in performing line-of-sight estimation.
Specifically, the computer host 103 may specifically include: the device comprises a processor, a memory, an interface and a bus, wherein the memory, the memory and the interface are respectively connected with the processor through the bus, and external devices such as a keyboard, a display, a mouse and the like can also be connected through the interface. The computer host may be used to run various computer programs and applications, such as an operating system, a gaze estimation computer program, an image processing computer program, and so forth.
In the above application scenario, the reference showing the performance evaluation of the gaze estimation device is the real gaze information of the user gazing at the given calibration point. However, when the alternative annotation point is an approximate gaze direction for some gaze estimation devices that do not provide accurate annotation points, the performance of the gaze estimation device cannot be quantitatively evaluated.
System embodiment
Referring to fig. 2, a schematic structural diagram of an embodiment of the system for evaluating performance of a line of sight estimation device according to the present application is shown, where the embodiment can perform quantitative performance evaluation on a line of sight estimation device that cannot provide a specific calibration point, and accordingly, the system for evaluating performance of a line of sight estimation device according to the present application may be generally applied to the computer host 103 shown in fig. 1.
As shown in fig. 2, the performance evaluation system 200 of the sight estimation device includes a data acquisition unit to be measured, at least two modules, an initial sight determination unit, a real sight determination unit, and a comparison unit, where the data acquisition unit to be measured is configured to acquire user sight information obtained by acquiring and calculating a user image by the sight estimation device; each module is used for emitting infrared light from different angles to irradiate the user and collecting the user image; the initial sight determining unit is used for calculating user sight information according to the user images acquired by each module, wherein each module corresponds to one group of user sight information; the real sight line determining unit is used for receiving a plurality of groups of user sight line information corresponding to each module, and selecting one group of user sight line information from the user sight line information to determine the user real sight line information; the comparison unit is used for comparing the user sight line information obtained by calculation of the sight line estimation equipment with the real user sight line information and evaluating the performance of the sight line estimation equipment according to a comparison result.
For the sake of clarity and complete understanding of the technical principles of the present embodiment, the following description will be given by way of example of an application scenario of the present embodiment.
Please refer to fig. 3, which shows a schematic diagram of an application scenario of the performance evaluation system of the gaze estimation apparatus of the present application, in the application scenario of fig. 3, the number of at least two modules 202 is three, the positions of the gaze estimation apparatus 101 and each module are relatively fixed and are respectively connected to the host computer 103, the gaze estimation apparatus 101 is connected to the host computer 103 for acquiring user images and calculating user gaze information, similarly, the three modules 202 are also used for acquiring user images for users, the measured data acquisition unit, the initial gaze determination unit, the real gaze determination unit and the comparison unit are respectively arranged in the host computer 103 for calculating user images acquired by the modules 202 to obtain a plurality of sets of user gaze information, and then selecting a set of user gaze information from the plurality of sets of user gaze information as user real gaze information, and performing the user real gaze information and the user gaze information calculated by the gaze estimation apparatus 101 And comparing to evaluate the performance of the sight line estimation apparatus 101.
According to the application scenario, the user images are acquired through the modules simultaneously, so that a sufficiently large sight line detection range is provided, the user sight line information of each module is correspondingly calculated, and one of the user sight line information is selected as the user real sight line information.
In addition, in consideration of the fact that the plurality of modules provide a sufficiently large sight line detection range in the example shown in fig. 3, there may be a problem that the user sight line information can be calculated from the user images collected by two or more modules, and at this time, when determining the user real sight line information, how to select from the plurality of sets of user sight line information is faced to ensure that the selected user sight line information is the truest of the plurality of sets of user sight line information. Therefore, in order to further improve the accuracy of the real sight line information of the user, the following embodiments are further provided in the present application.
In an exemplary embodiment, referring to fig. 4, a second schematic structural diagram of an embodiment of the sight-line estimation device performance evaluation system provided by the present application is shown, and as shown in fig. 4, the difference from the embodiment shown in fig. 3 is that: in this embodiment, the performance evaluation system 200 of the gaze estimation device includes an initial gaze determination unit 401 and a real gaze determination unit 402, where the initial gaze determination unit 401 is further configured to determine a confidence level of each set of user gaze information; in addition, the real sight line determining unit 402 is further configured to receive the confidence degrees of each set of user sight line information, and select the user sight line information with the highest confidence degree from the plurality of sets of user sight line information to determine as the user real sight line information.
In this embodiment, when the user sight line information is obtained by calculating the acquired user image, and the confidence of the calculated user sight line information is determined, when one group of user sight line information is selected as the real user sight line information from the multiple groups of user sight line information obtained by calculating the user image acquired by the multiple modules, the real user sight line information is selected according to the confidence, so that the precision of the real user sight line information is further improved.
Specifically, since the confidence of the user gaze information is determined based on the acquired user image, please refer to fig. 5, which shows a schematic diagram of an embodiment of the initial gaze determining unit 401 in the embodiment shown in fig. 4, as shown in fig. 5, the difference from the embodiment shown in fig. 4 is that: in this embodiment, the initial gaze determining unit 401 is specifically an initial gaze determining unit 501, and specifically includes: the initial gaze determining unit 401 determines the confidence of the corresponding group of user gaze information by analyzing the user image acquired by each module.
Further, please refer to fig. 6, which shows a first schematic diagram of an embodiment of the initial gaze determining unit 501 in the embodiment shown in fig. 5.
As shown in fig. 6, in the sight line estimation device performance evaluation system 200 of the present application, the initial sight line determination unit 501 specifically includes: the initial sight line determining unit 601 extracts an eye region image from the user image acquired by each module, determines an ellipse corresponding to a pupil from the eye region image, analyzes the shape characteristics of the ellipse corresponding to the pupil, and determines the confidence of the corresponding group of user sight line information according to the shape characteristics of the ellipse corresponding to the pupil.
Specifically, when the user's gaze is directed to a module or a camera, the shape corresponding to the pupil in the eye region image is more circular, and vice versa, the shape corresponding to the pupil is more elliptical, so that the ellipticity of the ellipse corresponding to the pupil can be calculated in the initial gaze determining unit 601 as a reference for determining the confidence or confidence of the user's gaze information corresponding to the user's image.
It should be understood that the ellipticity of the ellipse, i.e. the ratio of the major axis to the minor axis of the ellipse, is not described in detail herein because the parameters of the corresponding shape of the pupil can be identified according to the eye region image by the conventional technical means in the field, so as to realize the ellipticity calculation.
Further, please refer to fig. 7, which illustrates a second schematic diagram of an embodiment of the initial gaze determining unit 501 in the embodiment illustrated in fig. 5.
As shown in fig. 7, in the sight line estimation device performance evaluation system 200 of the present application, the initial sight line determination unit 501 specifically includes: the initial sight line determining unit 701 extracts an eye region image from the user image acquired by each module, determines a light spot appearing due to infrared light reflected by the eyes of the user from the eye region image, analyzes the shape feature of the light spot, and determines the confidence of the corresponding group of user sight line information according to the shape feature of the light spot.
Specifically, the light spot is a purkinje spot, which is a high-brightness emission light spot formed on the cornea when the infrared light source irradiates the eyeball. According to the optical knowledge of the eyeball, when the sight of the user is over against a certain infrared light source, the smaller the area of the cursor in the image of the eyeball is, the clearer the edge is. Therefore, the confidence of the corresponding group of user sight line information is determined according to the shape characteristics of the light spots, which can be according to the size of the area of the calculated light spots or the differentiation rate of the edge pixels of the identified light spots, and if the area is smaller or the differentiation rate of the edge pixels is higher, the confidence of the corresponding user sight line information is determined to be higher; otherwise, the lower the confidence level of the corresponding user sight line information is determined.
It should be understood that the calculation of the spot area or the identification of the pixel resolution at the edge of the spot can be realized by the conventional technical means in the field of the present technology, and the method for specifically calculating the spot area or the edge definition is not limited in this embodiment, so detailed description is not provided herein.
Further, please refer to fig. 8, which shows a third schematic diagram of an embodiment of the initial gaze determining unit 501 in the embodiment shown in fig. 5.
As shown in fig. 8, in the sight line estimation device performance evaluation system 200 of the present application, the initial sight line determination unit 501 specifically includes: the initial sight line determining unit 801 extracts an eye region image from the user image acquired by each module, determines pupils and light spots appearing due to infrared light reflected by the eyes of the user from the eye region image, determines relative positions of the pupils and the light spots in the two eyes of the user respectively, compares the degree of coincidence of the relative positions of the pupils and the light spots in the two eyes of the user, and determines confidence degrees of corresponding groups of user sight line information according to the degree of coincidence.
Specifically, according to the optical knowledge of the eyeball, the pupil of the human eye is aligned with the sight line direction, and if the sight line direction of the human eye is directly opposite to the light source (for example, infrared light), the position of the light spot in the human eye image collected for the human eye should be closer to the position of the pupil. Therefore, the confidence of the corresponding group of user sight line information is determined according to the consistency degree, which may be calculating the vector length between the pupil and the light spot in each eye of the user, and if the vector length is smaller, it indicates that the consistency degree of the positions of the light spot and the pupil is higher, that is, it may be determined that the confidence of the corresponding user sight line information is higher; otherwise, the lower the confidence level of the corresponding user sight line information is determined.
It should be understood that identifying the position information of the light spot and the pupil according to the extracted eye region image and performing two-point vector length calculation can be implemented by common technical means in the technical field, and the specific method for calculating the vector or the distance between the light spot and the pupil in the eye region image is not limited in this embodiment, so detailed description is not provided herein.
Further, please refer to fig. 9, which shows a fourth schematic diagram of an embodiment of the initial gaze determining unit 501 in the embodiment shown in fig. 5.
As shown in fig. 9, in the sight line estimation device performance evaluation system 200 of the present application, the initial sight line determination unit 501 specifically includes: the initial sight line determining unit 901 extracts an eye region image from the user image acquired by each module, determines the visual axes of the two eyes of the user from the eye region image respectively, calculates the deviation of the visual axes of the two eyes of the user, and determines the confidence of the sight line information of the corresponding group of users according to the deviation of the visual axes of the two eyes of the user.
Specifically, since the visual axes of the human eyes are related to the sight line direction, if the user is closer to the camera, the included angle between the visual axes of the two eyes in the human eye image is smaller. Therefore, the confidence of the corresponding group of user sight line information is determined according to the deviation of the visual axes of the two eyes of the user, the visual axes of the two eyes of the user are obtained according to the extracted eye region image, then the size of the included angle between the two visual axes is calculated, if the included angle is smaller, the confidence of the corresponding user sight line information is determined to be higher, otherwise, the confidence of the corresponding user sight line information is determined to be lower.
It should be understood that determining the visual axes of the user's eyes according to the extracted eye region image and calculating the included angle between the visual axes of the two eyes can be realized by common technical means in the technical field, and the method for determining the visual axes of the user's eyes and calculating the included angle between the two visual axes is not limited in this embodiment, so that the detailed description is omitted here.
Further, please refer to fig. 10, which shows a fifth schematic diagram of an embodiment of the initial gaze determining unit 501 in the embodiment shown in fig. 5.
As shown in fig. 10, in the sight line estimation device performance evaluation system 200 of the present application, the initial sight line determination unit 501 specifically includes: the initial sight line determining unit 1001 determines the head orientation of the user according to the user image acquired by each module, determines the deviation between the head orientation of the user and the angle of the user image acquired by each module, and determines the confidence of the corresponding group of user sight line information according to the deviation between the head orientation of the user and the angle of the user image acquired by each module.
Specifically, as known from the related art of facial image processing, the angle of the module with respect to the user can be determined by a camera coordinate system, and the head orientation of the user can be realized by human identification, so that if the face of the user is more inclined to the camera, the head orientation of the user in the corresponding facial image is closer to the angle direction of the module for collecting the image of the user, that is, the included angle between the two directions is smaller. Therefore, the confidence of the corresponding group of user sight line information is determined according to the deviation between the head orientation of the user and the angle of the user image collected by each module, which can be according to the calculation of the included angle between the head orientation of the user and the angle direction of the user image collected by the modules, if the included angle is smaller, the confidence of the corresponding group of user sight line information is determined to be higher, otherwise, the confidence of the corresponding group of user sight line information is determined to be lower.
It should be understood that determining the orientation of the head of the user and the angle at which the module captures the user image based on the captured user image can be achieved by conventional techniques in the art, and therefore will not be described in detail herein.
In addition, when a plurality of modules are used to provide a sufficiently large sight line detection range, in some cases, there may be a situation where light sources of different modules affect each other, resulting in a plurality of light spots in an eye image region of a captured user image, which is not favorable for the technical implementation of determining confidence by using the light spots. To this end, the present application also provides the following embodiments.
Referring to fig. 11, a first structural schematic diagram of an embodiment of the module 202 in the embodiment shown in fig. 2 is shown, and as shown in fig. 11, a single module 202 may include a camera 111 and a light source 113, where the light source is used to generate infrared light and make the infrared light irradiate a user; the camera is used for acquiring user images.
Specifically, the number of cameras in the module 202 is one, but the present application does not limit the number of cameras in each module. For example, the number of cameras may also be two or more.
Referring to fig. 12, a second schematic structural diagram of an embodiment of the module 202 shown in fig. 2 is shown, as shown in fig. 12, in the embodiment, the module 202 includes cameras 121 and 122 and a light source 123, the module further includes a controller 124, and the controller 124 is respectively connected to the cameras 121 and 122 and the light source 123. The controller 124 is responsible for camera 121, 122 and light source 123 control, including triggering, exposure, gain lighting, and also for communicating data to the initial line-of-sight determination unit (or host computer). The data may be an image of the user captured by the camera, an image of the user processed by the controller 404, or computed data output by an image processor on the controller 404.
It should be understood that the module 202 shown in fig. 12 may not include the controller 124, and the controller in the device connected to the module 202 may replace the responsible function of the controller 124. For example, when the module is connected to a computer (e.g., the host computer 103 shown in fig. 1), the controller 124 can be replaced by the computer to control the cameras 121 and 122 and the light source 123.
Based on the above teachings of FIG. 12, in an exemplary embodiment, a line-of-sight estimation device performance evaluation system may also be integrated with the module 202. For example, referring to fig. 13, a schematic diagram of an application scenario of the line-of-sight estimation device performance evaluation system is shown, in the application scenario of fig. 13, the to-be-tested data acquisition unit 201, the initial line-of-sight determination unit 203, the real line-of-sight determination unit 204, and the comparison unit 205 may be disposed in the controller 124, and the module 202 includes the cameras 121 and 122 and the light source 124, although the number of the module 202 may be one, or may also be two or more, which is not limited in this application.
The present embodiment can be implemented when the controller meets the calculation requirements, and the controller may be a microcomputer, for example, an integrated computer system such as a single chip microcomputer.
Further, on the basis that the module structure of the embodiment shown in fig. 11 or 12 is adopted in the present application, the camera and the light source in the module can be controlled, so as to avoid the situation that different light sources interfere with each other to cause multiple light spots.
In an exemplary embodiment, for example, see fig. 14, a first schematic structural diagram of a further embodiment of the sight-line estimation device performance evaluation system provided by the present application is shown, and as shown in fig. 14, the difference from the embodiment shown in fig. 2 is that: when the light sources in each module 202 generate infrared light having the same wavelength, the line-of-sight estimation apparatus performance evaluation system 200 further includes: the module control unit 141 is used for controlling each module with the facula interference relationship to work in a time-sharing manner, so that the user image collected by each module only has facula at most caused by the fact that the infrared light generated by the module is reflected by the eyes of the user; wherein, the module that has facula interference relation is the following type of module: when the infrared light with the same wavelength is adopted to simultaneously irradiate the user, light spots which are generated by reflecting the infrared light generated by other modules by the eyes of the user appear in the user image collected by at least one module.
For example, in an application scenario, see fig. 15, which is a schematic diagram illustrating an application scenario in which the module control unit 141 performs time-sharing control on the modules in the embodiment shown in fig. 14, as shown in fig. 15, it is assumed that three modules 151, 152, and 153 are included, the module control unit 141 controls triggering of the cameras and the light sources in the respective modules, and when a frame of user image is captured, the light sources of the modules 151 and 153 are turned on and the cameras are exposed to shoot, and at the same time, the light sources of the module 152 and the cameras are not triggered; when the next frame of user image is captured, the light source of module 152 is turned on and the camera is exposed to shoot, while the light source and camera of modules 151 and 153 are not triggered. Thus, only one module is in operation for a period of time between modules 151, 152 and modules 152, 153, thereby effectively avoiding light source interaction.
Of course, the control of the modules 151, 152, 153 is not limited to the above-described time division control method. For example, time division control may also be performed: when a frame of user image is collected, the light source of the module 151 is turned on and the camera is exposed to shoot, and meanwhile, the light sources and the cameras of the modules 152 and 153 are not triggered; when the next frame of user image is collected, the light source of the module 152 is turned on and the camera is exposed to shoot, and meanwhile, the light sources and the cameras of the modules 151 and 153 are not triggered; when the next frame of user image is collected again, the light source of the module 153 is turned on and the camera is exposed to shoot, and the light sources and the cameras of the modules 151 and 152 are not triggered, so that the process is repeated. Thus, only one module is in operation during a period of user image integration time between the modules 151, 152, 153, and the operation times of the modules 151, 152, 153 are staggered, thereby effectively avoiding the mutual influence of the light sources.
In an exemplary embodiment, for example, see fig. 16, a schematic structural diagram of a second embodiment of the sight-line estimation device performance evaluation system provided by the present application is shown, and as shown in fig. 16, the difference from the embodiment shown in fig. 2 is that: when the light source in each module with the light spot interference relationship generates infrared light with different wavelengths, the camera in each module with the light spot interference relationship adopts the optical filters 161, 162 and 163 which can filter the infrared light generated by the module and the infrared light with different wavelengths generated by other modules, so that the user image collected by each module only has light spots at most caused by the infrared light generated by the module reflected by the eyes of the user; wherein, the module that has facula interference relation is the following type of module: when infrared light with the same wavelength is adopted to irradiate a user at the same time, light spots generated by other modules and generated by infrared light reflected by eyes of the user appear in a user image acquired by at least one module.
For example, in an application scenario, in conjunction with fig. 16, what is different from the application scenario shown in fig. 15 is: the cameras of the modules 151, 152, 153 are respectively provided with corresponding filters 161, 162, 163, wherein the modules 151, 153 can generate infrared light of one wavelength (e.g. 940mm), and the module 152 generates infrared light of another wavelength (e.g. 940mm), at this time, the filters 161, 163 only allow the infrared light of the wavelengths generated by the modules 151, 153 to pass through, and filter the infrared light of the wavelengths generated by the module 152, and similarly, the filter 162 only allows the infrared light of the wavelengths generated by the module 152 to pass through, and filters the infrared light of the wavelengths generated by the modules 151, 153. In the embodiment, the light sources of the modules 151, 152 and 153 are subjected to frequency division control, so that only light spots formed by the light sources of the modules are in the user image collected by each module, and the mutual influence of the light sources is effectively avoided.
Of course, the frequency division control of the light sources of the modules 151, 152, 153 is not limited to the above. For example, frequency division control of the light sources of the modules 151, 152, 153 may also be implemented as follows: the modules 151, 152, 153 respectively generate infrared light with different wavelengths, and the filters 161, 162, 163 only allow the infrared light with the wavelength generated by the light source of the module to pass through, and filter the infrared light with different wavelengths generated by other modules. Therefore, only light spots formed by the light sources of the user are in the user images acquired by each module, and the mutual influence of the light sources is effectively avoided.
In addition, the following embodiments are also provided in consideration of convenience in evaluation by the line-of-sight estimation device performance evaluation system.
In an exemplary embodiment, please refer to fig. 17, which shows a schematic diagram of an embodiment of the sight-line estimation device performance evaluation system provided by the present application, it should be understood that, for convenience of explanation, only the differences from the embodiment shown in fig. 2 are shown, and the same structure is omitted. As shown in fig. 17, the difference from the embodiment shown in fig. 2 is that: in this embodiment, the performance evaluation system 200 of the sight line estimation device further includes at least one screen 171 for providing a region watched by the user, and in this case, the data acquisition unit to be tested has a function of acquiring user sight line information obtained by acquiring and calculating a user image when the user watches the screen by the sight line estimation device; each module is used for sending out infrared light from different angles and shines the user and gather user's image then specifically include: each module is used for collecting the user image when the user looks at the screen.
The embodiment provides a specific watching area for the user 104 by adding a screen, so that the sight line estimation device 101 and the user sight line information calculated by the application are based on the same watching area, and the performance quantitative evaluation of the sight line estimation device is realized by comparing the deviation of the sight line estimation device and the user sight line information.
In view of the case of the embodiment of FIG. 1, in the embodiment of FIG. 17, the area providing the user's gaze may include one determined calibration point and any indeterminate calibration point (e.g., one general direction or area). For this reason the two aforementioned cases will be clearly and completely illustrated below by means of different embodiments.
In an exemplary embodiment, on the basis of the embodiment shown in fig. 17, when the area where the user gazes is provided as a determined calibration point, please refer to fig. 18, which shows a schematic structural diagram of an embodiment of the system for estimating the performance of the gaze estimation device provided by the present application, it should be understood that, for convenience of explanation, only the differences from the embodiment shown in fig. 2 above are shown in the figure, and the same structure is omitted. As shown in fig. 18, the difference from the embodiment shown in fig. 2 is that: in this embodiment, the sight line estimation device performance evaluation system 18 specifically includes: the data acquisition unit to be detected 181 is configured to acquire a user sight direction obtained by acquiring a user image when the user gazes at a calibration point at a known position in the screen and calculating the user image; at least two modules 182, each of which collects an image of the user when the user gazes at the known position of the calibration point on the screen; an initial sight determining unit 183, configured to determine a user sight starting point according to the user image acquired by each module, determine a connection line between the user sight starting point and a calibration point watched by the user as a user watching direction in a camera coordinate system in the corresponding module, and determine the user sight direction as a user sight direction corresponding to the corresponding module, where each module corresponds to a group of user sight information; a real sight line determining unit 184, configured to select one user sight line direction from the user sight line directions corresponding to each module, and determine the user sight line direction as the real sight line direction of the user; a comparing unit 185, configured to convert the user's real gaze direction and the user's gaze direction calculated by the gaze estimation device into the same coordinate system, calculate a deviation between the user's real gaze direction and the user's gaze direction calculated by the gaze estimation device in the same coordinate system, and evaluate the performance of the gaze estimation device according to the deviation.
To facilitate understanding of the solution of the present embodiment, please refer to fig. 19, which shows a schematic diagram of the application scenario of the embodiment shown in fig. 18, as shown in fig. 19, a certain calibration point S is provided to the user in the screen 171, of course, this calibration point S may randomly appear in any screen, and then the positions of the module 182 and the gaze estimation device 101 relative to the screen 171 are fixed, so that the coordinate systems of the module 182 and the gaze estimation device 101 for acquiring images may be converted into the same screen coordinate system; alternatively, the performance of the sight line estimating apparatus may be evaluated by converting the coordinate system of one of the module 182 and the sight line estimating apparatus 101.
In another exemplary embodiment, on the basis of the embodiment shown in fig. 17, when the area where the user gazes is provided is an uncertain calibration point or an approximate area, please refer to fig. 20, which shows a schematic structural diagram of another embodiment of the sight line estimation device performance evaluation system provided by the present application, it should be understood that, for convenience of explanation, only the differences from the above-mentioned embodiment shown in fig. 2 are shown in the figure, and the same structure is omitted. As shown in fig. 20, the difference from the embodiment shown in fig. 2 is that: in this embodiment, the sight line estimation device performance evaluation system 20 includes: a to-be-detected data acquisition unit 2001 for acquiring a user sight direction obtained by acquiring and calculating a user image when the user gazes at an arbitrary point by the sight estimation device; at least two modules 2002, each module collecting a user image when the user gazes at the arbitrary point; an initial sight line determining unit 2003, configured to predict a user sight line direction according to the user image acquired by each module; a real gaze determining unit 2004 for selecting one gaze direction from the user gaze directions predicted by each module to determine as a user real gaze direction; a comparing unit 2005, configured to convert the user's real gaze direction and the user's gaze direction calculated by the gaze estimation device into the same coordinate system, calculate a deviation between the user's real gaze direction and the user's gaze direction calculated by the gaze estimation device in the same coordinate system, and evaluate performance of the gaze estimation device according to the deviation.
To facilitate understanding of the solution of the present embodiment, please refer to fig. 21, which shows a schematic diagram of the application scenario of the embodiment shown in fig. 20, as shown in fig. 21, a general direction area Q is provided for the user in the screen 171, of course, this general direction area Q may randomly appear in any screen 171, and then the positions of the module 2002 and the gaze estimation device 101 are fixed relative to the screen 171, so that the coordinate systems of the module 2002 and the gaze estimation device 101 for acquiring images may be converted into the same screen coordinate system; alternatively, the performance of the sight line estimating apparatus may be evaluated by converting the coordinate system of one of the module 2002 and the sight line estimating apparatus 101.
Specifically, in the above examples shown in fig. 18 and fig. 20, the performance of the user gaze information calculated by the gaze estimation device is evaluated based on the user real gaze information calculated by the present application, an included angle between the user real gaze direction and the user gaze direction calculated by the gaze estimation device may be calculated in the same coordinate system, or a cosine value of the included angle may be calculated, and if the included angle is larger and the cosine value is larger, the accuracy of the gaze estimation device is determined to be worse; otherwise, the better the accuracy of the gaze estimation device is determined.
Method embodiment
Based on the same inventive concept as the product embodiment, correspondingly, the application also provides various sight line estimation equipment performance evaluation methods.
Referring to fig. 22, which shows a flowchart of an implementation of an embodiment of the method for evaluating performance of a gaze estimation device provided by the present application, an execution subject of the method for evaluating performance of a gaze estimation device provided by the present application may be the host computer 103 shown in fig. 1.
As shown in fig. 22, the sight-line estimation device performance evaluation method includes the steps of:
step S221, acquiring user sight information obtained by acquiring and calculating user images by sight estimation equipment;
step S222, utilizing at least two modules to emit infrared light from different angles to irradiate the user and collect the user image;
step S223, calculating user sight information according to the user image collected by each module, wherein each module corresponds to one group of user sight information;
step S224, selecting a group of user sight line information from the multiple groups of user sight line information corresponding to each module to determine the user sight line information as the real user sight line information;
and step S225, comparing the user sight line information obtained by calculation of the sight line estimation equipment with the real user sight line information, and evaluating the performance of the sight line estimation equipment according to the comparison result.
In an exemplary embodiment, please refer to fig. 23, which shows a flowchart for implementing an embodiment of step S223 in the embodiment shown in fig. 22.
As shown in fig. 23, in the present embodiment, the step S223 specifically includes:
step S231, calculating user sight line information according to the user image acquired by each module, wherein each module corresponds to one group of user sight line information, and determining the confidence coefficient of each group of user sight line information;
accordingly, the step S224 specifically includes:
step S232, selecting the user sight line information with the highest confidence from the multiple sets of user sight line information corresponding to each module, and determining the user sight line information as the real user sight line information.
In an exemplary embodiment, please refer to fig. 24, which shows a flowchart for implementing an embodiment of step S231 in the embodiment shown in fig. 22.
As shown in fig. 24, in step S231, the determining the confidence level of each group of user gaze information specifically includes:
step S241, determining the confidence of the corresponding group of user sight line information by analyzing the user image acquired by each module.
In an exemplary embodiment, in step S241, determining the confidence of the corresponding group of user gaze information by analyzing the user image collected by each module specifically includes:
step 251, extracting eye region images from the user images collected by each module;
step S252, determining an ellipse corresponding to the pupil from the eye region image, and analyzing the shape characteristics of the ellipse corresponding to the pupil;
step S253, determining the confidence of the corresponding group of user gaze information according to the shape feature of the ellipse corresponding to the pupil.
In an exemplary embodiment, please refer to fig. 25, which shows a flowchart of a second implementation of the step S241 in the embodiment shown in fig. 24.
As shown in fig. 25, in step S241, the determining the confidence level of the corresponding group of user sight line information by analyzing the user image collected by each module specifically includes:
step S261, extracting an eye region image from the user image acquired by each module;
step S262, determining light spots which appear due to infrared light reflected by the eyes of the user from the eye region image, and analyzing the shape characteristics of the light spots;
and step S263, determining the confidence of the corresponding group of user sight line information according to the shape characteristics of the light spots.
In an exemplary embodiment, please refer to fig. 26, which shows a third implementation flow chart of an embodiment of step S241 in the embodiment shown in fig. 24.
As shown in fig. 26, in step S241, the determining the confidence level of the corresponding group of user sight line information by analyzing the user image collected by each module specifically includes:
step S271, extracting eye region images from the user images collected by each module;
step S272, determining pupils and light spots appearing due to infrared light reflected by the eyes of the user from the eye region image;
step S273, determining the relative positions of the pupils and the light spots in the two eyes of the user respectively, and comparing the degrees of coincidence of the relative positions of the pupils and the light spots in the two eyes of the user;
in step S274, the confidence level of the corresponding set of user gaze information is determined according to the degree of agreement.
In an exemplary embodiment, please refer to fig. 27, which shows a flowchart of a fourth implementation of the step S241 in the embodiment shown in fig. 24.
As shown in fig. 27, in step S241, the determining the confidence level of the corresponding group of user sight line information by analyzing the user image collected by each module specifically includes:
step S281, extracting an eye region image from the user image collected by each module;
step S282 of determining the visual axes of the two eyes of the user from the eye region image, and calculating the deviation of the visual axes of the two eyes of the user;
and step S283, determining the confidence of the corresponding group of user gaze information according to the deviation of the visual axes of the two eyes of the user.
In an exemplary embodiment, please refer to fig. 28, which shows a flowchart of a fifth implementation of an embodiment of step S241 in the embodiment shown in fig. 24.
As shown in fig. 28, in step S241, the determining the confidence level of the corresponding group of user sight line information by analyzing the user image collected by each module specifically includes:
step S291, the initial sight line determining unit determines the head orientation of the user according to the user image collected by each module;
step S292, determining deviation between the orientation of the head of the user and the angle of each module for collecting the user image;
step S293, determining the confidence of the corresponding group of user sight line information according to the deviation between the head direction of the user and the angle of each module for collecting the user image.
In an exemplary embodiment, please refer to the embodiment shown in fig. 11 and fig. 21, each module may specifically include: a light source for generating infrared light and causing the infrared light to illuminate a user; a camera for capturing a user image.
With reference to the embodiments shown in fig. 11 and fig. 21, and referring to fig. 29 again, a flowchart of a second implementation of an embodiment of the gaze estimation apparatus performance evaluation method provided by the present application is shown.
As shown in fig. 29, when the light sources in the respective modules generate infrared light having the same wavelength; the sight line estimation device performance evaluation method further includes:
step S302, controlling each module with a facula interference relationship to work in a time-sharing manner, so that at most only facula appears due to the fact that infrared light generated by the module is reflected by eyes of a user in a user image collected by each module;
wherein, the module that has facula interference relation is the following type of module: when the infrared light with the same wavelength is adopted to simultaneously irradiate the user, light spots which are generated by reflecting the infrared light generated by other modules by the eyes of the user appear in the user image collected by at least one module.
With reference to the embodiments shown in fig. 11 and fig. 21, and with reference to fig. 30, a third implementation flow chart of an embodiment of the method for evaluating performance of a gaze estimation device provided by the present application is shown.
As shown in fig. 30, when the light sources in the respective modules having the spot interference relationship generate infrared light having different wavelengths; the sight line estimation device performance evaluation method further includes:
step S311, the camera in each module having the light spot interference relationship adopts an optical filter which can pass through the infrared light generated by the module and filter the infrared light with different wavelengths generated by other modules, so that the user image acquired by each module only has light spots at most caused by the infrared light generated by the module reflected by the eyes of the user;
wherein, the module that has facula interference relation is the following type of module: when infrared light with the same wavelength is adopted to irradiate a user at the same time, light spots generated by other modules and generated by infrared light reflected by eyes of the user appear in a user image acquired by at least one module.
In an exemplary embodiment, please refer to fig. 31, which shows a flowchart for implementing a fifth embodiment of the gaze estimation apparatus performance evaluation method provided by the present application.
As shown in fig. 31, please refer to the embodiment shown in fig. 22, in this embodiment, the method for evaluating performance of a gaze estimation apparatus further includes:
a step S321 of providing an area at which a user gazes using at least one screen;
the step S221 may specifically include the step S322:
the system comprises a sight line estimation device, a display device and a display device, wherein the sight line estimation device is used for acquiring user sight line information obtained by acquiring a user image when a user watches a screen and calculating the user image;
the step S222 may specifically include the step S323: at least two modules are utilized to collect the user image when the user looks at the screen.
In an exemplary embodiment, please refer to fig. 32, which shows a flowchart of an implementation of an embodiment of the present application based on the embodiment shown in fig. 29.
As shown in fig. 32, please refer to the embodiment shown in fig. 22, in this embodiment, the method for evaluating performance of a gaze estimation apparatus further includes:
step S331, collecting a user sight direction obtained by collecting a user image when a user gazes at a calibration point of a known position in a screen and calculating the user image by a sight estimation device;
step S332, collecting a user image when the user watches the calibration point of the known position in the screen by using at least two modules;
step S333, determining a user sight starting point according to the user image acquired by each module, determining a connecting line between the user sight starting point and a calibration point watched by the user as a user watching direction in a camera coordinate system in the corresponding module, and determining the user sight direction as the user sight direction corresponding to the corresponding module;
step 334, selecting a user sight direction from the user sight directions corresponding to each module to determine the user sight direction as the real user sight direction;
step S335, converting the real user gaze direction and the user gaze direction calculated by the gaze estimation device into the same coordinate system, calculating a deviation between the real user gaze direction and the user gaze direction calculated by the gaze estimation device in the same coordinate system, and evaluating the performance of the gaze estimation device according to the deviation.
In an exemplary embodiment, please refer to fig. 33, which shows a flowchart of an implementation of an embodiment of the present application based on the embodiment shown in fig. 30.
As shown in fig. 33, please refer to the embodiment shown in fig. 22, in this embodiment, the method for evaluating performance of a gaze estimation apparatus further includes:
step S341, acquiring a user sight direction obtained by acquiring and calculating a user image when the user gazes at any point by the sight estimation device;
step S342, collecting the user image when the user gazes at the arbitrary point by using each module;
step S343, according to the user 'S picture that every module gathers predicts the user' S sight direction;
step S344, selecting a sight direction from the user sight directions predicted by each module to determine the sight direction as the real sight direction of the user;
step S345, converting the real gaze direction of the user and the gaze direction of the user calculated by the gaze estimation device into the same coordinate system, calculating a deviation between the real gaze direction of the user and the gaze direction of the user calculated by the gaze estimation device in the same coordinate system, and evaluating the performance of the gaze estimation device according to the deviation.
Apparatus embodiment
Referring to fig. 34, a schematic structural diagram of an electronic device 35 suitable for implementing some embodiments of the line-of-sight estimation device performance evaluation method of the present application is shown, and the electronic device shown in fig. 34 is only an example and should not bring any limitation to the functions and the application range of the embodiments of the present application. For example, the electronic device 35 shown in this embodiment may be a line-of-sight estimation device performance evaluation device, or may also be a host computer shown in fig. 1.
As shown in fig. 34, the electronic device 35 may include a processor 351, a memory 352, and a computer program 353 stored in the memory 352 and operable on the processor 351. The processor 351 implements the steps in each of the above embodiments of the sight-line estimation device performance evaluation method, such as steps 221 to 225 shown in fig. 22, when executing the computer program 353.
The electronic device 35 may include, but is not limited to, a processor 351, a memory 352. Those skilled in the art will appreciate that fig. 34 is merely an example of the electronic device 35, and does not constitute a limitation of the electronic device 35, and may include more or less components than those shown, or combine certain components, or different components, for example, the electronic device 35 may further include an input-output device, a network access device, a bus, etc.
Illustratively, the computer program 353 may be divided into one or more modules/units, which are stored in the memory 352 and executed by the processor 351 to accomplish the present invention. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 353 in the electronic device 35.
The Processor 351 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 352 may be an internal storage unit of the electronic device 35, such as a hard disk or a memory of the electronic device 35. The memory 352 may also be an external storage device of the electronic device 35, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, provided on the electronic device 35. Further, the memory 352 may also include both internal storage units and external storage devices of the electronic device 35. The memory 352 is used for storing the computer programs and other programs and data required by the electronic device 35. The memory 352 may also be used to temporarily store data that has been output or is to be output.
In an exemplary implementation, the present application further provides a computer-readable storage medium, on which a computer program 353 is stored, where the computer program 353, when executed by a processor, implements the line-of-sight estimation device performance evaluation method described in any one of the method embodiments.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
While the spirit and principles of the invention have been described with reference to several particular embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, nor is the division of aspects, which is for convenience only as the features in such aspects may not be combined to benefit. The invention is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.
The above-mentioned embodiments are intended to illustrate the objects, technical solutions and advantages of the present invention in further detail, and it should be understood that the above-mentioned embodiments are only exemplary embodiments of the present invention, and are not intended to limit the scope of the present invention, and any modifications, equivalent substitutions, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (17)

1. A gaze estimation device performance evaluation system, comprising:
the data acquisition unit to be detected is used for acquiring user sight information obtained by acquiring and calculating user images by the sight estimation equipment;
at least two modules; each module is used for emitting infrared light from different angles to irradiate the user and collecting the user image;
the initial sight determining unit is used for calculating user sight information according to the user images acquired by each module, wherein each module corresponds to one group of user sight information;
the real sight line determining unit is used for receiving a plurality of groups of user sight line information corresponding to each module, and selecting one group of user sight line information from the user sight line information to determine the user real sight line information;
and the comparison unit is used for comparing the user sight line information obtained by calculation of the sight line estimation equipment with the real user sight line information and evaluating the performance of the sight line estimation equipment according to a comparison result.
2. The gaze estimation device performance evaluation system of claim 1, characterized in that the initial gaze determination unit is further configured to determine a confidence level for each set of user gaze information; then the process of the first step is carried out,
the real sight line determining unit is further used for receiving the confidence degrees of each group of user sight line information and selecting the user sight line information with the highest confidence degree from the plurality of groups of user sight line information to determine the user sight line information as the real sight line information of the user.
3. The gaze estimation device performance evaluation system of claim 2, wherein the initial gaze determination unit determines the confidence level for each set of user gaze information, comprising:
the initial sight line determining unit determines the confidence degree of the corresponding group of user sight line information by analyzing the user image acquired by each module.
4. The sight line estimation device performance evaluation system according to claim 3, wherein the initial sight line determination unit determines the confidence level of the corresponding set of user sight line information by analyzing the user image acquired by each module, including:
the initial sight line determining unit extracts eye region images from the user images acquired by each module, determines ellipses corresponding to pupils from the eye region images, analyzes shape features of the ellipses corresponding to the pupils, and determines confidence degrees of corresponding groups of user sight line information according to the shape features of the ellipses corresponding to the pupils.
5. The sight line estimation device performance evaluation system according to claim 3, wherein the initial sight line determination unit determines the confidence level of the corresponding set of user sight line information by analyzing the user image acquired by each module, including:
the initial sight determining unit extracts eye region images from the user images acquired by each module, determines light spots appearing due to infrared light reflected by the eyes of the user from the eye region images, analyzes the shape characteristics of the light spots, and determines the confidence degree of the corresponding group of user sight information according to the shape characteristics of the light spots.
6. The sight line estimation device performance evaluation system according to claim 3, wherein the initial sight line determination unit determines the confidence level of the corresponding set of user sight line information by analyzing the user image acquired by each module, including:
the initial sight line determining unit extracts eye region images from the user images collected by each module, determines pupils and light spots appearing due to infrared light reflected by the eyes of the user from the eye region images, respectively determines relative positions of the pupils and the light spots in the two eyes of the user, compares the degree of coincidence of the relative positions of the pupils and the light spots in the two eyes of the user, and determines the confidence degree of the corresponding group of user sight line information according to the degree of coincidence.
7. The sight line estimation device performance evaluation system according to claim 3, wherein the initial sight line determination unit determines the confidence level of the corresponding set of user sight line information by analyzing the user image acquired by each module, including:
the initial sight line determining unit extracts eye region images from the user images acquired by each module, respectively determines visual axes of two eyes of the user from the eye region images, calculates the deviation of the visual axes of the two eyes of the user, and determines the confidence coefficient of the corresponding group of user sight line information according to the deviation of the visual axes of the two eyes of the user.
8. The sight line estimation device performance evaluation system according to claim 3, wherein the initial sight line determination unit determines the confidence level of the corresponding set of user sight line information by analyzing the user image acquired by each module, including:
the initial sight line determining unit determines the head orientation of a user according to the user image acquired by each module, determines the deviation between the head orientation of the user and the angle of the user image acquired by each module, and determines the confidence degree of the corresponding group of user sight line information according to the deviation between the head orientation of the user and the angle of the user image acquired by each module.
9. The sight line estimation apparatus performance evaluation system according to claim 1, wherein each module includes:
a light source for generating infrared light and causing the infrared light to illuminate a user;
a camera for capturing a user image.
10. The sight-line estimation apparatus performance evaluation system according to claim 9, wherein the light sources in the respective modules generate infrared light having the same wavelength, and the sight-line estimation apparatus performance evaluation system further comprises: the module control unit is used for controlling each module with the facula interference relationship to work in a time-sharing mode, so that at most only facula appears due to the fact that infrared light generated by the module is reflected by eyes of a user in a user image collected by each module; wherein, the module that has facula interference relation is the following type of module: when the infrared light with the same wavelength is adopted to simultaneously irradiate the user, light spots which are generated by reflecting the infrared light generated by other modules by the eyes of the user appear in the user image collected by at least one module.
11. The system for evaluating performance of sight line estimation equipment according to claim 9, wherein the light source in each module having a speckle interference relationship generates infrared light having different wavelengths, and the camera in each module having a speckle interference relationship employs an optical filter which can pass through the infrared light generated by the module and filter the infrared light having different wavelengths generated by other modules, so that at most only speckles appear in the user image collected by each module due to reflection of the infrared light generated by the module by the user's eyes; wherein, the module that has facula interference relation is the following type of module: when infrared light with the same wavelength is adopted to irradiate a user at the same time, light spots generated by other modules and generated by infrared light reflected by eyes of the user appear in a user image acquired by at least one module.
12. The gaze estimation device performance evaluation system of claim 1, further comprising: at least one screen for providing an area at which a user gazes; then the process of the first step is carried out,
the data acquisition unit to be measured is used for acquiring user sight information obtained by acquiring user images and calculating the user images by sight estimation equipment, and comprises: the data acquisition unit to be tested acquires user sight information obtained by acquiring and calculating a user image when a user watches a screen by the sight estimation equipment;
each module is used for sending infrared light from different angles and shines the user and gather user's image, includes: each module is used for collecting the user image when the user looks at the screen.
13. The gaze estimation device performance evaluation system of claim 12,
the data acquisition unit that awaits measuring gathers user's sight information that sight estimation equipment obtained through gathering user's image and calculating it when the user gazes the screen, includes: the data acquisition unit to be tested acquires a user sight direction obtained by acquiring a user image when a user watches a calibration point at a known position in a screen and calculating the user image;
each module is used for gathering user's image when the user gazes at the screen, includes: each module collects a user image when a user watches the calibration point at the known position in the screen;
the initial sight determining unit is used for calculating user sight information according to the user images acquired by each module, and comprises: the initial sight determining unit determines a user sight starting point according to the user image acquired by each module, determines a connecting line between the user sight starting point and a calibration point watched by the user as a user watching direction in a camera coordinate system in the corresponding module, and determines the user sight direction as the user sight direction corresponding to the corresponding module;
the real sight line determining unit is used for receiving a plurality of groups of user sight line information corresponding to each module, selecting one group of user sight line information from the user sight line information and determining the user sight line information as the real sight line information of the user, and comprises the following steps: the real sight line determining unit selects one user sight line direction from the user sight line directions corresponding to all the modules to determine the user sight line direction as the real sight line direction of the user;
the comparison unit is used for comparing the user sight line information obtained by calculation of the sight line estimation equipment with the real user sight line information and evaluating the performance of the sight line estimation equipment according to a comparison result, and comprises the following steps: the comparison unit converts the real sight direction of the user and the sight direction of the user calculated by the sight estimation device into the same coordinate system, calculates the deviation between the real sight direction of the user and the sight direction of the user calculated by the sight estimation device in the same coordinate system, and evaluates the performance of the sight estimation device according to the deviation.
14. The sight line estimation apparatus performance evaluation system according to claim 1,
the data acquisition unit to be measured is used for acquiring user sight information obtained by acquiring user images and calculating the user images by sight estimation equipment, and comprises: the data acquisition unit to be tested acquires a user sight direction obtained by acquiring a user image when a user gazes at any point and calculating the user image;
each module is used for sending infrared light from different angles and shines the user and gather user's image, includes: each module collects the user image when the user watches the arbitrary point;
the initial sight determining unit is used for calculating user sight information according to the user images acquired by each module, and comprises: the initial sight determining unit predicts the sight direction of the user according to the user image acquired by each module;
the real sight line determining unit is used for receiving a plurality of groups of user sight line information corresponding to each module, selecting one group of user sight line information from the user sight line information and determining the user sight line information as the real sight line information of the user, and comprises the following steps: the real sight line determining unit selects one sight line direction from the user sight line directions predicted by each module to determine the sight line direction as the real sight line direction of the user;
the comparison unit is used for comparing the user sight line information obtained by calculation of the sight line estimation equipment with the real user sight line information and evaluating the performance of the sight line estimation equipment according to a comparison result, and comprises the following steps: the comparison unit converts the real sight direction of the user and the sight direction of the user calculated by the sight estimation device into the same coordinate system, calculates the deviation between the real sight direction of the user and the sight direction of the user calculated by the sight estimation device in the same coordinate system, and evaluates the performance of the sight estimation device according to the deviation.
15. A sight line estimation device performance evaluation method, comprising:
acquiring user sight information obtained by acquiring and calculating user images by sight estimation equipment;
emitting infrared light from different angles by utilizing at least two modules to irradiate a user and collecting a user image;
calculating user sight line information according to the user image acquired by each module, wherein each module corresponds to one group of user sight line information;
selecting one group of user sight line information from a plurality of groups of user sight line information corresponding to each module to determine the user real sight line information;
and comparing the user sight line information obtained by calculation of the sight line estimation equipment with the real user sight line information, and evaluating the performance of the sight line estimation equipment according to a comparison result.
16. A sight line estimation device performance evaluation device comprising a processor, a memory, and a computer program stored on the memory and executable on the processor, wherein the processor executes the sight line estimation device performance evaluation method according to claim 15 when executing the computer program.
17. A computer-readable storage medium on which a computer program is stored, the computer program being characterized by implementing the line-of-sight estimation device performance evaluation method according to claim 15 when executed by a processor.
CN202010016069.9A 2020-01-08 2020-01-08 Sight estimation equipment performance evaluation method, system and equipment Pending CN111208904A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010016069.9A CN111208904A (en) 2020-01-08 2020-01-08 Sight estimation equipment performance evaluation method, system and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010016069.9A CN111208904A (en) 2020-01-08 2020-01-08 Sight estimation equipment performance evaluation method, system and equipment

Publications (1)

Publication Number Publication Date
CN111208904A true CN111208904A (en) 2020-05-29

Family

ID=70787161

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010016069.9A Pending CN111208904A (en) 2020-01-08 2020-01-08 Sight estimation equipment performance evaluation method, system and equipment

Country Status (1)

Country Link
CN (1) CN111208904A (en)

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004129684A (en) * 2002-10-08 2004-04-30 Nippon Telegr & Teleph Corp <Ntt> Sight line measuring accuracy evaluation device and, method therefor, and sight line measuring accuracy evaluation program and recording medium with the program recorded therein
JP2012037934A (en) * 2010-08-03 2012-02-23 Canon Inc Visual line detection device, visual line detection method and program
US20130154918A1 (en) * 2011-12-20 2013-06-20 Benjamin Isaac Vaught Enhanced user eye gaze estimation
CN104113680A (en) * 2013-04-19 2014-10-22 北京三星通信技术研究有限公司 Sight line tracking system and method
JP2015088828A (en) * 2013-10-29 2015-05-07 ソニー株式会社 Information processing device, information processing method, and program
US20150227789A1 (en) * 2014-02-10 2015-08-13 Sony Corporation Information processing apparatus, information processing method, and program
US20160063303A1 (en) * 2014-09-02 2016-03-03 Hong Kong Baptist University Method and apparatus for eye gaze tracking
US20160063304A1 (en) * 2014-08-29 2016-03-03 Alps Electric Co., Ltd. Line-of-sight detection apparatus
DE102016013806A1 (en) * 2016-11-18 2017-05-24 Daimler Ag System and method for detecting a viewing direction of a driver in a vehicle
CN106774863A (en) * 2016-12-03 2017-05-31 西安中科创星科技孵化器有限公司 A kind of method that Eye-controlling focus are realized based on pupil feature
WO2018000020A1 (en) * 2016-06-29 2018-01-04 Seeing Machines Limited Systems and methods for performing eye gaze tracking
WO2018030515A1 (en) * 2016-08-12 2018-02-15 国立大学法人静岡大学 Line-of-sight detection device
US20180089508A1 (en) * 2016-04-13 2018-03-29 Panasonic Intellectual Property Management Co., Ltd. Visual line measuring device and visual line measuring method
US9940518B1 (en) * 2017-09-11 2018-04-10 Tobii Ab Reliability of gaze tracking data for left and right eye
WO2019050543A1 (en) * 2017-09-11 2019-03-14 Tobii Ab Reliability of gaze tracking data for left and right eye
WO2019187808A1 (en) * 2018-03-29 2019-10-03 ソニー株式会社 Information processing device, information processing method, and program
JP2019171022A (en) * 2018-03-26 2019-10-10 株式会社Jvcケンウッド Evaluation device, evaluation method, and evaluation program
CN110537897A (en) * 2019-09-10 2019-12-06 北京未动科技有限公司 Sight tracking method and device, computer readable storage medium and electronic equipment

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004129684A (en) * 2002-10-08 2004-04-30 Nippon Telegr & Teleph Corp <Ntt> Sight line measuring accuracy evaluation device and, method therefor, and sight line measuring accuracy evaluation program and recording medium with the program recorded therein
JP2012037934A (en) * 2010-08-03 2012-02-23 Canon Inc Visual line detection device, visual line detection method and program
US20130154918A1 (en) * 2011-12-20 2013-06-20 Benjamin Isaac Vaught Enhanced user eye gaze estimation
CN104113680A (en) * 2013-04-19 2014-10-22 北京三星通信技术研究有限公司 Sight line tracking system and method
JP2015088828A (en) * 2013-10-29 2015-05-07 ソニー株式会社 Information processing device, information processing method, and program
US20150227789A1 (en) * 2014-02-10 2015-08-13 Sony Corporation Information processing apparatus, information processing method, and program
US20160063304A1 (en) * 2014-08-29 2016-03-03 Alps Electric Co., Ltd. Line-of-sight detection apparatus
US20160063303A1 (en) * 2014-09-02 2016-03-03 Hong Kong Baptist University Method and apparatus for eye gaze tracking
US20180089508A1 (en) * 2016-04-13 2018-03-29 Panasonic Intellectual Property Management Co., Ltd. Visual line measuring device and visual line measuring method
WO2018000020A1 (en) * 2016-06-29 2018-01-04 Seeing Machines Limited Systems and methods for performing eye gaze tracking
WO2018030515A1 (en) * 2016-08-12 2018-02-15 国立大学法人静岡大学 Line-of-sight detection device
US20190172222A1 (en) * 2016-08-12 2019-06-06 National University Corporation Shizuoka University Line-of-sight detection device
DE102016013806A1 (en) * 2016-11-18 2017-05-24 Daimler Ag System and method for detecting a viewing direction of a driver in a vehicle
CN106774863A (en) * 2016-12-03 2017-05-31 西安中科创星科技孵化器有限公司 A kind of method that Eye-controlling focus are realized based on pupil feature
US9940518B1 (en) * 2017-09-11 2018-04-10 Tobii Ab Reliability of gaze tracking data for left and right eye
WO2019050543A1 (en) * 2017-09-11 2019-03-14 Tobii Ab Reliability of gaze tracking data for left and right eye
JP2019171022A (en) * 2018-03-26 2019-10-10 株式会社Jvcケンウッド Evaluation device, evaluation method, and evaluation program
WO2019187808A1 (en) * 2018-03-29 2019-10-03 ソニー株式会社 Information processing device, information processing method, and program
CN110537897A (en) * 2019-09-10 2019-12-06 北京未动科技有限公司 Sight tracking method and device, computer readable storage medium and electronic equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李东平等: "基于普尔钦斑点的人眼视线方向检测", pages 498 - 500 *

Similar Documents

Publication Publication Date Title
CN109190540B (en) Biopsy region prediction method, image recognition device, and storage medium
US7819525B2 (en) Automatic direct gaze detection based on pupil symmetry
JP2021536057A (en) Lesion detection and positioning methods, devices, devices, and storage media for medical images
CN108985210A (en) A kind of Eye-controlling focus method and system based on human eye geometrical characteristic
Hosp et al. RemoteEye: An open-source high-speed remote eye tracker: Implementation insights of a pupil-and glint-detection algorithm for high-speed remote eye tracking
Toennies et al. Feasibility of hough-transform-based iris localisation for real-time-application
JP6822482B2 (en) Line-of-sight estimation device, line-of-sight estimation method, and program recording medium
CN115482574B (en) Screen gaze point estimation method, device, medium and equipment based on deep learning
CN109661668A (en) Image processing method and system for iris recognition
CN112987910B (en) Testing method, device, equipment and storage medium of eyeball tracking equipment
CN113260299A (en) System and method for eye tracking
KR20180030284A (en) System and method for correcting color of digital image based on the human sclera and pupil
CN111208904A (en) Sight estimation equipment performance evaluation method, system and equipment
CN116807452A (en) Scoliosis 3D detection method, system, equipment and medium
RU2696042C2 (en) Method and system for recording eye movement
Melesse et al. Appearance-based gaze tracking through supervised machine learning
CN116382473A (en) Sight calibration, motion tracking and precision testing method based on self-adaptive time sequence analysis prediction
Criss et al. Video assessment of finger tapping for Parkinson's disease and other movement disorders
Remeseiro et al. Automatic eye blink detection using consumer web cameras
KR100686517B1 (en) Method For Modeling Pupil Shape
CN114740966A (en) Multi-modal image display control method and system and computer equipment
Ferhat et al. Eye-tracking with webcam-based setups: Implementation of a real-time system and an analysis of factors affecting performance
Ji et al. Bayesian eye tracking
KR102473744B1 (en) A method of diagnosing strabismus through the analysis of eyeball image from cover and uncovered test
US11503998B1 (en) Method and a system for detection of eye gaze-pattern abnormalities and related neurological diseases

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination