CN112528713A - Method, system, processor and equipment for estimating fixation point - Google Patents

Method, system, processor and equipment for estimating fixation point Download PDF

Info

Publication number
CN112528713A
CN112528713A CN201910887940.XA CN201910887940A CN112528713A CN 112528713 A CN112528713 A CN 112528713A CN 201910887940 A CN201910887940 A CN 201910887940A CN 112528713 A CN112528713 A CN 112528713A
Authority
CN
China
Prior art keywords
pupil
calculating
image
pcr vector
vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910887940.XA
Other languages
Chinese (zh)
Inventor
王云飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing 7Invensun Technology Co Ltd
Original Assignee
Beijing 7Invensun Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing 7Invensun Technology Co Ltd filed Critical Beijing 7Invensun Technology Co Ltd
Priority to CN201910887940.XA priority Critical patent/CN112528713A/en
Publication of CN112528713A publication Critical patent/CN112528713A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The invention discloses a method and a system for estimating a fixation point, which are applied to sight tracking equipment with a single camera and a single coaxial light source, and comprise the following steps: acquiring an original image captured from a single camera; collecting human eye characteristic information of an original image, and calculating pupil facula center data of two eyes based on the human eye characteristic information; calculating to obtain an initial PCR vector based on the pupil spot center data, wherein the PCR vector represents a vector of the spot center pointing to the pupil center; normalizing the initial PCR vector by using a preset distance factor to obtain a target PCR vector; and calculating to obtain the gazing point information according to the target PCR vector. When the gaze point estimation is carried out, only pupil facula center data is applied, and the problem that the gaze point estimation can be carried out only by two groups of light sources is solved, so that the number of the light sources of the existing gaze tracking equipment can be reduced, and the miniaturization and the light weight of the gaze tracking equipment are realized.

Description

Method, system, processor and equipment for estimating fixation point
Technical Field
The present invention relates to the field of information processing technologies, and in particular, to a method, a system, a processor, and a device for estimating a gaze point.
Background
With the development of human-computer interaction technology, eyeball tracking technology is widely applied. Eye tracking, also known as gaze tracking, is a technique for estimating the gaze and/or point of regard of an eye by measuring eye movement.
The existing sight tracking technology is generally based on the setting of a multi-light source single camera or a multi-camera for multi-light source. Light sources are generally classified into two types: one is the separation of the light source from the camera position to form a normal pupil face map (also known as dark pupil), which is called a dark pupil light source; another is a face pattern (also called a bright pupil) where the light source reflects light off the iris coaxial with the camera, causing the camera pupil to shine. The light source combination is generally a combination of a plurality of dark pupils, or a plurality of dark pupils are combined with one bright pupil, or one dark pupil is combined with one bright pupil. The distance between the two light sources needs to be set relatively large, so that the volume of the sight tracking equipment generated by the existing sight tracking method is large, and the purposes of miniaturization and light weight required by a user cannot be met.
Disclosure of Invention
In view of the above problems, the present invention provides a method and a system for estimating a gaze point, which can reduce the number of light sources of the existing gaze tracking device and realize miniaturization and light weight of the gaze tracking device based on the gaze point estimation with a single light source.
In order to achieve the purpose, the invention provides the following technical scheme:
a gaze point estimation method applied to a gaze tracking apparatus having a single camera and a single coaxial light source, comprising:
acquiring a raw image captured from the single camera;
collecting the human eye characteristic information of the original image, and calculating pupil facula center data of two eyes based on the human eye characteristic information;
calculating to obtain an initial PCR vector based on the pupil facula center data, wherein the PCR vector represents a vector of the facula center pointing to the pupil center;
normalizing the initial PCR vector by using a preset distance factor to obtain a target PCR vector, wherein the preset distance factor represents a normalization parameter for normalizing the initial PCR vector;
and calculating to obtain the information of the fixation point according to the target PCR vector.
Optionally, the pupil spot center data of the two eyes includes pupil image coordinates and spot image coordinates, the acquiring of the human eye feature information of the original image and the calculating of the pupil spot center data of the two eyes based on the human eye feature information include:
collecting human eye characteristic information of the original image;
according to the human eye feature information, pupil image features and light spot image features of two eyes are obtained;
calculating to obtain pupil image coordinates according to the pupil image characteristics;
and calculating to obtain the coordinates of the light spot image according to the characteristics of the light spot image.
Optionally, the preset distance factor represents a function of a distance parameter, where the distance parameter includes a distance between pupils of both eyes, a distance between faculae of both eyes, or a distance between formulated feature points of both eyes.
Optionally, the acquiring the raw image captured from the single camera includes acquiring the raw image captured from the single camera according to a set exposure gain, and the method further includes:
calculating the average gray value of a pupil area according to the original image;
and judging whether the set exposure gain is adjusted or not according to the average gray value, wherein the obtained original image meets the light spot searching condition.
Optionally, the determining whether to adjust the set exposure gain according to the average gray value includes:
and judging whether the average gray value exceeds a preset gray threshold value, and if so, adjusting the set exposure gain.
Optionally, the method further comprises:
adjusting the set exposure gain to obtain a target exposure gain;
and controlling the single camera to acquire images according to the target exposure gain, so that the acquired original images meet the target exposure gain.
Optionally, the calculating, according to the target PCR vector, to obtain the gazing point information includes:
and calculating to obtain the fixation point information according to a preset mapping relation and the target PCR vector, wherein the preset mapping relation represents the mapping relation between the PCR vector and the fixation point and/or the fixation direction.
A gaze point estimation system for use with a gaze tracking device having a single camera and a single coaxial light source, comprising:
an acquisition unit for acquiring an original image captured from the single camera;
the first calculation unit is used for acquiring the human eye characteristic information of the original image and calculating pupil facula center data of two eyes based on the human eye characteristic information;
the second calculation unit is used for calculating and obtaining an initial PCR vector based on the pupil facula center data, and the PCR vector represents a vector of which the facula center points to the pupil center;
the normalization unit is used for performing normalization processing on the initial PCR vector by using a preset distance factor to obtain a target PCR vector, and the preset distance factor represents a normalization parameter for performing normalization processing on the initial PCR vector;
and the third calculating unit is used for calculating and obtaining the gazing point information according to the target PCR vector.
Optionally, the first computing unit includes:
the acquisition subunit is used for acquiring the human eye characteristic information of the original image;
the acquisition subunit is used for acquiring pupil image characteristics and light spot image characteristics of two eyes according to the human eye characteristic information;
the first calculating subunit is used for calculating and obtaining pupil image coordinates according to the pupil image characteristics;
the second calculating subunit is used for calculating to obtain the coordinates of the light spot image according to the characteristics of the light spot image; the pupil and light spot center data of the two eyes comprise pupil image coordinates and light spot image coordinates.
Optionally, the acquiring unit is specifically configured to acquire the raw image captured from the single camera according to a set exposure gain, and the system further includes:
the gray value calculation unit is used for calculating the average gray value of the pupil area according to the original image;
the judging unit is used for judging whether the set exposure gain is adjusted or not according to the average gray value, and the obtained original image meets the light spot searching condition;
wherein the judging unit is specifically configured to:
judging whether the average gray value exceeds a preset gray threshold value, if so, adjusting the set exposure gain;
the system further comprises:
the adjusting unit is used for adjusting the set exposure gain to obtain a target exposure gain;
and the re-acquisition unit is used for controlling the single camera to acquire images according to the target exposure gain so that the acquired original images meet the target exposure gain.
The third calculating unit is specifically configured to;
and calculating to obtain the fixation point information according to a preset mapping relation and the target PCR vector, wherein the preset mapping relation represents the mapping relation between the PCR vector and the fixation point and/or the fixation direction.
A processor for running a program, wherein the program when run performs the point of regard estimation method as described above.
An apparatus comprising a processor, a memory, and a program stored on the memory and executable on the processor, the processor when executing the program at least implementing:
acquiring a raw image captured from the single camera;
collecting the human eye characteristic information of the original image, and calculating pupil facula center data of two eyes based on the human eye characteristic information;
calculating to obtain an initial PCR vector based on the pupil facula center data, wherein the PCR vector represents a vector of the facula center pointing to the pupil center;
normalizing the initial PCR vector by using a preset distance factor to obtain a target PCR vector, wherein the preset distance factor represents a normalization parameter for normalizing the initial PCR vector;
and calculating to obtain the information of the fixation point according to the target PCR vector.
Compared with the prior art, the invention provides a method, a system, a processor and equipment for estimating a fixation point. The gaze point estimation method only applies pupil spot center data when performing gaze point estimation, namely distance information between human eyes and a camera can be estimated by only using one spot and pupil position information without using two spot information in one eye, so that the problem that the gaze point estimation can be realized only by two groups of light sources is solved, the number of the light sources of the existing gaze tracking equipment can be reduced, and the miniaturization and the light weight of the gaze tracking equipment are realized.
The noun explains:
pcr (pupelcornealreflection), a pupil-cornea reflection method, is one of the optical recording methods.
The method comprises the following steps:
firstly, acquiring an eye image with a light spot (also called purkinje spot), and acquiring a reflection point of a light source on a cornea, namely the light spot; along with the rotation of the eyeball, the relative position relationship between the pupil center and the light spot is changed, and the position change relationship is reflected by a plurality of correspondingly acquired eye images with the light spot; and estimating the sight line/the fixation point according to the position change relation.
Ipd (inter pupil distance), i.e. the distance between the pupils of both eyes (left and right).
Igd (inter Glint distance) is the distance between two spots in the eye image.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
FIG. 1 is a schematic diagram of a light source assembly in the prior art;
fig. 2 is a schematic structural diagram of a module of a gaze tracking apparatus according to an embodiment of the present disclosure;
fig. 3 is a schematic flowchart of a method for estimating a gaze point according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a PCR vector provided in an embodiment of the present application;
fig. 5 is a schematic flowchart of a method for calculating pupil spot center data of two eyes according to the second embodiment of the present application;
fig. 6 is a schematic flowchart illustrating an exposure gain adjustment method according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of a gaze point estimation system according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first" and "second," and the like in the description and claims of the present invention and the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "comprising" and "having," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not set forth for a listed step or element but may include steps or elements not listed.
The embodiment of the present invention provides a gaze point estimation method, which can be applied in the field of eye tracking, which may also be called gaze tracking, and is a technology for estimating the gaze and/or gaze point of an eye by measuring eye movement, and the eye tracking technology requires a dedicated device, such as an eye tracking device.
The sight line may be understood as a three-dimensional vector, and the gaze point may be understood as a two-dimensional coordinate of the three-dimensional vector projected on a certain plane. At present, an optical recording method is widely used, in which a camera or a video camera is used to record the eye movement of a subject, i.e., an eye image reflecting the eye movement is obtained, and eye features are extracted according to the obtained eye image to establish a model for estimating the sight line/fixation point. Wherein the eye features may include: pupil location, pupil shape, iris location, eyelid location, canthus location, spot (also known as purkinje spot) location, and the like.
Eye tracking methods can be broadly classified into two types, interference and non-interference. In the current sight tracking system, a non-interference eye movement tracking method is mostly adopted, and particularly, a pupil corneal reflex method is most widely applied. According to the physiological characteristics of human eyes and a visual imaging principle, an image processing technology is utilized to process an acquired eye pattern, and human eye characteristic parameters for sight line estimation are obtained. And by taking the obtained human eye characteristic parameters as datum points, the sight line falling point coordinates can be obtained by adopting a corresponding mapping model so as to realize the tracking of the sight line. The method has high precision, no interference to the user and free rotation of the head of the user. The hardware equipment used by the method comprises a light source and image acquisition equipment, wherein the light source is generally an infrared light source, because infrared rays cannot influence the vision of eyes, and the infrared light source can be a plurality of infrared light sources which are arranged in a preset mode, such as a delta shape, a straight shape and the like; the image acquisition device can be an infrared camera device, an infrared image sensor, a camera or a video camera and the like. In the corneal reflection method, because errors caused by asymmetric light spots are overcome, a multi-light-source single camera or a multi-light-source multi-camera is usually adopted to realize the fixation point estimation with free head movement.
For example, the light sources are generally two of the following: first, the light source is separated from the camera position to form a normal pupil face map (also called dark pupil); secondly, the iris caused by the light source and the camera being coaxial reflects light, so that the camera can obtain a face image with bright pupils (also called bright pupils). Referring to fig. 1, a schematic diagram of a light source combination in the prior art is shown, and it can be seen that the light source combination in the prior art is generally a combination of multiple dark pupils, or a combination of multiple dark pupils and a bright pupil, or a combination of a dark pupil and a bright pupil. To achieve a multiple light source solution, the distance between two light sources is usually larger than 150mm, so that the existing sight line tracking device is bulky.
Therefore, in the embodiments of the present application, there is provided a gaze point estimation method applied to a gaze tracking apparatus provided with a single camera and a single coaxial light source, since the gaze tracking apparatus in which only one light source is provided may be small in size.
Example one
Referring to fig. 2, it shows a schematic structural diagram of a module of a gaze tracking device provided in an embodiment of the present application, in the module, only one infrared light source and one infrared camera are needed, where the infrared light source is used as a bright pupil light source, it should be noted that to appear a bright pupil, the light source needs to be located near the optical axis of the camera or on the same straight line with the optical axis of the camera, so that due to the principle of specular reflection of the pupil, the image formed by the pupil on the image is not black, but a very bright image. The module of using this light source and camera can reduce the size of module greatly in sight tracking equipment, reaches the purpose of the miniaturization and the lightweight of sight tracking equipment. The infrared light source in fig. 2 represents a single light source in the embodiment of the present application, that is, the single light source in the present application may be a single light source, or may be a group of light sources as shown in fig. 2, but the distance between the position points of each light source in the group of light sources is small, so that the light is represented as a single light source as a whole, which is different from a scene in the prior art in which the distance between two light sources is larger than 150 mm.
In order to achieve the estimation of the gaze/gaze point of the gaze tracking apparatus provided with a single camera and a single coaxial light source, a gaze point estimation method is further provided in an embodiment of the present application, and referring to fig. 3, the method may include the following steps:
s101, acquiring an original image captured from a single camera.
S102, collecting human eye characteristic information of the original image, and calculating pupil facula center data of two eyes based on the human eye characteristic information.
The method comprises the steps of illuminating a scene by using a single light source in the sight tracking equipment, and collecting images by using a single camera in the sight tracking equipment, wherein the collected images are images including human eye characteristic information, such as human eye images, human face images and the like. Specifically, the human eye feature information may include information related to the pupil, such as a pupil position and a pupil shape, and some iris related information, such as an iris position and an iris shape, and may further include spot information formed in the eye by light source irradiation.
After detecting human eyes according to the original image, pupil light spot center data of the two eyes are calculated, namely the center coordinate data of the pupil images of the two eyes and the center coordinate data of the light spot images of the two eyes are included. Specifically, a two-dimensional coordinate system of the human eye image may be set, and the pupil spot center data may be determined according to an origin position and a coordinate scale of the two-dimensional coordinate system.
S103, calculating to obtain an initial PCR vector based on pupil spot center data;
and S104, carrying out normalization processing on the initial PCR vector according to a preset distance factor to obtain a target PCR vector.
It should be noted that the PCR (pulse central reflection) vector represents a vector from the center of the spot to the center of the pupil, i.e. a vector formed by connecting the center of the pupil and the center of the spot, and see fig. 4, which shows a schematic diagram of the PCR vector, and an arrow represents the PCR vector formed from the center of the spot to the center of the pupil. Assuming that the pupil center is (x1, y1) and the spot center is (x2, y2), the PCR vector is (x1-x2, y1-y 2). At this time, an initial PCR vector is calculated.
In the embodiment of the application, the human eye is subjected to the reduction of the PCR vectorSensitivity to lens distance, the initial PCR vector needs to be normalized by a distance factor, specifically by dividing the abscissa and ordinate values of the initial PCR vector by the distance factor, it should be noted that the distance factor is not a distance value, but is a function of the relevant distance parameter. For example, taking the interpupillary distance IPD (Inter pupil distance) as an example, IPD can be used2As a function of the distance parameter, i.e. a distance factor.
The distance parameter may include the interpupillary distance between two eyes, the spot distance between two eyes, or the distance between any two eye feature points, for example: the inner canthus distance of the two eyes, the outer canthus distance of the two eyes, the distance between the upper eyelid and the lower eyelid, the size of the human face, the size of key points of the human face and the like. Correspondingly, in one possible implementation, the distance factor represents a functional expression of the square of the spot separation D for both eyes, i.e. D is used2As a function of the distance parameter, i.e. a distance factor.
In different processing procedures, other functions, such as a cubic power or a square root, may be adopted, and need to be determined in combination with the eye characteristics of a specific user, so that the embodiment of the present invention does not limit the specific form of the function of the distance parameter representing the normalization factor.
For example, in another possible implementation, the distance factor characterizes a function of the distance between the eye-specific feature points, and if the distance between the eye-specific feature points is d, the distance factor is d3
And S105, calculating to obtain the gazing point information according to the target PCR vector.
The gaze point information estimation can be performed by adopting a corneal reflection method, the target PCR vector is input into a preset regression model to calculate to obtain the gaze point information, or the target PCR vector and the human eye parameters can be input into a gaze calculation model to calculate the gaze coordinates of the user, so that the gaze point information matched with the user can be obtained. The preset regression model and the sight line calculation model can be obtained by training the models by adopting training samples comprising PCR vectors and fixation point information.
The invention provides a fixation point estimation method, which comprises the steps of obtaining pupil facula center data in the fixation point estimation process, obtaining a target PCR vector after calculating and carrying out normalization processing based on the pupil facula center data, and obtaining fixation point information through the target PCR vector calculation. The gaze point estimation method only applies pupil spot center data when performing gaze point estimation, namely distance information between human eyes and a camera can be estimated by only using one spot and pupil position information without using two spot information in one eye, so that the problem that the gaze point estimation can be realized only by two groups of light sources is solved, the number of the light sources of the existing gaze tracking equipment can be reduced, and the miniaturization and the light weight of the gaze tracking equipment are realized.
Example two
In an embodiment of the present application, a method for calculating pupil spot center data of two eyes is provided, and referring to fig. 5, the method includes:
s201, collecting human eye characteristic information of the original image;
s202, pupil image characteristics and light spot image characteristics of two eyes are obtained according to the human eye characteristic information;
s203, calculating to obtain a pupil image coordinate according to the pupil image characteristic;
and S204, calculating to obtain the coordinates of the light spot image according to the characteristics of the light spot image.
The pupil spot center data for both eyes in this embodiment includes pupil image coordinates and spot image coordinates.
After the human eye feature information is obtained in the original image, the light spot image can be obtained by searching in the original image according to the light spot image feature as a searching condition. For example, the searching may be performed according to the gray value of the light spot image, the gray value of each pixel of the original image needs to be obtained by performing gray conversion on the original image, and then the light spot image in the original image is determined according to the gray value range of the light spot image.
After the spot image is obtained, the coordinates of the center point of the spot image may be taken as the spot image coordinates. Correspondingly, when determining the coordinates of the pupil image, the pupil image can be obtained by searching in the original image according to the characteristics of the pupil image, and the coordinates of the center of the pupil image are determined as the coordinates of the pupil image.
The method comprises the steps of firstly extracting a pupil area from an original image, facilitating the extraction of sight characteristic parameters, setting a lower gray threshold value as the pupil area is darker in a human eye image and has a very low gray value, determining a candidate area of the pupil area by comparing the gray value in the image with the threshold value, wherein the pupil area is the deepest part in the human eye image, is in a shape similar to a circle, and is converted into binary image information according to the gray difference between eye skin, sclera, iris and pupil in the human eye image, so that the pupil area can be extracted. The pupil area image comprises a cornea reflection light spot area, a complete pupil area and a partial iris area. Compared with other areas, the cornea reflection light spot area is a part with the highest gray value, the area is small, the color is bright, and the cornea reflection light spots contained in each pupil area are distributed in a circular horizontal direction. Therefore, according to the characteristics of the corneal reflection light spots, firstly, the pupil area is subjected to binarization processing, a gray value threshold value is set, an area larger than the gray value threshold value is taken as a candidate area, namely, a bright spot area of the pupil area is extracted, then, noise bright spots in the pupil area are removed according to the area and the shape of the bright spot, and a light spot image corresponding to the light spot area is obtained.
In another possible implementation manner, the center of the light spot may also be used as a center, a pupil search range is set, an obtained pupil gray level threshold is calculated according to the gray level histogram, and then, a pupil center coordinate is obtained.
Let us assume that the pupil image coordinates are (P)x,Py) The spot image coordinate is (G)x,Gy) And the PCR vector coordinate is (x, y), then:
x=norm(Px-Gx)
y=norm(Py-Gy)
wherein norm represents normalization calculation, i.e. to reduce the sensitivity of the PCR to the distance from the human eye to the lens, the PCR needs to be normalized by using a distance factor, which can be a correlation function of the pupil distance, the spot distance and the distance of any two eye feature points. It should be noted that the inter-speckle distance in the embodiment of the present application refers to the inter-speckle distance in both eyes, because the normalization factor adopted in the prior art is the distance between two speckles in an image, and is usually expressed by IGD (inter Glint distance), but for the case of adopting fewer light sources in the embodiment of the present application, there is no IGD, and therefore, for some gaze point estimation scenes, a new normalization factor that depends on fewer light sources and can achieve an ideal normalization effect should be considered. The scale information on the other images is used as a normalization factor to counteract the effects of distance variations. Namely, the normalization factor can be a correlation function of information such as the distance between pupils, the distance between inner canthus of two eyes, the distance between outer canthus of two eyes, the distance between upper and lower eyelids, the size of the human face, the distance between light spots in two eyes, the scale of key points of the human face and the like.
EXAMPLE III
Because the existing sight tracking equipment has more light sources, the control logic of the brightness for realizing effective fixation point estimation is also more complicated. In the third embodiment of the present application, an exposure gain control logic is provided, which only needs to implement control according to the gray-level value of the pupil portion.
Referring to fig. 6, a flow chart of an exposure gain adjustment method is shown, the method comprising:
s301, calculating the average gray value of a pupil area according to the original image;
s302, judging whether to adjust the set exposure gain according to the average gray value, so that the obtained original image meets the light spot searching condition.
According to the image characteristics of the pupil area, the pupil area is searched and obtained in the original image, then the image corresponding to the pupil area is converted into a gray image, so that the gray value of each pixel point in the gray image can be obtained, and then the average gray value of the pupil area is obtained through calculation. And judging whether the average gray value of the pupil area exceeds a gray threshold value or not according to the set gray threshold value, and if so, feeding back the judgment result to a camera in the sight tracking equipment so that the camera can adjust the exposure gain. If not, the camera can take the image according to the current exposure gain. The specific adjustment method is that the parameters related to exposure can be adjusted first, and if the exposure related parameters are adjusted to the limit value (namely, the maximum value) and still cannot meet the light spot searching condition, the parameters related to gain can be adjusted again until the gain is adjusted to the limit value. The limit values each represent a maximum value of the corresponding parameter.
Specifically, the camera continuously acquires images, if the gray value of the pupil area of the current frame is calculated to obtain that the exposure gain needs to be adjusted, the camera starts to adjust from the current frame, the adjustment effect is determined according to the actual hardware condition of the camera, and the effect can be achieved in the next frame or after five frames. Therefore, in actual operation, the statistical threshold value can continuously count the average value of five frames to ten frames, so that detection of five frames or ten frames is realized, and the situations of insufficient adjustment and the like are prevented. The general mark of the adjustment end is determined according to the specific situation of the image, mainly the edge of the pupil and the light spot is clear, the contour is easy to extract through an image algorithm, specifically, the gray level difference between the pupil and the iris area is enough, and the gray level difference between the light spot and the pupil is enough.
Certainly, the light source can also be adjusted to achieve the purpose, for example, the brightness of the infrared supplementary lighting lamp is adjusted to prevent the problem of light spot search failure caused by over-lighting.
The embodiment of the present application further provides a method for calculating gaze point information, which specifically includes:
and calculating to obtain the fixation point information according to a preset mapping relation and the target PCR vector, wherein the preset mapping relation represents the mapping relation between the PCR vector and the fixation point and/or the fixation direction.
And inputting the PCR of the two eyes, estimating and outputting the fixation point/fixation direction according to the established mapping relation between the PCR vector and the fixation point/fixation direction. The details are as follows
X=a0+a1x+a2x2+a3y+a4y2+a5xy
Y=b0+b1x+b2x2+b3y+b4y2+b5xy
Wherein x and y are coordinates of the PCR vector in a two-dimensional coordinate system; and X and Y are coordinates of the fixation point in a two-dimensional coordinate system, wherein relevant parameters such as a0, a1, a2, a3, a4, a5, b0, b1, b2, b3, b4, b5 and the like can be fitted during calibration.
After the user's gaze point information is obtained by tracking, the corresponding user's gaze point may be displayed on the display or the display module.
In the embodiment of the application, only the pupils and the light spots of the left eye and the right eye are used for the fixation point estimation. The method is different from the traditional scheme in that the distance between the eyes and the camera is estimated by the light spots of the two light sources, the scheme adopts the interpupillary distance as the eye distance estimation scheme, and the problem that the sight line estimation can be realized only by two groups of light sources is solved.
The module of the single camera and the single coaxial light source provided by the embodiment of the application can be placed below or above a computer display, so that the eyeball tracking of the display can be realized; the module can be arranged in the mobile phone module, so that eyeball tracking of the mobile phone can be realized. Except for the display and the mobile phone, other devices can have similar sight lines to meet the requirement of the devices for eyeball tracking, and the embodiment of the invention is not described in detail.
Example four
There is further provided in a fourth embodiment of the present application, a gaze point estimation system, applied to a gaze tracking apparatus provided with a single camera and a single coaxial light source, see fig. 7, including:
an acquisition unit 10 for acquiring an original image captured from the single camera;
the first calculating unit 20 is configured to collect eye feature information of the original image, and calculate pupil spot center data of two eyes based on the eye feature information;
the second calculating unit 30 is configured to calculate and obtain an initial PCR vector based on the pupil spot center data, where the PCR vector represents a vector pointing to the pupil center from the spot center;
a normalization unit 40, configured to perform normalization processing on the initial PCR vector by using a preset distance factor to obtain a target PCR vector, where the preset distance factor represents a normalization parameter for performing normalization processing on the initial PCR vector;
and the third calculating unit 50 is configured to calculate and obtain the gazing point information according to the target PCR vector.
On the basis of the above embodiment, the first calculation unit includes:
the acquisition subunit is used for acquiring the human eye characteristic information of the original image;
the acquisition subunit is used for acquiring pupil image characteristics and light spot image characteristics of two eyes according to the human eye characteristic information;
the first calculating subunit is used for calculating and obtaining pupil image coordinates according to the pupil image characteristics;
the second calculating subunit is used for calculating to obtain the coordinates of the light spot image according to the characteristics of the light spot image; the pupil and light spot center data of the two eyes comprise pupil image coordinates and light spot image coordinates.
On the basis of the above embodiment, the preset distance factor represents a function of a distance parameter, where the distance parameter includes a distance between pupils of two eyes, a distance between faculae of two eyes, or a distance between specific feature points of two eyes.
On the basis of the above embodiment, the acquiring unit is specifically configured to acquire an original image captured from the single camera according to a set exposure gain, and the system further includes:
the gray value calculation unit is used for calculating the average gray value of the pupil area according to the original image;
the judging unit is used for judging whether to adjust the set exposure gain according to the average gray value so that the acquired original image meets the light spot searching condition;
wherein the judging unit is specifically configured to:
judging whether the average gray value exceeds a preset gray threshold value, if so, adjusting the set exposure gain;
the system further comprises:
the adjusting unit is used for adjusting the set exposure gain to obtain a target exposure gain;
and the re-acquisition unit is used for controlling the single camera to acquire images according to the target exposure gain so that the acquired original images meet the target exposure gain.
On the basis of the foregoing embodiment, the third calculating unit is specifically configured to:
and calculating to obtain the fixation point information according to a preset mapping relation and the target PCR vector, wherein the preset mapping relation represents the mapping relation between the PCR vector and the fixation point and/or the fixation direction.
The invention provides a fixation point estimation system, which obtains pupil facula center data in the fixation point estimation process, obtains a target PCR vector after calculating and carrying out normalization processing based on the pupil facula center data, and obtains fixation point information through the target PCR vector calculation. The gaze point estimation method only applies pupil spot center data when performing gaze point estimation, namely distance information between human eyes and a camera can be estimated by only using one spot and pupil position information without using two spot information in one eye, so that the problem that the gaze point estimation can be realized only by two groups of light sources is solved, the number of the light sources of the existing gaze tracking equipment can be reduced, and the miniaturization and the light weight of the gaze tracking equipment are realized.
EXAMPLE five
The fifth embodiment of the present invention provides a processor, where the processor is configured to execute a program, where the program executes the method for estimating a point of regard described in any one of the first to third embodiments when the program is executed.
EXAMPLE six
An embodiment of the present invention provides an apparatus, where the apparatus includes a processor, a memory, and a program stored in the memory and operable on the processor, and the processor executes the program while looking at the following steps:
acquiring a raw image captured from the single camera;
collecting the human eye characteristic information of the original image, and calculating pupil facula center data of two eyes based on the human eye characteristic information;
calculating to obtain an initial PCR vector based on the pupil facula center data, wherein the PCR vector represents a vector of the facula center pointing to the pupil center;
normalizing the initial PCR vector by using a preset distance factor to obtain a target PCR vector, wherein the preset distance factor represents a normalization parameter for normalizing the initial PCR vector;
and calculating to obtain the information of the fixation point according to the target PCR vector.
Further, the pupil spot center data of the two eyes includes pupil image coordinates and spot image coordinates, the acquiring of the human eye feature information of the original image and the calculating of the pupil spot center data of the two eyes based on the human eye feature information include:
collecting human eye characteristic information of the original image;
according to the human eye feature information, pupil image features and light spot image features of two eyes are obtained;
calculating to obtain pupil image coordinates according to the pupil image characteristics;
and calculating to obtain the coordinates of the light spot image according to the characteristics of the light spot image.
Further, the preset distance factor represents a function of a distance parameter, wherein the distance parameter includes a distance between pupils of two eyes, a distance between faculae of two eyes, or a distance between formulated feature points of two eyes.
Further, the acquiring of the raw image captured from the single camera includes acquiring the raw image captured from the single camera according to a set exposure gain, and the method further includes:
calculating the average gray value of a pupil area according to the original image;
and judging whether the set exposure gain is adjusted or not according to the average gray value, wherein the obtained original image meets the light spot searching condition.
Further, the determining whether to adjust the set exposure gain according to the average gray scale value includes:
and judging whether the average gray value exceeds a preset gray threshold value, and if so, adjusting the set exposure gain.
Further, the method further comprises:
adjusting the set exposure gain to obtain a target exposure gain;
and controlling the single camera to acquire images according to the target exposure gain, so that the acquired original images meet the target exposure gain.
Further, the calculating to obtain the gazing point information according to the target PCR vector includes:
and calculating to obtain the fixation point information according to a preset mapping relation and the target PCR vector, wherein the preset mapping relation represents the mapping relation between the PCR vector and the fixation point and/or the fixation direction.
The device herein may be a server, a PC, a PAD, a mobile phone, etc.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). The memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above are merely examples of the present application and are not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (13)

1. A gaze point estimation method applied to a gaze tracking apparatus having a single camera and a single coaxial light source, comprising:
acquiring a raw image captured from the single camera;
collecting the human eye characteristic information of the original image, and calculating pupil facula center data of two eyes based on the human eye characteristic information;
calculating to obtain an initial PCR vector based on the pupil facula center data, wherein the PCR vector represents a vector of the facula center pointing to the pupil center;
normalizing the initial PCR vector by using a preset distance factor to obtain a target PCR vector, wherein the preset distance factor represents a normalization parameter for normalizing the initial PCR vector;
and calculating to obtain the information of the fixation point according to the target PCR vector.
2. The method according to claim 1, wherein the pupil spot center data of the two eyes comprises pupil image coordinates and spot image coordinates, and the acquiring the human eye feature information of the original image and calculating the pupil spot center data of the two eyes based on the human eye feature information comprises:
collecting human eye characteristic information of the original image;
according to the human eye feature information, pupil image features and light spot image features of two eyes are obtained;
calculating to obtain pupil image coordinates according to the pupil image characteristics;
and calculating to obtain the coordinates of the light spot image according to the characteristics of the light spot image.
3. The method of claim 1, wherein the predetermined distance factor characterizes a function of a distance parameter, wherein the distance parameter comprises a binocular pupil separation, a binocular spot separation, or a distance between designated feature points of both eyes.
4. The method of claim 1, wherein the acquiring the raw image captured from the single camera comprises acquiring the raw image captured from the single camera according to a set exposure gain, the method further comprising:
calculating the average gray value of a pupil area according to the original image;
and judging whether to adjust the set exposure gain or not according to the average gray value so that the acquired original image meets the light spot searching condition.
5. The method of claim 4, wherein the determining whether to adjust the set exposure gain according to the average gray-level value comprises:
and judging whether the average gray value exceeds a preset gray threshold value, and if so, adjusting the set exposure gain.
6. The method of claim 5, further comprising:
adjusting the set exposure gain to obtain a target exposure gain;
and controlling the single camera to acquire images according to the target exposure gain, so that the acquired original images meet the target exposure gain.
7. The method of claim 1, wherein the computing the gaze point information from the target PCR vector comprises:
and calculating to obtain the fixation point information according to a preset mapping relation and the target PCR vector, wherein the preset mapping relation represents the mapping relation between the PCR vector and the fixation point and/or the fixation direction.
8. A gaze point estimation system for use in a gaze tracking device having a single camera and a single coaxial light source, comprising:
an acquisition unit for acquiring an original image captured from the single camera;
the first calculation unit is used for acquiring the human eye characteristic information of the original image and calculating pupil facula center data of two eyes based on the human eye characteristic information;
the second calculation unit is used for calculating and obtaining an initial PCR vector based on the pupil facula center data, and the PCR vector represents a vector of which the facula center points to the pupil center;
the normalization unit is used for performing normalization processing on the initial PCR vector by using a preset distance factor to obtain a target PCR vector, and the preset distance factor represents a normalization parameter for performing normalization processing on the initial PCR vector;
and the third calculating unit is used for calculating and obtaining the gazing point information according to the target PCR vector.
9. The system of claim 8, wherein the first computing unit comprises:
the acquisition subunit is used for acquiring the human eye characteristic information of the original image;
the acquisition subunit is used for acquiring pupil image characteristics and light spot image characteristics of two eyes according to the human eye characteristic information;
the first calculating subunit is used for calculating and obtaining pupil image coordinates according to the pupil image characteristics;
the second calculating subunit is used for calculating to obtain the coordinates of the light spot image according to the characteristics of the light spot image; the pupil and light spot center data of the two eyes comprise pupil image coordinates and light spot image coordinates.
10. The system of claim 8, wherein the acquisition unit is specifically configured to acquire the raw image captured from the single camera according to a set exposure gain, the system further comprising:
the gray value calculation unit is used for calculating the average gray value of the pupil area according to the original image;
the judging unit is used for judging whether to adjust the set exposure gain according to the average gray value so that the acquired original image meets the light spot searching condition;
wherein the judging unit is specifically configured to:
judging whether the average gray value exceeds a preset gray threshold value, if so, adjusting the set exposure gain;
the system further comprises:
the adjusting unit is used for adjusting the set exposure gain to obtain a target exposure gain;
and the re-acquisition unit is used for controlling the single camera to acquire images according to the target exposure gain so that the acquired original images meet the target exposure gain.
11. The system according to claim 8, wherein the third computing unit is specifically configured to:
and calculating to obtain the fixation point information according to a preset mapping relation and the target PCR vector, wherein the preset mapping relation represents the mapping relation between the PCR vector and the fixation point and/or the fixation direction.
12. A processor, characterized in that the processor is configured to run a program, wherein the program when running performs the method of gaze point estimation according to any of claims 1-7.
13. An apparatus comprising a processor, a memory, and a program stored on the memory and executable on the processor, the processor when executing the program at least implementing:
acquiring a raw image captured from the single camera;
collecting the human eye characteristic information of the original image, and calculating pupil facula center data of two eyes based on the human eye characteristic information;
calculating to obtain an initial PCR vector based on the pupil facula center data, wherein the PCR vector represents a vector of the facula center pointing to the pupil center;
normalizing the initial PCR vector by using a preset distance factor to obtain a target PCR vector, wherein the preset distance factor represents a normalization parameter for normalizing the initial PCR vector;
and calculating to obtain the information of the fixation point according to the target PCR vector.
CN201910887940.XA 2019-09-19 2019-09-19 Method, system, processor and equipment for estimating fixation point Pending CN112528713A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910887940.XA CN112528713A (en) 2019-09-19 2019-09-19 Method, system, processor and equipment for estimating fixation point

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910887940.XA CN112528713A (en) 2019-09-19 2019-09-19 Method, system, processor and equipment for estimating fixation point

Publications (1)

Publication Number Publication Date
CN112528713A true CN112528713A (en) 2021-03-19

Family

ID=74974247

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910887940.XA Pending CN112528713A (en) 2019-09-19 2019-09-19 Method, system, processor and equipment for estimating fixation point

Country Status (1)

Country Link
CN (1) CN112528713A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002064031A2 (en) * 2001-02-09 2002-08-22 Sensomotoric Instruments Gmbh Multidimensional eye tracking and position measurement system
CN101803906A (en) * 2010-03-10 2010-08-18 中国科学院光电技术研究所 Automatic defocusing compensation human eye aberration Hartmann measuring instrument
CN103679180A (en) * 2012-09-19 2014-03-26 武汉元宝创意科技有限公司 Sight tracking method based on single light source of single camera
CN104199544A (en) * 2014-08-28 2014-12-10 华南理工大学 Targeted advertisement delivery method based on eye gaze tracking
CN105979162A (en) * 2016-07-21 2016-09-28 凌云光技术集团有限责任公司 Automatic exposure adjustment method and device for extensible dynamic range images
WO2017211066A1 (en) * 2016-06-08 2017-12-14 华南理工大学 Iris and pupil-based gaze estimation method for head-mounted device
EP3413234A1 (en) * 2017-06-09 2018-12-12 Aisin Seiki Kabushiki Kaisha Gaze-tracking device, program, and method
CN109034108A (en) * 2018-08-16 2018-12-18 北京七鑫易维信息技术有限公司 A kind of methods, devices and systems of sight estimation
CN110062168A (en) * 2019-05-05 2019-07-26 北京七鑫易维信息技术有限公司 Shooting parameter adjustment method, device, equipment and the medium of eye movement tracing equipment

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002064031A2 (en) * 2001-02-09 2002-08-22 Sensomotoric Instruments Gmbh Multidimensional eye tracking and position measurement system
CN101803906A (en) * 2010-03-10 2010-08-18 中国科学院光电技术研究所 Automatic defocusing compensation human eye aberration Hartmann measuring instrument
CN103679180A (en) * 2012-09-19 2014-03-26 武汉元宝创意科技有限公司 Sight tracking method based on single light source of single camera
CN104199544A (en) * 2014-08-28 2014-12-10 华南理工大学 Targeted advertisement delivery method based on eye gaze tracking
WO2017211066A1 (en) * 2016-06-08 2017-12-14 华南理工大学 Iris and pupil-based gaze estimation method for head-mounted device
CN105979162A (en) * 2016-07-21 2016-09-28 凌云光技术集团有限责任公司 Automatic exposure adjustment method and device for extensible dynamic range images
EP3413234A1 (en) * 2017-06-09 2018-12-12 Aisin Seiki Kabushiki Kaisha Gaze-tracking device, program, and method
CN109034108A (en) * 2018-08-16 2018-12-18 北京七鑫易维信息技术有限公司 A kind of methods, devices and systems of sight estimation
CN110062168A (en) * 2019-05-05 2019-07-26 北京七鑫易维信息技术有限公司 Shooting parameter adjustment method, device, equipment and the medium of eye movement tracing equipment

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
宫德麟;施家栋;张广月;王建中;: "头戴式眼动跟踪系统设计与实现", 科技创新与应用, no. 31, 15 November 2015 (2015-11-15) *
沈志豪;徐蔚;韩秋漪;张善端;: "基于瞳孔光响应的光谱灵敏度模型研究初探", 照明工程学报, no. 06, 31 December 2017 (2017-12-31) *
闫蓓;吴梦瑶;: "低分辨率图像中的瞳孔中心精确定位方法", 电子测量技术, no. 16, 31 August 2018 (2018-08-31) *

Similar Documents

Publication Publication Date Title
JP6577454B2 (en) On-axis gaze tracking system and method
US10048749B2 (en) Gaze detection offset for gaze tracking models
US9864430B2 (en) Gaze tracking via eye gaze model
US20190271858A1 (en) Method, apparatus, and computer program for establishing a representation of a spectacle lens edge
US11715231B2 (en) Head pose estimation from local eye region
JP2019519859A (en) System and method for performing gaze tracking
CN108985210A (en) A kind of Eye-controlling focus method and system based on human eye geometrical characteristic
US12056274B2 (en) Eye tracking device and a method thereof
US11385710B2 (en) Geometric parameter measurement method and device thereof, augmented reality device, and storage medium
EP3994510A1 (en) Eye tracking latency enhancements
CN110807427A (en) Sight tracking method and device, computer equipment and storage medium
US11163994B2 (en) Method and device for determining iris recognition image, terminal apparatus, and storage medium
JP6870474B2 (en) Gaze detection computer program, gaze detection device and gaze detection method
CN110051319A (en) Adjusting method, device, equipment and the storage medium of eyeball tracking sensor
CN112528714B (en) Single-light-source-based gaze point estimation method, system, processor and equipment
CN112528713A (en) Method, system, processor and equipment for estimating fixation point
KR102074977B1 (en) Electronic devices and methods thereof
JP7542563B2 (en) Eye tracking latency improvement
US20210350554A1 (en) Eye-tracking system
US11156831B2 (en) Eye-tracking system and method for pupil detection, associated systems and computer programs
WO2021095278A1 (en) Image processing method, image processing device, and image processing program
KR20210154731A (en) Method for detecting change of fundus for longitudinal analysis of fundusimage and device performing the same
CN113616153A (en) Method, device, medium and electronic equipment for measuring thickness of cornea

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination