CN117809358A - Eyeball rotation angle determination method, device, equipment and storage medium - Google Patents

Eyeball rotation angle determination method, device, equipment and storage medium Download PDF

Info

Publication number
CN117809358A
CN117809358A CN202311813365.1A CN202311813365A CN117809358A CN 117809358 A CN117809358 A CN 117809358A CN 202311813365 A CN202311813365 A CN 202311813365A CN 117809358 A CN117809358 A CN 117809358A
Authority
CN
China
Prior art keywords
eyeball
iris
model
center
rotation angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311813365.1A
Other languages
Chinese (zh)
Inventor
胡飞扬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jisu Optical Technology Co ltd
Original Assignee
Beijing Jisu Optical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jisu Optical Technology Co ltd filed Critical Beijing Jisu Optical Technology Co ltd
Priority to CN202311813365.1A priority Critical patent/CN117809358A/en
Publication of CN117809358A publication Critical patent/CN117809358A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The disclosure relates to a method, a device, equipment and a storage medium for determining an eyeball rotation angle. In at least one embodiment of the present disclosure, a first iris texture feature corresponding to an eye model looking at a first preset position and a second iris texture feature corresponding to an eye model looking at a second preset position are obtained; further, based on the center symmetry axis of the eyeball model and the three-dimensional space position information of the eyeball center, the eyeball rotation angle of the eyeball model is determined; thus, the eyeball rotation angle of the eyeball model is determined based on the first iris texture feature, the second iris texture feature and the eyeball rotation angle. Therefore, based on the eyeball rotation angle of the model eyeball, the estimated line of sight is made to coincide with the actual line of sight, and in addition, it is also necessary to make it possible to perform targeted compensation on the optical display, targeted fine adjustment at the time of pupil adjustment, estimate the orientation of the eyeball, and predict the imaging distribution of the display light path on the retina of the user.

Description

Eyeball rotation angle determination method, device, equipment and storage medium
Technical Field
The embodiment of the disclosure relates to the technical field of eyeball identification, in particular to a method, a device, equipment and a storage medium for determining an eyeball rotation angle.
Background
With the development of Virtual Reality (VR) technology, an eyeball recognition system is equipped in a VR device to obtain characteristics of an eyeball, so as to further implement related functions based on an eyeball model, such as iris recognition, pupil distance adjustment, eye orientation estimation, and the like.
At present, when estimating the sight line, the prior art only calculates the rotation angle of the eyeball, and does not consider the rotation angle of the eyeball, so that the estimated sight line is inaccurate and has deviation. Therefore, it is desirable to provide a method for determining the rotation angle of an eyeball so as to estimate the line of sight by using the rotation angle and the rotation angle of the eyeball and improve the accuracy of the line of sight estimation.
Disclosure of Invention
At least one embodiment of the present disclosure provides a method, an apparatus, a device, and a storage medium for determining an eyeball rotation angle.
In a first aspect, an embodiment of the present disclosure provides a method for determining an eyeball rotation angle, including:
acquiring a first iris texture feature corresponding to a first preset position seen by the eyeball model and a second iris texture feature corresponding to a second preset position seen by the eyeball model;
determining an eyeball rotation angle of the eyeball model based on the central symmetry axis of the eyeball model and the three-dimensional space position information of the eyeball center;
and determining the eyeball rotation angle of the eyeball model based on the first iris texture feature, the second iris texture feature and the eyeball rotation angle.
In some embodiments, the model eye is a model eye constructed based on images acquired by an infrared camera and images acquired by a visible camera;
the method for obtaining the first iris texture feature corresponding to the first preset position seen by the eyeball model and the second iris texture feature corresponding to the second preset position seen by the eyeball model comprises the following steps:
acquiring a first image group obtained by respectively acquiring images of eyeballs which are seen at a first preset position by an infrared camera and a visible light camera, and a second image group obtained by respectively acquiring images of eyeballs which are seen at a second preset position by the infrared camera and the visible light camera;
based on the first image group, determining a first iris texture feature corresponding to a first preset position of the eyeball model; and determining a second iris texture feature corresponding to the eyeball model looking at a second preset position based on the second image group.
In some embodiments, determining the eye rotation angle of the eye model based on the three-dimensional spatial position information of the center symmetry axis of the eye model and the eye center includes:
the eyeball center when the eyeball model is seen to a first preset position coincides with the eyeball center when the eyeball model is seen to a second preset position;
and calculating an angle corresponding to the superposition of the central symmetry axis of the eyeball model when the eyeball model looks at the first preset position and the central symmetry axis of the eyeball model when the eyeball model looks at the second preset position as an eyeball rotation angle of the eyeball model.
In some embodiments, determining the eye rotation angle of the eye model based on the first iris texture feature, the second iris texture feature, and the eye rotation angle comprises:
rotating the eyeball model based on the eyeball rotation angle so that the central symmetry axis of the eyeball model when looking at the first preset position coincides with the central symmetry axis of the eyeball model when looking at the second preset position;
and after the central symmetry axis is coincident, rotating the iris texture of the eyeball model along the central symmetry axis to enable the first iris texture feature to be coincident with the second iris texture feature, so as to obtain the rotation angle of the eyeball model.
In some embodiments, the three-dimensional spatial position information of the center symmetry axis of the eye model and the eye center is determined by:
taking the center of an ellipse or a circle in a three-dimensional space corresponding to the edge of the iris of the eyeball model as a drop foot, and taking a straight line perpendicular to the plane of the iris of the eyeball model as a central symmetry axis of the eyeball model;
based on the distance between the iris center of the eyeball model and the eyeball center and the three-dimensional space position information of the iris center, the three-dimensional space position information of the eyeball center is determined on a straight line taking the iris center as a foot and perpendicular to a plane where the iris is located.
In some embodiments, the distance between the iris center and the eyeball center of the eyeball model is determined by:
acquiring a first straight line passing through the center of an iris and being perpendicular to an iris plane where the center of the iris is located when the eyeball model is seen to a first preset position and a second straight line passing through the center of the iris and being perpendicular to the iris plane where the center of the iris is located when the eyeball model is seen to a second preset position;
determining an intersection point of the first straight line and the second straight line as an eyeball center of the eyeball model;
the distance between the center of the iris of the model eye and the center of the eye is calculated.
In some embodiments, the three-dimensional spatial location information of the iris center is determined by:
fitting to obtain an ellipse or circle representing the iris edge based on the three-dimensional space position information of the points on the iris edge;
and determining the three-dimensional space position of the center of the ellipse or the circle as the three-dimensional space position of the center of the iris.
In a second aspect, an embodiment of the present disclosure further provides a device for determining an eyeball rotation angle, where the device includes:
the acquisition unit is used for acquiring first iris texture features corresponding to the first preset position seen by the eyeball model and second iris texture features corresponding to the second preset position seen by the eyeball model;
a first determining unit configured to determine an eyeball rotation angle of the eyeball model based on three-dimensional spatial position information of a central symmetry axis of the eyeball model and a center of the eyeball;
and the second determining unit is used for determining the eyeball rotation angle of the eyeball model based on the first iris texture feature, the second iris texture feature and the eyeball rotation angle.
In a third aspect, an embodiment of the present disclosure further proposes an electronic device, including a memory, a processor, and a computer program stored on the memory, where the processor executes the computer program to implement the steps of the method for determining an eyeball rotation angle according to the first aspect.
In a fourth aspect, an embodiment of the present disclosure further proposes a computer-readable storage medium, wherein the computer-readable storage medium stores a program or instructions that cause a computer to perform the steps of the method for determining an eyeball rotation angle according to the first aspect.
In at least one embodiment of the present disclosure, a first iris texture feature corresponding to a first preset position seen by an eye model and a second iris texture feature corresponding to a second preset position seen by the eye model are obtained; further, based on the center symmetry axis of the eyeball model and the three-dimensional space position information of the eyeball center, the eyeball rotation angle of the eyeball model is determined; thus, the eyeball rotation angle of the eyeball model is determined based on the first iris texture feature, the second iris texture feature and the eyeball rotation angle. Therefore, based on the eyeball rotation angle of the model eyeball, the estimated line of sight is made to coincide with the actual line of sight, and in addition, it is also necessary to make it possible to perform targeted compensation on the optical display, targeted fine adjustment at the time of pupil adjustment, estimate the orientation of the eyeball, and predict the imaging distribution of the display light path on the retina of the user.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings used in the embodiments or the description of the prior art will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present disclosure, and other drawings may be obtained according to these drawings to those of ordinary skill in the art.
Fig. 1 is a flowchart of a method for determining an eyeball rotation angle according to an embodiment of the disclosure;
fig. 2 is a schematic diagram of an apparatus for determining an eyeball rotation angle according to an embodiment of the disclosure;
FIG. 3 is an exemplary block diagram of an electronic device provided by an embodiment of the present disclosure;
FIG. 4 is a top view of an eye model looking at a first predetermined position provided in an embodiment of the present disclosure;
FIG. 5 is a top view of an eye model looking at a second predetermined position provided in an embodiment of the present disclosure;
fig. 6 is a schematic diagram of rotation of the model eye in fig. 4 and 5 after the center symmetry axes of the model eye coincide.
Detailed Description
In order that the above-recited objects, features and advantages of the present disclosure may be more clearly understood, a more particular description of the disclosure will be rendered by reference to specific embodiments thereof which are illustrated in the appended drawings. It is to be understood that the described embodiments are some, but not all, of the embodiments of the present disclosure. The specific embodiments described herein are to be considered in an illustrative rather than a restrictive sense. All other embodiments derived by a person of ordinary skill in the art based on the described embodiments of the present disclosure fall within the scope of the present disclosure.
It should be noted that in this document, relational terms such as "first" and "second" and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
Fig. 1 is a schematic flow chart of a method for determining an eyeball rotation angle according to an embodiment of the present disclosure, where an execution body of the method for determining an eyeball rotation angle is an electronic device, and the electronic device includes, but is not limited to, a VR device, a vehicle-mounted device, a smart phone, a palm computer, a tablet computer, a wearable device with a display screen, a desktop device, a notebook computer, an all-in-one device, a smart home device, a server, and the like, where the server may be an independent server, or may be a cluster of multiple servers, or may include a server built locally and a server erected at a cloud.
As shown in fig. 1, the method for determining the rotation angle of the eyeball may include, but is not limited to, steps 101 to 104:
in step 101, a first iris texture feature corresponding to a first preset position seen by an eyeball model and a second iris texture feature corresponding to a second preset position seen by the eyeball model are obtained;
in this embodiment, the eyeball model is constructed based on an image collected by an infrared camera and an image collected by a visible light camera, and the construction process includes: iris feature extraction, pupil feature extraction and eyeball feature extraction:
iris feature extraction:
acquiring an eyeball infrared image acquired by an infrared camera and an eyeball visible light image acquired by a visible light camera; further, identifying a point on an iris edge in the infrared image of the eyeball and a point on an iris edge in the visible image of the eyeball;
determining a correspondence between points on an iris edge in an infrared image of an eyeball and points on an iris edge in a visible image of the eyeball comprises: extracting texture feature points of an iris in an eyeball infrared image and texture feature points of an iris in an eyeball visible light image; matching the texture feature points of the iris in the eyeball infrared image with the texture feature points of the iris in the eyeball visible light image to obtain a plurality of texture feature point matching pairs; determining an iris mapping relation between an eyeball infrared image and an eyeball visible light image based on a plurality of texture feature point matching pairs; based on the iris mapping relation, determining a corresponding relation between a point on the iris edge in the eyeball infrared image and a point on the iris edge in the eyeball visible light image;
determining three-dimensional spatial position information of a point on an iris edge in an eye-infrared image, an eye-visible image, and a correspondence between the point on the iris edge in the eye-infrared image and the point on the iris edge in the eye-visible image, includes: converting an infrared image of an eyeball and a visible light image of the eyeball into the same plane based on the internal and external parameters of the infrared camera and the internal and external parameters of the visible light camera; after the conversion to the same plane is completed, calculating parallax between the infrared camera and the visible light camera based on the corresponding relation; determining first depth information between a point on the iris edge and the infrared camera and second depth information between the point and the visible camera based on the parallax; determining three-dimensional spatial position information of a point on the edge of the iris based on the first depth information and the second depth information;
determining three-dimensional spatial position information of an iris center and major and minor axes of the iris in the eye model based on three-dimensional spatial position information of points on an iris edge, comprising: based on the three-dimensional space position information of points on the edge of the iris, an ellipse representing the edge of the iris is obtained through fitting, the three-dimensional space position of the center of the ellipse is determined as the three-dimensional space position of the center of the iris, and the major axis and the minor axis of the ellipse are determined as the major axis and the minor axis of the iris.
Pupil feature extraction:
determining the major and minor axes of the pupil in the model of the eye based on the major and minor axes of the iris, comprising: fitting a first ellipse representing the edge of the iris and a second ellipse representing the edge of the pupil in the same image, wherein the same image is an eyeball infrared image or an eyeball visible image; determining a major axis proportion and a minor axis proportion based on the first ellipse and the second ellipse; the long and short axes of the pupil are determined based on the long and short axes of the iris and the long and short axis ratios.
Determining the major and minor axes of the pupil in the model eye further comprises: based on an iris mapping relation (namely a homography matrix) between an eyeball infrared image and an eyeball visible light image, determining a corresponding relation between a point on the edge of a pupil in the eyeball infrared image and a point on the edge of the pupil in the eyeball visible light image; after converting an eyeball infrared image and an eyeball visible light image into the same plane, calculating parallax between an infrared camera and a visible light camera based on the corresponding relation; determining third depth information between a point on the pupil edge and the infrared camera and fourth depth information between the point and the visible camera based on the parallax; determining three-dimensional spatial position information of a point on the pupil edge based on the third depth information and the fourth depth information; fitting to obtain an ellipse representing the edge of the pupil based on the three-dimensional space position information of the point on the edge of the pupil, and determining the major axis and the minor axis of the ellipse as the major axis and the minor axis of the pupil;
extracting eyeball characteristics:
based on the distance between the iris center and the eyeball center and the three-dimensional space position information of the iris center, the three-dimensional space position information of the eyeball center is determined on a straight line which takes the iris center as a foot and is perpendicular to a plane where the iris is located.
After the eyeball model is constructed, a first iris texture feature corresponding to a first preset position seen by the eyeball model and a second iris texture feature corresponding to a second preset position seen by the eyeball model can be obtained. The first preset position and the second preset position are known three-dimensional space positions.
In this embodiment, a first image group (including an infrared image and a visible light image) obtained by respectively performing image acquisition on an eyeball looking at a first preset position by using an infrared camera and a visible light camera is obtained, and a second image group (including an infrared image and a visible light image) obtained by respectively performing image acquisition on an eyeball looking at a second preset position by using an infrared camera and a visible light camera is obtained. Further, based on the first image group, first iris texture features corresponding to the first preset position of the eyeball model can be determined; based on the second image set, a second iris texture feature corresponding to a second preset position of the eye model can be determined.
In some embodiments, the process of determining the iris texture features based on the infrared image and the visible light image is:
extracting iris texture feature points in the infrared image and iris texture feature points in the visible light image; matching iris texture feature points in the infrared image with iris texture feature points in the visible light image to obtain a plurality of texture feature point matching pairs; at least four texture feature point matching pairs are taken from the texture feature point matching pairs; an iris mapping relationship (i.e., homography matrix) between the infrared image and the visible light image is determined based on at least four texture feature point matching pairs.
Converting the infrared image and the visible light image to the same plane based on the internal and external parameters of the infrared camera and the internal and external parameters of the visible light camera; after the conversion to the same plane is completed, calculating parallax between the infrared camera and the visible light camera based on an iris mapping relation (namely a homography matrix) between the infrared image and the visible light image; determining depth information of iris texture feature points based on parallax; based on the depth information, three-dimensional spatial position information of the iris texture feature points is determined.
Therefore, in this embodiment, by letting the user see different objects of known three-dimensional positions, capturing images of the binocular camera each time the user looks at an object of a different position, iris texture information is attached to the iris of the eyeball model by the visible light image and the infrared image according to the eyeball model.
In step 102, an eyeball rotation angle of the eyeball model is determined based on the central symmetry axis of the eyeball model and the three-dimensional spatial position information of the eyeball center.
In this embodiment, the center of the eyeball when the model is seen to the first preset position coincides with the center of the eyeball when the model is seen to the second preset position; further, an angle at which the center symmetry axis of the eyeball model when looking at the first preset position coincides with the center symmetry axis of the eyeball model when looking at the second preset position is calculated as the eyeball rotation angle of the eyeball model.
In some embodiments, the three-dimensional spatial position information of the center symmetry axis of the eyeball model and the eyeball center is determined by the following steps A1 and A2:
step A1: taking the center of an ellipse or a circle in a three-dimensional space corresponding to the edge of the iris of the eyeball model as a drop foot, and taking a straight line perpendicular to the plane of the iris of the eyeball model as a central symmetry axis of the eyeball model;
step A2: based on the distance between the iris center of the eyeball model and the eyeball center and the three-dimensional space position information of the iris center, the three-dimensional space position information of the eyeball center is determined on a straight line taking the iris center as a foot and perpendicular to a plane where the iris is located.
In some embodiments, the distance between the iris center and the eyeball center of the eyeball model is determined by the following steps B1 to B3:
step B1: acquiring a first straight line passing through the center of an iris and being perpendicular to an iris plane where the center of the iris is located when the eyeball model is seen to a first preset position and a second straight line passing through the center of the iris and being perpendicular to the iris plane where the center of the iris is located when the eyeball model is seen to a second preset position;
step B2: determining an intersection point of the first straight line and the second straight line as an eyeball center of the eyeball model;
and B3, calculating the distance between the center of the iris of the eyeball model and the center of the eyeball.
In some embodiments, the three-dimensional spatial position information of the iris center is determined by the following steps C1 and C2:
step C1: fitting to obtain an ellipse or circle representing the iris edge based on the three-dimensional space position information of the points on the iris edge;
step C2: and determining the three-dimensional space position of the center of the ellipse or the circle as the three-dimensional space position of the center of the iris.
In step 103, an eye rotation angle of the eye model is determined based on the first iris texture, the second iris texture, and the eye rotation angle.
For example, based on the eyeball rotation angle, the eyeball model is rotated so that the central symmetry axis when the eyeball model is seen to the first preset position coincides with the central symmetry axis when the eyeball model is seen to the second preset position; and after the central symmetry axis is coincident, rotating the iris texture of the eyeball model along the central symmetry axis to enable the first iris texture feature to be coincident with the second iris texture feature, so as to obtain the rotation angle of the eyeball model.
The rotation of the eyeball is understood to be the rotation of the eyeball model with the optical axis as the rotation center, and the rotation of the eyeball is understood to be the rotation of the eyeball model changing the direction of the optical axis.
Fig. 4 is a top view of an eyeball model according to an embodiment of the disclosure looking at a first preset position, where in fig. 4, the first preset position is a point F, the point M is a second preset position, DE is an iris, C is a corneal sphere center, a is an eyeball center, and B is an auxiliary point (without actual physical meaning) when drawing. The straight line from the eyeball center A to the cornea sphere center C is the optical axis, namely the central symmetry axis of the eyeball model.
Fig. 5 is a top view of an eyeball model according to an embodiment of the disclosure looking at a second preset position, where in fig. 5, the second preset position is point M, point F is the first preset position, DE is the iris, C is the corneal sphere center, a is the center of the eyeball, and B is an auxiliary point (without actual physical meaning) when drawing. The straight line from the eyeball center A to the cornea sphere center C is the optical axis, namely the central symmetry axis of the eyeball model.
The eyeball model is turned to change the direction of the optical axis from the point F to the first preset position to the point M, namely, the eyeball model is turned to the point M from the point F through the eyeball rotation of the eyeball model.
Fig. 6 is a schematic diagram of rotation of the eyeball model after the center symmetry axes of the eyeball models in fig. 4 and 5 coincide, and in fig. 6, a point L is a projection point on a front view of the eyeball model after the center symmetry axes of the eyeball model coincide. The point nopqsuv represents the iris texture feature (first iris texture feature) when the eyeball model looks at the first preset position F, the point N 'O' P 'Q' S 'T' U 'V' represents the iris texture feature (second iris texture feature) when the eyeball model looks at the second preset position M, and the rotation angle formed by overlapping the first pattern formed by the point nopqsuv and the second pattern formed by the point N 'O' P 'Q' S 'T' U 'V' is the eyeball rotation angle of the eyeball model.
In the above embodiment, the first iris texture feature corresponding to the eyeball model looking at the first preset position and the second iris texture feature corresponding to the eyeball model looking at the second preset position are obtained; further, based on the center symmetry axis of the eyeball model and the three-dimensional space position information of the eyeball center, the eyeball rotation angle of the eyeball model is determined; thus, the eyeball rotation angle of the eyeball model is determined based on the first iris texture feature, the second iris texture feature and the eyeball rotation angle. Therefore, based on the eyeball rotation angle of the model eyeball, the estimated line of sight is made to coincide with the actual line of sight, and in addition, it is also necessary to make it possible to perform targeted compensation on the optical display, targeted fine adjustment at the time of pupil adjustment, estimate the orientation of the eyeball, and predict the imaging distribution of the display light path on the retina of the user.
It should be noted that, for simplicity of description, the foregoing method embodiments are all expressed as a series of combinations of actions, but those skilled in the art can appreciate that the disclosed embodiments are not limited by the order of actions described, as some steps may occur in other orders or concurrently in accordance with the disclosed embodiments. In addition, those skilled in the art will appreciate that the embodiments described in the specification are all alternatives.
Fig. 2 is a schematic diagram of an apparatus for determining an eyeball rotation angle according to an embodiment of the disclosure, where the apparatus for determining an eyeball rotation angle may be applied to an electronic device, and the electronic device includes, but is not limited to, a VR device, a vehicle-mounted device, a smart phone, a palm computer, a tablet computer, a wearable device with a display screen, a desktop device, a notebook computer, an all-in-one device, a smart home device, a server, etc., where the server may be an independent server, or may be a cluster of multiple servers, or may include a server built locally and a server erected at a cloud. The apparatus for determining an eyeball rotation angle provided in the embodiments of the disclosure may perform a processing flow provided in each embodiment of the method for determining an eyeball rotation angle, as shown in fig. 2, where the apparatus for determining an eyeball rotation angle includes, but is not limited to: an acquisition unit 21, a first determination unit 22, and a second determination unit 23. The functions of each unit are described as follows:
an obtaining unit 21, configured to obtain a first iris texture feature corresponding to a first preset position seen by an eyeball model and a second iris texture feature corresponding to a second preset position seen by the eyeball model;
a first determining unit 22 for determining an eyeball rotation angle of the eyeball model based on the center symmetry axis of the eyeball model and the three-dimensional spatial position information of the center of the eyeball;
a second determining unit 23 for determining an eyeball rotation angle of the eyeball model based on the first iris texture, the second iris texture and the eyeball rotation angle.
In some embodiments, the model eye is a model eye constructed based on images acquired by an infrared camera and images acquired by a visible camera;
an acquisition unit 21 for:
acquiring a first image group obtained by respectively acquiring images of eyeballs which are seen at a first preset position by an infrared camera and a visible light camera, and a second image group obtained by respectively acquiring images of eyeballs which are seen at a second preset position by the infrared camera and the visible light camera;
based on the first image group, determining a first iris texture feature corresponding to a first preset position of the eyeball model; and determining a second iris texture feature corresponding to the eyeball model looking at a second preset position based on the second image group.
In some embodiments, the first determining unit 22 is configured to:
the eyeball center when the eyeball model is seen to a first preset position coincides with the eyeball center when the eyeball model is seen to a second preset position;
and calculating an angle corresponding to the superposition of the central symmetry axis of the eyeball model when the eyeball model looks at the first preset position and the central symmetry axis of the eyeball model when the eyeball model looks at the second preset position as an eyeball rotation angle of the eyeball model.
In some embodiments, the second determining unit 23 is configured to:
rotating the eyeball model based on the eyeball rotation angle so that the central symmetry axis of the eyeball model when looking at a first preset position coincides with the central symmetry axis of the eyeball model when looking at a second preset position;
and after the central symmetry axis is coincident, rotating the iris texture of the eyeball model along the central symmetry axis to enable the first iris texture feature to be coincident with the second iris texture feature, so as to obtain the rotation angle of the eyeball model.
In some embodiments, the three-dimensional spatial position information of the center symmetry axis of the eye model and the eye center is determined by:
taking the center of an ellipse or a circle in a three-dimensional space corresponding to the edge of the iris of the eyeball model as a drop foot, and taking a straight line perpendicular to the plane of the iris of the eyeball model as a central symmetry axis of the eyeball model;
based on the distance between the iris center of the eyeball model and the eyeball center and the three-dimensional space position information of the iris center, the three-dimensional space position information of the eyeball center is determined on a straight line taking the iris center as a foot and perpendicular to a plane where the iris is located.
In some embodiments, the distance between the iris center and the eyeball center of the eyeball model is determined by:
acquiring a first straight line passing through the center of an iris and being perpendicular to an iris plane where the center of the iris is located when the eyeball model is seen to a first preset position and a second straight line passing through the center of the iris and being perpendicular to the iris plane where the center of the iris is located when the eyeball model is seen to a second preset position;
determining an intersection point of the first straight line and the second straight line as an eyeball center of the eyeball model;
the distance between the center of the iris of the model eye and the center of the eye is calculated.
In some embodiments, the three-dimensional spatial location information of the iris center is determined by:
fitting to obtain an ellipse or circle representing the iris edge based on the three-dimensional space position information of the points on the iris edge;
and determining the three-dimensional space position of the center of the ellipse or the circle as the three-dimensional space position of the center of the iris.
In the embodiment of the device for determining the rotation angle of the eyeball, the first iris texture feature corresponding to the first preset position seen by the eyeball model and the second iris texture feature corresponding to the second preset position seen by the eyeball model are obtained; further, based on the center symmetry axis of the eyeball model and the three-dimensional space position information of the eyeball center, the eyeball rotation angle of the eyeball model is determined; thus, the eyeball rotation angle of the eyeball model is determined based on the first iris texture feature, the second iris texture feature and the eyeball rotation angle. Therefore, based on the eyeball rotation angle of the model eyeball, the estimated line of sight is made to coincide with the actual line of sight, and in addition, it is also necessary to make it possible to perform targeted compensation on the optical display, targeted fine adjustment at the time of pupil adjustment, estimate the orientation of the eyeball, and predict the imaging distribution of the display light path on the retina of the user.
Fig. 3 is an exemplary block diagram of an electronic device provided by an embodiment of the present disclosure. As shown in fig. 3, the electronic device includes: a memory 31, a processor 32 and a computer program stored on said memory 31. It will be appreciated that the memory 31 in this embodiment may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory.
In some embodiments, the memory 31 stores the following elements, executable modules or data structures, or a subset thereof, or an extended set thereof: an operating system and application programs.
The operating system includes various system programs, such as a framework layer, a core library layer, a driving layer, and the like, and is used for realizing various basic tasks and processing hardware-based tasks. Applications, including various applications such as Media players (Media players), browsers (browses), etc., are used to implement various application tasks. A program for implementing the method for determining an eyeball rotation angle provided by the embodiment of the present disclosure may be included in an application program.
In the embodiment of the present disclosure, the at least one processor 32 is configured to execute the steps of each embodiment of the method for determining the rotation angle of the eyeball provided in the embodiment of the present disclosure, for example, execute the following steps by calling a program or an instruction stored in the at least one memory 31, specifically, a program or an instruction stored in an application program:
acquiring a first iris texture feature corresponding to a first preset position seen by the eyeball model and a second iris texture feature corresponding to a second preset position seen by the eyeball model;
determining an eyeball rotation angle of the eyeball model based on the central symmetry axis of the eyeball model and the three-dimensional space position information of the eyeball center;
and determining the eyeball rotation angle of the eyeball model based on the first iris texture feature, the second iris texture feature and the eyeball rotation angle.
The method for determining the rotation angle of the eyeball according to the embodiment of the present disclosure may be applied to the processor 32 or may be implemented by the processor 32. The processor 32 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuitry in hardware in processor 32 or by instructions in the form of software. The processor 32 described above may be a general purpose processor, a digital signal processor (Digital Signal Processor, DSP), an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), a field programmable gate array (Field Programmable Gate Array, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The steps of the method for determining the rotation angle of the eyeball provided by the embodiment of the disclosure can be directly embodied and executed by a hardware decoding processor or by combining and executing hardware and software modules in the decoding processor. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in a memory 31 and the processor 32 reads the information in the memory 31 and in combination with its hardware performs the steps of the method.
The embodiments of the present disclosure also propose a computer-readable storage medium storing a program or instructions that cause a computer to execute steps of embodiments of a method of determining an angle of rotation of an eyeball, for example, the steps of:
acquiring a first iris texture feature corresponding to a first preset position seen by the eyeball model and a second iris texture feature corresponding to a second preset position seen by the eyeball model;
determining an eyeball rotation angle of the eyeball model based on the central symmetry axis of the eyeball model and the three-dimensional space position information of the eyeball center;
and determining the eyeball rotation angle of the eyeball model based on the first iris texture feature, the second iris texture feature and the eyeball rotation angle.
In order to avoid repetition of the description, a description thereof will not be repeated here. Wherein the computer readable storage medium may be a non-transitory computer readable storage medium.
The disclosed embodiments also provide a computer program product comprising a computer program stored in a computer readable storage medium, which may be a non-transitory computer readable storage medium. At least one processor of the computer reads and executes the computer program from the computer-readable storage medium, so that the computer performs the steps of the embodiments of the method for determining the rotation angle of the eyeball, which are not described herein again for avoiding repetition of the description.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
Those skilled in the art will appreciate that while some embodiments described herein include some features but not others included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the disclosure and form different embodiments.
Those skilled in the art will appreciate that the descriptions of the various embodiments are each focused on, and that portions of one embodiment that are not described in detail may be referred to as related descriptions of other embodiments.
Although embodiments of the present disclosure have been described with reference to the accompanying drawings, various modifications and variations may be made by those skilled in the art without departing from the spirit and scope of the disclosure, and such modifications and variations fall within the scope defined by the appended claims.

Claims (10)

1. A method of determining an eyeball rotation angle, the method comprising:
acquiring a first iris texture feature corresponding to a first preset position of an eyeball model, and a second iris texture feature corresponding to a second preset position of the eyeball model;
determining an eyeball rotation angle of the eyeball model based on the central symmetry axis of the eyeball model and the three-dimensional space position information of the eyeball center;
and determining the eyeball rotation angle of the eyeball model based on the first iris texture feature, the second iris texture feature and the eyeball rotation angle.
2. The method of claim 1, wherein the model eye is a model eye constructed based on an image acquired by an infrared camera and an image acquired by a visible camera;
the obtaining the first iris texture feature corresponding to the eyeball model looking at the first preset position and the second iris texture feature corresponding to the eyeball model looking at the second preset position includes:
acquiring a first image group obtained by respectively acquiring images of eyeballs which are seen to a first preset position by the infrared camera and the visible light camera, and a second image group obtained by respectively acquiring images of eyeballs which are seen to a second preset position by the infrared camera and the visible light camera;
based on the first image group, determining a first iris texture feature corresponding to a first preset position of the eyeball model; and determining a second iris texture feature corresponding to a second preset position of the eyeball model based on the second image group.
3. The method of claim 1, wherein the determining the eye rotation angle of the eye model based on the three-dimensional spatial position information of the center symmetry axis of the eye model and the eye center comprises:
the eyeball center when the eyeball model is seen to a first preset position is overlapped with the eyeball center when the eyeball model is seen to a second preset position;
and calculating an angle corresponding to the superposition of the central symmetry axis of the eyeball model when looking at the first preset position and the central symmetry axis of the eyeball model when looking at the second preset position as the eyeball rotation angle of the eyeball model.
4. The method of claim 1, wherein the determining the eye rotation angle of the eye model based on the first iris texture feature, the second iris texture feature, and the eye rotation angle comprises:
rotating the eyeball model based on the eyeball rotation angle so that the central symmetry axis of the eyeball model when looking at a first preset position coincides with the central symmetry axis of the eyeball model when looking at a second preset position;
and after the central symmetry axis is coincident, rotating the iris texture of the eyeball model along the central symmetry axis to enable the first iris texture feature to be coincident with the second iris texture feature, so as to obtain the rotation angle of the eyeball model.
5. The method of claim 1, wherein the three-dimensional spatial location information of the center symmetry axis of the model eye and the center of the eye is determined by:
taking the center of an ellipse or a circle in a three-dimensional space corresponding to the edge of the iris of the eyeball model as a drop foot, and taking a straight line perpendicular to the plane of the iris of the eyeball model as a central symmetry axis of the eyeball model;
and determining three-dimensional space position information of the eyeball center on a straight line taking the iris center as a perpendicular foot and perpendicular to a plane where the iris is located based on the distance between the iris center and the eyeball center of the eyeball model and the three-dimensional space position information of the iris center.
6. The method of claim 5, wherein the distance between the center of the iris of the model eye and the center of the eye is determined by:
acquiring a first straight line passing through the center of an iris and being perpendicular to an iris plane where the center of the iris is located when the eyeball model looks at a first preset position and a second straight line passing through the center of the iris and being perpendicular to the iris plane where the center of the iris is located when the eyeball model looks at a second preset position;
determining an intersection point of the first straight line and the second straight line as an eyeball center of the eyeball model;
and calculating the distance between the center of the iris of the eyeball model and the center of the eyeball.
7. The method of claim 5, wherein the three-dimensional spatial location information of the iris center is determined by:
fitting to obtain an ellipse or circle representing the iris edge based on three-dimensional spatial position information of points on the iris edge;
and determining the three-dimensional space position of the center of the ellipse or the circle as the three-dimensional space position of the center of the iris.
8. An apparatus for determining an eyeball rotation angle, the apparatus comprising:
the device comprises an acquisition unit, a control unit and a control unit, wherein the acquisition unit is used for acquiring first iris texture characteristics corresponding to a first preset position seen by an eyeball model and second iris texture characteristics corresponding to a second preset position seen by the eyeball model;
a first determining unit configured to determine an eyeball rotation angle of the eyeball model based on three-dimensional spatial position information of a central symmetry axis of the eyeball model and a center of the eyeball;
and the second determining unit is used for determining the eyeball rotation angle of the eyeball model based on the first iris texture feature, the second iris texture feature and the eyeball rotation angle.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory, wherein the processor executes the computer program to implement the steps of the method of determining an eye rotation angle according to any one of claims 1 to 7.
10. A computer-readable storage medium storing a program or instructions that cause a computer to execute the steps of the method of determining an eyeball rotation angle according to any one of claims 1 to 7.
CN202311813365.1A 2023-12-26 2023-12-26 Eyeball rotation angle determination method, device, equipment and storage medium Pending CN117809358A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311813365.1A CN117809358A (en) 2023-12-26 2023-12-26 Eyeball rotation angle determination method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311813365.1A CN117809358A (en) 2023-12-26 2023-12-26 Eyeball rotation angle determination method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117809358A true CN117809358A (en) 2024-04-02

Family

ID=90422962

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311813365.1A Pending CN117809358A (en) 2023-12-26 2023-12-26 Eyeball rotation angle determination method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117809358A (en)

Similar Documents

Publication Publication Date Title
US11842438B2 (en) Method and terminal device for determining occluded area of virtual object
CN108875524B (en) Sight estimation method, device, system and storage medium
WO2018177337A1 (en) Method and apparatus for determining three-dimensional hand data, and electronic device
WO2018119889A1 (en) Three-dimensional scene positioning method and device
WO2018019282A1 (en) Binocular panorama image obtaining method and apparatus, and storage medium
CN109376631B (en) Loop detection method and device based on neural network
CN111046717A (en) Fundus image macular center positioning method and device, electronic equipment and storage medium
EP3243162A1 (en) Gaze detection offset for gaze tracking models
CN112308932B (en) Gaze detection method, device, equipment and storage medium
CN108875526B (en) Method, device and system for line-of-sight detection and computer storage medium
CN108090463B (en) Object control method, device, storage medium and computer equipment
US11181978B2 (en) System and method for gaze estimation
CN111008935A (en) Face image enhancement method, device, system and storage medium
CN110706283B (en) Calibration method and device for sight tracking, mobile terminal and storage medium
JP2021531601A (en) Neural network training, line-of-sight detection methods and devices, and electronic devices
WO2022174594A1 (en) Multi-camera-based bare hand tracking and display method and system, and apparatus
CN112882576B (en) AR interaction method and device, electronic equipment and storage medium
Perra et al. Adaptive eye-camera calibration for head-worn devices
WO2022032911A1 (en) Gaze tracking method and apparatus
CN106461982A (en) Method of determining at least one behavioural parameter
CN114638921B (en) Motion capture method, terminal device, and storage medium
CN117809358A (en) Eyeball rotation angle determination method, device, equipment and storage medium
CN117635600B (en) Method, device, equipment and storage medium for determining position of fovea
CN111462337B (en) Image processing method, device and computer readable storage medium
TWI731430B (en) Information display method and information display system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination