CN113208558A - Eyeball tracking method and device, electronic equipment and storage medium - Google Patents

Eyeball tracking method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113208558A
CN113208558A CN202110482099.3A CN202110482099A CN113208558A CN 113208558 A CN113208558 A CN 113208558A CN 202110482099 A CN202110482099 A CN 202110482099A CN 113208558 A CN113208558 A CN 113208558A
Authority
CN
China
Prior art keywords
current
pupil
gaze
display screen
pupil center
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110482099.3A
Other languages
Chinese (zh)
Other versions
CN113208558B (en
Inventor
沈忱
孙其民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanchang Virtual Reality Institute Co Ltd
Original Assignee
Nanchang Virtual Reality Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanchang Virtual Reality Institute Co Ltd filed Critical Nanchang Virtual Reality Institute Co Ltd
Priority to CN202110482099.3A priority Critical patent/CN113208558B/en
Publication of CN113208558A publication Critical patent/CN113208558A/en
Application granted granted Critical
Publication of CN113208558B publication Critical patent/CN113208558B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Biomedical Technology (AREA)
  • Human Computer Interaction (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The application discloses eyeball tracking method, device, electronic equipment and storage medium, which relate to the field of virtual reality and comprise: acquiring the current pupil center position and the current pupil radius of an eyeball to be tracked when a display screen is under the current background light information; determining a first fixation position corresponding to the current pupil center position on a display screen; obtaining a current offset corresponding to the current pupil radius and the first gaze location based on an offset mapping relationship, wherein the offset mapping relationship comprises a correspondence between a plurality of offsets, a plurality of pupil radii, and a plurality of gaze locations; obtaining a target pupil center position according to the current pupil center position and the current offset; and determining a second fixation position corresponding to the center position of the target pupil on the display screen as the eyeball fixation position of the eyeball to be tracked under the current background light information, so that the accurate eyeball fixation position under the current background light information can be obtained.

Description

Eyeball tracking method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of virtual reality technologies, and in particular, to an eyeball tracking method and apparatus, an electronic device, and a storage medium.
Background
The eyeball tracking technology is a scientific application technology, and generally performs tracking according to characteristic changes of eyeballs and eyeballs, or changes of irises and pupils. One or more cameras are typically employed to capture an image of the user's eyes and from the image of the eyes, estimate where the user is looking. However, different background lights in the scene can cause eye changes, thereby reducing the accuracy of eye tracking.
Disclosure of Invention
In view of the above problems, the present application provides an eyeball tracking method, an eyeball tracking apparatus, an electronic device, and a storage medium, which can solve the above technical problems.
In a first aspect, an embodiment of the present application provides an eyeball tracking method applied to an electronic device, where the electronic device includes a display screen, and the method includes: acquiring the current pupil center position and the current pupil radius of an eyeball to be tracked when the display screen is under the current background light information; determining a first fixation position corresponding to the current pupil center position on the display screen; obtaining a current offset corresponding to the current pupil radius and the first gaze location based on an offset mapping relationship, wherein the offset mapping relationship comprises a correspondence between a plurality of offsets, a plurality of pupil radii, and a plurality of gaze locations; obtaining a target pupil center position according to the current pupil center position and the current offset; and determining a second fixation position corresponding to the center position of the target pupil on the display screen as the eyeball fixation position of the eyeball to be tracked under the current background light information.
In a second aspect, an embodiment of the present application provides an eyeball tracking apparatus, which is applied to an electronic device, the electronic device includes a display screen, and the apparatus includes: the current pupil center position acquisition module is used for acquiring the current pupil center position and the current pupil radius of the eyeball to be tracked when the display screen is under the current background light information; the first gaze position determining module is used for determining a first gaze position corresponding to the current pupil center position on the display screen; a shift module, configured to obtain a current shift amount corresponding to the current pupil radius and the first gaze location based on a shift mapping relationship, where the shift mapping relationship includes a correspondence relationship between a plurality of shift amounts, a plurality of pupil radii, and a plurality of gaze locations; a target pupil center position obtaining module, configured to obtain a target pupil center position according to the current pupil center position and the current offset; and the second gaze position determining module is used for determining a second gaze position corresponding to the target pupil center position on the display screen, and the second gaze position is used as the eyeball gaze position of the eyeball to be tracked under the current background light information.
In a third aspect, an embodiment of the present application provides an electronic device, including: one or more processors; a memory; one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the above-described methods.
In a fourth aspect, the present application provides a computer-readable storage medium, in which a program code is stored, and the program code can be called by a processor to execute the above method.
According to the eyeball tracking method, the device, the electronic equipment and the storage medium, when the display screen is under the current background light information, a user watches the display screen to obtain the current pupil center position and the current pupil radius of the eyeball to be tracked, and it can be understood that the current pupil radius is obtained under the irradiation of the current background light information, which is equivalent to representing the current background light information by the current pupil radius; determining a first fixation position corresponding to the current pupil center position on a display screen; because the obtained first gaze location does not consider the influence of the current background light information on the pupil, the current offset corresponding to the current pupil radius and the first gaze location needs to be obtained continuously based on an offset mapping relationship, wherein the offset mapping relationship comprises a plurality of offsets, a plurality of pupil radii and a corresponding relationship between a plurality of gaze locations; obtaining a target pupil center position according to the current pupil center position and the current offset, wherein it can be understood that the current pupil center is offset according to the current offset to obtain the offset target pupil center position; and finally, determining a second fixation position corresponding to the center position of the target pupil on the display screen, namely, the second fixation position is deviated relative to the first fixation position, and the deviated second fixation position is used as the eyeball fixation position of the eyeball to be tracked under the current background light information, so that the eyeball tracking accuracy can be improved.
These and other aspects of the present application will be more readily apparent from the following description of the embodiments.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 shows a schematic diagram of a pupil under different background lights.
FIG. 2 illustrates a schematic diagram of an electronic device provided by an embodiment of the present application;
fig. 3 is a schematic flowchart illustrating an eye tracking method according to an embodiment of the present application;
fig. 4 is a schematic flowchart illustrating an eye tracking method according to another embodiment of the present application;
fig. 5 is a flowchart illustrating the step S2050 of the eyeball tracking method illustrated in fig. 4 of the present application;
fig. 6 is a schematic flowchart illustrating an eye tracking method according to another embodiment of the present application;
fig. 7 is a schematic flowchart illustrating an eye tracking method according to still another embodiment of the present application;
fig. 8 is a block diagram of an eye tracking device according to an embodiment of the present application;
fig. 9 is a block diagram of an electronic device for executing an eye tracking method according to an embodiment of the present application;
fig. 10 illustrates a storage unit for storing or carrying a program code for implementing an eyeball tracking method according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
The eyeball tracking technology is a scientific application technology, and generally performs tracking according to characteristic changes of eyeballs and eyeballs, or changes of irises and pupils. One or more cameras are typically employed to capture an image of the user's eyes and from the image of the eyes, estimate where the user is looking. Because the pupil center position is easy to determine, the eyeball can be tracked by tracking the pupil center position, the change of the pupil center position reflects the change of the eyeball gazing direction, and the eyeball can be tracked by acquiring the change of the pupil.
However, the inventors have found that different ambient light in a scene illuminates the eye resulting in a pupillary zoom, i.e. the radius of the pupil is different under different ambient light, and that this zoom is geometrically non-uniform. Referring to FIG. 1, the visual axis L1 is a line originating at the fovea, passing through a point c on the pupil0Wherein the visual axis is the true direction of the user's line of sight. The optical axis is a straight line passing through the pupil center position, as shown in fig. a, when a user gazes at a certain position, the user's eyes are irradiated with first light, a first eye image under the first light is collected, the pupil center position c and the pupil radius r are obtained through the first eye image, and the straight line passing through the pupil center position c in the first eye image is determined as the optical axis L2; as shown in fig. b, the user focuses on the same position, the illumination intensity is increased from the first illumination to the second illumination, the user's eyes are illuminated with the second illumination, a second eye image under the second illumination is collected, the pupil center position c and the pupil radius r are obtained from the second eye image, and a straight line passing through the pupil center position c in the second eye image is determined as the optical axis L3. It can be seen that, under two different illumination intensities, the visual axis of the user at the same position is not changed all the time, and the illumination with different intensities causes the pupil to zoom, resulting in the change of the pupil center position, and the change of the optical axis caused by the pupil position change, resulting in the change of the included angle between the optical axis and the visual axis. Since the visual axis is not measurable, the pupil is tracked through the optical axis during actual eye tracking, however, different background light in the scene will beThe pupil is zoomed, the center position of the pupil is changed, and the eyeball tracking accuracy is reduced.
In view of the above problems, the inventor has found and proposed an eyeball tracking method, an apparatus, an electronic device, and a storage medium according to embodiments of the present application through long-term research, where a pupil center position is shifted under current backlight information, and a position corresponding to the shifted pupil center position is an eyeball fixation position with high accuracy. The following embodiments describe specific eyeball tracking methods in detail.
Referring to fig. 2, an electronic device 100 includes a processor 110, a display screen 120, a fill-in light 130, and a camera 140, wherein the display screen 120, the fill-in light 130, and the camera 140 are all connected to the processor 110.
The processor 110 controls the display screen 120 to display different backlight information for viewing by the user. And the processor 110 controls the light supplement lamp 130 to illuminate for supplementing light to the photographed eyes in a dark place, and optionally, 2 light supplement lamps 130 may be provided. The camera 140 is used to capture an eye image of the user and send the eye image to the processor, alternatively, the camera 140 may be an infrared camera.
Fig. 3 is a schematic flow chart illustrating an eyeball tracking method according to an embodiment of the present application, where a pupil center position is shifted under current backlight information, and a position corresponding to the shifted pupil center position is an eyeball gaze position with higher accuracy. In an embodiment, the eye tracking method is applied to the eye tracking apparatus 200 shown in fig. 8 and the electronic device 100 (fig. 2 and 9) equipped with the eye tracking apparatus 200. In this embodiment, the specific process of this embodiment will be described by taking the application of the eyeball tracking method to the electronic device 100 as an example, and it is understood that the electronic device applied in this embodiment may be a mobile terminal, a smart phone, a tablet computer, a wearable electronic device, a head-mounted device in Virtual Reality (VR) experience, and the like, which is not limited herein. As will be described in detail with respect to the flow shown in fig. 3, the eyeball tracking method may specifically include the following steps:
and step S1010, acquiring the current pupil center position and the current pupil radius of the eyeball to be tracked when the display screen is under the current background light information.
When the display screen is under the current background light information and the eyes of the user watch the display screen, the camera shoots the current eye image of the user and sends the current eye image to the processor; the processor acquires the current pupil center position and the current pupil radius of the eyeball to be tracked from the current eye image by adopting a preset image processing algorithm; the preset image processing algorithm may be a threshold method or a contour method, and the current backlight information includes current gray scale information and/or current brightness information.
Step S1020, determining a first gaze position corresponding to the current pupil center position on the display screen.
In some embodiments, the electronic device may store a plurality of pupil center positions and a plurality of gaze positions in advance, and associate the plurality of pupil center positions with the plurality of gaze positions, that is, establish a corresponding relationship (a position mapping relationship) between the plurality of pupil center positions and the plurality of gaze positions, and optionally, the position mapping relationship may be a relationship mapping table. When the electronic device obtains the current pupil center position, the position mapping relationship is locally read from the electronic device, and based on the position mapping relationship, a first gaze position corresponding to the current pupil center position is obtained. Because the position mapping relation is stored locally in the electronic device, when the current pupil center position is obtained, the first gaze position corresponding to the current pupil center position can be obtained quickly.
In other embodiments, the position mapping relationship may be stored in a server connected to the electronic device, the electronic device sends the obtained current pupil center position to the server, and the server determines a first gaze position corresponding to the current pupil center position on the display screen based on the position mapping relationship and then sends the first gaze position to the electronic device. The position mapping relation is stored in the server, so that the occupation of the position mapping relation on a local memory of the electronic equipment can be reduced.
Step S1030, obtaining a current offset corresponding to the current pupil radius and the first gaze location based on an offset mapping relationship, where the offset mapping relationship includes a plurality of offsets, a plurality of pupil radii, and a plurality of gaze locations.
In some embodiments, the electronic device stores a plurality of offset amounts, a plurality of pupil radii, and a plurality of gaze locations in advance, and then establishes a correspondence relationship (i.e., an offset mapping relationship) between the plurality of offset amounts, the plurality of pupil radii, and the plurality of gaze locations, and optionally, the offset mapping relationship may be a relationship mapping table. When the electronic equipment obtains the current pupil radius and the first gaze position, the offset mapping relationship is locally read from the electronic equipment, and a current offset corresponding to the current pupil radius and the first gaze position is obtained based on the offset mapping relationship. Since the position mapping relationship is stored locally in the electronic device, when the current pupil radius and the first gaze position are obtained, the current offset can be obtained quickly.
In other embodiments, the offset mapping relationship may be stored in a server connected to the electronic device, the electronic device sends the obtained current pupil radius and the first gaze location to the server, and the server determines, based on the offset mapping relationship, a current offset corresponding to the current pupil radius and the first gaze location on the display screen and then sends the current offset to the electronic device. The offset mapping relation is stored in the server, so that the occupation of the offset mapping relation on a local memory of the electronic equipment can be reduced.
Optionally, the illumination irradiating the eyes of the user may be multiple, and besides the background light information of the display screen, the illumination may further include an ambient light source, such as light and sunlight in the environment, and the current pupil radius is obtained under the combined action of multiple illuminations. Therefore, the current offset obtained by combining the current pupil radius takes various illumination into consideration, so that the obtained current offset is more accurate.
Optionally, when the electronic device is a head-mounted device in a VR experience, the illumination for illuminating the eyes of the user is relatively single, and only the background light information of the display screen is used, that is, only the pupil radius of the user is affected by the background light information, so that an offset mapping relationship may also be established in combination with the background light information, that is, the established offset mapping relationship may also include a plurality of offset amounts, a plurality of background light information, and a plurality of gaze positions. And obtaining a current offset corresponding to the current background light information and the first watching position based on the offset mapping relation.
And step S1040, obtaining a target pupil center position according to the current pupil center position and the current offset.
Optionally, the coordinates of the current pupil center position are (x, y), the corresponding current offset includes an offset Δ x on the horizontal axis and an offset Δ y on the vertical axis, and the obtained target pupil center position is (x + Δ x, y + Δ y).
Step 1050, determining a second gaze position corresponding to the target pupil center position on the display screen, as an eyeball gaze position of the eyeball to be tracked under the current background light information.
And determining a second fixation position corresponding to the target pupil center position on the display screen based on the position mapping relation, and taking the second fixation position as the eyeball fixation position of the eyeball to be tracked under the current background light information.
In the eyeball tracking method provided by this embodiment, when the display screen is under the current background light information, the user gazes at the display screen to obtain the current pupil center position and the current pupil radius of the eyeball to be tracked, it can be understood that the current pupil radius is obtained under the irradiation of the current background light information, which is equivalent to representing the current background light information by the current pupil radius; determining a first fixation position corresponding to the current pupil center position on a display screen; because the obtained first gaze location does not consider the influence of the current background light information on the pupil, the current offset corresponding to the current pupil radius and the first gaze location needs to be obtained continuously based on an offset mapping relationship, wherein the offset mapping relationship comprises a plurality of offsets, a plurality of pupil radii and a corresponding relationship between a plurality of gaze locations; obtaining a target pupil center position according to the current pupil center position and the current offset, wherein it can be understood that the current pupil center is offset according to the current offset to obtain the offset target pupil center position; and finally, determining a second fixation position corresponding to the center position of the target pupil on the display screen, namely, the second fixation position is deviated relative to the first fixation position, and the deviated second fixation position is used as the eyeball fixation position of the eyeball to be tracked under the current background light information, so that the eyeball tracking accuracy can be improved.
In this embodiment, on the basis of the above embodiments, an eyeball tracking method is provided for establishing an offset mapping relationship, fig. 4 shows a flowchart of an eyeball tracking method provided in another embodiment of the present application, please refer to fig. 4, where the eyeball tracking method specifically includes the following steps:
step S2010, acquiring a current pupil center position and a current pupil radius of the eyeball to be tracked when the display screen is under the current background light information.
And S2020, determining a first gaze position corresponding to the current pupil center position on the display screen.
For details of steps S1010 to S1020, please refer to steps S2010 to S2020, which are not described herein again.
And step S2030, determining n fixation positions on the display screen.
N fixation positions are uniformly selected on the display screen, wherein the n fixation positions can be, but are not limited to 40, 50 and the like.
Step S2040, when the display screen is respectively in m kinds of background light information to display each gaze position in the n gaze positions, acquiring m eye images corresponding to each gaze position.
When the display screen is respectively positioned at each of n fixation positions displayed by m kinds of background light information, one eye image is collected under the irradiation of each kind of background light information of the m kinds of background light information, m eye images corresponding to each fixation position are collected, and n eye images are collected for the n fixation positions.
For example, the background light information includes 32 kinds, the gaze position includes 2 first gaze positions and second gaze positions, respectively, and in the first gaze position, one eye image is collected under the irradiation of each kind of background light information of the 32 kinds of background light information, and 32 eye images are collected in total; in the second fixation position, one eye image is collected under each background light irradiation of 32 kinds of background light information, and 32 eye images are collected in total. A total of 2 × 32 to 64 eye images are acquired at the first and second fixation positions. The backlight information is not limited to the 32 types described above, but may be more or less, for example, 16 types or 60 types.
The m kinds of background light information comprise m kinds of gray scale information and/or m kinds of brightness information, optionally, the brightness of the m kinds of gray scale information is sequentially enhanced or sequentially weakened, so that the collected pupil radiuses are sequentially changed, and eye images with the same pupil radius are prevented from being collected; the brightness of the m kinds of brightness information is sequentially enhanced or sequentially weakened, so that the collected pupil radiuses are sequentially changed, and eye images with the same pupil radius are prevented from being collected.
And step S2050, obtaining m offset amounts and m pupil radiuses according to the m eye images corresponding to each fixation position.
Referring to fig. 5, step S2050 includes the following sub-steps:
step S2051, obtaining m pupil center positions and m pupil radii according to the m eye images corresponding to each fixation position.
According to a preset image processing algorithm, m pupil center positions and m pupil radiuses are obtained from m eye images corresponding to each fixation position, wherein the preset image processing algorithm can be a threshold value method and a contour method.
Step S2052 is to determine one eye image as a reference image from the m eye images corresponding to each fixation position.
Alternatively, any one eye image is determined as the reference image among the m eye images corresponding to each gaze position. The first eye image in the m eye images can be selected as a reference image, wherein the first eye image is an eye image corresponding to the maximum gray scale information and/or the maximum brightness information, or an eye image corresponding to the minimum gray scale information and/or the minimum brightness information.
Step S2053 is to calculate a difference between the m pupil center positions corresponding to each gaze position and the pupil center position corresponding to the reference image, and obtain m offset amounts corresponding to each gaze position.
The m eye images corresponding to each gaze position are collected under m different kinds of background light information, namely the m eye images are collected when gazing at the same position, the determined reference image in the m eye images is that the pupil center position in the reference image corresponds to the gaze position, namely the pupil center position in the reference image corresponding to the gaze position, and the pupil center positions in the rest m-1 eye images are different from the pupil center in the reference image, therefore, the difference between the m eye images (including the rest m-1 eye images and the reference image) and the reference image needs to be calculated to obtain m offset amounts, the pupil center positions in the rest eye images are offset by the offset amounts, so that the pupil center positions in the rest eye images are offset to the pupil center positions corresponding to the reference image, and acquiring the fixation position corresponding to the shifted pupil center position (shifted to the pupil center position corresponding to the reference image) in the rest eye images on the display screen.
The m pupil center positions corresponding to each gaze position include the pupil center position corresponding to the reference image.
Illustratively, the reference image corresponds to a pupil center position (x1, y1), a pupil center position (x2, y2), an offset on the horizontal axis being (x1-x2), and an offset on the vertical axis being (y1-y 2).
Step S2060, obtaining n × m offsets and n × m pupil radii according to the n gaze locations and the m offsets and m pupil radii corresponding to each gaze location.
Each gaze location corresponds to m offsets and m pupil radii, and then n gaze locations correspond to n x m offsets and n x m pupil radii.
And step S2070, establishing the offset mapping relation according to the n × m offset quantities, the n × m pupil radiuses and the n fixation positions.
And establishing a corresponding relation among the n offset quantities, the n pupil radiuses and the n gazing positions, namely an offset mapping relation. Optionally, the offset mapping relationship may be stored locally in the electronic device, which is convenient for the electronic device to call, and the offset mapping relationship may be sent to a server connected to the electronic device, so as to reduce occupation of a storage space of the electronic device by the offset mapping relationship.
Step S2080, obtaining a current offset corresponding to the current pupil radius and the first gaze location based on an offset mapping relationship, wherein the offset mapping relationship comprises a plurality of offsets, a plurality of pupil radii, and a plurality of gaze locations.
Step S2090, obtaining a target pupil center position according to the current pupil center position and the current offset.
Step S2100, determining a second gaze location corresponding to the target pupil center location on the display screen, as an eyeball gaze location of the eyeball to be tracked under the current background light information.
For detailed description of steps S1030 to S1050, refer to steps S2080 to S2100, which are not described herein again.
In this embodiment, a corresponding relationship between a plurality of offsets, a plurality of pupil radii, and a plurality of gaze locations is established and stored, which facilitates the invocation of the electronic device.
Optionally, the backlight information includes gray scale information, fig. 6 shows a flowchart of an eye tracking method according to another embodiment of the present application, and referring to fig. 6, the eye tracking method may specifically include the following steps:
and S3010, acquiring the current pupil center position and the current pupil radius of the eyeball to be tracked when the display screen is under the current background light information.
Step S3020, determining a first gaze position corresponding to the current pupil center position on the display screen.
Step S3030, determining n fixation positions on the display screen.
For detailed description of S3010 to S3030, refer to step S2010 to step S2030, which are not described herein again.
Step S3040, obtaining a gray scale range corresponding to the display screen, and determining m kinds of gray scale information in the gray scale range.
For example, the gray scale range of the display screen is 0-255, and m kinds of gray scale information are determined in the gray scale range, and for example, 256 kinds of gray scale information can be determined. 256 kinds of gray information are respectively adjusted by adjusting the screen background brightness of the display screen or a Light Emitting Diode (LED for short) of the electronic device.
Optionally, the backlight information further includes luminance information, a luminance range corresponding to the display screen is obtained, and m kinds of luminance information are determined in the luminance range.
For example, the brightness range of the display screen is 0-100, m kinds of brightness information are determined in the brightness range, and for example, 100 kinds of brightness information can be determined. 100 kinds of brightness information are respectively adjusted by adjusting the screen background brightness of the display screen or the LED of the electronic equipment. The brightness range of the display screen is determined by factors such as the LED lamp of the electronic device, the material of the display screen, and the like, and therefore, the brightness range of the display screen is not limited to 0 to 100, and may be 0 to 120, 0 to 60, and the like.
Alternatively, the m types of backlight information include m types of grayscale information and m types of luminance information, where the luminance of the m types of grayscale information gradually increases or decreases, the luminance of the m types of luminance information gradually increases or decreases, and the m types of grayscale information and the m types of luminance information have one-to-one correspondence, as shown in table 1, where m may be 8, 16, 32, or 255.
TABLE 1
Figure BDA0003049652400000131
And S3050, when the display screen is respectively positioned at each of m kinds of background light information for displaying the n gaze positions, acquiring m eye images corresponding to each gaze position.
Step S3060, obtaining m offsets and m pupil radii from the m eye images corresponding to each fixation position.
And S3070, obtaining n offset and n pupil radiuses according to the n fixation positions and the m offset and the m pupil radiuses corresponding to each fixation position.
And S3080, establishing the offset mapping relation according to the n x m offset quantities, the n x m pupil radiuses and the n fixation positions.
Step S3090, obtaining a current offset corresponding to the current pupil radius and the first gaze location based on an offset mapping relationship, wherein the offset mapping relationship includes a plurality of offsets, a plurality of pupil radii, and a plurality of gaze locations.
And step S3100, obtaining a target pupil center position according to the current pupil center position and the current offset.
Step S3110, determining a second gaze location corresponding to the target pupil center location on the display screen, as an eyeball gaze location of the eyeball to be tracked under the current backlight information.
For the detailed description of step S3050-step S3110, refer to step S2050-step S2100, which is not described herein again.
Optionally, fig. 7 shows a schematic flowchart of an eyeball tracking method according to still another embodiment of the present application, please refer to fig. 7, where the eyeball tracking method specifically includes the following steps:
and S4010, acquiring the current pupil center position and the current pupil radius of the eyeball to be tracked when the display screen is under the current background light information.
Step S4020, determining a first gaze position corresponding to the current pupil center position on the display screen.
Step S4030, obtaining a current offset amount corresponding to the current pupil radius and the first gaze location based on an offset mapping relationship, where the offset mapping relationship includes a plurality of offset amounts, a plurality of pupil radii, and a plurality of gaze locations.
And S4040, obtaining the target pupil center position according to the current pupil center position and the current offset.
Step S4050, determining a second gaze position corresponding to the target pupil center position on the display screen, and taking the second gaze position as the eyeball gaze position of the eyeball to be tracked under the current background light information.
For detailed description of steps S4010 to S4050, please refer to steps S1010 to S1050, which are not described herein again.
Step S4060, when the background light information of the display screen changes, the changed pupil center position and the changed pupil radius of the eyeball to be tracked are obtained.
And S4070, determining a third fixation position corresponding to the changed pupil center position on the display screen.
Step S4080, based on the offset mapping relationship, obtaining a changed offset amount corresponding to the changed pupil radius and the third gaze position.
And step S4090, obtaining the changed target pupil center position according to the changed pupil center position and the changed offset.
Step S4100, determining a fourth gaze position corresponding to the changed target pupil center position on the display screen, as an eyeball gaze position of the eyeball to be tracked under the changed background light information.
For the detailed description of step S4060 to step S4100, please refer to step S1010 to step S1050, which is not described herein again.
In this embodiment, when the background light changes, the eyeball gaze position can be continuously tracked.
To implement the above method embodiments, the present embodiment provides an eyeball tracking apparatus, which is applied to an electronic device, where the electronic device includes a display screen, fig. 8 shows a block diagram of the eyeball tracking apparatus according to an embodiment of the present application, and referring to fig. 8, an eyeball tracking apparatus 300 includes: a current pupil center position acquisition module 310, a first gaze position determination module 320, a shift module 330, a target pupil center position acquisition module 340, and a second gaze position determination module 350.
A current pupil center position obtaining module 310, configured to obtain a current pupil center position and a current pupil radius of an eyeball to be tracked when the display screen is in the current background light information;
a first gaze location determining module 320, configured to determine a first gaze location corresponding to the current pupil center location on the display screen;
a shift module 330, configured to obtain a current shift amount corresponding to the current pupil radius and the first gaze location based on a shift mapping relationship, where the shift mapping relationship includes a plurality of shift amounts, a plurality of pupil radii, and a plurality of gaze locations;
a target pupil center position obtaining module 340, configured to obtain a target pupil center position according to the current pupil center position and the current offset;
a second gaze location determining module 350, configured to determine, on the display screen, a second gaze location corresponding to the target pupil center location, as an eyeball gaze location of the eyeball to be tracked under the current background light information.
Optionally, the eye tracking apparatus 300 further comprises: the device comprises a fixation position determining module, an eye opening image acquisition module, a first offset obtaining module, a second offset obtaining module and a mapping establishing module.
The gazing position determining module is used for determining n gazing positions on the display screen;
the eye-opening image acquisition module is used for acquiring m eye images corresponding to each fixation position when the display screen is respectively positioned at each fixation position in m kinds of background light information display n fixation positions;
a first offset obtaining module, configured to obtain m offsets and m pupil radii according to the m eye images corresponding to each fixation position;
a second offset obtaining module, configured to obtain n × m offsets and n × m pupil radii according to the n gaze locations and the m offsets and m pupil radii corresponding to each gaze location;
and the mapping establishing module is used for establishing the offset mapping relation according to the n x m offset, the n x m pupil radiuses and the n fixation positions.
Optionally, the backlight information includes gray scale information, and the eye tracking apparatus 300 further includes: and a gray information determining module.
And the gray information determining module is used for acquiring a gray range corresponding to the display screen and determining m kinds of gray information in the gray range.
Optionally, the backlight information includes brightness information, and the eye tracking apparatus 300 further includes: and a brightness information determination module.
And the brightness information determining module is used for acquiring a brightness range corresponding to the display screen and determining m kinds of brightness information in the brightness range.
Optionally, the first offset obtaining module includes: the pupil center position acquisition sub-module, the reference image determination sub-module and the offset calculation sub-module.
The pupil center position acquisition sub-module is used for acquiring m pupil center positions and m pupil radiuses according to the m eye images corresponding to each fixation position;
a reference image determining submodule for determining one eye image as a reference image from the m eye images corresponding to each gaze position;
and the offset calculation submodule is used for calculating the difference between the m pupil center positions corresponding to each fixation position and the pupil center position corresponding to the reference image to obtain the m offsets corresponding to each fixation position.
Optionally, the eye tracking apparatus 300 further comprises: the system comprises a changed pupil center position acquisition module, a third gaze position acquisition module, a changed offset acquisition module, a changed target pupil center position acquisition module and a fourth gaze position acquisition module.
The changed pupil center position acquisition module is used for acquiring the changed pupil center position and the changed pupil radius of the eyeball to be tracked when the background light information of the display screen is changed;
the third fixation position acquisition module is used for determining a third fixation position corresponding to the changed pupil center position on the display screen;
a changed offset obtaining module, configured to obtain a changed offset corresponding to the changed pupil radius and the third gaze location based on the offset mapping relationship;
the changed target pupil center position acquisition module is used for acquiring a changed target pupil center position according to the changed pupil center position and the changed offset;
and the fourth gaze position acquisition module is used for determining a fourth gaze position corresponding to the changed target pupil center position on the display screen, and the fourth gaze position is used as the eyeball gaze position of the eyeball to be tracked under the changed background light information.
Optionally, the first gaze location determination module comprises: a first gaze location determination submodule.
And the first gazing position determining submodule is used for determining the first gazing position corresponding to the current pupil center position on the display screen based on a position mapping relation, wherein the position mapping relation comprises a plurality of pupil center positions and a plurality of corresponding relations between the gazing positions.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses and modules may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, the coupling between the modules may be electrical, mechanical or other type of coupling.
In addition, functional modules in the embodiments of the present application may be integrated into one processing module, or each of the modules may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
Fig. 9 is a block diagram of an electronic device for executing an eye tracking method according to an embodiment of the present application, please refer to fig. 9, which shows a block diagram of an electronic device 100 according to an embodiment of the present application. The electronic device 100 may be a smart phone, a tablet computer, an electronic book, or other electronic devices capable of running an application. The electronic device 100 in the present application may include one or more of the following components: a processor 110, a memory 150, and one or more applications, wherein the one or more applications may be stored in the memory 150 and configured to be executed by the one or more processors 110, the one or more programs configured to perform a method as described in the aforementioned method embodiments.
Processor 110 may include one or more processing cores, among other things. The processor 110 connects various parts throughout the electronic device 100 using various interfaces and lines, and performs various functions of the electronic device 100 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 150 and calling data stored in the memory 150. Alternatively, the processor 110 may be implemented in hardware using at least one of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 110 may integrate one or more of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing the components to be displayed; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 110, but may be implemented by a communication chip.
The Memory 150 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). The memory 150 may be used to store instructions, programs, code sets, or instruction sets. The memory 150 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for implementing at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing various method embodiments described below, and the like. The storage data area may also store data created by the electronic device 100 in use (such as historical profiles) and the like.
Fig. 10 shows a storage unit for storing or carrying program codes for implementing an eyeball tracking method according to an embodiment of the present application, please refer to fig. 10, which shows a block diagram of a computer-readable storage medium provided by an embodiment of the present application. The computer-readable medium 400 has stored therein a program code that can be called by a processor to execute the method described in the above-described method embodiments.
The computer-readable storage medium 400 may be an electronic memory such as a flash memory, an EEPROM (electrically erasable programmable read only memory), an EPROM, a hard disk, or a ROM. Alternatively, the computer-readable storage medium 400 includes a non-volatile computer-readable storage medium. The computer readable storage medium 400 has storage space for program code 410 for performing any of the method steps of the method described above. The program code can be read from or written to one or more computer program products. Program code 410 may be compressed, for example, in a suitable form.
To sum up, according to the eyeball tracking method, the device, the electronic device and the storage medium provided by the application, when the display screen is under the current background light information, the user watches the display screen to obtain the current pupil center position and the current pupil radius of the eyeball to be tracked, and it can be understood that the current pupil radius is obtained under the irradiation of the current background light information, which is equivalent to representing the current background light information by the current pupil radius; determining a first fixation position corresponding to the current pupil center position on a display screen; because the obtained first gaze location does not consider the influence of the current background light information on the pupil, the current offset corresponding to the current pupil radius and the first gaze location needs to be obtained continuously based on an offset mapping relationship, wherein the offset mapping relationship comprises a plurality of offsets, a plurality of pupil radii and a corresponding relationship between a plurality of gaze locations; obtaining a target pupil center position according to the current pupil center position and the current offset, wherein it can be understood that the current pupil center is offset according to the current offset to obtain the offset target pupil center position; and finally, determining a second fixation position corresponding to the center position of the target pupil on the display screen, namely, the second fixation position is deviated relative to the first fixation position, and the deviated second fixation position is used as the eyeball fixation position of the eyeball to be tracked under the current background light information, so that the eyeball tracking accuracy can be improved.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not necessarily depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (10)

1. An eyeball tracking method applied to an electronic device, wherein the electronic device comprises a display screen, and the method comprises the following steps:
acquiring the current pupil center position and the current pupil radius of an eyeball to be tracked when the display screen is under the current background light information;
determining a first fixation position corresponding to the current pupil center position on the display screen;
obtaining a current offset corresponding to the current pupil radius and the first gaze location based on an offset mapping relationship, wherein the offset mapping relationship comprises a correspondence between a plurality of offsets, a plurality of pupil radii, and a plurality of gaze locations;
obtaining a target pupil center position according to the current pupil center position and the current offset;
and determining a second fixation position corresponding to the center position of the target pupil on the display screen as the eyeball fixation position of the eyeball to be tracked under the current background light information.
2. The method of claim 1, wherein before obtaining the current shift amount corresponding to the current pupil radius and the first gaze location based on the shift mapping, further comprising:
determining n gaze locations on the display screen;
when the display screen is respectively positioned at each of the n fixation positions displayed by the m kinds of background light information, acquiring m eye images corresponding to each fixation position;
obtaining m offsets and m pupil radiuses according to the m eye images corresponding to each fixation position;
obtaining n x m offsets and n x m pupil radiuses according to the n fixation positions and the m offsets and m pupil radiuses corresponding to each fixation position;
and establishing the offset mapping relation according to the n x m offset, the n x m pupil radiuses and the n fixation positions.
3. The method according to claim 2, wherein the backlight information comprises gray scale information, and before acquiring the m eye images corresponding to each of the n gaze locations when the display screen is respectively at each of the m backlight information display positions, the method further comprises:
and acquiring a gray scale range corresponding to the display screen, and determining m kinds of gray scale information in the gray scale range.
4. The method according to claim 2, wherein the backlight information includes luminance information, and before acquiring the m eye images corresponding to each of the n gaze locations when the display screen is respectively at each of the m backlight information display positions, the method further comprises:
and acquiring a brightness range corresponding to the display screen, and determining m kinds of brightness information in the brightness range.
5. The method according to claim 2, wherein obtaining m offsets and m pupil radii from the m eye images corresponding to each fixation position comprises:
obtaining m pupil center positions and m pupil radiuses according to the m eye images corresponding to each fixation position;
determining an eye image as a reference image from the m eye images corresponding to each fixation position;
and calculating the difference between the m pupil center positions corresponding to each gaze position and the pupil center position corresponding to the reference image to obtain m offset values corresponding to each gaze position.
6. The method according to any one of claims 1 to 5, wherein after determining a second gaze location corresponding to the target pupil center location on the display screen as an eyeball gaze location of the eyeball to be tracked under the current background light information, the method further comprises:
when the background light information of the display screen changes, acquiring the changed pupil center position and the changed pupil radius of the eyeball to be tracked;
determining a third gaze location on the display screen corresponding to the changed pupil center location;
obtaining a changed shift amount corresponding to the changed pupil radius and the third gaze location based on the shift mapping relationship;
obtaining the changed target pupil center position according to the changed pupil center position and the changed offset;
and determining a fourth gaze position corresponding to the changed target pupil center position on the display screen, wherein the fourth gaze position is used as the eyeball gaze position of the eyeball to be tracked under the changed background light information.
7. The method according to any one of claims 1-5, wherein the determining a first gaze location corresponding to the current pupil center location on the display screen comprises:
and determining the first gazing position corresponding to the current pupil center position on the display screen based on a position mapping relation, wherein the position mapping relation comprises a plurality of pupil center positions and a plurality of corresponding relations between the gazing positions.
8. An eye tracking apparatus, applied to an electronic device including a display screen, the apparatus comprising:
the current pupil center position acquisition module is used for acquiring the current pupil center position and the current pupil radius of the eyeball to be tracked when the display screen is under the current background light information;
the first gaze position determining module is used for determining a first gaze position corresponding to the current pupil center position on the display screen;
a shift module, configured to obtain a current shift amount corresponding to the current pupil radius and the first gaze location based on a shift mapping relationship, where the shift mapping relationship includes a correspondence relationship between a plurality of shift amounts, a plurality of pupil radii, and a plurality of gaze locations;
a target pupil center position obtaining module, configured to obtain a target pupil center position according to the current pupil center position and the current offset;
and the second gaze position determining module is used for determining a second gaze position corresponding to the target pupil center position on the display screen, and the second gaze position is used as the eyeball gaze position of the eyeball to be tracked under the current background light information.
9. An electronic device, comprising:
one or more processors;
a memory;
one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the method of any of claims 1-7.
10. A computer-readable storage medium, having stored thereon program code that can be invoked by a processor to perform the method according to any one of claims 1 to 7.
CN202110482099.3A 2021-04-30 2021-04-30 Eyeball tracking method and device, electronic equipment and storage medium Active CN113208558B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110482099.3A CN113208558B (en) 2021-04-30 2021-04-30 Eyeball tracking method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110482099.3A CN113208558B (en) 2021-04-30 2021-04-30 Eyeball tracking method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113208558A true CN113208558A (en) 2021-08-06
CN113208558B CN113208558B (en) 2022-10-21

Family

ID=77090689

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110482099.3A Active CN113208558B (en) 2021-04-30 2021-04-30 Eyeball tracking method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113208558B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114911445A (en) * 2022-05-16 2022-08-16 歌尔股份有限公司 Display control method of virtual reality device, and storage medium

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4995716A (en) * 1989-03-09 1991-02-26 Par Technology Corporation Method and apparatus for obtaining the topography of an object
US5036347A (en) * 1988-08-31 1991-07-30 Canon Kabushiki Kaisha Visual line detecting device and camera having the same
JPH06133931A (en) * 1992-10-26 1994-05-17 Nikon Corp Instrument for measuring position of visual line
EP0631222A1 (en) * 1993-06-21 1994-12-28 International Business Machines Corporation Gazing point estimation device
GB9616190D0 (en) * 1996-08-01 1996-09-11 Sharp Kk Eye detection system
JP2005312605A (en) * 2004-04-28 2005-11-10 Ditect:Kk Gaze position display device
US20140300538A1 (en) * 2013-04-08 2014-10-09 Cogisen S.R.L. Method for gaze tracking
US20170184847A1 (en) * 2015-12-28 2017-06-29 Oculus Vr, Llc Determining interpupillary distance and eye relief of a user wearing a head-mounted display
US20180232507A1 (en) * 2016-11-08 2018-08-16 Aerendir Mobile Inc. Unique patterns extracted from involuntary eye motions to identify individuals
US20190076014A1 (en) * 2017-09-08 2019-03-14 Tobii Ab Pupil radius compensation
CN109656373A (en) * 2019-01-02 2019-04-19 京东方科技集团股份有限公司 One kind watching independent positioning method and positioning device, display equipment and storage medium attentively
CN110245601A (en) * 2019-06-11 2019-09-17 Oppo广东移动通信有限公司 Eyeball tracking method and Related product
CN111012301A (en) * 2019-12-19 2020-04-17 北京理工大学 Head-mounted visual accurate aiming system
US20200192473A1 (en) * 2018-10-31 2020-06-18 Tobii Ab Gaze tracking using mapping of pupil center position
CN111539984A (en) * 2018-12-21 2020-08-14 托比股份公司 Continuous calibration based on pupil characteristics
CN111857333A (en) * 2020-06-29 2020-10-30 维沃移动通信有限公司 Eye movement tracking method and device and electronic equipment

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5036347A (en) * 1988-08-31 1991-07-30 Canon Kabushiki Kaisha Visual line detecting device and camera having the same
US4995716A (en) * 1989-03-09 1991-02-26 Par Technology Corporation Method and apparatus for obtaining the topography of an object
JPH06133931A (en) * 1992-10-26 1994-05-17 Nikon Corp Instrument for measuring position of visual line
EP0631222A1 (en) * 1993-06-21 1994-12-28 International Business Machines Corporation Gazing point estimation device
GB9616190D0 (en) * 1996-08-01 1996-09-11 Sharp Kk Eye detection system
JP2005312605A (en) * 2004-04-28 2005-11-10 Ditect:Kk Gaze position display device
US20140300538A1 (en) * 2013-04-08 2014-10-09 Cogisen S.R.L. Method for gaze tracking
US20170184847A1 (en) * 2015-12-28 2017-06-29 Oculus Vr, Llc Determining interpupillary distance and eye relief of a user wearing a head-mounted display
US20180232507A1 (en) * 2016-11-08 2018-08-16 Aerendir Mobile Inc. Unique patterns extracted from involuntary eye motions to identify individuals
US20190076014A1 (en) * 2017-09-08 2019-03-14 Tobii Ab Pupil radius compensation
CN109472189A (en) * 2017-09-08 2019-03-15 托比股份公司 Pupil radium compensation
US20200192473A1 (en) * 2018-10-31 2020-06-18 Tobii Ab Gaze tracking using mapping of pupil center position
CN111539984A (en) * 2018-12-21 2020-08-14 托比股份公司 Continuous calibration based on pupil characteristics
CN109656373A (en) * 2019-01-02 2019-04-19 京东方科技集团股份有限公司 One kind watching independent positioning method and positioning device, display equipment and storage medium attentively
CN110245601A (en) * 2019-06-11 2019-09-17 Oppo广东移动通信有限公司 Eyeball tracking method and Related product
CN111012301A (en) * 2019-12-19 2020-04-17 北京理工大学 Head-mounted visual accurate aiming system
CN111857333A (en) * 2020-06-29 2020-10-30 维沃移动通信有限公司 Eye movement tracking method and device and electronic equipment

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
JUSTIN WANEK等: "Representing the retinal line spread shape with mathematical functions", 《JOURNAL OF ZHEJIANG UNIVERSITY(SCIENCE B:AN INTERNATIONAL BIOMEDICINE & BIOTECHNOLOGY JOURNAL)》 *
牛晓霞: "眼球旋转和瞳孔中心移位对角膜屈光手术的影响", 《临床眼科杂志》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114911445A (en) * 2022-05-16 2022-08-16 歌尔股份有限公司 Display control method of virtual reality device, and storage medium

Also Published As

Publication number Publication date
CN113208558B (en) 2022-10-21

Similar Documents

Publication Publication Date Title
CN113729611B (en) Eye tracking using center position of eyeball
CN108700933B (en) Wearable device capable of eye tracking
CN107077751B (en) Virtual fitting method and device for contact lenses and computer program for implementing method
US10739851B2 (en) Eye-tracking enabled wearable devices
US9405365B2 (en) Systems and methods for identifying gaze tracking scene reference locations
EP3453317B1 (en) Pupil radius compensation
US20080137909A1 (en) Method and apparatus for tracking gaze position
CN113208558B (en) Eyeball tracking method and device, electronic equipment and storage medium
CN116452530A (en) Eye movement tracking method and eye movement tracking device
CN106061054B (en) A kind of information processing method and electronic equipment
CN218413524U (en) Eye tracking device and intelligent glasses
US20240103636A1 (en) Methods for manipulating a virtual object
CN114578940A (en) Control method and device and electronic equipment
CN117555631A (en) Image display method, device, near-eye display equipment and storage medium
CN117275420A (en) Method and device for controlling backlight brightness
CN117784382A (en) Display equipment for preventing and controlling myopia

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant