CN113780414A - Eye movement behavior analysis method, image rendering method, component, device and medium - Google Patents

Eye movement behavior analysis method, image rendering method, component, device and medium Download PDF

Info

Publication number
CN113780414A
CN113780414A CN202111062599.8A CN202111062599A CN113780414A CN 113780414 A CN113780414 A CN 113780414A CN 202111062599 A CN202111062599 A CN 202111062599A CN 113780414 A CN113780414 A CN 113780414A
Authority
CN
China
Prior art keywords
eye movement
movement behavior
eye
display screen
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111062599.8A
Other languages
Chinese (zh)
Inventor
闫桂新
孙建康
陈丽莉
张�浩
董学
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BOE Technology Group Co Ltd
Original Assignee
BOE Technology Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BOE Technology Group Co Ltd filed Critical BOE Technology Group Co Ltd
Priority to CN202111062599.8A priority Critical patent/CN113780414A/en
Publication of CN113780414A publication Critical patent/CN113780414A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The present disclosure provides an eye movement behavior analysis method applied to a near-eye display device, including: determining a fixation point of a target human eye on a display screen; acquiring a moving speed curve of a fixation point in a preset time period; determining at least one eye movement behavior type corresponding to the target human eye within a preset time period according to the moving speed curve, and acquiring eye movement behavior data corresponding to each eye movement behavior type according to the moving speed curve; and analyzing and calculating the eye movement behavior data corresponding to each eye movement behavior type to generate the eye movement behavior characteristics of the target human eyes. The present disclosure also provides an image rendering method, an electronic assembly, a near-eye display device and a computer readable medium.

Description

Eye movement behavior analysis method, image rendering method, component, device and medium
Technical Field
The present disclosure relates to the field of near-eye display technologies, and in particular, to an eye movement behavior analysis method, an image rendering method, an electronic component, a near-eye display device, and a computer-readable medium.
Background
With the continuous progress of the electronic technology level, Virtual Reality (VR) or Augmented Reality (AR) technology has been increasingly applied in daily life as a high-tech technology.
At the present stage, for a near-eye display device using a virtual reality or augmented reality technology, a picture refresh rate is low, a picture delay is large, and a viewer cannot obtain a smooth viewing experience.
Disclosure of Invention
The present disclosure is directed to at least one of the technical problems in the prior art, and provides an eye movement behavior analysis method, an image rendering method, an electronic component, a near-eye display device, and a computer readable medium.
To achieve the above object, in a first aspect, an embodiment of the present disclosure provides an eye movement behavior analysis method applied to a near-eye display device, the method including:
determining a fixation point of a target human eye on a display screen;
acquiring a moving speed curve of the fixation point in a preset time period;
determining at least one eye movement behavior type corresponding to the target human eyes in the preset time period according to the moving speed curve, and acquiring eye movement behavior data corresponding to each eye movement behavior type according to the moving speed curve;
and analyzing and calculating the eye movement behavior data corresponding to each eye movement behavior type to generate the eye movement behavior characteristics of the target human eyes.
In some embodiments, the determining, according to the moving speed profile, at least one eye movement behavior type corresponding to the target human eye within the preset time period includes:
for the moving speed curve, if a speed is less than or equal to a preset first speed threshold, determining that the eye movement behavior type comprises a watching type;
if a rate is greater than the first rate threshold and less than a preset second rate threshold, determining that the eye movement behavior type comprises a fixation movement type, wherein the first rate threshold is less than the second rate threshold;
if a rate greater than or equal to the second rate threshold exists, determining that the eye movement behavior type comprises a saccade type.
In some embodiments, the eye movement behavior data comprises: frequency and duration of eye movement activity;
the analyzing and calculating based on the eye movement behavior data corresponding to each eye movement behavior type to generate the eye movement behavior characteristics of the target human eyes comprises the following steps:
determining a weight coefficient corresponding to each eye movement behavior type according to the eye movement behavior frequency corresponding to each eye movement behavior type;
and analyzing and calculating based on the weight coefficient, the eye movement behavior frequency and the duration corresponding to each eye movement behavior type to generate the eye movement behavior characteristics.
In some embodiments, the determining the gaze point of the target human eye on the display screen includes:
acquiring a human eye side image including the target human eye;
and detecting the pupil state according to the image at the human eye side to determine the fixation point.
In some embodiments, the analyzing and calculating based on the eye movement behavior data corresponding to each of the eye movement behavior types to generate the eye movement behavior characteristics of the target human eyes includes:
analyzing the human eye side image to obtain human eye characteristics of the target human eye;
responding to the condition that the human eye characteristics are not stored in the database, performing analysis and calculation based on the eye movement behavior data corresponding to each eye movement behavior type, generating the eye movement behavior characteristics, and storing the human eye characteristics and the eye movement behavior characteristics into the database;
responding to the fact that the human eye features are stored in the database, and acquiring historical eye movement behavior data corresponding to the target human eyes from the database; analyzing and calculating based on historical eye movement behavior data and eye movement behavior data corresponding to each eye movement behavior type to generate the eye movement behavior characteristics; updating the human eye features and the eye movement behavior features into the database.
In some embodiments, the eye movement behavior feature is an eye movement behavior feature under a current scene type;
the method further comprises the following steps:
acquiring a display screen side image comprising the display screen;
and matching the display screen side image with a preset scene template, and determining the current scene type according to the matched scene template.
In some embodiments, the near-eye display device comprises an image capture assembly, a lens assembly, and the display screen; the image projected by the display screen reaches the target human eyes after being shaped by the lens assembly; the human eye side image and the display screen side image are acquired by the image acquisition assembly;
before the display screen side image is matched with a preset scene template, the method further comprises the following steps:
extracting a display screen area from the display screen side image based on the brightness;
correcting the display screen area to obtain the display screen area in a preset shape;
the matching of the display screen side image and a preset scene template comprises the following steps:
and matching the display screen area in a preset shape with the scene template.
In some embodiments, the near-eye display device further comprises a light reflecting component located on the same side as the target human eye, the light reflecting component configured to reflect the image projected by the display screen after being shaped by the lens component; the human eye side image and the display screen side image are the same image, and the image acquisition assembly shoots the target human eyes and the reflective assembly;
the extracting of the display screen area from the display screen side image based on the brightness comprises:
and extracting a light reflecting component area from the display screen side image based on the brightness, and taking the light reflecting component area as the display screen area.
In a second aspect, an embodiment of the present disclosure further provides an image rendering method applied to a near-eye display device, where the method includes:
determining a fixation point of a target human eye on a display screen;
acquiring eye movement behavior characteristics of the target human eyes, wherein the eye movement behavior characteristics are obtained by adopting the eye movement behavior analysis method in any one of the embodiments in advance;
predicting to obtain the predicted position information of the fixation point at the next moment according to the eye movement behavior characteristics;
and rendering the image displayed by the display screen at the next moment according to the predicted position information.
In a third aspect, an embodiment of the present disclosure further provides an electronic component, including:
one or more processors;
a memory for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement an eye movement behavior analysis method as in any one of the above embodiments, or an image rendering method as in the above embodiments.
In a fourth aspect, an embodiment of the present disclosure further provides a near-eye display device, including: image acquisition subassembly, lens subassembly, display screen and the electronic component of above-mentioned embodiment.
In a fifth aspect, the disclosed embodiments also provide a computer readable medium, on which a computer program is stored, wherein the program, when executed by a processor, implements the steps in the eye movement behavior analysis method as in any one of the above embodiments, or implements the steps in the image rendering method as in the above embodiments.
Drawings
The accompanying drawings are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description serve to explain the principles of the disclosure and not to limit the disclosure. The above and other features and advantages will become more apparent to those skilled in the art by describing in detail exemplary embodiments thereof with reference to the attached drawings, in which:
fig. 1 is a method for analyzing eye movement according to an embodiment of the present disclosure;
FIG. 2 is a flowchart illustrating a method of step S3 according to an embodiment of the present disclosure;
FIG. 3 is a flowchart illustrating a method for implementing step S4 according to an embodiment of the present disclosure;
fig. 4 is a flowchart of another eye movement behavior analysis method provided by the embodiments of the present disclosure;
fig. 5 is a flowchart of another eye movement behavior analysis method provided in the embodiments of the present disclosure;
fig. 6 is a flowchart of another eye movement behavior analysis method provided in the embodiments of the present disclosure;
FIG. 7 is a flowchart illustrating a method of step S02 according to an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of a near-eye display device according to an embodiment of the present disclosure;
fig. 9 is a schematic structural diagram of another near-eye display device provided in an embodiment of the present disclosure;
fig. 10 is a flowchart of an image rendering method according to an embodiment of the present disclosure;
fig. 11 is a schematic structural diagram of an electronic assembly according to an embodiment of the disclosure;
fig. 12 is a schematic structural diagram of a computer-readable medium according to an embodiment of the present disclosure.
Detailed Description
In order to make those skilled in the art better understand the technical solution of the present disclosure, the eye movement behavior analysis method, the image rendering method, the electronic component, the near-eye display device and the computer readable medium provided in the present disclosure are described in detail below with reference to the accompanying drawings.
Example embodiments will be described more fully hereinafter with reference to the accompanying drawings, but which may be embodied in different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. Thus, a first element, component, or module discussed below could be termed a second element, component, or module without departing from the teachings of the present disclosure.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and the present disclosure, and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
Fig. 1 is a diagram illustrating an eye movement behavior analysis method according to an embodiment of the present disclosure. In particular, the method is applied to a near-eye display device, wherein the near-eye display device comprises a virtual reality device, an augmented reality device and the like, and in some embodiments, the near-eye display device is a wearable device, such as VR glasses, AR glasses and other head-mounted devices; as shown in fig. 1, the method includes:
and step S1, determining the fixation point of the target human eyes on the display screen.
The fixation point, namely the position of the target eye in the image displayed on the display screen and directly watched at present, can be determined by detecting the state of the eye in real time or setting a fixed point.
In some embodiments, in the near-eye display device, two eyes of a user respectively correspond to one display screen, and any one of the two eyes is taken as a target eye; if the gazing position in the three-dimensional picture currently sensed by the target human eyes is acquired, based on the display resolution and the physical size of the display screen, the functional relation between the space coordinate system and the display screen coordinate system can be obtained through the mapping relation between two dimensions and three dimensions, so that the space coordinate system and the display screen coordinate system can be converted, and the gazing point on the display screen can be determined.
And step S2, acquiring a moving speed curve of the fixation point in a preset time period.
The method comprises the following steps that a moving speed curve of a fixation point in a preset time period can be obtained based on the change condition of the fixation point in the preset time period; specifically, the corresponding moving speed can be calculated based on the time when the point of regard moves from one position to another position and the distance between the two positions; in some embodiments, in the preset time period, at least two points may be selected based on the movement condition of the gaze point, and a movement speed curve may be generated based on the selected points; in some embodiments, the preset time period may be a fixed value, or it may be a value dynamically set based on a change in the point of regard during the actual analysis calculation.
Step S3, determining at least one eye movement behavior type corresponding to the target human eye within a preset time period according to the moving speed curve, and acquiring eye movement behavior data corresponding to each eye movement behavior type according to the moving speed curve.
Wherein, corresponding to the habit of using eyes of human body, the eyes can frequently rotate to search, watch and track interested targets; the target human eye may make one or more eye movements within a preset time period. In some embodiments, the eye movement behavior may include saccadic behavior, fixation movement behavior, tremor behavior, blinking behavior, and pupil dilation or constriction behavior, among others, each of which may correspond to one eye movement behavior type; in some embodiments, the corresponding type of eye movement behavior may be determined according to the direction and/or magnitude of a certain instantaneous speed, or in some embodiments, the corresponding type of eye movement behavior may also be determined according to the linear and/or angular speed at a certain moment.
And step S4, performing analysis and calculation based on the eye movement behavior data corresponding to each eye movement behavior type to generate the eye movement behavior characteristics of the target human eyes.
And comprehensively analyzing the eye movement behavior data corresponding to each eye movement behavior type to generate the eye movement behavior characteristics of the target human eyes.
The embodiment of the disclosure provides an eye movement behavior analysis method, which may be used to determine a gaze point corresponding to a target human eye, obtain a moving speed curve of the gaze point within a preset time period, determine at least one eye movement behavior type and eye movement behavior data corresponding to the eye movement behavior type according to the moving speed curve, and perform analysis calculation, thereby generating eye movement behavior characteristics of the target human eye, and may implement distinguishing between different eye movement behavior types, so as to obtain more accurate eye movement behavior characteristics.
Fig. 2 is a flowchart illustrating a specific implementation method of step S3 in the embodiment of the present disclosure. As shown in fig. 2, in step S3, the step of determining at least one eye movement behavior type corresponding to the target human eye within a preset time period according to the moving speed curve includes: step S301 to step S303.
Wherein, step S301 to step S303 are executed based on any one of the rates of the moving speed profile.
Step S301, if a rate is smaller than or equal to a preset first rate threshold, determining that the eye movement behavior type comprises a fixation type.
Step S302, if a rate is larger than a first rate threshold and smaller than a preset second rate threshold, determining that the eye movement behavior type comprises a fixation movement type.
Wherein the first rate threshold is less than the second rate threshold.
Step S303, if a speed is larger than or equal to a second speed threshold value, determining that the eye movement behavior type comprises a saccade type.
Therefore, at least one eye movement behavior made by the target human eyes in a preset time period can be determined based on the first speed threshold and the second speed threshold, and the corresponding at least one eye movement behavior type is determined; based on the eye movement behavior type related to the moving rate of the gazing point, the obtained eye movement behavior characteristics can be used for predicting the gazing point and pre-rendering a display image so as to improve the synchronism of the predicted position and the actual position.
Fig. 3 is a flowchart illustrating a specific implementation method of step S4 in the embodiment of the present disclosure. Specifically, the eye movement behavior data includes eye movement behavior frequency and duration; as shown in fig. 3, in step S4, the step of performing analysis calculation based on the eye movement behavior data corresponding to each eye movement behavior type to generate the eye movement behavior characteristics of the target human eye includes: step S401 and step S402.
Step S401, determining respective corresponding weight coefficients according to the eye movement behavior frequency corresponding to each eye movement behavior type.
And S402, analyzing and calculating based on the weight coefficient, the eye movement behavior frequency and the duration time corresponding to each eye movement behavior type to generate eye movement behavior characteristics.
Determining at least one eye movement behavior type according to the moving speed curve, and further determining the eye movement behavior frequency and duration corresponding to various eye movement behavior types; determining weight coefficients corresponding to various eye movement behavior types according to the eye movement behavior frequency, and performing comprehensive analysis according to the weight coefficients, the eye movement behavior frequency and the duration to generate eye movement behavior characteristics of the target human eyes.
In some embodiments, the method further comprises: the weighting factors, the frequency of eye movement behavior, and the duration are initialized, for example, three types of weighting factors are each initialized to 1/3, and three types of eye movement behavior frequency and duration are each initialized to 1.
In some embodiments, the following formula is employed:
factor=(wn1*Fgaze+wn2*Fglance+wn3*Fmove)*(wn1*Tgaze+wn2*Tglance+wn3*Tmove)
calculating to obtain an influence factor, generating eye movement behavior characteristics of the target human eyes based on the influence factor and the eye movement behavior data, wherein the size of the influence factor is related to the precision when the gaze point is predicted by using the characteristics; wherein, Fgaze、Fglance、FmoveRespectively representing the corresponding frequency, T, of the fixation type, the fixation movement type and the saccade type in a preset time periodgaze、Tglance、TmoveRespectively representing the duration, w, corresponding to the gazing type, the gazing movement type and the saccade type within a preset time periodn1、wn2、wn3And respectively representing the weight parameters corresponding to the fixation type, the fixation movement type and the glance type.
Fig. 4 is a flowchart of another eye movement behavior analysis method provided in the embodiments of the present disclosure. In particular, the method is an embodied alternative embodiment based on the method shown in fig. 1. As shown in fig. 4, the method includes not only step S2 to step S4, but also step S101 to step S102, where step S101 and step S102 are an alternative implementation of step S1. Only step S101 to step S102 will be described in detail below.
Step S101, acquiring a human eye side image including a target human eye.
The image acquisition component arranged in the near-eye display device can be used for acquiring the image at the human eye side, or receiving the image at the human eye side transmitted by the host end.
And S102, detecting the pupil state according to the image at the human eye side to determine the fixation point.
In some embodiments, the human eye-side image is an eye image or an eyeball image of the user; detecting a pupil state according to the image at the side of the human eyes, namely extracting a pupil area of the user according to the eye image or the eyeball image of the user; thereafter, a coordinate system may be established based on the pupil position of the user, and the coordinates of the pupil center may be mapped onto the display image on the display screen, or coordinate conversion may be performed according to the coordinate system established in the display image, thereby setting the mapped point or the point position corresponding to the conversion result as the gaze point of the human eye.
In some embodiments, the image acquisition assembly may include an infrared camera; in the display environment of the near-eye display device, the infrared camera can shoot a clear picture of the pupil of the human eye, and the infrared light source can form a brighter light spot on the cornea of the human eye, so that the fixation point of the human eye on the display screen can be calculated through an algorithm of pupil center-cornea light spot.
Fig. 5 is a flowchart of another eye movement behavior analysis method provided in the embodiment of the present disclosure. In particular, the method is an embodied alternative embodiment based on the method shown in fig. 4. As shown in fig. 5, the method not only includes steps S101 to S3, but also includes steps S403 to S405, and steps S403 to S405 are an alternative embodiment of step S4. Only step S403 to step S405 will be described in detail below.
And S403, analyzing the image at the human eye side to obtain the human eye characteristics of the target human eye.
In which human eye features, such as iris features, can be used to distinguish different human eyes, by which different persons or users can be identified.
Step S404, responding to the condition that the human eye characteristics are not stored in the database, analyzing and calculating based on the eye movement behavior data corresponding to each eye movement behavior type, generating eye movement behavior characteristics, and storing the human eye characteristics and the eye movement behavior characteristics into the database.
Step S405, responding to the fact that the human eye characteristics are stored in the database, obtaining historical eye movement behavior data corresponding to the target human eyes from the database, carrying out analysis and calculation based on the historical eye movement behavior data and the eye movement behavior data corresponding to each eye movement behavior type, generating eye movement behavior characteristics, and updating the human eye characteristics and the eye movement behavior characteristics into the database.
If the human eye characteristics are not stored in the database, storing the human eye characteristics and the eye movement behavior characteristics into corresponding data storage areas, specifically, correspondingly allocating identifications and data storage areas corresponding to the identifications to the human eye characteristics, wherein the identifications are unique identifications corresponding to the human eye characteristics, and storing the human eye characteristics and the eye movement behavior characteristics into the data storage areas; if the human eye characteristics are stored in the database, historical eye movement behavior data are called from a data storage area corresponding to the human eye characteristics, analysis is carried out based on the historical eye movement behavior data and the eye movement behavior data, the obtained eye movement behavior characteristics are updated to the data storage area, for example, multiple frames or multiple pictures watched by target human eyes in a preset time period are used as target objects, feature superposition is carried out based on the eye movement behavior data of each frame, and therefore the eye movement behavior characteristics of the same human eye can be improved in an iterative mode.
Fig. 6 is a flowchart of another eye movement behavior analysis method according to an embodiment of the present disclosure. In particular, the method is an embodied alternative embodiment based on the method shown in fig. 4. As shown in fig. 6, the method includes not only steps S101 to S4 but also steps S01 and S02. Only step S01 and step S02 will be described in detail below.
And step S01, acquiring a display screen side image comprising a display screen.
The image acquisition component arranged in the near-eye display device can be used for acquiring a display screen side image, or receiving the display screen side image transmitted from a host end, or directly receiving a display picture currently displayed by the display screen transmitted from the host end.
And step S02, matching the display screen side image with a preset scene template, and determining the current scene type according to the matched scene template.
For example, the eye movement behavior is more a fixation movement behavior in a reading scene, the eye movement behavior is more a fixation behavior in a movie scene, and the eye movement behavior is more a saccade behavior in an outdoor scene.
In step S02, matching the image on the display screen side with the scene template, and determining the current scene type according to the matched scene template, for example, the scene template with the largest matching degree; the eye movement behavior characteristics obtained through analysis and calculation are eye movement behavior characteristics under the current scene type; in some embodiments, each scene template and scene type corresponds to a number thereof, that is, a unique identifier of each scene template and scene type, and the data storage area corresponding to the eye movement behavior feature may be determined based on the number and the identifier of the human eye feature.
It should be noted that, in the embodiment of the present disclosure, the execution steps of steps S01 to S02 and S101 to S4 are not limited, that is, steps S01 to S02 and steps S101 to S4 may be executed sequentially, and the latter is executed synchronously or alternatively.
Fig. 7 is a flowchart illustrating a specific implementation method of step S02 in the embodiment of the present disclosure. Specifically, the near-eye display device comprises an image acquisition assembly, a lens assembly and a display screen; the image projected by the display screen reaches the target human eyes after being shaped by the lens assembly; the human eye side image and the display screen side image are acquired by the image acquisition assembly; as shown in fig. 7, the step of matching the display screen-side image with the preset scene template in step S02 includes: step S021 to step S023.
And S021, extracting a display screen area from the display screen side image based on the brightness.
The image on the display screen side is acquired by the image acquisition assembly in the display environment of the near-eye display device, and the image shot in the environment can be accurately segmented based on brightness, so that the display screen area is extracted.
And S022, correcting the display screen area to obtain the display screen area in a preset shape.
And S023, matching the display screen area in the preset shape with the scene template.
The display screen area of the display screen side image obtained by direct acquisition is mostly irregular or has a certain distortion, so that the display screen area needs to be corrected, such as affine transformation, anti-distortion operation and the like, so as to obtain a preset shape, such as a rectangle and the like, matched with each scene template.
Fig. 8 is a schematic structural diagram of a near-eye display device provided in an embodiment of the present disclosure, and fig. 9 is a schematic structural diagram of another near-eye display device provided in an embodiment of the present disclosure. As shown in fig. 8 and 9, the near-eye display device includes an image capturing assembly 1, a lens assembly 2, and a display screen 3, wherein an image projected by the display screen 3 reaches a target human eye a after being shaped by the lens assembly 2, and the image capturing assembly 1 is located above the lens assembly 2.
As shown in fig. 8, as an alternative embodiment, the image capturing assembly 1 includes a first collector 101 and a second collector 102, in some embodiments, the first collector 101 is an infrared camera whose shooting direction points to one side of the eye a of the target person, and the second collector 102 is a color camera, such as an RGB camera, whose shooting direction points to one side of the display screen 3; the first collector 101 collects images on the side of human eyes, and the second collector 102 collects images on the side of the display screen.
As an alternative embodiment, as shown in fig. 9, the near-eye display device further includes a reflective component 4, the reflective component 4 is located on the same side as the target human eye a, and the reflective component 4 is configured to reflect the image projected by the display screen 3 and shaped by the lens component 2; the image on the side of the human eye and the image on the side of the display screen are the same image, which is obtained by shooting the target human eye and the reflective component 4 by the image acquisition component 1, specifically, the image acquisition component 1 comprises a first collector 101, the shooting direction of which points to one side of the target human eye a, and in some embodiments, the first collector 101 is an infrared camera.
With respect to the structure shown in fig. 9, step S021, the step of extracting the display screen region from the display screen side image based on the brightness includes: and extracting a light reflecting assembly area from the image on the side of the display screen based on the brightness, and taking the light reflecting assembly area as the display screen area.
Fig. 10 is a flowchart of an image rendering method according to an embodiment of the present disclosure. In particular, the method is applied to a near-eye display device; as shown in fig. 10, the method includes:
and step S5, determining the fixation point of the target human eyes on the display screen.
And step S6, acquiring the eye movement behavior characteristics of the target human eyes.
The eye movement behavior characteristics of the target human eyes are obtained by adopting the eye movement behavior analysis method in any one of the above embodiments in advance.
And step S7, predicting the predicted position information of the fixation point at the next moment according to the eye movement behavior characteristics.
And step S8, rendering the image displayed on the display screen at the next moment according to the predicted position information.
In some embodiments, a region located within a predetermined shape having a predetermined size with the predicted gaze point as a center of symmetry is a rendering region, and an image of the region is rendered.
Specifically, the refresh rate in the near-eye display device is not high, in the existing scheme, eye movement behaviors are analyzed without distinction, the obtained eye movement behavior characteristic error is large, if the position of the gaze point needs to be predicted in advance to perform image rendering in advance, the phenomenon that the predicted gaze point is not synchronized with the actual gaze point occurs due to the fact that the eye movement behaviors of human eyes are different in types, the eye movement behaviors of different human eyes are different, the eye movement behaviors of human eyes under different scenes are different, and the like, so that the problems of inaccurate pre-rendering, screen blocking and the like are caused. The image rendering method based on the embodiment of the disclosure can predict the fixation point more accurately, so that the image can be accurately pre-rendered in the display process of the near-eye display device, and the smoothness of the image is improved.
Fig. 11 is a schematic structural diagram of an electronic assembly according to an embodiment of the present disclosure. As shown in fig. 11, the electronic component includes:
one or more processors 101;
a memory 102 on which one or more programs are stored, which when executed by the one or more processors, cause the one or more processors 101 to implement the eye movement behavior analysis method as in any one of the above embodiments, or to implement the image rendering method as in any one of the above embodiments;
one or more I/O interfaces 103 coupled between the processor and the memory and configured to enable information interaction between the processor and the memory.
The processor 101 is a device with data processing capability, and includes but is not limited to a Central Processing Unit (CPU) and the like; memory 102 is a device having data storage capabilities including, but not limited to, random access memory (RAM, more specifically SDRAM, DDR, etc.), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), FLASH memory (FLASH); an I/O interface (read/write interface) 103 is connected between the processor 101 and the memory 102, and can realize information interaction between the processor 101 and the memory 102, which includes but is not limited to a data Bus (Bus) and the like.
In some embodiments, the processor 101, memory 102, and I/O interface 103 are interconnected via a bus 104, which in turn connects with other components of the computing device.
The disclosed embodiment also provides a near-to-eye display device, which includes: image acquisition assembly, lens subassembly, display screen and the electronic component of above-mentioned embodiment.
Fig. 12 is a schematic structural diagram of a computer-readable medium according to an embodiment of the present disclosure. The computer readable medium has stored thereon a computer program, wherein the computer program, when executed by a processor, implements the steps in the eye movement behavior analysis method as in any one of the above embodiments, or implements the steps in the image rendering method as in any one of the above embodiments.
It will be understood by those of ordinary skill in the art that all or some of the steps of the methods disclosed above, functional modules/units in the apparatus, may be implemented as software, firmware, hardware, and suitable combinations thereof. In a hardware implementation, the division between functional modules/units mentioned in the above description does not necessarily correspond to the division of physical components; for example, one physical component may have multiple functions, or one function or step may be performed by several physical components in cooperation. Some or all of the physical components may be implemented as software executed by a processor, such as a central processing unit, digital signal processor, or microprocessor, or as hardware, or as an integrated circuit, such as an application specific integrated circuit. Such software may be distributed on computer readable media, which may include computer storage media (or non-transitory media) and communication media (or transitory media). The term computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data, as is well known to those of ordinary skill in the art. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by a computer. In addition, communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media as known to those skilled in the art.
Example embodiments have been disclosed herein, and although specific terms are employed, they are used and should be interpreted in a generic and descriptive sense only and not for purposes of limitation. In some instances, features, characteristics and/or elements described in connection with a particular embodiment may be used alone or in combination with features, characteristics and/or elements described in connection with other embodiments, unless expressly stated otherwise, as would be apparent to one skilled in the art. Accordingly, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the disclosure as set forth in the appended claims.

Claims (12)

1. An eye movement behavior analysis method, wherein the method is applied to a near-eye display device, and the method comprises the following steps:
determining a fixation point of a target human eye on a display screen;
acquiring a moving speed curve of the fixation point in a preset time period;
determining at least one eye movement behavior type corresponding to the target human eyes in the preset time period according to the moving speed curve, and acquiring eye movement behavior data corresponding to each eye movement behavior type according to the moving speed curve;
and analyzing and calculating the eye movement behavior data corresponding to each eye movement behavior type to generate the eye movement behavior characteristics of the target human eyes.
2. The eye movement behavior analysis method according to claim 1, wherein the determining at least one eye movement behavior type corresponding to the target human eye within the preset time period according to the moving speed curve comprises:
for the moving speed curve, if a speed is less than or equal to a preset first speed threshold, determining that the eye movement behavior type comprises a watching type;
if a rate is greater than the first rate threshold and less than a preset second rate threshold, determining that the eye movement behavior type comprises a fixation movement type, wherein the first rate threshold is less than the second rate threshold;
if a rate greater than or equal to the second rate threshold exists, determining that the eye movement behavior type comprises a saccade type.
3. The eye movement behavior analysis method according to claim 1, wherein the eye movement behavior data includes: frequency and duration of eye movement activity;
the analyzing and calculating based on the eye movement behavior data corresponding to each eye movement behavior type to generate the eye movement behavior characteristics of the target human eyes comprises the following steps:
determining a weight coefficient corresponding to each eye movement behavior type according to the eye movement behavior frequency corresponding to each eye movement behavior type;
and analyzing and calculating based on the weight coefficient, the eye movement behavior frequency and the duration corresponding to each eye movement behavior type to generate the eye movement behavior characteristics.
4. The eye movement behavior analysis method according to claim 1, wherein the determining a gaze point of a target human eye on a display screen comprises:
acquiring a human eye side image including the target human eye;
and detecting the pupil state according to the image at the human eye side to determine the fixation point.
5. The eye movement behavior analysis method according to claim 4, wherein the analyzing and calculating based on the eye movement behavior data corresponding to each eye movement behavior type to generate the eye movement behavior characteristics of the target human eyes comprises:
analyzing the human eye side image to obtain human eye characteristics of the target human eye;
responding to the condition that the human eye characteristics are not stored in the database, performing analysis and calculation based on the eye movement behavior data corresponding to each eye movement behavior type, generating the eye movement behavior characteristics, and storing the human eye characteristics and the eye movement behavior characteristics into the database;
responding to the fact that the human eye features are stored in the database, and acquiring historical eye movement behavior data corresponding to the target human eyes from the database; analyzing and calculating based on historical eye movement behavior data and eye movement behavior data corresponding to each eye movement behavior type to generate the eye movement behavior characteristics; updating the human eye features and the eye movement behavior features into the database.
6. The eye movement behavior analysis method according to claim 4, wherein the eye movement behavior feature is an eye movement behavior feature in a current scene type;
the method further comprises the following steps:
acquiring a display screen side image comprising the display screen;
and matching the display screen side image with a preset scene template, and determining the current scene type according to the matched scene template.
7. The eye movement behavior analysis method according to claim 6, wherein the near-eye display device comprises an image acquisition assembly, a lens assembly, and the display screen; the image projected by the display screen reaches the target human eyes after being shaped by the lens assembly; the human eye side image and the display screen side image are acquired by the image acquisition assembly;
before the display screen side image is matched with a preset scene template, the method further comprises the following steps:
extracting a display screen area from the display screen side image based on the brightness;
correcting the display screen area to obtain the display screen area in a preset shape;
the matching of the display screen side image and a preset scene template comprises the following steps:
and matching the display screen area in a preset shape with the scene template.
8. The eye movement behavior analysis method according to claim 7, wherein the near-eye display device further comprises a light reflecting component, the light reflecting component is located on the same side as the target human eye, and the light reflecting component is configured to reflect the image projected by the display screen and shaped by the lens component; the human eye side image and the display screen side image are the same image, and the image acquisition assembly shoots the target human eyes and the reflective assembly;
the extracting of the display screen area from the display screen side image based on the brightness comprises:
and extracting a light reflecting component area from the display screen side image based on the brightness, and taking the light reflecting component area as the display screen area.
9. An image rendering method, wherein the method is applied to a near-eye display device, the method comprising:
determining a fixation point of a target human eye on a display screen;
acquiring eye movement behavior characteristics of the target human eyes, wherein the eye movement behavior characteristics are obtained by adopting the eye movement behavior analysis method of any one of claims 1 to 8 in advance;
predicting to obtain the predicted position information of the fixation point at the next moment according to the eye movement behavior characteristics;
and rendering the image displayed by the display screen at the next moment according to the predicted position information.
10. An electronic assembly, comprising:
one or more processors;
a memory for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement an eye movement behavior analysis method as claimed in any one of claims 1-8, or to implement an image rendering method as claimed in claim 9.
11. A near-eye display device, comprising: an image capturing assembly, a lens assembly, a display screen and an electronic assembly as claimed in claim 10.
12. A computer readable medium, on which a computer program is stored, wherein the program, when being executed by a processor, carries out the steps in the eye movement behavior analysis method as claimed in any one of the claims 1 to 8, or carries out the steps in the image rendering method as claimed in claim 9.
CN202111062599.8A 2021-09-10 2021-09-10 Eye movement behavior analysis method, image rendering method, component, device and medium Pending CN113780414A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111062599.8A CN113780414A (en) 2021-09-10 2021-09-10 Eye movement behavior analysis method, image rendering method, component, device and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111062599.8A CN113780414A (en) 2021-09-10 2021-09-10 Eye movement behavior analysis method, image rendering method, component, device and medium

Publications (1)

Publication Number Publication Date
CN113780414A true CN113780414A (en) 2021-12-10

Family

ID=78842469

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111062599.8A Pending CN113780414A (en) 2021-09-10 2021-09-10 Eye movement behavior analysis method, image rendering method, component, device and medium

Country Status (1)

Country Link
CN (1) CN113780414A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107610044A (en) * 2017-08-29 2018-01-19 歌尔科技有限公司 Image processing method, computer-readable recording medium and virtual reality helmet
US20190354174A1 (en) * 2018-05-17 2019-11-21 Sony Interactive Entertainment Inc. Eye tracking with prediction and late update to gpu for fast foveated rendering in an hmd environment
US20200104039A1 (en) * 2018-09-28 2020-04-02 Snap Inc. Neural network system for gesture, wear, activity, or carry detection on a wearable or mobile device
CN111427150A (en) * 2020-03-12 2020-07-17 华南理工大学 Eye movement signal processing method used under virtual reality head-mounted display and wearable device
CN111949131A (en) * 2020-08-17 2020-11-17 陈涛 Eye movement interaction method, system and equipment based on eye movement tracking technology
CN112400150A (en) * 2018-05-17 2021-02-23 索尼互动娱乐股份有限公司 Dynamic graphics rendering based on predicted glance landing sites
CN113362450A (en) * 2021-06-02 2021-09-07 聚好看科技股份有限公司 Three-dimensional reconstruction method, device and system
CN113362449A (en) * 2021-06-01 2021-09-07 聚好看科技股份有限公司 Three-dimensional reconstruction method, device and system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107610044A (en) * 2017-08-29 2018-01-19 歌尔科技有限公司 Image processing method, computer-readable recording medium and virtual reality helmet
US20190354174A1 (en) * 2018-05-17 2019-11-21 Sony Interactive Entertainment Inc. Eye tracking with prediction and late update to gpu for fast foveated rendering in an hmd environment
CN112400150A (en) * 2018-05-17 2021-02-23 索尼互动娱乐股份有限公司 Dynamic graphics rendering based on predicted glance landing sites
US20200104039A1 (en) * 2018-09-28 2020-04-02 Snap Inc. Neural network system for gesture, wear, activity, or carry detection on a wearable or mobile device
CN111427150A (en) * 2020-03-12 2020-07-17 华南理工大学 Eye movement signal processing method used under virtual reality head-mounted display and wearable device
CN111949131A (en) * 2020-08-17 2020-11-17 陈涛 Eye movement interaction method, system and equipment based on eye movement tracking technology
CN113362449A (en) * 2021-06-01 2021-09-07 聚好看科技股份有限公司 Three-dimensional reconstruction method, device and system
CN113362450A (en) * 2021-06-02 2021-09-07 聚好看科技股份有限公司 Three-dimensional reconstruction method, device and system

Similar Documents

Publication Publication Date Title
CN109086726B (en) Local image identification method and system based on AR intelligent glasses
US11604509B1 (en) Event camera for eye tracking
CN109558012B (en) Eyeball tracking method and device
KR102121134B1 (en) Eye-traceable wearable devices
US10241329B2 (en) Varifocal aberration compensation for near-eye displays
US20180342066A1 (en) Apparatus and method for hybrid eye tracking
CN106959759B (en) Data processing method and device
WO2016115873A1 (en) Binocular ar head-mounted display device and information display method therefor
CN108881724B (en) Image acquisition method, device, equipment and storage medium
WO2014085092A1 (en) System and method for generating 3-d plenoptic video images
KR101788452B1 (en) Apparatus and method for replaying contents using eye tracking of users
US20150309567A1 (en) Device and method for tracking gaze
CN109885169B (en) Eyeball parameter calibration and sight direction tracking method based on three-dimensional eyeball model
CN109901290B (en) Method and device for determining gazing area and wearable device
JP7081599B2 (en) Information processing equipment, information processing methods, and programs
JP2023515205A (en) Display method, device, terminal device and computer program
US11749141B2 (en) Information processing apparatus, information processing method, and recording medium
US10553164B1 (en) Display latency calibration for liquid crystal display
CN110895433A (en) Method and apparatus for user interaction in augmented reality
CN112926523B (en) Eyeball tracking method and system based on virtual reality
KR101817436B1 (en) Apparatus and method for displaying contents using electrooculogram sensors
CN112651270A (en) Gaze information determination method and apparatus, terminal device and display object
CN113780414A (en) Eye movement behavior analysis method, image rendering method, component, device and medium
CN109963143A (en) A kind of image acquiring method and system of AR glasses
CN111654688B (en) Method and equipment for acquiring target control parameters

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination