CN111311494B - Eyeball tracking and positioning accuracy determination method and related product - Google Patents

Eyeball tracking and positioning accuracy determination method and related product Download PDF

Info

Publication number
CN111311494B
CN111311494B CN202010090960.7A CN202010090960A CN111311494B CN 111311494 B CN111311494 B CN 111311494B CN 202010090960 A CN202010090960 A CN 202010090960A CN 111311494 B CN111311494 B CN 111311494B
Authority
CN
China
Prior art keywords
determining
screen
interpolation
precision
positioning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010090960.7A
Other languages
Chinese (zh)
Other versions
CN111311494A (en
Inventor
吴义孝
方攀
陈岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202010090960.7A priority Critical patent/CN111311494B/en
Publication of CN111311494A publication Critical patent/CN111311494A/en
Application granted granted Critical
Publication of CN111311494B publication Critical patent/CN111311494B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4007Interpolation-based scaling, e.g. bilinear interpolation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30041Eye; Retina; Ophthalmic

Abstract

The embodiment of the application discloses a method for determining eyeball tracking and positioning accuracy and a related product, which are applied to electronic equipment, wherein the method comprises the following steps: determining N fixation points on a screen, wherein N is a positive integer; determining an accuracy value corresponding to the eyeball tracking and positioning corresponding to each fixation point in the N fixation points to obtain N accuracy values; determining an interpolation parameter corresponding to each precision value in the N precision values to obtain N interpolation parameters; and carrying out interpolation operation on each pixel point of the screen according to the N interpolation parameters to obtain an eyeball tracking and positioning precision distribution map corresponding to the screen. By adopting the method and the device, the eyeball tracking and positioning accuracy of different areas of the screen can be determined.

Description

Eyeball tracking and positioning accuracy determination method and related product
Technical Field
The application relates to the technical field of electronics, in particular to an eyeball tracking and positioning accuracy determination method and a related product.
Background
With the widespread use of electronic devices (such as mobile phones, tablet computers, and the like), the electronic devices have more and more applications and more powerful functions, and the electronic devices are developed towards diversification and personalization, and become indispensable electronic products in the life of users.
The existing eyeball tracking technology is not good, at present, on electronic equipment which does not use the eyeball tracking technology, the display of screen information is relatively uniform, but the mode is not suitable for eyeball tracking, and as the eyeball tracking at the present stage often has inconsistent positioning precision when different areas of a screen are annotated by human eyes, the information display of the screen is emphasized when the eyeball tracking is used, but the problem of how to determine the positioning precision of the eyeball tracking of different areas of the screen needs to be solved urgently.
Disclosure of Invention
The embodiment of the application provides an eyeball tracking and positioning precision determination method and a related product, and the eyeball tracking and positioning precision of different areas of a screen can be determined.
In a first aspect, an embodiment of the present application provides an eyeball tracking positioning accuracy determination method, which is applied to an electronic device, and the method includes:
determining N fixation points on a screen, wherein N is a positive integer;
determining an accuracy value corresponding to the eyeball tracking and positioning corresponding to each fixation point in the N fixation points to obtain N accuracy values;
determining an interpolation parameter corresponding to each precision value in the N precision values to obtain N interpolation parameters;
and carrying out interpolation operation on each pixel point of the screen according to the N interpolation parameters to obtain an eyeball tracking and positioning precision distribution map corresponding to the screen.
In a second aspect, an embodiment of the present application provides an eyeball tracking positioning accuracy determination apparatus, which is applied to an electronic device, and includes: a determination unit and an interpolation unit, wherein,
the determining unit is used for determining N fixation points on a screen, wherein N is a positive integer; determining an accuracy value corresponding to the eyeball tracking and positioning corresponding to each fixation point in the N fixation points to obtain N accuracy values; determining an interpolation parameter corresponding to each precision value in the N precision values to obtain N interpolation parameters;
and the interpolation unit is used for carrying out interpolation operation on each pixel point of the screen according to the N interpolation parameters to obtain an eyeball tracking positioning precision distribution map corresponding to the screen.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, and the program includes instructions for executing the steps in the first aspect of the embodiment of the present application.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program enables a computer to perform some or all of the steps described in the first aspect of the embodiment of the present application.
In a fifth aspect, embodiments of the present application provide a computer program product, where the computer program product includes a non-transitory computer-readable storage medium storing a computer program, where the computer program is operable to cause a computer to perform some or all of the steps as described in the first aspect of the embodiments of the present application. The computer program product may be a software installation package.
The embodiment of the application has the following beneficial effects:
it can be seen that, the eyeball tracking positioning accuracy determining method and the related product described in the embodiments of the present application are applied to an electronic device, N points of regard are determined on a screen, where N is a positive integer, an accuracy value corresponding to eyeball tracking positioning corresponding to each point of regard in the N points of regard is determined, N accuracy values are obtained, an interpolation parameter corresponding to each accuracy value in the N accuracy values is determined, N interpolation parameters are obtained, an interpolation operation is performed on each pixel point of the screen according to the N interpolation parameters, an eyeball tracking positioning accuracy distribution map corresponding to the screen is obtained, thus, the accuracy value corresponding to the point of regard and the interpolation parameter corresponding to the accuracy value can be determined based on the points of regard, and the interpolation operation is performed on each pixel point based on the interpolation parameters, the eyeball tracking positioning accuracy distribution map is obtained, and further, the eyeball tracking positioning accuracy in different areas of the screen can be determined.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1A is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 1B is a schematic flowchart of a method for determining eyeball tracking positioning accuracy according to an embodiment of the present disclosure;
FIG. 1C is a schematic diagram illustrating a split screen area of a quadtree-based storage structure according to an embodiment of the present disclosure;
FIG. 1D is a schematic diagram illustrating an eyeball tracking positioning accuracy distribution map provided by an embodiment of the present application;
FIG. 1E is a schematic illustration of an example of a target trajectory provided by embodiments of the present application;
FIG. 1F is a schematic diagram illustrating another exemplary target track provided by an embodiment of the present application;
FIG. 2 is a schematic flowchart of another method for determining eyeball tracking positioning accuracy according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of another electronic device provided in an embodiment of the present application;
fig. 4A is a block diagram illustrating functional units of an apparatus for determining eyeball tracking positioning accuracy according to an embodiment of the present application;
fig. 4B is a block diagram illustrating functional units of another apparatus for determining eyeball tracking positioning accuracy according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements but may alternatively include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein may be combined with other embodiments.
The electronic device related to the embodiment of the present application may include various handheld devices (smart phones, tablet computers, etc.) having a wireless communication function, vehicle-mounted devices (navigators, vehicle-mounted refrigerators, vehicle-mounted dust collectors, etc.), wearable devices (smart watches, smart bracelets, wireless headsets, augmented reality/virtual reality devices, smart glasses), computing devices or other processing devices connected to wireless modems, and various forms of User Equipment (UE), mobile Stations (MS), terminal devices (terminal device), and the like. For convenience of description, the above-mentioned devices are collectively referred to as electronic devices.
The following describes embodiments of the present application in detail.
As shown in fig. 1A, fig. 1A is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure. The electronic device includes a processor, a Memory, a signal processor, a communication interface, a display screen, a speaker, a microphone, a Random Access Memory (RAM), a Touch screen (TP), a camera module, a sensor, and the like. The storage, the signal processor, the display screen, the loudspeaker, the microphone, the RAM, the camera module, the sensor and the TP are connected with the processor, and the communication interface is connected with the signal processor.
The Display screen may be a Liquid Crystal Display (LCD), an Organic or inorganic Light-Emitting Diode (OLED), an Active Matrix/Organic Light-Emitting Diode (AMOLED), or the like.
The camera module can include a common camera and an infrared camera, and is not limited herein. The camera may be a front camera or a rear camera, and is not limited herein.
Wherein the sensor comprises at least one of: light sensors, gyroscopes, infrared light (IR) sensors, fingerprint sensors, pressure sensors, and the like. Among them, the light sensor, also called an ambient light sensor, is used to detect the ambient light brightness. The light sensor may include a light sensitive element and an analog to digital converter. The photosensitive element is used for converting collected optical signals into electric signals, and the analog-to-digital converter is used for converting the electric signals into digital signals. Optionally, the light sensor may further include a signal amplifier, and the signal amplifier may amplify the electrical signal converted by the photosensitive element and output the amplified electrical signal to the analog-to-digital converter. The photosensitive element may include at least one of a photodiode, a phototransistor, a photoresistor, and a silicon photocell.
The processor is a control center of the electronic equipment, various interfaces and lines are used for connecting all parts of the whole electronic equipment, and various functions and processing data of the electronic equipment are executed by operating or executing software programs and/or modules stored in the memory and calling data stored in the memory, so that the electronic equipment is integrally monitored.
The processor may integrate an Application Processor (AP) and a modem processor, wherein the AP mainly processes an operating system, a user interface, an application program, and the like, and the modem processor mainly processes wireless communication. It will be appreciated that the modem processor described above may not be integrated into the processor.
The processor includes a Central Processing Unit (CPU) and a Graphics Processing Unit (GPU). The CPU is one of the main devices of an electronic computer, and is a core accessory in the computer. Its functions are mainly to interpret computer instructions and to process data in computer software. The CPU is the core component of the computer responsible for reading, decoding and executing instructions. The central processor mainly comprises two parts, namely a controller and an arithmetic unit, and also comprises a cache memory and a bus for realizing data and control of the connection between the cache memory and the arithmetic unit. The three major core components of the computer are the CPU, internal memory, and input/output devices. The central processing unit mainly has the functions of processing instructions, executing operations, controlling time and processing data. The GPU, also called a display core, a visual processor, and a display chip, is a microprocessor dedicated to image and graphic related operations on personal computers, workstations, game machines, and some mobile devices (e.g., tablet computers, smart phones, etc.). The GPU reduces the dependence of the graphics card on the CPU, and performs part of the original CPU work, and particularly, the core technologies adopted by the GPU in 3D graphics processing include hardware T & L (geometric transformation and illumination processing), cubic environment texture mapping and vertex mixing, texture compression and bump mapping, a dual-texture four-pixel 256-bit rendering engine, and the like, and the hardware T & L technology can be said to be a mark of the GPU.
The memory is used for storing software programs and/or modules, and the processor executes various functional applications and data processing of the electronic equipment by operating the software programs and/or modules stored in the memory. The memory mainly comprises a program storage area and a data storage area, wherein the program storage area can store an operating system, a software program required by at least one function and the like; the storage data area may store data created according to use of the electronic device, and the like. Further, the memory may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
Referring to fig. 1B, fig. 1B is a schematic flowchart of a method for determining eyeball tracking positioning accuracy according to an embodiment of the present disclosure, and as shown in the drawing, the method is applied to the electronic device shown in fig. 1A, and the method for determining eyeball tracking positioning accuracy includes:
101. determining N fixation points on a screen, wherein N is a positive integer.
In this embodiment, the gaze points may be understood as points or areas on the screen where the eyeballs of the user focus, N gaze points may be preset or may be marked by the user, where N is a positive integer, for example, N is 1, or N is an integer greater than 1. In the embodiment of the present application, the gaze point may be a static point or a dynamic point. For example, N calibration points may be presented on the screen to guide the user to look at the calibration point, which may then be used as the point of gaze. As another example, a user may look at a location on the screen and mark that location as a point of gaze by himself.
102. And determining an accuracy value corresponding to the eyeball tracking and positioning corresponding to each fixation point in the N fixation points to obtain N accuracy values.
Because each fixation point corresponds to an eyeball (pupil) actual attention position and a predicted attention position calculated by an eyeball tracking algorithm, a certain deviation exists between the actual attention position and the predicted attention position, and the deviation determines an accuracy value corresponding to eyeball tracking positioning.
In one possible example, the step 102 of determining the accuracy value corresponding to the eye-tracking location corresponding to each gaze point in the N gaze points may include the following steps:
a21, determining a first coordinate position watched by a pupil corresponding to a fixation point i, wherein the fixation point i is any one of the N fixation points;
a22, determining a second coordinate position corresponding to the fixation point i and determined by a pre-stored eyeball tracking algorithm;
and A23, determining an accuracy value corresponding to the eyeball tracking positioning corresponding to the fixation point i according to the first coordinate position and the second coordinate position.
The electronic device may determine a first coordinate position (an actual gaze position) gazed by a pupil corresponding to the gaze point i, and may further determine a second coordinate position (a predicted gaze position) corresponding to the gaze point i according to the first coordinate position and the second coordinate position, for example, may calculate a target euclidean distance between the first coordinate position and the second coordinate position, and determine an accuracy value corresponding to the target euclidean distance according to a mapping relationship between the preset euclidean distance and the accuracy value, so that the accuracy value between the actual gaze position and the gaze position predicted by the eyeball tracking algorithm may be determined.
103. And determining interpolation parameters corresponding to the N precision values to obtain N interpolation parameters.
In this embodiment, the interpolation parameter may be at least one of the following: the interpolation algorithm, the interpolation algorithm corresponding to the interpolation control parameter, the interpolation region parameter, etc., are not limited herein. Wherein, the interpolation algorithm may be at least one of the following: the interpolation control parameters corresponding to the interpolation algorithm may be understood as control parameters corresponding to the interpolation algorithm and adjustment parameters used for adjusting the interpolation degree, the interpolation region parameters may be understood as a specific region within which interpolation is performed, and the interpolation region parameters may include at least one of the following: the shape of the region, the position of the region, the area of the region, etc., are not limited thereto. In specific implementation, because the position corresponding to each precision value in the N precision values is not constant or the size of each precision value is different, the interpolation parameters are also different, so that the interpolation parameter corresponding to each precision value in the N precision values can be determined to obtain N interpolation parameters, each interpolation parameter in the N interpolation parameters can be responsible for performing interpolation operation on an independent area, and the interpolation parameters can realize interpolation operation on the whole screen.
In a specific implementation, the electronic device may plan an independent area for each gaze point based on the position of the gaze point, so as to perform interpolation operation subsequently according to interpolation parameters corresponding to the gaze point, as shown in fig. 1C, taking 3 gaze points as an example (gaze point 1, gaze point 2, and gaze point 3), a storage structure of a quadtree is adopted to partition a screen area, and of course, the number of gaze points may be increased, and each time a gaze point is added, the area may be partitioned according to the position of the gaze point. In practical applications, when there are multiple gazing points, the multiple gazing points may be numbered, and the regions may be divided according to the numbering sequence.
In a possible example, in step 103, determining an interpolation parameter corresponding to each of the N precision values to obtain N interpolation parameters, the method may include the following steps:
31. acquiring a target screen state parameter between an eyeball corresponding to an accuracy value j and the screen, wherein the accuracy value j is any one of the N accuracy values;
32. and determining an interpolation parameter j corresponding to the target screen state parameter according to a preset mapping relation between the screen state parameter and the interpolation parameter.
In this embodiment of the present application, the screen state parameter may be at least one of the following: the size of the screen, the state of the screen, the distance between the gaze point and the pupil of the user, the angle between the gaze point and the pupil of the user, and the like, which are not limited herein. The screen state can be a landscape screen state or a portrait screen state.
In a specific implementation, taking the precision value j as an example, the precision value j is any one of the N precision values. The electronic device can acquire a target screen state parameter between the eyeball and the screen corresponding to the precision value j, and the electronic device can also pre-store a mapping relation between a preset screen state parameter and an interpolation parameter, and further can determine the interpolation parameter j corresponding to the target screen state parameter according to the mapping relation between the preset screen state parameter and the interpolation parameter, and so on, and can determine the interpolation parameter corresponding to each precision value.
104. And carrying out interpolation operation on each pixel point of the screen according to the N interpolation parameters to obtain an eyeball tracking and positioning precision distribution map corresponding to the screen.
Wherein, electronic equipment can carry out the interpolation operation according to each pixel of N interpolation parameter to the screen, like this, each pixel then can correspond an accuracy value, can be in order to obtain the eyeball tracking positioning accuracy distribution map that the screen corresponds, electronic equipment can also show eyeball tracking positioning accuracy distribution map on the screen, thus, utilize partial region's eyeball positioning accuracy, draw the eyeball tracking positioning accuracy distribution map of whole screen, shorten the user and look at consumption time earlier, not only greatly promote user experience, and can lay the basis for follow-up according to regional positioning accuracy partition information distribution.
In addition, in a specific implementation, the electronic device may divide the screen region by using a storage structure of a quadtree, and may perform bilinear interpolation operation on the positioning accuracy value in each region.
In a possible example, in the step 104, performing interpolation operation on each pixel point of the screen according to the N interpolation parameters to obtain the eyeball tracking positioning accuracy distribution map corresponding to the screen may include the following steps:
41. determining an interpolation area corresponding to each interpolation parameter in the N interpolation parameters to obtain N areas to be interpolated, wherein the N areas to be interpolated cover each pixel point of the screen;
42. and carrying out interpolation operation on the N areas to be interpolated according to the N interpolation parameters and the N precision values to obtain an eyeball tracking and positioning precision distribution map corresponding to the screen.
The electronic device may determine a to-be-interpolated region corresponding to each interpolation parameter in the N interpolation parameters to obtain N to-be-interpolated regions, where the to-be-interpolated region may be planned in advance, each to-be-interpolated region corresponds to one gaze point, and a region within a certain range of each gaze point in the N gaze points may also be used as the to-be-interpolated region, and then, the N to-be-interpolated regions may be subjected to interpolation operation according to the N interpolation parameters and the N precision values to obtain an eyeball tracking and positioning precision distribution map corresponding to the screen, that is, the N to-be-interpolated regions may be subjected to interpolation operation with the precision values of the gaze points corresponding to the to-be-interpolated regions as references and with the corresponding interpolation parameters, and may quickly generate the eyeball tracking and positioning precision distribution map corresponding to the entire screen.
In one possible example, after the step 104, the following steps may be further included:
b1, acquiring at least one precision threshold;
and B2, dividing the eyeball tracking and positioning precision distribution map into a plurality of precision grade areas based on the at least one precision threshold.
In the embodiment of the present application, at least one precision threshold may be preset or default to the system. In a specific implementation, the electronic device may obtain at least one precision threshold, and give the at least one precision threshold, divide the eye tracking and positioning precision distribution map into a plurality of precision level regions, where each precision level region may correspond to one precision level label, and each precision level region may also correspond to one display color. Furthermore, the electronic device can also pre-store the mapping relationship between the preset application and the precision grade, and then can determine the application corresponding to each precision grade region, or the electronic device can also pre-store the mapping relationship between the preset function and the precision grade, and then can determine the function corresponding to each precision grade region, so that the eyeball tracking and positioning of different applications or functions can be realized according to the precision of different regions, the accurate eyeball tracking and positioning function is facilitated to be realized, and the user experience is improved.
For example, taking 2 precision thresholds as an example, as shown in fig. 1D, the screen area can be divided into a low precision level area, a medium precision level area and a high precision level area by the 2 precision thresholds.
In one possible example, the step 101 of determining N fixation points on the screen may include the following steps:
11. displaying a fixation point a at a preset position of the screen in a first preset size, wherein the inner area of the fixation point a is displayed in a countdown mode, and the fixation point a is any one of the N fixation points;
12. and when the countdown is finished, the size of the fixation point a is scaled from the first preset size to a second preset size.
The first preset size and the second preset size may be set by a user or default by a system, and the first preset size is not equal to the second preset size, for example, the first preset size is smaller than the second preset size, or the first preset size is larger than the second preset size.
In specific implementation, taking a gazing point a as an example, the gazing point a is any one of N gazing points, a preset position may be preset or a system is default, the electronic device may display the gazing point a at a first preset size at the preset position of the screen, an internal area of the gazing point a is displayed in a countdown mode, and when the countdown is finished, the size of the gazing point a is scaled to a second preset size from the first preset size, so that the user can be effectively reminded to pay attention to the gazing point. For example, the countdown is 5s, and the size of the point of regard a may be scaled from the first preset size to the second preset size at the end of the countdown, thus giving the user enough time to focus on the screen.
In one possible example, the step 102 of determining an accuracy value corresponding to the eye tracking location corresponding to each of the N gaze points to obtain N accuracy values may include the following steps:
c21, obtaining the brightness of the target environment;
c22, determining a target track corresponding to the target environment light brightness according to a preset mapping relation between the environment light brightness and the track;
c23, controlling the fixation point a to move according to the target track;
c24, determining a plurality of track positions corresponding to the fixation point a, a plurality of positioning positions corresponding to the plurality of track positions, and a unique positioning position corresponding to each track position;
and C25, determining the precision value corresponding to the fixation point a according to the plurality of track positions and the plurality of positioning positions.
Wherein, the target track can be set by the user or the system defaults. In a specific implementation, during the moving process of the gaze point a, the target track may be reserved, that is, the user may see a line on the screen, and of course, the target track may not be reserved, that is, only a point on the screen is seen to move.
In specific implementation, the electronic device may include an ambient light sensor, ambient light collection may be implemented by the ambient light sensor, the electronic device may obtain the brightness of the target ambient light, a mapping relationship between the preset ambient light brightness and the track may also be stored in the electronic device in advance, and then, a target track corresponding to the brightness of the target ambient light may be determined according to the mapping relationship, and the gaze point a is controlled to move according to the target track, so that different tracks are set for different ambient lights, as shown in fig. 1E and 1F, different tracks are provided in fig. 1E and 1F, which is helpful for improving the attention of a user to the gaze point.
Further, the electronic device may determine a plurality of track positions corresponding to the gaze point a, a plurality of positioning positions corresponding to the plurality of track positions and calculated by an eye tracking algorithm, and a unique positioning position corresponding to each track position, and determine an accuracy value corresponding to the gaze point a according to the plurality of track positions and the plurality of positioning positions.
In a possible example, the step C25 of determining the accuracy value corresponding to the gaze point a according to the plurality of track positions and the plurality of positioning positions may include the following steps:
c251, calculating according to the plurality of track positions and the plurality of positioning positions to obtain a plurality of distances;
c252, determining the mean value of the plurality of distances to obtain a target mean value;
c253, determining the mean square deviations of the distances to obtain a target mean square deviation;
c254, determining a target initial precision value corresponding to the target mean value according to a mapping relation between a preset mean value and the initial precision value;
c255, determining a target adjusting coefficient corresponding to the target mean square error according to a mapping relation between a preset mean square error and an adjusting coefficient;
and C256, adjusting the target initial accuracy value according to the target adjusting coefficient to obtain an accuracy value corresponding to the fixation point a.
In a specific implementation, the electronic device may perform an operation according to a plurality of track positions and a plurality of positioning positions to obtain a plurality of distances, specifically, since the track positions and the positioning positions both correspond to one coordinate, further, an euclidean distance between the track positions and the corresponding positioning positions may be calculated to obtain a plurality of distances, further, a mean value of the plurality of distances may be determined to obtain a target mean value, and a mean square error of the plurality of distances may be determined to obtain a target mean square error, where the mean square error reflects the stability of the precision value of the point of regard a to a certain extent, and a mapping relationship between a preset mean value and an initial precision value and a mapping relationship between a preset mean square error and an adjustment coefficient may be stored in the electronic device in advance.
Further, the electronic device may determine a target initial accuracy value corresponding to the target mean value according to a mapping relationship between a preset mean value and an initial accuracy value, and determine a target adjustment coefficient corresponding to the target mean square error according to a mapping relationship between a preset mean square error and an adjustment coefficient, further, may adjust the target initial accuracy value according to the target adjustment coefficient to obtain an accuracy value corresponding to the gaze point a, specifically, the accuracy value corresponding to the gaze point a = the target initial accuracy value (1 + adjustment coefficient), in general, the accuracy value fluctuates to a certain extent, the mean square error is introduced to adjust the accuracy value, and accurate determination of an eyeball tracking positioning accuracy value corresponding to the gaze point a may be achieved.
It can be seen that the eyeball tracking positioning accuracy determining method described in the embodiment of the present application is applied to an electronic device, and determines N fixation points on a screen, where N is a positive integer, determines an accuracy value corresponding to eyeball tracking positioning corresponding to each fixation point in the N fixation points to obtain N accuracy values, determines an interpolation parameter corresponding to each accuracy value in the N accuracy values to obtain N interpolation parameters, and performs interpolation operation on each pixel point of the screen according to the N interpolation parameters to obtain an eyeball tracking positioning accuracy distribution map corresponding to the screen.
Referring to fig. 2, fig. 2 is a schematic flowchart of an eyeball tracking positioning accuracy determination method according to an embodiment of the present application, and as shown in the figure, the method is applied to the electronic device shown in fig. 1A, and the eyeball tracking positioning accuracy determination method includes:
201. determining N fixation points on a screen, wherein N is a positive integer.
202. And determining an accuracy value corresponding to the eyeball tracking and positioning corresponding to each fixation point in the N fixation points to obtain N accuracy values.
203. And determining an interpolation parameter corresponding to each precision value in the N precision values to obtain N interpolation parameters.
204. And carrying out interpolation operation on each pixel point of the screen according to the N interpolation parameters to obtain an eyeball tracking and positioning precision distribution diagram corresponding to the screen.
205. At least one accuracy threshold is obtained.
206. And dividing the eyeball tracking and positioning precision distribution map into a plurality of precision grade areas based on the at least one precision threshold.
For the detailed description of the steps 201 to 206, reference may be made to corresponding steps of the eyeball tracking positioning accuracy determination method described in fig. 1B, and details are not repeated here.
It can be seen that the eyeball tracking positioning accuracy determination method described in the embodiment of the present application may determine, based on the gaze point, the accuracy value corresponding thereto and the interpolation parameter corresponding to the accuracy value, and perform interpolation operation on each pixel point based on the interpolation parameter to obtain the eyeball tracking positioning accuracy distribution map, may also determine the eyeball tracking positioning accuracy in different areas of the screen, and may further divide the screen into a plurality of accuracy level areas according to the accuracy threshold.
In accordance with the foregoing embodiments, please refer to fig. 3, fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application, and as shown in the drawing, the electronic device includes a processor, a memory, a communication interface, and one or more programs, the one or more programs are stored in the memory and configured to be executed by the processor, and in an embodiment of the present application, the programs include instructions for performing the following steps:
determining N fixation points on a screen, wherein N is a positive integer;
determining an accuracy value corresponding to the eyeball tracking and positioning corresponding to each fixation point in the N fixation points to obtain N accuracy values;
determining an interpolation parameter corresponding to each of the N precision values to obtain N interpolation parameters;
and carrying out interpolation operation on each pixel point of the screen according to the N interpolation parameters to obtain an eyeball tracking and positioning precision distribution diagram corresponding to the screen.
It can be seen that, in the electronic device described in this embodiment of the present application, N points of regard are determined on a screen, where N is a positive integer, an accuracy value corresponding to eyeball tracking positioning corresponding to each point of regard in the N points of regard is determined, N accuracy values are obtained, an interpolation parameter corresponding to each accuracy value in the N accuracy values is determined, N interpolation parameters are obtained, an interpolation operation is performed on each pixel point of the screen according to the N interpolation parameters, and an eyeball tracking positioning accuracy distribution map corresponding to the screen is obtained.
In one possible example, in said determining a precision value corresponding to an eye tracking position corresponding to each of said N gaze points, the program comprises instructions for:
determining a first coordinate position watched by a pupil corresponding to a fixation point i, wherein the fixation point i is any one of the N fixation points;
determining a second coordinate position corresponding to the fixation point i and determined by a pre-stored eyeball tracking algorithm;
and determining an accuracy value corresponding to the eyeball tracking and positioning corresponding to the fixation point i according to the first coordinate position and the second coordinate position.
In one possible example, in said determining the interpolation parameter corresponding to each of the N precision values, resulting in N interpolation parameters, the program includes instructions for performing the steps of:
acquiring a target screen state parameter between an eyeball and the screen corresponding to an accuracy value j, wherein the accuracy value j is any one of the N accuracy values;
and determining an interpolation parameter j corresponding to the target screen state parameter according to a preset mapping relation between the screen state parameter and the interpolation parameter.
In a possible example, in the aspect that the interpolation operation is performed on each pixel point of the screen according to the N interpolation parameters to obtain the eyeball tracking and positioning accuracy distribution map corresponding to the screen, the above program includes instructions for executing the following steps:
determining an interpolation area corresponding to each interpolation parameter in the N interpolation parameters to obtain N areas to be interpolated, wherein the N areas to be interpolated cover each pixel point of the screen;
and carrying out interpolation operation on the N areas to be interpolated according to the N interpolation parameters and the N precision values to obtain an eyeball tracking and positioning precision distribution map corresponding to the screen.
In one possible example, the program further includes instructions for performing the steps of:
obtaining at least one precision threshold;
dividing the eye tracking positioning accuracy distribution map into a plurality of accuracy grade areas based on the at least one accuracy threshold.
In one possible example, in said determining N fixation points on the screen, the above program comprises instructions for performing the following steps:
displaying a fixation point a in a first preset size at a preset position of the screen, wherein the inner area of the fixation point a is displayed in a countdown mode, and the fixation point a is any one of the N fixation points;
and when the countdown is finished, the size of the fixation point a is reduced from the first preset size to a second preset size.
In one possible example, in the determining an accuracy value corresponding to the eye tracking position corresponding to each of the N gaze points to obtain N accuracy values, the program includes instructions for:
obtaining the brightness of the target environment;
determining a target track corresponding to the target environment light brightness according to a preset mapping relation between the environment light brightness and the track;
controlling the fixation point a to move according to the target track;
determining a plurality of track positions corresponding to the fixation point a, a plurality of positioning positions corresponding to the plurality of track positions, and a unique positioning position corresponding to each track position;
and determining the precision value corresponding to the fixation point a according to the plurality of track positions and the plurality of positioning positions.
The above description has introduced the solution of the embodiment of the present application mainly from the perspective of the method-side implementation process. It is understood that the electronic device comprises corresponding hardware structures and/or software modules for performing the respective functions in order to realize the above-mentioned functions. Those of skill in the art will readily appreciate that the present application is capable of hardware or a combination of hardware and computer software implementing the various illustrative elements and algorithm steps described in connection with the embodiments provided herein. Whether a function is performed in hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the electronic device may be divided into the functional units according to the method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
Fig. 4A is a block diagram of functional units of an eye tracking positioning accuracy determination apparatus 400 according to an embodiment of the present application. The eyeball tracking positioning accuracy determination apparatus 400 is applied to an electronic device, and the apparatus 400 comprises: a determination unit 401 and an interpolation unit 402, wherein,
the determining unit 401 is configured to determine N gaze points on a screen, where N is a positive integer; determining an accuracy value corresponding to the eyeball tracking and positioning corresponding to each fixation point in the N fixation points to obtain N accuracy values; determining an interpolation parameter corresponding to each precision value in the N precision values to obtain N interpolation parameters;
the interpolation unit 402 is configured to perform interpolation operation on each pixel point of the screen according to the N interpolation parameters, so as to obtain an eyeball tracking positioning accuracy distribution map corresponding to the screen.
It can be seen that the eyeball tracking positioning accuracy determining device described in the embodiment of the present application is applied to an electronic device, and determines N fixation points on a screen, where N is a positive integer, determines an accuracy value corresponding to eyeball tracking positioning corresponding to each fixation point in the N fixation points to obtain N accuracy values, determines an interpolation parameter corresponding to each accuracy value in the N accuracy values to obtain N interpolation parameters, and performs interpolation operation on each pixel point of the screen according to the N interpolation parameters to obtain an eyeball tracking positioning accuracy distribution map corresponding to the screen.
In one possible example, in the aspect of determining the precision value corresponding to the eye tracking position corresponding to each gaze point in the N gaze points, the determining unit 401 is specifically configured to:
determining a first coordinate position watched by a pupil corresponding to a fixation point i, wherein the fixation point i is any one of the N fixation points;
determining a second coordinate position corresponding to the fixation point i and determined by a pre-stored eyeball tracking algorithm;
and determining an accuracy value corresponding to the eyeball tracking and positioning corresponding to the fixation point i according to the first coordinate position and the second coordinate position.
In a possible example, in the aspect of determining an interpolation parameter corresponding to each of the N precision values to obtain N interpolation parameters, the determining unit 401 is specifically configured to:
acquiring a target screen state parameter between an eyeball corresponding to an accuracy value j and the screen, wherein the accuracy value j is any one of the N accuracy values;
and determining an interpolation parameter j corresponding to the target screen state parameter according to a mapping relation between preset screen state parameters and interpolation parameters.
In a possible example, in the aspect that the interpolation operation is performed on each pixel point of the screen according to the N interpolation parameters to obtain the eyeball tracking positioning accuracy distribution map corresponding to the screen, the interpolation unit 402 is specifically configured to:
determining an interpolation area corresponding to each interpolation parameter in the N interpolation parameters to obtain N areas to be interpolated, wherein the N areas to be interpolated cover each pixel point of the screen;
and carrying out interpolation operation on the N areas to be interpolated according to the N interpolation parameters and the N precision values to obtain an eyeball tracking and positioning precision distribution map corresponding to the screen.
In one possible example, as shown in fig. 4B, fig. 4B is a further modified structure of the eyeball tracking positioning accuracy determination apparatus shown in fig. 4A, which compared with fig. 4A, may further include: the obtaining unit 403 and the dividing unit 404 are specifically as follows:
the obtaining unit 403 is configured to obtain at least one precision threshold;
the dividing unit 404 is configured to divide the eyeball tracking positioning accuracy distribution map into a plurality of accuracy level regions based on the at least one accuracy threshold.
In one possible example, in terms of the determining N fixation points on the screen, the determining unit 401 is specifically configured to:
displaying a fixation point a in a first preset size at a preset position of the screen, wherein the inner area of the fixation point a is displayed in a countdown mode, and the fixation point a is any one of the N fixation points;
and when the countdown is finished, the size of the fixation point a is scaled from the first preset size to a second preset size.
In a possible example, in the determining an accuracy value corresponding to the eye tracking location corresponding to each of the N gaze points to obtain N accuracy values, the determining unit 401 is specifically configured to:
obtaining the brightness of the target environment;
determining a target track corresponding to the target environment light brightness according to a preset mapping relation between the environment light brightness and the track;
controlling the fixation point a to move according to the target track;
determining a plurality of track positions corresponding to the fixation point a, a plurality of positioning positions corresponding to the plurality of track positions, and a unique positioning position corresponding to each track position;
and determining the precision value corresponding to the fixation point a according to the plurality of track positions and the plurality of positioning positions.
It can be understood that the functions of each program module of the apparatus for determining eyeball tracking positioning accuracy in this embodiment may be specifically implemented according to the method in the foregoing method embodiment, and the specific implementation process may refer to the related description of the foregoing method embodiment, which is not described herein again.
Embodiments of the present application also provide a computer storage medium, where the computer storage medium stores a computer program for electronic data exchange, the computer program enabling a computer to execute part or all of the steps of any one of the methods described in the above method embodiments, and the computer includes an electronic device.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any of the methods as described in the above method embodiments. The computer program product may be a software installation package, the computer comprising an electronic device.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art will recognize that the embodiments described in this specification are preferred embodiments and that acts or modules referred to are not necessarily required for this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the above-described units is only one type of logical functional division, and other divisions may be realized in practice, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit may be stored in a computer readable memory if it is implemented in the form of a software functional unit and sold or used as a stand-alone product. Based on such understanding, the technical solutions of the present application, which are essential or part of the technical solutions contributing to the prior art, or all or part of the technical solutions, may be embodied in the form of a software product, which is stored in a memory and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the steps of the above methods of the embodiments of the present application. And the aforementioned memory comprises: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk, and various media capable of storing program codes.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash Memory disks, read-Only memories (ROMs), random Access Memories (RAMs), magnetic or optical disks, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, the specific implementation manner and the application scope may be changed, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. An eyeball tracking positioning accuracy determination method is applied to electronic equipment and comprises the following steps:
determining N fixation points on a screen, wherein the N fixation points are marked by a user, and N is an integer greater than 1;
determining an accuracy value corresponding to the eyeball tracking and positioning corresponding to each fixation point in the N fixation points to obtain N accuracy values;
determining an interpolation parameter corresponding to each of the N precision values to obtain N interpolation parameters;
and carrying out interpolation operation on each pixel point of the screen according to the N interpolation parameters to obtain an eyeball tracking and positioning precision distribution map corresponding to the screen.
2. The method of claim 1, wherein the determining an eye tracking position corresponding to each of the N gaze points comprises:
determining a first coordinate position watched by a pupil corresponding to a fixation point i, wherein the fixation point i is any one of the N fixation points;
determining a second coordinate position corresponding to the fixation point i and determined by a pre-stored eyeball tracking algorithm;
and determining an accuracy value corresponding to the eyeball tracking positioning corresponding to the fixation point i according to the first coordinate position and the second coordinate position.
3. The method according to claim 1 or 2, wherein the determining the interpolation parameter corresponding to each of the N precision values to obtain N interpolation parameters comprises:
acquiring a target screen state parameter between an eyeball and the screen corresponding to an accuracy value j, wherein the accuracy value j is any one of the N accuracy values;
and determining an interpolation parameter j corresponding to the target screen state parameter according to a mapping relation between preset screen state parameters and interpolation parameters.
4. The method according to claim 1 or 2, wherein the performing an interpolation operation on each pixel point of the screen according to the N interpolation parameters to obtain the eye tracking and positioning accuracy distribution map corresponding to the screen includes:
determining an interpolation area corresponding to each interpolation parameter in the N interpolation parameters to obtain N areas to be interpolated, wherein the N areas to be interpolated cover each pixel point of the screen;
and carrying out interpolation operation on the N areas to be interpolated according to the N interpolation parameters and the N precision values to obtain an eyeball tracking and positioning precision distribution map corresponding to the screen.
5. The method according to claim 1 or 2, characterized in that the method further comprises:
obtaining at least one precision threshold;
and dividing the eyeball tracking and positioning precision distribution map into a plurality of precision grade areas based on the at least one precision threshold.
6. The method according to claim 1 or 2, wherein the determining N fixation points on the screen comprises:
displaying a fixation point a at a preset position of the screen in a first preset size, wherein the inner area of the fixation point a is displayed in a countdown mode, and the fixation point a is any one of the N fixation points;
and when the countdown is finished, the size of the fixation point a is scaled from the first preset size to a second preset size.
7. The method of claim 6, wherein the determining an accuracy value corresponding to the eye tracking location corresponding to each of the N viewpoints, resulting in N accuracy values, comprises:
obtaining the brightness of the target environment;
determining a target track corresponding to the target ambient light brightness according to a preset mapping relation between the ambient light brightness and the track;
controlling the fixation point a to move according to the target track;
determining a plurality of track positions corresponding to the fixation point a, a plurality of positioning positions corresponding to the plurality of track positions, and a unique positioning position corresponding to each track position;
and determining the precision value corresponding to the fixation point a according to the plurality of track positions and the plurality of positioning positions.
8. An eyeball tracking positioning accuracy determination apparatus, which is applied to an electronic device, comprising: a determination unit and an interpolation unit, wherein,
the determining unit is used for determining N fixation points on a screen, the N fixation points are marked by a user, and N is an integer larger than 1; determining an accuracy value corresponding to the eyeball tracking and positioning corresponding to each fixation point in the N fixation points to obtain N accuracy values; determining an interpolation parameter corresponding to each precision value in the N precision values to obtain N interpolation parameters;
and the interpolation unit is used for carrying out interpolation operation on each pixel point of the screen according to the N interpolation parameters to obtain an eyeball tracking positioning precision distribution map corresponding to the screen.
9. An electronic device, comprising a processor, a memory to store one or more programs and configured to be executed by the processor, the programs including instructions for performing the steps in the method of any of claims 1-7.
10. A computer-readable storage medium, characterized in that a computer program for electronic data exchange is stored, wherein the computer program causes a computer to perform the method according to any of claims 1-7.
CN202010090960.7A 2020-02-13 2020-02-13 Eyeball tracking and positioning accuracy determination method and related product Active CN111311494B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010090960.7A CN111311494B (en) 2020-02-13 2020-02-13 Eyeball tracking and positioning accuracy determination method and related product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010090960.7A CN111311494B (en) 2020-02-13 2020-02-13 Eyeball tracking and positioning accuracy determination method and related product

Publications (2)

Publication Number Publication Date
CN111311494A CN111311494A (en) 2020-06-19
CN111311494B true CN111311494B (en) 2023-04-18

Family

ID=71150989

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010090960.7A Active CN111311494B (en) 2020-02-13 2020-02-13 Eyeball tracking and positioning accuracy determination method and related product

Country Status (1)

Country Link
CN (1) CN111311494B (en)

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7963652B2 (en) * 2003-11-14 2011-06-21 Queen's University At Kingston Method and apparatus for calibration-free eye tracking
US7809160B2 (en) * 2003-11-14 2010-10-05 Queen's University At Kingston Method and apparatus for calibration-free eye tracking using multiple glints or surface reflections
CN101576771B (en) * 2009-03-24 2010-12-01 山东大学 Scaling method for eye tracker based on nonuniform sample interpolation
KR102001950B1 (en) * 2012-07-26 2019-07-29 엘지이노텍 주식회사 Gaze Tracking Apparatus and Method
CN107533634A (en) * 2015-03-23 2018-01-02 控制辐射系统有限公司 Eyes tracking system
CN107153519A (en) * 2017-04-28 2017-09-12 北京七鑫易维信息技术有限公司 Image transfer method, method for displaying image and image processing apparatus
CN107168668A (en) * 2017-04-28 2017-09-15 北京七鑫易维信息技术有限公司 Image data transfer method, device and storage medium, processor

Also Published As

Publication number Publication date
CN111311494A (en) 2020-06-19

Similar Documents

Publication Publication Date Title
TWI785258B (en) Method, device, and storage medium for processing human face image
CN109240572B (en) Method for obtaining picture, method and device for processing picture
CN109308205B (en) Display adaptation method, device, equipment and storage medium of application program
US9355608B2 (en) Electronic device
CN111083417B (en) Image processing method and related product
US11294535B2 (en) Virtual reality VR interface generation method and apparatus
CN110263617B (en) Three-dimensional face model obtaining method and device
KR20150139214A (en) Method and apparatus for processing image
CN111324250A (en) Three-dimensional image adjusting method, device and equipment and readable storage medium
CN111028144B (en) Video face changing method and device and storage medium
CN115917474A (en) Rendering avatars in three-dimensional environments
CN107076994B (en) Method for controlling image display and apparatus supporting the same
CN109992175B (en) Object display method, device and storage medium for simulating blind feeling
CN110850961A (en) Calibration method of head-mounted display equipment and head-mounted display equipment
CN114175113A (en) Electronic device for providing head portrait and operation method thereof
JP2021185498A (en) Method for generating 3d object arranged in augmented reality space
CN111045577A (en) Horizontal and vertical screen switching method, wearable device and device with storage function
CN108604367B (en) Display method and handheld electronic device
CN111311494B (en) Eyeball tracking and positioning accuracy determination method and related product
KR20210138923A (en) Electronic device for providing augmented reality service and operating method thereof
EP3705982B1 (en) Apparatus and method for adaptively configuring user interface
CN113253829B (en) Eyeball tracking calibration method and related product
US20220301264A1 (en) Devices, methods, and graphical user interfaces for maps
KR20200066962A (en) Electronic device and method for providing content based on the motion of the user
CN109842722B (en) Image processing method and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant