CN112949551A - Eye key information determination method, device, equipment and storage medium - Google Patents
Eye key information determination method, device, equipment and storage medium Download PDFInfo
- Publication number
- CN112949551A CN112949551A CN202110297147.1A CN202110297147A CN112949551A CN 112949551 A CN112949551 A CN 112949551A CN 202110297147 A CN202110297147 A CN 202110297147A CN 112949551 A CN112949551 A CN 112949551A
- Authority
- CN
- China
- Prior art keywords
- dimensional
- eye
- eyeball
- information
- mapping
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 57
- 238000013507 mapping Methods 0.000 claims abstract description 269
- 210000001508 eye Anatomy 0.000 claims abstract description 245
- 210000005252 bulbus oculi Anatomy 0.000 claims abstract description 201
- 238000005457 optimization Methods 0.000 claims description 11
- 238000005070 sampling Methods 0.000 claims description 10
- 238000004590 computer program Methods 0.000 claims description 3
- 238000010276 construction Methods 0.000 claims description 3
- 239000011159 matrix material Substances 0.000 description 9
- 210000004279 orbit Anatomy 0.000 description 8
- 238000004891 communication Methods 0.000 description 7
- 238000013519 translation Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 210000001747 pupil Anatomy 0.000 description 4
- 238000001514 detection method Methods 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 210000004087 cornea Anatomy 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/193—Preprocessing; Feature extraction
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Processing Or Creating Images (AREA)
Abstract
The application provides a method, a device, equipment and a storage medium for determining eye key information, wherein the method comprises the following steps: acquiring three-dimensional eye information and a two-dimensional eye image corresponding to the three-dimensional eye information; constructing a mapping relation between the three-dimensional eyeballs and the two-dimensional eyeballs according to the three-dimensional eyeball information and the two-dimensional eyeball image corresponding to the three-dimensional eyeball information; and mapping the key point of the eyeball appointed part in the two-dimensional eye image to a three-dimensional space according to the mapping relation between the three-dimensional eyeball and the two-dimensional eyeball to obtain a mapping result capable of representing the position of the eyeball appointed part in the three-dimensional space. The method for determining the eye key information can determine the position of the eyeball designated part in the three-dimensional space according to the three-dimensional eye information and the two-dimensional eye image corresponding to the three-dimensional eye information.
Description
Technical Field
The present application relates to the field of data processing technologies, and in particular, to a method, an apparatus, a device, and a storage medium for determining eye key information.
Background
In some application scenarios, the position of an eyeball designated part (such as a pupil, a cornea, and the like) in a three-dimensional space needs to be determined according to eye information of a designated object, for example, in some application scenarios requiring gaze estimation, the position of the pupil in the three-dimensional space often needs to be determined according to the eye information of the designated object, and then gaze direction calculation is performed according to the position of the pupil in the three-dimensional space, and how to determine the position of the eyeball designated part in the three-dimensional space according to the eye information of the designated object is a problem that needs to be solved urgently at present.
Disclosure of Invention
In view of the above, the present application provides an eye key information determining method, apparatus, device and storage medium, for determining a position of an eyeball designated part in a three-dimensional space according to eye information corresponding to designation, and the technical solution is as follows:
an eye key information determination method, comprising:
acquiring three-dimensional eye information and a two-dimensional eye image corresponding to the three-dimensional eye information;
constructing a mapping relation between a three-dimensional eyeball and a two-dimensional eyeball according to the three-dimensional eye information and the two-dimensional eye image corresponding to the three-dimensional eye information;
and mapping the key point of the eyeball appointed part in the two-dimensional eye image to a three-dimensional space according to the mapping relation between the three-dimensional eyeball and the two-dimensional eyeball to obtain a mapping result capable of representing the position of the eyeball appointed part in the three-dimensional space.
Optionally, the constructing a mapping relationship between the three-dimensional eyeball and the two-dimensional eyeball according to the three-dimensional eye information and the two-dimensional eye image corresponding to the three-dimensional eye information includes:
constructing a three-dimensional eyeball model according to the three-dimensional eye information;
and determining the mapping relation between the three-dimensional eyeball and the two-dimensional eyeball according to the three-dimensional eye information, the two-dimensional eye image and the three-dimensional eyeball model.
Optionally, the constructing a three-dimensional eyeball model according to the three-dimensional eye information includes:
and fitting the three-dimensional shape of the eyeball and the central point of the eyeball by adopting an ellipse fitting method according to the three-dimensional eye information to obtain a three-dimensional eyeball model.
Optionally, the determining a mapping relationship between the three-dimensional eyeball and the two-dimensional eyeball according to the three-dimensional eye information, the two-dimensional eye image and the three-dimensional eyeball model includes:
acquiring a three-dimensional envelope line according to the three-dimensional eye information and the three-dimensional eyeball model, wherein the three-dimensional envelope line can represent an orbit in a three-dimensional space;
acquiring a two-dimensional envelope line according to the two-dimensional eye image, wherein the two-dimensional envelope line can represent an orbit in a two-dimensional space;
and determining the mapping relation between the three-dimensional eyeball and the two-dimensional eyeball according to the three-dimensional envelope curve and the two-dimensional envelope curve.
Optionally, the determining a mapping relationship between the three-dimensional eyeball and the two-dimensional eyeball according to the three-dimensional envelope and the two-dimensional envelope includes:
acquiring an initial mapping relation as a target mapping relation;
generating a plurality of candidate mapping relations according to the target mapping relation;
selecting an optimal candidate mapping relation from the plurality of candidate mapping relations based on the three-dimensional envelope line and the two-dimensional envelope line;
if the optimal candidate mapping relation meets a preset condition, determining the optimal candidate mapping relation as the mapping relation between the three-dimensional eyeball and the two-dimensional eyeball;
and if the optimal candidate mapping relation does not meet the preset condition, taking the optimal candidate mapping relation as a new target mapping relation, and then executing the step of generating a plurality of candidate mapping relations according to the target mapping relation.
Optionally, the generating a plurality of candidate mapping relationships according to the target mapping relationship includes:
determining an optimization range of the mapping relation according to the target mapping relation and a preset optimization parameter;
and generating a plurality of candidate mapping relations according to the mapping relation optimizing range.
Optionally, the acquiring three-dimensional eye information and the two-dimensional eye image corresponding to the three-dimensional eye information include:
acquiring three-dimensional face information and a two-dimensional face image corresponding to the three-dimensional face information;
extracting eye information from the three-dimensional face information to serve as three-dimensional eye information;
extracting eye images from two-dimensional face images corresponding to the three-dimensional face information to serve as two-dimensional eye images corresponding to the three-dimensional eye information;
the obtaining of the initial mapping relationship includes:
and determining a mapping relation between the three-dimensional face and the two-dimensional face as the initial mapping relation according to the three-dimensional face information and the two-dimensional face image.
Optionally, the obtaining an initial mapping relationship includes:
and randomly generating a mapping relation as the initial mapping relation.
Or,
randomly generating a plurality of mapping relations, and determining an optimal mapping relation from the randomly generated plurality of mapping relations based on the three-dimensional envelope line and the two-dimensional envelope line as the initial mapping relation;
optionally, the selecting an optimal candidate mapping relationship from the multiple candidate mapping relationships based on the three-dimensional envelope and the two-dimensional envelope includes:
for each candidate mapping:
mapping the three-dimensional envelope curve to a two-dimensional space by adopting the candidate mapping relation to obtain a mapped envelope curve;
determining the coincidence degree of the envelope curve after mapping and the two-dimensional envelope curve to obtain the coincidence degree corresponding to the candidate mapping relation;
and after the coincidence degrees respectively corresponding to the candidate mapping relations are obtained, determining the candidate mapping relation corresponding to the highest coincidence degree as the optimal candidate mapping relation.
Optionally, the determining the coincidence degree of the mapped envelope and the two-dimensional envelope includes:
sampling a plurality of points from the mapped envelope line according to a preset sampling rule;
calculating the distance between each point obtained by sampling and the corresponding point on the two-dimensional envelope line;
and summing all the calculated distances, and representing the coincidence degree of the mapped envelope line and the two-dimensional envelope line by the summed distances.
An eye-critical-information determining apparatus comprising: the eye key information determining module is used for determining the eye key information of the eye;
the eye information acquisition module is used for acquiring three-dimensional eye information and a two-dimensional eye image corresponding to the three-dimensional eye information;
the mapping relation construction module is used for constructing a mapping relation between a three-dimensional eyeball and a two-dimensional eyeball according to the three-dimensional eye information and the two-dimensional eye image corresponding to the three-dimensional eye information;
the eye key information determining module is used for mapping key points of the eyeball designated part in the two-dimensional eye image to a three-dimensional space according to the mapping relation between the three-dimensional eyeball and the two-dimensional eyeball to obtain a mapping result capable of representing the position of the eyeball designated part in the three-dimensional space.
An eye-critical-information determining apparatus comprising: a memory and a processor;
the memory is used for storing programs;
the processor is configured to execute the program to implement each step of the eye key information determination method described in any one of the above.
A readable storage medium having stored thereon a computer program which, when executed by a processor, carries out the steps of the method of determining eye key information according to any one of the preceding claims.
According to the scheme, the method, the device, the equipment and the storage medium for determining the eye key information are characterized in that the three-dimensional eye information and the two-dimensional eye image corresponding to the three-dimensional eye information are firstly obtained, then the mapping relation between the three-dimensional eyeball and the two-dimensional eyeball is constructed according to the three-dimensional eye information and the two-dimensional eye image corresponding to the three-dimensional eye information, and finally the key point of the eyeball designated position in the two-dimensional eye image is mapped to the three-dimensional space according to the mapping relation between the three-dimensional eyeball and the two-dimensional eyeball, so that the mapping result capable of representing the position of the eyeball designated position in the three-dimensional space is obtained. The method for determining the eye key information can determine the position of the eyeball designated part in the three-dimensional space according to the three-dimensional eye information and the two-dimensional eye image corresponding to the three-dimensional eye information.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is a schematic flowchart of a method for determining eye key information according to an embodiment of the present disclosure;
fig. 2 is a schematic flow chart illustrating a process of constructing a mapping relationship between a three-dimensional eyeball and a two-dimensional eyeball according to three-dimensional eye information and a two-dimensional eye image corresponding to the three-dimensional eye information according to the embodiment of the present application;
fig. 3 is a schematic flow chart illustrating a process of determining a mapping relationship between a three-dimensional eyeball and a two-dimensional eyeball according to three-dimensional eye information, a two-dimensional eye image, and a three-dimensional eyeball model according to an embodiment of the present application;
fig. 4 is a schematic flowchart illustrating a process of determining a mapping relationship between a three-dimensional eyeball and a two-dimensional eyeball according to a three-dimensional envelope and a two-dimensional envelope provided in the embodiment of the present application;
fig. 5 is a schematic structural diagram of an eye key information determining apparatus according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of an eye key information determination device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In order to determine the position of the designated part of the eyeball in the three-dimensional space, the inventor of the present application has conducted intensive research, and through research, finally provides an eye key information determining method, which can determine the position of the designated part of the eyeball in the three-dimensional space according to the three-dimensional eye information and a two-dimensional eye image corresponding to the three-dimensional eye information, and the eye key information determining method can be applied to an electronic device with processing capability, the electronic device can be a server on a network side, and can also be a terminal used on a user side, such as a PC, a notebook, a smart phone, a vehicle-mounted terminal, a smart home device, and the like, and the server on the network side or the terminal used on the user side can determine the position of the designated part of the eyeball in the three-dimensional space according to the eye key information determining. Next, the method for determining eye key information provided by the present application will be described by the following embodiments.
First embodiment
Referring to fig. 1, a schematic flow chart of a method for determining eye key information according to an embodiment of the present application is shown, where the method may include:
step S101: and acquiring the three-dimensional eye information and a two-dimensional eye image corresponding to the three-dimensional eye information.
There are various implementation manners for acquiring the three-dimensional eye information and the two-dimensional eye image corresponding to the three-dimensional eye information, and this embodiment provides two optional implementation manners as follows:
the first implementation mode comprises the following steps: the method comprises the steps of acquiring data acquired by a three-dimensional information acquisition device and a two-dimensional image acquisition device aiming at the eyes of a specified object simultaneously to obtain three-dimensional eye information.
The second implementation mode comprises the following steps: the method comprises the steps of firstly, acquiring data acquired by a three-dimensional information acquisition device and a two-dimensional image acquisition device aiming at a face of a specified object simultaneously to obtain three-dimensional face information and a two-dimensional face image corresponding to the three-dimensional face information, then acquiring eye information from the three-dimensional face information to obtain three-dimensional eye information, and acquiring an eye region from the two-dimensional face image corresponding to the three-dimensional face information to obtain a two-dimensional eye image corresponding to the three-dimensional eye information.
Step S102: and constructing a mapping relation between the three-dimensional eyeballs and the two-dimensional eyeballs according to the three-dimensional eyeball information and the two-dimensional eyeball image corresponding to the three-dimensional eyeball information.
The mapping relationship between the three-dimensional eyeball and the two-dimensional eyeball is a mapping relationship from the three-dimensional eyeball to the two-dimensional eyeball, that is, the mapping relationship constructed in step S102 is a mapping relationship from a three-dimensional space to a two-dimensional space.
The mapping relationship in this embodiment can be represented by a spatial rotation projection matrix and an alignment translation projection matrix, and it is assumed that F is used for the mapping relationship between the three-dimensional eyeball and the two-dimensional eyeballeyeRepresenting, spatially rotating projection matrices by ReyeRepresenting, aligning, translating, and projecting matrices by TeyeAnd if so, the mapping relationship between the three-dimensional eyeball and the two-dimensional eyeball can be expressed as follows:
Feye={Reye,Teye}
step S103: and mapping the key point of the eyeball appointed part in the two-dimensional eye image to a three-dimensional space according to the mapping relation between the three-dimensional eyeball and the two-dimensional eyeball to obtain a mapping result capable of representing the position of the eyeball appointed part in the three-dimensional space.
The eyeball designated part can be, but is not limited to, a pupil, a cornea and the like, and the key point of the eyeball designated part in the two-dimensional eye image can be obtained by performing key point detection on the two-dimensional eye image by using an existing key point detection scheme.
Since the mapping relationship between the three-dimensional eyeball and the two-dimensional eyeball is a mapping relationship from a three-dimensional space to a two-dimensional space, after the mapping relationship between the three-dimensional eyeball and the two-dimensional eyeball is established, reflection from the two-dimensional space to the three-dimensional space is required to be performed on the key point of the eyeball designated position in the two-dimensional eye image based on the mapping relationship, so as to obtain the key point of the eyeball designated position in the three-dimensional space.
According to the method for determining the eye key information, the three-dimensional eye information and the two-dimensional eye image corresponding to the three-dimensional eye information are firstly obtained, then the mapping relation between the three-dimensional eyeball and the two-dimensional eyeball is constructed according to the three-dimensional eye information and the two-dimensional eye image corresponding to the three-dimensional eye information, finally the key point of the eyeball designated position in the two-dimensional eye image is mapped to the three-dimensional space according to the mapping relation between the three-dimensional eyeball and the two-dimensional eyeball, and therefore the mapping result capable of representing the position of the eyeball designated position in the three-dimensional space is obtained. According to the method for determining the eye key information, the position of the eyeball designated part in the three-dimensional space can be determined according to the three-dimensional eye information and the two-dimensional eye image corresponding to the three-dimensional eye information.
Second embodiment
The present embodiment focuses on "step S102: and establishing a specific implementation process of mapping relation between the three-dimensional eyeballs and the two-dimensional eyeballs according to the three-dimensional eye information and the two-dimensional eye image corresponding to the three-dimensional eye information.
Referring to fig. 2, a schematic flow chart of constructing a mapping relationship between a three-dimensional eyeball and a two-dimensional eyeball according to three-dimensional eye information and a two-dimensional eye image corresponding to the three-dimensional eye information is shown, and the flow chart may include:
step S201: and constructing a three-dimensional eyeball model according to the three-dimensional eye information.
The shape of the real eyeball in the three-dimensional space can be approximately equivalent to a sphere or an ellipsoid, for example, the ellipsoid, and the coordinate equation can be expressed asWherein, a, b and c are respectively the radiuses under the x, y and z axes of the coordinate system, determine the shape of the ellipsoid, and the three-dimensional eye information is a space point cloud P3xDIn view of the above, the present embodiment may fit the three-dimensional shape of the eyeball and the center point of the eyeball by an ellipse fitting method according to the three-dimensional eye information to obtain the three-dimensional eyeball model.
Step S202: and determining the mapping relation between the three-dimensional eyeball and the two-dimensional eyeball according to the three-dimensional eye information, the two-dimensional eye image and the three-dimensional eyeball model.
Referring to fig. 3, a schematic flow chart of determining a mapping relationship between a three-dimensional eyeball and a two-dimensional eyeball according to three-dimensional eye information, a two-dimensional eye image, and a three-dimensional eyeball model is shown, which may include:
step S301: and acquiring a three-dimensional envelope line according to the three-dimensional eye information and the three-dimensional eyeball model.
The three-dimensional envelope can represent an orbit in a three-dimensional space, and it should be noted that the three-dimensional envelope essentially represents an eyeball visible region in the three-dimensional space.
Specifically, the process of acquiring the three-dimensional envelope curve according to the three-dimensional eye information and the three-dimensional eyeball model may include: acquiring spatial points corresponding to eye sockets from the three-dimensional eyeball model according to the three-dimensional eye information to obtain a point set B3DSet points B3DAs a three-dimensional envelope.
Step S302: and acquiring a two-dimensional envelope according to the two-dimensional eye image.
The two-dimensional envelope can represent an eye socket in a two-dimensional space, and it should be noted that the two-dimensional envelope essentially represents an eyeball visible region in the two-dimensional space.
Specifically, the process of acquiring the two-dimensional envelope from the two-dimensional eye image may include: determining key points around the eye socket from the two-dimensional eye image by adopting an edge key point detection method of the two-dimensional image to obtain a point set B2DSet points B2DAs a two-dimensional envelope.
Step S303: and determining the mapping relation between the three-dimensional eyeball and the two-dimensional eyeball according to the three-dimensional envelope line and the two-dimensional envelope line.
Referring to fig. 4, a schematic flow chart illustrating a process of determining a mapping relationship between a three-dimensional eyeball and a two-dimensional eyeball according to a three-dimensional envelope and a two-dimensional envelope may include:
step S401: and acquiring an initial mapping relation as a target mapping relation.
There are various implementation manners for obtaining the initial mapping relationship, and this embodiment provides the following three optional implementation manners:
the first implementation mode comprises the following steps: a mapping relationship is randomly generated (i.e., a spatial rotation projection matrix is randomly generated, and an alignment translation projection matrix is randomly generated) as an initial mapping relationship.
The second implementation mode comprises the following steps: and randomly generating a plurality of mapping relations, determining an optimal mapping relation from the randomly generated plurality of mapping relations based on the three-dimensional envelope line and the two-dimensional envelope line, and taking the optimal mapping relation from the randomly generated plurality of mapping relations as an initial mapping relation.
The third implementation mode comprises the following steps: according to three-dimensional face information I3DAnd three-dimensional face information I3DCorresponding two-dimensional face image I2DAnd determining the mapping relation between the three-dimensional face and the two-dimensional face as an initial mapping relation. It should be noted that the three-dimensional face information I3DAnd three-dimensional face information I3DCorresponding two-dimensional face image I2DData acquired simultaneously for a face of a specified object using a three-dimensional information acquisition apparatus and a two-dimensional image acquisition apparatus.
Considering that the mapping relationship between the three-dimensional face and the two-dimensional face is the initial mapping relationship, the mapping relationship between the three-dimensional eyeball and the two-dimensional eyeball can be determined more quickly, and the application preferably adopts the third implementation manner to obtain the initial mapping relationship.
Step S402: and generating a plurality of candidate mapping relations according to the target mapping relation.
Specifically, the generating a plurality of candidate mapping relationships according to the target mapping relationship includes: and determining an optimization range of the mapping relation according to the preset range parameter and the target mapping relation, and generating a plurality of candidate mapping relations according to the determined optimization range. It should be noted that a plurality of candidate mappings are located in the determined optimization range.
The above embodiment mentions that the mapping relationship can be characterized by a spatial rotation projection matrix R and an alignment translation projection matrix T, assuming that the target mapping relationship is: fTarget={RTarget,TTargetA range parameter a and b can be preset, wherein a and RTargetCorresponding, b and TTargetCorrespondingly, based on the range parameters a and b, the following optimization range [ R ] can be determinedTarget-a,RTarget+a]、[TTarget-b,TTarget+b]After determining the optimum range, at [ R ]Target-a,RTarget+a]Within this range, candidate spatial rotation projection matrices are generated, at [ T ]Target-b,TTarget+b]And internally generating a candidate alignment translation projection matrix, and forming a candidate mapping relation by the generated candidate space rotation projection matrix and the candidate alignment translation projection matrix.
Optionally, the value of the range parameter a may be 15 °, and the value of b may be 15 °dist(Reye,Leye) To specify the distance between the left and right eyes of the subject. The values of a and b are merely examples, and the present application is not limited thereto.
Step S403: and selecting an optimal candidate mapping relation from a plurality of candidate mapping relations based on the three-dimensional envelope line and the two-dimensional envelope line.
Specifically, based on the three-dimensional envelope and the two-dimensional envelope, the process of selecting the optimal candidate mapping relationship from the plurality of candidate mapping relationships may include:
step S4031: for each candidate mapping, performing:
step S4031-1: and mapping the three-dimensional envelope curve to a two-dimensional space by adopting the candidate mapping relation to obtain the mapped envelope curve.
Step S4031-2: and determining the coincidence degree of the envelope curve after mapping and the two-dimensional envelope curve so as to obtain the coincidence degree corresponding to the candidate mapping relation.
There are various implementations for determining the coincidence ratio between the mapped envelope and the two-dimensional envelope, and the embodiment provides the following two implementations:
the first implementation mode comprises the following steps: and calculating the distance between each point on the mapped envelope line and the corresponding point on the two-dimensional envelope line, summing all the calculated distances, and representing the coincidence degree of the mapped envelope line and the two-dimensional envelope line by using the summed distance. Illustratively, the mapped envelope comprises 100 points, namely x 1-x 100, and the two-dimensional envelope also comprises 100 points, namely y 1-y 100, then the distance d1 between x1 and y1, the distance d2 between x2 and y2, …, and the distance d100 between x100 and y100 are calculated, then d 1-d 100 are summed, and the coincidence degree of the mapped envelope and the two-dimensional envelope is represented by the summed distance d.
The second implementation mode comprises the following steps: sampling a plurality of points on the mapped envelope line, calculating the distance between each point obtained by sampling and the corresponding point on the two-dimensional envelope line, and representing the coincidence degree of the mapped envelope line and the two-dimensional envelope line by using the summed distance. Illustratively, the mapped envelope includes 100 points, which are x1 to x100, and the two-dimensional envelope also includes 100 points, which are y1 to y100, then 50 points are sampled from the mapped envelope, for example, x1, x3, x5, …, the distances d1 between x1 and y1, and the distances d3, … between x3 and y3 are calculated, and finally 50 distances are obtained, and the summed distance is used as the coincidence ratio of the mapped envelope and the two-dimensional envelope.
In the second implementation manner, only part of the points are calculated, so that the calculation amount is smaller and the determination efficiency of the contact ratio is higher compared with the first implementation manner.
Coincidence degrees respectively corresponding to the respective candidate mapping relationships can be obtained via step S4031.
It should be noted that, in the second implementation manner of step S401, an implementation process of "determining an optimal mapping relationship from a plurality of randomly generated mapping relationships based on a three-dimensional envelope and a two-dimensional envelope" is similar to an implementation process of "selecting an optimal candidate mapping relationship from a plurality of candidate mapping relationships based on a three-dimensional envelope and a two-dimensional envelope".
Step S4032: and determining the candidate mapping relation corresponding to the highest coincidence degree as the optimal candidate mapping relation.
Step S404: and judging whether the optimal candidate mapping relation meets a preset condition, if so, executing the step S405a, otherwise, executing the step S405 b.
Specifically, the process of determining whether the optimal candidate mapping relationship satisfies the preset condition may include: and judging whether the coincidence degree corresponding to the optimal candidate mapping relation is larger than a preset coincidence degree threshold value or not.
Step S405 a: and determining the optimal candidate mapping relation as the mapping relation between the three-dimensional eyeball and the two-dimensional eyeball.
Step S405 b: and taking the optimal candidate mapping relation as a new target mapping relation, and then executing the step S402 and the subsequent steps.
The mapping relation between the three-dimensional eyeball and the two-dimensional eyeball can be determined through the process, and the key point of the eyeball designated part in the two-dimensional eye image can be mapped to the three-dimensional space according to the mapping relation after the mapping relation between the three-dimensional eyeball and the two-dimensional eyeball is determined, so that the key point of the eyeball designated part in the three-dimensional space is obtained, and the region of the eye designated part can be determined from the three-dimensional eyeball model according to the key point of the eyeball designated part in the three-dimensional space.
Third embodiment
The embodiments of the present application further provide an eye information determination device, which is described below, and the eye information determination device described below and the eye information determination method described above may be referred to in correspondence with each other.
Referring to fig. 5, a schematic structural diagram of an eye information determining apparatus according to an embodiment of the present application is shown, which may include: an eye information acquisition module 501, a mapping relation construction module 502 and an eye key information determination module 503.
The eye information obtaining module 501 is configured to obtain three-dimensional eye information and a two-dimensional eye image corresponding to the three-dimensional eye information.
A mapping relationship establishing module 502, configured to establish a mapping relationship between a three-dimensional eyeball and a two-dimensional eyeball according to the three-dimensional eye information and the two-dimensional eye image corresponding to the three-dimensional eye information.
The eye key information determining module 503 is configured to map a key point of an eyeball designated portion in the two-dimensional eye image to a three-dimensional space according to a mapping relationship between the three-dimensional eyeball and the two-dimensional eyeball, so as to obtain a mapping result that can represent a position of the eyeball designated portion in the three-dimensional space.
Optionally, the mapping relationship building module 502 may include: the three-dimensional eyeball model building module and the mapping relation determining module.
The three-dimensional eyeball model building module is used for building a three-dimensional eyeball model according to the three-dimensional eye information;
the mapping relation determining module is used for determining the mapping relation between the three-dimensional eyeball and the two-dimensional eyeball according to the three-dimensional eye information, the two-dimensional eye image and the three-dimensional eyeball model.
Optionally, the three-dimensional eyeball model building module is configured to fit a three-dimensional shape of an eyeball and a center point of the eyeball according to the three-dimensional eye information by using an ellipse fitting method, so as to obtain a three-dimensional eyeball model.
Optionally, the mapping relationship determining module includes: the device comprises a three-dimensional envelope line acquisition submodule, a two-dimensional envelope line acquisition submodule and a mapping relation determination submodule.
The three-dimensional envelope acquisition submodule is used for acquiring a three-dimensional envelope according to the three-dimensional eye information and the three-dimensional eyeball model, wherein the three-dimensional envelope can represent an orbit in a three-dimensional space.
The two-dimensional envelope acquisition submodule is used for acquiring a two-dimensional envelope according to the two-dimensional eye image, wherein the two-dimensional envelope can represent an orbit in a two-dimensional space.
And the mapping relation determining submodule is used for determining the mapping relation between the three-dimensional eyeball and the two-dimensional eyeball according to the three-dimensional envelope line and the two-dimensional envelope line.
Optionally, the mapping relationship determining sub-module is specifically configured to obtain an initial mapping relationship, use the initial mapping relationship as a target mapping relationship, generate a plurality of candidate mapping relationships according to the target mapping relationship, select an optimal candidate mapping relationship from the plurality of candidate mapping relationships based on the three-dimensional envelope and the two-dimensional envelope, determine the optimal candidate mapping relationship as the mapping relationship between the three-dimensional eyeball and the two-dimensional eyeball if the optimal candidate mapping relationship satisfies a preset condition, use the optimal candidate mapping relationship as a new target mapping relationship if the optimal candidate mapping relationship does not satisfy the preset condition, and then execute the target mapping relationship to generate the plurality of candidate mapping relationships.
Optionally, when the mapping relationship determining submodule generates a plurality of candidate mapping relationships according to a target mapping relationship, the mapping relationship determining submodule is specifically configured to determine an optimization range of the mapping relationship according to the target mapping relationship and a preset optimization parameter, and generate a plurality of candidate mapping relationships according to the optimization range of the mapping relationship.
Optionally, the eye information obtaining module 501 is specifically configured to extract eye information from the three-dimensional face information as three-dimensional eye information, and extract an eye image from a two-dimensional face image corresponding to the three-dimensional face information as a two-dimensional eye image corresponding to the three-dimensional eye information.
Optionally, the mapping relationship determining submodule, when obtaining an initial mapping relationship, is specifically configured to determine, according to the three-dimensional face information and the two-dimensional face image, a mapping relationship between a three-dimensional face and a two-dimensional face, as the initial mapping relationship.
Optionally, the mapping relationship determining submodule is specifically configured to randomly generate a mapping relationship as the initial mapping relationship when the initial mapping relationship is obtained.
Optionally, the mapping relationship determining submodule is specifically configured to randomly generate a plurality of mapping relationships when obtaining an initial mapping relationship, and determine an optimal mapping relationship from the plurality of randomly generated mapping relationships based on the three-dimensional envelope and the two-dimensional envelope, where the optimal mapping relationship is used as the initial mapping relationship.
Optionally, the mapping relationship determining submodule, when selecting an optimal candidate mapping relationship from the plurality of candidate mapping relationships based on the three-dimensional envelope and the two-dimensional envelope, is specifically configured to map the three-dimensional envelope to a two-dimensional space by using the candidate mapping relationship for each candidate mapping relationship to obtain a mapped envelope, determine a coincidence degree of the mapped envelope and the two-dimensional envelope to obtain a coincidence degree corresponding to the candidate mapping relationship, and determine the candidate mapping relationship corresponding to the highest coincidence degree as the optimal candidate mapping relationship after obtaining the coincidence degrees respectively corresponding to the candidate mapping relationships.
Optionally, when determining the coincidence degree of the mapped envelope line and the two-dimensional envelope line, the mapping relation determining submodule is specifically configured to sample a plurality of points from the mapped envelope line according to a preset sampling rule;
and calculating the distance between each point obtained by sampling and the corresponding point on the two-dimensional envelope line, summing all the calculated distances, and representing the coincidence degree of the mapped envelope line and the two-dimensional envelope line by the summed distance.
According to the device for determining the eye key information, the three-dimensional eye information and the two-dimensional eye image corresponding to the three-dimensional eye information are firstly obtained, then the mapping relation between the three-dimensional eyeball and the two-dimensional eyeball is built according to the three-dimensional eye information and the two-dimensional eye image corresponding to the three-dimensional eye information, and finally the key point of the eyeball designated position in the two-dimensional eye image is mapped to the three-dimensional space according to the mapping relation between the three-dimensional eyeball and the two-dimensional eyeball, so that the mapping result capable of representing the position of the eyeball designated position in the three-dimensional space is obtained. The device for determining the eye key information provided by the embodiment of the application can determine the position of the eyeball designated part in a three-dimensional space according to the three-dimensional eye information and the two-dimensional eye image corresponding to the three-dimensional eye information.
Fourth embodiment
An embodiment of the present application further provides an eye key information determining device, please refer to fig. 6, which shows a schematic structural diagram of the eye key information determining device, where the eye key information determining device may include: at least one processor 601, at least one communication interface 602, at least one memory 603, and at least one communication bus 604;
in the embodiment of the present application, the number of the processor 601, the communication interface 602, the memory 603, and the communication bus 604 is at least one, and the processor 601, the communication interface 602, and the memory 603 complete communication with each other through the communication bus 604;
the processor 601 may be a central processing unit CPU, or an application Specific Integrated circuit asic, or one or more Integrated circuits configured to implement embodiments of the present invention, or the like;
the memory 603 may include a high-speed RAM memory, and may further include a non-volatile memory (non-volatile memory), etc., such as at least one disk memory;
wherein the memory stores a program and the processor can call the program stored in the memory, the program for:
acquiring three-dimensional eye information and a two-dimensional eye image corresponding to the three-dimensional eye information;
constructing a mapping relation between a three-dimensional eyeball and a two-dimensional eyeball according to the three-dimensional eye information and the two-dimensional eye image corresponding to the three-dimensional eye information;
and mapping the key point of the eyeball appointed part in the two-dimensional eye image to a three-dimensional space according to the mapping relation between the three-dimensional eyeball and the two-dimensional eyeball to obtain a mapping result capable of representing the position of the eyeball appointed part in the three-dimensional space.
Alternatively, the detailed function and the extended function of the program may be as described above.
Fifth embodiment
Embodiments of the present application further provide a readable storage medium, where a program suitable for being executed by a processor may be stored, where the program is configured to:
acquiring three-dimensional eye information and a two-dimensional eye image corresponding to the three-dimensional eye information;
constructing a mapping relation between a three-dimensional eyeball and a two-dimensional eyeball according to the three-dimensional eye information and the two-dimensional eye image corresponding to the three-dimensional eye information;
and mapping the key point of the eyeball appointed part in the two-dimensional eye image to a three-dimensional space according to the mapping relation between the three-dimensional eyeball and the two-dimensional eyeball to obtain a mapping result capable of representing the position of the eyeball appointed part in the three-dimensional space.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
Claims (13)
1. An eye key information determination method, comprising:
acquiring three-dimensional eye information and a two-dimensional eye image corresponding to the three-dimensional eye information;
constructing a mapping relation between a three-dimensional eyeball and a two-dimensional eyeball according to the three-dimensional eye information and the two-dimensional eye image corresponding to the three-dimensional eye information;
and mapping the key point of the eyeball appointed part in the two-dimensional eye image to a three-dimensional space according to the mapping relation between the three-dimensional eyeball and the two-dimensional eyeball to obtain a mapping result capable of representing the position of the eyeball appointed part in the three-dimensional space.
2. The method for determining eye key information according to claim 1, wherein the constructing a mapping relationship between a three-dimensional eyeball and a two-dimensional eyeball according to the three-dimensional eye information and a two-dimensional eye image corresponding to the three-dimensional eye information comprises:
constructing a three-dimensional eyeball model according to the three-dimensional eye information;
and determining the mapping relation between the three-dimensional eyeball and the two-dimensional eyeball according to the three-dimensional eye information, the two-dimensional eye image and the three-dimensional eyeball model.
3. The method for determining eye key information according to claim 2, wherein the constructing a three-dimensional eyeball model according to the three-dimensional eye information comprises:
and fitting the three-dimensional shape of the eyeball and the central point of the eyeball by adopting an ellipse fitting method according to the three-dimensional eye information to obtain a three-dimensional eyeball model.
4. The method for determining eye key information according to claim 2, wherein the determining a mapping relationship between a three-dimensional eyeball and a two-dimensional eyeball according to the three-dimensional eye information, the two-dimensional eye image and the three-dimensional eyeball model comprises:
acquiring a three-dimensional envelope line according to the three-dimensional eye information and the three-dimensional eyeball model, wherein the three-dimensional envelope line can represent an orbit in a three-dimensional space;
acquiring a two-dimensional envelope line according to the two-dimensional eye image, wherein the two-dimensional envelope line can represent an orbit in a two-dimensional space;
and determining the mapping relation between the three-dimensional eyeball and the two-dimensional eyeball according to the three-dimensional envelope curve and the two-dimensional envelope curve.
5. The method for determining eye key information according to claim 4, wherein determining a mapping relationship between a three-dimensional eyeball and a two-dimensional eyeball according to the three-dimensional envelope and the two-dimensional envelope comprises:
acquiring an initial mapping relation as a target mapping relation;
generating a plurality of candidate mapping relations according to the target mapping relation;
selecting an optimal candidate mapping relation from the plurality of candidate mapping relations based on the three-dimensional envelope line and the two-dimensional envelope line;
if the optimal candidate mapping relation meets a preset condition, determining the optimal candidate mapping relation as the mapping relation between the three-dimensional eyeball and the two-dimensional eyeball;
and if the optimal candidate mapping relation does not meet the preset condition, taking the optimal candidate mapping relation as a new target mapping relation, and then executing the target mapping relation to generate a plurality of candidate mapping relations.
6. The method of determining ocular key information of claim 5, wherein the generating a plurality of candidate mappings from a target mapping comprises:
determining an optimization range of the mapping relation according to the target mapping relation and a preset optimization parameter;
and generating a plurality of candidate mapping relations according to the mapping relation optimizing range.
7. The method for determining eye key information according to claim 5, wherein the acquiring three-dimensional eye information and the two-dimensional eye image corresponding to the three-dimensional eye information comprises:
acquiring three-dimensional face information and a two-dimensional face image corresponding to the three-dimensional face information;
extracting eye information from the three-dimensional face information to serve as three-dimensional eye information;
extracting eye images from two-dimensional face images corresponding to the three-dimensional face information to serve as two-dimensional eye images corresponding to the three-dimensional eye information;
the obtaining of the initial mapping relationship includes:
and determining a mapping relation between the three-dimensional face and the two-dimensional face as the initial mapping relation according to the three-dimensional face information and the two-dimensional face image.
8. The method for determining eye key information according to claim 5, wherein the obtaining an initial mapping relationship comprises:
and randomly generating a mapping relation as the initial mapping relation.
Or,
and randomly generating a plurality of mapping relations, and determining an optimal mapping relation from the randomly generated plurality of mapping relations based on the three-dimensional envelope line and the two-dimensional envelope line as the initial mapping relation.
9. The method for determining eye key information according to claim 5, wherein the selecting an optimal candidate mapping relationship from the plurality of candidate mapping relationships based on the three-dimensional envelope and the two-dimensional envelope comprises:
for each candidate mapping:
mapping the three-dimensional envelope curve to a two-dimensional space by adopting the candidate mapping relation to obtain a mapped envelope curve;
determining the coincidence degree of the envelope curve after mapping and the two-dimensional envelope curve to obtain the coincidence degree corresponding to the candidate mapping relation;
and after the coincidence degrees respectively corresponding to the candidate mapping relations are obtained, determining the candidate mapping relation corresponding to the highest coincidence degree as the optimal candidate mapping relation.
10. The method for determining eye key information according to claim 9, wherein the determining a degree of coincidence of the mapped envelope with the two-dimensional envelope comprises:
sampling a plurality of points from the mapped envelope line according to a preset sampling rule;
calculating the distance between each point obtained by sampling and the corresponding point on the two-dimensional envelope line;
and summing all the calculated distances, and representing the coincidence degree of the mapped envelope line and the two-dimensional envelope line by the summed distances.
11. An eye key information determination device, comprising: the eye key information determining module is used for determining the eye key information of the eye;
the eye information acquisition module is used for acquiring three-dimensional eye information and a two-dimensional eye image corresponding to the three-dimensional eye information;
the mapping relation construction module is used for constructing a mapping relation between a three-dimensional eyeball and a two-dimensional eyeball according to the three-dimensional eye information and the two-dimensional eye image corresponding to the three-dimensional eye information;
the eye key information determining module is used for mapping key points of the eyeball designated part in the two-dimensional eye image to a three-dimensional space according to the mapping relation between the three-dimensional eyeball and the two-dimensional eyeball to obtain a mapping result capable of representing the position of the eyeball designated part in the three-dimensional space.
12. An eye key information determination device, comprising: a memory and a processor;
the memory is used for storing programs;
the processor is configured to execute the program to implement the steps of the eye key information determination method according to any one of claims 1 to 10.
13. A readable storage medium having stored thereon a computer program, wherein the computer program, when executed by a processor, performs the steps of the method for determining eye key information according to any one of claims 1 to 10.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110297147.1A CN112949551B (en) | 2021-03-19 | 2021-03-19 | Eye key information determination method, device, equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110297147.1A CN112949551B (en) | 2021-03-19 | 2021-03-19 | Eye key information determination method, device, equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112949551A true CN112949551A (en) | 2021-06-11 |
CN112949551B CN112949551B (en) | 2024-08-27 |
Family
ID=76227250
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110297147.1A Active CN112949551B (en) | 2021-03-19 | 2021-03-19 | Eye key information determination method, device, equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112949551B (en) |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107145224A (en) * | 2017-04-07 | 2017-09-08 | 清华大学 | Human eye sight tracking and device based on three-dimensional sphere Taylor expansion |
CN108182699A (en) * | 2017-12-28 | 2018-06-19 | 北京天睿空间科技股份有限公司 | Three-dimensional registration method based on two dimensional image local deformation |
CN109389069A (en) * | 2018-09-28 | 2019-02-26 | 北京市商汤科技开发有限公司 | Blinkpunkt judgment method and device, electronic equipment and computer storage medium |
CN109872397A (en) * | 2019-02-18 | 2019-06-11 | 北京工业大学 | A kind of three-dimensional rebuilding method of the airplane parts based on multi-view stereo vision |
CN110136273A (en) * | 2019-03-29 | 2019-08-16 | 初速度(苏州)科技有限公司 | A kind of sample data mask method and device in machine learning |
CN110135453A (en) * | 2019-03-29 | 2019-08-16 | 初速度(苏州)科技有限公司 | A kind of laser point cloud data mask method and device |
US10402978B1 (en) * | 2019-01-25 | 2019-09-03 | StradVision, Inc. | Method for detecting pseudo-3D bounding box based on CNN capable of converting modes according to poses of objects using instance segmentation and device using the same |
WO2019218887A1 (en) * | 2018-04-27 | 2019-11-21 | 苏州新光维医疗科技有限公司 | Method and device for converting two-dimensional image into three-dimensional image, and three-dimensional imaging system |
CN111161397A (en) * | 2019-12-02 | 2020-05-15 | 支付宝(杭州)信息技术有限公司 | Face three-dimensional reconstruction method and device, electronic equipment and readable storage medium |
CN111540383A (en) * | 2019-02-06 | 2020-08-14 | 丰田自动车株式会社 | Voice conversation device, control program, and control method thereof |
-
2021
- 2021-03-19 CN CN202110297147.1A patent/CN112949551B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107145224A (en) * | 2017-04-07 | 2017-09-08 | 清华大学 | Human eye sight tracking and device based on three-dimensional sphere Taylor expansion |
CN108182699A (en) * | 2017-12-28 | 2018-06-19 | 北京天睿空间科技股份有限公司 | Three-dimensional registration method based on two dimensional image local deformation |
WO2019218887A1 (en) * | 2018-04-27 | 2019-11-21 | 苏州新光维医疗科技有限公司 | Method and device for converting two-dimensional image into three-dimensional image, and three-dimensional imaging system |
CN109389069A (en) * | 2018-09-28 | 2019-02-26 | 北京市商汤科技开发有限公司 | Blinkpunkt judgment method and device, electronic equipment and computer storage medium |
US10402978B1 (en) * | 2019-01-25 | 2019-09-03 | StradVision, Inc. | Method for detecting pseudo-3D bounding box based on CNN capable of converting modes according to poses of objects using instance segmentation and device using the same |
CN111540383A (en) * | 2019-02-06 | 2020-08-14 | 丰田自动车株式会社 | Voice conversation device, control program, and control method thereof |
CN109872397A (en) * | 2019-02-18 | 2019-06-11 | 北京工业大学 | A kind of three-dimensional rebuilding method of the airplane parts based on multi-view stereo vision |
CN110136273A (en) * | 2019-03-29 | 2019-08-16 | 初速度(苏州)科技有限公司 | A kind of sample data mask method and device in machine learning |
CN110135453A (en) * | 2019-03-29 | 2019-08-16 | 初速度(苏州)科技有限公司 | A kind of laser point cloud data mask method and device |
CN111161397A (en) * | 2019-12-02 | 2020-05-15 | 支付宝(杭州)信息技术有限公司 | Face three-dimensional reconstruction method and device, electronic equipment and readable storage medium |
Non-Patent Citations (5)
Title |
---|
ADRIAN J. CHUNG等: "Extraction of visual features with eye tracking for saliency driven 2D/3D registration", 《IMAGE AND VISION COMPUTING》, vol. 23, no. 11, pages 999 - 1008 * |
MOHSEN MANSOURYAR等: "3D Gaze Estimation from 2D Pupil Positions on Monocular Head-Mounted Eye Trackers", 《ARXIV HUMAN-COMPUTER INTERACTION》, pages 1 - 6 * |
大脑技术: "瞳孔-角膜追踪技术方法(Pupil-CR)", pages 1 - 4, Retrieved from the Internet <URL:zhuanlan.zhihu.com/p/280746366> * |
张云峰: "图像与三维模型匹配方法的研究及应用", 《信息科技辑》, no. 07, pages 7 - 50 * |
陈骥等: "眼底图像的三维重建", 《生物医学工程学杂志》, vol. 25, no. 1, pages 177 - 181 * |
Also Published As
Publication number | Publication date |
---|---|
CN112949551B (en) | 2024-08-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3698275B1 (en) | Data processing method, apparatus, system and storage media | |
EP3786890A1 (en) | Method and apparatus for determining pose of image capture device, and storage medium therefor | |
CN110276829B (en) | Three-dimensional representation by multi-scale voxel hash processing | |
CN111784821B (en) | Three-dimensional model generation method and device, computer equipment and storage medium | |
EP4002290A1 (en) | Three-dimensional facial model generation method and apparatus, computer device and storage medium | |
JP2020517027A (en) | Method and apparatus for determining facial image quality, electronic device and computer storage medium | |
CN112771573A (en) | Depth estimation method and device based on speckle images and face recognition system | |
WO2012177166A1 (en) | An efficient approach to estimate disparity map | |
Zhang et al. | Indepth: Real-time depth inpainting for mobile augmented reality | |
CN113470112A (en) | Image processing method, image processing device, storage medium and terminal | |
CN114792355A (en) | Virtual image generation method and device, electronic equipment and storage medium | |
US20180247451A1 (en) | System and method for three dimensional object reconstruction and quality monitoring | |
CN113627298A (en) | Training method of target detection model and method and device for detecting target object | |
CN112949551A (en) | Eye key information determination method, device, equipment and storage medium | |
CN110852132B (en) | Two-dimensional code space position confirmation method and device | |
CN113093907A (en) | Man-machine interaction method, system, equipment and storage medium | |
US11783501B2 (en) | Method and apparatus for determining image depth information, electronic device, and media | |
CN115761123B (en) | Three-dimensional model processing method, three-dimensional model processing device, electronic equipment and storage medium | |
CN115031635A (en) | Measuring method and device, electronic device and storage medium | |
CN113902768B (en) | Three-dimensional face model edge optimization method and system based on micro-rendering | |
CN111754632B (en) | Business service processing method, device, equipment and storage medium | |
CN115187821A (en) | Method for verifying correctness before and after model conversion, related device and program product | |
CN114373046A (en) | Method and device for assisting robot to operate and storage medium | |
CN113706543A (en) | Three-dimensional pose construction method and equipment and storage medium | |
CN112991451A (en) | Image recognition method, related device and computer program product |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |