CN110929672A - Pupil positioning method and electronic equipment - Google Patents

Pupil positioning method and electronic equipment Download PDF

Info

Publication number
CN110929672A
CN110929672A CN201911212042.0A CN201911212042A CN110929672A CN 110929672 A CN110929672 A CN 110929672A CN 201911212042 A CN201911212042 A CN 201911212042A CN 110929672 A CN110929672 A CN 110929672A
Authority
CN
China
Prior art keywords
eye
image
determining
optimal
contours
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911212042.0A
Other languages
Chinese (zh)
Other versions
CN110929672B (en
Inventor
杨大业
宋建华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201911212042.0A priority Critical patent/CN110929672B/en
Publication of CN110929672A publication Critical patent/CN110929672A/en
Application granted granted Critical
Publication of CN110929672B publication Critical patent/CN110929672B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Human Computer Interaction (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The present disclosure provides a pupil location method. The method comprises the following steps: acquiring a current frame image, wherein the current frame image comprises an eye image; determining an initial region of the eye image in the current frame image; fitting pixels in the initial region to generate a plurality of eye contours; determining an optimal eye contour based on the plurality of eye contours; and determining the position of the pupil in the eye image based on the optimal eye contour under the condition that the optimal eye contour meets a preset condition. The present disclosure also provides an electronic device.

Description

Pupil positioning method and electronic equipment
Technical Field
The disclosure relates to a pupil positioning method and an electronic device.
Background
With the rapid development of artificial intelligence, automatic control, communication and computer technology, a man-machine interaction mode based on eyeball tracking is important to be applied to VR (virtual reality) and AR (augmented reality) interaction due to convenience and high efficiency. The most critical technique in eye tracking technology is to locate the position of the pupil. In the related art, the position of the pupil is usually determined based on the purkinje spot, however, the method has low positioning accuracy, and it is difficult to accurately position the pupil position.
Disclosure of Invention
One aspect of the present disclosure provides a pupil positioning method, including: acquiring a current frame image, wherein the current frame image comprises an eye image; determining an initial region of the eye image in the current frame image; fitting the pixels in the initial region to generate a plurality of eye contours; determining an optimal eye contour based on the plurality of eye contours; and under the condition that the optimal eye contour meets the preset condition, determining the position of the pupil in the eye image based on the optimal eye contour.
Optionally, fitting the pixels in the initial region to generate a plurality of eye contours comprises: carrying out edge detection on the image in the initial region to obtain a plurality of edge pixels; performing ellipse fitting on the plurality of edge pixels to form a plurality of ellipses; and determining a plurality of eye contours based on the plurality of ellipses.
Optionally, determining the plurality of eye contours based on the plurality of ellipses comprises: calculating an aspect ratio and an area of each of the plurality of ellipses, wherein the aspect ratio indicates a proportional relationship between a major axis and a minor axis of the ellipse; and taking an ellipse with the aspect ratio and the area both conforming to the eye rules as the eye contour.
Optionally, determining the optimal eye contour based on the plurality of eye contours comprises: judging the similarity among a plurality of eye contours; determining the position coordinates of each eye feature point in the plurality of eye feature points in the plurality of eye contours respectively under the condition that the similarity is greater than a first preset value; determining an average value of the position coordinates of each of the plurality of eye feature points; and determining an optimal eye profile based on the average.
Optionally, fitting an ellipse to the plurality of edge pixels to form a plurality of ellipses comprises: performing ellipse fitting on the plurality of edge pixels by using a least square method to form a plurality of ellipses, and determining a loss function corresponding to each ellipse in the plurality of ellipses, wherein the determining the optimal eye contour based on the plurality of eye contours comprises: determining an optimal ellipse based on the plurality of ellipses, wherein the optimal eye contour meeting the preset conditions comprises: the fitting quality of the best ellipse is greater than a second predetermined value, and the fitting quality of the best ellipse is determined according to a loss function of the best ellipse.
Optionally, the method further includes determining a stable region based on a gray value of the image in the initial region when the optimal eye contour does not satisfy a preset condition; performing ellipse fitting on the stable region to obtain a plurality of eye contours; determining an optimal eye contour based on the plurality of eye contours; and determining the position of the pupil in the eye image based on the optimal eye contour.
Optionally, the method further comprises reducing the resolution of the current frame image and processing the current frame image into a grayscale image.
Optionally, determining an initial region of the eye image in the current frame image comprises: determining a historical eye image, wherein the historical eye image is determined according to a previous frame image of a current frame image; and taking the region where the historical eye image is located as an initial region.
Another aspect of the disclosure provides an electronic device comprising one or more processors; a storage device to store one or more programs, wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to perform: acquiring a current frame image, wherein the current frame image comprises an eye image; determining an initial region of the eye image in the current frame image; extracting edge pixels of the image in the initial region, and determining a plurality of eye contours according to the edge pixels; determining an optimal eye contour based on the plurality of eye contours; and under the condition that the optimal eye contour meets the preset condition, determining the position of the pupil in the eye image based on the optimal eye contour.
Optionally, the processor extracting edge pixels of the image in the initial region, and determining the plurality of eye contours from the edges comprises: carrying out edge detection on the image in the initial region to obtain a plurality of edge pixels; performing ellipse fitting on the plurality of edge pixels to form a plurality of ellipses; and determining a plurality of eye contours based on the plurality of ellipses.
Another aspect of the present disclosure provides a computer-readable storage medium storing computer-executable instructions for implementing the method as described above when executed.
Another aspect of the disclosure provides a computer program comprising computer executable instructions for implementing the method as described above when executed.
Drawings
For a more complete understanding of the present disclosure and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which:
fig. 1 schematically illustrates an application scenario of pupil positioning according to an embodiment of the present disclosure;
figure 2A schematically illustrates a flow chart of a pupil location method according to an embodiment of the present disclosure;
FIG. 2B schematically illustrates an exemplary schematic of an initial region according to an embodiment of the disclosure;
fig. 3A schematically illustrates a flowchart of a method of generating a plurality of eye contours in operation S203 according to an embodiment of the present disclosure;
FIG. 3B schematically illustrates an exemplary diagram of a plurality of ellipses formed by ellipse fitting of edge pixels according to an embodiment of the disclosure;
FIG. 4 schematically shows a flowchart of a method of determining an optimal eye contour according to an embodiment of the present disclosure
Figure 5 schematically illustrates a flow chart of a pupil location method according to another embodiment of the present disclosure; and
FIG. 6 schematically illustrates a flow chart of a method of determining an initial region according to an embodiment of the present disclosure;
figure 7 schematically illustrates a flow chart of a pupil location method according to another embodiment of the present disclosure;
figure 8 schematically illustrates a block diagram of a pupil positioning device according to an embodiment of the disclosure; and
fig. 9 schematically shows a block diagram of an electronic device according to an embodiment of the disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is illustrative only and is not intended to limit the scope of the present disclosure. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the disclosure. It may be evident, however, that one or more embodiments may be practiced without these specific details. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
Where a convention analogous to "at least one of A, B and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B and C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.). Where a convention analogous to "A, B or at least one of C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B or C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.).
Some block diagrams and/or flow diagrams are shown in the figures. It will be understood that some blocks of the block diagrams and/or flowchart illustrations, or combinations thereof, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the instructions, which execute via the processor, create means for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks. The techniques of this disclosure may be implemented in hardware and/or software (including firmware, microcode, etc.). In addition, the techniques of this disclosure may take the form of a computer program product on a computer-readable storage medium having instructions stored thereon for use by or in connection with an instruction execution system.
Embodiments of the present disclosure provide a pupil positioning method. The method comprises the steps of obtaining a current frame image, wherein the current frame image comprises an eye image, determining an initial region of the eye image in the current frame image, fitting pixels in the initial region to generate a plurality of eye contours, determining an optimal eye contour based on the eye contours, and determining the position of a pupil in the eye image based on the optimal eye contour under the condition that the optimal eye contour meets a preset condition.
Fig. 1 schematically illustrates an application scenario of pupil positioning according to an embodiment of the present disclosure. It should be noted that fig. 1 is only an example of a scenario in which the embodiments of the present disclosure may be applied to help those skilled in the art understand the technical content of the present disclosure, but does not mean that the embodiments of the present disclosure may not be applied to other devices, systems, environments or scenarios.
As shown in fig. 1, a reading device 110 and a user using the reading device 110 to read an electronic book are included in the application scenario. The reading device 110 may, for example, include pupil positioning means for positioning the pupil of the user. The electronic book may be controlled to automatically turn pages, for example, when the pupil locating device determines that the user's pupil is looking at the lower end of the reading device 110.
The pupil positioning device can realize pupil positioning by using the method according to the embodiment of the disclosure.
Fig. 2A schematically illustrates a flow chart of a pupil location method according to an embodiment of the present disclosure.
As shown in fig. 2A, the method includes operations S201 to S205.
In operation S201, a current frame image is acquired, where the current frame image includes an eye image.
According to an embodiment of the present disclosure, a face image of a user may be acquired by a near-infrared sensor, for example. Or may acquire an image of the face of the user via a camera or the like.
It is understood that the current frame image is not necessarily the face image of the user, and may be an image of a portion above the nose, for example, as long as the image includes eyes.
In operation S202, an initial region of the eye image in the current frame image is determined.
Fig. 2B schematically illustrates an exemplary schematic of an initial region according to an embodiment of the disclosure. As shown in fig. 2B, the initial region of the eye image in the current frame image, for example, determined in operation S202, may include, for example, a region 210 and a region 220.
According to the embodiment of the disclosure, the current frame image may be identified through a convolutional neural network, for example, to identify an initial region where the eye image is located from the current frame image. Fig. 6 schematically illustrates a flowchart of a method for determining an initial region of an eye image in a current frame image according to another embodiment of the present disclosure, which is not described herein again.
In operation S203, pixels in the initial region are fitted to generate a plurality of eye contours.
According to an embodiment of the present disclosure, edge detection may be performed on the initial region, for example, to obtain the edge of the iris. Wherein an edge may for example comprise a plurality of line segments. Next, a plurality of line segments included by the edges of the irises may be fitted to generate contours of the plurality of irises.
For another example, edge detection may be performed on the initial region to obtain the edge of the pupil. Wherein an edge may for example comprise a plurality of line segments. Next, a plurality of line segments included at the edge of the pupil may be fitted to generate a contour of the plurality of pupils.
In operation S204, an optimal eye contour is determined based on the plurality of eye contours.
According to an embodiment of the present disclosure, for example, an optimal eye contour may be selected from a plurality of eye contours, or an optimal eye contour may be calculated from a plurality of eye contours.
In operation S205, in a case where the optimal eye contour satisfies a preset condition, a position of the pupil in the eye image is determined based on the optimal eye contour.
According to an embodiment of the present disclosure, the preset condition may be set by a person skilled in the art according to data of a large number of eye features, for example.
The optimal eye contour meeting the preset condition can indicate that the fitted eye contour meets the requirement, and the position of the pupil can be determined according to the optimal eye contour. If the optimal eye contour does not meet the preset conditions, the fitted eye contour does not meet the requirements, and the position of the pupil cannot be determined according to the optimal eye contour. In the case where the location of the pupil cannot be determined from the optimal eye contour, other methods may be used to determine the location of the pupil. The pupil position may be determined, for example, in purkinje's patches.
According to the embodiment of the disclosure, the method can locate the pupil according to the optimal eye contour by fitting the pixels of the initial region and determining the optimal eye contour according to the fitted multiple eye contours, so that the problem that the pupil is difficult to accurately locate by using purkinje spots in the related art is at least partially solved.
Fig. 3A schematically illustrates an example method flowchart for generating a plurality of eye contours in operation S203 according to an embodiment of the present disclosure.
As shown in fig. 3A, the method may include operations S213 to S233.
In operation S213, edge detection is performed on the image in the initial region to obtain a plurality of edge pixels.
According to an embodiment of the present disclosure, the sobel operator calculates the image gradient, thereby identifying the edge of the initial region. For example, the edge detection may be performed by using a Marr-Hildreth detection method, or the edge detection may be performed on the initial region by using a combination of a plurality of edge detection algorithms.
According to the embodiment of the present disclosure, the edge detection of the initial region may obtain a plurality of line segments, and the plurality of line segments are respectively composed of a plurality of pixels.
In operation S223, ellipse fitting is performed on the plurality of edge pixels to form a plurality of ellipses.
FIG. 3B schematically shows an exemplary schematic diagram of a plurality of ellipses formed by ellipse fitting of edge pixels according to an embodiment of the disclosure.
As shown in FIG. 3B, edge detection is performed on the initial region, and the obtained edges may include, for example, edges 310-360.
Ellipse fitting is performed on a plurality of edges, for example, edges 310-330 and the like fit an ellipse with center O1 (for ease of understanding, the triangle labeled in FIG. 3B may be an ellipse with center O1). Edges 340-360, etc. fit an ellipse centered at O2 (for ease of understanding, the circled label in FIG. 3B may be an ellipse centered at O2).
According to an embodiment of the present disclosure, a plurality of ellipses may be formed by ellipse fitting a plurality of edge pixels using, for example, a least squares method in operation S223. The edge pixels may be obtained by performing edge detection on the initial region, for example. In this embodiment, operations S203 and S204 may be determining an optimal eye contour based on the plurality of eye contours, and determining that the optimal eye contour satisfies a preset condition if the fitting quality of the optimal ellipse is greater than a second preset value. The fitting quality may be characterized by a loss function of an optimal ellipse, for example, and the loss function of the optimal ellipse may be an average of loss functions generated when a plurality of eye contours are obtained by performing a least squares fitting on a plurality of edge pixels.
In operation S233, a plurality of eye contours are determined based on the plurality of ellipses.
According to the embodiments of the present disclosure, for example, the aspect ratio and the area of each of a plurality of ellipses may be calculated, and an ellipse whose aspect ratio and area both conform to the ocular rule may be taken as the ocular contour. Where the aspect ratio indicates the proportional relationship between the major and minor axes of the ellipse.
For example, an ellipse having an aspect ratio within a first predetermined range and an elliptical area within a second predetermined range may be used as the iris outline.
According to an embodiment of the present disclosure, the first preset range and the second preset range may be determined by those skilled in the art according to a large number of iris shapes.
According to the embodiment of the disclosure, the method can determine the plurality of eye contours by performing ellipse fitting on the plurality of edge pixels, so that the technical effect of determining the eye contours according to the characteristics of pupils or irises is achieved, and the fitting result is more accurate.
It is understood that an ellipse in this disclosure includes a circle having a major axis and a minor axis that are equal.
Fig. 4 schematically illustrates an example method flowchart for determining an optimal eye contour in operation S204 according to an embodiment of the present disclosure.
As shown in FIG. 4, the method may include operations S214-244.
In operation S214, a similarity between a plurality of eye contours is determined.
For example, the gray level distribution in each eye contour may be analyzed, and the plurality of gray level distributions may be compared to determine the similarity between the plurality of eye contours based on the gray level distributions.
According to the embodiment of the disclosure, on the basis of determining the pixel degrees between the eye contours through the gray scale distribution, the similarity between the eye contours can also be determined according to the positions of the eye contours.
In operation S224, in a case where the similarity is greater than a first preset value, position coordinates of each of the plurality of eye feature points in the plurality of eye contours, respectively, are determined.
According to the embodiment of the present disclosure, for example, in the case where the similarity of a partial eye contour among a plurality of eye contours is greater than a first preset value, the position coordinates of the eye feature points in the partial eye contour are determined.
The eye feature points may be, for example, specific points of the eye contour. Specifically, for example, the center point of the eye contour, the end point of the eye contour on the major axis and the end point on the minor axis, and the like may be mentioned.
In operation S234, an average value of the position coordinates of each of the plurality of eye feature points is determined.
In operation S244, an optimal eye profile is determined based on the average value.
According to the embodiment of the present disclosure, in operations S234 and S244, for example, an average value of the position coordinates of the center points of the plurality of eye contours may be calculated to obtain the position of the center point of the optimal eye contour, and an average value of the position coordinates of the end points corresponding to each of the plurality of eye contours may be calculated to obtain the position of the end point of the optimal eye contour. Thus, the optimal eye contour is determined according to the positions of the central point and the plurality of end points. For example, an optimal iris profile, or an optimal pupil profile, may be obtained.
Fig. 5 schematically shows a flow chart of a pupil localization method according to another embodiment of the present disclosure.
As shown in fig. 5, the method may further include operations S501 to S504 on the basis of operations S201 to S204 described in fig. 2.
In operation S501, in the case where the optimal eye contour does not satisfy the preset condition, a stable region is determined based on a gray value of an image in the initial region.
According to an embodiment of the present disclosure, a plurality of pixel values whose gradation values conform to the gradation characteristic of the pupil may be found in the initial region, for example. For example, the gray scale of the pupil in an image is typically greater than the gray scale of other regions of the eye. Therefore, a plurality of pixels having the largest gradation value can be found in the initial region, and a region formed by the plurality of pixels having the largest gradation value can be used as a stable region.
In operation S502, a plurality of eye contours are obtained by ellipse fitting the stable region.
According to an embodiment of the present disclosure, a plurality of eye contours may be obtained by fitting a plurality of pixels of the stable region using, for example, a least squares method.
In operation S503, an optimal eye contour is determined based on the plurality of eye contours. The optimal eye contour may be determined, for example, using the operations described in operation S204.
In operation S504, a position of a pupil in the eye image is determined based on the optimal eye contour. For example, the center position of the optimal eye contour may be used as the position of the pupil in the eye image.
According to the embodiment of the present disclosure, in the case that the optimal eye contour determined by ellipse fitting using a plurality of pixels obtained by edge detection fails to locate the pupil, the pupil can be located by the method described in fig. 5. According to the embodiment of the disclosure, the positioning accuracy of the method depicted in fig. 2 is high, and when the method of fig. 2 cannot determine the position of the pupil, the method of fig. 5 is automatically adopted to analyze the gray scale of the image again to obtain the positioning result, thereby at least partially avoiding the situation that the position of the pupil cannot be determined.
According to another embodiment of the pupil location method of the present disclosure, before performing operation S201, it may further perform: reducing the resolution of the current frame image and processing the current frame image into a grayscale image.
According to the embodiment of the present disclosure, reducing the resolution of the current frame image and processing it into a gray image before performing operation S201 on the current frame image can reduce the amount of calculation in operations S201 to S205 and operations S501 to S504, so that pupil location is more rapid, reducing delay.
According to the embodiments of the present disclosure, for example, the resolution of the near-infrared image is high, and much image data is not required to be used when the image is subjected to image processing (e.g., edge detection, ellipse fitting), so that the resolution of the near-infrared image can be reduced. Specifically, the resolution of the current frame image may be reduced by, for example, smoothing the image or by employing a decimation method.
According to another embodiment of the present disclosure, after operation S202 and before operation S203, the image of the initial region may be convolved using a gaussian kernel to reduce the number of resulting edges, thereby reducing the complexity of the fitting.
Fig. 6 schematically illustrates an example method flowchart for determining an initial region in accordance with an embodiment of the present disclosure.
As shown in fig. 6, the method may include operations S212 and S222.
In operation S212, a historical eye image is determined, which is determined from a previous frame image of a current frame image.
According to an embodiment of the present disclosure, the historical eye image of the user may be determined, for example, from the position of the pupil center determined from the previous frame of image. Specifically, for example, the iris region of the eye may be determined according to the pupil center position of the user, so as to determine the top-bottom width and the left-right width of the eye of the user.
In operation S222, a region where the history eye image is located is taken as an initial region.
According to the embodiment of the disclosure, the method can take the eye image position determined by the previous frame image as the initial region, and simplifies the calculation process for identifying the eye image from the current frame image.
Fig. 7 schematically illustrates an example flow chart of a pupil location method according to another embodiment of this disclosure.
As shown in fig. 7, the method may include operations S701 to S714.
In operation S701, a current frame image, which may be, for example, a near-infrared image, is acquired.
In operation S702, the current frame image is converted into a gray map, and the resolution of the current frame image may be reduced.
In operation S703, it is determined whether the electronic apparatus is in a state of tracking the line of sight of the user. If the electronic device is in a state of tracking the line of sight of the user, operation S704 is performed. If the electronic device is not in a state of tracking the line of sight of the user, operation S705 is performed.
In operation S704, a mask image may be determined according to a previous frame image of the current frame image, for example. The mask image is the eye image in the previous frame of image.
In operation S705, if the electronic device is in the tracking state, the mask image is preprocessed, for example, the mask image may be convolved to extract the eye features. If the electronic device is not in the tracking state, that is, there is no previous frame image, the current frame image may be preprocessed to determine an eye image, and the eye image may be subjected to convolution processing.
In operation S706, edge features are extracted and filtered. For example, edge detection may be performed on the eye image.
In operation S707, the edges obtained by the edge detection are filtered to eliminate edges that do not conform to the eye feature. For example, the edge may be filtered according to the position and length of the edge. Next, ellipse fitting is performed on the filtered edges.
According to the embodiment of the disclosure, for example, the fitted ellipses may be filtered, the ellipses whose aspect ratios and areas do not conform to the eye rule are deleted, and the ellipses conforming to the eye rule are used as the eye contour.
According to an embodiment of the present disclosure, an optimal eye profile may be determined from a plurality of eye profiles according to the operations described above with reference to fig. 4.
In operation S708, the ellipse fitting quality of the optimal eye contour is calculated. The quality of the fit may be characterized, for example, by a loss function of an optimal ellipse, which may be an average of loss functions generated when a plurality of eye contours are obtained by least squares fitting a plurality of edge pixels. The quality of the fit may be calculated, for example, by a function inversely proportional to the loss function. The function for calculating the fitting quality can be designed by those skilled in the art according to actual needs.
In operation S709, it is determined whether the fitting quality is greater than a threshold. If the fitting quality is greater than the preset threshold, operation S713 may be performed. If the fitting quality is not greater than the preset threshold, operation S710 may be performed. The fitting quality may be, for example, a quality function inversely proportional to a loss function of the least square fitting, and the preset threshold corresponding to the quality function may be determined empirically by those skilled in the art
In operation S710, a stable region is detected. For example, the method described in operation S501 above may be performed.
In operation S711, pixels within the stable region are filtered. For example, pixels on the edge of the stable region that do not belong to the gray values of the pupil or iris may be filtered out.
In operation S712, an ellipse fitting may be performed on a plurality of pixels included in the edge of the stable region, for example, with the center of the determined ellipse as the center of the pupil.
In operation S713, it is determined whether the electronic device is still in a state of tracking the line of sight of the user. If the user' S gaze is tracked, operation S714 may be performed.
If the user is not in a state of tracking the line of sight of the user, the operation S701 may be returned to.
In operation S714, a new mask image may be created according to the determined center of the pupil for use in the next frame of image.
Fig. 8 schematically illustrates a block diagram of a pupil positioning device 800 according to an embodiment of the disclosure.
As shown in fig. 8, the pupil location device 800 includes an acquisition module 810, a first determination module 820, a generation module 830, a second determination module 840, and a third determination module 850.
The obtaining module 810, for example, may perform operation S201 described above with reference to fig. 2, for obtaining a current frame image, where the current frame image includes an eye image.
The first determining module 820, for example, may perform operation S202 described above with reference to fig. 2 for determining an initial region of the eye image in the current frame image.
The generating module 830, for example, may perform operation S203 described above with reference to fig. 2 for fitting the pixels in the initial region to generate a plurality of eye contours.
The second determining module 840, for example, may perform operation S204 described above with reference to fig. 2 for determining an optimal eye contour based on the plurality of eye contours.
The third determining module 850, for example, may perform operation S205 described above with reference to fig. 2, for determining the position of the pupil in the eye image based on the optimal eye contour if the optimal eye contour satisfies the preset condition.
According to an embodiment of the present disclosure, fitting the pixels in the initial region to generate a plurality of eye contours comprises: carrying out edge detection on the image in the initial region to obtain a plurality of edge pixels; performing ellipse fitting on the plurality of edge pixels to form a plurality of ellipses; and determining a plurality of eye contours based on the plurality of ellipses.
According to an embodiment of the present disclosure, determining the plurality of eye contours based on the plurality of ellipses comprises: calculating an aspect ratio and an area of each of the plurality of ellipses, wherein the aspect ratio indicates a proportional relationship between a major axis and a minor axis of the ellipse; and taking an ellipse with the aspect ratio and the area both conforming to the eye rules as the eye contour.
According to an embodiment of the present disclosure, determining an optimal eye contour based on a plurality of eye contours comprises: judging the similarity among a plurality of eye contours; determining the position coordinates of each eye feature point in the plurality of eye feature points in the plurality of eye contours respectively under the condition that the similarity is greater than a first preset value; determining an average value of the position coordinates of each of the plurality of eye feature points; and determining an optimal eye profile based on the average.
According to an embodiment of the present disclosure, fitting an ellipse to a plurality of edge pixels to form a plurality of ellipses includes: performing ellipse fitting on the plurality of edge pixels by using a least square method to form a plurality of ellipses, and determining a loss function corresponding to each ellipse in the plurality of ellipses, wherein the determining the optimal eye contour based on the plurality of eye contours comprises: determining an optimal ellipse based on the plurality of ellipses, wherein the optimal eye contour meeting the preset conditions comprises: the fitting quality of the best ellipse is greater than a second predetermined value, and the fitting quality of the best ellipse is determined according to a loss function of the best ellipse.
According to an embodiment of the present disclosure, the apparatus 800 may further include a fourth determination module, a fitting module, a fifth determination module, and a sixth determination module. The fourth determining module is used for determining a stable region based on the gray value of the image in the initial region under the condition that the optimal eye contour does not meet the preset condition; the fitting module is used for carrying out ellipse fitting on the stable region to obtain a plurality of eye contours; a fifth determination module for determining an optimal eye contour based on the plurality of eye contours; and the sixth determining module is used for determining the position of the pupil in the eye image based on the optimal eye contour.
According to an embodiment of the present disclosure, the apparatus 800 may further include an initialization module for reducing the resolution of the current frame image and processing the current frame image into a grayscale image.
According to an embodiment of the present disclosure, determining an initial region of an eye image in a current frame image includes: determining a historical eye image, wherein the historical eye image is determined according to a previous frame image of a current frame image; and taking the region where the historical eye image is located as an initial region.
Any number of modules, sub-modules, units, sub-units, or at least part of the functionality of any number thereof according to embodiments of the present disclosure may be implemented in one module. Any one or more of the modules, sub-modules, units, and sub-units according to the embodiments of the present disclosure may be implemented by being split into a plurality of modules. Any one or more of the modules, sub-modules, units, sub-units according to embodiments of the present disclosure may be implemented at least in part as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in any other reasonable manner of hardware or firmware by integrating or packaging a circuit, or in any one of or a suitable combination of software, hardware, and firmware implementations. Alternatively, one or more of the modules, sub-modules, units, sub-units according to embodiments of the disclosure may be at least partially implemented as a computer program module, which when executed may perform the corresponding functions.
For example, any plurality of the obtaining module 810, the first determining module 820, the generating module 830, the second determining module 840 and the third determining module 850 may be combined and implemented in one module, or any one of the modules may be split into a plurality of modules. Alternatively, at least part of the functionality of one or more of these modules may be combined with at least part of the functionality of the other modules and implemented in one module. According to an embodiment of the disclosure, at least one of the obtaining module 810, the first determining module 820, the generating module 830, the second determining module 840, and the third determining module 850 may be implemented at least partially as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in hardware or firmware in any other reasonable manner of integrating or packaging a circuit, or in any one of three implementations of software, hardware, and firmware, or in a suitable combination of any of them. Alternatively, at least one of the obtaining module 810, the first determining module 820, the generating module 830, the second determining module 840 and the third determining module 850 may be at least partially implemented as a computer program module, which when executed, may perform a corresponding function.
Fig. 9 schematically shows a block diagram of an electronic device according to an embodiment of the disclosure. The electronic device shown in fig. 9 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 9, the electronic device 900 includes a processor 910, a computer-readable storage medium 920. The electronic device 900 may perform a method according to an embodiment of the disclosure.
In particular, processor 910 may include, for example, a general purpose microprocessor, an instruction set processor and/or related chip set and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), and/or the like. The processor 910 may also include onboard memory for caching purposes. The processor 910 may be a single processing unit or a plurality of processing units for performing the different actions of the method flows according to embodiments of the present disclosure.
Computer-readable storage media 920, for example, may be non-volatile computer-readable storage media, specific examples including, but not limited to: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and so on.
The computer-readable storage medium 920 may include a computer program 921, which computer program 921 may include code/computer-executable instructions that, when executed by the processor 910, cause the processor 910 to perform a method according to an embodiment of the present disclosure, or any variation thereof.
The computer program 921 may be configured with, for example, computer program code comprising computer program modules. For example, in an example embodiment, code in computer program 921 may include one or more program modules, including 921A, modules 921B, … …, for example. It should be noted that the division and number of the modules are not fixed, and those skilled in the art may use suitable program modules or program module combinations according to actual situations, so that the processor 910 may execute the method according to the embodiment of the present disclosure or any variation thereof when the program modules are executed by the processor 910.
According to an embodiment of the present invention, at least one of the obtaining module 810, the first determining module 820, the generating module 830, the second determining module 840 and the third determining module 850 may be implemented as a computer program module described with reference to fig. 9, which, when executed by the processor 910, may implement the respective operations described above.
The present disclosure also provides a computer-readable storage medium, which may be contained in the apparatus/device/system described in the above embodiments; or may exist separately and not be assembled into the device/apparatus/system. The computer-readable storage medium carries one or more programs which, when executed, implement the method according to an embodiment of the disclosure.
According to embodiments of the present disclosure, the computer-readable storage medium may be a non-volatile computer-readable storage medium, which may include, for example but is not limited to: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Those skilled in the art will appreciate that various combinations and/or combinations of features recited in the various embodiments and/or claims of the present disclosure can be made, even if such combinations or combinations are not expressly recited in the present disclosure. In particular, various combinations and/or combinations of the features recited in the various embodiments and/or claims of the present disclosure may be made without departing from the spirit or teaching of the present disclosure. All such combinations and/or associations are within the scope of the present disclosure.
While the disclosure has been shown and described with reference to certain exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the disclosure as defined by the appended claims and their equivalents. Accordingly, the scope of the present disclosure should not be limited to the above-described embodiments, but should be defined not only by the appended claims, but also by equivalents thereof.

Claims (10)

1. A pupil location method, comprising:
acquiring a current frame image, wherein the current frame image comprises an eye image;
determining an initial region of the eye image in the current frame image;
fitting pixels in the initial region to generate a plurality of eye contours;
determining an optimal eye contour based on the plurality of eye contours; and
and under the condition that the optimal eye contour meets a preset condition, determining the position of the pupil in the eye image based on the optimal eye contour.
2. The method of claim 1, wherein said fitting pixels in the initial region to generate a plurality of ocular contours comprises:
performing edge detection on the image in the initial region to obtain a plurality of edge pixels;
performing ellipse fitting on the plurality of edge pixels to form a plurality of ellipses; and
based on the plurality of ellipses, a plurality of eye contours is determined.
3. The method of claim 2, wherein said determining a plurality of eye contours based on the plurality of ellipses comprises:
calculating an aspect ratio and an area of each of the plurality of ellipses, wherein the aspect ratio indicates a proportional relationship between a major axis and a minor axis of the ellipse; and
and taking the ellipse of which the aspect ratio and the area both accord with the eye rule as the eye contour.
4. The method of claim 2, wherein said determining an optimal eye contour based on the plurality of eye contours comprises:
judging the similarity among the eye contours;
determining the position coordinates of each eye feature point in the plurality of eye feature points in the plurality of eye contours respectively under the condition that the similarity is larger than a first preset value;
determining an average value of the position coordinates of each of the plurality of eye feature points; and
an optimal eye contour is determined based on the average.
5. The method of claim 2, wherein said fitting the plurality of edge pixels to form a plurality of ellipses comprises:
fitting the plurality of edge pixels with an ellipse using a least squares method to form a plurality of ellipses, and determining a loss function corresponding to each of the plurality of ellipses,
wherein the determining an optimal eye contour based on the plurality of eye contours comprises: determining an optimal ellipse based on the plurality of ellipses,
the optimal eye contour meeting the preset conditions comprises: the fitting quality of the best ellipse is greater than a second preset value, and the fitting quality of the best ellipse is determined according to a loss function of the best ellipse.
6. The method of claim 1, further comprising:
determining a stable region based on the gray value of the image in the initial region under the condition that the optimal eye contour does not meet a preset condition;
performing ellipse fitting on the stable region to obtain a plurality of eye contours;
determining an optimal eye contour based on the plurality of eye contours; and
determining a location of a pupil in the eye image based on the optimal eye contour.
7. The method of claim 1, further comprising:
reducing the resolution of the current frame image and processing the current frame image into a gray scale image.
8. The method of claim 1, wherein the determining an initial region of the eye image in the current frame image comprises:
determining a historical eye image, wherein the historical eye image is determined according to a previous frame image of the current frame image; and
and taking the region where the historical eye image is located as the initial region.
9. An electronic device, comprising:
one or more processors;
a storage device for storing one or more programs,
wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to perform:
acquiring a current frame image, wherein the current frame image comprises an eye image;
determining an initial region of the eye image in the current frame image;
extracting edge pixels of the image in the initial region and determining a plurality of eye contours from the edge pixels;
determining an optimal eye contour based on the plurality of eye contours; and
and under the condition that the optimal eye contour meets a preset condition, determining the position of the pupil in the eye image based on the optimal eye contour.
10. The electronic device of claim 9, wherein the processor extracts edge pixels of the image in the initial region and determines a plurality of eye contours from the edges comprises:
performing edge detection on the image in the initial region to obtain a plurality of edge pixels;
performing ellipse fitting on the plurality of edge pixels to form a plurality of ellipses; and
based on the plurality of ellipses, a plurality of eye contours is determined.
CN201911212042.0A 2019-11-28 2019-11-28 Pupil positioning method and electronic equipment Active CN110929672B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911212042.0A CN110929672B (en) 2019-11-28 2019-11-28 Pupil positioning method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911212042.0A CN110929672B (en) 2019-11-28 2019-11-28 Pupil positioning method and electronic equipment

Publications (2)

Publication Number Publication Date
CN110929672A true CN110929672A (en) 2020-03-27
CN110929672B CN110929672B (en) 2024-03-01

Family

ID=69848234

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911212042.0A Active CN110929672B (en) 2019-11-28 2019-11-28 Pupil positioning method and electronic equipment

Country Status (1)

Country Link
CN (1) CN110929672B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113031269A (en) * 2021-03-08 2021-06-25 北京正远展览展示有限公司 VR shows dizzy governing system of anti-dazzle

Citations (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5231674A (en) * 1989-06-09 1993-07-27 Lc Technologies, Inc. Eye tracking method and apparatus
CN101923645A (en) * 2009-06-09 2010-12-22 黑龙江大学 Iris splitting method suitable for low-quality iris image in complex application context
CN102314589A (en) * 2010-06-29 2012-01-11 比亚迪股份有限公司 Fast human-eye positioning method and device
CN103054548A (en) * 2012-07-05 2013-04-24 东北电力大学 Fixation point measurement device and pupil recognition method and Purkinje image recognition method
CN103207664A (en) * 2012-01-16 2013-07-17 联想(北京)有限公司 Image processing method and equipment
CN103390152A (en) * 2013-07-02 2013-11-13 华南理工大学 Sight tracking system suitable for human-computer interaction and based on system on programmable chip (SOPC)
CN103530618A (en) * 2013-10-23 2014-01-22 哈尔滨工业大学深圳研究生院 Non-contact sight tracking method based on corneal reflex
US20140072230A1 (en) * 2011-03-11 2014-03-13 Omron Corporation Image processing device and image processing method
CN103679180A (en) * 2012-09-19 2014-03-26 武汉元宝创意科技有限公司 Sight tracking method based on single light source of single camera
JP2014061085A (en) * 2012-09-20 2014-04-10 National Institute Of Advanced Industrial & Technology Method fo detecting ellipse approximating to pupil portion
CN105812778A (en) * 2015-01-21 2016-07-27 成都理想境界科技有限公司 Binocular AR head-mounted display device and information display method therefor
CN105913487A (en) * 2016-04-09 2016-08-31 北京航空航天大学 Human eye image iris contour analyzing and matching-based viewing direction calculating method
CN105930762A (en) * 2015-12-02 2016-09-07 中国银联股份有限公司 Eyeball tracking method and device
CN106325510A (en) * 2016-08-19 2017-01-11 联想(北京)有限公司 Information processing method and electronic equipment
CN106919933A (en) * 2017-03-13 2017-07-04 重庆贝奥新视野医疗设备有限公司 The method and device of Pupil diameter
CN107292242A (en) * 2017-05-31 2017-10-24 华为技术有限公司 A kind of iris identification method and terminal
US20180089834A1 (en) * 2016-09-29 2018-03-29 Magic Leap, Inc. Neural network for eye image segmentation and image quality estimation
CN107870667A (en) * 2016-09-26 2018-04-03 联想(新加坡)私人有限公司 Method, electronic installation and program product for eye tracks selection checking
CN108509908A (en) * 2018-03-31 2018-09-07 天津大学 A kind of pupil diameter method for real-time measurement based on binocular stereo vision
CN109840484A (en) * 2019-01-23 2019-06-04 张彦龙 A kind of pupil detection method based on edge filter, oval evaluation and pupil verifying
CN109857254A (en) * 2019-01-31 2019-06-07 京东方科技集团股份有限公司 Pupil positioning method and device, VR/AR equipment and computer-readable medium
CN110210357A (en) * 2019-05-24 2019-09-06 浙江大学 A kind of ptosis image measuring method based on still photo face recognition
US10417495B1 (en) * 2016-08-08 2019-09-17 Google Llc Systems and methods for determining biometric information
CN110263745A (en) * 2019-06-26 2019-09-20 京东方科技集团股份有限公司 A kind of method and device of pupil of human positioning
CN110276324A (en) * 2019-06-27 2019-09-24 北京万里红科技股份有限公司 The elliptical method of pupil is determined in a kind of iris image

Patent Citations (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5231674A (en) * 1989-06-09 1993-07-27 Lc Technologies, Inc. Eye tracking method and apparatus
CN101923645A (en) * 2009-06-09 2010-12-22 黑龙江大学 Iris splitting method suitable for low-quality iris image in complex application context
CN102314589A (en) * 2010-06-29 2012-01-11 比亚迪股份有限公司 Fast human-eye positioning method and device
US20140072230A1 (en) * 2011-03-11 2014-03-13 Omron Corporation Image processing device and image processing method
CN103207664A (en) * 2012-01-16 2013-07-17 联想(北京)有限公司 Image processing method and equipment
CN103054548A (en) * 2012-07-05 2013-04-24 东北电力大学 Fixation point measurement device and pupil recognition method and Purkinje image recognition method
CN103679180A (en) * 2012-09-19 2014-03-26 武汉元宝创意科技有限公司 Sight tracking method based on single light source of single camera
JP2014061085A (en) * 2012-09-20 2014-04-10 National Institute Of Advanced Industrial & Technology Method fo detecting ellipse approximating to pupil portion
CN103390152A (en) * 2013-07-02 2013-11-13 华南理工大学 Sight tracking system suitable for human-computer interaction and based on system on programmable chip (SOPC)
CN103530618A (en) * 2013-10-23 2014-01-22 哈尔滨工业大学深圳研究生院 Non-contact sight tracking method based on corneal reflex
CN105812778A (en) * 2015-01-21 2016-07-27 成都理想境界科技有限公司 Binocular AR head-mounted display device and information display method therefor
CN105930762A (en) * 2015-12-02 2016-09-07 中国银联股份有限公司 Eyeball tracking method and device
US20180365844A1 (en) * 2015-12-02 2018-12-20 China Unionpay Co.,Ltd. Eyeball tracking method and apparatus, and device
CN105913487A (en) * 2016-04-09 2016-08-31 北京航空航天大学 Human eye image iris contour analyzing and matching-based viewing direction calculating method
US10417495B1 (en) * 2016-08-08 2019-09-17 Google Llc Systems and methods for determining biometric information
CN106325510A (en) * 2016-08-19 2017-01-11 联想(北京)有限公司 Information processing method and electronic equipment
CN107870667A (en) * 2016-09-26 2018-04-03 联想(新加坡)私人有限公司 Method, electronic installation and program product for eye tracks selection checking
US20180089834A1 (en) * 2016-09-29 2018-03-29 Magic Leap, Inc. Neural network for eye image segmentation and image quality estimation
CN106919933A (en) * 2017-03-13 2017-07-04 重庆贝奥新视野医疗设备有限公司 The method and device of Pupil diameter
CN107292242A (en) * 2017-05-31 2017-10-24 华为技术有限公司 A kind of iris identification method and terminal
CN108509908A (en) * 2018-03-31 2018-09-07 天津大学 A kind of pupil diameter method for real-time measurement based on binocular stereo vision
CN109840484A (en) * 2019-01-23 2019-06-04 张彦龙 A kind of pupil detection method based on edge filter, oval evaluation and pupil verifying
CN109857254A (en) * 2019-01-31 2019-06-07 京东方科技集团股份有限公司 Pupil positioning method and device, VR/AR equipment and computer-readable medium
CN110210357A (en) * 2019-05-24 2019-09-06 浙江大学 A kind of ptosis image measuring method based on still photo face recognition
CN110263745A (en) * 2019-06-26 2019-09-20 京东方科技集团股份有限公司 A kind of method and device of pupil of human positioning
CN110276324A (en) * 2019-06-27 2019-09-24 北京万里红科技股份有限公司 The elliptical method of pupil is determined in a kind of iris image

Non-Patent Citations (7)

* Cited by examiner, † Cited by third party
Title
DAN WITZNER HANSEN 等: "In the Eye of the Beholder: A Survey of Models for Eyes and Gaze", 《IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE》 *
KANG-A CHOI 等: "Improved pupil center localization method for eye-gaze tracking-based human-device interaction", 《2014 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS (ICCE)》 *
XIANMEI WANG 等: "Pupil localization by ASM and regional gray distribution", 《2011 9TH WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION》 *
余罗 等: "基于椭圆拟合的瞳孔中心精确定位算法研究", 《中国医疗器械杂志》 *
周锋宜: "基于视频图像处理的视线追踪技术研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *
姜太平 等: "一种改进的基于人脸图像的瞳孔精确检测方法", 《小型微型计算机系统》 *
李斌: "随机场景人眼检测实时追踪及其应用技术研究", 《中国博士学位论文全文数据库 信息科技辑》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113031269A (en) * 2021-03-08 2021-06-25 北京正远展览展示有限公司 VR shows dizzy governing system of anti-dazzle

Also Published As

Publication number Publication date
CN110929672B (en) 2024-03-01

Similar Documents

Publication Publication Date Title
CN110874594B (en) Human body appearance damage detection method and related equipment based on semantic segmentation network
EP3161728B1 (en) Hierarchical interlinked multi-scale convolutional network for image parsing
US7352881B2 (en) Method for tracking facial features in a video sequence
KR20210028185A (en) Human posture analysis system and method
WO2019174276A1 (en) Method, device, equipment and medium for locating center of target object region
CN111598038B (en) Facial feature point detection method, device, equipment and storage medium
US10255673B2 (en) Apparatus and method for detecting object in image, and apparatus and method for computer-aided diagnosis
CN104766059A (en) Rapid and accurate human eye positioning method and sight estimation method based on human eye positioning
CN108765315B (en) Image completion method and device, computer equipment and storage medium
EP2887315A1 (en) Calibration device, method for implementing calibration, program and camera for movable body
US9256800B2 (en) Target line detection device and method
CN110781770B (en) Living body detection method, device and equipment based on face recognition
CN110675407A (en) Image instance segmentation method and device, electronic equipment and storage medium
CN113011285B (en) Lane line detection method and device, automatic driving vehicle and readable storage medium
CN113792718B (en) Method for positioning face area in depth map, electronic device and storage medium
EP3089107B1 (en) Computer program product and method for determining lesion similarity of medical image
CN115423870A (en) Pupil center positioning method and device
CN110929672B (en) Pupil positioning method and electronic equipment
AU2019433083B2 (en) Control method, learning device, discrimination device, and program
CN113780040A (en) Lip key point positioning method and device, storage medium and electronic equipment
CN110555344B (en) Lane line recognition method, lane line recognition device, electronic device, and storage medium
CN114463430B (en) Ocean search and rescue system and method based on image processing
Niemeijer et al. Automatic Detection of the Optic Disc, Fovea and Vacular Arch in Digital Color Photographs of the Retina.
EP3244346A1 (en) Determining device and determination method
CN114627542A (en) Eye movement position determination method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant