CN108510542B - Method and device for matching light source and light spot - Google Patents

Method and device for matching light source and light spot Download PDF

Info

Publication number
CN108510542B
CN108510542B CN201810147220.5A CN201810147220A CN108510542B CN 108510542 B CN108510542 B CN 108510542B CN 201810147220 A CN201810147220 A CN 201810147220A CN 108510542 B CN108510542 B CN 108510542B
Authority
CN
China
Prior art keywords
light
distance distribution
distance
light source
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810147220.5A
Other languages
Chinese (zh)
Other versions
CN108510542A (en
Inventor
刘伟
宫小虎
聂凤梅
王健
杨孟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing 7Invensun Technology Co Ltd
Original Assignee
Beijing 7Invensun Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing 7Invensun Technology Co Ltd filed Critical Beijing 7Invensun Technology Co Ltd
Priority to CN201810147220.5A priority Critical patent/CN108510542B/en
Publication of CN108510542A publication Critical patent/CN108510542A/en
Priority to PCT/CN2019/071326 priority patent/WO2019154012A1/en
Priority to TW108102406A priority patent/TWI680309B/en
Application granted granted Critical
Publication of CN108510542B publication Critical patent/CN108510542B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/145Illumination specially adapted for pattern recognition, e.g. using gratings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/197Matching; Classification

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Image Analysis (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The invention discloses a method and a device for matching a light source with a light spot. Wherein, the method comprises the following steps: acquiring first distance distribution of a plurality of light spots and the center of a pupil, wherein the image to be detected comprises the plurality of light spots and the pupil; obtaining a second distance distribution based on the plurality of light sources; comparing the first distance distribution with the second distance distribution to obtain a comparison result; and determining that the light spots meeting the second distance distribution are matched with the light source according to the comparison result. The invention solves the technical problem of inaccurate matching of the light spot and the light source in the prior art.

Description

Method and device for matching light source and light spot
Technical Field
The invention relates to the field of sight tracking equipment, in particular to a method and a device for matching a light source with a light spot.
Background
Virtual Reality (VR) is a computer simulation system that can create and experience a Virtual world, and a three-dimensional simulation environment that is dynamically displayed in real time can be generated in a computer by using the Virtual Reality, and a user can obtain the same perception as that in a real environment, for example, visual perception, auditory perception, tactile perception, and the like, in the simulation environment. With the rapid development of scientific technology, VR technology has been widely applied in various industries, wherein a line-of-sight tracking technology in VR technology is widely applied in medicine, for example, tracking a line of sight in the medical field.
However, in the prior art, a 3D approximate spherical model of an eyeball is mainly used to perform sight estimation on a remote device of an eye fixation point according to pupil center coordinates and corneal reflection. And, when VR equipment used the many light sources of a plurality of cameras, only need carry out single point and rectify the tracking that can accomplish the sight.
However, in actual use, since the light source is usually not specific, and the relative positions of the light source and the cameras are usually different, some cameras cannot acquire an image or acquire an image with poor quality in the process of performing line-of-sight tracking, so that the light source and the light spot cannot be accurately matched.
Aiming at the problem of inaccurate matching of the light spot and the light source in the prior art, an effective solution is not provided at present.
Disclosure of Invention
The embodiment of the invention provides a method and a device for matching a light source and a light spot, which at least solve the technical problem of inaccurate matching of the light spot and the light source in the prior art.
According to an aspect of an embodiment of the present invention, there is provided a method of matching a light source with a light spot, including: acquiring first distance distribution of a plurality of light spots and the center of a pupil, wherein the image to be detected comprises the plurality of light spots and the pupil; obtaining a second distance distribution based on the plurality of light sources; comparing the first distance distribution with the second distance distribution to obtain a comparison result; and determining that the light spots meeting the second distance distribution are matched with the light source according to the comparison result.
According to another aspect of the embodiments of the present invention, there is also provided an apparatus for matching a light source with a light spot, including: the first acquisition module is used for acquiring first distance distribution between a plurality of light spots and the center of a pupil, wherein the image to be detected comprises the plurality of light spots and the pupil; a second acquisition module for acquiring a second distance distribution based on the plurality of light sources; the comparison module is used for comparing the first distance distribution with the second distance distribution to obtain a comparison result; and the determining module is used for determining that the light spots meeting the second distance distribution are matched with the light source according to the comparison result.
According to another aspect of the embodiments of the present invention, there is also provided a storage medium including a stored program, wherein the program performs a method of matching a light source with a light spot.
According to another aspect of the embodiments of the present invention, there is also provided a processor for executing a program, wherein the program executes a method of matching a light source and a light spot.
In the embodiment of the invention, a mode of matching light sources and light spots through distance distribution is adopted, a first distance distribution of a plurality of light spots and the center of a pupil and a second distance distribution based on the plurality of light sources are obtained, then the first distance distribution and the second distance distribution are compared to obtain a comparison result, and the matching of the light spots meeting the second distance distribution and the light sources is determined according to the comparison result, wherein the image to be detected comprises the plurality of light spots and the pupil, so that the aim of accurately filtering out the stray light spots on the image is achieved, the technical effect of accurately matching the light sources and the light spots is realized, and the technical problem of inaccurate matching of the light spots and the light sources in the prior art is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 is a flow chart of a method of matching a light source to a light spot according to an embodiment of the invention;
FIG. 2 is a schematic diagram of an alternative configuration for acquiring images based on multiple light sources according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of an alternative image to be detected according to an embodiment of the invention;
FIG. 4 is a schematic illustration of an alternative first distance distribution according to an embodiment of the present invention;
FIG. 5 is a schematic view of an alternative light source distribution according to an embodiment of the present invention;
FIG. 6 is a schematic illustration of an alternative second distance distribution according to an embodiment of the present invention; and
fig. 7 is a schematic structural diagram of an apparatus for matching a light source and a light spot according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
In accordance with an embodiment of the present invention, there is provided an embodiment of a method of matching a light source to a light spot, it being noted that the steps illustrated in the flowchart of the drawings may be performed in a computer system, such as a set of computer-executable instructions, and that while a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than here.
Fig. 1 is a flowchart of a method for matching a light source with a light spot according to an embodiment of the present invention, as shown in fig. 1, the method includes the following steps:
step S102, obtaining first distance distribution of a plurality of light spots and the center of a pupil, wherein the image to be detected comprises the plurality of light spots and the pupil.
It should be noted that the image acquisition device may acquire an image to be detected, where the image acquisition device may be, but is not limited to, a camera, a mobile phone with a camera function, a tablet, and the like. In addition, the gaze tracking device connected to the image capturing device may process the image captured by the image capturing device, wherein the gaze tracking device may be, but is not limited to, a virtual reality device and a smart terminal capable of performing gaze tracking, such as a mobile phone, a computer, a tablet, a wearable device, and the like. In the case where the device connected to the image capturing device is a gaze tracking device, the image captured by the image capturing device may be a corneal image of the eye.
In an alternative embodiment, an alternative multi-light-source-based structural diagram of the captured image is shown in fig. 2. In FIG. 2, a is the eyeball, b is the corneal surface, c is the corneal center of curvature, d is the center of rotation of the eyeball, p is the pupil center, r is the pupil radius, O1Is a camera, I1And I2Two light sources, u21And u11The light source reaches the imaging point of the camera after being reflected by the cornea. Wherein, the light spot is the imaging point of the camera reached after the light source is reflected by the cornea.
In an alternative embodiment, a schematic diagram of an alternative image to be detected is shown in fig. 3. The gaze tracking device may process the image to be detected to obtain the central positions of the plurality of light spots and the distance between the centers of the pupils, for example, the distance between the central position of the light spot 1 and the center of the pupil is L1 (not shown in fig. 3), the distance between the central position of the light spot 2 and the center of the pupil is L2 (not shown in fig. 3), and so on. If the number of the light spots is 8, 8 distance values can be obtained, and the first distance distribution as shown in fig. 4 can be obtained based on the obtained 8 distance values.
Step S104, a second distance distribution based on the plurality of light sources is acquired.
In space, if a plurality of light sources are distributed in a circular shape in space, for example, 8 light sources are equally arranged, and 8 light sources form an approximately elliptical shape after perspective projection transformation or radiation projection transformation, therefore, the distribution of light spots formed on an image to be detected by the plurality of light sources distributed in a circular shape is an ellipse-like shape.
In an alternative embodiment, the plurality of light sources are distributed in a circle, and the images of the plurality of light sources are distributed in an ellipse, as an alternative light source distribution diagram shown in fig. 5. After the distribution of the light sources is determined, the center position of the ellipse can be determined, and then the distance between the center position of each light source and the center position of the ellipse is calculated, so as to obtain a second distance distribution of the light sources, as shown in fig. 6.
And S106, comparing the first distance distribution with the second distance distribution to obtain a comparison result.
In an alternative embodiment, as can be seen from fig. 6, if the light sources are distributed in a circle, the second distance distribution is in the form of a distribution resembling a triangular curve. In an ideal situation, the distribution of the light spots in the image to be detected is the same as that of the light source, and therefore, in an ideal situation, the first distance distribution is the same as that of the second distance distribution. Therefore, the matching condition of the light spot and the light source can be determined by comparing the first distance distribution with the second distance distribution.
And S108, determining that the light spots meeting the second distance distribution are matched with the light source according to the comparison result.
In an alternative embodiment, the number of light sources is 8, and the 8 light sources are uniformly distributed in a circle. It can thus be determined that the distance conversion frequency of each two light sources on the curve of the second distance distribution is pi/2. And then obtaining the distance conversion frequency of every two light spots on the curve of the first distance distribution, if the conversion frequency between the light spot A and the light spot B is pi/4, determining that the light spot B is a parasitic light spot, and filtering the light spot B by the sight tracking equipment at the moment. If the frequency of the transition between spot a and spot B is pi/3, within an acceptable error range, then spot B is determined to match the light source, at which point the gaze tracking device will retain spot B.
Based on the schemes defined in steps S102 to S106, it can be known that a comparison result is obtained by obtaining a first distance distribution between a plurality of light spots and the center of the pupil and a second distance distribution based on a plurality of light sources, and then comparing the first distance distribution and the second distance distribution, and it is determined that the light spots satisfying the second distance distribution are matched with the light sources according to the comparison result, where the image to be detected includes the plurality of light spots and the pupil.
It is easy to notice that, because the distribution shape of the light source in the space is similar to the projection shape on the pupil, the first distance distribution represents the distribution condition of the light spots, and the second distance distribution represents the distribution condition of the light source, the light spots which are not matched with the second distance distribution can be accurately determined by comparing the first distance distribution with the second distance distribution, and the purpose of improving the accurate matching of the light spots and the light source is achieved.
According to the above content, the embodiment can achieve the purpose of accurately filtering the stray light spots on the image, thereby realizing the technical effect of accurately matching the light source and the light spots, and further solving the technical problem of inaccurate matching between the light spot and the light source in the prior art.
In an alternative embodiment, before acquiring the first distance distribution between the plurality of light spots and the pupil center, the center positions of the plurality of light spots need to be determined, and the specific steps are as follows:
step S10, acquiring an image to be detected;
step S12, preprocessing the image to be detected;
and step S14, performing light spot extraction on the preprocessed image to be detected to obtain the central positions of a plurality of light spots.
Specifically, after an image to be detected including a human eye image is obtained, the sight tracking device performs enhancement, sharpening and other processing on the image to be detected, extracts a light spot in the preprocessed image to be detected by using a binarization connected domain method, determines a constraint condition related to a pupil position and a convex hull shape constraint condition, extracts the light spot again, and finally obtains a central position of the extracted light spot.
It should be noted that after the central positions of the multiple light spots and the centers of the pupils are obtained, a first distance distribution may be determined according to the central positions of the multiple light spots and the centers of the pupils, which includes the following specific steps:
step S1020, obtaining a distance between the center position of each light spot in the plurality of light spots and the center of the pupil to obtain a first distance set;
step S1022, a first distance distribution is determined according to the first distance set.
In an alternative embodiment, 9 light spots exist in the image to be detected, the distances between the 9 light spots and the pupil center are respectively L1, L2, L3, L4, L5, L6, L7, L8, and L9, the distances between the 9 light spots and the pupil center are used as ordinate, and the position labels of the 9 light spots are used as abscissa, so that the first distance distribution shown in fig. 4 is obtained.
It should be noted that after obtaining the first distance distribution based on the multiple light spots, it is also necessary to determine a second distance distribution based on the multiple light sources, and the specific steps are as follows:
step S1040, obtaining the circle center position of a circle formed by a plurality of light sources;
step S1042, obtaining the distance between the center positions of the plurality of light sources and the circle center position to obtain a second distance set;
step S1044 determines a second distance distribution according to the second distance set.
It should be noted that the method for determining the second distance distribution is similar to the method for determining the first distance distribution, and is not described herein again. In addition, the plurality of light sources can be uniformly distributed or non-uniformly distributed. After the first distance distribution and the second distance distribution are obtained, the first distance distribution and the second distance distribution need to be compared to obtain a comparison result, and then stray light spots are filtered out, and the method specifically comprises the following steps:
step 1060, determining an initial light spot and an initial light source, wherein the initial light spot corresponds to the initial light source;
step S1062, determining a first conversion frequency between every two light spots in the first distance distribution according to the starting light spot;
step S1064, determining a second conversion frequency between every two light sources in the second distance distribution according to the starting light source;
step S1066, comparing the first transform frequency with the second transform frequency to obtain a comparison result.
In an alternative embodiment, the starting light source may be determined by the relative position of the light source and the human eye, the starting light spot may be determined by the relative position of the light spot and the human eye in the image to be detected, for example, the canthus bullosa (i.e., the canthus near the bridge of the nose) of the human eye is taken as a reference, the light source closest to the canthus bullosa is taken as the starting light spot, the light spot closest to the canthus bullosa is taken as the starting light spot, a first transform frequency between every two light spots and a second transform frequency between every two light sources are determined in a clockwise direction, and then whether the first transform frequency and the second transform frequency are matched is determined, for example, a frequency difference between the first transform frequency and the second transform frequency is less than or equal to a preset frequency, the light spot is determined to be matched with the light source, otherwise, the light. For example, the light spot A is an initial light spot, the light source A ' is an initial light source, the conversion frequency between every two light spots and the light source is calculated clockwise, wherein the conversion frequency between the light source A ' and the light source B is pi/2, the conversion frequency between the light spot A and the light spot B is pi/12, and the light spot B is determined to be not matched with the light source A ' because the frequency difference 5 pi/12 between the pi/2 and the pi/12 is greater than the preset frequency pi/12.
It should be noted that, if the plurality of light sources are uniformly distributed, and the conversion frequency between every two light sources is a fixed frequency, at this time, it is only necessary to compare the conversion frequency between every two light spots with the fixed frequency. If the light sources are non-uniformly distributed, the conversion frequency between each two light sources may be different, and at this time, the conversion frequency between each two light spots needs to be compared with the conversion frequency between the corresponding light sources.
In an alternative embodiment, after the comparison result of the first distance distribution and the second distance distribution is determined, the light spots satisfying the second distance distribution are determined to be matched with the light source according to the comparison result, the light spots corresponding to the first transform frequency matched with the second transform frequency are determined to be matched with the light source, and the light spots corresponding to the first transform frequency unmatched with the second transform frequency are filtered.
Example 2
According to an embodiment of the present invention, an embodiment of an apparatus for matching a light source and a light spot is provided. Fig. 7 is a schematic structural diagram of an apparatus for matching a light source and a light spot according to an embodiment of the present invention, and as shown in fig. 7, the apparatus includes: a first obtaining module 701, a second obtaining module 703, a comparing module 705 and a determining module 707.
The first obtaining module 701 is configured to obtain a first distance distribution between a plurality of light spots and a pupil center, where the image to be detected includes the plurality of light spots and the pupil; a second obtaining module 703, configured to obtain a second distance distribution based on the plurality of light sources; a comparison module 705, configured to compare the first distance distribution and the second distance distribution to obtain a comparison result; and a determining module 707, configured to determine, according to the comparison result, that the light spot satisfying the second distance distribution matches the light source.
It should be noted that the first obtaining module 701, the second obtaining module 703, the comparing module 705, and the determining module 707 correspond to steps S102 to S108 in embodiment 1, and the four modules are the same as the corresponding steps in the implementation example and the application scenario, but are not limited to the disclosure in embodiment 1.
In an alternative embodiment, the means for matching the light source to the light spot further comprises: the device comprises a third acquisition module, a processing module and an extraction module. The third acquisition module is used for acquiring an image to be detected; the processing module is used for preprocessing the image to be detected; and the extraction module is used for extracting light spots of the preprocessed image to be detected to obtain the central positions of a plurality of light spots.
It should be noted that the third acquiring module, the processing module and the extracting module correspond to steps S10 to S14 in embodiment 1, and the three modules are the same as the corresponding steps in the implementation example and application scenario, but are not limited to the disclosure in embodiment 1.
In an alternative embodiment, the first obtaining module includes: the device comprises a fourth obtaining module and a first determining module. The fourth acquisition module is used for acquiring the distance between the center position of each light spot in the plurality of light spots and the center of the pupil to obtain a first distance set; a first determining module for determining a first distance distribution according to the first distance set.
It should be noted that the fourth acquiring module and the first determining module correspond to steps S1020 to S1022 in embodiment 1, and the two modules are the same as the corresponding steps in the implementation example and application scenarios, but are not limited to the disclosure in embodiment 1.
In an alternative embodiment, the first obtaining module includes: the device comprises a fifth obtaining module, a sixth obtaining module and a second determining module. The fifth acquisition module is used for acquiring the circle center position of a circle formed by a plurality of light sources; the sixth acquisition module is used for acquiring the distances between the center positions of the plurality of light sources and the circle center position to obtain a second distance set; a second determining module for determining a second distance distribution according to the second distance set.
It should be noted that the fifth acquiring module, the sixth acquiring module and the second determining module correspond to steps S1040 to S1044 in embodiment 1, and the three modules are the same as the corresponding steps in the implementation example and the application scenario, but are not limited to the disclosure in embodiment 1.
In an alternative embodiment, the alignment module comprises: the device comprises a third determining module, a fourth determining module, a fifth determining module and a first comparing module. The third determining module is used for determining an initial light spot and an initial light source, wherein the initial light spot corresponds to the initial light source; a fourth determining module, configured to determine, according to the starting light spot, a first transform frequency between every two light spots in the first distance distribution; a fifth determining module for determining a second transform frequency between each two light sources in the second distance distribution according to the starting light source; and the first comparison module is used for comparing the first conversion frequency with the second conversion frequency to obtain a comparison result.
It should be noted that the third determining module, the fourth determining module, the fifth determining module and the first comparing module correspond to steps S1060 to S1066 in embodiment 1, and the four modules are the same as the corresponding steps in the implementation example and application scenarios, but are not limited to the disclosure in embodiment 1.
In an alternative embodiment, the determining module includes: a sixth determination module and a filtering module. The sixth determining module is used for determining that the light spot corresponding to the first conversion frequency matched with the second conversion frequency is matched with the light source; and the filtering module is used for filtering out light spots corresponding to the first conversion frequency which is not matched with the second conversion frequency.
Example 3
According to another aspect of the embodiments of the present invention, there is also provided a storage medium including a stored program, wherein the program performs the method of matching a light source with a light spot in embodiment 1.
Example 4
According to another aspect of the embodiments of the present invention, there is also provided a processor for executing a program, wherein the program executes the method for matching a light source and a light spot in embodiment 1.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (10)

1. A method of matching a light source to a spot, comprising:
acquiring first distance distribution of a plurality of light spots and the center of a pupil, wherein an image to be detected comprises the plurality of light spots and the pupil, in the first distance distribution, the distances between the plurality of light spots and the center of the pupil are vertical coordinates, and the position labels of the plurality of light spots are horizontal coordinates;
acquiring a second distance distribution based on the plurality of light sources, wherein acquiring the second distance distribution based on the plurality of light sources comprises: acquiring the circle center position of a circle formed by the plurality of light sources; obtaining the distances between the center positions of the plurality of light sources and the circle center position to obtain a second distance set; determining the second distance distribution according to the second distance set, wherein in the second distance distribution, the distance between the center positions of the plurality of light sources and the circle center position is a vertical coordinate, and the position labels of the center positions of the plurality of light sources are horizontal coordinates;
comparing the first distance distribution with the second distance distribution to obtain a comparison result;
determining that the light spots meeting the second distance distribution are matched with the light source according to the comparison result;
the comparing the first distance distribution and the second distance distribution to obtain a comparison result comprises: determining a starting light spot and a starting light source, wherein the starting light spot corresponds to the starting light source; determining a first conversion frequency between every two light spots in the first distance distribution according to the starting light spot; determining a second transform frequency between each two light sources in the second distance distribution from the starting light source; and comparing the first conversion frequency with the second conversion frequency to obtain the comparison result.
2. The method of claim 1, wherein prior to acquiring the first distance distribution of the plurality of spots from the pupil center, the method further comprises:
acquiring the image to be detected;
preprocessing the image to be detected;
and carrying out light spot extraction on the preprocessed image to be detected to obtain the central positions of the light spots.
3. The method of claim 1, wherein obtaining a first distance distribution of a plurality of spots from a pupil center comprises:
acquiring the distance between the center position of each light spot in the plurality of light spots and the center of the pupil to obtain a first distance set;
determining the first distance distribution from the first set of distances.
4. The method of claim 1, wherein determining that the light spots satisfying the second distance distribution match a light source according to the comparison comprises:
determining that the light spot corresponding to the first conversion frequency matched with the second conversion frequency is matched with the light source;
and filtering out light spots corresponding to the first conversion frequency which is not matched with the second conversion frequency.
5. An apparatus for matching a light source to a light spot, comprising:
the first acquisition module is used for acquiring first distance distribution of a plurality of light spots and the center of a pupil, wherein the image to be detected comprises the plurality of light spots and the pupil, in the first distance distribution, the distances between the plurality of light spots and the center of the pupil are vertical coordinates, and the position labels of the plurality of light spots are horizontal coordinates;
a second obtaining module, configured to obtain a second distance distribution based on a plurality of light sources, wherein the second obtaining module includes: the fifth acquisition module is used for acquiring the circle center position of a circle formed by the plurality of light sources; a sixth obtaining module, configured to obtain distances between center positions of the multiple light sources and the circle center position, to obtain a second distance set; a second determining module, configured to determine the second distance distribution according to the second distance set, where in the second distance distribution, distances between center positions of the plurality of light sources and the circle center position are vertical coordinates, and position labels of the center positions of the plurality of light sources are horizontal coordinates;
the comparison module is used for comparing the first distance distribution with the second distance distribution to obtain a comparison result;
the determining module is used for determining that the light spots meeting the second distance distribution are matched with the light source according to the comparison result;
the alignment module comprises: the third determining module is used for determining a starting light spot and a starting light source, wherein the starting light spot corresponds to the starting light source; a fourth determining module, configured to determine, according to the starting light spot, a first transform frequency between every two light spots in the first distance distribution; a fifth determining module, configured to determine a second transform frequency between each two light sources in the second distance distribution according to the starting light source; and the first comparison module is used for comparing the first conversion frequency with the second conversion frequency to obtain the comparison result.
6. The apparatus of claim 5, further comprising:
the third acquisition module is used for acquiring the image to be detected;
the processing module is used for preprocessing the image to be detected;
and the extraction module is used for extracting light spots of the preprocessed image to be detected to obtain the central positions of the light spots.
7. The apparatus of claim 5, wherein the first obtaining module comprises:
a fourth obtaining module, configured to obtain a distance between a center position of each of the multiple light spots and the pupil center, to obtain a first distance set;
a first determining module to determine the first distance distribution according to the first distance set.
8. The apparatus of claim 5, wherein the determining module comprises:
a sixth determining module, configured to determine that a light spot corresponding to the first transform frequency matched to the second transform frequency is matched to the light source;
and the filtering module is used for filtering out light spots corresponding to the first conversion frequency which is not matched with the second conversion frequency.
9. A storage medium storing a computer program, wherein the computer program is executed by a computer to implement the method of matching a light source with a light spot according to any one of claims 1 to 4.
10. A processor for running a program, wherein a computer when executing the program implements the method of matching a light source to a light spot according to any one of claims 1 to 4.
CN201810147220.5A 2018-02-12 2018-02-12 Method and device for matching light source and light spot Active CN108510542B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201810147220.5A CN108510542B (en) 2018-02-12 2018-02-12 Method and device for matching light source and light spot
PCT/CN2019/071326 WO2019154012A1 (en) 2018-02-12 2019-01-11 Method and apparatus for matching light sources with light spots
TW108102406A TWI680309B (en) 2018-02-12 2019-01-22 Method and device for coupling light source with light spot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810147220.5A CN108510542B (en) 2018-02-12 2018-02-12 Method and device for matching light source and light spot

Publications (2)

Publication Number Publication Date
CN108510542A CN108510542A (en) 2018-09-07
CN108510542B true CN108510542B (en) 2020-09-11

Family

ID=63375044

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810147220.5A Active CN108510542B (en) 2018-02-12 2018-02-12 Method and device for matching light source and light spot

Country Status (3)

Country Link
CN (1) CN108510542B (en)
TW (1) TWI680309B (en)
WO (1) WO2019154012A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108510542B (en) * 2018-02-12 2020-09-11 北京七鑫易维信息技术有限公司 Method and device for matching light source and light spot
CN112747906B (en) * 2021-01-28 2022-07-22 歌尔光学科技有限公司 Light source detection method, detection device and readable storage medium
CN117455927B (en) * 2023-12-21 2024-03-15 万灵帮桥医疗器械(广州)有限责任公司 Method, device, equipment and storage medium for dividing light spot array and calculating light spot offset

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102715981A (en) * 2012-06-29 2012-10-10 深圳普门科技有限公司 Annular light spot obtaining device for light therapy equipment, glaucoma treatment device and method for controlling annular light spots in annular light spot obtaining device
EP3035110A1 (en) * 2014-12-18 2016-06-22 Optotune AG Optical system for avoiding speckle patterns
CN106778641A (en) * 2016-12-23 2017-05-31 北京七鑫易维信息技术有限公司 Gaze estimation method and device
CN107357429A (en) * 2017-07-10 2017-11-17 京东方科技集团股份有限公司 For determining the method, equipment and computer-readable recording medium of sight

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7001377B1 (en) * 2003-12-22 2006-02-21 Alcon Refractivehorizons, Inc. Optical tracking system and associated methods
US20140375541A1 (en) * 2013-06-25 2014-12-25 David Nister Eye tracking via depth camera
CN103530618A (en) * 2013-10-23 2014-01-22 哈尔滨工业大学深圳研究生院 Non-contact sight tracking method based on corneal reflex
EP3065623B1 (en) * 2013-11-09 2020-04-22 Shenzhen Goodix Technology Co., Ltd. Optical eye tracking
CN103761519B (en) * 2013-12-20 2017-05-17 哈尔滨工业大学深圳研究生院 Non-contact sight-line tracking method based on self-adaptive calibration
CN104732191B (en) * 2013-12-23 2018-08-17 北京七鑫易维信息技术有限公司 The devices and methods therefor of virtual display Eye-controlling focus is realized using Cross ration invariability
US9454699B2 (en) * 2014-04-29 2016-09-27 Microsoft Technology Licensing, Llc Handling glare in eye tracking
TWI617948B (en) * 2015-07-24 2018-03-11 由田新技股份有限公司 Module, method and computer readable medium for eye-tracking correction
CN108604291A (en) * 2016-01-13 2018-09-28 Fove股份有限公司 Expression identification system, expression discrimination method and expression identification program
CN205485920U (en) * 2016-01-21 2016-08-17 京东方科技集团股份有限公司 Show controlling means and display control system
CN107066085B (en) * 2017-01-12 2020-07-10 惠州Tcl移动通信有限公司 Method and device for controlling terminal based on eyeball tracking
CN108510542B (en) * 2018-02-12 2020-09-11 北京七鑫易维信息技术有限公司 Method and device for matching light source and light spot

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102715981A (en) * 2012-06-29 2012-10-10 深圳普门科技有限公司 Annular light spot obtaining device for light therapy equipment, glaucoma treatment device and method for controlling annular light spots in annular light spot obtaining device
EP3035110A1 (en) * 2014-12-18 2016-06-22 Optotune AG Optical system for avoiding speckle patterns
CN106778641A (en) * 2016-12-23 2017-05-31 北京七鑫易维信息技术有限公司 Gaze estimation method and device
CN107357429A (en) * 2017-07-10 2017-11-17 京东方科技集团股份有限公司 For determining the method, equipment and computer-readable recording medium of sight

Also Published As

Publication number Publication date
TWI680309B (en) 2019-12-21
TW201935083A (en) 2019-09-01
CN108510542A (en) 2018-09-07
WO2019154012A1 (en) 2019-08-15

Similar Documents

Publication Publication Date Title
CN108334191B (en) Method and device for determining fixation point based on eye movement analysis equipment
CN110147744B (en) Face image quality assessment method, device and terminal
US20190114480A1 (en) Eye pose identification using eye features
CN108510542B (en) Method and device for matching light source and light spot
EP3339943A1 (en) Method and system for obtaining optometric parameters for fitting eyeglasses
TWI691937B (en) Method and device for filtering light spot, computer readable storage medium, processor, gaze tracking equipment
WO2023011339A1 (en) Line-of-sight direction tracking method and apparatus
CN111353506A (en) Adaptive gaze estimation method and apparatus
CN107590474B (en) Unlocking control method and related product
Matey et al. Iris image segmentation and sub-optimal images
US10909363B2 (en) Image acquisition system for off-axis eye images
CN109977929A (en) A kind of face identification system and method based on TOF
CN110647790A (en) Method and device for determining gazing information
CN115830675A (en) Method and device for tracking fixation point, intelligent glasses and storage medium
CN107767421B (en) Light spot light source matching method and device in sight tracking equipment
CN108764135B (en) Image generation method and device and electronic equipment
CN113658035A (en) Face transformation method, device, equipment, storage medium and product
CN112651270B (en) Gaze information determining method and device, terminal equipment and display object
CN109712230B (en) Three-dimensional model supplementing method and device, storage medium and processor
CN112132755A (en) Method, device and system for correcting and calibrating pupil position and computer readable medium
CN111935473B (en) Rapid eye three-dimensional image collector and image collecting method thereof
US12033278B2 (en) Method for generating a 3D model
CN113095116B (en) Identity recognition method and related product
CN113093907A (en) Man-machine interaction method, system, equipment and storage medium
CN107644443B (en) Parameter setting method and device in sight tracking equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant