CN111028234A - Human eye tear film rupture detection method - Google Patents
Human eye tear film rupture detection method Download PDFInfo
- Publication number
- CN111028234A CN111028234A CN202010149917.3A CN202010149917A CN111028234A CN 111028234 A CN111028234 A CN 111028234A CN 202010149917 A CN202010149917 A CN 202010149917A CN 111028234 A CN111028234 A CN 111028234A
- Authority
- CN
- China
- Prior art keywords
- tear film
- human eye
- pattern
- comparison
- frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
- G06T7/0014—Biomedical image inspection using an image reference approach
- G06T7/0016—Biomedical image inspection using an image reference approach involving temporal comparison
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30041—Eye; Retina; Ophthalmic
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Eye Examination Apparatus (AREA)
Abstract
The invention discloses a human eye tear film rupture detection method, which comprises the following steps: illuminating the eyes of a person to be detected by an illuminating device with a certain luminous pattern; collecting the video information of human eyes of a person to be detected; and judging the state of the human eye tear film of the person to be detected according to the change of the human eye reflection pattern in the video information. The human eye tear film rupture detection method has the beneficial effects that the human eye tear film rupture detection method can automatically judge the state of the human eye tear film of a person to be detected according to the change of the human eye reflection pattern.
Description
Technical Field
The invention relates to a human eye tear film rupture detection method.
Background
The tear film is a thin liquid layer, consisting of 3 parts, coated on our corneal and conjunctival surfaces, the outermost layer being a lipid layer, the middle being a watery layer, and the innermost layer being a mucin layer. The tear film covering the surface of the human eye can keep the surface of the eye moist. The human eye coats tears on the surface of the eyeball by the action of blinking to form a tear film. The tear film is evaporated and thinned after a period of time passes after the formation until the tear film is broken, and the time interval from the formation to the generation of the tear film is the tear film breaking time. The tear film of normal human eyes is stable and lasts for a long time. However, for those with diseases of the eye, such as dry eye, the tear film breaks down in a short time and the protective function of the tear film on the ocular surface is reduced. Therefore, tear film break-up time is an important parameter of human eye health, and is the main method for dry eye diagnosis.
The traditional tear film rupture time detection is that the time of the first dry spot appears in the cornea is observed under a slit lamp microscope after the cornea is dyed by the fluorescein sodium, the accuracy of the detection result depends on the experience of an observer and the stability of the coloring agent to the tear film, the detection result is strong in subjectivity and is unstable, the coloring agent is required to be dripped into a patient before the detection, and certain harm is brought to the patient.
Disclosure of Invention
The invention provides a human eye tear film rupture detection method, which adopts the following technical scheme:
a method for detecting tear film disruption in a human eye, comprising the steps of:
illuminating the eyes of a person to be detected by an illuminating device with a certain luminous pattern;
collecting the video information of human eyes of a person to be detected;
and judging the state of the human eye tear film of the person to be detected according to the change of the human eye reflection pattern in the video information.
Further, the specific method for judging the state of the human eye tear film of the person to be detected according to the change of the human eye reflection pattern in the video information comprises the following steps:
taking the process of opening and closing eyes of a person to be detected once as a detection period, identifying the moment when the eyes of the person are completely opened, taking an image at the moment as a reference frame, and taking an image of each frame behind the reference frame as a comparison frame;
each contrast frame and reference frame are compared to identify differences between each contrast frame and reference frame and the times at which the differences occur.
Further, the method of identifying the timing at which the human eyes are fully open is:
and calculating the pixel average value of each frame of image, and when the pixel average value of the image is reduced to a certain value and lasts for a certain preset time, taking the moment when the pixel average value of the image is reduced to the certain value for the first time as the moment when the human eyes are fully opened.
Further, the specific method of comparing each comparison frame with the reference frame to identify the difference between each comparison frame and the reference frame and the time when the difference is generated is as follows:
carrying out contrast-limiting self-adaptive histogram equalization on the reference frame, carrying out mean value filtering on the equalized reference frame, and carrying out difference on the reference frame before and after the mean value filtering to obtain a reference pattern;
carrying out contrast-limiting self-adaptive histogram equalization on the comparison frames, carrying out mean value filtering on the equalized comparison frames, and carrying out difference on the comparison frames before and after the mean value filtering to obtain a comparison pattern;
obtaining a difference map by subtracting the reference pattern and the comparison pattern;
dividing the reference pattern, the comparison pattern and the difference map into a plurality of sub-regions with the same quantity and the same size;
comparing the reference pattern with the subareas corresponding to the comparison patterns, and identifying the subareas with differences;
mapping the identified sub-areas where the difference exists to a difference map, marking the sub-areas containing the difference parts in the identified sub-areas where the difference exists as the rupture areas, and recording the rupture time of the rupture areas.
Further, the burst time is the time interval between the contrast frame and the reference frame that generates the burst region.
Further, when the sub-region in the comparison frame is marked as a broken region, the broken region is not repeatedly judged in the comparison judgment of the subsequent comparison frame.
Further, a tear film rupture result topographic map is output, and the tear film rupture result map is marked with a plurality of rupture areas and rupture time corresponding to each rupture area.
Further, the specific method for identifying the sub-regions with differences is as follows:
and calculating the image similarity of the corresponding sub-regions in the reference pattern and the comparison pattern, and judging that the sub-regions have differences if the similarity is lower than a threshold value.
Further, the light emitting pattern of the lighting device is concentric rings.
Further, the lighting device illuminates white light.
The human eye tear film rupture detection method has the beneficial effects that the human eye tear film rupture detection method can automatically judge the state of the human eye tear film of a person to be detected according to the change of the human eye reflection pattern.
The human eye tear film rupture detection method has the advantages that the human eye tear film rupture detection method identifies the difference between each comparison frame after the reference frame and the reference frame, and can clearly identify each ruptured area and the time generated by each ruptured area.
Drawings
FIG. 1 is a schematic diagram of a human eye tear film break-up detection method of the present invention;
FIG. 2 is a schematic view of a lighting device;
FIG. 3 is a schematic diagram of a reference frame;
FIG. 4 is a schematic illustration of a comparison frame;
FIG. 5 is a schematic diagram of a reference pattern resulting from processing a reference frame;
FIG. 6 is a schematic diagram of a contrast pattern resulting from processing a contrast frame;
FIG. 7 is a schematic illustration of a difference map obtained by differencing a reference pattern and a reference pattern;
FIG. 8 is a partially enlarged schematic illustration of a division of a reference pattern;
FIG. 9 is a partially enlarged schematic illustration of the division of the contrast pattern;
FIG. 10 is a schematic illustration of mapping a subregion in which a disparity exists to a disparity map;
figure 11 is a topographical map of the tear film disruption results output.
Detailed Description
The invention is described in detail below with reference to the figures and the embodiments.
As shown in fig. 1, a method for detecting tear film disruption in a human eye is disclosed, comprising the steps of: s1: the eyes of a person to be examined are illuminated by an illumination device having a certain luminous pattern. S2: the method comprises the steps of collecting video information of human eyes of a person to be detected. S3: and judging the state of the human eye tear film of the person to be detected according to the change of the human eye reflection pattern in the video information. Through the steps, the state of the tear film of the human eyes is monitored according to the change condition of the light source pattern emitted by the human eyes. It will be appreciated that the image acquisition and processing of the above steps are all handled automatically by the associated electronics module. Specifically, video information is acquired through the image acquisition module, and the video information is automatically processed through the image processing module and the calculation module to analyze and judge the state of the tear film of the human eyes according to the video information.
For step S1: the eye of the person to be examined is illuminated by an illumination device having a certain light emission pattern.
The human eye tear film rupture detection method mainly judges whether the human eye tear film ruptures or not by judging the change of the pattern of the light source reflected by the human eye, and firstly, an illumination device with a certain light-emitting pattern is used for irradiating the eye of a person to be detected. There are many options for the light-emitting pattern of the lighting device, and in the present invention, in order to clearly reflect the state of the tear film at different parts of the eye, a concentric lighting device as shown in fig. 2 is used, in which the lighting portion is composed of a plurality of concentric light-emitting rings with different radii, and the distances between the light-emitting rings are equal. It is understood that the illumination portion may be formed concentrically with other shapes of light emitting parts such as regular triangles having different side lengths and square light emitting parts having different side lengths. Also, the light emitted by the illumination device in the present invention is white light.
For step S2: the method comprises the steps of collecting video information of human eyes of a person to be detected.
The video information of human eyes of a person to be detected is acquired through an image acquisition module, such as a camera.
For step S3: and judging the state of the human eye tear film of the person to be detected according to the change of the human eye reflection pattern in the video information.
And analyzing the change of the pattern reflected by the human eyes in the video information through the image processing equipment, and judging the state of the tear film of the human eyes according to the change condition of the pattern.
Specifically, a process of opening and closing the eyes of the person to be detected at one time is taken as a detection period, a time at which the eyes of the person are fully opened is recognized, and an image at the time is taken as a reference frame, which is shown as a reference frame image in fig. 3. Each frame image after the reference frame is taken as a comparison frame, and as shown in fig. 4, an image of one of the comparison frames is taken. Each of the contrast frames and the reference frame are compared to identify differences between each of the contrast frames and the reference frame and times at which the differences occur. The specific area of tear film disruption and the time of generation can be found by analyzing the difference of each contrast frame and reference frame.
Because the pixel mean value of the eye-closing eyelid is far higher than the pixel mean value of the eye-opening eyelid, the pixel mean value of each frame of image is calculated through the eye-opening judgment algorithm, and when the pixel mean value of the image is reduced to a certain value and stably maintained at the value for a certain time, the eyes are judged to be completely opened. The pixel mean is the lowest value when the eye is fully open. The time at which the pixel average value of the image first falls to the lowest value is taken as the time at which the human eye is fully open. In the present invention, the stabilization duration is set to 0.5s, and it is understood that the time interval may be adjusted as the case may be.
The specific method for comparing each contrast frame with the reference frame to identify the difference between each contrast frame and the reference frame and the time when the difference is generated is as follows: performing contrast-limited adaptive histogram equalization on the reference frame, performing mean filtering on the equalized reference frame, and performing difference on the reference frame before and after the mean filtering to obtain a reference pattern, as shown in fig. 5, which is an effect diagram after the reference frame of fig. 3 is processed. Similarly, contrast-limited adaptive histogram equalization is performed on the contrast frames, mean filtering is performed on the equalized contrast frames, and differences are made between the contrast frames before and after the mean filtering to obtain contrast patterns, as shown in fig. 6, which is an effect diagram after the contrast frame pattern processing of fig. 4. The difference between the reference pattern and the reference pattern is plotted as shown in fig. 7. Dividing the reference pattern, the comparison pattern and the difference map into a plurality of sub-areas with the same quantity and the same size, comparing the sub-areas corresponding to the reference pattern and the comparison pattern, and identifying the sub-areas with differences. Fig. 8 is a partial enlarged schematic view of the partition of fig. 5, and fig. 9 is a partial enlarged schematic view of the partition of fig. 6, each sub-region having a sub-region number. In the present invention, the specific method for identifying the sub-regions with differences is as follows: and calculating the image similarity of the corresponding sub-regions in the reference pattern and the comparison pattern, and judging that the sub-regions have differences if the similarity is lower than a threshold value. Mapping the identified sub-regions with differences to a difference map, marking the sub-regions containing differences in the identified sub-regions with differences as cracked regions, as shown in fig. 10, mapping the sub-regions with differences to the difference map, wherein the sub-regions selected by the box in the map are the identified sub-regions with differences, marking the sub-regions containing white differences in the box as cracked regions, and recording the cracking time of the cracked regions. In the present invention, the burst time is defined as the time interval between a contrast frame and a reference frame that generates a burst region.
As another alternative, a sub-area surrounding the box and having a certain distance containing a white difference is marked as a fracture area, and the solution differs from the above solution in that: in this embodiment, even if the frame does not include a white difference portion, the sub-region is determined to be a broken region if a white difference portion exists in a certain region around the frame. The certain distance can be set according to actual conditions.
It can be understood that, as time goes on, the position of tear film rupture may become more, and the position of rupture may further change as time goes on, so that, as time goes on, it is likely that the sub-region where tear film rupture has occurred is detected, and it is determined that rupture has not occurred in the following comparison frame determination, therefore, in order to make the determination result more accurate, after the sub-region in the comparison frame is first marked as a rupture region, in the comparison determination of the following comparison frame, the rupture region is marked as a rupture region, and the determination is not repeated, that is, the result of tear film rupture in the first determination is taken as the standard.
Further, after one detection period is completed, a tear film rupture result topographic map is output through the image display module, and all rupture areas and rupture time corresponding to each rupture area are marked in the reference frame image, as shown in fig. 11. In the invention, the image display module is a liquid crystal display. The areas where the human eye tear film is broken and the time for each broken area can be clearly obtained in the tear film breaking result topographic map, and then the health state of the human eye can be judged. When the closed eyes are detected, the detection is finished, similar to an eye opening judgment algorithm, and the closed eyes are judged by judging the change condition of the pixel mean value of each frame of image. Specifically, when the detected pixel mean value of the contrast frame suddenly increases compared to the pixel mean value of the reference frame, the contrast frame is determined to be closed-eye.
The foregoing illustrates and describes the principles, general features, and advantages of the present invention. It should be understood by those skilled in the art that the above embodiments do not limit the present invention in any way, and all technical solutions obtained by using equivalent alternatives or equivalent variations fall within the scope of the present invention.
Claims (10)
1. A method for detecting tear film disruption in a human eye, comprising the steps of:
illuminating the eyes of a person to be detected by an illuminating device with a certain luminous pattern;
collecting the video information of human eyes of a person to be detected;
and judging the state of the human eye tear film of the person to be detected according to the change of the human eye reflection pattern in the video information.
2. The method of detecting tear film disruption in a human eye of claim 1,
the specific method for judging the state of the human eye tear film of the person to be detected according to the change of the human eye reflection pattern in the video information comprises the following steps:
taking the process of opening and closing eyes of a person to be detected once as a detection period, identifying the moment when the eyes of the person are completely opened, taking an image at the moment as a reference frame, and taking an image of each frame behind the reference frame as a comparison frame;
each of the contrast frames and the reference frames are compared to identify differences between each of the contrast frames and the reference frames and times at which the differences are generated.
3. The method of detecting tear film disruption in a human eye of claim 2,
the method for identifying the moment when the human eyes are fully opened comprises the following steps:
and calculating the pixel average value of each frame of image, and when the pixel average value of the image is reduced to a certain value and lasts for a certain preset time, taking the moment when the pixel average value of the image is reduced to the certain value for the first time as the moment when the human eyes are fully opened.
4. The method of detecting tear film disruption in a human eye of claim 2,
the specific method for comparing each of the comparison frames and the reference frames to identify the difference between each of the comparison frames and the reference frames and the time when the difference is generated is as follows:
carrying out contrast-limited self-adaptive histogram equalization on the reference frame, carrying out mean value filtering on the equalized reference frame, and carrying out difference on the reference frame before and after the mean value filtering to obtain a reference pattern;
carrying out contrast-limiting self-adaptive histogram equalization on the comparison frames, carrying out mean value filtering on the equalized comparison frames, and carrying out difference on the comparison frames before and after the mean value filtering to obtain a comparison pattern;
obtaining a difference map by subtracting the reference pattern and the contrast pattern;
dividing the reference pattern, the contrast pattern and the difference map into a plurality of sub-regions with the same number and the same size;
comparing the reference pattern with the subareas corresponding to the comparison patterns, and identifying the subareas with differences;
mapping the identified sub-areas where there is a difference to the difference map, marking the sub-areas containing difference parts of the identified sub-areas where there is a difference as a rupture area, and recording the rupture time of the rupture area.
5. The method of detecting tear film disruption in a human eye of claim 4,
the break time is the time interval between the contrast frame and the reference frame that generated the break region.
6. The method of detecting tear film disruption in a human eye of claim 4,
when the sub-area in the comparison frame is marked as the fracture area, the fracture area is not judged repeatedly in the comparison judgment of the comparison frame.
7. The method of detecting tear film disruption in a human eye of claim 4,
outputting a tear film rupture result topographic map, wherein the tear film rupture result map is marked with a plurality of rupture areas and the rupture time corresponding to each rupture area.
8. The method of detecting tear film disruption in a human eye of claim 4,
the specific method for identifying the sub-regions with differences is as follows:
and calculating the image similarity of the corresponding sub-regions in the reference pattern and the comparison pattern, and judging that the sub-regions have differences if the similarity is lower than a threshold value.
9. The method of detecting tear film disruption in a human eye of claim 1,
the light emitting pattern of the lighting device is concentric rings.
10. The method of detecting tear film disruption in a human eye of claim 9,
the illumination device illuminates white light.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2019113972165 | 2019-12-30 | ||
CN201911397216 | 2019-12-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111028234A true CN111028234A (en) | 2020-04-17 |
Family
ID=70199343
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010149917.3A Pending CN111028234A (en) | 2019-12-30 | 2020-03-06 | Human eye tear film rupture detection method |
CN202010224225.0A Pending CN111127463A (en) | 2019-12-30 | 2020-03-26 | Human eye tear film breakage detection device |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010224225.0A Pending CN111127463A (en) | 2019-12-30 | 2020-03-26 | Human eye tear film breakage detection device |
Country Status (1)
Country | Link |
---|---|
CN (2) | CN111028234A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114240823A (en) * | 2021-10-29 | 2022-03-25 | 深圳莫廷医疗科技有限公司 | Real-time tear film break-up detection method, computer-readable storage medium, and apparatus |
CN115294071A (en) * | 2022-08-10 | 2022-11-04 | 中山大学中山眼科中心 | Tear film detection system and method based on video data |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104463081A (en) * | 2013-09-16 | 2015-03-25 | 展讯通信(天津)有限公司 | Detection method of human eye state |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104545793B (en) * | 2015-01-28 | 2016-02-10 | 厦门大学 | A kind of contactless breakup time of tear film measuring device and method |
US11369264B2 (en) * | 2017-08-10 | 2022-06-28 | Kyoto Prefectural Public University Corporation | Method for dynamic evaluation of tear fluid layer and device therefor |
-
2020
- 2020-03-06 CN CN202010149917.3A patent/CN111028234A/en active Pending
- 2020-03-26 CN CN202010224225.0A patent/CN111127463A/en active Pending
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104463081A (en) * | 2013-09-16 | 2015-03-25 | 展讯通信(天津)有限公司 | Detection method of human eye state |
Non-Patent Citations (1)
Title |
---|
师雷雷: "《干眼检测关键技术研究与运用》", 《中国优秀硕士学位论文全文数据库 医药卫生科技辑》 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114240823A (en) * | 2021-10-29 | 2022-03-25 | 深圳莫廷医疗科技有限公司 | Real-time tear film break-up detection method, computer-readable storage medium, and apparatus |
CN114240823B (en) * | 2021-10-29 | 2024-10-29 | 深圳莫廷医疗科技有限公司 | Real-time tear film rupture detection method, computer readable storage medium and apparatus |
CN115294071A (en) * | 2022-08-10 | 2022-11-04 | 中山大学中山眼科中心 | Tear film detection system and method based on video data |
Also Published As
Publication number | Publication date |
---|---|
CN111127463A (en) | 2020-05-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101484365B1 (en) | Image processing apparatus, image processing method and storage medium | |
EP2529661B1 (en) | Ophthalmology device and image categorizing method | |
JP4845698B2 (en) | Eye detection device, eye detection method, and program | |
CN110555845A (en) | Fundus OCT image identification method and equipment | |
CN105828700A (en) | Method For Operating An Eye Tracking Device And Eye Tracking Device For Providing An Active Illumination Control For Improved Eye Tracking Robustness | |
CN111028234A (en) | Human eye tear film rupture detection method | |
CN109961848B (en) | Macular image classification method and device | |
JP2016520350A (en) | Method and system for detecting ocular tissue structure and pathology | |
BRPI0303222B1 (en) | method and apparatus for inspecting optical devices | |
Trokielewicz et al. | Database of iris images acquired in the presence of ocular pathologies and assessment of iris recognition reliability for disease-affected eyes | |
US6735331B1 (en) | Method and apparatus for early detection and classification of retinal pathologies | |
JP5038925B2 (en) | Ophthalmic measuring device | |
US6802837B2 (en) | Device used for the photorefractive keratectomy of the eye using a centering method | |
KR101693802B1 (en) | Apparatus for Analyzing Lipid Layer in Tear Film | |
CN111242212A (en) | Method for detecting atrophy arc of high-myopia fundus image based on machine learning | |
US20240299592A1 (en) | Dye enhanced visualization of cataract surgery | |
WO2016126556A1 (en) | Method and system for objective evaluation of dry eye syndrome | |
CN111110185A (en) | Tear film lipid layer thickness detection device | |
WO2011108995A1 (en) | Automatic analysis of images of the anterior chamber of an eye | |
Rosenthal et al. | Digital measurement of pallor-disc ratio | |
CN114820537A (en) | Dry eye FBUT detection method and system based on deep learning and storage medium | |
CN108230287B (en) | Method and device for detecting crystal region of anterior segment image | |
JPH09198508A (en) | Eye state detector | |
US20220330814A1 (en) | Method for evaluating the stability of a tear film | |
JP2022548111A (en) | Apparatus and method for detecting tear breakup |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200417 |