CN115886717B - Eye crack width measuring method, device and storage medium - Google Patents

Eye crack width measuring method, device and storage medium Download PDF

Info

Publication number
CN115886717B
CN115886717B CN202210989509.8A CN202210989509A CN115886717B CN 115886717 B CN115886717 B CN 115886717B CN 202210989509 A CN202210989509 A CN 202210989509A CN 115886717 B CN115886717 B CN 115886717B
Authority
CN
China
Prior art keywords
eye
width
crack
distance
video stream
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210989509.8A
Other languages
Chinese (zh)
Other versions
CN115886717A (en
Inventor
田超楠
王友翔
王友志
杜东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Baiyi Medical Technology Co ltd
Original Assignee
Shanghai Baiyi Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Baiyi Medical Technology Co ltd filed Critical Shanghai Baiyi Medical Technology Co ltd
Priority to CN202210989509.8A priority Critical patent/CN115886717B/en
Publication of CN115886717A publication Critical patent/CN115886717A/en
Priority to PCT/CN2023/113448 priority patent/WO2024037579A1/en
Application granted granted Critical
Publication of CN115886717B publication Critical patent/CN115886717B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mathematical Physics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Artificial Intelligence (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Medical Informatics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Eye Examination Apparatus (AREA)
  • Image Analysis (AREA)

Abstract

The application relates to a method, a device and a storage medium for measuring an eye-break width, wherein the method comprises the following steps: shooting and acquiring a first eye bitmap in front of the head-up of a user; shooting and obtaining a video stream of a user, wherein the video stream keeps the eyeballs of the eyelid motionless to move left and right along with the tracking points in the horizontal direction; dividing the center of an pupil from a first eye bitmap by adopting a neural network, obtaining a pupil center line in the vertical direction, dividing a static and dynamic intersection point from each frame of a video stream by adopting the neural network, and obtaining the outline of the eye crack through the aggregation of the intersection points; and obtaining the vertical distance of the outline of the eye crack on the center line of the pupil, and obtaining the width of the eye crack according to the distance. According to the application, the pupil center and the pupil center line are directly identified through neural network segmentation, the eye socket shape is identified through the intersection point of eyelid and eyeball by a dynamic segmentation method, the eye crack width is obtained by utilizing the up-down distance of the pupil center line on the outline of the eye socket, and the accuracy of eye crack width measurement is improved.

Description

Eye crack width measuring method, device and storage medium
Technical Field
The application relates to the technical field of eye detection, in particular to a method and a device for measuring eye crack width and a storage medium.
Background
The eye fissure is also called eyelid fissure, and the eye fissure width refers to the distance between the upper eyelid and the lower eyelid passing through the pupil. Clinically, eyelid retraction includes retraction of the upper and lower eyelids, and there are various causes of eyelid retraction including congenital eyelid retraction, eyelid retraction due to traumatic scar traction, eyelid retraction due to hyperthyroidism, and eyelid retraction due to senile degeneration, and in part, neurogenic eyelid retraction due to facial paralysis, which is also one of the important indicators of clinical diagnosis. Eyelid retraction can cause unacceptable face damage to the patient as well as exposed keratopathy, such as corneal ulcers, which can lead to vision threats. Measurement of eyelid retraction is therefore critical to clinical diagnosis, and the extent of eyelid retraction can be reflected by measuring the width of the eye fissure.
Currently, a millimeter ruler is generally used for detecting the eye-break width clinically, but the method is easy to cause errors in measurement data due to inaccurate readings and influence of the level and habit of an operator. In the diagnosis and evaluation of the related disease patients, a 1mm float often indicates the change of the disease conditions of the patients, so that the measurement accuracy is often required to be high, and the measurement method can cause missed diagnosis and misdiagnosis of the patients and delay the disease condition progress of the patients.
In addition, there are also solutions in the prior art that use an automatic measurement method to determine whether the eyelid is retracted. As described in Chinese patent (application number: 202010803761.6, publication date: 10/30/2020), cornea and sclera are identified by image recognition and neural network training, and then the upper eyelid and upper margin of the cornea region are determined by image recognition and neural network training to determine whether there is eyelid retraction. However, this approach can cause erroneous judgment for a person with three white eyes or a person with open eyes due to eye herniation caused by myopia. As described in chinese patent "blink number analysis method and system based on image processing" (application number: 201910939612.X, publication date: 02/04/year 2020), iris outline and sclera outline are determined from photographed human eye images, an eye crack boundary is determined from the iris outline and the sclera outline, and an eye crack height is calculated from the difference between the coordinates of the upper boundary point and the lower boundary point of the eye crack. However, the height of the eye crack is defined as the distance between the upper and lower eyelids passing through the pupil center line, and the height of the eye crack determined in the above manner may be an oblique distance of the eye crack.
In addition, although a method of dividing an eye image by neural network training is also disclosed in the prior art, so that iris, sclera, pupil, and background of the eye image can be distinguished, there is a problem that the training accuracy is sometimes low when dividing each part of the eye image from a still image by using neural network training, and therefore, it is necessary to further seek an improvement in the measurement of the eye break width.
Disclosure of Invention
In order to overcome the defect of low accuracy or complexity in measuring the width of the eye crack in the prior art, the embodiment of the application provides a measuring method, a device and a storage medium for the width of the eye crack.
In order to achieve the above purpose, the present application adopts the following technical scheme:
in one aspect, a method for measuring an eye-break width is provided, the method comprising:
shooting and acquiring a first eye bitmap in front of the head-up of a user;
shooting and obtaining a video stream of a user, wherein the video stream keeps the eyeballs of the eyelid motionless to move left and right along with the tracking points in the horizontal direction;
dividing the center of an exit pupil from the first eye bitmap by adopting a neural network, obtaining a pupil center line in the vertical direction, dividing a static and dynamic junction point from each frame of the video stream by adopting the neural network, and obtaining the outline of the eye crack through the collection of the junction point;
and obtaining the vertical distance of the outline of the eye crack on the pupil center line, and obtaining the eye crack width by using the distance.
Further, the neural network is used to segment static and dynamic junctions from each frame of the video stream, which specifically includes: and comparing two adjacent frames in the video stream to obtain the movement rate of each pixel between the two frames, and dividing a static and dynamic intersection point by adopting a convolution and cyclic neural network model.
Further, when a neural network is used for dividing a static and dynamic junction point from each frame of a video stream, the horizontal movement and the vertical movement are divided, and the influence of blinking is removed by adopting the vertical movement.
Further, the eye-break width is:
the eye crack width (B) =pixel distance of eye crack width (a) ×single pixel width length×eye crack to camera lens distance (D)/(camera light sensitive sheet to camera lens distance (C), wherein the eye crack width pixel distance (a) is the up-down distance of the outline of the eye crack on the pupil center line.
Further, the distance from the eye crack to the camera lens is as follows:
distance of eye break to camera lens (D) =distance of corner stuck point to camera lens (1) -eye protrusion (2), the eye protrusion (2) taking the average of normal human eye protrusions.
Further, the error is removed by adopting a method of averaging the eye crack width through multiple measurements.
In one aspect, there is provided a measurement device for an eye-break width, comprising:
the first acquisition module is used for shooting and acquiring a first eye bitmap in front of the head-up of the user;
the second acquisition module is used for shooting and acquiring video streams of the user, wherein the video streams keep the eyelid motionless eyeballs to move left and right along with the tracking points in the horizontal direction;
the segmentation module is used for segmenting the pupil center from the first eye bitmap to obtain a pupil center line in the vertical direction, segmenting static and dynamic intersection points from each frame of the video stream, and obtaining the outline of the eye crack through the aggregation of the intersection points;
and the calculation module is used for calculating the width of the eye crack through the up-down distance of the outline of the eye crack on the pupil center line.
Further, the segmentation module further includes: and the segmentation sub-module is used for segmenting transverse movement and longitudinal movement when segmenting a static and dynamic intersection point from each frame of the video stream, and adopting the longitudinal movement to eliminate the influence of blinking.
In one aspect, a computer readable storage medium having stored therein at least one program code loaded and executed by a processor to perform operations performed by the method of measuring eye break width is provided.
Compared with the prior art, the technical scheme of the application has at least the following beneficial effects: the pupil center and the pupil center line are segmented from the static first eye bitmap through the neural network, the intersection point of the eyelid and the eyeball is segmented through a dynamic segmentation method based on an optical flow method, so that the shape of the eye socket is identified, the width of the eye crack is obtained through the vertical distance of the pupil center line on the outline of the eye socket, and the identification precision of the eye crack can be improved.
Drawings
FIG. 1 is a flow chart of a method for measuring the width of an eye crack provided by the application;
fig. 2 is a schematic diagram of a pupil and an iris obtained by semantic segmentation according to the present application;
FIG. 3 is a schematic diagram of the calculation principle of the eye-break width provided by the application;
fig. 4 is a schematic structural diagram of an eye-break width measuring device provided by the application.
Detailed Description
The following describes in further detail the embodiments of the present application with reference to the drawings and examples. The following examples are illustrative of the application and are not intended to limit the scope of the application.
Fig. 1 is a flowchart of a method for measuring an eye-break width according to an embodiment of the present application, as shown in fig. 1, the method for measuring an eye-break width includes the following steps:
step S1: shooting and acquiring a first eye bitmap in front of the head-up of a user;
the photographed image is performed in a natural light field so that it can be protected from the influence of red blood filaments on imaging.
Step S2: shooting and obtaining a video stream of a user, wherein the video stream keeps the eyeballs of the eyelid motionless to move left and right along with the tracking points in the horizontal direction;
specifically, the tracking point is set to be a group of indicator lights in the horizontal direction, the indicator lights are sequentially turned on from left to right or from right to left, and the user moves eyeballs left and right along with the tracking point under the indication of the indicator lights, for example, a 20s video stream is shot.
Step S3: dividing the center of an exit pupil from the first eye bitmap by adopting a neural network, obtaining a pupil center line in the vertical direction, dividing a static and dynamic junction point from each frame of the video stream by adopting the neural network, and obtaining the outline of the eye crack through the collection of the junction point;
in the above steps, as shown in fig. 2, the obtained eye image is input into the neural network, and the most likely type (such as iris or pupil) of each pixel in the image is output, namely, the iris and pupil which are not blocked by eyelid are segmented by adopting the semantic segmentation neural network, so that the coordinate point of the pupil center point in the first eye bitmap can be directly obtained through the network, and the pupil center line in the vertical direction can be obtained by making a vertical line in the vertical direction through the coordinate point.
The method comprises the steps of dividing a static and dynamic junction point from each frame of a video stream by adopting a neural network, wherein the static refers to eyelid, the dynamic refers to eyeball, and the two relatively move in the eyeball rotation process, and the method specifically comprises the following steps: inputting two adjacent frames in the eye video stream into a neural network, comparing the two adjacent frames in the video stream, outputting the displacement of each pixel in the image of the previous frame between the two frames to obtain the movement rate of each pixel between the two frames, and dividing a static and dynamic intersection point by adopting a convolution and cyclic neural network model.
When the static and dynamic intersection points are segmented from each frame of the video stream by adopting the neural network, the transverse movement and the longitudinal movement are segmented, and the influence of blinking is removed by adopting the longitudinal movement.
Traditionally, optical flow is considered a problem of energy minimization, making a trade-off between data items and regularization items. The optical flow can be formulated as a continuous optimization problem using a variational framework. And can estimate a dense flow field by performing a gradient step. The problems of excessive smoothing and noise sensitivity can also be solved by introducing a robust estimation framework. Replacing the fourth power with the L1 data item and regularization of the total variation replaces the penalty of the second order element with the L1 data item, which allows motion discontinuities and better handling of outliers. Some improvements have been made by defining better matching costs and regularization terms. This continuous formula maintains a single estimate of the light flow and is refined in each iteration. To ensure a smooth objective function, a first order taylor approximation is used to model the data item. Therefore, they only work for small displacements. In order to handle large displacements, a coarse-to-fine strategy is employed, i.e. large displacements are estimated with image pyramids at low resolution, and then small displacements are refined at high resolution. But this coarse to fine strategy may miss small objects that move quickly and is difficult to recover from early errors. As with the continuous method, a single optical flow estimate is maintained and refined in each iteration. However, since the correlation volumes are established for all pairs at both high and low resolution, each local update uses information about small and large displacements. Furthermore, instead of using a sub-pixel taylor approximation of the data item, the update operator of the present application learns to propose a direction of descent. In the training, the initial training is completed through the photo set in the simulation, and then the training is enhanced through a more real video.
The moving direction of the object in the video is obtained by comparing the differences between the front and rear frames of pictures in one video. This neural network may be trained by a video stream generated in a simulation. Because the time and space continuity of the object are known in the simulation, the object and the moving direction of the object with the front and back two frames of pictures can be known. In one possible implementation, the present application employs an existing model for training using analog video and vehicle video, such as a representative RAFT (return All-Pairs Field Transforms for Optical Flow, global matching based optical flow learning framework) neural network model, during training. The characteristic of eyelid immobility when the eyeball moves is utilized when the model is used. By processing the vertical movement (i.e., up-down movement) and the horizontal movement (i.e., left-right movement) separately, that is, when the vertical movement in the eye picture is too large, the frames are removed by the divided vertical movement, and only the picture of the horizontal movement is saved. And then, averaging the absolute values of the lateral movement of each pixel in all the saved pictures, and obtaining the eyeball parts in the eye pictures in the video by using a predefined threshold value.
Step S4: and obtaining the vertical distance of the outline of the eye crack on the center line of the pupil, and obtaining the width of the eye crack according to the distance.
As shown in fig. 3, in the step S4, the eye-break width is calculated according to the principle of similar triangle, and the calculation method is as follows:
the eye-break width b=the pixel distance a of the eye-break width x the single-pixel width length x the distance D of the eye-break to the camera lens/the distance C of the camera light-sensitive sheet to the camera lens, wherein the pixel distance a of the eye-break width is the upper and lower distance of the outline of the eye-break on the pupil center line.
Wherein, the distance D of the eye crack to the camera lens=the distance 1 of the corner of eye stuck point to the camera lens-the eyeball prominence 2.
The eyeball prominence 2 in the present application takes the average value of the eyeball prominence of a normal person. From the point of view of statistical averaging, the average value of the eye prominence of a normal person may be 12-14mm. Since in actual operation, the distance 1 from the corner of the eye stuck point to the camera lens is known, the distance C from the camera light-sensitive sheet to the camera lens is also known, the pixel distance a of the eye-break width can be obtained from the first photographed eye-map, and the single-pixel width length is also a known constant value, so that the value of the eye-break width can be obtained by the above calculation method.
Since the eyelid is kept not to move left and right during the training process, the difference in calculation of the eye-break width may be caused by each movement, in order to ensure the accuracy of calculation of the eye-break width, the error is removed by adopting a method of averaging the eye-break width by multiple measurements, in one possible implementation manner, the average value of two or three measurements is adopted as the final calculation value of the eye-break width in the embodiment, so as to improve the measurement accuracy of the eye-break width.
Fig. 4 is a schematic frame diagram of an eye-break width measuring device. Referring to fig. 4, the apparatus includes a first acquisition module 401, a second acquisition module 402, a segmentation module 403, and a calculation module 404, wherein,
a first obtaining module 401, configured to capture and obtain a first bitmap of eyes in front of a head-up view of a user.
A second acquisition module 402, configured to capture a video stream of a user that keeps the eyelid motionless eyeball moving left and right along with the tracking point in the horizontal direction.
In one possible implementation, the first acquisition module and the second acquisition module control the image or video acquisition device to acquire a first eye bitmap in front of the user's head-up and a video stream of the user keeping the eyelid motionless eye moving left and right along with the tracking point in the horizontal direction respectively in response to the acquisition instruction.
The segmentation module 403 includes that a neural network is used to segment the pupil center from the first eye bitmap to obtain a pupil center line in the vertical direction, and a neural network is used to segment a static and dynamic intersection point from each frame of the acquired video stream, and the segmented static and dynamic intersection points are integrated to extract the outline of the eye fracture.
In a possible implementation, the segmentation module 403 further includes a segmentation sub-module, configured to segment the horizontal movement and the vertical movement when the static and dynamic intersections are segmented from the frames of the video stream, and to use the vertical movement to reject the impact of blinking, that is, to identify blinking when the distance of the vertical movement exceeds a certain threshold.
The calculating module 404 is configured to calculate an eye crack width by using the up-down distance of the eye crack profile on the pupil center line.
With respect to the measuring device for the width of the eye break in the above-described embodiment, the specific manner in which the respective modules perform the operations has been described in detail in the method embodiment, and reference is made to the description of the method embodiment for relevant points.
In an exemplary embodiment, there is also provided a computer readable storage medium including a memory storing at least one program code loaded and executed by a processor to perform the method of measuring an eye break width in the above-described embodiment. For example, the computer readable storage medium may be Read-Only Memory (ROM), random-access Memory (Random Access Memory, RAM), compact disc Read-Only Memory (CDROM), magnetic tape, floppy disk, optical data storage device, etc.
It will be appreciated by those of ordinary skill in the art that all or part of the steps of implementing the above-described embodiments may be implemented by hardware, or may be implemented by at least one piece of hardware associated with a program, where the program may be stored in a computer readable storage medium, where the storage medium may be a read-only memory, a magnetic disk or optical disk, etc.
The foregoing description of the preferred embodiments of the application is not intended to be limiting, but rather is intended to cover all modifications, equivalents, and alternatives falling within the spirit and principles of the application.

Claims (6)

1. A method of measuring the width of an eye tear, the method comprising:
shooting and acquiring a first eye bitmap in front of the head-up of a user;
shooting and obtaining a video stream of a user, wherein the video stream keeps the eyeballs of the eyelid motionless to move left and right along with the tracking points in the horizontal direction;
dividing the center of the pupil from the first eye diagram by using a neural network, obtaining a pupil center line in the vertical direction,
dividing static and dynamic junction points from each frame of the video stream by adopting a neural network, and obtaining the outline of the eye crack through the collection of the junction points;
solving the vertical distance of the outline of the eye crack on the pupil center line, and calculating the eye crack width by using the distance;
wherein the dividing the static and dynamic intersection points from each frame of the video stream by using the neural network comprises: comparing two adjacent frames in the video stream to obtain the mobility of each pixel between the two frames, and dividing a static and dynamic intersection point by adopting a convolution and cyclic neural network model;
when the static and dynamic intersection points are segmented from each frame of the video stream by adopting the neural network, the transverse movement and the longitudinal movement are segmented, and the influence of blinking is removed by adopting the longitudinal movement.
2. The method for measuring the width of an eye crack according to claim 1, wherein the width of an eye crack is:
eye-break width (B) =pixel distance of eye-break width (a) ×single-pixel wide length×eye-break distance to camera lens (D)/(camera light-sensitive sheet distance to camera lens (C);
wherein the pixel distance (A) of the eye crack width is the upper and lower distance of the outline of the eye crack on the pupil center line.
3. The method of claim 2, wherein the distance from the eye crack to the camera lens is:
distance of eye break to camera lens (D) =distance of corner stuck point to camera lens (1) -eye protrusion (2), wherein the eye protrusion (2) takes the average of normal human eye protrusions.
4. A method for measuring the width of an eye crack as claimed in claim 3, wherein the error is removed by taking the average of a plurality of measurements of the width of the eye crack.
5. A device for measuring the width of an eye tear, comprising:
the first acquisition module is used for shooting and acquiring a first eye bitmap in front of the head-up of the user;
the second acquisition module is used for shooting and acquiring video streams of the user, wherein the video streams keep the eyelid motionless eyeballs to move left and right along with the tracking points in the horizontal direction;
the segmentation module is used for segmenting the pupil center from the first eye bitmap to obtain a pupil center line in the vertical direction, segmenting static and dynamic intersection points from each frame of the video stream, and obtaining the outline of the eye crack through the aggregation of the intersection points;
the calculation module is used for calculating the width of the eye crack through the up-down distance of the outline of the eye crack on the pupil center line;
the segmentation module further comprises: and the segmentation sub-module is used for comparing two adjacent frames in the video stream when a static and dynamic intersection point is segmented from the frames of the video stream, obtaining the movement rate of each pixel between the two frames, segmenting the static and dynamic intersection point by adopting a convolution and cyclic neural network model, segmenting transverse movement and longitudinal movement, and eliminating the influence of blinking by adopting longitudinal movement.
6. A computer readable storage medium, characterized in that at least one program code is stored in the computer readable storage medium, which is loaded and executed by a processor to implement the method of measuring the width of an eye break according to any of claims 1 to 4.
CN202210989509.8A 2022-08-18 2022-08-18 Eye crack width measuring method, device and storage medium Active CN115886717B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210989509.8A CN115886717B (en) 2022-08-18 2022-08-18 Eye crack width measuring method, device and storage medium
PCT/CN2023/113448 WO2024037579A1 (en) 2022-08-18 2023-08-17 Palpebral fissure height measurement method and apparatus, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210989509.8A CN115886717B (en) 2022-08-18 2022-08-18 Eye crack width measuring method, device and storage medium

Publications (2)

Publication Number Publication Date
CN115886717A CN115886717A (en) 2023-04-04
CN115886717B true CN115886717B (en) 2023-09-29

Family

ID=86487145

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210989509.8A Active CN115886717B (en) 2022-08-18 2022-08-18 Eye crack width measuring method, device and storage medium

Country Status (2)

Country Link
CN (1) CN115886717B (en)
WO (1) WO2024037579A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115886717B (en) * 2022-08-18 2023-09-29 上海佰翊医疗科技有限公司 Eye crack width measuring method, device and storage medium

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000023100A (en) * 1998-07-07 2000-01-21 Ricoh Co Ltd Electronic still camera, moving picture photographing method of electronic still camera and recording medium storing computer executable program
JP2006157691A (en) * 2004-11-30 2006-06-15 Nippon Telegr & Teleph Corp <Ntt> Representative image selecting method, apparatus, and program
JP2009181424A (en) * 2008-01-31 2009-08-13 Nec Corp Image processor, method for processing image, and image processing program
CN108062507A (en) * 2016-11-08 2018-05-22 中兴通讯股份有限公司 A kind of method for processing video frequency and device
CN111033442A (en) * 2017-09-01 2020-04-17 奇跃公司 Detailed eye shape model for robust biometric applications
KR20200086742A (en) * 2017-12-14 2020-07-17 삼성전자주식회사 Method and device for determining gaze distance
CN111664839A (en) * 2020-05-20 2020-09-15 重庆大学 Vehicle-mounted head-up display virtual image distance measuring method
CN112384127A (en) * 2018-07-27 2021-02-19 高雄医学大学 Eyelid droop detection method and system
CN112932407A (en) * 2021-01-29 2021-06-11 上海市内分泌代谢病研究所 Face front calibration method and system
CN113080836A (en) * 2021-03-31 2021-07-09 上海青研科技有限公司 Non-center gazing visual detection and visual training equipment
WO2021260526A1 (en) * 2020-06-23 2021-12-30 Mor Research Applications Ltd. System and method for characterizing droopy eyelid
CN114694236A (en) * 2022-03-08 2022-07-01 浙江大学 Eyeball motion segmentation positioning method based on cyclic residual convolution neural network
CN114910052A (en) * 2022-05-27 2022-08-16 深圳市立体通技术有限公司 Camera-based distance measurement method, control method and device and electronic equipment

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4893862B1 (en) * 2011-03-11 2012-03-07 オムロン株式会社 Image processing apparatus and image processing method
CN105913487B (en) * 2016-04-09 2018-07-06 北京航空航天大学 One kind is based on the matched direction of visual lines computational methods of iris edge analysis in eye image
US10580133B2 (en) * 2018-05-30 2020-03-03 Viswesh Krishna Techniques for identifying blepharoptosis from an image
CN111340922B (en) * 2018-12-18 2024-10-15 北京三星通信技术研究有限公司 Positioning and map construction method and electronic equipment
CN110335266B (en) * 2019-07-04 2023-04-07 五邑大学 Intelligent traditional Chinese medicine visual inspection image processing method and device
CN112837805B (en) * 2021-01-12 2024-03-29 浙江大学 Eyelid topological morphology feature extraction method based on deep learning
CN114821756A (en) * 2022-05-20 2022-07-29 上海美沃精密仪器股份有限公司 Blink data statistical method and device based on deep learning
CN115886717B (en) * 2022-08-18 2023-09-29 上海佰翊医疗科技有限公司 Eye crack width measuring method, device and storage medium

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000023100A (en) * 1998-07-07 2000-01-21 Ricoh Co Ltd Electronic still camera, moving picture photographing method of electronic still camera and recording medium storing computer executable program
JP2006157691A (en) * 2004-11-30 2006-06-15 Nippon Telegr & Teleph Corp <Ntt> Representative image selecting method, apparatus, and program
JP2009181424A (en) * 2008-01-31 2009-08-13 Nec Corp Image processor, method for processing image, and image processing program
CN108062507A (en) * 2016-11-08 2018-05-22 中兴通讯股份有限公司 A kind of method for processing video frequency and device
CN111033442A (en) * 2017-09-01 2020-04-17 奇跃公司 Detailed eye shape model for robust biometric applications
KR20200086742A (en) * 2017-12-14 2020-07-17 삼성전자주식회사 Method and device for determining gaze distance
CN112384127A (en) * 2018-07-27 2021-02-19 高雄医学大学 Eyelid droop detection method and system
CN111664839A (en) * 2020-05-20 2020-09-15 重庆大学 Vehicle-mounted head-up display virtual image distance measuring method
WO2021260526A1 (en) * 2020-06-23 2021-12-30 Mor Research Applications Ltd. System and method for characterizing droopy eyelid
CN112932407A (en) * 2021-01-29 2021-06-11 上海市内分泌代谢病研究所 Face front calibration method and system
CN113080836A (en) * 2021-03-31 2021-07-09 上海青研科技有限公司 Non-center gazing visual detection and visual training equipment
CN114694236A (en) * 2022-03-08 2022-07-01 浙江大学 Eyeball motion segmentation positioning method based on cyclic residual convolution neural network
CN114910052A (en) * 2022-05-27 2022-08-16 深圳市立体通技术有限公司 Camera-based distance measurement method, control method and device and electronic equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
严健全.人像透视画法在视频画像中的运用.广东公安科技.2015,(第120期),13-18. *

Also Published As

Publication number Publication date
WO2024037579A1 (en) 2024-02-22
CN115886717A (en) 2023-04-04

Similar Documents

Publication Publication Date Title
US10564446B2 (en) Method, apparatus, and computer program for establishing a representation of a spectacle lens edge
CN109684915B (en) Pupil tracking image processing method
CN105094300B (en) A kind of sight line tracking system and method based on standardization eye image
CN108985210A (en) A kind of Eye-controlling focus method and system based on human eye geometrical characteristic
US20220100268A1 (en) Eye tracking device and a method thereof
JP2011520503A (en) Automatic concave nipple ratio measurement system
CN112837805B (en) Eyelid topological morphology feature extraction method based on deep learning
CN111667456A (en) Method and device for detecting vascular stenosis in coronary artery X-ray sequence radiography
CN111933275A (en) Depression evaluation system based on eye movement and facial expression
CN109697719A (en) A kind of image quality measure method, apparatus and computer readable storage medium
CN109700449B (en) Effective heart rate measuring system capable of resisting natural head movement
CN115886717B (en) Eye crack width measuring method, device and storage medium
Kaya A novel method for optic disc detection in retinal images using the cuckoo search algorithm and structural similarity index
WO2024037587A1 (en) Palpebral fissure height measurement method and apparatus, and storage medium
CN110598635B (en) Method and system for face detection and pupil positioning in continuous video frames
JP3726122B2 (en) Gaze detection system
Raza et al. Hybrid classifier based drusen detection in colored fundus images
Ghadiri et al. Retinal vessel tortuosity evaluation via circular Hough transform
CN111369496B (en) Pupil center positioning method based on star ray
Charoenpong et al. Pupil extraction system for Nystagmus diagnosis by using K-mean clustering and Mahalanobis distance technique
Domingo et al. Irregular motion recovery in fluorescein angiograms
Liang et al. Location of optic disk in the fundus image based on visual attention
CN112528714A (en) Single light source-based gaze point estimation method, system, processor and equipment
Hazelhoff et al. Behavioral state detection of newborns based on facial expression analysis
CN109188703A (en) A kind of artificial intelligence diagnostic imaging system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant