GB2576139A - Ocular assessment - Google Patents

Ocular assessment Download PDF

Info

Publication number
GB2576139A
GB2576139A GB1811923.0A GB201811923A GB2576139A GB 2576139 A GB2576139 A GB 2576139A GB 201811923 A GB201811923 A GB 201811923A GB 2576139 A GB2576139 A GB 2576139A
Authority
GB
United Kingdom
Prior art keywords
eye
images
features
measurements
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1811923.0A
Other versions
GB201811923D0 (en
Inventor
Saha Konal
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ophthalmic Surgical Services Ltd
Original Assignee
Ophthalmic Surgical Services Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ophthalmic Surgical Services Ltd filed Critical Ophthalmic Surgical Services Ltd
Priority to GB1811923.0A priority Critical patent/GB2576139A/en
Publication of GB201811923D0 publication Critical patent/GB201811923D0/en
Publication of GB2576139A publication Critical patent/GB2576139A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0025Operational features thereof characterised by electronic signal processing, e.g. eye models
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/48Diagnostic techniques
    • A61B6/486Diagnostic techniques involving generating temporal series of image data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach
    • G06T7/0016Biomedical image inspection using an image reference approach involving temporal comparison
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/11Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils
    • A61B3/112Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils for measuring diameter of pupils
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • A61B5/004Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room adapted for image acquisition of a particular organ or body part
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30041Eye; Retina; Ophthalmic
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes

Abstract

Method of measuring changes in ocular features comprising: obtaining measurements of eye features from images captured at first and second times; comparing the measurements to quantitatively measure change of appearance of the eye between the first and second times. Image processing may be performed, eg. pre-processing, edge or corner detection, template matching, deformable parameterised shapes, active contours, normalisation, enhancement, scaling, segmentation. Images may be transformed, sized, scaled or rotated to minimise sum of differences and compensate for discrepancies due to distance from camera, focus, pose and image capture variations. Images may be two dimensional, eg. 2D video sequences. Three dimensional images may be captured through stereo imaging, 3D points mapping the eye surface, or structured light pattern projection. Measurements may be based on the spatial relationship of 3D extracted feature positions. Features may be represented by points, lines, curves. Straight line lengths between spatial points, ratios of distances or curvatures or absolute or percentage distance changes may be determined. Weighted averages of multiple measurement readings may be determined. Measurements may be taken along mid pupil line, medial and lateral limbal line. Measurements may be used to determine margin reflex distance, pretarsal skin show, upper eyelid skin fold, eyebrow skin show.

Description

OCULAR ASSESSMENT
The present invention relates to assessment of surgical procedures, and particularly, but not exclusively, to determining efficacy of oculoplastic surgery.
Oculoplastic Surgeons commonly perform surgery to change the appearance of patients' eyelids. Surgery is often aimed at raising the upper eyelid margin (ptosis surgery) and/or reducing hooding of the upper eyelids (upper blepharoplasty). In order to document efficacy of intervention, measurements are made of eyelid height using a ruler in clinic. Accurate assessment is important for a number of different reasons including:
• Medicolegal - is there objective proof the intervention has had the desired effect?
• Healthcare quality improvement - we need an accurate, reproducible method for documenting outcomes in order to compare different techniques and refine practice.
• Research - understanding ageing changes and disease progression through accurate documentation of change.
• Healthcare commissioning - we need to prove the efficacy of treatments to help determine the value of an intervention in the context of population healthcare.
It is an object of at least one aspect of the invention to provide improved clinical assessment. Aspects of the invention are set out in the independent claims and preferable features are set out in the dependent claims.
There is described herein a method of measuring changes in ocular features, the method comprising obtaining first measurement data of eye features of a user, derived from one or more images of the eye of the user captured at a first time; obtaining second measurement data of said eye features of a user, derived from one or more images of the user captured at a second time; comparing said first and second measurement data to produce a quantitative measure of change of appearance of the eye between said first and second times.
In this way, a more reliable, repeatable assessment is produced, which reduces variability associated with ruler measurements, resulting in inaccurate documentation of the change in appearance (ie the results of physical changes in the eye and surrounding region). Furthermore the method is less invasive and more comfortable and convenient for a user/patient.
Preferably said one or more images are captured at a first location, and said comparing is performed at a second location, remote from the first location. In this way, a user or patient need not physically visit or be present at a clinic or clinician, and the assessment can be performed remotely, this saving time and making the process more efficient for user/patient and clinician alike.
Preferably the measurement data is obtained by performing image processing to identify one or more predetermined features of the eye, and to derive one or more predetermined measurements based on the positions of the extracted features. Such image processing can be performed at the first location, or the second location, or the processing can be distributed between the two. The image processing may include techniques such as corner detection, edge detection, template matching, deformable parameterised shapes, and/or active contours.
In certain examples, image processing includes normalising the one or more images and/or measurements. This advantageously allows images (and/or the resulting measurements) to be sized/scaled, and possibly even rotated and/or transformed, to allow a better comparison to be made. Thus discrepancies due to the variations in image capture (distance from lens, pose, focus, etc) can be compensated for.
Although the method may be performed using 2D images, a 3D image or images may equally be used. In this case the measurements derived are advantageously based on 3D positions of extracted features. Also, while still images are envisaged, so too are video images or sequences.
Preferably, the method also comprises outputting the quantitative measure of change, which may be output to a clinician directly, to assist or guide in further actions or procedures, or which may be stored, or sent for further processing.
Preferably the method further comprises issuing one or more instructions to a user to instigate and/or guide image capture. This may comprise a series of instructions to ensure appropriate image capture, and may include directions to perform image capture again in the case that an image or images are determined to be unsuitable.
There is further described herein a diagnostic apparatus for measuring changes in ocular features comprising a communication module configured to communicate with a remote image capture device, and to receive from said remote capture device images of a user's eye captured at different points in time; and an image processing module configured to perform image processing on received captured images to identify one or more predetermined features of the eye, and derive one or more predetermined measurements based on the positions of the extracted features, and a comparison module configured to compare said measurements to produce a quantitative measure of change of appearance of the eye between said points in time.
There is still further described herein a diagnostic apparatus for measuring changes in ocular features comprising a communication module configured to communicate with a remote image capture device, and to receive from said remote capture device measurement data of eye features of a user, said measurement data derived from one or more images of the eye captured at different points in time; and a comparison module configured to compare said measurement data to produce a quantitative measure of change of appearance of the eye between said points in time.
There is yet further described herein a system comprising a local image capture device configured to capture one or more images of a user's eye at different points in time; and a processor, remote from said capture device, configured to obtain measurement data of the eye, derived from said one or more images at said different points in time, and to compare said measurement data to produce a quantitative measure of change of appearance of the eye between said points in time.
In examples said local image capture device is configured to perform image processing on captured images to identify one or more predetermined features of the eye, and derive one or more predetermined measurements based on the positions of the extracted features, and further configured to send said derived measurements to said processor.
In examples said local image capture device is configured to send captured images to said processor, and wherein said processor is configured to perform image processing on received captured images to identify one or more predetermined features of the eye, and derive one or more predetermined measurements based on the positions of the extracted features.
The invention also provides a computer program and a computer program product for carrying out any of the methods described herein and/or for embodying any of the apparatus features described herein, and a computer readable medium having stored thereon a program for carrying out any of the methods described herein and/or for embodying any of the apparatus features described herein.
The invention extends to methods, apparatus and/or use substantially as herein described with reference to the accompanying drawings.
Any feature in one aspect of the invention may be applied to other aspects of the invention, in any appropriate combination. In particular, features of method aspects may be applied to apparatus aspects, and vice versa.
Furthermore, features implemented in hardware may generally be implemented in software, and vice versa. Any reference to software and hardware features herein should be construed accordingly.
Preferred features of the present invention will now be described, purely by way of example, with reference to the accompanying drawings, in which:
Figure 1 iIIIustrates the stages of an exemplary method of assessment;
Figure 2 is a schematic representation of a user's eye;
Figure 3 represents an example of a system suitable for performing the method of Figure 1
Referring to Figure 1, An image or images of a user's eye are captured at step 102. The capturing may be one or more still images, or a video sequence of images. An image may further be a 3D image, for example in the form of a series of 3D points mapping a surface representative of the eye. Such a 3D image may for example be obtained by using stereo imaging, or by using a structured light approach, projecting a pattern of light onto the eye region, and detecting deformation in the projected pattern.
At step 104, the image or images are processed in order to extract or identify certain features. Prior to feature extraction, the image may be pre-processed, for example to enhance, scale or segment the image. The extracted features are typically a set of predetermined features used to define measurements which characterise the eye. Such features are discussed in greater detail with reference to Figure 2.
Many different techniques may be used for feature extraction. For example, basic techniques include corner and edge detection, and template matching. More sophisticated techniques may however be employed, such as deformable parameterised shapes and active contours. The result is that a set of distinguishable features of the eye are extracted from the image, in such a way that the spatial relationship between these features can be analysed. For example, this may most commonly be data representing the 2D or 3D positions in the image space of points, lines and/or curves representing the features.
In step 106, measurements of the eye are made, based on the extracted features. Again the specific measurements are discussed in relation to Figure 2. Typically straight line lengths between extracted spatial points are determined. Such measurements can be obtained simply by computation given the spatial position data of the extracted features. However, other measures, such as ratios of distances or curvatures may be determined in more advanced embodiments.
The result of steps 102 to 106 is a set of measurements characteristic of a user's eye at a point in time, which allows an objective comparison to be made of the change of configuration or appearance of the eye when compared to corresponding measurements at another point in time. Such comparison is performed at step 108. Here two or more sets of measurements are compared to derive a quantitative assessment of change.
In the case of a single still image, a set of measurements corresponds to a distinct point in time, however more representative measurement may be taken from a series of images, or video of the eye over a period of a few seconds, or minutes. This would allow multiple readings of a particular given measurement to be taken, allowing for averaging. It would also allow a user to blink naturally which may afford different or improved measurement.
In a single image the user is preferably looking straight ahead, however multiple images or video may allow for a user to change pose or eye direction, possibly following a prescribed set of movements, again allowing more data to be gathered. In one case the image capture may be performed during a video consultation with a clinician. During the consultation, instructions can be given to the user, such as to adjust the position or pose of the head, direction of gaze, and to blink.
In order to provide a true comparison, the measurements are scaled or normalised if necessary before comparison. Such normalisation may be performed during steps 102,104 and/or 106. Alternatively, normalisation may occur at the point of comparison in step 108. For example, normalisation may be performed by reference to one or more features or measurements of the eye which should be time invariant (for example iris size is generally consistent). It may also be possible to normalise images using the extracted features, with some features expected to be broadly consistent (even if others are not). Scaling or transforming to minimize the sum of differences in features or a selected subset of features, may give acceptable results. In some cases, and especially in the case of 3D imaging, it may be possible to determine pose/position of the camera relative to the face, and therefore normalisation may be by geometric transformation.
Measurements of a user's eye as set out above are performed at multiple different points in time to determine changes. Typically this will be before and after a surgical procedure, however measurements may be taken at other times to monitor for changes in the absence of surgery also. Reference to a point in time however should be understood to be indicative of the order of a day or date, typically - as explained above a single set of measurements may be based on a video, or sequence of images of the eye lasting a few seconds, or several minutes for example.
The form of comparison may vary according to the type and number of measurements. In a simple example an absolute or a percentage change in a distance will typically be used. However, where multiple measurements are available, averaged changes may be used, or indeed multiple numerical outputs or combinations of such outputs may be used to assess a change. A score may be determined for a particular purpose or aspect, by combining sets of selected measurements, eg a weighted average. Different scores may then be used for different purposes.
The output(s) of the comparison is produced at step 110. This may be in the form of a standardised parameter or set of parameters, and may be provided to a clinician. The output may be used to guide further courses of action, such as method of surgical intervention or an assessment of surgical success.
Considering Figure 2, there is shown a schematic image of a human eye, or ocular region. The pupil is shown at 226, and the iris at 228. The upper eyelid margin is indicated at 224. Reference 222 represents the upper eyelid skin fold, while the eyebrow is shown at 220. Each of these features may be extracted from a real image of an eye as discussed above.
Measurements 212 to 218 are described with reference to these features. The Margin Reflex Distance 1 (MRD1) is the distance from the centre of the pupil to the upper lid margin, and is shown as 216. The Margin Reflex Distance 2 (MRD2) is the distance from the centre of the pupil to the lower lid margin, and is shown as 218. The upper lid skin show (pretarsal skin show) 214 is the distance from the upper lid margin to the upper eyelid skin fold, and the brow skin show 212 is the distance from the upper eyelid skin fold to the the lower edge of the eyebrow.
The upper lid skin show 214 and brow skin show 212 can be measured along the mid pupil line 232, and can also be measured along the medial limbal line 230 and the lateral limbal line 234. Measurements at these three positions allows an average to be taken and also an assessment of contour.
Figure 3 illustrates the main features of a system for implementing the method described with respect to Figures 1 and 2. A local device 304 is used to acquire data of a user's eye. This will typically be a camera or other imaging device. This would typically be a smartphone, but a dedicated hardware device could be used if desired. A user may be able to take measurements, eg using a smartphone, without having to visit a facility such as a hospital or clinic, and may be able to so alone, without being in the presence of a medical professional (although a virtual presence may be possible in the case of a video consultation as mentioned above). In some cases, the local device 304 is able to provide data indicative of the user's eye to a remote processor 306, typically via the internet 304, but any suitable communication channel may be employed. The remote processor may be a server or any general purpose computer for example. The processor is able to provide data to the remote device in embodiments, such as in the example of a two-way video consultation as discussed above. Also, prompts or instructions associated with the image capture process may be sent to the local device. A database 308 may be provided for storing user eye data, and/or outputs of the processor 306. In some embodiments, the database 308 may be integral with the processor 306, as indicated by the dashed line.
Image capturing step 102 of Figure 1 is performed at the local device, however subsequent processing may be distributed between the local device and the remote processor. In one example, the image is sent to the remote processor 306, which performs feature extraction and determines measurements. In other embodiments however, feature extraction and measurement are performed at the local device, and only a measurement data set is sent to the remote processor. Alternatively the processing may be split, some parts performed locally and some remotely. It is even conceivable that the whole method could be performed on the local device, including comparison and output steps. The choice of how to distribute the processing, between more server or cloud based, and more edge based, depends on factors such as local processing power, data privacy and operational and clinical considerations. When data is sent between the local and remote devices and/or stored, it may be encrypted to safeguard user privacy.
The above embodiments and examples are to be understood as illustrative examples. Further embodiments, aspects or examples are envisaged. It is to be understood that any feature described in relation to any one embodiment, aspect or example may be used alone, or in combination with other features described, and may also be used in combination with one or more features of any other of the embodiments, aspects or examples, or any combination of any other of the embodiments, aspects or examples. Furthermore, equivalents and modifications not described above may also be employed without departing from the scope of the invention, which is defined in the accompanying claims.

Claims (18)

1. A method of measuring changes in ocular features, the method comprising: obtaining first measurement data of eye features of a user, derived from one or more images of the eye of the user captured at a first time;
obtaining second measurement data of said eye features of a user, derived from one or more images of the user captured at a second time;
comparing said first and second measurement data to produce a quantitative measure of change of appearance of the eye between said first and second times.
2. A method according to claim 1, wherein said one or more images are captured at a first location, and wherein said comparing is performed at a second location, remote from the first location.
3. A method according to claim 1 or claim 2, wherein said measurement data is obtained by performing image processing to identify one or more predetermined features of the eye, and derive one or more predetermined measurements based on the positions of the extracted features.
4. A method according to claim 3, wherein said image processing is performed at said first location.
5. A method according to claim 3, wherein said image processing is performed at said second location.
6. A method according to any one of claims 3 to 5, wherein said image processing includes at least one of corner detection, edge detection, template matching, deformable parameterised shapes, and/or active contours.
7. A method according to any one of claims 3 to 6, wherein said image processing includes normalising said one or more images and/or said measurements.
8. A method according to any preceding claim, wherein said one or more images comprises a 3D image.
9. A method according to claim 8, wherein said measurements are based on 3D positions of extracted features
10. A method according to any preceding claim, wherein said one or more images comprises a video image.
11. A method according to any preceding claim, further comprising outputting said quantitative measure of change.
12. A method according to any preceding claim, further comprising issuing one or more instructions to a user to instigate and/or guide image capture.
13. A diagnostic apparatus for measuring changes in ocular features comprising:
a communication module configured to communicate with a remote image capture device, and to receive from said remote capture device images of a user's eye captured at different points in time; and an image processing module configured to perform image processing on received captured images to identify one or more predetermined features of the eye, and derive one or more predetermined measurements based on the positions of the extracted features, and a comparison module configured to compare said measurements to produce a quantitative measure of change of appearance of the eye between said points in time.
14. A diagnostic apparatus for measuring changes in ocular features comprising:
a communication module configured to communicate with a remote image capture device, and to receive from said remote capture device measurement data of eye features of a user, said measurement data derived from one or more images of the eye captured at different points in time; and a comparison module configured to compare said measurement data to produce a quantitative measure of change of appearance of the eye between said points in time.
15. A diagnostic apparatus according to claim 13 or claim 14, wherein said communication module is further configured to send instructions to said remote capture device to initiate and/or control image capture.
16. A system comprising:
a local image capture device configured to capture one or more images of a user's eye at different points in time; and a processor, remote from said capture device, configured to obtain measurement data of the eye, derived from said one or more images at said different points in time, and to compare said measurement data to produce a quantitative measure of change of appearance of the eye between said points in time.
17. A system according to claim 16, wherein said local image capture device is configured to perform image processing on captured images to identify one or more predetermined features of the eye, and derive one or more predetermined measurements based on the positions of the extracted features, and further configured to send said derived measurements to said processor.
18. A system according to claim 16, wherein said local image capture device is configured to send captured images to said processor, and wherein said processor is configured to perform image processing on received captured images to identify one or more predetermined features of the eye, and derive one or more predetermined measurements based on the positions of the extracted features.
GB1811923.0A 2018-07-20 2018-07-20 Ocular assessment Withdrawn GB2576139A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1811923.0A GB2576139A (en) 2018-07-20 2018-07-20 Ocular assessment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1811923.0A GB2576139A (en) 2018-07-20 2018-07-20 Ocular assessment

Publications (2)

Publication Number Publication Date
GB201811923D0 GB201811923D0 (en) 2018-09-05
GB2576139A true GB2576139A (en) 2020-02-12

Family

ID=63364429

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1811923.0A Withdrawn GB2576139A (en) 2018-07-20 2018-07-20 Ocular assessment

Country Status (1)

Country Link
GB (1) GB2576139A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2497409A1 (en) * 2011-03-10 2012-09-12 Canon Kabushiki Kaisha Optical tomographic image photographing apparatus and control method therefor
WO2016109841A1 (en) * 2014-12-31 2016-07-07 Morphotrust Usa, Llc Detecting facial liveliness
WO2016172532A2 (en) * 2015-04-23 2016-10-27 Bd Kiestra B.V. A method and system for automated microbial colony counting from streaked sample on plated media
WO2016189711A1 (en) * 2015-05-27 2016-12-01 糧三 齋藤 Stress evaluation program for mobile terminal and mobile terminal provided with program
US20170032525A1 (en) * 2014-04-07 2017-02-02 Mimo Ag Method for the analysis of image data representing a three-dimensional volume of biological tissue
US20170172405A1 (en) * 2015-12-02 2017-06-22 Nidek Co., Ltd. Ophthalmologic information processing apparatus and ophthalmologic information processing method
US20180064336A1 (en) * 2016-09-07 2018-03-08 Nidek Co., Ltd. Ophthalmic analysis apparatus and ophthalmic analysis method
WO2018064408A1 (en) * 2016-09-29 2018-04-05 Flir Systems, Inc. Fail-safe detection using thermal imaging analytics
EP3335621A1 (en) * 2016-12-16 2018-06-20 Tomey Corporation Ophthalmic apparatus

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2497409A1 (en) * 2011-03-10 2012-09-12 Canon Kabushiki Kaisha Optical tomographic image photographing apparatus and control method therefor
US20170032525A1 (en) * 2014-04-07 2017-02-02 Mimo Ag Method for the analysis of image data representing a three-dimensional volume of biological tissue
WO2016109841A1 (en) * 2014-12-31 2016-07-07 Morphotrust Usa, Llc Detecting facial liveliness
WO2016172532A2 (en) * 2015-04-23 2016-10-27 Bd Kiestra B.V. A method and system for automated microbial colony counting from streaked sample on plated media
WO2016189711A1 (en) * 2015-05-27 2016-12-01 糧三 齋藤 Stress evaluation program for mobile terminal and mobile terminal provided with program
US20170172405A1 (en) * 2015-12-02 2017-06-22 Nidek Co., Ltd. Ophthalmologic information processing apparatus and ophthalmologic information processing method
US20180064336A1 (en) * 2016-09-07 2018-03-08 Nidek Co., Ltd. Ophthalmic analysis apparatus and ophthalmic analysis method
WO2018064408A1 (en) * 2016-09-29 2018-04-05 Flir Systems, Inc. Fail-safe detection using thermal imaging analytics
EP3335621A1 (en) * 2016-12-16 2018-06-20 Tomey Corporation Ophthalmic apparatus

Also Published As

Publication number Publication date
GB201811923D0 (en) 2018-09-05

Similar Documents

Publication Publication Date Title
EP2319392B1 (en) Biometric recognition through examination of the surface map of the posterior surface of the cornea
EP3305173B1 (en) Surgery system, and device and method for processing image
JP7466607B2 (en) Slit Lamp Microscope
EP4134981A1 (en) Method for acquiring side image for eye protrusion analysis, image capture device for performing same, and recording medium
JP7321678B2 (en) slit lamp microscope and ophthalmic system
US20200297251A1 (en) Technique for processing patient-specific image data for computer-assisted surgical navigation
AU2016263038B2 (en) OCT image modification
JP7154044B2 (en) slit lamp microscope and ophthalmic system
US20230337913A1 (en) Method and photographing device for acquiring side image for ocular proptosis degree analysis, and recording medium therefor
KR101374295B1 (en) Apparatus for ocular and method for measuring treatment position thereof
KR20200134022A (en) Apparatus, method and system for measuring exophthalmos using 3D depth camera
GB2576139A (en) Ocular assessment
JP2012192090A (en) Information processing method, method for estimating orbitale, method for calculating frankfurt plane, and information processor
CN110414302A (en) Contactless interpupillary distance measurement method and system
JP2016193175A (en) Extraction method of determination part for apparent face impression, extraction method of determining factor for apparent face impression, and differentiation method for apparent face impression
Hontscharuk et al. Primary orbital fracture repair: development and validation of tools for morphologic and functional analysis
WO2021066039A1 (en) Medical information processing program, and medical information processing device
CN115100380A (en) Medical image automatic identification method based on eye body surface feature points
CN115953717A (en) Eye position deflection angle measuring method and device based on three-dimensional reconstruction
WO2023225525A2 (en) Remote subjective refraction techniques
JP6019721B2 (en) Objective displacement measuring apparatus and objective displacement measuring method

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)