WO2008062528A1 - Analyseur d'image de fond d'œil - Google Patents

Analyseur d'image de fond d'œil Download PDF

Info

Publication number
WO2008062528A1
WO2008062528A1 PCT/JP2006/323413 JP2006323413W WO2008062528A1 WO 2008062528 A1 WO2008062528 A1 WO 2008062528A1 JP 2006323413 W JP2006323413 W JP 2006323413W WO 2008062528 A1 WO2008062528 A1 WO 2008062528A1
Authority
WO
WIPO (PCT)
Prior art keywords
fundus image
image
fundus
luminance distribution
distribution information
Prior art date
Application number
PCT/JP2006/323413
Other languages
English (en)
Japanese (ja)
Inventor
Enrico Grisan
Alfredo Ruggeri
Massimo De Luca
Original Assignee
Nidek Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nidek Co., Ltd. filed Critical Nidek Co., Ltd.
Priority to PCT/JP2006/323413 priority Critical patent/WO2008062528A1/fr
Publication of WO2008062528A1 publication Critical patent/WO2008062528A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/12Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes

Definitions

  • the present invention relates to a fundus image analysis apparatus that analyzes a fundus image.
  • Patent Document 1 an apparatus for analyzing a fundus image obtained by a fundus camera or the like is known (see, for example, Patent Document 1 and Patent Document 2).
  • Such an analysis device detects an optic disc using image processing technology and analyzes the state of the optic disc.
  • Patent Document 1 JP-A-9 313447
  • Patent Document 2 Japanese Patent Application Laid-Open No. 11 151206
  • the present invention is characterized by having the following configuration.
  • Storage means for storing a photographed fundus image, mask means for specifying and masking a blood vessel portion by the stored fundus image force image processing, and all pixels in the fundus image other than those masked by the mask means
  • a luminance distribution acquisition unit that obtains luminance distribution information of the entire fundus image by obtaining luminance information, and a luminance component acquired by the luminance distribution acquisition unit
  • Image analysis means for analyzing the fundus image by comparing cloth information with reference luminance distribution information obtained in advance, and notification means for notifying a result analyzed by the image analysis means.
  • the image analysis means performs analysis using a -European network.
  • the image analysis unit compares the luminance distribution information with reference luminance distribution information, and determines the luminance distribution information in a dark color zone, a gray zone, and a light color zone. It is divided into three zones, and at least one of the dark color zone and the light color zone is analyzed using a -Ural network.
  • the image analysis unit analyzes the fundus image by comparing luminance distribution information of the entire fundus image with reference luminance distribution information obtained in advance. Then, the entire fundus image is further divided into predetermined sectors, and analysis is performed on sectors other than the portion masked by the mask means.
  • the fundus image is an image photographed with green illumination light.
  • the image analysis means obtains the presence or absence of the onset of diabetic retinopathy by the analysis.
  • FIG. 1 is a diagram showing a schematic configuration of a fundus imaging apparatus in the present embodiment.
  • FIG. 2 is a diagram showing an optical system of a fundus imaging apparatus in the present embodiment.
  • FIG. 3 is a diagram showing a configuration of a focus chart.
  • FIG. 4 is a diagram showing an anterior segment image in which an alignment index is formed.
  • FIG. 5A is a diagram for explaining focus adjustment of the fundus imaging apparatus in the present embodiment.
  • FIG. 5B is a diagram for explaining focus adjustment of the fundus imaging apparatus in the present embodiment.
  • FIG. 6 A flow chart for analyzing the presence or absence of diabetic retinopathy using the Yural network.
  • FIG. 7 is a schematic diagram showing a fundus image of a patient's eye developing diabetic retinopathy.
  • FIG. 8 is a diagram showing luminance distribution information of the entire fundus image after removing blood vessels and optic discs.
  • FIG. 9 is a diagram showing a state in which the fundus image is divided into local luminance distribution information.
  • FIG. 1 is a schematic diagram showing a schematic configuration of the fundus imaging apparatus of the present embodiment.
  • the fundus imaging apparatus 1 includes a control unit 2 having a CPU and the like, a monitor 3 for displaying various information, an instruction input unit 4 for performing various settings, a storage unit for storing fundus images, etc. 5.
  • Image analysis unit that analyzes stored fundus images using the neural network technology 6.
  • Output unit 7 for outputting the analysis results.
  • Image taking unit with optical system for photographing the fundus of eye E 100, drive unit 1 for driving the imaging unit 100 in the front-back, up-down, left-right direction (XYZ direction) with respect to the eye E Provide 01 mag.
  • Reference numeral 8 denotes a photographing window, and the fundus of the subject eye E is photographed by the photographing unit 100 inside the apparatus 1 by positioning the subject eye E in the photographing window 8.
  • Monitor 3, instruction input unit 4, storage unit 5, image analysis unit 6, output unit 7, photographing unit 100 (light source, light receiving element, etc.), drive unit 101 is electrically connected to control unit 2 and connected to control unit The drive is controlled by the command signal from 2.
  • FIG. 2 is a diagram illustrating a configuration of an optical system included in the photographing unit 100.
  • the optical system for illuminating the eye to be examined includes a light source 10 that emits infrared light for fundus illumination, a light source 11 that emits flash light in the visible range for photographing the fundus, and a visible light that reflects infrared light. It consists of a dichroic mirror 12 that transmits light, a collimator lens 13, a focus chart 14, a condenser lens 15, a ring slit 16 having a ring-shaped opening, a mirror 17, a relay lens 18, and a half mirror 19.
  • the light source 10 and the light source 11 have a conjugate relationship with the pupil of the eye E to be examined.
  • the light source 11 can be used as long as it emits visible light. In the present embodiment, the light source 11 emits green monochromatic light to further enhance the blood vessel image among the fundus images to be photographed. Use the
  • a ring-shaped chart 14b having a predetermined size is formed on a filter 14a that transmits visible light and infrared light.
  • This chart 14b is formed by a coating process that transmits visible light but does not transmit infrared light.
  • the focus chart 14 is moved along the optical axis together with a focusing lens 23 described later by the driving means 102, and forms a ring image on the fundus of the eye E as an index at the time of focus.
  • the ring slit 16 is placed at a position conjugate with the pupil of the eye E through the relay lens 18.
  • Infrared light emitted from the light source 10 is reflected by the dichroic mirror 12 and then illuminates the focus chart 14 with a back force via the collimator lens 13.
  • the infrared light that has passed through the focus chart 14 illuminates the ring slit 16 through the condenser lens 15.
  • Infrared light that has passed through the ring slit 16 is reflected by the mirror 17, reflected by the half mirror 19 via the relay lens 18, and then formed at the pupil of the eye E to be examined and focused on the fundus.
  • the fundus is illuminated while forming a ring image that serves as an index of the eye.
  • visible light emitted from the light source 11 After passing through the dichroic mirror 12 (in this embodiment, green monochromatic light), the same light path as that of the infrared light from the light source 10 described above is applied to illuminate the fundus of the eye E to be examined. Since the chart 14b formed on the focus chart 14 transmits visible light, the visible light emitted from the light source 11 uniformly illuminates the fundus without forming a ring image on the fundus of the eye E. To do.
  • the optical system for photographing the fundus of the eye E has a half mirror 19, an object lens 20, a dichroic mirror 21, an aperture 22, a focusing lens 23, an imaging lens, and a half mirror from the eye E side. 25 and 2D light receiving element 26.
  • the two-dimensional light receiving element 26 has a conjugate relationship with the fundus of the eye E to be examined.
  • the diaphragm 22 is disposed at a position conjugate with the pupil of the eye E through the objective lens 20.
  • the focusing lens 23 is moved along the optical axis together with the focus chart by the driving means 102.
  • the reflected light of the fundus force of the illumination light from the light source 10 or 11 passes through the half mirror 19 and the objective lens 20 and forms an image, and then the dichroic mirror 21, aperture 22, focusing lens 23, imaging lens 24, The light is received by the two-dimensional light receiving element 26 through the half mirror 25.
  • Reference numerals 32a and 32b denote light sources for projecting alignment indices for detection in the up / down / left / right and front / rear directions from the front of the eye E and for illuminating the anterior eye portion of the eye E.
  • the light sources 32a and 32b are a pair of rectangular LEDs arranged symmetrically with respect to the photographing optical axis L1, and emit infrared light having a wavelength different from that of the light source 10 described above.
  • the light sources 32a, 32b project a finite index (a rectangular index extending in a direction perpendicular to the eye to be examined) by a divergent light beam at a predetermined projection angle toward the cornea of the eye E to be examined, and an anterior eye portion Illuminate the whole.
  • a finite index a rectangular index extending in a direction perpendicular to the eye to be examined
  • the optical system for photographing the anterior segment of the eye E has half mirror 19, objective lens 20, dichroic mirror 21 shared with the optical system for fundus photography, and dichroic mirror 2 1 It consists of a field lens 28, a mirror 29, an imaging lens 30, and a two-dimensional light receiving element 31 that are arranged in the direction of reflection by.
  • the two-dimensional light receiving element 31 has a conjugate relationship with the pupil of the eye E to be examined.
  • the dichroic mirror 21 transmits visible light and infrared light from the light source 10. The infrared light emitted from the light sources 32a and 32b is reflected.
  • FIG. 27 is a fixation lamp that emits visible light placed in the reflection direction of the half mirror 25.
  • the subject's face is brought close to the apparatus, and the eye (examinee E) to be imaged is positioned on the imaging window 8 shown in FIG. .
  • the control unit 2 turns on one of the nine fixation lamps 27 (in this case, the central fixation lamp located on the optical axis) to fixate.
  • the control unit 2 turns on the light sources 32a and 32b, causes the two-dimensional light receiving element 31 to receive the anterior eye image of the eye E, and based on the light reception result, the device (imaging unit 100) and the eye E to be inspected. Align (alignment).
  • FIG. 4 is a schematic diagram showing an anterior segment image received by the two-dimensional light receiving element 31.
  • the anterior segment of the eye E is illuminated, and rectangular alignment indexes 33L and 33R as shown are projected onto the cornea of the eye E.
  • the control unit 2 specifies the pupil P by anterior eye image power image processing received by the two-dimensional light receiving element 31 and obtains the center P of the specified pupil P. In addition, the control unit 2 performs an anterior segment by image processing.
  • the alignment state of the imaging unit 100 in the up / down / left / right direction with respect to the eye E is detected by detecting the positional relationship between the obtained center P and the intermediate position M, and the alignment in the front / rear direction
  • the alignment state is detected by comparing the image intervals of the alignment indicators 33L and 33R. Further, since the alignment indices 33L and 33R are projections of a finite distance index, the image interval changes with a change in the front-rear direction between the eye E and the imaging unit 100. In the present embodiment, the interval between the alignment indexes corresponding to the appropriate alignment distance in the front-rear direction between the eye E and the imaging unit 100 is obtained in advance as a predetermined value and stored in the storage unit 5.
  • the control unit 2 is based on the intermediate position M obtained from the index image (alignment index) formed by the light sources 32a and 32b and the center P obtained from the pupil center obtained from the anterior eye image power. Then, distance information for moving the photographing unit is obtained, and the drive unit 101 is driven so that the two coincide with each other, and the whole photographing unit 100 is moved in the vertical and horizontal directions to perform alignment. In addition, the control unit 2 drives the drive unit 101 so that the interval between the alignment indices 33L and 33R becomes a predetermined interval (predetermined value), and moves the entire imaging unit 100 in the front-rear direction with respect to the eye to be examined. Make the alignment. When each alignment state falls within a predetermined allowable range, the control unit determines alignment completion.
  • the control unit 2 turns off the light sources 32a and 32b, turns on the light source 10 for fundus illumination, irradiates the fundus of the eye E with infrared light, and receives the reflected light two-dimensionally.
  • Light is received by element 26 and a fundus image is obtained.
  • FIG. 5A is a schematic view showing a fundus image received by the two-dimensional light receiving element 26. 200 is an index projected onto the fundus by the focus chart 14.
  • the control unit 2 sets a line 210 that passes through the index 200, and the luminance information power on the set line 210 also indicates luminance information corresponding to the index 200.
  • FIG. 5B is a schematic diagram showing luminance information 220 on the set line 210.
  • the vertical axis indicates the luminance value
  • the horizontal axis indicates the position
  • 200 ′ indicates the luminance information corresponding to the index 200.
  • luminance information corresponding to other parts such as blood vessels in the fundus is excluded.
  • the image of the index 200 projected onto the fundus is blurred, so the luminance information 20 as shown by the dotted line in FIG. 5B.
  • the control unit 2 detects the luminance information 20 (corresponding to the index 200, and based on this luminance information 20 (
  • the focus chart 14 and the focusing lens 23 are moved in conjunction with each other using the driving unit 102 so that the peak height L1 and the narrowest width W1 are obtained.
  • the index chart 200 is projected onto the fundus using the focus chart 14, and the focus is adjusted based on the light receiving state (luminance information) of the index 200.
  • the captured fundus image is not limited to this. From blood vessels, etc. Extracting a constant sites, can also be adjusted based, Te focus on luminance information of the site
  • the control unit 2 turns off the light source 10 and The light source 11 is flashed to illuminate the fundus with visible light (green monochromatic light in this embodiment).
  • the reflected light from the fundus passes through the half mirror 19 and the objective lens 20 and forms an image.
  • the light is received by the two-dimensional light receiving element 26 through the dichroic mirror 21, the aperture 22, the focusing lens 23, the imaging lens 24, and the first mirror 25.
  • the control unit 2 stores the obtained fundus image in the storage unit 5 as the fundus image of the subject E and sequentially turns on the other fixation lamps 27, and uses the same method for the same eye E. A plurality of fundus images are obtained.
  • the ophthalmologic photographing apparatus analyzes a photographed fundus image (fundus image) using a Yural network, and analyzes diabetic retinopathy (hereinafter simply referred to as DR). It also serves as a fundus image analyzer that can determine the onset of onset.
  • DR diabetic retinopathy
  • the fundus images obtained by sequentially lighting nine fixation lamps are joined together by an existing image processing technique to form a single fundus image.
  • here is an example of analysis using the fundus image obtained by the presentation of a single fixation lamp.
  • the image analysis unit 6 extracts the fundus image 300 as shown in FIG. 7 stored in the storage unit 5.
  • a blood vessel 301 and an optic nerve head 302 (which may not be photographed depending on the position of the fixation lamp) are photographed.
  • DR diabetic retinopathy
  • a dark portion 303 caused by bleeding and a bright portion 304 called cotton wool spots are photographed.
  • the image analysis unit 6 uses image processing technology to extract the portions of the fundus image, such as blood vessels and optic nerve heads, that interfere with the subsequent analysis, removes (masks) the corresponding pixels, and leaves the remaining fundus image All pixels are counted as 0 to 255 levels of luminance information, and luminance distribution information of the entire fundus image is obtained.
  • the image is taken using green monochromatic light, so that a red portion such as a blood vessel on the fundus is photographed in black, and the subsequent blood vessels and It is easier to extract the dark part.
  • FIG. 8 is a diagram showing luminance distribution information (global luminance distribution information) of the entire fundus image after removing the blood vessel 301 and the optic disc 302.
  • the horizontal axis is the brightness value from 0 to 255
  • the vertical axis indicates the number of pixels.
  • a solid line 310 indicates luminance distribution information based on the fundus image 300
  • a dotted line 320 indicates luminance distribution information of a healthy person obtained in advance, and indicates reference luminance distribution information.
  • the healthy person's luminance distribution information (dotted line) 320 is determined in advance by quantitatively obtaining luminance distribution information obtained from the fundus image power of a plurality of healthy persons.
  • the image analysis unit 6 divides into three zones, a dark color zone, a gray zone, and a light color zone, at the boundary where the luminance distribution information (solid line) 310 to be analyzed and the luminance distribution information 320 of the healthy person intersect.
  • the image analysis unit 6 analyzes the luminance information of the portion corresponding to the dark zone in the luminance distribution information 310 using the -Ural network.
  • the gray zone and the light color zone are also analyzed using the -Ural network, and it is determined whether or not DR is indicated based on these output results.
  • the neural network of the present embodiment is configured as a feed-forward type composed of three layers, an input layer, an intermediate layer, and an output layer, and inputs and outputs learning data (teacher signal) to backpropagation. Learning (training) using the method.
  • Input data includes, for example, the total number of pixels in each zone, maximum (minimum) luminance information, the difference in the number of pixels between the gray and dark (light) zones, the number of pixels relative to the peak value in each zone, the bottom in each zone Feature data required for DR determination, such as the number of pixels for the value, is given.
  • the weighting parameters within the neuron's network matrix are systematically adjusted until an acceptable response is achieved.
  • the network is trained on all pre-prepared training map datasets and then tested simultaneously on an independent test map dataset. The network has been improved to correctly classify both training and test sets.
  • the tolerance of both is set to a low value so that higher accuracy can be realized.
  • FIG. 9 shows a diagram in which the fundus image is divided into local luminance distribution information.
  • the image analysis unit 6 masks pixels corresponding to blood vessels and optic nerve heads on the fundus image and divides the fundus image into small regions 400 (one sector).
  • the image analysis unit 6 For each sector 400 where no masked pixel exists, the image analysis unit 6 divides the pixel 401 in the sector 1 into dark, gray, and light pixels, and counts the number. At this time, if the number of dark or light pixels exceeds the predetermined number, the sector 400 is counted as an abnormal sector 400 because it is likely to indicate DR. If there is no masked pixel, analyze all sectors 400 in the same way and detect abnormal sectors 400.
  • the image analysis unit 6 analyzes the portion having the dark pixel force and the portion having the light pixel force in the sector 400 determined to be abnormal in the same manner as described above using the -Ural network, and performs an abnormality based on the output result. It is judged whether or not it shows the DR for every 400 sectors.
  • the control unit 2 displays (notifies) that fact on the monitor 3. If the result of neural network output does not indicate DR, control unit 2 displays that fact on monitor 3.
  • the fundus image is analyzed using the -Ural network, and the result of the analysis is a force that is used to determine the presence or absence of the onset of diabetic retinopathy. /.
  • fundus image power After removing (masking) blood vessels, optic nerve head, etc., obtaining the luminance distribution tendency of each pixel, using a neural network, other fundus images can be obtained Similar analysis can be performed based on local differences in the whole or in the fundus.
  • the fundus imaging apparatus has a configuration for performing image analysis, but the present invention is not limited to this.
  • the image analysis described above does not have a function of photographing the fundus, and can be applied to any device that analyzes fundus images obtained by other devices.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

L'invention concerne un analyseur d'image de fond d'œil permettant une analyse efficace d'une image de fond d'œil sans opération manuelle autant que possible. L'analyseur pour analyser une image de fond d'œil capturée comprend des moyens de stockage pour stocker une image de fond d'œil capturée, des moyens de masque pour identifier une partie de vaisseau sanguin par un traitement d'image de l'image de fond d'œil stockée et pour masquer la partie de vaisseau sanguin identifiée, des moyens d'acquisition de distribution de luminance pour déterminer les informations de luminance sur la totalité des pixels de l'image de fond d'œil, à l'exception de la partie masquée, et pour acquérir des informations de distribution de luminance sur l'image de fond d'œil entière, des moyens d'analyse d'image pour comparer les informations de distribution de luminance acquises à des informations de distribution de luminance de référence prédéterminées pour analyser l'image de fond d'œil, et des moyens d'indication pour indiquer les résultats de l'analyse par les moyens d'analyse d'image.
PCT/JP2006/323413 2006-11-24 2006-11-24 Analyseur d'image de fond d'œil WO2008062528A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2006/323413 WO2008062528A1 (fr) 2006-11-24 2006-11-24 Analyseur d'image de fond d'œil

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2006/323413 WO2008062528A1 (fr) 2006-11-24 2006-11-24 Analyseur d'image de fond d'œil

Publications (1)

Publication Number Publication Date
WO2008062528A1 true WO2008062528A1 (fr) 2008-05-29

Family

ID=39429470

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2006/323413 WO2008062528A1 (fr) 2006-11-24 2006-11-24 Analyseur d'image de fond d'œil

Country Status (1)

Country Link
WO (1) WO2008062528A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018027273A (ja) * 2016-08-19 2018-02-22 学校法人自治医科大学 糖尿病網膜症の病期判定支援システムおよび糖尿病網膜症の病期の判定を支援する方法
WO2019073962A1 (fr) * 2017-10-10 2019-04-18 国立大学法人 東京大学 Dispositif et programme de traitement d'images
JP2019177032A (ja) * 2018-03-30 2019-10-17 株式会社ニデック 眼科画像処理装置、および眼科画像処理プログラム
CN110432860A (zh) * 2019-07-01 2019-11-12 中山大学中山眼科中心 基于深度学习识别广域眼底图中格变裂孔的方法和系统
JP2021154159A (ja) * 2017-12-28 2021-10-07 株式会社トプコン 機械学習ガイド付き撮影システム

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005508215A (ja) * 2001-08-30 2005-03-31 フィラデルフィア オフサルミック イメージング システムズ 糖尿病性網膜症の患者をスクリーニングするためのシステムおよび方法
JP2005253796A (ja) * 2004-03-12 2005-09-22 Yokohama Tlo Co Ltd 眼底診断装置
JP2005261789A (ja) * 2004-03-22 2005-09-29 Kowa Co 眼底画像処理方法及び眼底画像処理装置
JP2006280682A (ja) * 2005-04-01 2006-10-19 Hitachi Omron Terminal Solutions Corp ノイズ検出機能を備えた画像診断支援方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005508215A (ja) * 2001-08-30 2005-03-31 フィラデルフィア オフサルミック イメージング システムズ 糖尿病性網膜症の患者をスクリーニングするためのシステムおよび方法
JP2005253796A (ja) * 2004-03-12 2005-09-22 Yokohama Tlo Co Ltd 眼底診断装置
JP2005261789A (ja) * 2004-03-22 2005-09-29 Kowa Co 眼底画像処理方法及び眼底画像処理装置
JP2006280682A (ja) * 2005-04-01 2006-10-19 Hitachi Omron Terminal Solutions Corp ノイズ検出機能を備えた画像診断支援方法

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018027273A (ja) * 2016-08-19 2018-02-22 学校法人自治医科大学 糖尿病網膜症の病期判定支援システムおよび糖尿病網膜症の病期の判定を支援する方法
WO2019073962A1 (fr) * 2017-10-10 2019-04-18 国立大学法人 東京大学 Dispositif et programme de traitement d'images
JPWO2019073962A1 (ja) * 2017-10-10 2019-11-14 国立大学法人 東京大学 画像処理装置及びプログラム
JP2021154159A (ja) * 2017-12-28 2021-10-07 株式会社トプコン 機械学習ガイド付き撮影システム
JP2019177032A (ja) * 2018-03-30 2019-10-17 株式会社ニデック 眼科画像処理装置、および眼科画像処理プログラム
CN110432860A (zh) * 2019-07-01 2019-11-12 中山大学中山眼科中心 基于深度学习识别广域眼底图中格变裂孔的方法和系统

Similar Documents

Publication Publication Date Title
JP5117396B2 (ja) 眼底撮影装置
US7810928B2 (en) Evaluating pupillary responses to light stimuli
JP4113005B2 (ja) 眼の検査用機器
KR100738491B1 (ko) 안과 장치
RU2612500C2 (ru) Система и способ для удаленного измерения оптического фокуса
US7874675B2 (en) Pupillary reflex imaging
US20070171363A1 (en) Adaptive photoscreening system
US6616277B1 (en) Sequential eye screening method and apparatus
US6663242B1 (en) Simultaneous, wavelength multiplexed vision screener
JP5850292B2 (ja) 眼科装置
US20220338733A1 (en) External alignment indication/guidance system for retinal camera
WO2008062528A1 (fr) Analyseur d'image de fond d'œil
JP3950876B2 (ja) 眼底検査装置
US8996097B2 (en) Ophthalmic measuring method and apparatus
JP2010233978A (ja) 視機能検査装置
JP2005102948A (ja) 視野計
US20210307604A1 (en) Ophthalmic photographing apparatus
JP4542350B2 (ja) 前眼部の測定装置
US7404641B2 (en) Method for examining the ocular fundus
WO2000021432A1 (fr) Procedes et appareil d'imagerie oculaire numerique
WO2021085020A1 (fr) Dispositif ophtalmique et son procédé de commande
JP6325856B2 (ja) 眼科装置及び制御方法
JP2005102947A (ja) 眼科装置
US20230233078A1 (en) Vision Screening Device Including Color Imaging
EP4133992A1 (fr) Détermination de la capacité de vision des couleurs à l'aide d'un dispositif d'examen de vision

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 06833217

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06833217

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP