WO2011027531A1 - Fundus camera - Google Patents

Fundus camera Download PDF

Info

Publication number
WO2011027531A1
WO2011027531A1 PCT/JP2010/005332 JP2010005332W WO2011027531A1 WO 2011027531 A1 WO2011027531 A1 WO 2011027531A1 JP 2010005332 W JP2010005332 W JP 2010005332W WO 2011027531 A1 WO2011027531 A1 WO 2011027531A1
Authority
WO
WIPO (PCT)
Prior art keywords
fundus
focus
focus detection
detection range
unit
Prior art date
Application number
PCT/JP2010/005332
Other languages
French (fr)
Inventor
Hiroyuki Inoue
Tomoyuki Iwanaga
Shinya Tanaka
Original Assignee
Canon Kabushiki Kaisha
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Kabushiki Kaisha filed Critical Canon Kabushiki Kaisha
Priority to US13/393,001 priority Critical patent/US20120154748A1/en
Priority to EP10813483.4A priority patent/EP2473093A4/en
Priority to CN201080039000.2A priority patent/CN102481097B/en
Publication of WO2011027531A1 publication Critical patent/WO2011027531A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
    • G02B7/365Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals by analysis of the spatial frequency components of the image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/12Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes

Definitions

  • the present invention relates to a fundus camera used in an ophthalmic hospital or a group medical examination to photograph a fundus of an eye to be examined.
  • a fundus camera When a fundus camera is to be focused on the fundus of the eye to be examined, an index is projected on the fundus. An index image is then observed via a focus lens of an observation photographing system, and the fundus camera is focused on the fundus based on the observed position of the index image.
  • Japanese Patent Application Laid-Open No. 5-95907 discusses a fundus camera that picks up a focus split index image which is split into two and projected on the fundus. The fundus camera then detects a focus state from each position of the focus split index image and attenuates the brightness of the index.
  • Japanese Patent Application Laid-Open No. 8-275921 discusses an ophthalmic apparatus that projects a focus index on the fundus. The apparatus then picks up the focus index image using an imaging optical system and detects the focus state.
  • Japanese Patent Application Laid-Open No. 1-178237 discusses a modified example of an apparatus that electronically picks up an image during observation and performs automatic focusing (AF) by performing contrast detection of the picked-up image itself. More specifically, the apparatus focuses on a first range and a second range of the fundus using a high-frequency component of the fundus image and acquires distances from each range to a focus lens position in a direction of the optical axis.
  • AF automatic focusing
  • a conventional fundus camera separates an area in which a fundus illumination light flux or a focus split index light flux is emitted, from an area in which an observation photographing light flux is emitted near a pupil of the eye to be examined. This is to eliminate reflected light by a cornea of the eye to be examined. If there is a difference among individuals in the aberration of an optical system of the eye to be examined, a focusing error may then be generated depending on the eye to be examined. More specifically, the error is generated when the fundus is photographed by only setting the focus split index position to a predetermined position, and an unfocused fundus image may thus be acquired.
  • Such an apparatus solves the above-described problem in which the unfocused fundus image is acquired due to the focusing error depending on the eye to be examined.
  • a focus detection region is fixedly arranged with respect to a portion of the image pickup system, the following problem arises.
  • the distance in the depth direction of the fundus image is different depending on the region of the fundus.
  • a focus detection range is fixed, it thus becomes necessary to direct a line-of-sight of the eye to be examined so that the region to be focused matches the focus detection range.
  • the AF detection position may change due to a movement of the eye to be examined.
  • the present invention is directed to a fundus camera that solves the above-described problems and is capable of easily performing alignment.
  • a fundus camera includes a fundus illumination optical system configured to illuminate a fundus of an eye to be examined, a fundus imaging optical system having a focus lens which is driven to focus on a fundus, a focus lens driving unit configured to drive the focus lens, a fundus image pickup unit arranged at a position which is conjugate to a fundus with respect to the fundus imaging optical system, a display monitor configured to display a fundus image acquired by the fundus image pickup unit, a focus state detection unit configured to detect an AF evaluation value indicating a level of a focus state based on an output signal from the fundus image pickup unit, and a lens drive control unit configured to drive the focus lens based on the AF evaluation value detected by the focus state detection unit, wherein the focus state detection unit includes a fundus position detection unit configured to detect with respect to an output from the fundus image pickup unit a specific region of a fundus image using a region pattern unique to a fundus region, and a focus detection range determination unit configured to
  • Fig. 1 illustrates a configuration of a fundus camera according to a first exemplary embodiment of the present invention.
  • Fig. 2 illustrates a configuration of a focus state detection unit.
  • Fig. 3 illustrates a configuration of a fundus position detection unit.
  • Fig. 4 illustrates a configuration of a focus detection range determination unit.
  • Fig. 5 is a flowchart illustrating a control method.
  • Fig. 6 illustrates a basic principle of contrast detection.
  • Fig. 7 illustrates a fundus image displayed on a display monitor.
  • Fig. 8 illustrates a method for calculating an AF evaluation value.
  • Fig. 9 illustrates a configuration of a focus detection range determination unit according to a third exemplary embodiment of the present invention.
  • Fig. 10 illustrates a configuration of a focus detection range determination unit according to a fourth exemplary embodiment of the present invention.
  • Fig. 11 illustrates an external view of a fundus camera.
  • Fig. 12 illustrates a
  • Fig. 1 illustrates a configuration of the fundus camera.
  • a fundus illumination optical system is formed as described below.
  • An observation light source 1 a photographing light source 2, a lens 3, and a mirror 4 are arranged on an optical axis L1.
  • Relay lenses 5 and 6 and a perforated mirror 7 having an opening at the center are sequentially arranged on an optical axis L2 in a reflection direction of the mirror 4.
  • An objective lens 8 is arranged opposite to an eye to be examined E on an optical axis L3 in the reflection direction of the perforated mirror 7.
  • the observation light source 1 for illuminating the fundus is formed of a halogen lamp that emits ambient light
  • the photographing light source 2 is formed of a stroboscopic tube that emits visible light.
  • a fundus imaging optical system in the fundus camera illustrated in Fig. 1 is configured as described below.
  • a focus lens 9 for adjusting the focus by moving along the optical axis, a photographing lens 10, and a fundus image pickup unit 11 disposed at a position conjugate to a fundus Er are sequentially arranged at the rear of the perforated mirror 7 on the optical axis L3.
  • An output from the fundus image pickup unit 11 is transmitted to a focus state detection unit 21. Further, an output from the focus state detection unit 21 is transmitted to the focus lens 9 via a lens drive control unit 22 and a focus lens driving unit 23, also to the observation light source 1 via an illumination light quantity control unit 24, and is connected to a display monitor 25.
  • a focus detection range display unit 25a is included in the display monitor 25.
  • An examiner observes the fundus image displayed on the display monitor 25 and adjusts using the observation light source 1 the alignment of the eye to be examined E with a chassis that includes the optical system. The examiner then adjusts the focus and photographs the fundus using the photographing light source 2.
  • the fundus camera includes an AF function that automatically adjusts the focus.
  • the fundus camera is capable of displaying the focus detection range to the examiner by superposing a frame portion of the focus detection range display unit 25a on the fundus image acquired by the fundus image pickup unit 11.
  • the fundus camera is capable of visually displaying the focus detection position to the user and thus improves the AF operability.
  • Focus detection in such a fundus camera is performed using contrast detection of the fundus image that is formed by the photographing light flux.
  • the fundus camera is thus different from a conventional apparatus which projects the focus index via an anterior eye region outside the image pickup light flux.
  • the fundus camera is capable of performing automatic focusing independent of the aberration of the optical system of the eye to be examined.
  • the focus state detection unit 21 includes a fundus position detection unit 21a that detects a specific position of the fundus Er.
  • the focus state detection unit 21 also includes a focus detection range determination unit 21b that determines the focus detection range based on a signal received from the fundus position detection unit 21a.
  • the focus state detection unit 21 includes an AF evaluation value storing unit 21c that stores the AF evaluation value and the position of the focus lens 9 when the AF evaluation value is acquired.
  • the fundus position detection unit 21a includes a fundus image pattern memory 21d that stores region patterns which are reference images of the specific regions in the fundus image.
  • the region pattern is used to extract the specific region from the fundus image.
  • Position information of the specific region is acquired by performing pattern matching between the region pattern recorded in the fundus image pattern memory 21d and an output signal from the fundus image pickup unit 11.
  • the focus detection range determination unit 21b determines the range to be focused based on the fundus image specific region extracted by the fundus position detection unit 21a.
  • the focus detection range determination unit 21b includes a focus detection range correction unit 21e as illustrated in Fig. 4, so that the examiner can correct the size of the focus detection range.
  • the examiner uses the focus detection range correction unit 21e by operating a cursor with respect to the image on the display monitor 25.
  • the focus state detection unit 21 calculates the AF evaluation value of the focus detection range determined by the focus detection range determination unit 21b.
  • the focus state detection unit 21 also stores information about the position of the focus lens 9 at that time in the AF evaluation value storing unit 21c.
  • Fig. 5 is a flowchart illustrating an AF control method.
  • the examiner instructs start of the AF operation via an AF start switch (not illustrated).
  • the fundus camera then starts performing pattern recognition of the fundus image.
  • the fundus position detection unit 21a calculates a correlation function between the output from the fundus image pickup unit 11 and the region pattern of the fundus image specific region stored in the fundus image pattern memory 21d.
  • the fundus position detection unit 21a then performs a comparison, and the ranges in which the calculated value becomes greater than or equal to a threshold value are determined to be the same range.
  • the fundus position detection unit 21a thus determines whether pattern recognition can be performed.
  • step S3 the fundus camera sequentially drives the focus lens 9 until pattern recognition can be performed .
  • the pattern recognition is performed each time.
  • step S2 If pattern recognition can be performed (YES in step S2), the process proceeds to step S4.
  • step S4 the focus detection range determination unit 21b determines the focus detection range based on the output from the fundus position detection unit 21a.
  • step S5 the focus state detection unit 21 then calculates the AF evaluation value indicating a focus level of the focus detection range. The method for calculating the AF evaluation value will be described below.
  • step S6 the AF evaluation storing unit 21c stores the calculated AF evaluation value.
  • Fig. 6 illustrates a principle of the focus detection using contrast detection.
  • a focus detection method is performed based on a specific high-frequency component of a luminance signal, which obtains a maximum value when focused.
  • the focus state detection unit 21 thus detects and uses as the AF evaluation value the high-frequency component of the input luminance signal.
  • a position of the focus lens is indicated on a horizontal axis, and the AF evaluation value is indicated on a vertical axis.
  • the AF evaluation value becomes the maximum value at a focus position M2 and decreases at a position M1 which is greatly out of focus.
  • focus correction which matches the aberration of the optical system of the human eye is performed using the contrast detection principle.
  • step S7 the fundus camera determines whether the maximum point, i.e., the position M2 illustrated in Fig. 6, is included in the AF evaluation values stored in step S6, using the principle of contrast detection. Since the determination of the maximum point cannot be performed in an initial determination performed in step S7, the process proceeds to step S3. In step S3, the fundus camera drives the focus lens 9.
  • step S7 the process proceeds to step S8.
  • step S8 the focus state detection unit 21 calculates a displacement of the focus lens 9.
  • the displacement of the focus lens 9 in step S8 is an amount the focus lens 9 is driven to the position at which the maximum point M2 of the AF evaluation value is detected.
  • step S9 the lens drive control unit 22 transmits a signal to the focus lens driving unit 23 based on the focus lens displacement calculated in step S8 and drives the focus lens 9. Automatic focusing thus ends.
  • step S9 the focus lens 9 is driven and the automatic focusing ends in step S9 based on the displacement of the focus lens 9 calculated in step S8.
  • the processes of step S2 to step S5 may be performed after performing step S9 to calculate the AF evaluation value.
  • the calculated AF value is then compared with the AF evaluation value in which the maximum point is first determined. Automatic focusing may thus end when the difference between the AF evaluation values become less than or equal to a threshold value.
  • step S7 the process proceeds to step S3.
  • step S3 the fundus camera drives the focus lens 9 by a predetermined amount.
  • the process then returns to step S2, in which the fundus position detection unit 21a again performs pattern recognition.
  • step S4 the focus detection range determination unit 21b then determines the focus detection range.
  • the focus detection range can follow the movement of the eye to be examined E, even when the eye to be detected E moves while performing automatic focusing. If pattern recognition or the detection of the maximum point of the AF evaluation values cannot be determined during a predetermined number of cycles, it may be determined as an error.
  • Fig. 7 illustrates the fundus image displayed on the display monitor 25.
  • the relative positions of an optic disc N, large and medium blood vessels V, and a yellow spot Y which are unique to the fundus do not vary greatly regardless of the difference between individuals. Further, the relative positions are mirror-reversed between the left and right eyes.
  • Fig. 8 illustrates the AF evaluation values when the focus detection range is the region pattern of the large and medium blood vessels V.
  • An AF evaluation value calculation method is a method for easily detecting the high-frequency component in the image. In such a method, luminance signals of a target pixel and eight pixels that are horizontally, vertically, and diagonally adjacent to the target pixel are compared. A value of the greatest difference between the luminance signals of the pixels then becomes the AF evaluation value of the target pixel.
  • An image G1 is an example of a portion of the image in which the large and medium blood vessels V exist in a vertical direction. The luminance signal of each pixel is either "0" or "1".
  • the AF evaluation values for each pixel are acquired as illustrated in an image A2.
  • a total sum of the AF evaluation values of the pixels can then be set as the AF evaluation value of the entire image.
  • the AF evaluation value can be more easily and speedily calculated by comparing the luminance signals of two adjacent pixels. If there is no difference, the AF evaluation value is set to "0", and if there is a difference, the AF evaluation value is set to "1". Since the number of pixels to be compared is less than the number in the above-described method, the calculation load is reduced. However, if two pixels which are vertically adjacent to each other in the image G1 are compared, an image G3 is acquired, and edges of the large and medium blood vessels V cannot be detected.
  • the differences between the luminance of the adjacent pixels in the images G1 and G4 are mapped as indicated by the images G2 and G5. If the difference is greater, the difference in the luminance between the adjacent pixels is great, and the total sum is set as the AF evaluation value of the entire image.
  • the large and medium blood vessels V run in a circular arc shape with the yellow spot Y at proximate center in the fundus Er.
  • High speed automatic focusing can thus be performed with a low load and without lowering the sensitivity of the AF evaluation value by employing a detection method that is selective in such a direction.
  • the large and medium blood vessels V in the fundus Er are employed in performing pattern recognition of the fundus image.
  • patterns of other regions such as the optic disc N or the yellow spot Y may be stored in the fundus image pattern memory 21d, so that automatic focusing is performed with respect to such regions.
  • the focus detection range can thus be automatically determined using pattern recognition, and the AF operability can be improved. Further, the focus detection position can follow the movement of the eye to be examined E, so that focus accuracy can be improved.
  • the focus state detection unit 21 refers to the luminance value of each pixel when calculating the AF evaluation value, the saturation of the luminance value of the determined focus detection range may be detected. If the luminance value is saturated, the focus state detection unit 21 transmits a signal to the illumination light quantity control unit 24 to adjust the light quantity of the observation light source 1. Automatic focusing can thus be performed with higher accuracy. For example, if the light quantity of the illumination optical system is adjusted when performing contrast detection on the optic disc N in which overexposure is easily generated, a highly accurate fundus image having a great diagnostic value can be acquired.
  • pattern recognition is performed on a specific region of the fundus Er.
  • the examiner selects before starting automatic focusing, the region in the fundus Er for setting the focus detection range.
  • the focus detection range is thus determined based on the selection, and automatic focusing is then performed.
  • the fundus image pattern memory 21d includes a plurality of fundus image patterns, such as the region patterns of the optic disc N, the yellow spot Y, and the large and medium blood vessels V.
  • the examiner previously selects the region to be focused according to a case, using a region selection unit such as the cursor on the display monitor 25.
  • This is equivalent to the fundus position detection unit 21a selecting one of the plurality of fundus image patterns.
  • the fundus position detection unit 21a detects a position of the fundus image pattern selected based on the output from the fundus image pickup unit 11 and transfers the result to the focus detection range determining unit 21b.
  • Such a process and the processes to follow are similar to those described according to the first exemplary embodiment.
  • the examiner may also select a plurality of regions in the fundus instead of one region.
  • the AF evaluation value is calculated for each of the plurality of regions, and the sum of the AF evaluation values is set as an overall evaluation value. Accordingly, an image which is averagely focused with respect to the plurality of regions selected by the examiner can be acquired by detecting the maximum value of the overall evaluation value. As a result, the fundus image in which the region desired by the examiner is focused can be photographed, and the examiner can acquire the fundus image of great diagnostic value.
  • the fundus image of great diagnostic value can be acquired by performing pattern recognition of the region the examiner particularly desires to focus on in the diagnosis and determining the focus detection range.
  • an appropriate focus detection range can be determined in the optic disc N, the large and medium blood vessels V, and the yellow spot Y in the fundus image which include a comparatively large amount of high-frequency component. Highly accurate contrast detection can thus be performed.
  • highly accurate contrast detection can be performed by detecting the large and medium blood vessels V in which there is little difference between individuals as compared to the optic disc N while there is a great difference between individuals in concavity and convexity of the optic disc N. Further, a running direction of the large and medium blood vessels V can be easily identified. Highly accurate contrast detection with a small calculation load can thus be performed at high speed and low cost by detecting the contrast in a direction perpendicular to the large and medium blood vessels V.
  • an image of high diagnostic value appropriate for a lesion which the examiner is focusing on can be acquired by the examiner selecting the focus detection range from the plurality of regions in the fundus.
  • the examiner selects the focus detection range before starting automatic focusing.
  • the examiner selects the focus detection range from specific regions on which pattern recognition has been performed, and automatic focusing is then performed.
  • the fundus image pattern memory 21d includes a plurality of fundus image patterns, such as the region patterns of the optic disc N, the yellow spot Y, and the large and medium blood vessels V. This is similar to the second exemplary embodiment.
  • the positions of the plurality of fundus image patterns are detected with respect to the output from the fundus image pickup unit 11. The result is then transferred to the focus detection range determination unit 21b. Such a process is different from the first and second exemplary embodiments.
  • the focus detection range determination unit 21b includes a focus detection range correction unit 21e and a focus detection range selection unit 21f.
  • the focus detection range display unit 25a included in the display monitor 25 displays to the examiner the plurality of specific regions of the fundus image extracted by the fundus position detection unit 21a.
  • the examiner uses the focus detection range selection unit 21f, i.e., the cursor, and selects from the plurality of specific regions, one region for setting the focus detection range.
  • the specific regions of the fundus image may be displayed to the examiner when a predetermined number of pattern recognition results has been detected, or when the focus lens 9 has moved over the entire movable range.
  • the examiner can manually correct the position and the size of the focus detection range using the focus detection range correction unit 21e.
  • the examiner can thus acquire the fundus image in which the region desired by the user is correctly focused.
  • the examiner may also select the plurality of regions instead of one fundus region, similar to the second exemplary embodiment.
  • the process for transferring the selected specific region of the fundus image to the focus detection range determination unit 21b and the processes that follow are similar to the first exemplary embodiment.
  • the AF evaluation value of one or a plurality of the focus detection ranges selected by the examiner from a plurality of pattern-recognized fundus image regions is calculated.
  • the AF evaluation values are calculated and evaluated for all of the plurality of fundus image region patterns that are pattern-recognized, and automatic focusing is then performed.
  • the focus state detection unit 21 calculates the AF evaluation values for each of the plurality of specific regions in the fundus image, extracted by the fundus position detection unit 21a. The focus state detection unit 21 then sets the sum of the calculated AF evaluation values as the overall evaluation value. An image which is averagely focused with respect to the plurality of regions selected by the examiner can thus be acquired by detecting the maximum value of the overall evaluation value.
  • the focus detection range determination unit 21b includes a focus detection range narrowing unit 21g.
  • the focus detection range narrowing unit 21g automatically determines the specific region having the highest AF evaluation value as the focus detection range and transfers the result to the focus state detection unit 21.
  • the process for transferring the selected specific region of the fundus image to the focus detection range determination unit 32b and the processes to follow are similar to the above-described exemplary embodiments. As a result, the focused fundus image can be automatically photographed, so that a fundus camera of high AF operability can be acquired.
  • the focus detection range is automatically determined, the AF operability can be improved.
  • the position of the specific region in the fundus image is detected only by pattern recognition performed by the fundus position detection unit 21a.
  • pattern recognition of the optic disc N and detection of the right / left eye are combined.
  • the large and medium blood vessels V including a large amount of the specific high-frequency components are then detected, and automatic focusing is performed.
  • Fig. 11 illustrates an external view of the fundus camera according to the fifth exemplary embodiment.
  • a mount 32 which is movable back and forth and in a horizontal direction as indicated by arrows illustrated in Fig. 11 is disposed on a base 31.
  • a control stick 35 including a photographing switch is disposed on the mount 32.
  • the examiner operates with the control stick 35 and adjusts the mount 32 in the horizontal direction to align with the right and left eyes.
  • a right / left eye detection unit 36 is disposed between the base 31, and the mount 32. The position of the chassis 33 in the horizontal direction is then detected, so that the right / left eye detection unit 36 can detect whether the right eye or the left eye to be examined E is being observed and photographed.
  • Fig. 12 illustrates a detection method performed by the right / left eye detection unit 36.
  • the right / left eye detection unit 36 which is disposed on a bottom surface of the mount 32 is formed of a microswitch.
  • the right / left eye detection unit 36 is in an "off" state when positioned above the low portion of the base 31 and an "on" state when positioned above the high portion of the base 31. More specifically, the left or the right eye to be examined facing the chassis 33 can be detected by forming the low portion 31a on the left side and the high portion 31b on the right side and detecting the on / off state of the right / left eye detection unit 36.
  • a method for detecting the focus detection range by the right / left eye detection unit 36 performing the right / left eye detection and the fundus position detection unit 21a performing pattern recognition of the optic disc N will be described below.
  • the method for detecting the large and medium blood vessels V illustrated in Fig. 7 will be described below.
  • the structure of the fundus Er can be predicted by detecting a specific region in the fundus Er and determining whether the right or left eyes is being observed.
  • the large and medium blood vessels V can then be detected by the right / left eye detection unit 36 detecting the right or left eye and by performing pattern recognition of the optic disc N.
  • the process for transferring the result of detecting the large and medium blood vessels V, to the focus detection range determination unit 21b and the processes to follow are similar to the above-described exemplary embodiments.
  • the examiner uses the focus detection range correction unit 21e to manually correct the position and the size of the focus detection range, so that the fundus image in which the region desired by the examiner is correctly focused can be acquired.
  • the large and medium blood vessels V or the yellow spot Y is identified and set as the focus detection range by detecting the optic disc N which can be easily pattern-recognized and by detecting the right or left eye.
  • the calculation load and the calculation time are thus reduced, so that high-speed automatic focusing can be realized.

Landscapes

  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Ophthalmology & Optometry (AREA)
  • Medical Informatics (AREA)
  • General Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Optics & Photonics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

If a position of a focus lens is greatly displaced from a focus state when automatic focusing is started, and pattern recognition cannot be performed, the process proceeds to step S3. In step S3, the lens is sequentially driven until pattern recognition can be performed. If it is determined that the pattern recognition can be performed in step S2, a focus detection range is determined in S4. In step S5, an AF evaluation value of the range is calculated, and the value is stored in step S6.

Description

FUNDUS CAMERA
The present invention relates to a fundus camera used in an ophthalmic hospital or a group medical examination to photograph a fundus of an eye to be examined.
When a fundus camera is to be focused on the fundus of the eye to be examined, an index is projected on the fundus. An index image is then observed via a focus lens of an observation photographing system, and the fundus camera is focused on the fundus based on the observed position of the index image.
Japanese Patent Application Laid-Open No. 5-95907 discusses a fundus camera that picks up a focus split index image which is split into two and projected on the fundus. The fundus camera then detects a focus state from each position of the focus split index image and attenuates the brightness of the index.
Further, Japanese Patent Application Laid-Open No. 8-275921 discusses an ophthalmic apparatus that projects a focus index on the fundus. The apparatus then picks up the focus index image using an imaging optical system and detects the focus state.
Furthermore, Japanese Patent Application Laid-Open No. 1-178237 discusses a modified example of an apparatus that electronically picks up an image during observation and performs automatic focusing (AF) by performing contrast detection of the picked-up image itself. More specifically, the apparatus focuses on a first range and a second range of the fundus using a high-frequency component of the fundus image and acquires distances from each range to a focus lens position in a direction of the optical axis.
However, a conventional fundus camera separates an area in which a fundus illumination light flux or a focus split index light flux is emitted, from an area in which an observation photographing light flux is emitted near a pupil of the eye to be examined. This is to eliminate reflected light by a cornea of the eye to be examined. If there is a difference among individuals in the aberration of an optical system of the eye to be examined, a focusing error may then be generated depending on the eye to be examined. More specifically, the error is generated when the fundus is photographed by only setting the focus split index position to a predetermined position, and an unfocused fundus image may thus be acquired.
To solve such a problem, there is an apparatus which electronically picks up an image during observation and performs automatic focusing by performing contrast detection of the picked-up image itself.
Such an apparatus solves the above-described problem in which the unfocused fundus image is acquired due to the focusing error depending on the eye to be examined. However, since a focus detection region is fixedly arranged with respect to a portion of the image pickup system, the following problem arises.
The distance in the depth direction of the fundus image is different depending on the region of the fundus. In a conventional AF detection in which a focus detection range is fixed, it thus becomes necessary to direct a line-of-sight of the eye to be examined so that the region to be focused matches the focus detection range.
Further, it becomes necessary to manually move the detection range when the focus detection range is movable, as in a general AF single-lens reflex camera. Furthermore, the AF detection position may change due to a movement of the eye to be examined.
Japanese Patent Application Laid-Open No. 5-95907 Japanese Patent Application Laid-Open No. 8-275921 Japanese Patent Application Laid-Open No. 1-178237
The present invention is directed to a fundus camera that solves the above-described problems and is capable of easily performing alignment.
According to an aspect of the present invention, a fundus camera includes a fundus illumination optical system configured to illuminate a fundus of an eye to be examined, a fundus imaging optical system having a focus lens which is driven to focus on a fundus, a focus lens driving unit configured to drive the focus lens, a fundus image pickup unit arranged at a position which is conjugate to a fundus with respect to the fundus imaging optical system, a display monitor configured to display a fundus image acquired by the fundus image pickup unit, a focus state detection unit configured to detect an AF evaluation value indicating a level of a focus state based on an output signal from the fundus image pickup unit, and a lens drive control unit configured to drive the focus lens based on the AF evaluation value detected by the focus state detection unit, wherein the focus state detection unit includes a fundus position detection unit configured to detect with respect to an output from the fundus image pickup unit a specific region of a fundus image using a region pattern unique to a fundus region, and a focus detection range determination unit configured to determine a focus detection range based on an output from the fundus position detection unit, and calculates the AF evaluation value of the focus detection range determined by the focus detection range determination unit.
Further features and aspects of the present invention will become apparent from the following detailed description of exemplary embodiments with reference to the attached drawings.
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate exemplary embodiments, features, and aspects of the invention and, together with the description, serve to explain the principles of the invention.
Fig. 1 illustrates a configuration of a fundus camera according to a first exemplary embodiment of the present invention. Fig. 2 illustrates a configuration of a focus state detection unit. Fig. 3 illustrates a configuration of a fundus position detection unit. Fig. 4 illustrates a configuration of a focus detection range determination unit. Fig. 5 is a flowchart illustrating a control method. Fig. 6 illustrates a basic principle of contrast detection. Fig. 7 illustrates a fundus image displayed on a display monitor. Fig. 8 illustrates a method for calculating an AF evaluation value. Fig. 9 illustrates a configuration of a focus detection range determination unit according to a third exemplary embodiment of the present invention. Fig. 10 illustrates a configuration of a focus detection range determination unit according to a fourth exemplary embodiment of the present invention. Fig. 11 illustrates an external view of a fundus camera. Fig. 12 illustrates a configuration of a right / left eye detection unit.
Various exemplary embodiments, features, and aspects of the invention will be described in detail below with reference to the drawings.
The first exemplary embodiment according to the present invention will be described below. Fig. 1 illustrates a configuration of the fundus camera. Referring to Fig. 1, a fundus illumination optical system is formed as described below. An observation light source 1, a photographing light source 2, a lens 3, and a mirror 4 are arranged on an optical axis L1. Relay lenses 5 and 6 and a perforated mirror 7 having an opening at the center are sequentially arranged on an optical axis L2 in a reflection direction of the mirror 4. An objective lens 8 is arranged opposite to an eye to be examined E on an optical axis L3 in the reflection direction of the perforated mirror 7. The observation light source 1 for illuminating the fundus is formed of a halogen lamp that emits ambient light, and the photographing light source 2 is formed of a stroboscopic tube that emits visible light.
A fundus imaging optical system in the fundus camera illustrated in Fig. 1 is configured as described below. A focus lens 9 for adjusting the focus by moving along the optical axis, a photographing lens 10, and a fundus image pickup unit 11 disposed at a position conjugate to a fundus Er are sequentially arranged at the rear of the perforated mirror 7 on the optical axis L3.
An output from the fundus image pickup unit 11 is transmitted to a focus state detection unit 21. Further, an output from the focus state detection unit 21 is transmitted to the focus lens 9 via a lens drive control unit 22 and a focus lens driving unit 23, also to the observation light source 1 via an illumination light quantity control unit 24, and is connected to a display monitor 25. A focus detection range display unit 25a is included in the display monitor 25.
An examiner observes the fundus image displayed on the display monitor 25 and adjusts using the observation light source 1 the alignment of the eye to be examined E with a chassis that includes the optical system. The examiner then adjusts the focus and photographs the fundus using the photographing light source 2.
The fundus camera according to the present exemplary embodiment includes an AF function that automatically adjusts the focus. The fundus camera is capable of displaying the focus detection range to the examiner by superposing a frame portion of the focus detection range display unit 25a on the fundus image acquired by the fundus image pickup unit 11. As a result, the fundus camera is capable of visually displaying the focus detection position to the user and thus improves the AF operability.
Focus detection in such a fundus camera is performed using contrast detection of the fundus image that is formed by the photographing light flux. The fundus camera is thus different from a conventional apparatus which projects the focus index via an anterior eye region outside the image pickup light flux. The fundus camera is capable of performing automatic focusing independent of the aberration of the optical system of the eye to be examined.
Referring to Fig. 2, the focus state detection unit 21 includes a fundus position detection unit 21a that detects a specific position of the fundus Er. The focus state detection unit 21 also includes a focus detection range determination unit 21b that determines the focus detection range based on a signal received from the fundus position detection unit 21a. Further, the focus state detection unit 21 includes an AF evaluation value storing unit 21c that stores the AF evaluation value and the position of the focus lens 9 when the AF evaluation value is acquired.
Referring to Fig. 3, the fundus position detection unit 21a includes a fundus image pattern memory 21d that stores region patterns which are reference images of the specific regions in the fundus image. The region pattern is used to extract the specific region from the fundus image. Position information of the specific region is acquired by performing pattern matching between the region pattern recorded in the fundus image pattern memory 21d and an output signal from the fundus image pickup unit 11. Further, the focus detection range determination unit 21b determines the range to be focused based on the fundus image specific region extracted by the fundus position detection unit 21a. However, it is desirable that the focus detection range determination unit 21b includes a focus detection range correction unit 21e as illustrated in Fig. 4, so that the examiner can correct the size of the focus detection range. The examiner uses the focus detection range correction unit 21e by operating a cursor with respect to the image on the display monitor 25.
The focus state detection unit 21 calculates the AF evaluation value of the focus detection range determined by the focus detection range determination unit 21b. The focus state detection unit 21 also stores information about the position of the focus lens 9 at that time in the AF evaluation value storing unit 21c.
Fig. 5 is a flowchart illustrating an AF control method. The examiner instructs start of the AF operation via an AF start switch (not illustrated). In step S1, the fundus camera then starts performing pattern recognition of the fundus image. In step S2, the fundus position detection unit 21a calculates a correlation function between the output from the fundus image pickup unit 11 and the region pattern of the fundus image specific region stored in the fundus image pattern memory 21d. The fundus position detection unit 21a then performs a comparison, and the ranges in which the calculated value becomes greater than or equal to a threshold value are determined to be the same range. The fundus position detection unit 21a thus determines whether pattern recognition can be performed.
If the position of the focus lens 9 is greatly displaced from the focused position when automatic focusing is started so that pattern recognition cannot be performed (NO in step S2), the process proceeds to step S3. In step S3, the fundus camera sequentially drives the focus lens 9 until pattern recognition can be performed . The pattern recognition is performed each time.
If pattern recognition can be performed (YES in step S2), the process proceeds to step S4. In step S4, the focus detection range determination unit 21b determines the focus detection range based on the output from the fundus position detection unit 21a. In step S5, the focus state detection unit 21 then calculates the AF evaluation value indicating a focus level of the focus detection range. The method for calculating the AF evaluation value will be described below. In step S6, the AF evaluation storing unit 21c stores the calculated AF evaluation value.
Fig. 6 illustrates a principle of the focus detection using contrast detection. Such a focus detection method is performed based on a specific high-frequency component of a luminance signal, which obtains a maximum value when focused. The focus state detection unit 21 thus detects and uses as the AF evaluation value the high-frequency component of the input luminance signal. Referring to Fig. 6, a position of the focus lens is indicated on a horizontal axis, and the AF evaluation value is indicated on a vertical axis. The AF evaluation value becomes the maximum value at a focus position M2 and decreases at a position M1 which is greatly out of focus. According to the present exemplary embodiment, focus correction which matches the aberration of the optical system of the human eye is performed using the contrast detection principle.
In step S7, the fundus camera determines whether the maximum point, i.e., the position M2 illustrated in Fig. 6, is included in the AF evaluation values stored in step S6, using the principle of contrast detection. Since the determination of the maximum point cannot be performed in an initial determination performed in step S7, the process proceeds to step S3. In step S3, the fundus camera drives the focus lens 9.
If the maximum point is detected in the AF evaluation values (YES in step S7), the process proceeds to step S8. In step S8, the focus state detection unit 21 calculates a displacement of the focus lens 9. The displacement of the focus lens 9 in step S8 is an amount the focus lens 9 is driven to the position at which the maximum point M2 of the AF evaluation value is detected. In step S9, the lens drive control unit 22 transmits a signal to the focus lens driving unit 23 based on the focus lens displacement calculated in step S8 and drives the focus lens 9. Automatic focusing thus ends.
In the above-described process, the focus lens 9 is driven and the automatic focusing ends in step S9 based on the displacement of the focus lens 9 calculated in step S8. However, the processes of step S2 to step S5 may be performed after performing step S9 to calculate the AF evaluation value. The calculated AF value is then compared with the AF evaluation value in which the maximum point is first determined. Automatic focusing may thus end when the difference between the AF evaluation values become less than or equal to a threshold value.
On the other hand, if the maximum point is not detected in the AF evaluation values (NO in step S7), the process proceeds to step S3. In step S3, the fundus camera drives the focus lens 9 by a predetermined amount. The process then returns to step S2, in which the fundus position detection unit 21a again performs pattern recognition. In step S4, the focus detection range determination unit 21b then determines the focus detection range. As a result, the focus detection range can follow the movement of the eye to be examined E, even when the eye to be detected E moves while performing automatic focusing. If pattern recognition or the detection of the maximum point of the AF evaluation values cannot be determined during a predetermined number of cycles, it may be determined as an error.
Fig. 7 illustrates the fundus image displayed on the display monitor 25. Referring to Fig. 7, the relative positions of an optic disc N, large and medium blood vessels V, and a yellow spot Y which are unique to the fundus do not vary greatly regardless of the difference between individuals. Further, the relative positions are mirror-reversed between the left and right eyes.
Fig. 8 illustrates the AF evaluation values when the focus detection range is the region pattern of the large and medium blood vessels V. An AF evaluation value calculation method is a method for easily detecting the high-frequency component in the image. In such a method, luminance signals of a target pixel and eight pixels that are horizontally, vertically, and diagonally adjacent to the target pixel are compared. A value of the greatest difference between the luminance signals of the pixels then becomes the AF evaluation value of the target pixel. An image G1 is an example of a portion of the image in which the large and medium blood vessels V exist in a vertical direction. The luminance signal of each pixel is either "0" or "1".
When the above-described detection method is applied to the image, the AF evaluation values for each pixel are acquired as illustrated in an image A2. A total sum of the AF evaluation values of the pixels can then be set as the AF evaluation value of the entire image.
The AF evaluation value can be more easily and speedily calculated by comparing the luminance signals of two adjacent pixels. If there is no difference, the AF evaluation value is set to "0", and if there is a difference, the AF evaluation value is set to "1". Since the number of pixels to be compared is less than the number in the above-described method, the calculation load is reduced. However, if two pixels which are vertically adjacent to each other in the image G1 are compared, an image G3 is acquired, and edges of the large and medium blood vessels V cannot be detected.
On the other hand, if such a method is applied to an image G4 in which the large and medium blood vessels V exist in a horizontal direction, an image G5 is acquired. A result similar to the image G2 in which the AF evaluation values are calculated using the previously-described method can thus be acquired. In other words, if a detection method which is direction-dependent as the above-described method is selected, the calculation time can be shortened. However, it becomes necessary to appropriately select the target image.
As described above, the differences between the luminance of the adjacent pixels in the images G1 and G4 are mapped as indicated by the images G2 and G5. If the difference is greater, the difference in the luminance between the adjacent pixels is great, and the total sum is set as the AF evaluation value of the entire image.
The large and medium blood vessels V according to the present exemplary embodiment run in a circular arc shape with the yellow spot Y at proximate center in the fundus Er. The region in which the vessels become thicker exist near the optic disc N so that the edge of the large and medium blood vessels V exist approximately in a direction of plus-minus 45 degrees. High speed automatic focusing can thus be performed with a low load and without lowering the sensitivity of the AF evaluation value by employing a detection method that is selective in such a direction.
According to the present exemplary embodiment, the large and medium blood vessels V in the fundus Er are employed in performing pattern recognition of the fundus image. However, patterns of other regions such as the optic disc N or the yellow spot Y may be stored in the fundus image pattern memory 21d, so that automatic focusing is performed with respect to such regions.
The focus detection range can thus be automatically determined using pattern recognition, and the AF operability can be improved. Further, the focus detection position can follow the movement of the eye to be examined E, so that focus accuracy can be improved.
Furthermore, since the focus state detection unit 21 refers to the luminance value of each pixel when calculating the AF evaluation value, the saturation of the luminance value of the determined focus detection range may be detected. If the luminance value is saturated, the focus state detection unit 21 transmits a signal to the illumination light quantity control unit 24 to adjust the light quantity of the observation light source 1. Automatic focusing can thus be performed with higher accuracy. For example, if the light quantity of the illumination optical system is adjusted when performing contrast detection on the optic disc N in which overexposure is easily generated, a highly accurate fundus image having a great diagnostic value can be acquired.
According to the first exemplary embodiment, pattern recognition is performed on a specific region of the fundus Er. According to a second exemplary embodiment, the examiner selects before starting automatic focusing, the region in the fundus Er for setting the focus detection range. The focus detection range is thus determined based on the selection, and automatic focusing is then performed.
According to the second exemplary embodiment, the fundus image pattern memory 21d includes a plurality of fundus image patterns, such as the region patterns of the optic disc N, the yellow spot Y, and the large and medium blood vessels V. The examiner previously selects the region to be focused according to a case, using a region selection unit such as the cursor on the display monitor 25. This is equivalent to the fundus position detection unit 21a selecting one of the plurality of fundus image patterns. Further, the fundus position detection unit 21a detects a position of the fundus image pattern selected based on the output from the fundus image pickup unit 11 and transfers the result to the focus detection range determining unit 21b. Such a process and the processes to follow are similar to those described according to the first exemplary embodiment.
The examiner may also select a plurality of regions in the fundus instead of one region. In such a case, the AF evaluation value is calculated for each of the plurality of regions, and the sum of the AF evaluation values is set as an overall evaluation value. Accordingly, an image which is averagely focused with respect to the plurality of regions selected by the examiner can be acquired by detecting the maximum value of the overall evaluation value. As a result, the fundus image in which the region desired by the examiner is focused can be photographed, and the examiner can acquire the fundus image of great diagnostic value.
As described above, the fundus image of great diagnostic value can be acquired by performing pattern recognition of the region the examiner particularly desires to focus on in the diagnosis and determining the focus detection range. In other words, an appropriate focus detection range can be determined in the optic disc N, the large and medium blood vessels V, and the yellow spot Y in the fundus image which include a comparatively large amount of high-frequency component. Highly accurate contrast detection can thus be performed.
In particular, highly accurate contrast detection can be performed by detecting the large and medium blood vessels V in which there is little difference between individuals as compared to the optic disc N while there is a great difference between individuals in concavity and convexity of the optic disc N. Further, a running direction of the large and medium blood vessels V can be easily identified. Highly accurate contrast detection with a small calculation load can thus be performed at high speed and low cost by detecting the contrast in a direction perpendicular to the large and medium blood vessels V.
Furthermore, an image of high diagnostic value, appropriate for a lesion which the examiner is focusing on can be acquired by the examiner selecting the focus detection range from the plurality of regions in the fundus.
According to the second exemplary embodiment, the examiner selects the focus detection range before starting automatic focusing. According to the third exemplary embodiment, the examiner selects the focus detection range from specific regions on which pattern recognition has been performed, and automatic focusing is then performed.
According to the first exemplary embodiment, the fundus image pattern memory 21d includes a plurality of fundus image patterns, such as the region patterns of the optic disc N, the yellow spot Y, and the large and medium blood vessels V. This is similar to the second exemplary embodiment. According to the third exemplary embodiment, the positions of the plurality of fundus image patterns are detected with respect to the output from the fundus image pickup unit 11. The result is then transferred to the focus detection range determination unit 21b. Such a process is different from the first and second exemplary embodiments.
Referring to Fig. 9, the focus detection range determination unit 21b according to the third exemplary embodiment includes a focus detection range correction unit 21e and a focus detection range selection unit 21f. The focus detection range display unit 25a included in the display monitor 25 displays to the examiner the plurality of specific regions of the fundus image extracted by the fundus position detection unit 21a. The examiner uses the focus detection range selection unit 21f, i.e., the cursor, and selects from the plurality of specific regions, one region for setting the focus detection range. The specific regions of the fundus image may be displayed to the examiner when a predetermined number of pattern recognition results has been detected, or when the focus lens 9 has moved over the entire movable range.
Further, the examiner can manually correct the position and the size of the focus detection range using the focus detection range correction unit 21e. The examiner can thus acquire the fundus image in which the region desired by the user is correctly focused.
Furthermore, the examiner may also select the plurality of regions instead of one fundus region, similar to the second exemplary embodiment. The process for transferring the selected specific region of the fundus image to the focus detection range determination unit 21b and the processes that follow are similar to the first exemplary embodiment.
According to the second and third exemplary embodiments, the AF evaluation value of one or a plurality of the focus detection ranges selected by the examiner from a plurality of pattern-recognized fundus image regions is calculated. According to a fourth exemplary embodiment, the AF evaluation values are calculated and evaluated for all of the plurality of fundus image region patterns that are pattern-recognized, and automatic focusing is then performed.
According to the fourth exemplary embodiment, the focus state detection unit 21 calculates the AF evaluation values for each of the plurality of specific regions in the fundus image, extracted by the fundus position detection unit 21a. The focus state detection unit 21 then sets the sum of the calculated AF evaluation values as the overall evaluation value. An image which is averagely focused with respect to the plurality of regions selected by the examiner can thus be acquired by detecting the maximum value of the overall evaluation value.
Further, referring to Fig. 10, the focus detection range determination unit 21b according to the fourth exemplary embodiment includes a focus detection range narrowing unit 21g. The focus detection range narrowing unit 21g automatically determines the specific region having the highest AF evaluation value as the focus detection range and transfers the result to the focus state detection unit 21. The process for transferring the selected specific region of the fundus image to the focus detection range determination unit 32b and the processes to follow are similar to the above-described exemplary embodiments. As a result, the focused fundus image can be automatically photographed, so that a fundus camera of high AF operability can be acquired.
In other words, since the focus detection range is automatically determined, the AF operability can be improved.
According to the above-described exemplary embodiments, the position of the specific region in the fundus image is detected only by pattern recognition performed by the fundus position detection unit 21a. According to a fifth exemplary embodiment, pattern recognition of the optic disc N and detection of the right / left eye are combined. The large and medium blood vessels V including a large amount of the specific high-frequency components are then detected, and automatic focusing is performed.
Fig. 11 illustrates an external view of the fundus camera according to the fifth exemplary embodiment. A mount 32 which is movable back and forth and in a horizontal direction as indicated by arrows illustrated in Fig. 11 is disposed on a base 31. A chassis 33 in which the optic system of the fundus camera illustrated in Fig. 1 is included, and the display monitor 25 are disposed on the mount 32. Further, a control stick 35 including a photographing switch is disposed on the mount 32.
The examiner operates with the control stick 35 and adjusts the mount 32 in the horizontal direction to align with the right and left eyes. A right / left eye detection unit 36 is disposed between the base 31, and the mount 32. The position of the chassis 33 in the horizontal direction is then detected, so that the right / left eye detection unit 36 can detect whether the right eye or the left eye to be examined E is being observed and photographed.
Fig. 12 illustrates a detection method performed by the right / left eye detection unit 36. Referring to Fig. 12, there is a low portion 31a and a high portion 31b on a top surface of the base 31 which form a difference in height. The right / left eye detection unit 36 which is disposed on a bottom surface of the mount 32 is formed of a microswitch. The right / left eye detection unit 36 is in an "off" state when positioned above the low portion of the base 31 and an "on" state when positioned above the high portion of the base 31. More specifically, the left or the right eye to be examined facing the chassis 33 can be detected by forming the low portion 31a on the left side and the high portion 31b on the right side and detecting the on / off state of the right / left eye detection unit 36.
A method for detecting the focus detection range by the right / left eye detection unit 36 performing the right / left eye detection and the fundus position detection unit 21a performing pattern recognition of the optic disc N will be described below. In particular, the method for detecting the large and medium blood vessels V illustrated in Fig. 7 will be described below.
The structure of the fundus Er can be predicted by detecting a specific region in the fundus Er and determining whether the right or left eyes is being observed. The large and medium blood vessels V can then be detected by the right / left eye detection unit 36 detecting the right or left eye and by performing pattern recognition of the optic disc N. The process for transferring the result of detecting the large and medium blood vessels V, to the focus detection range determination unit 21b and the processes to follow are similar to the above-described exemplary embodiments.
According to the present exemplary embodiment, only the optic disc N on which pattern recognition is easily performed is detected. The other regions in the fundus Er are then predicted from the detection result and determined as the focus detection range. The specific region in the fundus Er and the focus detection range may thus be displaced due to the difference between individuals. In such a case, the examiner uses the focus detection range correction unit 21e to manually correct the position and the size of the focus detection range, so that the fundus image in which the region desired by the examiner is correctly focused can be acquired.
As described above, the large and medium blood vessels V or the yellow spot Y is identified and set as the focus detection range by detecting the optic disc N which can be easily pattern-recognized and by detecting the right or left eye. The calculation load and the calculation time are thus reduced, so that high-speed automatic focusing can be realized.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all modifications, equivalent structures, and functions.
This application claims priority from Japanese Patent Application No. 2009-201290 filed September 1, 2009, which is hereby incorporated by reference herein in its entirety.

Claims (13)

  1. A fundus camera comprising:
    a fundus illumination optical system configured to illuminate a fundus of an eye to be examined;
    a fundus imaging optical system having a focus lens which is driven to focus on a fundus;
    a focus lens driving unit configured to drive the focus lens;
    a fundus image pickup unit arranged at a position which is conjugate to a fundus with respect to the fundus imaging optical system;
    a display monitor configured to display a fundus image acquired by the fundus image pickup unit;
    a focus state detection unit configured to detect an AF evaluation value indicating a level of a focus state based on an output signal from the fundus image pickup unit; and
    a lens drive control unit configured to drive the focus lens based on the AF evaluation value detected by the focus state detection unit,
    wherein the focus state detection unit includes a fundus position detection unit configured to detect with respect to an output from the fundus image pickup unit a specific region of a fundus image using a region pattern unique to a fundus region, and a focus detection range determination unit configured to determine a focus detection range based on an output from the fundus position detection unit, and calculates the AF evaluation value of the focus detection range determined by the focus detection range determination unit.
  2. The fundus camera according to claim 1, wherein the display monitor includes a focus detection range display unit configured to display the focus detection range determined by the focus detection range determination unit, superposing the focus detection range on a fundus image acquired by the fundus image pickup unit.
  3. The fundus camera according to claim 1, wherein the region pattern identifies an optic disc of a fundus.
  4. The fundus camera according to claim 1, wherein the region pattern identifies large and medium blood vessels of a fundus.
  5. The fundus camera according to claim 1, wherein the region pattern identifies a yellow spot of a fundus.
  6. The fundus camera according to claim 1, wherein the fundus position detection unit includes a plurality of the region patterns.
  7. The fundus camera according to claim 6, wherein the focus detection range determination unit includes a region selection unit configured to cause an examiner to previously select a plurality of the region patterns.
  8. The fundus camera according to claim 6, wherein the focus detection range determination unit includes a focus detection range selection unit configured to display a plurality of the focus detection ranges identified by a plurality of the region patterns and causes an examiner to select one or a plurality of focus detection ranges from the displayed plurality of focus detection ranges.
  9. The fundus camera according to claim 1, further comprising driving the focus lens based on the AF evaluation value with respect to a plurality of the focus detection ranges identified by a plurality of the region patterns.
  10. The fundus camera according to claim 9, wherein the focus detection range determination unit includes a focus detection range narrowing unit configured to determine, based on the AF evaluation value with respect to a plurality of the focus detection ranges, one of the focus detection ranges among a plurality of the focus detection ranges.
  11. The fundus camera according to claim 1, further comprising right / left eye detection unit configured to determine whether a left eye or a right eye is observed from a position of a mount on which an optical system is mounted,
    wherein the fundus camera detects a position of a large and medium blood vessels or a position of a yellow spot based on a position of an optic disc acquire by the fundus position detection unit and an output from the right / left eye detection unit.
  12. The fundus camera according to claim 1, wherein the focus detection range determination unit further includes a focus detection range correction unit configured to correct a position and size of the focus detection range.
  13. The fundus camera according to claim 1, further comprising an illumination light quantity control unit configured to adjust an illumination light quantity of an observation light source in the fundus illumination optical system,
    wherein the illumination light quantity control unit controls the illumination light quantity based on an output from the focus state detection unit.
PCT/JP2010/005332 2009-09-01 2010-08-30 Fundus camera WO2011027531A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US13/393,001 US20120154748A1 (en) 2009-09-01 2010-08-30 Fundus camera
EP10813483.4A EP2473093A4 (en) 2009-09-01 2010-08-30 Fundus camera
CN201080039000.2A CN102481097B (en) 2009-09-01 2010-08-30 Fundus camera

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009201290A JP5388765B2 (en) 2009-09-01 2009-09-01 Fundus camera
JP2009-201290 2009-09-01

Publications (1)

Publication Number Publication Date
WO2011027531A1 true WO2011027531A1 (en) 2011-03-10

Family

ID=43649087

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/005332 WO2011027531A1 (en) 2009-09-01 2010-08-30 Fundus camera

Country Status (5)

Country Link
US (1) US20120154748A1 (en)
EP (1) EP2473093A4 (en)
JP (1) JP5388765B2 (en)
CN (1) CN102481097B (en)
WO (1) WO2011027531A1 (en)

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5539123B2 (en) * 2010-08-31 2014-07-02 キヤノン株式会社 Ophthalmologic photographing apparatus and photographing method using ophthalmic photographing apparatus
CN102959800B (en) 2010-09-07 2015-03-11 株式会社村田制作所 Antenna apparatus and communication terminal apparatus
JP5943785B2 (en) * 2012-09-12 2016-07-05 キヤノン株式会社 IMAGING DEVICE, IMAGING SYSTEM, IMAGE PROCESSING DEVICE, AND IMAGING DEVICE CONTROL METHOD
JP2014079392A (en) 2012-10-17 2014-05-08 Canon Inc Ophthalmology imaging apparatus
JP2014083376A (en) * 2012-10-26 2014-05-12 Canon Inc Ophthalmologic apparatus, and control method
JP2014083358A (en) * 2012-10-26 2014-05-12 Canon Inc Ophthalmologic apparatus, ophthalmology control method, and program
JP2014094118A (en) * 2012-11-09 2014-05-22 Canon Inc Ophthalmologic photography apparatus and method
JP2014113422A (en) * 2012-12-12 2014-06-26 Canon Inc Ophthalmological photographing apparatus, and control method and program of ophthalmological photographing apparatus
JP6296683B2 (en) * 2013-01-31 2018-03-20 キヤノン株式会社 Ophthalmic apparatus and control method
CN103353677B (en) 2013-06-28 2015-03-11 北京智谷睿拓技术服务有限公司 Imaging device and method thereof
CN103353667B (en) 2013-06-28 2015-10-21 北京智谷睿拓技术服务有限公司 Imaging adjustment Apparatus and method for
CN103353663B (en) 2013-06-28 2016-08-10 北京智谷睿拓技术服务有限公司 Imaging adjusting apparatus and method
CN103431840B (en) * 2013-07-31 2016-01-20 北京智谷睿拓技术服务有限公司 Eye optical parameter detecting system and method
CN103424891B (en) 2013-07-31 2014-12-17 北京智谷睿拓技术服务有限公司 Imaging device and method
CN103431980A (en) 2013-08-22 2013-12-11 北京智谷睿拓技术服务有限公司 Eyesight protection imaging system and method
CN103439801B (en) 2013-08-22 2016-10-26 北京智谷睿拓技术服务有限公司 Sight protectio imaging device and method
CN103500331B (en) 2013-08-30 2017-11-10 北京智谷睿拓技术服务有限公司 Based reminding method and device
CN103605208B (en) 2013-08-30 2016-09-28 北京智谷睿拓技术服务有限公司 content projection system and method
CN103558909B (en) 2013-10-10 2017-03-29 北京智谷睿拓技术服务有限公司 Interaction projection display packing and interaction projection display system
JP5777681B2 (en) * 2013-10-10 2015-09-09 キヤノン株式会社 Control apparatus and control method
CN110930446B (en) * 2018-08-31 2024-03-19 福州依影健康科技有限公司 Pretreatment method and storage device for quantitative analysis of fundus images
CN109889714A (en) * 2019-03-15 2019-06-14 杭州视辉科技有限公司 Eye-ground photography device and its judge the method that electric voltage exception and auto-focusing are taken pictures
JP7447902B2 (en) * 2019-07-16 2024-03-12 株式会社ニデック Ophthalmology imaging device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01178237A (en) * 1988-01-07 1989-07-14 Kowa Co Ophthalmic measuring apparatus
JPH07227380A (en) * 1994-02-21 1995-08-29 Canon Inc Eyeground camera
JPH11188006A (en) * 1997-12-26 1999-07-13 Topcon Corp Ophthalmic device

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19881541B3 (en) * 1997-09-17 2014-08-14 Kabushiki Kaisha Topcon Ophthalmic camera
EP1278452A4 (en) * 2000-04-14 2005-04-13 Fovioptics Inc Non-invasive measurement of blood components using retinal imaging
US20050010091A1 (en) * 2003-06-10 2005-01-13 Woods Joe W. Non-invasive measurement of blood glucose using retinal imaging
DE10313975B4 (en) * 2002-03-28 2007-08-23 Heidelberg Engineering Gmbh Procedure for examining the fundus
AU2003302746A1 (en) * 2002-12-19 2004-07-14 Christopher J. Kolanko Method for diagnosing a disease state using ocular characteristics
JP4359489B2 (en) * 2003-11-28 2009-11-04 株式会社ニデック Fundus camera
JP4377745B2 (en) * 2004-05-14 2009-12-02 オリンパス株式会社 Electronic endoscope
JP4628763B2 (en) * 2004-12-01 2011-02-09 株式会社ニデック Fundus camera
GB0517948D0 (en) * 2005-09-03 2005-10-12 Keeler Ltd Imaging apparatus, portable image capture device and method of assembling composite images from component images
JP4797522B2 (en) * 2005-09-08 2011-10-19 カシオ計算機株式会社 Imaging apparatus and program thereof
KR100806690B1 (en) * 2006-03-07 2008-02-27 삼성전기주식회사 Auto focusing method and auto focusing apparatus therewith
JP4869757B2 (en) * 2006-03-24 2012-02-08 株式会社トプコン Fundus observation device
EP2130486B1 (en) * 2008-06-02 2016-03-23 Nidek Co., Ltd. Ophthalmic Photographing Apparatus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01178237A (en) * 1988-01-07 1989-07-14 Kowa Co Ophthalmic measuring apparatus
JPH07227380A (en) * 1994-02-21 1995-08-29 Canon Inc Eyeground camera
JPH11188006A (en) * 1997-12-26 1999-07-13 Topcon Corp Ophthalmic device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2473093A4 *

Also Published As

Publication number Publication date
EP2473093A1 (en) 2012-07-11
CN102481097A (en) 2012-05-30
CN102481097B (en) 2015-05-06
EP2473093A4 (en) 2015-08-19
JP2011050531A (en) 2011-03-17
US20120154748A1 (en) 2012-06-21
JP5388765B2 (en) 2014-01-15

Similar Documents

Publication Publication Date Title
WO2011027531A1 (en) Fundus camera
US8147064B2 (en) Fundus camera
US7524062B2 (en) Ophthalmologic apparatus
JP5725706B2 (en) An ophthalmologic apparatus, an image generation method, and a program.
JP2008110156A (en) Ophthalmologic photographing apparatus
KR101637944B1 (en) Ophthalmologic apparatus and alignment method
JP5355305B2 (en) Ophthalmic imaging equipment
JP5361522B2 (en) Fundus camera
JP6112949B2 (en) Ophthalmic apparatus, control method for ophthalmic apparatus, and program
US8545019B2 (en) Fundus camera
US20120050515A1 (en) Image processing apparatus and image processing method
JP2009172157A (en) Ophthalmological photographing apparatus
US20150335242A1 (en) Ophthalmic apparatus and control method for the same
JP2017012663A (en) Ophthalmic photographing apparatus, control method thereof and program
JP5355220B2 (en) Fundus photographing device
US20140118692A1 (en) Ophthalmologic apparatus and control method
JP2010082281A (en) Fundus camera
JP2015146961A (en) Ophthalmologic apparatus, and control method of ophthalmologic apparatus
JP6025903B2 (en) Ophthalmic apparatus, image generation method, and program
JP5777681B2 (en) Control apparatus and control method
JP5587478B2 (en) Ophthalmic imaging apparatus and control method thereof
JP2022114614A (en) Ophthalmologic apparatus and control method thereof, and program
JP6322897B2 (en) Ophthalmic equipment
JP2013233466A (en) Ophthalmologic photographing apparatus
JPH06178763A (en) Visual line detecting device and visual line detecting method

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201080039000.2

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10813483

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 13393001

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2010813483

Country of ref document: EP