EP2617351B1 - Appareil et procédé de traitement d'images - Google Patents

Appareil et procédé de traitement d'images Download PDF

Info

Publication number
EP2617351B1
EP2617351B1 EP13151715.3A EP13151715A EP2617351B1 EP 2617351 B1 EP2617351 B1 EP 2617351B1 EP 13151715 A EP13151715 A EP 13151715A EP 2617351 B1 EP2617351 B1 EP 2617351B1
Authority
EP
European Patent Office
Prior art keywords
image
polarization
tomographic
display
eye
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Not-in-force
Application number
EP13151715.3A
Other languages
German (de)
English (en)
Other versions
EP2617351A1 (fr
Inventor
Yoshihiko Iwase
Tomoyuki Makihira
Makoto Sato
Kazuhide Miyata
Ritsuya Tomita
Hiroyuki Shinbata
Daisuke Kibe
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Publication of EP2617351A1 publication Critical patent/EP2617351A1/fr
Application granted granted Critical
Publication of EP2617351B1 publication Critical patent/EP2617351B1/fr
Not-in-force legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0025Operational features thereof characterised by electronic signal processing, e.g. eye models
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/102Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for optical coherence tomography [OCT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/12Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B9/00Measuring instruments characterised by the use of optical techniques
    • G01B9/02Interferometers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing

Definitions

  • the present invention relates to an image processing apparatus and an image processing method for processing an image of a subject.
  • the present invention relates to an image processing apparatus and an image processing method for processing a tomographic image of a subject's eye.
  • OCT optical coherence tomography
  • an ophthalmologic OCT apparatus can acquire, in addition to a normal OCT image of a shape of fundus tissue, a polarization-sensitive OCT image using a polarization parameter (i.e., retardation and orientation), which is an optical characteristic of the fundus tissue.
  • a polarization parameter i.e., retardation and orientation
  • the polarization-sensitive OCT can form the polarization sensitive OCT image using the polarization parameter to be formed, and perform distinguishing and segmentation of the fundus tissue.
  • PS-OCT uses the fact that some layers in the retina (or fundus) of an eye reflect polarized light differently from other layers.
  • International Publication No. WO 2010/122118 A1 discusses a polarization-sensitive OCT which employs a light beam that has been modulated to a circularly-polarized beam as a measuring beam for examining the sample.
  • Detection is then performed by splitting the interference beam into two linearly-polarized beams perpendicular to each other, so that the polarization-sensitive OCT image is generated with different layers in the fundus being imaged differently depending on their interaction with the polarized light.
  • United States patent publication US2007/115481A1 discloses a method/system which preserves annotations of different pathological conditions or changes that are recognized on cross-sections within a three dimensional volume of a patient's eye so that the annotations are maintained in a visible state in an en face projection produced with a SVP technique.
  • a missing portion of a retinal pigment epithelium (RPE) layer is recognizable in the polarization-sensitive OCT image of the subject's eye when it has a disease causing the missing portion. It is desirable to have an image processing apparatus capable of providing useful information for diagnosis with respect to such a missing portion.
  • RPE retinal pigment epithelium
  • an image processing apparatus as specified in claims 1 to 8.
  • an image processing method as specified in claims 9 and 10.
  • a computer-readable storage medium as specified in claim 11.
  • the position corresponding to the missing portion i.e., a discontinuous portion in the predetermined layer
  • the position corresponding to the missing portion can be displayed on an image of the fundus, so that useful information for the diagnosis can be provided.
  • An image processing apparatus is capable of displaying, on an image of the fundus (a fundus image), a position corresponding to a discontinuous portion in a predetermined layer (e.g., a missing portion in an RPE layer).
  • the image processing apparatus can thus provide useful information for diagnosis.
  • Fig. 11 illustrates display forms 1102 (i.e., two-dimensional regions), indicating the discontinuity in the RPE layer of the subject's eye (as seen in the tomographic image 1103), superimposed on a fundus image 1101.
  • the display forms 1102 superimposed on the fundus image 1101 at the positions corresponding to discontinuous regions 1104 in the RPE layer 1105 of the subject's eye in the tomographic image 1103 indicating the polarization state (i.e., the polarization-sensitive OCT image).
  • a tomographic image may be referred to as a polarization-sensitive tomographic image or a tomographic image indicating a polarization state in order to differentiate it from a tomographic intensity image or a conventional tomographic image (discussed below).
  • a three-dimensional layer region as the (for example) RPE layer. Further, it is desirable to identify three-dimensional discontinuous regions as the discontinuous portions, based on the three-dimensional layer region.
  • the user can then easily confirm, with an image obtained by superimposing the display form 1105 indicating the RPE layer on the tomographic intensity image 1103 and an image obtained by superimposing the display form 1102 indicating the discontinuity on the fundus image 1101 displayed side by side, the missing portions in each image by associating them with each other.
  • the acquisition position of the tomographic image indicating the polarization state and the acquisition position of the tomographic intensity image are the same position within the subject's eye.
  • the positions of the acquired tomographic image indicating the polarization state and the acquired tomographic intensity image may be adjusted as they are acquired, or afterwards using post-imaging processing, and the display forms 1102 and 1105 may then be superimposed.
  • the fundus image 1101 superimposed on a fundus photograph 1106, the latter being a color fundus image captured by a fundus camera, on a display area in a display screen on a monitor.
  • the fundus image 1101 is an image of a narrower range as compared to the fundus photograph 1106. It is desirable that the fundus photograph 1106 and the fundus image 1101 are aligned based on a characteristic region such as a blood vessel.
  • the image processing apparatus includes a designation unit, such as a mouse, for the user to designate one of the discontinuous portions.
  • a designation unit such as a mouse
  • the portion designated by the designation unit may be one of a plurality of discontinuous portions 1102 in the fundus image 1101.
  • the user can easily confirm the position of the corresponding missing portion in both images even when there is a plurality of missing portions in the image.
  • the fundus image 1101 may be a fundus image indicating the polarization state (i.e. a polarization-sensitive fundus image), such as a retardation map or a birefringence map (to be described below) instead of a fundus intensity image as indicated above.
  • a fundus image indicating the polarization state i.e. a polarization-sensitive fundus image
  • a retardation map or a birefringence map to be described below
  • An imaging apparatus is applicable to a subject such as a subject's eye, skin, and an internal organ. Further, the imaging apparatus according to the present exemplary embodiment may be an ophthalmologic apparatus or an endoscope. Methods and apparatuses according to preferred embodiments will be described below that enable the desired effects defined above. Specifically, the way that the polarization-sensitive fundus image and therefore the intensity fundus image, as well as the way that the polarization-sensitive tomographic images and, therefore, the tomographic intensity image are acquired will be described.
  • Fig. 1 is a schematic diagram illustrating an ophthalmologic apparatus, which is an example of the imaging apparatus according to the present exemplary embodiment. At least a portion of a signal-processing unit 190 to be described below may be considered as an image processing apparatus. Alternatively, the signal-processing unit 190 in conjunction with the ophthalmologic apparatus may be regarded as an image processing apparatus.
  • the ophthalmologic apparatus includes a polarization-sensitive OCT (PS-OCT) 100, a polarization-sensitive scanning laser ophthalmoscope (PS-SLO) 140, an anterior segment imaging unit 160, an internal fixation lamp 170, and a control unit 200.
  • PS-OCT polarization-sensitive OCT
  • PS-SLO polarization-sensitive scanning laser ophthalmoscope
  • the ophthalmologic apparatus is aligned by lighting the internal fixation lamp 170 and causing the subject's eye to gaze at the internal fixation lamp 170, and using the image of the anterior segment of the eye captured by the anterior segment imaging unit 160. After completing the alignment, the PS-OCT 100 and the PS-SLO 140 perform imaging of the fundus.
  • the configuration of the PS-OCT 100 will be described below.
  • SLD super luminescent diode
  • ASE amplified spontaneous emission
  • the light emitted from the light source 101 is guided by a polarization maintaining (PM) fiber 102 and a polarization controller 103 to a fiber coupler 104 having a polarization preserving function.
  • the beam is then split into a measuring beam (hereinafter referred to as a "measuring beam for a tomographic image", or an "OCT measuring beam”) and a reference beam corresponding to the measuring beam.
  • a measuring beam for a tomographic image or an “OCT measuring beam
  • the polarization controller 103 adjusts the polarization state of the beam emitted from the light source 101, and adjusts the beam to a linearly-polarized beam.
  • a branching ratio of the fiber coupler 104 is 90 (reference beam) : 10 (measuring beam).
  • the measuring beam is output from a collimator 106 via a PM fiber 105 as a parallel beam.
  • the output measuring beam reaches a dichroic mirror 111 via an X scanner 107, lenses 108 and 109, and a Y scanner 110.
  • the X scanner 107 includes a galvano mirror that scans the measuring beam in a horizontal direction on a fundus Er
  • the Y scanner 110 includes a galvano mirror that scans the measuring beam in a vertical direction on the fundus Er.
  • the X scanner 107 and the Y scanner 110 are controlled by a drive control unit 180, and are capable of scanning the measuring beam in a desired range on the fundus Er.
  • the range in which the measuring beam is scanned on the fundus may be considered as an irradiation position of the measuring beam and as an acquisition range of the tomographic image including an acquisition position of the tomographic image.
  • the X scanner 107 and the Y scanner 110 are examples of a scanning unit for PS-OCT, and may be configured as a common XY scanner.
  • the dichroic mirror 111 reflects light having wavelengths of 800 nm to 900 nm, and transmits light of other wavelengths.
  • the measuring beam reflected off the dichroic mirror 111 passes through, via a lens 112, a ⁇ /4 polarizing plate 113 arranged to be inclined at an angle of 45° from a P-polarization direction and from an S-polarization with respect to an optical axis as a rotational axis.
  • the phase of the beam is thus shifted by 90°, and is polarized to be a circularly-polarized beam.
  • the ⁇ /4 polarizing plate 113 is an example of a polarization adjustment member for the measuring beam for adjusting the polarization state of the measuring beam.
  • the ⁇ /4 polarizing plate 113 may be disposed in a common optical path between a portion of the PS-OCT optical system and a portion of the PS-SLO optical system.
  • variation in the polarization states generated in the images acquired by the PS-OCT optical system and the PS-SLO optical system can be comparatively reduced.
  • the scanning unit for the PS-SLO and the scanning unit for the PS-OCT are arranged in mutually-conjugate positions, and may be arranged to be conjugate with the pupil in the subject's eye.
  • the inclination of the ⁇ /4 polarizing plate 113 is an example of a state of the ⁇ /4 polarizing plate 113, and is an angle from a predetermined position in the case where the optical axis of a polarizing beam splitting surface of a fiber coupler 123 including a polarizing beam splitter is the rotating axis.
  • the ⁇ /4 polarizing plate 113 may be inserted and removed from the optical path.
  • the ⁇ /4 polarizing plate 113 may be mechanically configured to rotate around the optical axis or an axis parallel to the optical axis as the rotational axis.
  • the beam incident on the subject's eye is thus polarized to a circularly-polarized beam by arranging the ⁇ /4 polarizing plate 113 to be inclined at an angle of 45°.
  • the beam may not become a circularly-polarized beam on the fundus Er due to the characteristic of the subject's eye.
  • the drive control unit 180 can perform control to finely adjust the inclination of the ⁇ /4 polarizing plate 113.
  • a focus lens 114 mounted on a stage 116 focuses, on layers in a retina in the fundus Er via an anterior segment Ea of the subject's eye, the measuring beam polarized to a circularly-polarized beam.
  • the measuring beam irradiating the fundus Er is reflected and scattered by each layer in the retina, and returns to the fiber coupler 104 via the above-described optical path.
  • the reference beam branched by the fiber coupler 104 is output as a parallel beam from a collimator 118 via a PM fiber 117.
  • the output reference beam is polarized by a ⁇ /4 polarizing plate 119 arranged to be inclined at an angle of 22.5° from the P-polarization to the S-polarization with the optical axis as the rotational axis, similarly as the measuring beam.
  • the ⁇ /4 polarizing plate 119 is an example of the polarization adjustment member for the reference beam for adjusting the polarization state of the reference beam.
  • the reference beam is reflected via a dispersion compensation glass 120 by a mirror 122 mounted on a coherence gate stage 121, and returns to the fiber coupler 104.
  • the reference beam passes through the ⁇ /4 polarizing plate 119 twice, so that the linearly -polarized beam returns to the fiber coupler 104.
  • the coherence gate stage 121 is controlled by the drive control unit 180 to deal with differences in an axial length of the subject's eye.
  • the coherence gate is the position corresponding to an optical path length of the reference beam in the optical path of the measuring beam.
  • the optical path length of the reference beam is changeable. However, it is not limited thereto, as long as the difference in the optical path lengths of the measuring beam and the reference beam can be changed.
  • the return beam and the reference beam that have returned to the fiber coupler 104 are combined into an interference beam (also referred to as a combined beam).
  • the interference beam is incident on the fiber coupler 123 including the polarizing beam splitter and is split at the branching ratio of 50:50 into a P-polarized beam and an S-polarized beam of different polarization directions.
  • the P-polarized beam is dispersed by a grating 131 via a PM fiber 124 and a collimator 130, and is received by a lens 132 and a line camera 133.
  • the S-polarized beam is similarly dispersed by a grating 127 via a PM fiber 125 and a collimator 126, and is received by a lens 128 and a line camera 129.
  • the gratings 127 and 131 and the line cameras 129 and 133 are arranged to match the direction of each polarization direction.
  • the beam received by each of the line cameras 129 and 133 is output as an electrical signal corresponding to the light intensity.
  • the signal-processing unit 190 which is an example of a tomographic image generation unit, then receives the output electrical signals.
  • the inclinations of the ⁇ /4 polarizing plates 113 and 119 can be automatically adjusted based on the inclination of the polarizing beam splitter surface of the polarizing beam splitter.
  • the inclinations of the ⁇ /4 polarizing plates 113 and 119 can also be automatically adjusted with respect to a virtual line connecting centers of an optical disk and a macula lutea in the fundus.
  • an inclination detection unit (not illustrated) to detect the inclinations of the ⁇ /4 polarizing plates 113 and 119.
  • the inclination detection unit can detect the current inclination and detect whether the inclination has reached a predetermined inclination.
  • the inclinations of the ⁇ /4 polarizing plates 113 and 119 can be detected based on the intensity of the received light, and the inclinations can be adjusted so that a predetermined intensity is reached. Furthermore, an object indicating the inclination may be displayed on a graphical user interface (GUI), and the user may adjust the inclination using a mouse. Moreover, a similar result can be acquired by adjusting a polarizing beam splitter and the ⁇ /4 polarizing plates 113 and 119 based on the vertical direction as a polarization basis.
  • GUI graphical user interface
  • a light source 141 i.e., a semiconductor laser, emits a light beam having a central wavelength of 780 nm.
  • the measuring beam emitted from the light source 141 (hereinafter referred to as a measuring beam for a fundus image, or an SLO measuring beam) is transmitted via a PM fiber 142 to a polarizing controller 145 where it is converted into a linearly-polarized beam, and is output from a collimator 143 as a parallel beam.
  • the output measuring beam then passes through a perforated portion of a perforated mirror 144, and reaches a dichroic mirror 154 via a lens 155, an X scanner 146, lenses 147 and 148, and a Y scanner 149.
  • the X scanner 146 includes a galvano mirror that scans the measuring beam in the horizontal direction on the fundus Er
  • the Y scanner 149 includes a galvano mirror that scans the measuring beam in the vertical direction on the fundus Er.
  • the X scanner 146 and the Y scanner 149 are controlled by the drive control unit 180, and are capable of scanning the measuring beam in the desired range on the fundus Er.
  • the X scanner 146 and the Y scanner 149 are examples of a scanning unit for the PS-SLO, and may be configured as a common XY scanner.
  • the dichroic mirror 154 reflects light having wavelengths of 760 nm to 800 nm, and transmits light of other wavelengths.
  • the linearly-polarized measuring beam reflected by the dichroic mirror 154 reaches the fundus Er via the optical path similar to that of the PS-OCT 100.
  • the measuring beam irradiating the fundus Er is reflected and scattered by the fundus Er, and reaches the perforated mirror 144 via the above-described optical path.
  • the beam reflected by the perforated mirror 144 is then split by a polarizing beam splitter 151 via the lens 150 into beams of different polarization directions (i.e., according to the present exemplary embodiment, split into a P-polarized beam and an S-polarized beam).
  • the split beams are received by respective avalanche photodiodes (APD) 152 and 153, converted into electrical signals and received by the signal-processing unit 190, i.e., an example of the fundus image generation unit.
  • APD avalanche photodiodes
  • the position of the perforated mirror 144 is conjugate with the position of the pupil in the subject's eye.
  • the perforated mirror 144 reflects the light that has passed through a peripheral region of the pupil among the light reflected and scattered by the fundus Er irradiated with the measuring beam.
  • both the PS-OCT and the PS-SLO use the PM fiber.
  • a similar configuration and effect may be acquired by using a single mode fiber (SMF) in the case where the polarizing controller controls polarization.
  • SMF single mode fiber
  • the anterior segment imaging unit 160 will be described below.
  • the anterior segment imaging unit 160 irradiates the anterior segment Ea using an irradiation light source 115 including light emitting diodes (LED) 115-a and 115-b, which emit irradiation light having a wavelength of 1000 nm.
  • the light reflected by the anterior segment Ea reaches a dichroic mirror 161 via the lens 114, the polarizing plate 113, the lens 112 and the dichroic mirrors 111 and 154.
  • the dichroic mirror 161 reflects light having wavelengths of 980 nm to 1100 nm, and transmits light of other wavelengths.
  • the light reflected by the dichroic mirror 161 is then received by an anterior segment camera 165 via lenses 162, 163, and 164.
  • the light received by the anterior segment camera 165 is converted into an electrical signal and is received by the signal-processing unit 190.
  • the internal fixation lamp 170 will be described below.
  • the interior fixation lamp 170 includes an interior fixation lamp display unit 171 and a lens 172.
  • a plurality of LEDs arranged in a matrix shape is used as the interior fixation lamp display unit 171.
  • a lighting position of the LED is changed by control performed by the drive control unit 180 according to a region to be imaged.
  • the light emitted from the interior fixation lamp display unit 171 is guided to the subject's eye via the lens 172.
  • the interior fixation lamp display unit 171 emits light having a wavelength of 520 nm, and the drive control unit 180 displays a desired pattern.
  • a control unit 200 for controlling the entire apparatus according to the present exemplary embodiment will be described below.
  • the control unit 200 includes the drive control unit 180, the signal-processing unit 190, a display control unit 191, and a display unit 192.
  • the drive control unit 180 controls each unit as described above.
  • the signal-processing unit 190 generates images based on the signals output from the line cameras 129 and 133, the APD 152 and 153, and the anterior segment camera 165, analyzes the generated images, and generates visualization information of the analysis results.
  • the image generation process will be described in detail below.
  • the display control unit 191 displays, on a display screen in the display unit 192, the images generated by a tomographic image generation unit and a fundus image generation unit and acquired by a fundus image acquisition unit (not illustrated) and a tomographic image acquisition unit (not illustrated).
  • the display unit 192 may be a liquid crystal display.
  • the image data generated by the signal-processing unit 190 may be transmitted to the display control unit 191 via wired or wireless communication.
  • the display control unit 191 may be considered as the image processing apparatus.
  • the fundus image acquisition unit may include the SLO optical system
  • the tomographic image acquisition unit may include the OCT optical system, as the respective imaging system.
  • the fundus image i.e., a fundus intensity image
  • the fundus image acquisition unit as a planar image acquisition unit.
  • the display unit 192 displays display forms indicating various types of information to be described below based on control performed by the display control unit 191.
  • Image data from the display control unit 191 may be transmitted to the display unit 192 via wired or wireless communication.
  • the display unit 192 is included in the control unit 200. However, it is not limited thereto, and the display unit 192 may be separated from the control unit 200.
  • a tablet which is an example of a portable device, configured by integrating the display control unit 191 and the display unit 192 may be used. In such a case, it is desirable to include a touch panel function in the display unit, so that a user can operate the touch panel to move the display position of the images, enlarge and reduce the images, and change the images to be displayed.
  • Image generation and image analysis processes performed in the signal-processing unit 190 will be described below.
  • the signal-processing unit 190 performs, on interference signals output from each of the line cameras 129 and 133, reconfiguration processing employed in a common spectral domain (SD-) OCT.
  • the signal-processing unit 190 thus generates a tomographic image corresponding to a first polarized beam and a tomographic image corresponding to a second polarized beam, i.e., two tomographic images each based on a polarization component.
  • the signal-processing unit 190 performs fixed pattern noise cancellation on the interference signals.
  • the fixed pattern noise cancellation is performed by averaging a plurality of A-scan signals that has been detected and thus extracting the fixed pattern noise, and subtracting the extracted fixed pattern noise from the input interference signal.
  • the signal-processing unit 190 then transforms the wavelength of the interference signal to a wave number and performs Fourier transform to the signal so that a tomographic signal indicating the polarization state is generated.
  • the signal-processing unit 190 performs the above-described process for the interference signals of the two polarization components, and thus generates the two tomographic images.
  • the signal-processing unit 190 aligns the signals output from the APD 152 and 153 in synchronization with driving of the X scanner 146 and the Y scanner 149.
  • the signal-processing unit 190 thus generates a fundus image corresponding to the first polarized beam and a fundus image corresponding to the second polarized beam, i.e., two fundus images each based on a polarization component.
  • the signal-processing unit 190 generates a tomographic intensity image from the above-described two tomographic signals.
  • the tomographic intensity image is basically the same as the tomographic image in the conventional OCT.
  • the signal-processing unit 190 similarly generates a fundus intensity image from the two fundus images.
  • Fig. 2A illustrates an example of the tomographic intensity image of the optic disk.
  • the display control unit 191 may display, on the display unit 192, the tomographic intensity image acquired employing the conventional OCT technique, or the fundus intensity image acquired employing the conventional SLO technique.
  • the signal-processing unit 190 generates a retardation image from the tomographic images of the polarization components that are perpendicular to each other.
  • a value ⁇ of each pixel in the retardation image is a value indicating a ratio of the effects received by the vertical polarization component and the horizontal polarization component in the subject's eye, at the position of each pixel configuring the tomographic image.
  • Fig. 2B illustrates an example of the retardation image of the optic disk generated as described above.
  • the retardation image can be acquired by calculating equation (2) for each B-scan image.
  • the retardation image is a tomographic image indicating the difference of the effect received by the two polarized beams in the subject's eye.
  • the values indicating the above-described ratio are displayed as a color tomographic image. A darker-shaded portion indicates that the value of the ratio is small, and a lighter-shaded portion indicates that the value of the ratio is large.
  • generating the retardation image enables the recognition of a layer in which there is birefringence.
  • E. Gotzinger et al., Opt. Express 13, 10217, 2005 refer to " E. Gotzinger et al., Opt. Express 13, 10217, 2005 ".
  • the signal-processing unit 190 can similarly generate a retardation image in a planar direction of the fundus based on the outputs from the APD 152 and 153.
  • the signal-processing unit 190 generates a retardation map from the retardation images acquired with respect to a plurality of B-scan images.
  • the signal-processing unit 190 detects a retinal pigment epithelium (RPE) in each B-scan image. Since the RPE cancels polarization, the signal-processing unit 190 searches for a retardation distribution of each A-scan along the depth direction in the range from an inner limiting membrane (ILM) without including the RPE. The signal-processing unit 190 then sets a maximum value of the retardation as a representative value of the retardation in the A-scan.
  • RPE retinal pigment epithelium
  • the signal-processing unit 190 performs the above-described process on all retardation images, and thus generates the retardation map.
  • Fig. 2C illustrates an example of the retardation map of the optic disk.
  • the darker-shaded portion indicates that the value of the ratio A V :A H is small, and the lighter-shaded portion indicates that the value of the ratio is large.
  • the layer having birefringence in the optic disk is a retinal nerve fiber layer (RNFL).
  • the ratio of A V :A H that gives rise to the value of ⁇ is affected by the birefringence in the RNFL and the thickness of the RNFL.
  • a birefringent property that enables a beam with a first polarization to pass through a layer with such a birefringent property but that inhibits a beam with a second polarization to pass will give a detected intensity value that is much greater for the first beam than for the second beam.
  • a V for example may be much greater than A H , giving a much higher value for A V /A H than if the layer did not have a birefringent property.
  • the value ( ⁇ ) indicating the ratio becomes large where the RNFL is thick, and becomes small where the RNFL is thin. The thickness of the RNFL for the entire fundus thus becomes recognizable using the retardation map, and can be used in the diagnosis of glaucoma.
  • the signal-processing unit 190 performs linear approximation of the value of the retardation ⁇ in the range of ILM to RNFL (which are the two first anterior layers in the retina) in each A scan image of the previously-generated retardation image.
  • the signal-processing unit 190 determines the acquired slope as the birefringence at the position on the retina in the A-scan image. In other words, since retardation is a product of a distance and the birefringence in the RNFL, a linear relation is acquired by plotting the depth and the value of the retardation in each A-scan image.
  • the signal-processing unit 190 performs the above-described process on all of the acquired retardation images, and generates the map representing the birefringence.
  • Fig. 2D illustrates an example of the birefringence map of the optic disk.
  • the birefringence map directly maps the values of the birefringence.
  • the signal-processing unit 190 calculates a Stokes vector S for each pixel from the acquired tomographic signals A H and A V , and a phase difference ⁇ between the tomographic signals A H and A V , using equation (3).
  • the signal-processing unit 190 sets, in each B-scan image, a window of the size that is proximately 70 ⁇ m in a main scanning direction of the measuring beam and 18 ⁇ m in a depth direction.
  • the signal-processing unit 190 then averages each element of the Stokes vector calculated for each pixel by a number C within each window, and calculates a degree of polarization uniformity (DOPU) within the window using equation (4).
  • DOPU Q m 2 + U m 2 + V m 2
  • Q m , U m , and V m are values acquired by averaging the elements Q, U, and V in the Stokes vector within each window.
  • the signal-processing unit 190 performs the above-described process for all windows in the B-scan images, and generates a DOPU image of the optic disk as illustrated in Fig. 2E .
  • the DOPU image is a tomographic image indicating the degree of polarization uniformity of the two types of polarization.
  • DOPU is a value indicating the uniformity of polarization, and becomes close to "1" when polarization is preserved, and smaller than "1" when polarization is cancelled or not preserved. Since the RPE in the structure of the retina cancels the polarization state, the value of the DOPU in the portion corresponding to the RPE in the DOPU image becomes lower than the values in the other portions. Referring to Fig. 2E , the lighter-shaded portion indicates the RPE.
  • the DOPU image visualizes the layer such as the RPE that cancels polarization, so that the image of the RPE can be firmly acquired as compared to the change in the intensity even when the RPE is deformed due to a disease.
  • the signal-processing unit 190 can similarly generate a DOPU image in the planar direction of the fundus based on the outputs from the APD 152 and 153.
  • the above-described tomographic images corresponding to the first and second polarized beams ( Fig. 2A ), the retardation image ( Fig. 2B ), and the DOPU image ( Fig. 2E ) will be referred to as tomographic images indicating the polarization state.
  • the above-described retardation map ( Fig. 2C ) and the birefringence map ( Fig. 2D ) will also be referred to as fundus images indicating the polarization state.
  • the signal-processing unit 190 uses the above-described intensity image to perform segmentation of the tomographic image.
  • the signal-processing unit 190 applies, to the tomographic image to be processed, a median filter as a type of smoothing filter, and a Sobel filter as a type of edge detection method.
  • the signal-processing unit 190 thus generates respective images (hereinafter referred to as a median image and a Sobel image).
  • the signal-processing unit 190 then generates a profile for each A-scan from the generated median image and Sobel image.
  • the signal-processing unit 190 generates the profile of the intensity value from the median image and the profile of a gradient from the Sobel image.
  • the signal-processing unit 190 detects peaks in the profiles generated from the Sobel image. Further, signal-processing unit 190 extracts a boundary of each layer in the retina by referring to the profiles of the median image corresponding to regions before and after the detected peaks and the regions between the detected peaks.
  • the signal-processing unit 190 measures each layer thickness in the direction of the A-scan line, and generates a layer thickness map of each layer.
  • Fig. 3 is a flowchart illustrating the operation performed by the image-processing apparatus according to the present exemplary embodiment.
  • step S101 the image processing apparatus and the subject's eye positioned on the image processing apparatus are aligned.
  • the process unique to the present exemplary embodiment with respect to performing alignment will be described below. Since alignment of a working distance in X, Y, and Z directions, focusing, and adjustment of the coherence gate are common, description of this will be omitted.
  • Fig. 4 illustrates a window 400 displayed on the display unit 192 when performing adjustment.
  • a display area 410 i.e., an example of a first display area, displays a fundus image 411 imaged by the PS-SLO 140 and generated by the signal-processing unit 190.
  • a frame 412 indicating an imaging range of the PS-OCT 100 is superimposed on the fundus image 411.
  • an operator sets the imaging range under control of the drive control unit 180, by clicking and dragging an instruction device (not illustrated), such as a mouse, and designating by a cursor displayed on the window 400.
  • an instruction device such as a mouse
  • the operator designates the frame 412 using the cursor, and moves the frame 412 using the dragging operation.
  • the drive control unit 180 controls a drive angle of a scanner and sets the imaging range.
  • the mouse includes a sensor for detecting a movement signal when the user manually moves the mouse in two directions, left and right mouse buttons for detecting that the user has pressed the button, and a wheel mechanism between the two mouse buttons which is movable in front and back and left to right directions.
  • the display unit may include a touch panel function, and the operator may designate the acquisition position on the touch panel.
  • any type of instruction device may be used, such as a joystick, keyboard keys or touch screen.
  • indicators 413 and 414 are displayed for adjusting the angle of the ⁇ /4 polarizing plate 113.
  • the angle of the ⁇ /4 polarizing plate 113 is adjusted based on control of the drive control unit 180.
  • the indicator 413 is for instructing adjustment in a counterclockwise direction
  • the indicator 414 is for instructing adjustment in a clockwise direction.
  • a numerical value displayed beside the indicators 413 and 414 indicates the current angle of the ⁇ /4 polarizing plate 113.
  • the display control unit 191 may display the indicator for adjusting the angle of the ⁇ /4 polarizing plate 119 side by side with the indicator 413 on the display unit 192, or in place of the indicator 413.
  • the operator gives, using the cursor by operating on the mouse, an instruction so that the intensities of the tomographic images of each polarized beam respectively displayed on a display area 430, i.e., an example of a third display area, and a display area 440, i.e., an example of a fourth display area, become the same.
  • a peak intensity value may be displayed along with tomographic images 431 and 441 of each polarized beam, or a waveform of each interference signal may be displayed, so that the operator performs adjustment while viewing the peak intensity value or the waveform.
  • the tomographic images 431 and 441 of each polarized beam are examples of tomographic images corresponding to the first polarized beam and the second polarized beam, respectively.
  • each image it is desirable to display a type of each image on the tomographic images 431 and 441 of each polarized beam (or tomographic images 531 and 541 to be described below). For example, a letter "P” indicating the P-polarized beam and a letter "S” indicating the S-polarized beam may be displayed. As a result, such a display prevents the user from misrecognizing the image.
  • the letters may be displayed above or besides the image instead of being superimposed on the image, as long as the display is arranged to be associated with the image.
  • a display area 420 i.e., an example of a second display area, to display any information.
  • the current adjustment state such as a message informing "adjusting ⁇ /4 polarizing plate”
  • a display indicating patient information such as a left eye or a right eye, or image capturing information such as an image capturing mode may be performed on the window 400.
  • the display control unit 191 even in a minimum-sized ophthalmologic apparatus can display the fundus intensity image on the display area 410 and the tomographic image indicating the polarization state on the display area 420.
  • the acquisition position of the tomographic image indicating the polarization state may be determined in an initial setting for acquiring a center region of the fundus image indicating the polarization state.
  • Adjustment can thus be simply performed to acquire the tomographic image accurately indicating the polarization state that is finer and corresponding to a narrower range as compared to the fundus image indicating the polarization state.
  • the ⁇ /4 polarizing plate may be automatically adjusted in response to completion of adjustment of the coherence gate, or in response to input of a signal for acquiring the image indicating the polarization state. Further, the ⁇ /4 polarizing plate may be adjusted in advance on an initial setting screen when activating the ophthalmologic apparatus, so that the ⁇ /4 polarizing plate is not required to be adjusted for each image capture.
  • the ⁇ /4 polarizing plate can be inserted and removed with respect to the optical path, it is desirable to perform adjustment in the following order: alignment adjustment using the anterior segment image or the luminescent spot in the cornea; focus adjustment using the SLO fundus image; coherence gate adjustment using the OCT tomographic image; and adjustment of the ⁇ /4 polarizing plate after inserting the ⁇ /4 polarizing plate in the optical path. Adjustment can thus be performed before acquiring the image indicating the polarization state, using the normal SLO fundus image and the OCT tomographic image that the user is intuitively used to.
  • the coherence gate may also be adjusted using the tomographic image indicating the polarization state of the PS-OCT by inserting the ⁇ /4 polarizing plate after performing focus adjustment. In such a case, the ⁇ /4 polarizing plate may be automatically inserted in response to completion of adjustment of the coherence gate, or in response to input of the signal for acquiring the image indicating the polarization state.
  • the focus may be finely adjusted using the OCT tomographic image after coarsely adjusting the focus using the SLO fundus image.
  • all of such adjustments may be automatically performed in the above-described order, or by the user adjusting the cursor to a slider corresponding to each type of adjustment displayed on the display unit and performing dragging.
  • an icon instructing inserting or removing the ⁇ /4 polarizing plate with respect to the optical path may be displayed on the display unit.
  • each of the light sources 101 and 141 emits the measuring beam.
  • the line cameras 129 and 133 and the APD 152 and 153 then receive the return beam, and the signal-processing unit 190 generates and analyzes each image as described above.
  • the display control unit 191 After the signal-processing unit 190 completes generating and analyzing each image, the display control unit 191 generates output information based on the result. The display control unit 191 then outputs to and displays on the display unit 192 the output information.
  • Fig. 5 illustrates a display example on the display unit 192 according to the present exemplary embodiment.
  • a window 500 displayed on the display unit 192 includes display areas 510, 520, 530, and 540.
  • the display area 510 i.e., an example of the first display area, displays a fundus image 511, and a rectangular frame 512 indicating the position of the tomographic image is superimposed on the fundus image 511.
  • the fundus intensity image is displayed as the fundus image 511.
  • the fundus image may be generated based on a polarization signal.
  • the display area 520 i.e., an example of the second display area, displays a tomographic image 521. Further, the display area 520 displays buttons 522, 523, 524, and 525, i.e., examples of a selection unit for selecting the type of the tomographic image to be displayed. The user may select the type of the tomographic image from a menu instead of using the buttons 522, 523, 524, and 525. In the example illustrated in Fig. 5 , the user has selected the button 522. By selecting different images to view, the user may be able to discern different information about the eye and determine where there are discontinuous portions in certain layers and also in which layers these discontinuous portions lie.
  • the display area 530 i.e., an example of a third display area
  • the display area 540 i.e., an example of a fourth display area
  • the display areas 530 and 540 may also display each fundus image based on each polarization signal from which the fundus image displayed on the display area 510 has been generated, according to an instruction by the operator via the menu.
  • the tomographic intensity image 521 and a retardation image 621 and a DOPU image 721 to be described below, by superimposing the display form indicating the type of the image, such as "intensity”, “retardation”, and "DOPU” written in characters as shown in the top left of the respective display area in Figs 5 , 6 and 7 .
  • the type of image may be displayed above or besides the image instead of being superimposed on the image, as long as the characters are arranged to be associated with the image.
  • the tomographic image displayed on the display area 520 can be changed to a retardation image 621 as illustrated in Fig. 6 .
  • the retardation image enables the user to discern discontinuities in the RNFL, for example.
  • the display areas 530 and 540 respectively display the tomographic images 531 and 541 similarly as in Fig. 5 .
  • the tomographic image displayed on the display area 520 can be changed to a DOPU image 721 as illustrated in Fig. 7 .
  • the DOPU image enables the RPE to be discerned and discontinuities in the RPE to be seen.
  • the display area 530 displays the intensity image 521
  • the display area 540 displays the retardation image 621. It is desirable to provide a button for selecting the image for each display area. The user thus becomes capable of easily selecting the images to be compared, such as a plurality of tomographic images indicating different polarization states.
  • the tomographic image displayed on the display area 520 can be changed to an image 821 indicating a segmentation result as illustrated in Fig. 8 .
  • image 821 indicating a segmentation result as illustrated in Fig. 8 .
  • color line segments indicating the layer boundaries are superimposed on the tomographic image and displayed in the image 821, and the RPE is highlighted (shown in the Figure as shaded with dots).
  • the layer selected by the operator using the cursor is highlighted and this may be the RPE in the illustrated case. Highlighting different layers in the retina enables the user to differentiate the layers and to determine whether there are discontinuous portions in the layers and if so, in which layers the discontinuous portions are located.
  • the display area 540 displays a tomographic image 841, used in performing segmentation, and buttons 842 and 843. If the operator selects the button 843 instead of button 842 as illustrated in Fig. 8 , the intensity image 841 can be switched to a graph 941, illustrated in Fig. 9 , indicating, for example, the layer thickness (and therefore any discontinuities in the thickness) of the highlighted layer.
  • thickness information of the selected layer may be displayed on the display area 530.
  • the thickness of the selected layer is expressed by a difference in color (or grey shade, as illustrated in Fig. 10 ).
  • An integration image may be displayed in place of the thickness of the selected layer illustrated in Fig. 10 .
  • the integration image refers to a two-dimensional image generated by adding together at least portions of a three-dimensional OCT image in the depth direction (the z direction).
  • the integration image may be generated based on a specific layer or on the entire PS-OCT.
  • the image to be displayed is changed according to the instruction of the operator.
  • the information on the disease to be diagnosed such as the name of the disease, may be selected from the menu, and the image on which a priority order has been preset with respect to the disease may be displayed on each display area.
  • an image obtained by superimposing the display form 1102 indicating the discontinuity on the fundus image 1101, as illustrated in Fig. 11 may be displayed.
  • an image obtained by superimposing the fundus image 1101 on the fundus photograph 1106, as illustrated in Fig. 12 may be displayed.
  • the display control unit 191 may display, on one of the display areas in the display unit 192, the retardation map or the birefringence map instead of the above-described images. Moreover, the display control unit 191 may superimpose and display the retardation map and the birefringence map on the fundus intensity image 511. In such a case, it is desirable to superimpose and display the retardation map or the birefringence map on the area indicated by the frame 512.
  • each of the generated images can be efficiently presented to the operator.
  • the operator can select the necessary images with easy operation.
  • the operation becomes easier by previously associating the name of the disease with the image to be displayed.
  • the positions of the display areas in which the above-described images are displayed are not limited thereto.
  • the fundus image may be displayed in the left display area in the display screen.
  • the number of images to be displayed is not limited thereto.
  • the fundus image and the tomographic image i.e., two images
  • the display method may then be changed after performing image capturing, and a plurality of tomographic images indicating different polarization states may be displayed side by side on the display screen along with the fundus image.
  • the order and the positions in which the buttons 522, 523, 524, and 525 are arranged are not limited thereto.
  • aspects of the present invention can also be realized by a computer of a system or apparatus (or devices such as a CPU or MPU) that reads out and executes a program recorded on a memory device to perform the functions of the above-described embodiment (s), and by a method, the steps of which are performed by a computer of a system or apparatus by, for example, reading out and executing a program recorded on a memory device to perform the functions of the above-described embodiment(s).
  • the program is provided to the computer for example via a network or from a recording medium of various types serving as the memory device (e.g., computer-readable medium).

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Veterinary Medicine (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Eye Examination Apparatus (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)

Claims (11)

  1. Appareil de traitement d'image comprenant :
    un moyen d'acquisition d'image de fond d'oeil (140) configuré pour acquérir une image de fond d'oeil (1101 ; 411 ; 511) d'un oeil (Er) ;
    un moyen d'acquisition d'image tomographique (100) configuré pour acquérir une image tomographique sensible à la polarisation (1103 ; 431, 441 ; 531, 541) de l'oeil ;
    un moyen d'extraction de couche (190) configuré pour extraire une couche prédéterminée (1105) de l'image tomographique sensible à la polarisation (431, 441 ; 531, 541) ; et
    un moyen d'identification configuré pour identifier une partie discontinue (1104) de la couche prédéterminée (1105) ;
    l'appareil de traitement d'image étant configuré pour obtenir une région bidimensionnelle (1102, 1104) correspondant à la partie discontinue (1104) de la couche prédéterminée (1105) de l'oeil représenté dans l'image tomographique sensible à la polarisation (1103) ; et
    l'appareil de traitement d'image comprenant en outre un moyen de commande d'affichage (191) configuré pour amener le moyen d'affichage (192) à afficher ladite région bidimensionnelle de façon superposée à une position dans l'image du fond de l'oeil qui correspond à la partie discontinue (1104) dans la couche prédéterminée (1105) de l'oeil représenté dans l'image tomographique sensible à la polarisation.
  2. Appareil de traitement d'image selon la revendication 1, dans lequel le moyen d'acquisition d'image tomographique (100) est configuré pour acquérir une image d'intensité tomographique (1103 ; 521) de l'oeil et
    dans lequel le moyen de commande d'affichage (191) est configuré pour amener le moyen d'affichage (192) à afficher une région bidimensionnelle indiquant la couche prédéterminée (1105) de l'oeil superposée à l'image d'intensité tomographique (1103).
  3. Appareil de traitement d'image selon la revendication 2, dans lequel le moyen d'acquisition d'image tomographique (100) est configuré pour acquérir l'image d'intensité tomographique (1103 ; 521) sur la base de l'image tomographique sensible à la polarisation (431, 441 ; 531, 541).
  4. Appareil de traitement d'image selon la revendication 2 ou 3, dans lequel une position d'acquisition de l'image tomographique sensible à la polarisation (431, 441 ; 531, 541) et une position d'acquisition de l'image d'intensité tomographique (1103 ; 521) sont une même position de l'oeil (Er).
  5. Appareil de traitement d'image selon l'une quelconque des revendications précédentes, dans lequel le moyen de commande d'affichage (191) est configuré pour amener le moyen d'affichage (192) à afficher l'image de fond d'oeil (1101) sur laquelle est superposée la forme d'affichage (1102) indiquant la partie discontinue, qui est elle-même superposée à une image de fond d'oeil en couleur (1106 ; 411 ; 511) de l'oeil (Er).
  6. Appareil de traitement d'image selon l'une quelconque des revendications précédentes, comprenant en outre un moyen de désignation configuré pour désigner, dans un cas où une pluralité de parties discontinues (1104) sont présentes dans la couche prédéterminée (1105) de l'oeil, au moins l'une de la pluralité de parties discontinues (1102) dans l'image de fond d'oeil (1101),
    dans lequel le moyen de traitement d'image (190) est configuré pour générer une région bidimensionnelle (1102) de la partie discontinue désignée dans un format différent de régions bidimensionnelles représentant d'autres parties discontinues.
  7. Appareil de traitement d'image selon la revendication 6, dans lequel le moyen de traitement de signal (190) est configuré pour modifier la région bidimensionnelle (1104) d'une partie discontinue désignée dans une image d'intensité tomographique de l'oeil, en une forme d'affichage différente de régions bidimensionnelles représentant d'autres parties discontinues.
  8. Appareil de traitement d'image selon la revendication 1, dans lequel le moyen d'acquisition d'image tomographique est configuré pour acquérir une image tomographique tridimensionnelle formée d'une pluralité d'images tomographiques bidimensionnelles de l'oeil sensibles à la polarisation,
    dans lequel le moyen d'extraction de couche (190) est configuré pour extraire une région de couche tridimensionnelle en tant que couche prédéterminée (1105) de l'image tomographique tridimensionnelle,
    dans lequel le moyen d'identification est configuré pour identifier une région discontinue tridimensionnelle en tant que partie discontinue (1104) sur la base de la région de couche tridimensionnelle,
    dans lequel le moyen de traitement d'image (190) est configuré pour générer la région bidimensionnelle indiquant la partie discontinue tridimensionnelle (1102), et
    dans lequel le moyen de commande d'affichage (191) est configuré pour amener le moyen d'affichage (192) à afficher la région bidimensionnelle de façon superposée à une autre région bidimensionnelle de l'image de fond d'oeil (1101) correspondant à la région discontinue tridimensionnelle.
  9. Procédé de traitement d'image consistant à :
    acquérir (S102, S103) une image de fond d'oeil d'un oeil ;
    acquérir (S102, S103) une image tomographique de l'oeil sensible à la polarisation ;
    extraire une couche prédéterminée (1105) de l'image tomographique sensible à la polarisation (431, 441 ; 531, 541) ;
    identifier une partie discontinue (1104) de la couche prédéterminée (1105) ;
    obtenir une région bidimensionnelle (1102, 1104) correspondant à la partie discontinue (1104) dans la couche prédéterminée (1105) de l'oeil représenté dans l'image tomographique sensible à la polarisation (1103) ; et
    afficher (S105) la région bidimensionnelle de façon superposée à une position dans l'image du fond de l'oeil qui correspond à la partie discontinue dans la couche prédéterminée de l'oeil dans l'image tomographique sensible à la polarisation.
  10. Procédé selon la revendication 9, consistant en outre à :
    exposer un fond d'oeil de l'oeil à deux faisceaux de lumière polarisés de façon différente ;
    recevoir des faisceaux réfléchis correspondant aux deux faisceaux ;
    afficher deux images tomographiques sensibles à la polarisation (431, 441 ; 531, 541) sur la base des faisceaux réfléchis respectivement reçus ;
    ajuster une lame de polarisation λ/4 (113) positionnée sur le chemin optique des deux faisceaux afin d'ajuster l'intensité relative des faisceaux réfléchis reçus ; et
    afficher une image d'intensité tomographique (521) résultant de l'interférence entre les faisceaux réfléchis reçus dont la polarisation a été ajustée.
  11. Support de stockage lisible par ordinateur stockant un programme qui, lorsqu'il est exécuté sur un ordinateur, amène l'ordinateur à mettre en oeuvre le procédé de traitement d'image selon la revendication 9 ou 10.
EP13151715.3A 2012-01-20 2013-01-17 Appareil et procédé de traitement d'images Not-in-force EP2617351B1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012010281 2012-01-20
JP2012186591A JP5988772B2 (ja) 2012-01-20 2012-08-27 画像処理装置及び画像処理方法

Publications (2)

Publication Number Publication Date
EP2617351A1 EP2617351A1 (fr) 2013-07-24
EP2617351B1 true EP2617351B1 (fr) 2017-08-02

Family

ID=47681667

Family Applications (1)

Application Number Title Priority Date Filing Date
EP13151715.3A Not-in-force EP2617351B1 (fr) 2012-01-20 2013-01-17 Appareil et procédé de traitement d'images

Country Status (5)

Country Link
US (1) US9033499B2 (fr)
EP (1) EP2617351B1 (fr)
JP (1) JP5988772B2 (fr)
KR (1) KR101570666B1 (fr)
CN (1) CN103211574B (fr)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140096262A (ko) * 2011-08-21 2014-08-05 데이비드 레비츠 스마트폰에 장착된 광 간섭 단층 촬영 시스템
JP6071331B2 (ja) * 2012-08-27 2017-02-01 キヤノン株式会社 画像処理装置及び画像処理方法
JP2014110883A (ja) * 2012-10-30 2014-06-19 Canon Inc 画像処理装置及び画像処理方法
JP6184232B2 (ja) * 2013-07-31 2017-08-23 キヤノン株式会社 画像処理装置及び画像処理方法
JP6418766B2 (ja) * 2014-03-27 2018-11-07 キヤノン株式会社 断層画像処理装置、断層画像処理方法及びプログラム
JP2016002380A (ja) 2014-06-18 2016-01-12 キヤノン株式会社 画像処理装置、その作動方法及びプログラム
JP6594033B2 (ja) 2015-05-14 2019-10-23 キヤノン株式会社 画像処理装置、画像処理方法及びプログラム
JP6736270B2 (ja) * 2015-07-13 2020-08-05 キヤノン株式会社 画像処理装置及び画像処理装置の作動方法
US10945607B2 (en) 2015-12-15 2021-03-16 Horiba, Ltd. Spectroscope, optical inspection device and OCT device
JP6598713B2 (ja) * 2016-03-11 2019-10-30 キヤノン株式会社 情報処理装置
JP6843521B2 (ja) * 2016-04-28 2021-03-17 キヤノン株式会社 画像処理装置及び画像処理方法
US10102682B1 (en) * 2017-04-17 2018-10-16 Raytheon Company System and method for combining 3D images in color
CN110448265B (zh) * 2018-05-08 2021-07-27 广西师范学院 一种双折射晶体快拍穆勒矩阵成像测偏眼底系统
JP7100503B2 (ja) * 2018-06-15 2022-07-13 株式会社トプコン 眼科装置
JP7273394B2 (ja) * 2019-02-18 2023-05-15 株式会社トーメーコーポレーション 眼科装置
JP2023084947A (ja) * 2021-12-08 2023-06-20 株式会社トーメーコーポレーション 画像処理装置及びそれを備えた眼科装置

Family Cites Families (58)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6112114A (en) 1991-12-16 2000-08-29 Laser Diagnostic Technologies, Inc. Eye examination apparatus employing polarized light probe
US5719399A (en) 1995-12-18 1998-02-17 The Research Foundation Of City College Of New York Imaging and characterization of tissue based upon the preservation of polarized light transmitted therethrough
EP1001283B1 (fr) 1997-07-30 2003-05-21 Hamamatsu Photonics K.K. Element optique, unite d'imagerie, appareil d'imagerie detecteur d'image radiante et analyseur d'empreintes digitales l'utilisant
US7557929B2 (en) 2001-12-18 2009-07-07 Massachusetts Institute Of Technology Systems and methods for phase measurements
CN1917806A (zh) 2004-02-10 2007-02-21 光视有限公司 高效低相干干涉测量
DE102004037479A1 (de) 2004-08-03 2006-03-16 Carl Zeiss Meditec Ag Fourier-Domain OCT Ray-Tracing am Auge
EP1872109A1 (fr) 2005-04-22 2008-01-02 The General Hospital Corporation Agencements, systèmes et procédés permettant de réaliser une tomographie par cohérence optique sensible à la polarisation du domaine spectral
JP4916779B2 (ja) 2005-09-29 2012-04-18 株式会社トプコン 眼底観察装置
JP4819478B2 (ja) 2005-10-31 2011-11-24 株式会社ニデック 眼科撮影装置
US20070109554A1 (en) 2005-11-14 2007-05-17 Feldchtein Felix I Polarization sensitive optical coherence device for obtaining birefringence information
US7593559B2 (en) 2005-11-18 2009-09-22 Duke University Method and system of coregistrating optical coherence tomography (OCT) with other clinical tests
ATE516739T1 (de) 2005-12-06 2011-08-15 Zeiss Carl Meditec Ag Interferometrische probenmessung
US7430345B2 (en) 2006-03-02 2008-09-30 The Board Of Trustees Of The Leland Stanford Junior University Polarization controller using a hollow-core photonic-bandgap fiber
JP2007240228A (ja) 2006-03-07 2007-09-20 Fujifilm Corp 光断層画像化装置
JP2007252475A (ja) 2006-03-22 2007-10-04 Fujifilm Corp 光断層画像化装置および光断層画像の画質調整方法
EP2345363A1 (fr) 2006-05-01 2011-07-20 Physical Sciences, Inc. Ophthalmoscope hybride à tomographie à cohérence optique domaine spectrale et à balayage laser linéaire
WO2007124601A1 (fr) 2006-05-03 2007-11-08 Campbell Melanie C W Méthode et appareil améliorant l'imagerie de l'oeil par sélection de types de polarisation
US20070291277A1 (en) 2006-06-20 2007-12-20 Everett Matthew J Spectral domain optical coherence tomography system
US8223143B2 (en) 2006-10-27 2012-07-17 Carl Zeiss Meditec, Inc. User interface for efficiently displaying relevant OCT imaging data
JP5007114B2 (ja) 2006-12-22 2012-08-22 株式会社トプコン 眼底観察装置、眼底画像表示装置及びプログラム
US7625088B2 (en) * 2007-02-22 2009-12-01 Kowa Company Ltd. Image processing apparatus
JP4905695B2 (ja) 2007-03-28 2012-03-28 学校法人慶應義塾 光弾性測定方法およびその装置
JP5044840B2 (ja) 2007-04-17 2012-10-10 国立大学法人 千葉大学 眼底検査装置
JP5149535B2 (ja) 2007-04-27 2013-02-20 国立大学法人 筑波大学 偏光感受型光コヒーレンストモグラフィー装置、該装置の信号処理方法、及び該装置における表示方法
JP5448353B2 (ja) 2007-05-02 2014-03-19 キヤノン株式会社 光干渉断層計を用いた画像形成方法、及び光干渉断層装置
RU2344764C1 (ru) 2007-06-14 2009-01-27 Государственное образовательное учреждение высшего профессионального образования "Нижегородская государственная медицинская академия Федерального Агентства по здравоохранению и социальному развитию" (ГОУ ВПО НижГМА Росздрава) Способ исследования цилиарного тела и угла передней камеры глаза и способ оценки тяжести тупой травмы глаза
ES2673575T3 (es) 2007-09-06 2018-06-22 Alcon Lensx, Inc. Fijación de objetivo precisa de foto-disrupción quirúrgica
US20090131800A1 (en) 2007-11-15 2009-05-21 Carestream Health, Inc. Multimodal imaging system for tissue imaging
US8208996B2 (en) 2008-03-24 2012-06-26 Carl Zeiss Meditec, Inc. Imaging of polarization scrambling tissue
US7973939B2 (en) 2008-06-17 2011-07-05 Chien Chou Differential-phase polarization-sensitive optical coherence tomography system
JP5331395B2 (ja) 2008-07-04 2013-10-30 株式会社ニデック 光断層像撮影装置
JP5255524B2 (ja) 2008-07-04 2013-08-07 株式会社ニデック 光断層像撮影装置、光断層像処理装置。
US8820931B2 (en) * 2008-07-18 2014-09-02 Doheny Eye Institute Optical coherence tomography-based ophthalmic testing methods, devices and systems
JP2010125291A (ja) * 2008-12-01 2010-06-10 Nidek Co Ltd 眼科撮影装置
JP5455001B2 (ja) 2008-12-26 2014-03-26 キヤノン株式会社 光断層撮像装置および光断層撮像装置の制御方法
JP5618533B2 (ja) 2008-12-26 2014-11-05 キヤノン株式会社 光干渉断層情報取得装置、撮像装置及び撮像方法
JP5743380B2 (ja) 2009-03-06 2015-07-01 キヤノン株式会社 光断層撮像装置および光断層撮像方法
EP2243420A1 (fr) 2009-04-24 2010-10-27 Schmidt-Erfurth, Ursula Procédé pour déterminer les exsudats dans la rétine
JP2010259492A (ja) 2009-04-30 2010-11-18 Topcon Corp 眼底観察装置
JP5610706B2 (ja) 2009-05-22 2014-10-22 キヤノン株式会社 撮像装置および撮像方法
JP5054072B2 (ja) 2009-07-28 2012-10-24 キヤノン株式会社 光断層画像撮像装置
JP5484000B2 (ja) 2009-10-30 2014-05-07 キヤノン株式会社 補償光学装置および補償光学方法、光画像撮像装置および光画像の撮像方法
US9506740B2 (en) 2009-12-01 2016-11-29 The Brigham And Women's Hospital System and method for calibrated spectral domain optical coherence tomography and low coherence interferometry
JP5582772B2 (ja) 2009-12-08 2014-09-03 キヤノン株式会社 画像処理装置及び画像処理方法
US8463016B2 (en) * 2010-02-05 2013-06-11 Luminescent Technologies, Inc. Extending the field of view of a mask-inspection image
US20130003077A1 (en) 2010-03-31 2013-01-03 Canon Kabushiki Kaisha Tomographic imaging apparatus and control apparatus for tomographic imaging apparatus
JP5451492B2 (ja) 2010-03-31 2014-03-26 キヤノン株式会社 画像処理装置、その制御方法及びプログラム
EP2563206B1 (fr) 2010-04-29 2018-08-29 Massachusetts Institute of Technology Procédé et dispositif de correction de mouvement et d'amélioration d'image pour la tomographie à cohérence optique
JP5627321B2 (ja) 2010-07-09 2014-11-19 キヤノン株式会社 光断層画像撮像装置及びその撮像方法
JP5721412B2 (ja) 2010-12-02 2015-05-20 キヤノン株式会社 眼科装置、血流速算出方法およびプログラム
US9492076B2 (en) 2011-02-01 2016-11-15 Korea University Research And Business Foundation Dual focusing optical coherence imaging system
JP5792967B2 (ja) * 2011-02-25 2015-10-14 キヤノン株式会社 画像処理装置及び画像処理システム
US9033510B2 (en) 2011-03-30 2015-05-19 Carl Zeiss Meditec, Inc. Systems and methods for efficiently obtaining measurements of the human eye using tracking
JP5843542B2 (ja) * 2011-09-20 2016-01-13 キヤノン株式会社 画像処理装置、眼科撮影装置、画像処理方法及びプログラム
JP6021384B2 (ja) 2012-03-30 2016-11-09 キヤノン株式会社 光干渉断層撮影装置及び制御方法
US9107610B2 (en) * 2012-11-30 2015-08-18 Kabushiki Kaisha Topcon Optic neuropathy detection with three-dimensional optical coherence tomography
JP6217085B2 (ja) * 2013-01-23 2017-10-25 株式会社ニデック 眼科撮影装置
US9526412B2 (en) * 2014-01-21 2016-12-27 Kabushiki Kaisha Topcon Geographic atrophy identification and measurement

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None *

Also Published As

Publication number Publication date
KR101570666B1 (ko) 2015-11-20
US20130188138A1 (en) 2013-07-25
CN103211574A (zh) 2013-07-24
KR20130085978A (ko) 2013-07-30
JP2013165952A (ja) 2013-08-29
US9033499B2 (en) 2015-05-19
JP5988772B2 (ja) 2016-09-07
EP2617351A1 (fr) 2013-07-24
CN103211574B (zh) 2016-08-03

Similar Documents

Publication Publication Date Title
EP2617351B1 (fr) Appareil et procédé de traitement d'images
US9993152B2 (en) Image processing apparatus and image processing method
JP6071331B2 (ja) 画像処理装置及び画像処理方法
US9295383B2 (en) Image processing apparatus and image processing method for displaying indexes for different forms in a region of a retinal pigment epithelium
EP2727518B1 (fr) Appareil et procédé de traitement d'images
EP2617352B1 (fr) Appareil, procédé d'imagerie et programme
JP2015029557A (ja) 画像処理装置および画像処理方法
US9717409B2 (en) Image processing apparatus and image processing method
EP2725319A1 (fr) Appareil et procédé de traitement d'images
US8995737B2 (en) Image processing apparatus and image processing method
JP6381622B2 (ja) 画像処理装置及び画像処理方法
US9192293B2 (en) Image processing apparatus and image processing method
JP6505072B2 (ja) 画像処理装置及び画像処理方法
US8979267B2 (en) Imaging apparatus and method for controlling the same
US9247873B2 (en) Imaging apparatus

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

17P Request for examination filed

Effective date: 20140124

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

17Q First examination report despatched

Effective date: 20150907

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

INTG Intention to grant announced

Effective date: 20170210

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

Ref country code: AT

Ref legal event code: REF

Ref document number: 913542

Country of ref document: AT

Kind code of ref document: T

Effective date: 20170815

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602013024254

Country of ref document: DE

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20170802

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 913542

Country of ref document: AT

Kind code of ref document: T

Effective date: 20170802

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170802

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170802

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170802

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20171102

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170802

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170802

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170802

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170802

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20171103

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20171102

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170802

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170802

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170802

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20171202

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170802

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170802

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170802

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602013024254

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170802

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170802

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170802

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170802

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20180503

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170802

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20180117

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180117

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180131

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

REG Reference to a national code

Ref country code: FR

Ref legal event code: ST

Effective date: 20180928

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20180131

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180131

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180131

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180131

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180117

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180117

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170802

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180117

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170802

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20130117

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170802

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170802

Ref country code: MK

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170802

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170802

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20211215

Year of fee payment: 10

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 602013024254

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20230801