WO2019225290A1 - 撮影装置及びその制御方法 - Google Patents

撮影装置及びその制御方法 Download PDF

Info

Publication number
WO2019225290A1
WO2019225290A1 PCT/JP2019/017726 JP2019017726W WO2019225290A1 WO 2019225290 A1 WO2019225290 A1 WO 2019225290A1 JP 2019017726 W JP2019017726 W JP 2019017726W WO 2019225290 A1 WO2019225290 A1 WO 2019225290A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
light source
control unit
image
interference
Prior art date
Application number
PCT/JP2019/017726
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
耕平 竹野
Original Assignee
キヤノン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by キヤノン株式会社 filed Critical キヤノン株式会社
Publication of WO2019225290A1 publication Critical patent/WO2019225290A1/ja

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions

Definitions

  • the present invention relates to an imaging apparatus that performs imaging of a measurement object using light from a light source and a control method thereof.
  • OCT optical coherence tomography
  • OCT splits light from a light source into measurement light and reference light, then causes return light of the measurement light from the measurement object to interfere with reference light reflected from the reference mirror, and analyzes the intensity of the interference light. As a result, a tomographic image of the measurement object is obtained.
  • a time domain OCT Time Domain OCT: TD-OCT
  • Spectral Domain OCT Spectral Domain OCT: SD-OCT
  • wavelength sweep type wavelength scanning type (wavelength scanning type)
  • OCT Sewept Source
  • SS-OCT wavelength first An OCT
  • SD-OCT and SS-OCT are also collectively referred to as FD-OCT (Fourier Domain OCT).
  • an OCT apparatus for ophthalmology is desired to increase the imaging speed.
  • a light source capable of wavelength sweeping (wavelength scanning) is mounted in order to realize the higher speed. Developments are underway to reduce time.
  • the measurement light is irradiated to the measurement target in a two-dimensional area instead of a point, and the return light of the measurement light from the measurement target is received by the two-dimensional sensor.
  • FF-OCT Full Field OCT
  • SS-FF-OCT which uses a two-dimensional sensor capable of high-speed imaging and the above-described wavelength sweep (wavelength scan) method, is one of the preferred options for speeding up.
  • Patent Document 1 and Non-Patent Document 1 describe conventional techniques for irradiating measurement light with a two-dimensional sensor by irradiating measurement light onto a two-dimensional region of the measurement object.
  • the technology is known.
  • Patent Document 1 describes a technique in which polarized light is divided and received, and imaging is performed based on a beat signal of interference light.
  • the technique of Patent Document 1 is essential. Since it is a time domain method, it is difficult to speed up tomographic imaging.
  • Non-Patent Document 1 describes a wavelength sweep (wavelength scan) type FF-OCT (SS-FF-OCT) technique comprising a wavelength sweep type (wavelength scan type) light source and a two-dimensional sensor. ing.
  • the acquisition efficiency of the image signal is impaired by, for example, shifting the timing of starting the wavelength sweep of the light source and the timing of starting the exposure of the two-dimensional sensor. In some cases, a tomographic image could not be acquired.
  • the present invention has been made in view of such problems, and an object of the present invention is to provide a mechanism capable of acquiring a tomographic image with good image quality by high-speed imaging.
  • the imaging apparatus of the present invention is arranged in a two-dimensional manner, a light branching means for branching light from a light source into measurement light and reference light, an irradiation means for irradiating the measurement light to a two-dimensional region to be measured.
  • a detection unit configured to include a light receiving element, and detecting interference light obtained by causing the return light of the measurement light from the measurement target to interfere with the reference light; and the operation of the light source and the operation of the detection unit; Control means for performing control to link the two.
  • the present invention also includes a method for controlling the above-described photographing apparatus.
  • FIG. 1 It is a block diagram which shows an example of a function structure of the imaging device which concerns on embodiment of this invention. It is a figure which shows an example of the external appearance structure of the imaging device which concerns on embodiment of this invention. It is a figure which shows an example of the internal structure of the optical system shown in FIG. It is a figure which shows embodiment of this invention and shows an example of an interference image. It is a figure which shows embodiment of this invention and shows an example of an interference signal. It is a figure which shows embodiment of this invention and shows an example of a tomographic signal. It is a figure which shows embodiment of this invention and shows an example of the display screen displayed on the display part shown in FIG.
  • FIG. 4 is a timing chart illustrating an embodiment of the present invention and illustrating an example of an operation method of the light source and the two-dimensional sensor illustrated in FIG. 3.
  • FIG. 1 is a block diagram illustrating an example of a functional configuration of a photographing apparatus 10 according to an embodiment of the present invention.
  • the photographing apparatus 10 includes the functional components of an optical system 100, an input unit 200, an overall control unit 300, an image generation unit 400, a display control unit 500, a storage unit 600, and a display unit 700. It is configured.
  • the optical system 100 is a component that irradiates the measurement target T with measurement light and detects return light of the measurement light from the measurement target T based on the control of the overall control unit 300.
  • the input unit 200 is a component that inputs various information and the like to the overall control unit 300 and the like.
  • the input unit 200 includes, for example, a keyboard and a mouse that allow a user to input an operation.
  • the overall control unit 300 is a component that performs overall control of the operation of the imaging device 10 based on information input from the input unit 200, for example.
  • the image generation unit 400 is a component that generates the image IM by processing the signal S that is the output of the optical system 100 based on the control of the overall control unit 300.
  • the display control unit 500 is a component that performs control to display, for example, the image IM generated by the image generation unit 400 and information input from the input unit 200 on the display unit 700.
  • the display control unit 500 is provided with a configuration different from the overall control unit 300.
  • the functions of the display control unit 500 are incorporated in the overall control unit 300 and
  • the control unit 300 may perform display control on the display unit 700.
  • the storage unit 600 is a configuration unit that stores programs and various information necessary for the overall control unit 300, the image generation unit 400, and the display control unit 500 to perform various processes.
  • the storage unit 600 is a configuration unit that stores various types of information acquired by the overall control unit 300, the image generation unit 400, and the display control unit 500 performing various types of processing.
  • the storage unit 600 stores information for specifying the measurement target T together with the image IM generated by the image generation unit 400.
  • the display unit 700 is a component that includes a display device such as a liquid crystal display, for example.
  • FIG. 2 is a diagram illustrating an example of an external configuration of the photographing apparatus 10 according to the embodiment of the present invention.
  • the photographing apparatus 10 includes optical appearance components such as an optical head 11, a stage unit 12, a base unit 13, a PC (personal computer) 14, a face receiver 15, an input unit 200, and a display unit 700. It is configured.
  • FIG. 2 shows an XYZ coordinate system for determining a position in a three-dimensional space. 2 shows an example in which the subject's eye (eye to be examined) E is applied as the measurement target T shown in FIG. 1, and in this case, an example in which an ophthalmic imaging apparatus is applied as the imaging apparatus 10. Become.
  • the optical head 11 is a casing including the optical system 100 shown in FIG.
  • the stage unit 12 is a component that moves the optical head 11 with respect to the base unit 13 in the XYZ directions by a motor or the like based on the control of the overall control unit 300 shown in FIG.
  • the base unit 13 is a component that supports the optical head 11 via the stage unit 12.
  • the base portion 13 is a component that also supports the face receiver 15.
  • the face receiver 15 is a component for fixing the subject's face F.
  • the PC 14 is a computer that includes the overall control unit 300, the image generation unit 400, the display control unit 500, and the storage unit 600 shown in FIG.
  • the PC 14 can realize the overall control unit 300, the image generation unit 400, and the display control unit 500 illustrated in FIG. 1 as software modules that can be realized by hardware such as a CPU.
  • the software module is realized by executing a program stored in the storage unit 600 by a CPU (not shown) of the PC 14 will be described.
  • the image generation unit 400 may be realized by dedicated hardware such as an ASIC
  • the display control unit may be realized by a dedicated processor such as a GPU different from the CPU.
  • a form in which the connection between the optical system 100 provided in the optical head 11 and the PC 14 is realized by a configuration via a network, for example, is also applicable to the present invention.
  • FIG. 3 is a diagram showing an example of the internal configuration of the optical system 100 shown in FIG. Note that the example shown in FIG. 3 also shows an example in which the eye E is applied as the measurement target T shown in FIG.
  • the optical system 100 includes a light source 101, optical fibers 102-1 to 102-3, a coupler 103, a collimating lens 104, a beam splitter 105, an aperture 107, an eyepiece optical system 140, a reference optical system 150, and a light receiving optical system 160. Configured. Although not shown in FIG. 3, the optical system 100 is a wide-angle fundus photographing unit for confirming the photographing position of the fundus Er of the eye E, and an anterior eye observation for facilitating alignment. And a fixation lamp optical system for presenting the fixation position to the eye E to be examined.
  • the wide-angle fundus photographing unit, the anterior ocular segment observation unit, and the fixation lamp optical system can use known configurations in the present embodiment, and are not the central configuration of the present invention. The description is omitted.
  • the light source 101 is a wavelength swept light source (wavelength scanning light source) configured to be able to sweep the wavelength of light to be output.
  • the light source 101 can change parameters such as a wavelength and a wavelength width at which scanning is started in one scan, and a scan speed that is the number of scans per second based on the control of the overall control unit 300. It is. In this embodiment, the standard scan speed is 25 scans per second.
  • light intensity data for each wavelength of the light source 101 (hereinafter referred to as “spectrum data”) is measured and stored in the storage unit 600 in advance.
  • the light emitted from the light source 101 enters the coupler 103 via the single mode optical fiber 102-1.
  • the coupler 103 is a light branching unit that branches the light from the light source into the measurement light 121 and the reference light 123. Then, the measurement light 121 branched by the coupler 103 is guided to the collimator lens 104 via the single mode optical fiber 102-2, and then to the beam splitter 105 via the aperture 107.
  • the reference beam 123 branched by the coupler 103 is guided to the reference optical system 150 via the single mode optical fiber 102-3.
  • the aperture 107 is provided to cut out a region where the intensity distribution of the measuring light 121 is approximately uniform.
  • the region of the measurement light 121 that is about half or more of the peak intensity passes through the aperture 107.
  • the aperture 107 is configured such that the aperture diameter is variable, and the region that irradiates the fundus Er can be changed. With this configuration, even when the eye E is nearsighted or farsighted, the overall control unit 300 can perform control so that the irradiation area is the same as that when the eye E is normally sighted. Note that the aperture 107 can be controlled in conjunction with the focus adjustment mechanism 141.
  • the focal length of the collimating lens 104 is selected so that the intensity distribution of the measuring light 121 is substantially uniform under the condition that the aperture diameter of the aperture 107 is the largest. It should be noted that by using the light source 101 with higher output, the permissible decrease in intensity with respect to the peak intensity may be tightened, and a more uniform region may be selected and used.
  • the measurement light 121 passes through the beam splitter 105 and is guided to the eyepiece optical system 140. Specifically, the measurement light 121 guided to the eyepiece optical system 140 is guided to the eye E via the focus adjustment mechanism 141, the scanner 142, the relay optical system 143, the scanner 144, the eyepiece 145, and the aperture 146.
  • the two-dimensional region of the fundus Er is irradiated.
  • the eyepiece optical system 140 is an irradiation unit that irradiates the measurement light 121 to a two-dimensional region of the eye E to be measured (more specifically, the fundus Er of the eye E). is there.
  • the measurement light 121 reflected by the fundus Er of the eye E follows the optical path followed by the measurement light 121 in the reverse order as the return light 122 of the measurement light.
  • the return light 122 of the measurement light enters the eyepiece optical system 140, and passes through the aperture 146, the eyepiece lens 145, the scanner 144, the relay optical system 143, the scanner 142, and the focus adjustment mechanism 141, and the beam splitter 105. Led to. Then, the return light 122 guided to the beam splitter 105 enters the light receiving optical system 160 via the beam splitter 105.
  • FIG. 3 schematically shows that the light reflected by a part of the two-dimensional region of the fundus Er irradiated with the measurement light 121 is imaged on the two-dimensional sensor 162 as the return light 122.
  • the present embodiment is configured such that all of the return light 122 reflected by the two-dimensional region of the fundus Er irradiated with the measurement light 121 forms an image on the two-dimensional sensor 162.
  • the focus adjustment mechanism 141 includes mirrors 141-1 and 141-2, a stage 141-3, and a prism 141-4, and is a mechanism that adjusts the imaging relationship between the fundus Er and the two-dimensional sensor 162. is there.
  • the measurement light 121 incident on the focus adjustment mechanism 141 from the light source 101 side is reflected by one surface of the prism 141-4, and then sequentially reflected by the mirror 141-1 and the mirror 141-2. Reflected on the other surface.
  • the mirror 141-1 and the mirror 141-2 are arranged on the stage 141-3.
  • the overall control unit 300 can change the optical path length of the measurement light 121 by moving the stage 141-3 in the direction indicated by the arrow in FIG. 2 based on the input from the input unit 200 or the like. ing.
  • the scanner 142 and the scanner 144 are, for example, galvano scanners in which the angle of the reflecting surface that reflects light is variable.
  • the scanner 142 and the scanner 144 are moved in two directions, an X direction (horizontal direction) and a Y direction (vertical direction), which are orthogonal to each other, under the control of the overall control unit 300, respectively.
  • the scanner 142 and the scanner 144 are arranged so as to be conjugate with the pupil P of the eye E by the relay optical system 143 and the eyepiece lens 145.
  • the overall control unit 300 can control the measurement light 121 incident on the eye E to pass through substantially the same region of the pupil P regardless of the steering by the scanner 142 and the scanner 144. Further, by this control, the measurement light 121 is not partially blocked by the eye E, and steering can be performed efficiently.
  • the aperture 146 is disposed in the vicinity of the condensing point close to the eye E of the measurement light 121, and is configured so that the fundus Er can be appropriately irradiated.
  • the reference light 123 emitted from the single-mode optical fiber 102-3 is guided to the reference optical system 150 as described above. Specifically, the reference light 123 incident on the reference optical system 150 is guided to the collimating lens 151, the aperture 153, the dispersion compensation glass 154, and the transfer optical element 155. Thereafter, the reference beam 123 is guided to the beam splitter 105 via the transfer optical element 155.
  • the dispersion compensation glass 154 is used to compensate for dispersion caused by the optical element constituting the eye E or the optical system 100.
  • the exit end of the optical fiber 102-3 and the collimating lens 151 are disposed on the stage 152.
  • the overall control unit 300 drives the stage 152 in the optical axis direction in response to the difference in the axial length of the subject's eye E based on the input from the input unit 200 and the like.
  • the position can be adjusted.
  • the coherence gate position represents a position where there is no difference in the optical path length of the reference light 123 with respect to the optical path lengths of the measurement light 121 and the return light 122.
  • the optical path length of the reference light 123 is changed.
  • the optical path length of the measurement light 121 and the return light 122 and the optical path of the reference light 123 are not limited to this form. It suffices if the optical path length difference from the length can be changed.
  • the stage 152 is moved to move the exit end of the optical fiber 102-3 and the collimator lens 151.
  • the present invention is not limited to this form.
  • an optometer may be disposed immediately before the aperture 153, and it is desirable to employ a configuration in which an element having a small weight is mounted on the stage 152 in order to increase the operation speed of vertical tracking described later.
  • the beam splitter 105 is a multiplexing unit that combines the return light 122 incident from the eyepiece optical system 140 and the reference light 123 incident from the reference optical system 150 to generate interference light 124. Then, the interference light 124 generated by the beam splitter 105 is guided to the light receiving optical system 160.
  • the interference light 124 guided to the light receiving optical system 160 is received by the two-dimensional sensor 162 via the imaging optical system 161.
  • the two-dimensional sensor 162 is a detection unit that detects incident interference light 124 as an interference signal. Then, the two-dimensional sensor 162 stores the detected interference signal data in the internal memory 163.
  • the two-dimensional sensor 162 includes a plurality of light receiving elements (hereinafter referred to as “pixels”) arranged two-dimensionally, and each pixel has interference incident according to the exposure timing. Light 124 is converted into an interference signal.
  • FIG. 4 is a diagram illustrating an example of an interference image, an interference signal, and a tomographic signal according to the embodiment of the present invention. Specifically, FIG. 4A shows an example of the interference image 810, FIG. 4B shows an example of the interference signal 820, and FIG. 4C shows an example of the tomographic signal 830.
  • a signal generated by one exposure of the two-dimensional sensor 162 is an interference image 810 in which interference fringes are superimposed on the front view of the fundus Er.
  • N is the total number of coordinates of the interference image 810
  • the upper left of the interference image 810 is the reference coordinates 811 (X1, Y1).
  • the signal of one wavelength scan at the coordinates 812 (Xi, Yi) of the interference image 810 which is the fundus image shown in FIG. 4A becomes the interference signal 820 shown in FIG. 4B.
  • 4B is obtained by calculating the interference signal 820 shown in FIG. 4B, and a tomographic image is generated from the plurality of tomographic signals 830.
  • the tomographic direction shown on the horizontal axis in FIG. 4C is a direction corresponding to the depth direction (Z direction) of the eye E (more specifically, the fundus Er of the eye E in the present embodiment).
  • the two-dimensional sensor 162 can take a picture at a shooting speed of 4000 frames or more per second. Preferably it is done.
  • the data of the partial area 813 of the interference image 810 shown in FIG. 4A is sent to the image generation unit 400 in real time based on the user's designation in the preview area 922, which will be described later with reference to FIG.
  • This image is further displayed on the display unit 700 in real time.
  • a huge amount of data transfer becomes unnecessary, and a preview image can be presented to the user in real time.
  • the user can determine whether or not the coherence gate position or the like is appropriate in the focus, the coherence gate position, or the steering described above based on the preview image.
  • the reference light 123 is imaged in a wider area than the measurement light 121 on the two-dimensional sensor 162 by appropriately selecting the collimating lens 151 and the aperture 153. Yes.
  • the accuracy required for adjusting the optical system is relaxed, and it is possible to perform shooting more stably.
  • the overall control unit 300 is configured as a software module realized by the CPU of the PC 14 as described above, and controls each component of the optical system 100 shown in FIG. Further, in the present embodiment, the overall control unit 300 controls the overall operation of the photographing apparatus 10 and also functions as a unit that performs various selection processes, various measurement processes, and various arithmetic processes. In addition, the overall control unit 300 receives an input from a user who operates the imaging apparatus 10 via the input unit 200. Specifically, for example, information such as a patient ID for specifying the eye E to be examined, parameters necessary for imaging, selection of a pattern for scanning the fundus Er, and the like are input to the overall control unit 300 via the input unit 200. The overall control unit 300 controls each component of the photographing apparatus 10 based on various types of information input via the input unit 200, and stores the obtained signals, images, and other data. Has the function of saving.
  • Image Generation Unit 400 Next, the image generation unit 400 shown in FIG. 1 will be described.
  • the image generation unit 400 generates and outputs an image related to the eye E by performing various processes on the signal S output from the optical system 100.
  • the display control unit 500 performs control for displaying the image acquired from the image generation unit 400 on the display unit 700 based on the control of the overall control unit 300.
  • FIG. 5 is a diagram showing an example of the display screen 900 displayed on the display unit 700 shown in FIG. 1 according to the embodiment of the present invention.
  • an input screen for specific information of the eye E such as a patient ID input by the overall control unit 300 can also be displayed on the display unit 700.
  • a known configuration can be used, and since it is not the central configuration of the present invention, description thereof is omitted.
  • a user interface that can be operated by the user includes a left / right eye switching button 901, an alignment adjustment unit 902, a focus adjustment slider bar 903, a coherence gate adjustment slider bar 904, a coherence gate automatic adjustment button 905, and a display mode pull-down menu 906.
  • a scan mode pull-down menu 907 and a shooting button 908 are provided.
  • Each image displayed in the display areas 910, 920, 930, 940 and 950 is generated by the image generation unit 400.
  • an anterior segment image of the eye E is displayed, and the alignment between the optical head 11 and the eye E can be confirmed. Further, the overall control unit 300 automatically detects the scattered light from the cornea of the measurement light 121 and highlights it as a bright spot 911 on the anterior eye image in the display area 910, so that the measurement light 121 is in the eye to be examined. The user can easily recognize the position in the pupil incident on E. In the display area 910, a target circle 912 that presents the target position of the pupil of the eye E to be examined is superimposed on the anterior ocular segment image so that the user can easily determine whether the alignment is appropriate. ing.
  • the mark 913 is superimposed and displayed on the anterior ocular segment image, and a fundus image with little reflection of corneal reflection is obtained by aligning the bright spot 911 with the mark 913. Be able to. Further, the distance from the center of the pupil of the eye E to be examined may be superimposed on the anterior eye image in the display area 910 so that the user can easily grasp the incident position of the measurement light 121.
  • the optical system 100 of the present embodiment appropriately selects the reflectance of a dichroic mirror (not shown) that separates the anterior ocular segment observation unit and the measurement light 121, and reflects the reflected light of the cornea displayed in the display region 910. The luminance value is not saturated.
  • a wide area fundus plane image of the fundus Er of the eye E is displayed in real time, and a tomographic imaging area 921 and a preview area 922 are superimposed on the fundus plane image.
  • the imaging area 921 is an area that is a target for imaging a tomographic image that is designated by the user using the input unit 200.
  • the preview area 922 is used for designating a display position of the tomographic image in the direction indicated by each corresponding arrow in the display areas 930 and 940, and is designated by the user using the input unit 200.
  • volume data a series of interference images acquired based on the area specified by the imaging area 921 is referred to as volume data.
  • a series of interference images acquired by one wavelength scan is referred to as single volume data.
  • the volume data is generated from a plurality of single volumes.
  • acquiring volume data is referred to as shooting
  • acquiring single volume data is referred to as single shooting.
  • the shooting area 921 and the preview area 922 can be moved in conjunction with each other or can be moved independently, and can be switched between linked and independent operation by a switch (not shown).
  • the preview area 922 automatically changes in size in conjunction with the area (two-dimensional area) where the measurement light 121 irradiates the fundus Er, and the user can easily obtain a range in which data can be acquired by single imaging without steering. Can be distinguished.
  • the display mode pull-down menu 906 can select the type of tomographic image to be displayed in the display area 950.
  • the display mode pull-down menu 906 can select a three-dimensional tomographic image in addition to a horizontal or vertical tomographic image of the region designated by the imaging region 921.
  • a tomographic image is displayed in the display area 950, it is possible to change the position of the tomographic image to be displayed by moving the crosshair displayed superimposed on the fundus plane image in the display area 920. is there.
  • a horizontal tomographic image calculated from single volume data is displayed in the display area 930, and a vertical tomographic image is displayed in the display area 940.
  • an arrow is superimposed on the preview area 922 so that the direction of the acquired data of the tomographic image can be easily determined.
  • the three-dimensional tomographic image When a three-dimensional tomographic image is displayed in the display area 950, the three-dimensional tomographic image can be moved, rotated, enlarged / reduced, and contrast adjusted on the display area 950 by operating the input unit 200. In addition to this, only a specific retinal layer on the fundus Er of the eye E can be displayed.
  • each image generated by the image generation unit 400 can be efficiently presented to the user.
  • an image required by the user can be selected with a simple operation.
  • the operation is further simplified by associating a disease name with an image to be displayed in advance.
  • FIG. 6 is a flowchart illustrating an example of a processing procedure in the photographing method of the photographing apparatus 10 according to the embodiment of the present invention.
  • acquisition of a wide-angle fundus image of the fundus Er by the wide-angle fundus photographing unit is started prior to the processing of the flowchart shown in FIG. 6 and acquired at a predetermined frame rate.
  • a wide-angle fundus image is displayed in the display area 920 in real time.
  • Step S101 selection of left and right eyes
  • the overall control unit 300 controls the subject to be imaged based on the operation of the left / right eye switching button 901.
  • the right eye (R) or the left eye (L) is selected as the optometry E.
  • the overall control unit 300 moves the optical head 11 using data stored in advance in the storage unit 600 based on the selection of the left and right eyes.
  • the overall control unit 300 may calculate the amount of movement using the acquired data of the anterior ocular segment observation unit and move the optical head 11 with higher accuracy.
  • Step S102 (Selecting Shooting Mode)> Subsequently, when the user designates a photographing mode from the scan mode pull-down menu 907, the overall control unit 300 selects a photographing mode based on the designation.
  • the imaging mode for example, a standard imaging mode (Standard Mode) in which imaging is performed at a standard scanning speed, a high resolution mode (High Resolution Mode) in which the resolution in the tomographic direction of the retina Er is improved, or a standard scanning speed. It is possible to select a high-speed shooting mode (High Speed Mode) in which shooting is performed quickly.
  • the user designates the photographing mode.
  • a disease to be diagnosed from a menu (not shown) (selecting a disease name)
  • a parameter that is prioritized for the disease in advance You may make it take a picture.
  • Step S103 Start of wavelength sweep of light source
  • the overall control unit 300 turns on the light source 101 and starts wavelength sweeping (wavelength scanning) of light output from the light source 101.
  • the overall control unit 300 sets parameters such as the scan speed of the light source 101 and the imaging region (ROI) of the two-dimensional sensor 162 stored in advance in the storage unit 600 based on the imaging mode selected in step S102.
  • the wavelength scan of the light output from the light source 101 is started.
  • the overall control unit 300 acquires the data of the partial region 813 based on the preview region 922, and causes the display regions 930 and 940 to display images imaged by the image generation unit 400.
  • Step S104 (Alignment Adjustment)> Subsequently, when the user clicks on the pupil center of the anterior segment image displayed in the display area 910 via the input unit 200, the overall control unit 300 appropriately aligns the pupil center based on the position of the click. Thus, the optical head 11 is moved. At this time, the overall control unit 300 performs control so that the incident position of the measurement light 121 deviates from the apex of the cornea, and automatically prevents the reflected light of the measurement light 121 from the cornea of the eye E to reach the two-dimensional sensor 162. adjust.
  • the user can finely adjust the alignment by clicking a button of the alignment adjustment unit 902 via the input unit 200.
  • the target circle 912 presenting the target position of the pupil is superimposed on the display area 910 on the anterior ocular segment image so that the user can easily determine whether the alignment is appropriate. It has become.
  • the mark 913 is superimposed and displayed on the anterior ocular segment image, and the bright spot 911 is aligned so as to overlap the mark 913, thereby reducing the reflection of corneal reflection.
  • a fundus image can be acquired.
  • the mark 913 is displayed at a position based on the parameters stored in the storage unit 600, and can be switched between display and non-display using a switch (not shown).
  • Step S105 (Focus Adjustment)> Subsequently, when the user operates the focus adjustment slider bar 903 while referring to the wide-angle fundus image displayed in the display area 920, the overall control unit 300 performs the wide-angle fundus photographing unit based on the operation input value of the user. Adjust the focus. Further, the overall control unit 300 drives the focus adjustment mechanism 141 in conjunction with the focus adjustment of the wide-angle fundus photographing unit.
  • the overall control unit 300 adjusts the aperture diameter of the aperture 107 in conjunction with the movement of the focus adjustment mechanism 141. For example, when the eye E is a myopic eye, the irradiation area of the fundus Er is narrowed. In this case, adjustment is performed so that the aperture diameter of the aperture 107 is increased.
  • the overall control unit 300 controls the light source 101 so that the amount of light incident on the eye E is substantially constant in conjunction with the aperture diameter of the aperture 107.
  • the focus adjustment mechanism 141 and the aperture 107 are linked based on parameters stored in advance in the storage unit 600.
  • the aperture diameter of the aperture 107 can be manually adjusted, and the irradiation area of the fundus Er is changed depending on conditions such as miosis of the eye E to be examined. Photography is possible.
  • the size of the preview area 922 is displayed in conjunction with the area where the measurement light 121 irradiates the fundus Er. It is possible to easily visually recognize the shooting range.
  • the focus adjustment mechanism 141 changes the optical path length, the adjustment is facilitated by adjusting the focus before the coherence gate adjustment performed in step S107 described later. In addition, when the focus is adjusted at a timing different from this step, it is desirable to adjust the coherence gate in conjunction.
  • Step S106 (shooting position selection)> Subsequently, when the user designates the desired preview position via the input unit 200, the overall control unit 300 adjusts the position of the preview area 922 based on the designation.
  • Step S107 (Coherence Gate Adjustment)> Subsequently, when the user presses the automatic coherence gate adjustment button 905, the overall control unit 300 determines the coherence gate position based on the luminance value of the image, and drives the stage 152.
  • the user can finely adjust the coherence gate by sliding the coherence gate adjustment slider bar 904 using the input unit 200.
  • Step S108 (shooting area adjustment)> Subsequently, the user designates the position and size of the shooting area 921 and the position of the preview area 922 so that the desired shooting range is obtained from the input unit 200 while checking the images displayed in the display areas 930 and 940. Then, the overall control unit 300 performs adjustment based on the designation.
  • Step S109 Start Shooting
  • the overall control unit 300 starts photographing a tomographic image of the eye E based on the photographing region 921.
  • the shooting area 921 is specified to be narrower than the preview area 922
  • the overall control unit 300 performs steering so that the centers of the shooting area 921 and the preview area 922 coincide with each other, and performs single shooting to obtain single volume data. get.
  • the shooting area 921 is designated wider than the preview area 922, the overall control unit 300 automatically determines the shooting order so that the volume data in the shooting area 921 can be acquired.
  • FIG. 7 is a diagram for explaining a volume data acquisition method according to the embodiment of the present invention.
  • the overall control unit 300 hides the preview area 922 and automatically alternates steering and single shooting so that volume data in the shooting area 921 is acquired.
  • the steering is controlled by the overall control unit 300 so that the scanner 142 and the scanner 144 respectively change the irradiation position of the fundus Er with the measurement light 121 in the X direction (horizontal direction) and the Y direction (vertical direction). ) In two directions.
  • the overall control unit 300 sets the amount of steering movement and the amount of overlap between single volume data based on the parameters stored in advance in the storage unit 600, and performs control so as to include the imaging region 921. .
  • including means that the obtained volume data is wider than the area specified by the imaging area 921. For this reason, it is possible to reliably photograph the area intended by the user.
  • parameters such as a steering amount when performing single shooting are stored in the storage unit 600 in association with single volume data.
  • the overall control unit 300 performs single photographing at the approximate center of the photographing region 921 and stores the tomographic image in the partial region 813 in the storage unit 600 as a reference tomographic image. In subsequent single shooting, the overall control unit 300 sequentially performs single shooting so as to be adjacent to the already shot single volume data. Further, the overall control unit 300 automatically adjusts the focus and the coherence gate position before steering to the next adjacent imaging position based on the result of comparing the tomographic image of the partial region 813 with the reference tomographic image. Further, the overall control unit 300 may be configured to perform re-imaging by adjusting the focus or coherence gate position when it is determined that the focus or coherence gate position is significantly shifted. With this configuration, it is possible to always perform single imaging at the optimum focus and coherence gate position.
  • the overall control unit 300 sets the amount of steering movement and the amount of overlap between single volume data based on the parameters stored in advance in the storage unit 600, and controls to include the imaging region 921.
  • the aperture shape shown in FIG. 7A is a rectangle
  • the amount of steering movement at the time of shooting and the amount of overlap between single volumes can be easily set.
  • the aperture shape shown in FIG. 7B is circular, the Gaussian beam emitted from the single mode optical fiber 102-2 can be used effectively.
  • an area where single imaging is completed is shown as a captured area 923.
  • the photographed area 923 is displayed in a semi-transparent color in the display area 920 so that the user can easily determine whether the area is a single photographed area.
  • FIGS. 7A and 7B an area where single imaging is performed is shown as an imaging area 924.
  • This in-photographing area 924 is displayed in a manner that can be distinguished by a color different from that of the photographed area 923, for example.
  • an area where single imaging has not yet been performed is shown as an unimaged area 925.
  • the unphotographed area 925 is displayed in a manner that can be distinguished by a different color from the photographed area 923 and the in-photographing area 924, for example.
  • the area where the single imaging has failed or the area where the re-imaging has been performed is in a color different from that of the captured area 923, the in-photographing area 924, and the unphotographed area 925. You may make it display in the aspect which can be distinguished.
  • the user can easily grasp the shooting progress.
  • an example of distinguishing each region by color display has been described.
  • a different display method for example, a method of distinguishing each region by superimposing display of text or the like may be used.
  • the overall control unit 300 determines the movement of the eye E based on the reference wide-angle fundus image stored in the storage unit 600 and the wide-angle fundus image acquired in real time after the start of imaging in step S109. Detects and performs tracking to correct the movement. For example, every time a part of the wide-angle fundus image is acquired, the overall control unit 300 calculates the amount of movement of the eye E using the phase-only correlation method, and captures approximately the same position of the fundus Er. In addition, tracking for driving and controlling the scanners 142 and 144 is performed. This tracking corresponds to tracking in the X direction and the Y direction, and is referred to herein as lateral tracking. With this configuration, it is possible to realize lateral tracking that is faster than the frame rate of the wide-angle fundus image.
  • the overall control unit 300 may automatically detect the position of the coherence gate based on the luminance value of the tomographic image and perform tracking for driving and controlling the stage 152. This tracking corresponds to tracking in the Z direction, and is referred to herein as vertical tracking.
  • Step S110 (Judgment End of Shooting)> Subsequently, the overall control unit 300 determines whether or not to end shooting depending on whether or not the user has instructed to end shooting via the input unit 200. If the result of this determination is that shooting is not completed (S110 / NO), the process returns to step S101, and the processing after step S101 is performed again. At this time, for example, if the setting for omitting the selection of the left and right eyes (S101) and the selection of the shooting mode (S102) is made, the process returns to step S103, and the processes after step S103 are performed again. May be.
  • step S110 If the result of determination in step S110 is that shooting is to be terminated (S110 / YES), processing for ending shooting is performed, and then the processing of the flowchart of FIG. 6 is terminated.
  • control method of Imaging Apparatus 10 (Method for Generating Three-dimensional Tomographic Image)]
  • the overall control unit 300 transfers the volume data stored in the internal memory 163 of the two-dimensional sensor 162 to the image generation unit 400.
  • the image generation unit 400 generates a single 3D tomographic image from the single volume data, and then performs alignment and pasting of a plurality of single 3D tomographic images to obtain the 3D tomographic image. Generate.
  • FIG. 8 is a flowchart showing an example of a processing procedure in the method for generating a single three-dimensional tomographic image of the imaging apparatus 10 according to the embodiment of the present invention.
  • the image generation unit 400 aligns a series of interference images in single volume data. At this time, the image generation unit 400 uses the interference image acquired at the wavelength with the highest intensity in the spectrum data of the light source 101 as a reference image, performs correlation calculation of the interference image, and performs alignment between the interference images.
  • Step S202 (Acquisition of Interference Signal at Coordinates (Xi, Yi))> Subsequently, the image generation unit 400 acquires an interference signal at the coordinates 812 (Xi, Yi) illustrated in FIG. 4A.
  • the image generation unit 400 performs spectrum processing of the interference signal acquired in step S202. Specifically, the image generation unit 400 first multiplies spectral data by an appropriate magnification and subtracts it from the interference signal. In the present embodiment, since interference signals are acquired at equal wavelength intervals, the image generation unit 400 performs rescaling so as to obtain interference signals at equal wave number intervals. Further, the image generation unit 400 performs dispersion correction of the interference signal based on parameters measured in advance and stored in the storage unit 600.
  • Step S204 (Window Function Processing)> Subsequently, the image generation unit 400 multiplies the interference signal subjected to the spectrum processing in step S203 by a Hanning function as a window function.
  • the window function used for the processing in step S204 is not limited to the Hanning function exemplified here, and for example, a rectangular function, a Tukey function, or the like can be used.
  • Step S205 (FFT operation)> Subsequently, the image generation unit 400 performs an FFT operation on the interference signal that has been subjected to the window function processing in step S204, and acquires a tomographic signal.
  • An example of the tomographic signal acquired in step S205 is the tomographic signal 830 shown in FIG. 4C.
  • Step S206 (Storage)> Subsequently, the image generation unit 400 stores the tomographic signal data acquired in step S ⁇ b> 205 in the storage unit 600.
  • Step S207 Determining Necessity of Calculation of Next Coordinate
  • the image generation unit 400 determines whether or not the index i is smaller than the total number N of coordinates.
  • Step S208 (Set Next Coordinate)> If the result of determination in step S207 is that the index i is smaller than the total number N of coordinates (S207 / YES), it is determined that there is a coordinate that has not yet undergone calculation processing, and the process proceeds to step S208.
  • the image generation unit 400 (or the overall control unit 300) sets the next index (i ++). Then, it returns to step S202 and performs the process after step S202 again.
  • step S207 determines whether index i is smaller than the total number N of coordinates (S207 / NO). If the result of determination in step S207 is that index i is not smaller than the total number N of coordinates (S207 / NO), it is determined that calculation processing has been performed for all coordinates, and the processing of the flowchart in FIG. .
  • a single three-dimensional tomographic image can be generated by performing the processing of the flowchart of FIG.
  • FIG. 9 is a flowchart illustrating an example of a processing procedure in the three-dimensional tomographic image generation method of the imaging apparatus 10 according to the embodiment of the present invention.
  • the image generation unit 400 aligns a plurality of single three-dimensional tomographic images based on parameters such as the steering amount and the overlap amount stored in the storage unit 600. Specifically, the image generation unit 400 specifies adjacent single three-dimensional tomographic images from the steering amount, performs correlation calculation based on regions expected to overlap, and determines the position of the single three-dimensional tomographic image.
  • Step S302 (Bonding)> Subsequently, based on the position determined in step S301, the image generation unit 400 performs an averaging process on the overlapping regions, and performs a single three-dimensional tomographic image bonding. Thereafter, the processing of the flowchart of FIG. 9 ends.
  • a three-dimensional tomographic image can be generated by performing the processing of the flowchart of FIG.
  • the generated three-dimensional tomographic image is trimmed based on the size specified in the imaging region 921 under the control of the display control unit 500 and displayed on the display unit 700.
  • FIG. 10 is a timing chart showing an embodiment of the present invention and showing an example of an operation method of the light source 101 and the two-dimensional sensor 162 shown in FIG.
  • a single imaging operation is performed a plurality of times at the same location on the fundus Er of the eye E to be examined.
  • a case of re-imaging when the coherence gate position is adjusted is included.
  • FIG. 10 (1a) to (1c) are timing charts showing time-series operations of the operation method of the light source 101 and the two-dimensional sensor 162 in the first example of the present embodiment.
  • FIG. 10 (1 a) shows the timing of the trigger signal generated by the overall control unit 300
  • FIG. 10 (1 b) shows the timing of the wavelength scan operation of the light source 101
  • FIG. 10 (1 c) The timing of the exposure operation of the two-dimensional sensor 162 is shown.
  • the overall control unit 300 performs control to link (synchronize) the operation of the light source 101 and the operation of the two-dimensional sensor 162 using the trigger signal 1001 shown in FIG. Specifically, the overall control unit 300 links the operation of sweeping the wavelength of light output from the light source 101 shown in FIG. 10 (1b) with the exposure operation of the two-dimensional sensor 162 shown in FIG. 10 (1c). Take control. At this time, the overall control unit 300 controls the operation of sweeping the wavelength of light stepwise as shown in FIG. 10 (1c) as the operation of sweeping the wavelength of light output from the light source 101.
  • the overall control unit 300 detects the rising edge of the trigger signal 1001 shown in FIG. Then, the overall control unit 300 switches the single wavelength section 1011 from the scanning start wavelength 1010 of the light source 101 to the light source 101 in a stepped manner based on the detection and preset parameters as shown in FIG. Perform wavelength scanning.
  • the overall control unit 300 switches the single wavelength section 1011 from the scanning start wavelength 1010 of the light source 101 to the light source 101 in a stepped manner based on the detection and preset parameters as shown in FIG. Perform wavelength scanning.
  • a signal at a single wavelength can be obtained during the timing of the exposure operation of the two-dimensional sensor 162, thereby improving the resolution. Is possible.
  • the two-dimensional sensor 162 operates with a global shutter and operates at an interval of an exposure section 1081 shown in FIG. 10 (1c). Based on the trigger signal 1001, the overall control unit 300 interrupts the current exposure operation shown in the exposure section 1080 for the two-dimensional sensor 162 and starts the exposure operation again in the exposure section 1081. According to this control, data with insufficient exposure can be removed from an image (tomographic image or the like) generated by the image generation unit 400, and a tomographic image or the like with good image quality can be acquired.
  • the overall control unit 300 controls the light source 101 to return to the scan start wavelength 1010 after performing a wavelength scan with a predetermined number of wavelength steps.
  • the section 1030 corresponds to one wavelength scan.
  • the flyback section 1020 is an operation section in which a scanner (not shown) used inside the light source 101 to perform wavelength scanning returns to the initial position.
  • the overall control unit 300 controls the light source 101 to be turned off.
  • the flyback section 1020 is an exposure section 1083 where the fundus Er is not captured by the two-dimensional sensor 162. According to this control, invalid data is removed from an image (tomographic image or the like) generated by the image generation unit 400, and a tomographic image or the like having a good image quality can be acquired.
  • the overall control unit 300 performs control so that the above-described steering is performed in the flyback section 1020. According to this control, it is possible to perform efficient and high-speed shooting. Further, the overall control unit 300 may adjust the coherence gate position based on the tomographic image of the partial region 813 in the flyback section 1020. In this case, it is possible to perform more efficient high-speed imaging. is there.
  • FIG. 10 (2a) to (2c) are timing charts showing time-series operations of the operation method of the light source 101 and the two-dimensional sensor 162 in the second example of the present embodiment. Specifically, FIG. 10 (2a) shows the timing of the exposure operation of the two-dimensional sensor 162, FIG. 10 (2b) shows the exposure timing signal generated by the two-dimensional sensor 162, and FIG. 10 (2c) The timing of the wavelength scan operation of the light source 101 is shown.
  • the overall control unit 300 performs control to link the operation of the light source 101 and the operation of the two-dimensional sensor 162 with reference to the trigger signal 1001 generated by its own processing.
  • control is performed to link the operation of the light source 101 and the operation of the two-dimensional sensor 162 with reference to the exposure timing signal of the two-dimensional sensor 162 shown in FIG. is there.
  • the overall control unit 300 generates the trigger signal 1001 based on the exposure timing signal generated by the two-dimensional sensor 162 (that is, based on the exposure operation of the two-dimensional sensor 162). sell.
  • the two-dimensional sensor 162 generates an exposure timing signal 1005 for each exposure that converts light into a signal.
  • the overall control unit 300 causes the light source 101 to start a wavelength scan every predetermined number of exposure timing signals 1005. Further, in the case of the second embodiment, the overall control unit 300 links the operation of the light source 101 and the operation of the two-dimensional sensor 162 to the light source 101 during the standby interval 1021 separately from the flyback interval 1020. Also, it is driven so as to maintain the light-off state. With this control, the light source 101 performs wavelength scanning of the light source 101 in a stepped manner from a single wavelength section 1012 (the same wavelength as the single wavelength section 1011) after the standby section 1021 has elapsed.
  • the operation of the single wavelength section 1012 of the light source 101 is interlocked with the operation of the exposure section 1081 of the two-dimensional sensor 162.
  • the exposure operation of the two-dimensional sensor 162 can be interlocked with the operation of the light source 101 without interruption, so that efficient and high-speed shooting can be performed.
  • FIGS. 10 (3a) to (3b) are timing charts showing time-series operations of the operation method of the light source 101 and the two-dimensional sensor 162 in the third example of the present embodiment. Specifically, FIG. 10 (3 a) shows the timing of the exposure operation of the two-dimensional sensor 162, and FIG. 10 (3 b) shows the timing of the wavelength scan operation of the light source 101.
  • the trigger signal 1001 and the exposure timing signal 1005 are used to control the operation of the light source 101 and the operation of the two-dimensional sensor 162, respectively.
  • the control of linking the operation of the light source 101 and the operation of the two-dimensional sensor 162 is performed without using the trigger signal 1001 and the exposure timing signal 1005.
  • the overall control unit 300 is based on the result of comparing the luminance information of the interference image based on the interference signal obtained by the two-dimensional sensor 162 detecting the interference light 124 with the luminance information of the reference image. The operation of sweeping the wavelength of light output from the light source 101 is controlled.
  • the interference image of the partial region 813 is sent from the image generation unit 400 to the overall control unit 300 in real time.
  • the overall control unit 300 acquires luminance information (hereinafter referred to as “partial luminance information”) of the interference image of the partial region 813 from the received interference image of the partial region 813.
  • partial luminance information hereinafter referred to as “partial luminance information”.
  • the average luminance of the interference image in the partial region 813 is used as the partial luminance information.
  • the overall control unit 300 compares the acquired partial luminance information with the reference partial luminance information, and detects an exposure section 1085 having a low partial luminance information shown in FIG. 10 (3a).
  • the reference partial luminance information is an average luminance of the interference image (reference image) for each wavelength that is acquired in advance and stored in the storage unit 600.
  • the overall control unit 300 sets the adjustment section 1022 based on the exposure section 1087 and the exposure section 1085 at the end of one wavelength scan.
  • the overall control unit 300 turns off the light source 101 during the adjustment section 1022 separately from the flyback section 1020 in order to link the operation of the light source 101 and the operation of the two-dimensional sensor 162. Drive to maintain the state.
  • the light source 101 performs wavelength scanning of the light source 101 in a stepped manner from the single wavelength section 1013 (the same wavelength as the single wavelength section 1011) after the adjustment section 1022 has elapsed.
  • the operation of the single wavelength section 1013 of the light source 101 is interlocked with the operation of the exposure section 1088 of the two-dimensional sensor 162.
  • the exposure operation of the two-dimensional sensor 162 can be interlocked with the operation of the light source 101 without interruption, so that efficient and high-speed shooting can be performed.
  • the imaging apparatus 10 includes the coupler 103 (light branching unit) that branches the light from the light source 101 into the measurement light 121 and the reference light 123, and the measurement light 121 as the measurement target T. And an eyepiece optical system 140 (irradiation means) for irradiating a two-dimensional region of the eye E (more specifically, the fundus Er of the eye E) and a light receiving element arranged in a two-dimensional manner,
  • the two-dimensional sensor 162 detection means that detects the interference light 124 obtained by causing the return light 122 from the optometry E to interfere with the reference light 123, and the operation of the light source 101 and the operation of the two-dimensional sensor 162 are linked.
  • An overall control unit 300 (control means) that performs control is included.
  • a tomographic image with good image quality can be acquired by high-speed imaging.
  • the present invention is not limited to the eye E.
  • any target other than the eye E can be applied as the measurement target T as long as it can capture a tomographic image using the light source 101.
  • the imaging device 10 is not limited to an ophthalmic imaging device.
  • the present invention supplies a program that realizes one or more functions of the above-described embodiments to a system or apparatus via a network or a storage medium, and one or more processors in a computer of the system or apparatus read and execute the program
  • This process can be realized. It can also be realized by a circuit (for example, ASIC) that realizes one or more functions.
  • This program and a computer-readable storage medium storing the program are included in the present invention.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Eye Examination Apparatus (AREA)
PCT/JP2019/017726 2018-05-24 2019-04-25 撮影装置及びその制御方法 WO2019225290A1 (ja)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-099493 2018-05-24
JP2018099493A JP7195769B2 (ja) 2018-05-24 2018-05-24 撮影装置及びその作動方法

Publications (1)

Publication Number Publication Date
WO2019225290A1 true WO2019225290A1 (ja) 2019-11-28

Family

ID=68616604

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/017726 WO2019225290A1 (ja) 2018-05-24 2019-04-25 撮影装置及びその制御方法

Country Status (2)

Country Link
JP (1) JP7195769B2 (enrdf_load_stackoverflow)
WO (1) WO2019225290A1 (enrdf_load_stackoverflow)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7253253B2 (ja) * 2019-11-07 2023-04-06 株式会社ユニバーサルエンターテインメント 遊技機
JP7253254B2 (ja) * 2019-11-07 2023-04-06 株式会社ユニバーサルエンターテインメント 遊技機
JP7253252B2 (ja) * 2019-11-07 2023-04-06 株式会社ユニバーサルエンターテインメント 遊技機
JP7253250B2 (ja) * 2019-11-07 2023-04-06 株式会社ユニバーサルエンターテインメント 遊技機
JP7327521B2 (ja) * 2020-01-22 2023-08-16 株式会社ニコン 光干渉断層計及び光干渉断層計の制御方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008151155A2 (en) * 2007-05-31 2008-12-11 Board Of Regents, The University Of Texas System Polarization-sensitive spectral interferometry
JP2009042197A (ja) * 2007-08-13 2009-02-26 Topcon Corp 光画像計測装置
JP2013512441A (ja) * 2010-03-17 2013-04-11 ライトラブ イメージング, インコーポレイテッド 干渉センシングおよび画像取得システムのための強度雑音を低減する方法および装置
US20140028997A1 (en) * 2012-07-27 2014-01-30 Praevium Research, Inc. Agile imaging system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008151155A2 (en) * 2007-05-31 2008-12-11 Board Of Regents, The University Of Texas System Polarization-sensitive spectral interferometry
JP2009042197A (ja) * 2007-08-13 2009-02-26 Topcon Corp 光画像計測装置
JP2013512441A (ja) * 2010-03-17 2013-04-11 ライトラブ イメージング, インコーポレイテッド 干渉センシングおよび画像取得システムのための強度雑音を低減する方法および装置
US20140028997A1 (en) * 2012-07-27 2014-01-30 Praevium Research, Inc. Agile imaging system

Also Published As

Publication number Publication date
JP2019201952A (ja) 2019-11-28
JP7195769B2 (ja) 2022-12-26

Similar Documents

Publication Publication Date Title
EP3636138B1 (en) Ophthalmic apparatus, controlling method thereof, and recording medium
JP6354979B2 (ja) 眼底撮影装置
US10849499B2 (en) Ophthalmologic apparatus and method of controlling the same
WO2019225290A1 (ja) 撮影装置及びその制御方法
JP6184232B2 (ja) 画像処理装置及び画像処理方法
JP6349878B2 (ja) 眼科撮影装置、眼科撮影方法、及び眼科撮影プログラム
US10786153B2 (en) Ophthalmologic imaging apparatus
JP6899632B2 (ja) 眼科撮影装置
JP2017221525A (ja) 光コヒーレンストモグラフィ装置、および光コヒーレンストモグラフィ制御プログラム
JP2015029559A (ja) 撮影装置及び撮影方法
US10321819B2 (en) Ophthalmic imaging apparatus
JP2019201951A (ja) 撮影装置及びその制御方法
JP7104516B2 (ja) 断層画像撮影装置
JP6923392B2 (ja) 眼科装置
JP6946643B2 (ja) 光干渉断層撮像装置
JP2018000620A (ja) 眼科撮影装置
JP6160807B2 (ja) 眼科撮影装置及び眼科撮影プログラム
WO2022186115A1 (ja) Oct装置および眼科画像処理プログラム
JP2020168266A (ja) 眼科光干渉断層撮影装置及び眼科光干渉断層撮影装置の制御方法
JP2016055123A (ja) 眼科撮影装置、眼科撮影システム、および眼科撮影プログラム
JP2021094456A (ja) 眼科装置
JP6437055B2 (ja) 画像処理装置及び画像処理方法
JP2020126071A (ja) 光コヒーレンストモグラフィ装置、および光コヒーレンストモグラフィ演算プログラム
JP7309404B2 (ja) 撮像装置およびその制御方法
JP6836212B2 (ja) 眼科撮影装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19806903

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19806903

Country of ref document: EP

Kind code of ref document: A1