WO2021193179A1 - Dispositif endoscopique - Google Patents

Dispositif endoscopique Download PDF

Info

Publication number
WO2021193179A1
WO2021193179A1 PCT/JP2021/010282 JP2021010282W WO2021193179A1 WO 2021193179 A1 WO2021193179 A1 WO 2021193179A1 JP 2021010282 W JP2021010282 W JP 2021010282W WO 2021193179 A1 WO2021193179 A1 WO 2021193179A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
optical
endoscope device
optical fiber
data
Prior art date
Application number
PCT/JP2021/010282
Other languages
English (en)
Japanese (ja)
Inventor
のりこ 安間
達夫 長▲崎▼
広朗 長▲崎▼
Original Assignee
のりこ 安間
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by のりこ 安間 filed Critical のりこ 安間
Publication of WO2021193179A1 publication Critical patent/WO2021193179A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/26Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes using light guides

Definitions

  • the present invention enables high-definition imaging with a deep depth of focus and wide-field to microscopic enlargement by enabling pixel-by-pixel focusing without using an optical system and an image sensor.
  • the present invention relates to a small and small-diameter endoscope device that enables continuous switching of imaging and, in addition, enables pixel-level spectral analysis.
  • the main issues of the conventional endoscope device are as follows: -To reduce the pain of the patient and to make it possible to observe a narrow space in the body, the size and diameter are reduced.
  • Achieve wide-field and high-definition imaging in order to quickly detect diseases of 3 mm or less (which cannot be detected by X-ray imaging or CT).
  • the diagonal dimension of the image sensor's imaging surface must be suppressed to 1.4 mm or less.
  • the pixel pitch is required to be 0.65 ⁇ m, which exceeds the limit values in the pixel pitch and sensitivity of the image sensor and the resolution of the optical system.
  • the image sensor is to be equipped with a spectrum detection function, it is necessary to separately prepare a pixel equipped with an optical filter for spectrum detection in addition to the RGB filter, which makes the image sensor more compact. It gets even more difficult.
  • the optical system that enables wide-field observation to microscopic magnified observation has a large shape, and the diameter of the endoscopic device must be increased. If the electronic zoom function is used together to increase the magnification in order to reduce the size of the optical system, it is not possible to provide a high-definition image. Displaying a wide-field, high-definition image on a high-definition screen that matches the visual resolution leads to quicker detection of the diseased part, rather than displaying an enlarged display that is unnecessarily lower than the visual resolution by electronic zoom. This is because it is much faster to visually screen on a wide-field, high-definition screen than to operate an endoscope device for screening.
  • -Patent Document 1 discloses an endoscope device that does not use an image sensor. This is an endoscope device that scans spot light in two dimensions for imaging.
  • high-speed scanning of 15.7 kHz is required for the horizontal line, and 65.7 kHz is required for the HD image of progressive scanning. Difficult to achieve.
  • fatigue (transmission loss) of the optical fiber is started in minutes due to the bending resistance of the optical fiber.
  • Patent Document 1 the realization of a thinner endoscope device is limited by the size of the imaging optical system.
  • the imaging optical system is to have functions from wide-field imaging to microscopic magnified imaging, the optical system becomes larger and it becomes difficult to reduce the diameter.
  • the present invention has been made in view of such circumstances. That is, in the present invention, the imaging optical system, the image sensor, and the like provided in the conventional endoscope device are replaced with one optical fiber and a one-dimensional scanning mechanism, and the reflected light received by the optical fiber is solved outside the imaging unit.
  • the problems caused by providing an image sensor and the like and the problems caused by high-speed horizontal scanning in Patent Document 1 can be solved, and the human body of the endoscopy device can be used.
  • the purpose is to reduce the size and diameter of the inserted part.
  • the present invention is an imaging type endoscope device that does not use an imaging optical system and an image sensor in order to collectively solve the above-mentioned problems.
  • the outline of the imaging method of the present invention will be described with reference to FIG.
  • the diffused light 10 is obliquely irradiated to the subject surface 3, and the obtained reflected light is subjected to OCI (Optical Coherence Imaging) processing to perform resolution in the line 4 direction, and while repeating resolution in the line 4 direction,
  • OCI Optical Coherence Imaging
  • the endoscope device includes a method of performing pipeline processing to image the resolution of the scanning direction Y orthogonal to the line 4 direction and the line 4 direction on the subject surface 3.
  • the OCI process is technically based on the OCT (Optical Coherence Tomography) process, but since the detection target and purpose are different, the name OCI is used in this application. The details of OCI processing will be described later.
  • FIGS. 2 (a) to 2 (c) The principle of the above resolution will be described with reference to FIGS. 2 (a) to 2 (c).
  • the resolution in the direction in which the illumination light is diffused is performed by the OCI process
  • the same resolution as that transmitted and received by the spherical light pulse P1 is obtained as shown in FIG. 2 (a)
  • the diffused light is obtained by the synthetic aperture processing.
  • the resolution of the scanning direction Y orthogonal to the central axis X (see FIG. 1) of 10 is performed
  • the resolution of the fan-shaped P2 is obtained as shown in FIG. 2 (b), and the product of both is integrated. It becomes the resolution of.
  • the overall resolution is the same as transmitting and receiving arc-shaped light pulses 5 having a thickness on the order of microns, like a radar, to obtain reflected light.
  • the arc-shaped light pulse 5 enters the subject surface 3 diagonally within the diffused range, the reflected light that is in focus can be obtained at any position on the line 4, and the resolution is made for each pixel. Therefore, the depth of focus is deep, and by switching between the OCI range OC and the synthetic aperture range SA, continuous switching from wide-field imaging to microscopic magnified imaging becomes possible.
  • the imaging method of the present invention has dynamic focusing that greatly expands the depth of field while maintaining high resolution by electrical processing, and microscopic enlargement from wide-field / high-definition imaging. Super zooming that continuously switches up to imaging is possible.
  • the image obtained by the endoscope device of the present invention has a wave surface at a reflection point 6 where the line 4 and the wave surface 5 of the illumination light intersect when the diffused light 10 is illuminated at an oblique angle ⁇ with respect to the subject. It becomes the same as the image observed from the observation direction 7 which is the tangential direction of 5.
  • the reflected signal (reflected brightness) from the same wave plane inside the subject is superimposed, and the transmitted image is transmitted from the tangential direction (observation direction 7). You will get the same image as you observed.
  • near-infrared light wavelength 0.68 ⁇ m to 1.5 ⁇ m
  • the irradiation angle ⁇ of the illumination is manipulated according to the purpose.
  • a first aspect of the endoscope device of the present invention is an optical fiber that guides light supplied from a light source, emits it as diffused light, and receives reflected light returned from a subject.
  • An optical interference resolution processing unit that performs resolution processing in the traveling direction of the wavefront of the diffused light according to the imaging principle of the optical coherence tomography method.
  • a scanning mechanism that scans the diffused light in a scanning direction that intersects the central axis of the diffused light.
  • a storage unit that stores a data string generated by resolution processing by the optical interference resolution processing unit, and a storage unit. It is provided with a synthetic aperture processing unit that extracts and adds data that matches the optical path length from one end of the optical fiber to the position of the pixel to be detected from the data string stored in the storage unit.
  • the second aspect of the endoscope device of the present invention is the endoscope device according to the first aspect, in which the scanning direction is orthogonal to the central axis of the diffused light.
  • the third aspect of the endoscope device of the present invention is the endoscope device according to the first or second aspect, and the light source produces wideband light or wideband wavelength sweep light.
  • a fourth aspect of the endoscope device of the present invention is the endoscope device according to any one of the first to third aspects, and the optical interference resolution processing unit is the reflection received by the optical fiber.
  • a sixth aspect of the endoscope device of the present invention is the endoscope device according to any one of the first to fifth aspects, wherein a plurality of endoscope devices having clusters are known in descending order of Fisher ratio from the reflection spectrum of a subject.
  • a discriminating means is provided for calculating a spectral component and using the spectral component to discriminate from the reflection spectrum of an unknown subject by a cluster.
  • the seventh aspect of the endoscope device of the present invention is the endoscope device according to the sixth aspect, and the identification means uses an AI that performs deep learning.
  • the imaging unit of the endoscope device since the imaging unit of the endoscope device has a simple configuration including an optical fiber and a scanning mechanism in a one-dimensional direction, it is possible to make the diameter of the insertion unit of the endoscope device extremely small.
  • the endoscope device of the present invention since horizontal scanning is performed by electrical processing (for example, Fourier conversion processing), high-speed horizontal scanning required for HD, 4K, and 8K can be realized, and images are repeated.
  • a scanning mechanism having a low vertical scanning frequency for example, 60 Hz
  • the endoscope device of the present invention is capable of high-definition imaging of HD, 4K, and 8K in spite of its small size and small diameter, and can continuously switch from such an image to a microscopic magnified image, and is covered. It is possible to electrically increase the depth of field.
  • the endoscope device of the present invention is small and has a small diameter, it is possible to identify the substance of the subject at the pixel level by spectral analysis.
  • the endoscope device of the present invention does not use a heat-sensitive image sensor, autoclave sterilization is possible.
  • FIG. 1 It is a block diagram which shows the structure of the endoscope apparatus which is an embodiment. It is a figure for demonstrating the principle of imaging by an endoscopic apparatus which is an embodiment, (a) is a figure which shows the resolution by OCI, (b) is a figure which shows the resolution by a synthetic aperture. Yes, (c) is a schematic diagram schematically showing an optical pulse diffused on a subject surface. It is a figure for demonstrating the observation image. It is a figure for demonstrating the relationship of the reciprocating optical path length between the fiber end of the irradiation light which irradiates from an optical fiber, and an arc. It is a block diagram which shows the structure of a scanning mechanism.
  • (A) is a diagram for explaining a method of imaging a living body by OCT processing
  • (a) is a diagram for explaining a method of imaging a living body by OCI processing.
  • It is a block diagram which shows OCI processing using a wide band light source. It is a figure for demonstrating OCI processing by Fourier analysis.
  • It is a block diagram which shows the OCI processing using the wavelength sweep light source.
  • It is a conceptual diagram which shows the concept of a synthetic aperture in a sector scan system.
  • It is a block diagram which shows the example of the completion processing which complements the acquired data, and the synthesis opening processing by a Fourier transform.
  • FIG. 1 It is a block diagram which shows the structure of the scanning mechanism which uses a MEMS mirror.
  • (A) and (b) are configuration diagrams showing the configuration of a scanning mechanism that vibrates the optical fiber end by a piezoelectric bimorph, and (c) shows the optical fiber end by two piezoelectric bimorph oscillators whose vibration directions are orthogonal to each other. It is a block diagram which shows the structure of the scanning mechanism which causes rotational vibration.
  • the reflected light from the arc ARCs (ARC1 and ARC2) is superimposed in the same phase because the round-trip optical path lengths from the optical fiber end 51 are the same, and is received by the optical fiber end 51.
  • Line 4 shows one of the lines on the subject surface 3 orthogonal to the arc ARC.
  • the optical path length OPL from the optical fiber end 51 to the arc ARC gradually increases as the position of the arc ARC becomes farther from one end (line end) 104 of the line 4.
  • the reflected light received at the optical fiber end 51 is guided to the OCI processing unit 207 via an optical circulator (see reference numeral 203 in FIG. 1).
  • the OCI processing unit 207 by the same processing as OCT, the direction orthogonal to the arc ARC (longitudinal direction of the line 4) is utilized by utilizing the difference in the round-trip optical path length OPL from the optical fiber end 51 to the arc ARC shown in FIG. Resolution is made.
  • the operation of the OCI processing unit 207 will be described in detail in sections 2) to 5) described later.
  • the diffused light 10 is scanned in the scanning direction Y intersecting the central axis X by the one-dimensional scanning mechanism 8, and the data string required for the composite opening is acquired and stored. It is stored in the unit 211.
  • FIG. 5 shows an example of the scanning mechanism.
  • Reference numeral 511 indicates an optical fiber that rotates and reciprocates while illuminating the subject surface 3 at an angle
  • 555 indicates a microgalvano scanner
  • 555 indicates an optical rotary joint
  • 112 indicates an optical fiber for transmission.
  • Reference numeral 12 indicates a GRIN (Gradient Index) lens.
  • the GRIN lens 12 is used not for forming an image but for adjusting the diffusion distribution of light intensity, and the aperture diameter can be made relatively small.
  • a rotary scanning micromotor may be used. When using a micromotor, the coil spring 533 becomes unnecessary. Micromotors of 0.9 mm ⁇ and 0.6 mm ⁇ are commercially available, and there is no concern about their size. Examples of other scanning mechanisms will be described later.
  • the composite aperture processing unit 213 performs a process of synthesizing the aperture in the scanning direction Y, and resolution in the scanning direction Y is performed.
  • the operation of the synthetic opening processing unit 213 will be described in detail in Sections 6) and 7) described later.
  • the display processing unit 205 performs matrix conversion to an RGB image, processing for displaying the AI determination result, scanning conversion according to the display device 215, and the like, and the display device 215 performs RGB image and spectrum image. Is displayed.
  • OCI processing is optical interference resolution processing (processing based on the imaging principle of optical coherence tomography) based on OCT processing, but the purpose is as shown below. Are different from each other, so there are differences in both processes.
  • the purpose of the OCT process is to detect an internal tomographic image from the living body surface OS, the depth of field D1 is set deeply as shown in FIG. 6A. Therefore, the numerical aperture NA1 of the optical system cannot be increased (the resolution is increased) (NA1 is relatively small).
  • NA1 the numerical aperture of the optical system cannot be increased (the resolution is increased) (NA1 is relatively small).
  • the wavelength band used in OCT processing is limited to near-infrared light with high biotransparency, and attenuation and scattering for each wavelength are superimposed during propagation in the living body, so RGB images can be generated and quantitatively generated. Spectral analysis is not possible.
  • the OCI process aims to detect an image of the biological surface OS which is the surface of the subject, the numerical aperture NA2 can be set relatively large as shown in FIG. 6 (b).
  • the irradiation light is not attenuated when propagating in the living body, so there is a margin in sensitivity, and by adjusting the opening of the optical fiber end 51 with a pinhole, high microscopic resolution can be achieved. realizable.
  • wideband light can be used, and unlike OCT processing, there is no attenuation or scattering for each wavelength that occurs superimposed when the irradiation light propagates in the body, so RGB image detection and quantitative spectrum Analysis is possible.
  • OCT processing technology is used as a base, OCT processing is performed using wideband light, and in addition, processing for RGB image detection and spectrum analysis is performed.
  • RGB image detection and spectrum analysis will be described in [2. Spectrum analysis] will be described in detail.
  • the OCI processing unit using a wideband light source will be described. As shown in FIG. 7, the light of the broadband light source 21 is divided into two by the branch coupler 22, and one of the branched lights is reflected by the reflector 24 via the optical circulator 23, and the reference light. 25 is obtained.
  • the other branched light is guided to the endoscope tip portion 201 by the optical fiber 1 via the optical circulator 26, and is obliquely irradiated to the subject surface 3 as shown in FIG.
  • the reflected light from the arc ARC shown in FIG. 4 is received by the optical fiber 1, and is interfered with the reference light by the interference coupler 27 via the optical circulator 26.
  • interference fringes are generated at a frequency proportional to the difference between the reflected light and the optical path length OPL of the reference light 25, and these are superimposed by the number of reflection points from the line 4. The longer the optical path length OPL of the reflected light is longer than the optical path length of the reference light 25, the higher the frequency of the interference fringes.
  • the output light of the interference coupler 27 is split into a wave number (inverse wavelength) component by the toroidal type grating spectroscope 28, and the light carrier disappears by receiving the wave number component with the light receiving line detector 29.
  • An electric signal in which the wavenumber components are in chronological order can be obtained.
  • This electric signal has components indicating interference fringes generated according to the difference in the round-trip optical path lengths of the reflected light and the reference light 25, which are superimposed by the number of reflection points on the line 4.
  • the frequency component after the Fourier transform corresponds to the difference between the reference light 25 and the optical path length OPL of each arcuate reflection point ARC, and the amplitude of each frequency component. However, it corresponds to the reflection intensity.
  • the resolution is made in the direction orthogonal to the arc-shaped reflection point ARC, and the reflected signal reflected from the arc-shaped reflection point ARC is obtained.
  • the wave number of the interference fringes increases in proportion to it, so that the width of the single spectrum when the wavelength signal is Fourier transformed becomes narrow and the line 4 direction. Will increase the resolution of.
  • the resolution ⁇ is proportional to the width of the wavelength band. ⁇ indicates the wavelength bandwidth, ⁇ c indicates the central wavelength, and ⁇ indicates the angle at which the central axis X of the diffused light and the line 4 intersect.
  • the output of the light receiving line detector 29 becomes larger, so that the sensitivity of the OCI process increases.
  • the following is a supplementary explanation of OCI processing by Fourier analysis.
  • the first term of the equation shown in FIG. 8A schematically represents a wide band illumination light.
  • the second term in FIG. 8A expresses the relationship between the reference light 25 and the optical path length OPL of the reflected light by a ⁇ function when the number of reflection points on the line 4 is one.
  • the value of L indicates the reciprocating optical path length of the reflected light when the optical path length OPL of the reference light 25 is 0.
  • the process of interfering the reference light 25 with the reflected light can be expressed as a superposition integration (*) of the illumination light on the reflection positions of the reference light 25 and the reflected light as shown in the equation of FIG. 8A.
  • the second term k in FIG. 8A indicates the amplitude of the reference wave, and A indicates the amplitude of the reflected wave.
  • the composite product obtained by the formula of FIG. 8 (a) is separated by a toroidal type grating spectroscope 28 (see FIG. 7), and the electric signal converted by the light receiving line detector 29 is shown in FIG. 8 (a). Since it is equivalent to detecting the carriers of light by Fourier transforming (spectroscopically) the equation, the wavenumber band and the second term of the illumination light of the first term of the equation of FIG. 8 (b) are based on the theorem of superposition integration. The electrical signal of the third term obtained by multiplying the interference fringes of (X) is obtained.
  • the first term ⁇ in FIG. 8 (b) means the wave number component of the broadband light.
  • FIGS. 8 (a) and 8 (b) the second term of FIGS. 8 (a) and 8 (b) is a Fourier transform pair, the waveform after the Fourier transform can be obtained by exchanging the position axis (time axis) and the wave number axis (frequency axis). easy to understand.
  • the position (distance) axis is Fourier transformed, it becomes the wave number (spatial frequency) axis, but since it is converted into a time-series electric signal by the light receiving line detector 29 (see FIG. 7), FIG. 8 (b)
  • the wave number axis is the time axis.
  • the light whose wave number (reciprocal of the wavelength) of the illumination light is linearly modulated by the wavelength sweep light source 31 is input to the branch coupler 32. Further, the output light from the interference coupler 33 is converted into an electric signal by the light receiving detector 34.
  • Other processing is the same as the processing performed in the OCI processing of the wideband light source 21 described with reference to FIG. 7.
  • the grating spectroscope 28 and the light receiving line detector 29 shown in FIG. 7 are not required, and a single light receiving detector 34 having good sensitivity and SN can be used.
  • the light source is selected according to the purpose in consideration of applications such as multispectral analysis described later. To do or use a light source in combination.
  • the interpolation circuit 46 shown in FIG. 11 performs amplitude and phase interpolation from the data of the preceding and following addresses, and reads the data AD-1. -Improve the accuracy of AD-n.
  • the addresses of the data AD-1 to AD-n to be read are parabolic 175 by the first-order approximation.
  • phase shifts corresponding to the optical path differences OPD-1 to OPD-n of the read data AD-1 to AD-n were matched and added, pixel 173 was detected by the optical system having the same aperture. It will be the same as.
  • the amplitude of the detected pixel 173 is increased by the number of the data strings DL-1 to DL-n, and the amplitude of the other pixels used in the synthetic aperture 171 is the arc of FIG.
  • the phases (positions) shift and cancel each other out.
  • a supplementary explanation of this phase shift will be given in Section 7) using Fourier analysis.
  • the correlation calculation unit 48 causes the data DL-1 to be added.
  • the correlation calculation of the reference signal RS for matching with DLn is performed in the direction of the data string 177.
  • the reference signal RS selects the data string DL-, which is selected according to the optical path length of the pixel 173 to be detected (according to the reflection position on the line 4).
  • the range 177 of 1 to DL-n (the range 171 of the synthetic aperture) and the reference signal RS must be adaptively switched.
  • the reference signal RS required for the pixel 173 to be detected (combined) is stored in the lookup table of the reference signal generation unit 45.
  • the reference signal RS since the reference signal RS is a function of the optical path length of the pixel 173 to be detected, it may be generated by calculation.
  • the range of data for which the Fourier transform 401 is performed is the range of data stored in the scanning direction Y.
  • the reference signal in the scanning direction Y is constant as in the scanning method of FIG. 5
  • the method of Fourier transform 401 in FIG. 12 requires fewer multiplications than the method of correlation calculation in FIG.
  • the small-diameter endoscope (2 mm ⁇ ) shown in FIG. 5 is passed through the forceps opening (2 to 3 mm ⁇ ) 405 of the existing endoscope (6 mm ⁇ thin endoscope) 407.
  • the reason for pressing it against the subject surface is to prevent blurring and to increase the numerical aperture of the synthetic aperture.
  • the operation performed by the observer is as follows: first, the diseased part is screened by the wide-field image of the parent endoscope 407, and then the high-definition observation of the part of interest is performed by the small-diameter endoscope 500 through the forceps opening 405. I do. Then, the observer continues the screening while shortening the observation distance of the high-definition image (while enlarging the image), and presses the small-diameter endoscope 500 against the screened site where cytodiagnosis is required. And observe the magnified microscopic image.
  • the small-diameter endoscope 500 has a deep depth of focus and can automatically switch from high-definition imaging to microscopic magnified imaging simply by changing the distance of the subject. Can be easily done.
  • the small-diameter endoscope 500 has a first flexible portion 409 that can be refracted by wire control and a spring-shaped second flexible portion 411 that can be refracted.
  • the optical fiber end 51 is rotationally scanned in an arcuate scanning range 343 0.7 mm from the center (rotary scanning axis) R of the endoscope device 301, and the optical fiber end 51 is 1 mm from the optical fiber end 51.
  • the resolution for obtaining 1000 ⁇ 1000 pixels for the 2 ⁇ 2 mm imaging range 571 needs to be 2 ⁇ m or less according to the sampling theorem.
  • the resolution ⁇ in the line 4 direction is calculated from the above-mentioned OCI resolution formula, a resolution of 2 ⁇ m or less can be sufficiently achieved by using the visible light band for the diffused light.
  • reference numeral 511 is an optical fiber for lighting
  • 525 is a bearing
  • 259 is a magnet
  • 556 is a micromotor
  • 533 is a coil spring
  • 533 is an optical rotary joint
  • 513 is an electric coil
  • 571 is a 2x2 mm imaging range.
  • Reference numeral 341 in FIG. 15 indicates a surface mucous membrane, and 343 indicates a scanning range of 1.34 mm at the end 51 of the optical fiber.
  • the interval P when the diffused light 10 is rotationally scanned to acquire the data strings DL-1 to DL-n in FIG. 10 is obtained by Fourier analysis.
  • the openings 171 to be combined are discrete, as shown in the equation of FIG. 16A, the COM (comb) function of the first term representing the interval P for acquiring the data strings DL-1 to DL-n can be used.
  • the composite aperture process is a Fourier transform, according to the theorem of superimposed integration, as shown in the equation of Fig. 16 (b), the square wave is Fouriered into the com function with an interval of 1 / P that can be obtained by Fourier transforming the COM function. It can be expressed by an equation obtained by superimposing and integrating the sync function formed by the transform and multiplying it by the distribution of diffused light (light receiving sensitivity distribution of the optical fiber) on the focal point of the third term.
  • the axis unit of the equation of FIG. 16 (b) is represented by the spatial frequency, but when converted into the unit of distance, it is represented by the waveform of FIG. 16 (c).
  • the interval P becomes wider, the interval between the 0th and ⁇ 1st orders of the waveform in FIG. 16 (c) becomes narrower, and the 0th order PSF (Point Spread Function: light receiving sensitivity)
  • the ⁇ 1st order PSF enters the distribution) as an artifact (a virtual image that does not originally exist).
  • the integral value (artifact) of the ⁇ 1st order PSF (side lobe) overlapping the 0th order PSF must be smaller than the signal noise.
  • the diffusion range DA of the diffused light 10 see FIG.
  • the diffusion range DA of the diffused light 10 (see FIG. 1) is set. By narrowing it, it is not possible to suppress ⁇ 1st order or higher. By adjusting the interval P, the effect of ⁇ 1st order can be suppressed.
  • the smaller the interval P the wider the interval between 0th order and ⁇ 1st order, and the influence of ⁇ 1st order becomes smaller, but the number of data increases by that amount, so consider the balance between SN and the amount of calculation (circuit scale). Set.
  • the interval P in the equation shown in FIG. 16 (a) is set to 1 ⁇ m
  • the optical fiber end 51 is scanned and scanned over a 1.34 mm scanning range 343 (scanning angle ⁇ is 68 degrees) including the opening to be combined.
  • 1340 data strings are acquired, and the synthetic aperture processing is performed using the 340 data strings corresponding to the size of the aperture in the acquired data strings.
  • FIG. 17 shows a scanning method for scanning diffused light 10 (see FIG. 3).
  • a is a linear system
  • c is a sector scan system
  • d is a multi-scan system in which a, b, and c are combined.
  • Apertures can be combined in any scanning method.
  • the convex method of b is convenient when imaging the inside of a blood vessel or the respiratory tract
  • the linear method is convenient when imaging a joint or the like
  • the scanning method can be used properly according to the shape of the subject. Items to be described later 2.
  • the scanning method of d is used to suppress the interference pattern.
  • the scanning method in FIG. 5 corresponds to the convex method in b.
  • the diameter of the endoscope can be reduced because the scanning range 343 (see FIG. 15) for scanning the diffused light 10 may be the size of the opening.
  • the scanning range 343 where the synthetic aperture is possible is an area where the diffusion range DA (see FIG. 1) of the scanned diffused light 10 overlaps, it is necessary to widen the diffusion range DA of the illumination light by the amount of widening the viewing angle. There is.
  • the reference signal RS is generated in accordance with the deflection of the central axis X, so that the number of reference signals RS (see FIGS. 11 and 12) increases.
  • the amount of calculation of the synthetic aperture can be considerably reduced only by reducing the peripheral resolution ⁇ by about -3 dB as compared with the center of the diffusion range DA.
  • the opening AP (scanning range of diffused light) required for detecting the pixel 42 (see FIG. 18) having the longest optical path length OPER. Is selected, the interval P required for the pixel 42 having the shortest optical path length is selected, and the diffusion range DA of the diffused light 10 required for the pixel 42 having the largest deflection angle ⁇ (see FIG. 18) (see FIG. 18). (See FIG. 1) may be selected.
  • FIG. 18 shows a conceptual diagram of phase matching of the sector scan method c.
  • Reference numeral AP in FIG. 18 indicates an opening required for detection.
  • tissue characterization tissue characterization
  • the wavelength band of spectrum analysis is the region from the visible light region excluding ultraviolet rays and X-rays, which are highly invasive, to the infrared region and terahertz.
  • Reflection and absorption in the wavelength band The mechanism of light reflection and absorption in the living body differs depending on the wavelength band. -In the visible light band, changes in the spectral components that are absorbed by exciting the vibrations and spins of biomolecules are superimposed on the reflected light of Rayleigh scattering to form a color.
  • the optimal spectral space axis for identifying the substance (the axis with the largest Fisher ratio between the clusters to be identified) is determined in advance by multivariate analysis, and the number of axes is calculated from the cumulative contribution rate.
  • assigning the information obtained on that axis to the axis with high visual resolution in the color space (3D) (for example, YIQ with visual resolution of 4: 1.5: 0.5) is effective for diagnostic imaging. Can be supported. Once displayed in color space, the observer's visual brain then performs non-linear discrimination.
  • the clusters formed in that axis space are narrowed down to 3 axes by AI (Deep Learning), which is good at non-linear identification, and as described above, a color space that is easy for human vision to recognize. You may support the diagnosis by converting it to the axis of and displaying it.
  • AI Deep Learning
  • the scale of AI can be significantly reduced by narrowing down the number of AI inputs by multivariate analysis, which is a preprocessing. As with data compression, the number of input axes can be narrowed down to at most 5-6.
  • ⁇ AI is better than human beings in learning a large amount of information in a short time and giving an answer in a short time from a large amount of information if the application scene and range are limited.
  • the reason is that AI does not get tired, so it is possible to learn a large amount of information in a short time by the speed of electricity day and night, and forget the large amount of information used for learning and the learning result. Since there is no such thing (in the case of human beings, forgetting is said to be a means to debug the brain), if the scenes and scope of application are limited, there are many cases where it is superior to human beings.
  • AI is effective in cases such as multispectral analysis where simple spectral pattern recognition is performed, but the number of variables is large, and unstable noise is mixed with various generation factors such as Rayleigh scattering. ..
  • ⁇ First in order to acquire as many multi-spectral image data as possible by dividing the visible light to infrared wavelength band as finely as possible by the endoscopy device, and to reduce the tags and cluster dispersion consistent with the definitive diagnosis.
  • Information necessary for preprocessing of (for example, wavelength band characteristics of the light source at the time of data acquisition, wavelength band characteristics until conversion into an electric signal by the light receiving line detector, etc.) is attached to the image data, and the external computer It is sent to a storage device and stored.
  • the optimum spectral space axis is calculated to identify the target substance by principal component analysis or FS (Foley-Sammon) conversion. do. Since the spectral space axes are calculated in descending order of cumulative contribution rate, the number of spectral space axes is narrowed down in the same manner as for data compression.
  • FS Principal component analysis
  • the accumulated multi-spectral data is projected onto the narrowed-down spectral space axis, and the accumulated multi-spectral data is used as training data to cause AI on the computer to perform "supervised learning" to identify the cluster.
  • the configuration of AI is the same as the configuration of AI installed in the endoscope device.
  • the basal vector component of the narrowed-down spectral space axis and the knowledge data learned by AI are sent from the computer to the endoscopy device for storage and identification. These coefficients are switched and used for each substance.
  • the number of spectral space axes narrowed down is at most 5 to 6 axes (according to the experience of the inventor, etc.), and since the scale of AI is small, it is easy to incorporate into an endoscope device and the identification speed is also high. Since it will be faster, real-time substance identification will be possible in vivo.
  • the endoscope device acquires a large number of multispectral data in vivo and can store them in an external storage device, and detects the signal of the spectral axis determined by multivariate analysis. It has two functions that enable real-time substance identification in vivo. Examples of these two functions will be described in Section 5). Twice
  • FIG. 20 shows the wavelength bands of various spectral images generated by the Fourier transform.
  • the output of the light receiving line detector 29 may be multiplied by a predetermined coefficient to perform a Fourier transform to generate a Y (luminance) signal.
  • the band including the near-infrared region 85 having good biopermeability shown in FIG. 20 may be Fourier transformed and generated as a W signal 81.
  • the FFT 62 of FIG. 19 performs the Fourier transform of the R band shown in FIG. 20 to generate the R signal 82. Pixels of the R signal 82 are interpolated by the interpolation memory unit 63 shown in FIG. 19, and the number of pixels and the time axis are synchronized with the W signal 81. Subsequently, the FFT 62 performs the Fourier transform of the B band shown in FIG. 20 to generate the B signal 83.
  • the B signal 83 is also interpolated by the interpolation memory unit 64 shown in FIG. 19, and the number of pixels and the time axis are synchronized with the W signal 81.
  • the data strings of the W, R, and B signals are stored in the memories 211-1 to 211-3 of the storage unit 211, and then in the synthetic aperture processing unit 213, W , R, and B signals are subjected to synthetic aperture processing 213-1 to 213-3, and then the matrix conversion unit 205-1 performs matrix conversion to an RGB signal to generate a video signal. ..
  • the wavelength bandwidth of the R signal 82 and the B signal 83 is 1/3 that of the W signal 81, the resolution is also 1/3, but the resolution for R and B of the human eye is compared with the luminance information. There is no problem because it is 1/3.
  • the output of the light receiving line detector 29 (see FIG. 7) can be multiplied by the coefficient of the XYZ color matching function to perform a Fourier transform to obtain an XYZ signal. It is possible.
  • the R signal 82 and the B signal 83 can be generated by performing the Fourier transform on the divided R and B bands (see FIG. 20).
  • the wideband light source 21 (see FIG. 7) consists of the linear sum of a plurality of light sources such as R, G, B and infrared, and everything from illumination to Fourier transform. Since it is a linear process, the process of extracting signals 82 and 83 corresponding to the bands of R and B from the output of the light receiving line detector 29 and performing Fourier transform by the principle of superposition is separate using a single light source of R and B. It is the same as acquiring the signal to and Fourier transforming it.
  • the output of the light receiving line detector 29 is multiplied by a predetermined coefficient to obtain the band of each spectrum.
  • images corresponding to the respective spectra can be obtained.
  • images of multispectral MS1 to MSn (reference numeral 84 in FIG. 20) obtained by dividing the wavelength band from visible light to infrared light as finely as possible are acquired by an endoscope device. do.
  • the site of the cancer tissue and the site of the normal tissue for which the multispectral MS1 to MSn are acquired are designated on the displayed RGB image by an input means such as a mouse.
  • the control unit 71 generates a gate signal 76 for cutting out a designated area based on the mouse information.
  • the multispectral MS1 to MSn which are the outputs of the light receiving line detector (reference numeral 29 in FIG. 7), are Fourier transformed by FFT68 to generate images of the respective spectra. A designated area is cut out from those images by the gate signal 76 and stored in the memory unit 69.
  • tags for definitive diagnosis of cancer tags indicating the type, malignancy, progression, etc. of cancer, and spectral characteristics of illumination required for pretreatment, etc.
  • Information is attached to the cut out multispectral MS1 to MSn image data, sent to an external computer, and stored.
  • image data of multispectral MS1 to MSn of such cancer tissue and normal tissue are acquired and accumulated for each case.
  • the processing by the data format creation unit 70 described above may be performed on an external computer by sending the images of the multispectral MS1 to MSn and the corresponding RGB images to a storage device managed by an external computer.
  • the resolution of the spectrum and the resolution of the spectral image are in a trade-off relationship.
  • a trade-off relationship based on SN including the completion time, holds. If you want to apply multi-spectral analysis to texture enhancement and contour identification, focus on the resolution (bandwidth) of the spectral image, and if you focus on the accuracy of substance identification, you should focus on the spectral resolution. The number of multispectral to be acquired will be increased.
  • the balance between the number of multispectral MS1 to MSn and the bandwidth is appropriately set according to the purpose of application. Since the bandwidths of the multispectral MS1 to MSn used for multivariate analysis should be divided as finely as possible, the emphasis is on the resolution of the spectrum, but the target site is specified by an input means such as a mouse. Since a certain resolution is required for the spectral image in order to cut out from the image, the balance is set according to the purpose.
  • a large amount of multispectral image data of cancer tissue and normal tissue accumulated in an external computer is normalized by the computer, such as the spectral characteristics of illumination, the wavelength band characteristics of the optical processing circuit, and the average brightness.
  • Preprocessing is performed. Preprocessing is important to reduce cluster distribution and improve identification accuracy.
  • a computer performs multivariate analysis of the preprocessed image data.
  • the spectral data of the cancer tissue and the normal cell is displayed pixel by pixel in the space where the multi-spectral component is the multidimensional orthogonal axis (O in FIG. 21)
  • a cluster of the cancer tissue and the normal tissue is formed. If pretreatment is performed to normalize the spectral characteristics of the illumination, the average brightness, and the variation in the sensitivity of the endoscopic device, the dispersion of each cluster is mainly caused by the instability of the superimposed Rayleigh scattering. ..
  • the variance of each of the clusters 361 and 362 is the smallest, and the distance between the clusters 361 and 362 is the largest (the Fisher ratio is the largest) so that the two clusters 361 and 362 of the cancer tissue and the normal tissue can be separated and distinguished.
  • the projective space by orthogonal transformation. For example, when this projective space is obtained by FS (Foley-Sammon) transformation, the projective axes (eigenvectors) EU1 to EUn are calculated in descending order of Fisher ratio. The inventors have obtained the finding that the number of EU1 to EUn can be reduced to at most 5 to 6 or less, similar to data compression.
  • AI similar configuration as AI137 in FIG. 23
  • AI is composed of a multi-layer neural network capable of learning Deep Learning.
  • "supervised learning” is performed on AI for each identification target.
  • the neuron coefficients obtained by learning are sent to AI137 in FIG. 23 and stored.
  • FIG. 21 schematically shows the identification of clusters in the projected space of the calculated intrinsic spectra EU1 to EUn so that the non-linear identification by AI can be easily understood visually.
  • Reference numeral 363 indicates the projection of clusters of cancerous tissue on the planes of EU1 and EU2
  • reference numeral 364 indicates the projection of clusters of normal tissue
  • Z indicates the non-linear threshold value that AI identifies on that plane. Shown. It shows that the data of the two orthogonal axes EU1 and EU2 are converted into one-axis information (in this case, the axis that distinguishes between cancerous tissue and normal tissue) by the non-linear discrimination of AI.
  • AI enables non-linear identification in the information space by performing identification for each minute area of the information space for each layer, and performs conversion to different information axes.
  • the number of orthogonal axes increases or decreases as needed (due to backpropagation).
  • the basis vector component values of EU1 to EUn are sent from an external computer to the control unit 71 shown in FIG. 19, and are set in the control unit 71 as the coefficients 72-1 to 72-n of the multipliers 73-1 to 73-n. It is stored in the memory of. Values necessary for preprocessing such as correction by monitoring the spectral characteristics of the wideband light source are added to the values of the coefficients 72-1 to 72-n.
  • the coefficients 72-1 to 72-n are read out from the memory of the control unit 71, and the spectral components output in time series from the light receiving line detector 29 (see FIG. 7) by the multipliers 73-1 to 73-n. To generate a time-series signal projected on the EU1 to EUn axes. By Fourier transforming the time-series signals with FFT 74-1 to 74-n, spectral images for EU1 to EUn can be obtained, respectively.
  • the YIQ signal with a resolution ratio of 4: 1.5: 0.5 is assigned in descending order of contribution on the EU axis, and then converted to RGB by the matrix conversion unit 205-2 and displayed.
  • the identification accuracy on the image of the tissue can be maximized. After that, non-linear discrimination is made by the observer's visual brain in this color space.
  • the element that senses the contrast of the image is not the difference in brightness between the pixel of interest and neighboring pixels, but the ratio of brightness (according to the Retinex theory of the visual model).
  • the image of the spectral component corresponding to each axis is input to the trained AI137, narrowed down to 3 axes, and assigned to the YIQ signal described above for display. You may.
  • the substance may be directly specified by AI137. Then, depending on the degree of ignition of the AI output, the hue of the image of the cancer part is changed and expressed, or the outline of the cancer is emphasized and displayed, and the display is converted into a display that is easy to visually identify. Then, the diagnosis support may be effectively performed.
  • the detected spectral components are used to distinguish between two clusters of normal tissue and cancer tissue for each region of the image, and then the cancer tissue is determined.
  • the type and malignancy of the cancer are determined from the combination of identification of the two clusters, and then the two clusters are shown in 3. of FIG. 24.
  • the degree of progress is judged from the combination of identification of.
  • the coefficients 72-1 to 72-n of EU1 to EUn which are optimal for identification, are read out from the control unit 71 and used, and by switching the knowledge data of AI137, the scale shown in FIG. 23 is small. AI137 can be used to identify multiple clusters without increasing the time required for identification. 6) Application example of spectrum analysis
  • the endoscope device of the present invention can be used for the spectrum analysis of biological substances having unique reflection characteristics with respect to the spectrum of light (particularly infrared rays).
  • the endoscope device of the present invention can also be used for spectrum analysis of biomolecules having the property of binding to a specific dye, or biomolecules having staining or fluorescence that develop color by reacting with a specific enzyme.
  • a specific dye or biomolecules having staining or fluorescence that develop color by reacting with a specific enzyme.
  • dyeing and fluorescent dyes such as ICG (indocyanine green), 5-ALA, BBG, triamcinolone acetonide, and fluorescein.
  • the staining is based on the absorption spectrum, and Rayleigh scattered light is mixed in, but the absorption is strong, so the stability and SN are high.
  • the result of staining can be emphasized by using the above-mentioned spectrum analysis. Since some dyes, such as ICG, are weakly toxic, the amount used can be reduced by emphasizing the dyeing results.
  • the fluorescent dye (probe) that attaches to the cancer the photodynamic therapy that kills the attached cancer when exposed to near infrared rays is added to the fluorescent dye, and the cancer is displaced by the immune effect on the remains of the cancer that died by the photodynamic therapy.
  • Various new reagents and therapeutic agents are being developed, including those with photoimmunotherapy that kills the cancer. For these as well, it is expected that the situation of the probe will be recognized by the above-mentioned spectrum analysis.
  • Raman scattered light has extremely low signal energy ( 10-6 of excitation light)
  • Rayleigh scattering of excitation light can be removed by a spectral filter that utilizes wavelength shift, so it is detected by the reflection method. Suitable for. Since the Raman scattered light is emitted first, which is a linear phenomenon, and the remaining energy is absorbed as intermolecular vibration, it is possible to detect the Raman spectrum of the subject by the above-mentioned spectral analysis.
  • a method for improving the image resolution of the Raman spectrum as disclosed in Japanese Patent Application No. 2019-087128 and optimization of the detection band for each substance specified by multivariate analysis are performed by AI. By identifying the substance, it is possible to improve the sensitivity of identification.
  • the endoscope device of the present invention it is possible to utilize the combined use of the SERS (Surface Enhanced Raman Scattering) effect of Raman scattering and the process of CARS (Coherent anti-Stokes Raman Scattering).
  • SERS Surface Enhanced Raman Scattering
  • CARS Coherent anti-Stokes Raman Scattering
  • -Fig. 27 shows an example of using a micromotor 231 (0.9 mm ⁇ ) as a scanning mechanism.
  • the tip of the micromotor 231 is connected to the optical fiber end 51 via the flexible joint 237, and when the micromotor 231 is driven, the flexible joint 237 and the optical fiber end 51 are rotated. Since the tip of the endoscope device having the optical fiber end 51 can be further thinned, it is suitable for observing a luminal-shaped subject such as a blood vessel or an airway.
  • the length of the flexible joint 237 is appropriately set within a range in which the rigidity of the flexible joint 237 can be maintained so that the rotation speed does not become uneven due to twisting.
  • reference numeral 235 indicates a sliding member
  • 233 indicates an optical rotary joint
  • RX1 indicates a rotation axis.
  • FIG. 28 is an example of a scanning mechanism using the MEMS mirror 131.
  • reference numeral 51 indicates an optical fiber end
  • 137 indicates a virtual scanning line of the optical fiber end 51.
  • -Fig. 29 is a scanning mechanism that vibrates and scans the optical fiber end with a voice coil or a piezoelectric bimorph such as piezo or polyvinylidene fluoride because the optical fiber has high bending resistance.
  • the tip becomes thinner, and it is possible to perform scanning according to the shape of the subject.

Landscapes

  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Optics & Photonics (AREA)
  • Biomedical Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Astronomy & Astrophysics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Endoscopes (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)

Abstract

Le problème décrit par la présente invention est de fournir un dispositif endoscopique qui nécessite ni système optique de formation d'image ni capteur d'image, qui peut être miniaturisé et présenter un petit diamètre, et qui réalise une imagerie et une analyse spectrale. La solution selon l'invention porte sur un dispositif endoscopique qui comprend : une fibre optique (1) qui guide la lumière fournie par une source de lumière (209), émet la lumière sous forme de lumière diffusée (10), et reçoit une lumière réfléchie qui revient d'un sujet (3) ; une unité de traitement de résolution d'interférence optique (207) qui effectue un traitement de résolution sur une direction de progression d'une surface d'onde (5) de la lumière diffusée (10) par le principe d'imagerie de la tomographie par cohérence optique ; un mécanisme de balayage (8) qui balaie la lumière diffusée (10) dans une direction de balayage (Y) qui croise un axe central (X) de la lumière diffusée (10) ; une unité de stockage (211) qui stocke une chaîne de données (177) générée par le traitement de résolution effectué par l'unité de traitement de résolution d'interférence optique (207) ; et une unité de traitement d'ouverture combinée (213) qui extrait et ajoute des données (AD-1 à AD-n), de la chaîne de données (177) stockée dans l'unité de stockage (211), qui correspondent à une longueur de trajet optique (OPL) d'une extrémité (51) de la fibre optique (1) à une position d'un pixel (173) à détecter.
PCT/JP2021/010282 2020-03-21 2021-03-15 Dispositif endoscopique WO2021193179A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-050385 2020-03-21
JP2020050385A JP6748932B1 (ja) 2020-03-21 2020-03-21 内視鏡装置

Publications (1)

Publication Number Publication Date
WO2021193179A1 true WO2021193179A1 (fr) 2021-09-30

Family

ID=72276724

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/010282 WO2021193179A1 (fr) 2020-03-21 2021-03-15 Dispositif endoscopique

Country Status (2)

Country Link
JP (1) JP6748932B1 (fr)
WO (1) WO2021193179A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116548910B (zh) * 2023-05-19 2023-12-08 北京至真互联网技术有限公司 一种眼科相干断层扫描仪的分辨率自适应调节方法及系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007135989A (ja) * 2005-11-21 2007-06-07 Olympus Corp 分光内視鏡
JP2010201077A (ja) * 2009-03-05 2010-09-16 Fujifilm Corp 生体断層画像生成装置及びその情報処理方法
JP2017505667A (ja) * 2014-01-31 2017-02-23 ザ ジェネラル ホスピタル コーポレイション 光プローブ、光強度検出、撮像方法及びシステム
JP2017170006A (ja) * 2016-03-25 2017-09-28 キヤノン株式会社 断層撮像システム、画像処理方法及びプログラム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007135989A (ja) * 2005-11-21 2007-06-07 Olympus Corp 分光内視鏡
JP2010201077A (ja) * 2009-03-05 2010-09-16 Fujifilm Corp 生体断層画像生成装置及びその情報処理方法
JP2017505667A (ja) * 2014-01-31 2017-02-23 ザ ジェネラル ホスピタル コーポレイション 光プローブ、光強度検出、撮像方法及びシステム
JP2017170006A (ja) * 2016-03-25 2017-09-28 キヤノン株式会社 断層撮像システム、画像処理方法及びプログラム

Also Published As

Publication number Publication date
JP2021146008A (ja) 2021-09-27
JP6748932B1 (ja) 2020-09-02

Similar Documents

Publication Publication Date Title
CN104684457B (zh) 使用oct光源和扫描光学器件的二维共焦成像
US10045692B2 (en) Self-referenced optical coherence tomography
RU2489091C2 (ru) Способ визуализации с помощью оптической томографии и устройство визуализации с помощью оптической томографии
EP2583617A2 (fr) Systèmes pour générer des images de lumière fluorescente
US20200129068A1 (en) Intraoral oct with color texture
JP2005510323A (ja) 多重スペクトルコード化を用いる共焦点顕微鏡法並びに分光法コード化共焦点顕微鏡法のためのシステム及び装置
WO2020222307A1 (fr) Dispositif endoscopique capable d'imagerie à haute définition et d'analyse spectrale
JP2008542758A (ja) スペクトルコード化ヘテロダイン干渉法を画像化に使用可能なシステム、方法、及び装置
JP2004502957A (ja) 高分解能コヒーレント光画像化のための方法及び装置
JP2006522341A (ja) 光路長が変更された異なる角度の光の合成により光学的に干渉する断層撮影におけるスペックルの減少
CN107957401A (zh) 一种可用于介入式肿瘤诊断的高光谱显微成像仪
CN112168144A (zh) 一种用于烧伤皮肤的光学相干层析成像系统
US6088099A (en) Method for interferometer based spectral imaging of moving objects
Guay-Lord et al. Combined optical coherence tomography and hyperspectral imaging using a double-clad fiber coupler
WO2021193179A1 (fr) Dispositif endoscopique
JP5653087B2 (ja) 光断層画像化装置及びその作動方法
Qiu et al. Spectral imaging with scattered light: from early cancer detection to cell biology
US8567948B2 (en) Device and method for examining the eye fundus, especially the photoreceptors
CN107224267B (zh) 一种眼底高光谱成像装置
WO2011158848A1 (fr) Dispositif de tomographie optique et procédé de tomographie optique
JP2023095625A (ja) 3次元撮像装置
JP3474883B2 (ja) 可動物の、干渉形に基づくスペクトル結像装置
Fujimoto Optical coherence tomography: principles and applications
JP5373389B2 (ja) 光構造情報取得装置及びその光干渉信号処理方法
JP5657941B2 (ja) 光断層画像化装置及びその作動方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21774521

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21774521

Country of ref document: EP

Kind code of ref document: A1