WO2020052623A1 - 一种用于可见光和激发荧光实时成像的系统和方法 - Google Patents

一种用于可见光和激发荧光实时成像的系统和方法 Download PDF

Info

Publication number
WO2020052623A1
WO2020052623A1 PCT/CN2019/105570 CN2019105570W WO2020052623A1 WO 2020052623 A1 WO2020052623 A1 WO 2020052623A1 CN 2019105570 W CN2019105570 W CN 2019105570W WO 2020052623 A1 WO2020052623 A1 WO 2020052623A1
Authority
WO
WIPO (PCT)
Prior art keywords
light intensity
intensity value
visible spectrum
spectrum segment
image
Prior art date
Application number
PCT/CN2019/105570
Other languages
English (en)
French (fr)
Inventor
胡文忠
张宇
戴玉蓉
陈继东
聂红林
Original Assignee
上海逸思医学影像设备有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海逸思医学影像设备有限公司 filed Critical 上海逸思医学影像设备有限公司
Publication of WO2020052623A1 publication Critical patent/WO2020052623A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0071Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by measuring fluorescence emission
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons

Definitions

  • the present application relates to optical imaging technology, and more particularly to the field of endoscopic multispectral imaging.
  • Optical imaging is widely used in clinical medicine, and has the advantages of being harmless to the human body, non-invasive, high sensitivity, and in vivo multi-target imaging.
  • fluorescent imaging has become an important tool for intraoperative navigation.
  • the application technology based on the near-infrared fluorescent developer Indocyanine Green (ICG) is also increasingly mature.
  • ICG near-infrared fluorescent developer Indocyanine Green
  • the indocyanine green fluorescent agent After being injected intravenously, the indocyanine green fluorescent agent binds to plasma proteins and is excited by near-infrared light at a wavelength of about 805 nm to release fluorescence at a wavelength of about 835 nm, which is longer than the wavelength of the excitation light.
  • the imaging system processes the captured fluorescent signals through algorithms and displays white light, fluorescence, or superimposed images on the screen in real time. Compared with traditional organic dyes, this method of real-time fluorescence development has higher contrast, better recognition of target tissues, and avoids dye dispersion in the later stage of surgery.
  • each image frame of the system output image is a multi-spectral fusion image.
  • one frame of image can be divided into visible spectrum image information and fluorescence spectrum image information.
  • fluorescent contrast agent navigation for tumor surgical cutting
  • tissue areas are expected to be accurately excised, and this requires clear fluorescent tissue areas and clear boundaries of fluorescent areas.
  • the operator also hopes that the background tissue area of the visible spectrum can also be clearly defined so as to better identify the tissue.
  • the tissue of the labeled fluorescent region contains both the information of the fluorescence spectrum and the visible spectrum, and the light intensity of the visible spectrum is usually much stronger than that of the fluorescence spectrum.
  • the present application provides a multi-spectral imaging system, including: a light source module that provides visible light and generates excitation light, the illuminated area is illuminated by a combination of visible light and excitation light, and the illuminated area is released by being excited by the excitation light. Fluorescence; a camera module comprising one or more image sensors, said image sensors receiving visible light and fluorescence from reflected light from an irradiated area to form a multi-spectral image containing image information of visible spectrum and image information of fluorescence spectrum; Among them, the light intensity value of the visible spectrum segment changes periodically in the continuous video frame image, so that the bright and dark parts of the visible video image are interlaced in the continuous frame.
  • the light intensity value of the visible spectrum segment includes a first light intensity value of the visible spectrum segment and a second light intensity value of the visible spectrum segment, a first light intensity value of the visible spectrum segment and a first light intensity value of the visible spectrum segment.
  • the two light intensity values appear alternately and cyclically in each periodic frame of a continuous video frame image.
  • the one or more image sensors acquire image information of a visible spectrum segment and image information of a fluorescence spectrum segment in a multi-spectral image; each period frame is two Multi-spectral fusion image, the light intensity value of the fluorescence spectrum segment in the image frame remains unchanged in successive frames of the image, and the light intensity value of the visible spectrum segment is set to the first light intensity of the visible spectrum segment in the first half of the period frame Value, the light intensity value of the visible spectrum segment is set to the second light intensity value of the visible spectrum segment in the second half of the period frame, and the first light intensity value of the visible spectrum segment is different from the second light intensity value of the visible spectrum segment.
  • the first light intensity value in the visible spectrum segment and the second light intensity value in the visible spectrum segment are set according to a preset reference light intensity value in the visible spectrum segment.
  • the first light intensity value in the visible spectrum segment and the second light intensity value in the visible spectrum segment are set according to the light intensity value in the fluorescence spectrum segment.
  • the light intensity value in the visible spectrum band changes in a periodic function in the continuous video frame image.
  • the light intensity difference value of the visible spectrum segment in the adjacent frame image of the image is adjusted by adjusting the maximum amplitude between the peaks and valleys of the periodic function.
  • the light intensity value of the visible spectrum band is controlled by adjusting the light source module.
  • the light intensity value of the fluorescence spectrum segment in the video progressive scanning mode, includes a first light intensity value of the fluorescence spectrum segment and a second light intensity value of the fluorescence spectrum segment.
  • the first light intensity value and the second light intensity value of the fluorescence spectrum segment appear alternately and cyclically in each periodic frame of the video image, and the periodic change of the light intensity value of the fluorescence spectrum segment and the light intensity of the visible spectrum segment Periodic changes in the value are reversed.
  • the light intensity value of the fluorescence spectrum segment and the light intensity value of the visible spectrum segment change in a periodic function in the video frame image, and the change in the light intensity value of the fluorescence spectrum segment is related to the current or voltage of the fluorescent light source.
  • the change in the light intensity value in the visible spectrum corresponds to the change in the current or voltage of the visible light source.
  • each periodic frame is four frames of images, and the first light intensity value of the visible spectrum segment or the second light intensity value of the visible spectrum segment appears in continuous In two frames.
  • the light intensity value of the visible spectrum band in the first frame is set to a first light intensity value of visible light
  • the light intensity value of the visible spectrum band in the second frame is set to a second light of visible light Intensity value
  • the light intensity value of the visible spectrum segment in the third frame is set to the second light intensity value of visible light
  • the light intensity value of the visible spectrum segment in the fourth frame is set to the first light intensity value of visible light.
  • the light intensity value in the visible spectrum segment changes as a periodic function in the video frame image, and the change in the light intensity value in the visible spectrum segment corresponds to a change in the current or voltage of the visible light source.
  • each frame of the video is divided into two field images, and the first light intensity value of the visible spectrum segment and the second light intensity value of the visible spectrum segment are respectively It alternates periodically in different field images.
  • each frame of a video multispectral image is divided into two field images for interlaced video output, and the two field images are complementary to form a multispectral image.
  • the first light intensity value in the visible spectrum segment is different from the second light intensity value in the visible spectrum segment.
  • the light intensity value in the visible spectrum segment changes as a periodic function in the video frame image, and the change in the light intensity value in the visible spectrum segment corresponds to a change in the current or voltage of the visible light source.
  • the second light intensity value of the visible spectrum segment is set according to the following manner: calculating the minimum value of the active fluorescent pixel, the maximum value of the active fluorescent pixel, and the average value of all the active fluorescent pixels; and according to the calculated active fluorescent pixel
  • the minimum value and the maximum value of the active fluorescent pixel are used to calculate the median value of the active fluorescent pixel; the minimum value of the median value of the fluorescent pixel and the average value of the active fluorescent pixel are selected; and the preset visible spectrum segment is determined according to the selected minimum value.
  • the reference light intensity value of is the second light intensity value in the visible spectrum.
  • the first light intensity value in the visible spectrum segment is set according to the imaging quality of the image in the visible spectrum segment.
  • the maximum difference between the first light intensity value in the visible spectrum segment and the second light intensity value in the visible spectrum segment is used to control the maximum amplitude of the current or voltage of the light source to adjust the light intensity.
  • the present application also provides a multi-spectral imaging method, comprising: irradiating a combination of visible light and excitation light; the illuminated area is excited by the excitation light to emit fluorescence; and receiving visible light and fluorescence from the reflected light from the illuminated area to Form a multi-spectral image containing image information of the visible spectrum segment and image information of the fluorescence spectrum segment; wherein the light intensity value of the visible spectrum segment changes periodically in the continuous video frame image, so that the visible light video image is bright and dark in the continuous frame Intertwined.
  • the light intensity value of the visible spectrum segment includes a first light intensity value of the visible spectrum segment and a second light intensity value of the visible spectrum segment, a first light intensity value of the visible spectrum segment and a first light intensity value of the visible spectrum segment.
  • the two light intensity values appear alternately and cyclically in each periodic frame of a continuous video frame image.
  • each periodic frame is two multi-spectral fusion images, and the light intensity value of the fluorescence spectrum segment in the image frame remains unchanged in successive frames of the image.
  • the light intensity value of the visible spectrum segment is set to the first light intensity value of the visible spectrum segment in the first half period frame, and the light intensity value of the visible spectrum segment is set to the second light intensity value of the visible spectrum segment in the second half period frame.
  • the first light intensity value in the visible spectrum segment is different from the second light intensity value in the visible spectrum segment.
  • the light intensity value of the fluorescence spectrum segment in the video progressive scanning mode, includes a first light intensity value of the fluorescence spectrum segment and a second light intensity value of the fluorescence spectrum segment.
  • the first light intensity value and the second light intensity value of the fluorescence spectrum segment appear alternately and cyclically in each periodic frame of the video image, and the periodic change of the light intensity value of the fluorescence spectrum segment and the light intensity of the visible spectrum segment Periodic changes in the value are reversed.
  • each periodic frame is four frames of images, and the first light intensity value of the visible spectrum segment or the second light intensity value of the visible spectrum segment appears in continuous In two frames.
  • each frame of the video is divided into two field images, and the first light intensity value of the visible spectrum segment and the second light intensity value of the visible spectrum segment are respectively It alternates periodically in different field images.
  • the multi-spectral image fusion of the visible spectrum image and the fluorescence spectrum image is performed.
  • the application also provides a computer-readable storage medium that stores computer-executable instructions for performing the following operations: irradiated by a combination of visible light and excitation light, and the illuminated area is released by being excited by the excitation light Receive fluorescence; receive visible light and fluorescence from the reflected light from the illuminated area to form a multispectral image containing image information of the visible spectrum and image information of the fluorescence spectrum; wherein the light intensity value of the visible spectrum is in continuous video frames The image changes periodically, so that the visible light video image is interlaced in bright and dark parts in consecutive frames.
  • the present application also provides a method for adjusting light intensity in multispectral imaging, including: calculating a minimum value of active fluorescent pixels, a maximum value of active fluorescent pixels, and an average value of all active fluorescent pixels; and according to the calculated minimum value of active fluorescent pixels
  • the maximum value of the active fluorescent pixel calculate the median value of the active fluorescent pixel; select the minimum value of the median value of the fluorescent pixel and the average value of the active fluorescent pixel; determine the preset reference of the visible spectrum segment according to the selected minimum value
  • the light intensity value is used as the second light intensity value of the visible spectrum segment; the first light intensity value of the visible spectrum segment is set according to the imaging quality of the visible spectrum segment image; the first light intensity value of the visible spectrum segment and the A maximum difference between the second light intensity values; and determining a maximum amplitude for controlling the light source current or voltage to adjust the light intensity.
  • calculating the median value of the active fluorescent pixels may also be combined with the maximum gray level of the pixels.
  • the system and method for real-time imaging of visible light and excited fluorescence provided by the present application can make the visible background image visible and bright, while the fluorescence imaging has high contrast and low noise.
  • the application can be widely applied in the field of medical imaging technology, including various fields such as medical endoscopes, fluorescence microscopes and the like.
  • FIG. 1 is a schematic structural diagram of a multispectral imaging system according to an embodiment of the present application.
  • FIG. 2 is a schematic structural diagram of a multi-spectral imaging system using a single image sensor according to an embodiment of the present application
  • FIG. 3 is a schematic diagram of a light source module using multiple monochromatic light source combinations according to an embodiment of the present application
  • FIG. 4 is a schematic diagram of a camera module using two image sensors according to an embodiment of the present application.
  • FIG. 5 is a schematic diagram of a camera module using four image sensors according to an embodiment of the present application.
  • FIG. 6 is a schematic diagram of an imaging method under a progressive output mode according to an embodiment of the present application.
  • FIG. 7 is a schematic diagram of another imaging method in a progressive output mode according to an embodiment of the present application.
  • FIG. 8 is a schematic diagram of another imaging method in a progressive output mode according to an embodiment of the present application.
  • FIG. 9 is a schematic diagram of an imaging method under an interlaced output mode according to an embodiment of the present application.
  • FIG. 10 is a schematic diagram of a method for confirming a light intensity reference preset value according to an embodiment of the present application.
  • FIG. 11 is a block diagram of an exemplary computer system according to one embodiment of the present application.
  • FIG. 1 shows an exemplary structure diagram of a multi-spectral imaging system according to an embodiment of the present application.
  • a multispectral imaging system can be used for a fluorescent endoscope.
  • the fluorescence endoscope camera system may include a light source module 1, a camera module 2 for imaging, a data acquisition and preprocessing module 3 for image acquisition and preprocessing, a digital image processing module 4, and a system controller.
  • 5 Light source controller 6.
  • the camera module 2 may include an optical lens assembly and one or more image sensors (described in detail in the following figure).
  • the system controller 5 may generate a corresponding control signal according to the feedback processing result of the digital image processing module 4.
  • the light source controller 6 may perform lighting control on the light source module 1 according to a control signal of the system controller 5. You can control the light intensity value in the visible spectrum by adjusting the light source module.
  • FIG. 2 shows an exemplary structure diagram of a multi-spectral imaging system using a single image sensor according to an embodiment of the present application.
  • the light source module 1 may include an incident visible light LED 11, a near-infrared (NIR) exciter 12, and a combination beam splitter 13.
  • the camera module 2 may include an optical lens assembly 9 and a single image sensor 10 (for example, a CCD or CMOS image sensor).
  • the optical lens assembly 9 may further include an optical filter 8.
  • the incident visible light LED 11 can provide illumination in the visible light wavelength range (400nm ⁇ 760nm).
  • the near-infrared (NIR) exciter 12 can generate NIR excitation light in a near-infrared wavelength range (790 nm to 820 nm), particularly near-infrared excitation light in a wavelength range around 800 nm.
  • the visible light and the NIR excitation light can be combined and irradiated to the irradiated area (for example, a tissue) through the combined spectroscope 13.
  • the tissue injected with the fluorescent developer is irradiated with the NIR excitation light in the light source module 1
  • the fluorescent groups in the tissue are excited to emit near-infrared fluorescence with a longer wavelength band than the excitation light wavelength, such as fluorescence of 830 nm to 900 nm.
  • the reflected light from the irradiated area may include three spectral bands such as visible light, excitation light (for example, NIR excitation light), and fluorescence (for example, near-infrared fluorescence).
  • the NIR excitation light is completely blocked by the optional optical filter 8 and cannot enter the image sensor, while visible light and fluorescence enter the single image sensor 10 with NIR pixels through the optical lens assembly 9.
  • the data from the image sensor 10 can enter the digital image processing module 4 through the data acquisition and preprocessing module 3 to form a visible spectrum image and a fluorescent spectrum image.
  • the system controller 5 may output a signal to the light source controller 6 according to the feedback from the digital image processing module 4.
  • the light source controller 6 can control the light source module 1 by controlling a driving current or a voltage.
  • FIG. 3 is an exemplary schematic diagram of a light source module employing multiple monochromatic light source combinations according to an embodiment of the present application.
  • the light source module 1 may include a combination of three monochrome light sources as shown in FIG. 3.
  • the visible light source consists of three monochromatic light sources: red LED 18 (620nm ⁇ 760nm), green LED 19 (492nm ⁇ 577nm) and blue LED 20 (400nm ⁇ 450nm) through the corresponding combination beam splitter 15, combination beam splitter 16 and combination.
  • the spectroscope 17 is combined.
  • the camera module 2 may include one or more image sensors.
  • the camera module 2 may be a single image sensor as shown in FIG. 2, or a dual image sensor combination as shown in FIG. 4.
  • FIG. 4 is an exemplary schematic diagram of a camera module employing two image sensors according to an embodiment of the present application.
  • the camera module 2 may include a plurality of image sensors, and a corresponding dichroic prism.
  • the camera module 2 may employ an image sensor 32 (such as a CCD / CMOS image sensor) for visible light and an image sensor 33 (such as a CCD / CMOS image sensor) for fluorescence. Visible light and fluorescence from the tissue enter the corresponding image sensors through the dichroic prism 31, respectively.
  • FIG. 5 is an exemplary schematic diagram of a camera module using four image sensors according to an embodiment of the present application.
  • the camera module 2 may include four image sensors and corresponding three beam splitting prisms.
  • prism light splitting technology for example, a combination of a light splitting prism 34, a light splitting prism 35, and a light splitting prism 36
  • the sensors for example, the red light image sensor 38, the green light image sensor 39, and the blue light image sensor 40
  • each image sensor receives reflected light of a corresponding wavelength band to form a multispectral image.
  • the periodic adjustment of the light intensity in the continuous frame of the video image of the present application can be performed by one image sensor, two image sensors, or four sensor solutions.
  • Multispectral fusion images can also be obtained by integrating fluorescent pixels in a single image sensor.
  • Three image sensors (RGB) synthesize visible light images.
  • the fourth image sensor generates fluorescent 4CCD / CMOS, which can also form a fusion image. Therefore, this application uses one or more image sensors to generate a fused image frame of visible light and fluorescence, and the number of image sensors in each embodiment is only an example and not a limitation.
  • FIG. 6 is a schematic diagram of an imaging method in a progressive output mode according to an embodiment of the present application.
  • Figure 6 shows an alternative way of periodic light intensity changes in successive frame images.
  • each image frame of the system output image is a multi-spectral fusion image, which includes image information of visible spectrum and image information of fluorescence spectrum.
  • the light intensity of the fluorescence spectrum can be set to remain constant in all consecutive frames of the image.
  • the visible spectrum segment can be set to a high light intensity value
  • the visible spectrum segment in the second frame can be set to a low light intensity value
  • the visible spectrum segment in the third frame image can be set again.
  • the visible spectral band in the fourth frame image can be changed to a low light intensity value again.
  • the subsequent image frames also change in this way.
  • high and low light intensity values appear alternately and periodically in each frame of the image.
  • one period of the change of the high and low light intensity values is the time of the two frames of the image, that is, one period of the frame is two images, the first half of the period frame is the high light intensity of the visible spectrum, and the second half of the frame is the low light spectrum brightness.
  • the high or low intensity values in the visible spectrum of the image are both relative to the preset reference spectrum intensity of the visible spectrum, or the intensity of the fluorescence spectrum.
  • the visible light video image is in continuous image frames.
  • the light and dark parts in the middle are intertwined, and it is not easy for the human eye to perceive this rapid change in light and dark in the visible spectrum.
  • the frame image with high light intensity in the visible spectrum segment mainly highlights the image details and sharpness of the background tissue, while the frame image with low light intensity in the visible spectrum segment highlights the image details of the fluorescent region and simultaneously Contrast with background tissue. In this way, in the process of continuous image frames, using the visual persistence characteristics of the human eye, it reflects the structural image details under the background tissue high light intensity, and also reflects the specific image details of the fluorescent area.
  • the light intensity value in the visible spectrum range is adjusted and changed in a continuous video frame image by a periodic function.
  • the light intensity can be controlled by the light source driving current or voltage.
  • the light source driving current may be a function of continuous periodic variation. The greater the driving current, the greater the intensity of the light source.
  • the driving current of the light source is a periodic change function, such as a sine function, a cosine function, a tangent function, a square wave periodic function, a triangular periodic function, a step periodic function, and the like.
  • the sinusoidal function shown in FIG. 6 is changed, and its half cycle time is a frame image exposure time.
  • the frame rate range of the image can be adjusted between 60 fps (frames per second) and 120 fps. You can appropriately increase the frame rate of the image to 60fps-120fps to eliminate the screen flicker caused by too slow or too large light intensity changes, thereby eliminating human eye discomfort caused by too large light intensity changes.
  • FIG. 7 is a schematic diagram of another imaging method in a progressive output mode according to an embodiment of the present application.
  • FIG. 7 shows another alternative way of periodic light intensity change in continuous frame images.
  • the light intensity change in the system image periodic frame can be shown in FIG. 7.
  • the periodic change of the light intensity value in the fluorescence spectrum segment is opposite to the periodic change of the light intensity value in the visible spectrum segment.
  • the visible spectrum is a high light intensity value
  • the fluorescence spectrum is a low light intensity.
  • the visible spectrum is a low light intensity value
  • the fluorescence spectrum is a high light intensity change.
  • the driving current corresponding to each light source also changes as shown in FIG. 7.
  • the difference in light intensity between the visible spectrum segment and the fluorescence spectrum segment is more prominent, so that in a continuous period frame, the visible spectrum segment image and the fluorescence spectrum segment image are more distinct.
  • the light intensity value in the fluorescence spectrum band and the light intensity value in the visible spectrum band are adjusted and changed by a periodic function in the video frame image.
  • FIG. 8 is a schematic diagram of another imaging method in a progressive output mode according to an embodiment of the present application.
  • FIG. 8 shows another alternative way of periodic light intensity change in continuous frame images.
  • the light intensity change in the visible spectral band in the system image frame information is shown in Figure 8.
  • one periodic frame is four images, and the high or low light intensity values in the visible spectral band appear in two consecutive images, that is, the cycle time of a certain light intensity state is increased in the continuous frame images.
  • the visible spectrum image and fluorescence spectrum image are more distinct in different light intensity states.
  • FIG. 9 is a schematic diagram of an imaging method under an interlaced output mode according to an embodiment of the present application.
  • FIG. 9 illustrates a method for adjusting periodic light intensity in continuous frame images in an interlaced output mode.
  • the driving current of the light source changes as a continuous periodic function, such as a sinusoidal periodic function.
  • the image adopts the interlaced output mode one frame of the image is divided into two fields of odd and even field image information, and the high and low light intensity values in the visible spectral band appear alternately and periodically in different field images, respectively.
  • the odd field is the high light intensity in the visible spectrum
  • the even field is the low light intensity in the visible spectrum, or vice versa.
  • interlaced output When interlaced output is used, the odd-numbered lines of the image (upper field) are scanned first, and then the even-numbered lines of the image (lower field) are scanned. The two complement each other to form a complete picture. Although the brightness of the upper field is attenuated when scanning the lower field, the light and dark parts of the visible spectrum are intertwined, which makes it difficult to perceive human visual characteristics.
  • interlaced output the image information is divided into two field images and output separately. This method is more continuous for the visual effects of this application.
  • interlaced output can support higher resolution image output.
  • the light intensity value in the visible spectrum band may be high first and then low (as shown in the illustrated embodiment), but the application is not limited thereto.
  • the light intensity value of the visible spectrum band may be set to be low first and then high.
  • the intensity value of the fluorescence spectrum can also be set to high first and then low.
  • the change in the light intensity value of the fluorescence spectrum segment may correspond to the change in the current or voltage of the fluorescent light source.
  • the change in the light intensity value in the visible spectrum band may correspond to the change in the current or voltage of the visible light source.
  • image fusion of the visible spectrum segment image and the fluorescence spectrum segment image may be performed first, and after the image processing, the visual effect of the output multi-spectral fusion image is more enhanced.
  • FIG. 10 is a schematic diagram of a method for confirming a light intensity reference preset value according to an embodiment of the present application.
  • the high or low intensity values in the visible spectrum of the image are both relative to the pre-set reference light intensity of the visible spectrum, or the light intensity relative to the fluorescence spectrum. The following will describe the reference intensity feedback adjustment of the change in the light intensity value in the visible spectrum.
  • the digital image processing module continuously calculates the minimum, maximum, and average values of all active fluorescent pixels in the image frame. Calculate the median value of the active fluorescent pixels according to the minimum, maximum, and maximum gray levels of the active fluorescent pixels.
  • calculating the median value of the active fluorescent pixels may also be combined with the maximum gray level of the pixels.
  • the maximum gray level of the pixel can be the maximum dynamic range of the fluorescent pixel, that is, the maximum light intensity that the pixel sensor unit can feel.
  • the sizes of the median value and the average value of the active fluorescent pixels are compared, and the minimum value of the median value and the average value of the active fluorescent pixels is selected.
  • the selected minimum value the best preset light intensity for the low light intensity frame of the visible light source is determined.
  • the high intensity value in the visible spectrum of the image is determined according to the imaging quality of the visible light image.
  • the maximum difference between the high light intensity value and the low light intensity of the visible light source is calculated, the maximum amplitude A of the light source control current is determined, and the light intensity adjustment of the fluorescence mode of the system is completed.
  • the solution of the present application adjusts the light intensity of the visible spectrum band of different frame images in the video image in the continuous frame image information, so that the light intensity of the high and low visible spectrum bands Alternately cyclically output in different frame images.
  • both high-brightness visible spectrum images can be output, the visible light image details of the background tissue can be more prominent, and high-contrast fluorescent image information under low visible light intensity background can be output.
  • both the clear details of the visible spectrum image and the high-contrast image details of the fluorescence near-infrared spectrum image are reflected. That is, the image periodic frame technology that outputs different light intensities and contrasts every other frame is used to output clear visible light image information while ensuring high contrast of fluorescent image information.
  • the defect of low contrast in the fluorescent region under the background image with high visible light intensity is solved.
  • the defect that the imaging quality of the visible light image under the background image of the low visible light intensity is poor when the fluorescence and visible spectrum segments are imaged on the market at the same time is solved.
  • Computer system 1100 may include a logic processor 1102, such as an execution core. Although one logical processor 1102 is illustrated, in other embodiments, the computer system 1100 may have multiple logical processors, for example, multiple execution cores per processor chip, and / or multiple processor bases. Chips, where each processor chip can have multiple execution cores. As shown, various computer-readable storage media 1110 may be interconnected by one or more system buses that couple various system components to the logical processor 1102.
  • the system bus can be any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures.
  • the computer-readable storage medium 1110 may include, for example, a random access memory (RAM) 1104, a storage device 1106 (e.g., an electromechanical hard drive, a solid state hard drive, etc.), firmware 1108 (e.g., flash memory RAM or ROM), and removable storage devices 1118 (such as, for example, CD-ROMs, floppy disks, DVDs, flash drives, external storage devices, etc.).
  • RAM random access memory
  • storage device 1106 e.g., an electromechanical hard drive, a solid state hard drive, etc.
  • firmware 1108 e.g., flash memory RAM or ROM
  • removable storage devices 1118 such as, for example, CD-ROMs, floppy disks, DVDs, flash drives, external storage devices, etc.
  • the computer-readable storage medium 1110 may provide non-volatile and volatile storage of computer-executable instructions 1122, data structures, program modules, and other data for the computer system 1100.
  • a basic input / output system (BIOS) 1120 which contains basic routines such as helping to transfer information between units of the computer system 1100 during startup, may be stored in the firmware 1108.
  • a large number of programs may be stored on the firmware 1108, the storage device 1106, the RAM 1104, and / or the removable storage device 1118, and may be executed by the logical processor 1102, which includes the operating system and / or application programs. Commands and information may be received by the computer system 1100 through an input device 1116.
  • the input device 1116 may include, but is not limited to, a keyboard and a pointing device.
  • Other input devices may include a microphone, joystick, gamepad, scanner, and so on. These and other input devices are often connected to the logical processor 1102 through a serial port interface coupled to the system bus, but may also be connected through other interfaces, such as a parallel port, a game port, or a universal serial bus (USB).
  • a display or other type of display device may also be connected to the system bus via an interface, such as a video adapter, which may be part of or connected to the graphics processing unit 1112.
  • computers typically include other peripheral output devices, such as speakers and printers (not shown).
  • the exemplary system of FIG. 11 may further include a host adapter, a small computer system interface (SCSI) bus, and an external storage device connected to the SCSI bus.
  • SCSI small computer system interface
  • Computer system 1100 may operate in a networked environment using logical connections to one or more remote computers, such as a certain remote computer.
  • the remote computer may be another computer, server, router, network PC, peer device, or other common network node, and may typically include multiple or all of the units described above with respect to the computer system 1100.
  • the computer system 1100 When used in a LAN or WAN networking environment, the computer system 1100 may be connected to a LAN or WAN through a network interface card 1114.
  • a network card (NIC) 1114 (which may be internal or external) may be connected to the system bus.
  • program modules or portions thereof depicted with respect to computer system 1100 may be stored in a remote memory storage device. It should be appreciated that the network connections described herein are exemplary and other means of establishing a communications link between the computers may be used.
  • the functions and processes described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium.
  • Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another.
  • a storage media may be any available media that can be accessed by a computer.
  • such computer readable media may include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or in the form of instructions or data structures that can be used to carry or store instructions. Any other medium that agrees with the program code and can be accessed by a computer.
  • the software code may be stored in a memory such as the memory of a mobile station and executed by a processor such as a desktop computer, laptop computer, server computer, microprocessor of a mobile device, or the like.
  • the memory can be implemented inside the processor or outside the processor.
  • memory refers to any type of long-term, short-term, volatile, nonvolatile, or other memory, and is not limited to any particular type of memory or a specific number of memories, or memory storage The type of media on it.
  • any reference to a claim element in a singular form such as a reference using the articles “a”, “an” or “the” should not be interpreted as limiting the element to the singular.
  • any reference to the source depth by the names “first”, “second”, etc. as used herein generally does not limit the number or order of those elements.
  • Reference to the first and second elements does not mean that only two elements are used here, nor does it mean that the first element must precede the second element in some way.
  • a set of elements may include one or more elements.
  • the skilled person can implement the described structure in different ways for each particular application, but such an implementation should not be interpreted as causing a departure from the scope of this application.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)
  • Endoscopes (AREA)

Abstract

一种多光谱成像系统,包括:光源模块(1),提供可见光并产生激发光,被照射区由可见光和激发光组合照射,被照射区受激发光的激发而释放出荧光;相机模块(2),包括一个或多个图像传感器(10),该图像传感器(10)接收来自被照射区的反射光中的可见光和荧光,以形成包含可见光谱段图像信息和荧光谱段图像信息的多光谱图像;其中,可见光谱段的光强度值在连续的视频帧图像中呈周期性变化,使可见光视频图像在连续帧中亮暗部分相交织。该多光谱成像系统使得可见光背景图像成像清晰明亮,同时荧光成像对比度高,噪点低。

Description

一种用于可见光和激发荧光实时成像的系统和方法 技术领域
本申请涉及光学成像技术,更具体地涉及内窥镜多光谱成像领域。
背景技术
光学成像在临床医学中应用广泛,具有对人体无害、非侵入性、高灵敏度和可进行在体多目标成像的优点。其中,随着荧光显影剂的不断拓展和创新,荧光成像更是成为术中导航的重要工具。目前,基于近红外荧光显影剂吲哚菁绿(Indocyanine Green,ICG)的应用技术也日益成熟。荧光显影剂吲哚菁绿通过静脉注射后,与血浆蛋白结合,通过805nm波段左右的近红外光的激发,释放出比激发光波长更长的835nm左右波段荧光。成像系统将捕捉到的荧光信号通过算法处理,在屏幕上实时地显示白光、荧光或叠加图像。这种实时荧光显影的方法与传统的有机染料相比,对比度更高,对目标组织的识别效果更佳,且避免了手术后期染料弥散。
对于可见光和荧光同时成像而言,系统输出图像的每一图像帧为多光谱融合图像,根据光谱波长段的不同,一帧图像中可以分为可见光谱段图像信息和荧光谱段图像信息。在采用荧光显影剂导航进行肿瘤手术切割的时候,通常荧光标记的组织区域是希望能够被精准切除,而这就需要荧光组织区域清晰且荧光区域边界分明。同时,手术者也希望可见光谱段背景组织区域也能够清晰明确,以便更好的识别组织。然而,被标记的荧光区域组织上既包含荧光谱段又有可见光谱段的信息,且可见光谱段的光强度通常是远远强于荧光谱段的光强度。这导致了荧光通常被可见光淹没,导致在多光谱图像中荧光较暗,不清晰,对比度较低等问题,而通常荧光区域边界处组织病变程度更低,被激发的荧光通常更弱,在强的可见光谱段光强度下,荧光区域的边界更加的模糊不清楚。
因此,本领域需要改进的多光谱成像技术,以提升可见光和激发荧光同时成像的图像质量。
发明内容
为了实现上述目的,本申请提供了一种多光谱成像系统,包括:光源模块,提供可见光并产生激发光,被照射区由可见光和激发光组合照射,被照射区受激发光的激发而释放出荧光;相机模块,包括一个或多个图像传感器,所述图像传感器接收来自被照射区的反射光中的可见光和荧光,以形成包含可见光谱段图像信息和荧光谱段图像信息的多光谱图像;其中,可见光谱段的光强度值在连续的视频帧图像中呈周期性变化,使可见光视频图像在连续帧中亮暗部分相交织。
根据本申请的一个方面,可见光谱段的光强度值包括可见光谱段的第一光强度值和可见光谱段的第二光强度值,可见光谱段的第一光强度值和可见光谱段的第二光强度值分别在连续的视频帧图像的每个周期帧中交替周期性循环出现。
根据本申请的一个实施例,在视频图像采取逐行扫描方式下,所述一个或多个图像传感器获取多光谱图像中的可见光谱段图像信息和荧光谱段图像信息;每个周期帧为两帧多光谱融合图像,图像帧中荧光谱段的光强度值在图像的连续帧中保持不变,可见光谱段的光强度值在上半个周期帧被设置成可见光谱段的第一光强度值,可见光谱段的光强度值在下半个周期帧被设置成可见光谱段的第二光强度值,可见光谱段的第一光强度值不同于可见光谱段的第二光强度值。
根据本申请的一个方面,可见光谱段的第一光强度值和可见光谱段的第二光强度值是根据预设的可见光谱段的基准光强度值来设置的。
根据本申请的一个方面,可见光谱段的第一光强度值和可见光谱段的第二光强度值是根据荧光谱段的光强度值来设置的。
根据本申请的一个方面,可见光谱段的光强度值在连续视频帧图像中呈周期函数变化。
根据本申请的一个方面,通过调节所述周期函数的波峰波谷间的最大幅值来调节图像的相邻帧图像中可见光谱段的光强度差值。
根据本申请的一个方面,通过调节光源模块从而控制可见光谱段的光强度值。
根据本申请的另一个实施例,在采取视频逐行扫描方式中,荧光谱段的光强度值包括荧光谱段的第一光强度值和荧光谱段的第二光强度值,荧光谱段的第一光强度值和荧光谱段的第二光强度值分别在视频图像的每个周期帧中交替周期性循环出现,并且荧光谱段的光强度值的周期性变化与可见光谱段的光强度值的周期性变 化呈反向。
根据本申请的一个方面,荧光谱段的光强度值和可见光谱段的光强度值在视频帧图像中呈周期函数变化,且荧光谱段的光强度值的变化与荧光光源的电流或电压的变化相对应,可见光谱段的光强度值的变化与可见光光源的电流或电压的变化相对应。
根据本申请的又一个实施例,在采取视频逐行扫描方式中,每个周期帧为四帧图像,可见光谱段的第一光强度值或可见光谱段的第二光强度值出现在连续的两帧图像中。
根据本申请的一个方面,在第一帧中可见光谱段的光强度值被设置成可见光的第一光强度值,在第二帧中可见光谱段的光强度值被设置成可见光的第二光强度值,在第三帧中可见光谱段的光强度值被设置成可见光的第二光强度值,在第四帧中可见光谱段的光强度值被设置成可见光的第一光强度值。
根据本申请的一个方面,可见光谱段的光强度值在视频帧图像中呈周期函数进行变化,且可见光谱段的光强度值的变化与可见光光源的电流或电压的变化相对应。
根据本申请的又一个实施例,在采取视频隔行扫描方式中,视频的每一帧图像分为两个场图像,可见光谱段的第一光强度值和可见光谱段的第二光强度值分别在不同场图像中交替周期性出现。
根据本申请的一个方面,视频每一帧多光谱图像被分为两个场图像进行隔行扫描视频输出,两个场图像互补形成多光谱图像。
根据本申请的一个方面,可见光谱段的第一光强度值不同于可见光谱段的第二光强度值。
根据本申请的一个方面,可见光谱段的光强度值在视频帧图像中呈周期函数进行变化,且可见光谱段的光强度值的变化与可见光光源的电流或电压的变化相对应。
根据本申请的一个方面,可见光谱段的第二光强度值根据如下方式来设置:计算活跃荧光像素最小值、活跃荧光像素最大值、所有活跃荧光像素的平均值;根据计算出的活跃荧光像素最小值和活跃荧光像素最大值计算活跃荧光像素的中值;选取荧光像素的中值和活跃荧光像素的平均值两者中的最小值;以及 根据选取的最小值,确定预设的可见光谱段的基准光强度值,作为可见光谱段的第二光强度值。
根据本申请的一个方面,可见光谱段的第一光强度值是根据可见光谱段图像的成像质量来设置的。
根据本申请的一个方面,通过可见光谱段的第一光强度值与可见光谱段的第二光强度值的最大差值,确定用于控制光源电流或电压的最大幅值,以调节光强度。
本申请还提供了一种多光谱成像方法,包括:由可见光和激发光组合照射,被照射区受激发光的激发而释放出荧光;接收来自被照射区的反射光中的可见光和荧光,以形成包含可见光谱段图像信息和荧光谱段图像信息的多光谱图像;其中,可见光谱段的光强度值在连续的视频帧图像中呈周期性变化,使可见光视频图像在连续帧中亮暗部分相交织。
根据本申请的一个方面,可见光谱段的光强度值包括可见光谱段的第一光强度值和可见光谱段的第二光强度值,可见光谱段的第一光强度值和可见光谱段的第二光强度值分别在连续的视频帧图像的每个周期帧中交替周期性循环出现。
根据本申请的一个实施例,在视频图像采取逐行扫描方式下,每个周期帧为两帧多光谱融合图像,图像帧中荧光谱段的光强度值在图像的连续帧中保持不变,可见光谱段的光强度值在上半个周期帧被设置成可见光谱段的第一光强度值,可见光谱段的光强度值在下半个周期帧被设置成可见光谱段的第二光强度值,可见光谱段的第一光强度值不同于可见光谱段的第二光强度值。
根据本申请的另一个实施例,在采取视频逐行扫描方式中,荧光谱段的光强度值包括荧光谱段的第一光强度值和荧光谱段的第二光强度值,荧光谱段的第一光强度值和荧光谱段的第二光强度值分别在视频图像的每个周期帧中交替周期性循环出现,并且荧光谱段的光强度值的周期性变化与可见光谱段的光强度值的周期性变化呈反向。
根据本申请的又一个实施例,在采取视频逐行扫描方式中,每个周期帧为四帧图像,可见光谱段的第一光强度值或可见光谱段的第二光强度值出现在连续的两帧图像中。
根据本申请的又一个实施例,在采取视频隔行扫描方式中,视频的每一帧图像分为两个场图像,可见光谱段的第一光强度值和可见光谱段的第二光强度值分别在 不同场图像中交替周期性出现。
根据本申请的又一个实施例,在输出多光谱图像之前,进行可见光谱段图像与荧光谱段图像的多光谱图像融合。
本申请还提供了一种计算机可读存储介质,所述计算机可读存储介质存储用于执行以下操作的计算机可执行指令:由可见光和激发光组合照射,被照射区受激发光的激发而释放出荧光;接收来自被照射区的反射光中的可见光和荧光,以形成包含可见光谱段图像信息和荧光谱段图像信息的多光谱图像;其中,可见光谱段的光强度值在连续的视频帧图像中呈周期性变化,使可见光视频图像在连续帧中亮暗部分相交织。
本申请还提供了一种用于多光谱成像中光强度调节方法,包括:计算活跃荧光像素最小值、活跃荧光像素最大值、所有活跃荧光像素的平均值;根据计算出的活跃荧光像素最小值、活跃荧光像素最大值,计算活跃荧光像素的中值;选取荧光像素的中值和活跃荧光像素的平均值两者中的最小值;根据选取的最小值,确定预设的可见光谱段的基准光强度值,作为可见光谱段的第二光强度值;根据可见光谱段图像的成像质量来设置可见光谱段的第一光强度值;计算可见光谱段的第一光强度值与可见光谱段的第二光强度值的最大差值;以及确定用于控制光源电流或电压的最大幅值,以调节光强度。
根据本申请的一个实施例,计算活跃荧光像素的中值还可以结合像素最大灰度级。
相比于已有技术,本申请提供的用于可见光和激发荧光实时成像的系统和方法能够使得可见光背景图像成像清晰明亮,同时荧光成像对比度高,噪点低。
本申请可广泛应用于医疗成像技术领域,包括医疗内窥镜、荧光显微镜等各个领域。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为根据本申请一个实施例的多光谱成像系统的结构示意图;
图2为根据本申请一个实施例的采用单个图像传感器的多光谱成像系统的结构示意图;
图3为根据本申请一个实施例的光源模块采用多个单色光源组合的示意图;
图4为根据本申请一个实施例的相机模块采用两个图像传感器的示意图;
图5为根据本申请一个实施例的相机模块采用四个图像传感器的示意图;
图6为根据本申请一个实施例的逐行输出方式下一种成像方法示意图;
图7为根据本申请一个实施例的逐行输出方式下另一种成像方法示意图;
图8为根据本申请一个实施例的逐行输出方式下又一种成像方法示意图;
图9为根据本申请一个实施例的隔行输出方式下一种成像方法示意图;
图10为根据本申请一个实施例的光强度基准预设值确认方法示意图;以及
图11是根据本申请的一个实施例的示例性计算机系统框图。
具体实施方式
将参照附图详细描述各种实施例。在可能之处,相同附图标记将贯穿附图用于指代相同或类似部分。对特定示例和实现所作的引用是用于说明性目的,而无意限定本申请或权利要求的范围。
措辞“示例性”在本文中用于表示“用作示例、实例或示出”。本文中描述为“示例性”的任何实现不必然被解释为优于或胜过其他实现。
图1示出了根据本申请一个实施例的多光谱成像系统的示例性结构示意图。在该示例中,多光谱成像系统可以用于荧光内窥镜。如图1所示,荧光内窥镜摄像系统可以包括光源模块1,用于成像的相机模块2,进行图像采集和预处理的数据采集和预处理模块3、数字图像处理模块4,系统控制器5和光源控制器6。其中,相机模块2可以包括光学镜头组件和一个或多个图像传感器(下图将详述)。系统控制器5可以根据数字图像处理模块4的反馈处理结果生成相应的控制信号。光源控制器6可以根据系统控制器5的控制信号对光源模块1进行照明控制。可以通过调节光源模块从而控制可见光谱段的光强度值。
图2示出了根据本申请一个实施例的采用单个图像传感器的多光谱成像系统的示例性结构示意图。如图2所示,光源模块1可以包括入射可见光LED 11、近红外(NIR)激发器12和组合分光镜13。在该示例中,相机模块2可以包括光学镜头组件9和单个图像传感器10(例如,CCD或CMOS图像传感器)。可任选地,光学镜头组件9还可以包括光学滤波器8。入射可见光LED 11可以提供可见光波长段(400nm~760nm)的照明。近红外(NIR)激发器12可以产生近红外波段(790nm~820nm)的NIR激发光,特别是800nm附近波长段的近红外激发光。可见光和NIR激发光可以通过组合分光镜13对被照射区(例如,组织)进行组合照射。当被注射荧光显影剂的组织被光源模块1中的NIR激发光照射时,组织中的荧光基团被激发而释放出比激发光波长更长波段的近红外荧光,例如830nm~900nm的荧光。来自被照射区(例如,组织)的反射光可以包括例如可见光、激发光(例如,NIR激发光)和荧光(例如,近红外荧光)三种光谱波段。其中NIR激发光被可选的光学滤波器8完全阻挡而不能进入图像传感器,而可见光和荧光透过光学镜头组件9进入具有NIR像素的单个图像传感器10。来自图像传感器10的数据可以通过数据采集和预处理模块3进入数字图像处理模块4,以形成可见光谱段图像和荧光谱段图像。系统控制器5可以根据数字图像处理模块4的反馈而输出信号给光源控制器6。光源控制器6可以通过控制驱动电流或者电压控制光源模块1。
图3为根据本申请一个实施例的光源模块采用多个单色光源组合的示例性示意图。该示例中,光源模块1可以包括如图3所示的三个单色光源组合形式。可见光源由红光LED 18(620nm~760nm)、绿光LED 19(492nm~577nm)和蓝光LED 20(400nm~450nm)三个单色光源通过相应的组合分光镜15、组合分光镜16和组合分光镜17进行组合。在多个单色光源组合的示例中,相机模块2可以包括一个或多个图像传感器。例如,相机模块2可以是如图2所示的单个图像传感器,也可以是如图4所示的双图像传感器组合。
图4为根据本申请一个实施例的相机模块采用两个图像传感器的示例性示意图。相机模块2可以包括多个图像传感器,以及相应的分光棱镜。该示例中,相机模块2可以采用可见光的图像传感器32(如,CCD/CMOS图像传感器)和荧光的图像传感器33(如,CCD/CMOS图像传感器)。来自组织的可见光 和荧光通过分光棱镜31分别进入相应的图像传感器。
图5为根据本申请一个实施例的相机模块采用四个图像传感器的示例性示意图。相机模块2可以包括四个图像传感器,以及相应的三个分光棱镜。如图5所示,当采用4-CCD/CMOS图像传感器组合时,采用棱镜分光技术(例如,分光棱镜34、分光棱镜35和分光棱镜36的组合)将光分别引导至三个RGB单色图像传感器(例如,红光图像传感器38、绿光图像传感器39和蓝光图像传感器40)和一个NIR荧光图像传感器37中,其中各图像传感器分别接收所对应波长段的反射光,以形成多光谱图像。
可以理解,本申请的光强度在视频图像连续帧中周期性调节可以通过一个图像传感器,也可以通过两个图像传感器,还可以通过四个传感器方案。单个图像传感器中集成荧光像素也可以获取多光谱融合图像,三个图像传感器(RGB)合成可见光图像,第四个图像传感器生成荧光的4CCD/CMOS也可以形成融合图像。因此,本申请采用一个或者多个图像传感器来产生可见光和荧光的融合图像帧,各个实施例中图像传感器的数量仅为列举而非限定。
图6为根据本申请一个实施例的逐行输出方式下一种成像方法示意图。图6示出了连续帧图像中周期性光强度变化的一个可选方式。在逐行输出方式下,系统输出图像的每一图像帧为多光谱融合图像,包括可见光谱段图像信息和荧光谱段图像信息。在所有连续帧图像中,荧光谱段的光强度可以被设置成保持不变。而系统图像输出的第一帧中,可见光谱段可以被设置为高光强度值,而第二帧中的可见光谱段可以被设置为低光强度值,第三帧图像中的可见光谱段又可以被设置为高光强度值,第四帧图像中的可见光谱段又可以变为低光强度值。后续图像帧亦是如此变化。可见光谱段高和低光强度值分别在图像的每一帧中交替周期性循环出现。图中高和低光强度值变化的一个周期为两帧图像的时间,即是一个周期帧为两帧图像,上半个周期帧为可见光谱段高光强度,下半个周期帧为可见光谱段低光强度。应当理解的是,图像可见光谱段的高强度或者低强度值均是相对于预设置的可见光谱段基准光强度,亦或者是相对于荧光谱段的强度。虽然图像相邻帧的可见光谱段光强具有亮暗差异,但是由于人的视觉暂留特性,以及人眼在一定光强之上对亮度变化敏感性较低的特点,这样可见光视频图像在连续图像帧中的亮暗部分交织在一起,人眼不易察觉这种 快速的可见光谱段亮暗变化。此外,在各图像帧包含的信息中,可见光谱段为高光强度的帧图像主要突出背景组织的图像细节和清晰度,而可见光谱段低光强帧图像则突出荧光区域的图像细节,同时突出与背景组织的对比度。如此,在连续图像帧的过程中,利用人眼的视觉暂留特性,即反映了背景组织高光强下的结构图像细节,也体现了荧光区域的具体图像细节。
此外,可见光谱段的光强度值在连续视频帧图像中通过呈周期函数进行调节变化。光强度可以由光源驱动电流或者电压进行控制。为产生连续周期性可见光谱段明暗交替帧图像,光源驱动电流可以为连续周期性变化函数。驱动电流越大,光源强度越大,光源驱动电流为周期性变化函数,例如正弦函数、余弦函数、正切函数、方波周期函数、三角周期函数、阶跃周期函数等。例如图6所示正弦函数变化,其半个周期时间为一帧图像曝光时间。
通过调节该函数波峰波谷间的最大幅值以调节相邻帧图像中可见光光强度差值,以适应人眼对光强变化敏感度,达到更好的视觉效果。此外,在具体实施中,图像的帧率范围可以在60fps(每秒帧数)至120fps之间进行调节。可以适当提高图像帧率至60fps-120fps,以消除光强变化过慢或者过大带来的屏闪现象,从而消除光强变化过大带来的人眼不适应。
图7为根据本申请一个实施例的逐行输出方式下另一种成像方法示意图。图7示出了连续帧图像中周期性光强度变化的另一个可选方式。在视频逐行输出方式下,当连续图像周期帧中更加强调可见光谱段图像和荧光区域图像之间的对比度的时候,系统图像周期帧中光强变化可如图7所示。荧光谱段的光强度值的周期性变化与可见光谱段的光强度值的周期性变化呈反向。可见光谱段为高光强度值时,荧光谱段为低光强。而可见光谱段为低光强度值时,荧光谱段为高光强变化。各光源对应的驱动电流也如图7所示变化。如此在一帧多光谱融合图像中,可见光谱段和荧光谱段的图像光强对比差值更加突出,从而在连续的一个周期帧中,可见光谱段图像和荧光谱段图像更加的分明。在该示例中,荧光谱段的光强度值和可见光谱段的光强度值通过在视频帧图像中呈周期函数进行调节变化。
图8为根据本申请一个实施例的逐行输出方式下又一种成像方法示意图。图8示出了连续帧图像中周期性光强度变化的又一个可选方式。在视频逐行输 出方式下,当光源驱动电流为类似余弦函数周期变化时,系统图像帧信息中可见光谱段光强度变化如图8所示。根据前述的描述,一个周期帧为4帧图像,可见光谱段高或者低光强度值出现在连续的两帧图像中,即是在连续帧图像中提高了某一光强状态的周期时间。可见光谱段图像和荧光谱段图像不同光强状态更加的分明。
图9为根据本申请一个实施例的隔行输出方式下一种成像方法示意图。图9示出了隔行输出方式下连续帧图像中周期性光强度调节方法。当图像采用隔行输出方式下,如图9所示,此时光源驱动电流为连续周期性函数变化,例如正弦周期函数等。当图像采用隔行输出方式下,图像一帧图像分为奇数和偶数两个场图像信息,可见光谱段高和低光强度值分别在不同场图像中交替周期性出现。奇场为可见光谱段高光强度,偶场为可见光谱段低光强度,或者相反变化。最后,图像采用隔行输出显示。在采用隔行输出时,先扫描图像奇数行(上场),然后再扫描图像偶数行(下场),两者互补成完整的画面。虽然扫描下场时,上场的亮度衰减了,但是由于可见光谱段亮暗的部分相交织,利用人的视觉特点,反而不易察觉。采用隔行输出,图像信息被分为两个场图像分别输出,这种方式对于本申请的视觉效果更加具有连续性。此外,隔行输出能支持更高分辨率的图像输出。
应当理解的是,作为示例,可见光谱段的光强度值可以采取先高后低(如图示实施方式),但本申请并不局限于此。作为替换,在每个周期帧中,可见光谱段的光强度值可以被设置成先低后高。而荧光谱段的光强值也可以被设置成先高后低。
此外,在本申请的各实施例中,荧光谱段的光强度值的变化可以与荧光光源的电流或电压的变化相对应。可见光谱段的光强度值的变化可以与可见光光源的电流或电压的变化相对应。
此外,可选地,在输出多光谱图像之前,可以先对可见光谱段图像与荧光谱段图像进行图像融合,经图像处理后,输出的多光谱融合图像视觉效果更加。
图10为根据本申请一个实施例的光强度基准预设值确认方法示意图。上文提到,图像可见光谱段的高强度或者低强度值均是相对于预设置的可见光谱段基准光强度、或者是相对于荧光谱段的光强度。下面将描述可见光谱段的光强 度值变化的基准强度反馈调节。如图10所示,在可见光谱段低光强度作为荧光区域背景时,数字图像处理模块不断计算图像帧中处于活跃的荧光像素最小值、最大值和所有活跃荧光像素的平均值。根据活跃荧光像素的最小值、最大值和像素最大灰度级,计算活跃荧光像素的中值。可选地,计算活跃荧光像素的中值还可以结合像素最大灰度级。此处像素最大灰度级可以是荧光像素最大动态范围,即是像素传感器单元可感受到光强度的最大值。然后,比较活跃荧光像素中值和平均值的大小,选取荧光像素的中值和活跃荧光像素的平均值两者中的最小值。根据选取的最小值确定可见光光源低光强度帧时的最佳预设光强度。而图像可见光谱段的高强度值则根据可见光图像的成像质量进行确定。最后,计算可见光光源高光强度值和低光强度的最大差值,确定光源控制电流的最大幅值A,完成系统荧光模式的光强调节。
本申请的方案根据人眼的视觉暂留特性,结合不同视频输出方式,对连续帧图像信息中,通过调节视频图像中不同帧图像的可见光谱段光强度,使得高和低可见光谱段光强度在不同帧图像中交替周期性循环输出。这样在一个连续的帧图像信息中,既可以输出高亮度的可见光谱段图像,更加突出背景组织的可见光图像细节,也可以输出低可见光强度背景下高对比度的荧光图像信息。如此,在一系列的连续帧图像信息中,即体现了可见光谱段图像的清晰细节,又体现了荧光近红外谱段图像的高对比度下的图像细节。即采用隔帧输出不同光强度和对比度的图像周期帧技术,在保证荧光图像信息高对比度的同时,也输出清晰可见光图像信息。
根据本申请的方案,解决了现有的荧光和可见光谱段同时成像中时,高可见光强度背景图像下荧光区域对比度低的缺陷。解决了市场上荧光和可见光谱段同时成像中时,低可见光强度背景图像下可见光图像成像质量差的缺陷。
应当理解,上述实施例仅作为示例而非限制,除了上述逐行扫描和隔行扫描方案之外,本领域技术人员还可以构想更多其他类似方法,既能体现可见光谱段图像的清晰细节,又能体现荧光近红外谱段图像的高对比度下的图像细节。这样的实现方式不应被解读成导致脱离了本申请的范围。
参考图11,示出了一种示例性的计算机系统1100。计算机系统1100可以包括逻辑处理器1102,例如执行核。尽管图示了一个逻辑处理器1102,但在 其它的实施例中,计算机系统1100可以具有多个逻辑处理器,例如,每个处理器基片多个执行核,和/或多个处理器基片,其中每个处理器基片可以具有多个执行核。如图所示,各种计算机可读存储介质1110可以通过一条或多条系统总线互连,所述系统总线将各种系统组件耦合到逻辑处理器1102。系统总线可以是若干类型的总线结构中的任何类型,包括存储器总线或存储器控制器、外围总线、以及使用各种各样的总线体系结构中的任一种的本地总线。在示例性的实施例中,计算机可读存储介质1110可以包括例如随机存取存储器(RAM)1104、存储设备1106(例如,机电硬驱动器、固态硬驱动器等等)、固件1108(例如,快闪RAM或ROM)、以及可拆卸存储设备1118(诸如像CD-ROM、软盘、DVD、闪存驱动器、外部存储设备等等)。本领域的技术人员应当意识到,可以使用其它类型的计算机可读存储介质,诸如磁带盒、闪存卡和/或数字视频盘。计算机可读存储介质1110可以提供计算机可执行的指令1122、数据结构、程序模块、以及用于计算机系统1100的其它数据的非易失性和易失性存储。基本输入/输出系统(BIOS)1120——其包含诸如在启动期间帮助在计算机系统1100的单元之间转移信息的基本例行程序——可以被存储在固件1108中。大量的程序可以被存储在固件1108、存储设备1106、RAM 1104和/或可拆卸存储设备1118上,并可以由逻辑处理器1102执行,逻辑处理器1102包括操作系统和/或应用程序。命令和信息可以通过输入设备1116被计算机系统1100所接收,输入设备1116可以包括但不限于键盘和指向设备。其它的输入设备可以包括话筒、操纵杆、游戏手柄、扫描仪等等。这些和其它输入设备常常通过被耦合到系统总线的串行端口接口而被连接到逻辑处理器1102,但是也可以通过其它的接口连接,诸如并行端口、游戏端口或通用串行总线(USB)。显示器或其它类型的显示设备也可以经由接口连接到系统总线,比如视频适配器,其可以是图形处理单元1112的一部分或被连接到图形处理单元1112。除了显示器,计算机典型地包括其它的外围输出设备,诸如扬声器和打印机(未示出)。图11的示例性系统还可以包括主机适配器、小型计算机系统接口(SCSI)总线、以及连接到SCSI总线的外部存储设备。计算机系统1100可以在使用到一个或多个远程计算机(诸如,某个远程计算机)的逻辑连接的联网环境中操作。远程计算机可以是另外的计算机、服务器、 路由器、网络PC、对等设备或其它常见的网络节点,以及典型地可以包括上面相对于计算机系统1100描述的单元中的多个或所有单元。当在LAN或WAN联网环境中使用时,计算机系统1100可以通过网络接口卡1114被连接到LAN或WAN。网卡(NIC)1114(其可以是内部的或外部的)可以被连接到系统总线。在联网环境中,相对于计算机系统1100描绘的程序模块或者它们的一些部分可以被存储在远程存储器存储设备中。应意识到,这里描述的网络连接是示范性的,以及可以使用在计算机之间建立通信链路的其它手段。
在一个或更多个示例性实施例中,所描述的功能和过程可以在硬件、软件、固件、或其任何组合中实现。如果在软件中实现,则各功能可以作为一条或更多条指令或代码存储在计算机可读介质上或藉其进行传送。计算机可读介质包括计算机存储介质和通信介质两者,其包括促成计算机程序从一地向另一地转移的任何介质。存储介质可以是能被计算机访问的任何可用介质。作为示例而非限定,这样的计算机可读介质可包括RAM、ROM、EEPROM、CD-ROM或其它光盘存储、磁盘存储或其它磁存储设备、或能被用来携带或存储指令或数据结构形式的合意程序代码且能被计算机访问的任何其它介质。对于固件和/或软件实现,这些方法可以用执行本文中所描述功能的模块(例如,程序、函数等等)来实现。有形地体现指令的任何机器可读介质可用于实现本文中所描述的方法体系。例如,软件代码可被存储在例如移动站的存储器之类的存储器中,并由例如台式计算机、膝上型计算机、服务器计算机、移动设备的微处理器等处理器执行。存储器可以实现在处理器内部或处理器外部。如本文所使用的,术语“存储器”是指任何类型的长期、短期、易失性、非易失性、或其他存储器,而并不限于任何特定类型的存储器或特定数目的存储器、或记忆存储在其上的介质的类型。
上述描述和图示仅作为说明性示例提供。对单数形式的权利要求元素的任何引述,例如使用冠词“一”、“某”或“该”的引述不应解释为将该元素限定为单数。例如本文使用的“第一”、“第二”等名称对源深的任何引用通常不限制那些元素的数量或次序。对第一和第二元件的引用并不意味着此处采用仅仅两个元件,也不意味着第一元件必须以某种方式在第二元件之前。除非另外说明,否则一组元件可以包括一个或多个元件。技术人员对于每种特定应用 可用不同的方式来实现所描述的结构,但这样的实现方式不应被解读成导致脱离了本申请的范围。
提供所公开的实施例的先前描述是为了使本领域任何技术人员皆能制作或使用本申请。对这些实施例的各种修改对本领域技术人员来说将是显而易见的,且本文所定义的一般原理可被应用于其它实施例而不背离本申请的精神或范围。由此,本申请并非旨在限定于本文中示出的实施例,而是应被授予与所附权利要求和本文中公开的原理和新颖性特征一致的最广义的范围。

Claims (30)

  1. 一种多光谱成像系统,包括:
    光源模块,提供可见光并产生激发光,被照射区由可见光和激发光组合照射,被照射区受激发光的激发而释放出荧光;
    相机模块,包括一个或多个图像传感器,所述图像传感器接收来自被照射区的反射光中的可见光和荧光,以形成包含可见光谱段图像信息和荧光谱段图像信息的多光谱图像;
    其中,可见光谱段的光强度值在连续的视频帧图像中呈周期性变化,使可见光视频图像在连续帧中亮暗部分相交织。
  2. 根据权利要求1所述的多光谱成像系统,其特征在于:
    可见光谱段的光强度值包括可见光谱段的第一光强度值和可见光谱段的第二光强度值,可见光谱段的第一光强度值和可见光谱段的第二光强度值分别在连续的视频帧图像的每个周期帧中交替周期性循环出现。
  3. 根据权利要求2所述的多光谱成像系统,其特征在于:
    在视频图像采取逐行扫描方式下,所述一个或多个图像传感器获取多光谱图像中的可见光谱段图像信息和荧光谱段图像信息;每个周期帧为两帧多光谱融合图像,图像帧中荧光谱段的光强度值在图像的连续帧中保持不变,可见光谱段的光强度值在上半个周期帧被设置成可见光谱段的第一光强度值,可见光谱段的光强度值在下半个周期帧被设置成可见光谱段的第二光强度值,可见光谱段的第一光强度值不同于可见光谱段的第二光强度值。
  4. 根据权利要求2所述的多光谱成像系统,其特征在于:
    可见光谱段的第一光强度值和可见光谱段的第二光强度值是根据预设的可见光谱段的基准光强度值来设置的。
  5. 根据权利要求2所述的多光谱成像系统,其特征在于:
    可见光谱段的第一光强度值和可见光谱段的第二光强度值是根据荧光谱段的光强度值来设置的。
  6. 根据权利要求3所述的多光谱成像系统,其特征在于:
    可见光谱段的光强度值在连续视频帧图像中呈周期函数变化。
  7. 根据权利要求6所述的多光谱成像系统,其特征在于:
    通过调节所述周期函数的波峰波谷间的最大幅值来调节图像的相邻帧图像中可见光谱段的光强度差值。
  8. 根据权利要求1所述的多光谱成像系统,其特征在于:
    通过调节光源模块从而控制可见光谱段的光强度值。
  9. 根据权利要求2所述的多光谱成像系统,其特征在于:
    在采取视频逐行扫描方式中,荧光谱段的光强度值包括荧光谱段的第一光强度值和荧光谱段的第二光强度值,荧光谱段的第一光强度值和荧光谱段的第二光强度值分别在视频图像的每个周期帧中交替周期性循环出现,并且荧光谱段的光强度值的周期性变化与可见光谱段的光强度值的周期性变化呈反向。
  10. 根据权利要求9所述的多光谱成像系统,其特征在于:
    荧光谱段的光强度值和可见光谱段的光强度值在视频帧图像中呈周期函数变化,且荧光谱段的光强度值的变化与荧光光源的电流或电压的变化相对应,可见光谱段的光强度值的变化与可见光光源的电流或电压的变化相对应。
  11. 根据权利要求2所述的多光谱成像系统,其特征在于:
    在采取视频逐行扫描方式中,每个周期帧为四帧图像,可见光谱段的第一光强度值或可见光谱段的第二光强度值出现在连续的两帧图像中。
  12. 根据权利要求11所述的多光谱成像系统,其特征在于:
    在第一帧中可见光谱段的光强度值被设置成可见光的第一光强度值,在第二帧中可见光谱段的光强度值被设置成可见光的第二光强度值,在第三帧中可见光谱段的光强度值被设置成可见光的第二光强度值,在第四帧中可见光谱段的光强度值被设置成可见光的第一光强度值。
  13. 根据权利要求11所述的多光谱成像系统,其特征在于:
    可见光谱段的光强度值在视频帧图像中呈周期函数进行变化,且可见光谱段的光强度值的变化与可见光光源的电流或电压的变化相对应。
  14. 根据权利要求2所述的多光谱成像系统,其特征在于:
    在采取视频隔行扫描方式中,视频的每一帧图像分为两个场图像,可见光谱段的第一光强度值和可见光谱段的第二光强度值分别在不同场图像中交替周期性出现。
  15. 根据权利要求14所述的多光谱成像系统,其特征在于:
    视频每一帧多光谱图像被分为两个场图像进行隔行扫描视频输出,两个场图像互补形成多光谱图像。
  16. 根据权利要求14所述的多光谱成像系统,其特征在于:
    可见光谱段的第一光强度值不同于可见光谱段的第二光强度值。
  17. 根据权利要求14所述的多光谱成像系统,其特征在于:
    可见光谱段的光强度值在视频帧图像中呈周期函数进行变化,且可见光谱段的光强度值的变化与可见光光源的电流或电压的变化相对应。
  18. 根据权利要求4或5所述的多光谱成像系统,其特征在于,可见光谱段的第二光强度值根据如下方式来设置:
    计算活跃荧光像素最小值、活跃荧光像素最大值、所有活跃荧光像素的平均值;
    根据计算出的活跃荧光像素最小值和活跃荧光像素最大值计算活跃荧光像素的中值;
    选取荧光像素的中值和活跃荧光像素的平均值两者中的最小值;以及
    根据选取的最小值,确定预设的可见光谱段的基准光强度值,作为可见光谱段的第二光强度值。
  19. 根据权利要求18所述的多光谱成像系统,其特征在于:
    可见光谱段的第一光强度值是根据可见光谱段图像的成像质量来设置的。
  20. 根据权利要求19所述的多光谱成像系统,其特征在于:
    通过可见光谱段的第一光强度值与可见光谱段的第二光强度值的最大差值,确定用于控制光源电流或电压的最大幅值,以调节光强度。
  21. 一种多光谱成像方法,包括:
    由可见光和激发光组合照射,被照射区受激发光的激发而释放出荧光;
    接收来自被照射区的反射光中的可见光和荧光,以形成包含可见光谱段图像信息和荧光谱段图像信息的多光谱图像;
    其中,可见光谱段的光强度值在连续的视频帧图像中呈周期性变化,使可见光视频图像在连续帧中亮暗部分相交织。
  22. 根据权利要求21所述的多光谱成像方法,其特征在于:
    可见光谱段的光强度值包括可见光谱段的第一光强度值和可见光谱段的第二光强度值,可见光谱段的第一光强度值和可见光谱段的第二光强度值分别在连续的 视频帧图像的每个周期帧中交替周期性循环出现。
  23. 根据权利要求22所述的多光谱成像方法,其特征在于:
    在视频图像采取逐行扫描方式下,每个周期帧为两帧多光谱融合图像,图像帧中荧光谱段的光强度值在图像的连续帧中保持不变,可见光谱段的光强度值在上半个周期帧被设置成可见光谱段的第一光强度值,可见光谱段的光强度值在下半个周期帧被设置成可见光谱段的第二光强度值,可见光谱段的第一光强度值不同于可见光谱段的第二光强度值。
  24. 根据权利要求22所述的多光谱成像方法,其特征在于:
    在采取视频逐行扫描方式中,荧光谱段的光强度值包括荧光谱段的第一光强度值和荧光谱段的第二光强度值,荧光谱段的第一光强度值和荧光谱段的第二光强度值分别在视频图像的每个周期帧中交替周期性循环出现,并且荧光谱段的光强度值的周期性变化与可见光谱段的光强度值的周期性变化呈反向。
  25. 根据权利要求22所述的多光谱成像方法,其特征在于:
    在采取视频逐行扫描方式中,每个周期帧为四帧图像,可见光谱段的第一光强度值或可见光谱段的第二光强度值出现在连续的两帧图像中。
  26. 根据权利要求22所述的多光谱成像方法,其特征在于:
    在采取视频隔行扫描方式中,视频的每一帧图像分为两个场图像,可见光谱段的第一光强度值和可见光谱段的第二光强度值分别在不同场图像中交替周期性出现。
  27. 根据权利要求22所述的多光谱成像方法,其特征在于:
    在输出多光谱图像之前,进行可见光谱段图像与荧光谱段图像的多光谱图像融合。
  28. 一种计算机可读存储介质,所述计算机可读存储介质存储用于执行以下操作的计算机可执行指令:
    由可见光和激发光组合照射,被照射区受激发光的激发而释放出荧光;
    接收来自被照射区的反射光中的可见光和荧光,以形成包含可见光谱段图像信息和荧光谱段图像信息的多光谱图像;
    其中,可见光谱段的光强度值在连续的视频帧图像中呈周期性变化,使可见光视频图像在连续帧中亮暗部分相交织。
  29. 一种用于多光谱成像中光强度调节方法,包括:
    计算活跃荧光像素最小值、活跃荧光像素最大值、所有活跃荧光像素的平均值;
    根据计算出的活跃荧光像素最小值、活跃荧光像素最大值,计算活跃荧光像素的中值;
    选取荧光像素的中值和活跃荧光像素的平均值两者中的最小值;
    根据选取的最小值,确定预设的可见光谱段的基准光强度值,作为可见光谱段的第二光强度值;
    根据可见光谱段图像的成像质量来设置可见光谱段的第一光强度值;
    计算可见光谱段的第一光强度值与可见光谱段的第二光强度值的最大差值;以及
    确定用于控制光源电流或电压的最大幅值,以调节光强度。
  30. 根据权利要求29所述的方法,其特征在于:
    计算活跃荧光像素的中值还结合像素最大灰度级。
PCT/CN2019/105570 2018-09-12 2019-09-12 一种用于可见光和激发荧光实时成像的系统和方法 WO2020052623A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811063689.7A CN110893095A (zh) 2018-09-12 2018-09-12 一种用于可见光和激发荧光实时成像的系统和方法
CN201811063689.7 2018-09-12

Publications (1)

Publication Number Publication Date
WO2020052623A1 true WO2020052623A1 (zh) 2020-03-19

Family

ID=69777319

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/105570 WO2020052623A1 (zh) 2018-09-12 2019-09-12 一种用于可见光和激发荧光实时成像的系统和方法

Country Status (2)

Country Link
CN (1) CN110893095A (zh)
WO (1) WO2020052623A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111887810A (zh) * 2020-07-24 2020-11-06 西北大学 一种近红外二区共径离轴光学-ct双模态成像系统及方法

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220156606A1 (en) * 2020-11-13 2022-05-19 International Business Machines Corporation Identification of a section of bodily tissue for pathology tests
CN114027765B (zh) * 2020-11-20 2023-03-24 上海微觅医疗器械有限公司 荧光内窥镜系统、控制方法和存储介质
CN113749772A (zh) * 2021-04-22 2021-12-07 上海格联医疗科技有限公司 一种增强近红外4k荧光导航系统
CN113749771A (zh) * 2021-04-30 2021-12-07 上海格联医疗科技有限公司 分子影像近红外二区荧光导航系统
CN113208567A (zh) * 2021-06-07 2021-08-06 上海微创医疗机器人(集团)股份有限公司 多光谱成像系统、成像方法和存储介质
CN113610823B (zh) * 2021-08-13 2023-08-22 南京诺源医疗器械有限公司 图像处理方法、装置、电子设备及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1341003A (zh) * 1999-01-26 2002-03-20 牛顿实验室公司 用于内窥镜的自发荧光成象系统
CN1802560A (zh) * 2003-06-03 2006-07-12 不列颠哥伦比亚癌症研究所 使用多激发-发射对和同时多信道图像检测进行荧光成像的方法和装置
CN101433458A (zh) * 2007-11-15 2009-05-20 卡尔斯特里姆保健公司 用于组织成像的多模式成像系统
EP2454985A1 (en) * 2009-07-16 2012-05-23 Yamano Optical Co., Ltd. Aperture stop
CN105496354A (zh) * 2014-09-23 2016-04-20 岩崎电气株式会社 摄像系统
CN107635451A (zh) * 2015-04-03 2018-01-26 苏州国科美润达医疗技术有限公司 用于在可见和红外波长下进行同时成像的方法和装置

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006509573A (ja) * 2002-12-13 2006-03-23 イエトメド リミテッド 手術中に腫瘍と正常組織とを実時間で区別するのに特に有用な光学的検査方法および装置
PL2291640T3 (pl) * 2008-05-20 2019-07-31 University Health Network Urządzenie i sposób obrazowania i monitorowania w oparciu o fluorescencję
CN103300812A (zh) * 2013-06-27 2013-09-18 中国科学院自动化研究所 基于内窥镜的多光谱视频导航系统和方法
CN107518879A (zh) * 2017-10-11 2017-12-29 北京数字精准医疗科技有限公司 一种荧光成像装置及方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1341003A (zh) * 1999-01-26 2002-03-20 牛顿实验室公司 用于内窥镜的自发荧光成象系统
CN1802560A (zh) * 2003-06-03 2006-07-12 不列颠哥伦比亚癌症研究所 使用多激发-发射对和同时多信道图像检测进行荧光成像的方法和装置
CN101433458A (zh) * 2007-11-15 2009-05-20 卡尔斯特里姆保健公司 用于组织成像的多模式成像系统
EP2454985A1 (en) * 2009-07-16 2012-05-23 Yamano Optical Co., Ltd. Aperture stop
CN105496354A (zh) * 2014-09-23 2016-04-20 岩崎电气株式会社 摄像系统
CN107635451A (zh) * 2015-04-03 2018-01-26 苏州国科美润达医疗技术有限公司 用于在可见和红外波长下进行同时成像的方法和装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111887810A (zh) * 2020-07-24 2020-11-06 西北大学 一种近红外二区共径离轴光学-ct双模态成像系统及方法

Also Published As

Publication number Publication date
CN110893095A (zh) 2020-03-20

Similar Documents

Publication Publication Date Title
WO2020052623A1 (zh) 一种用于可见光和激发荧光实时成像的系统和方法
US9872610B2 (en) Image processing device, imaging device, computer-readable storage medium, and image processing method
WO2020052626A1 (zh) 一种基于图像曝光的多光谱成像系统和方法
CN103561629B (zh) 内窥镜装置及内窥镜装置的工作方法
US10335014B2 (en) Endoscope system, processor device, and method for operating endoscope system
JP4728450B2 (ja) 撮像装置
US10694117B2 (en) Masking approach for imaging multi-peak fluorophores by an imaging system
US20160128545A1 (en) Endoscope apparatus and method for controlling endoscope apparatus
US10386627B2 (en) Simultaneous visible and fluorescence endoscopic imaging
JP2015029841A (ja) 撮像装置および撮像方法
CN111526773B (zh) 内窥镜图像获取系统及方法
US20210145248A1 (en) Endoscope apparatus, operating method of endoscope apparatus, and information storage medium
US20220012915A1 (en) Apparatuses, systems, and methods for managing auto-exposure of image frames depicting signal content against a darkened background
CN102843951A (zh) 荧光观察装置
JP6203452B1 (ja) 撮像システム
WO2018167969A1 (ja) イメージング装置
JP6085382B1 (ja) 投影装置及び投影システム
JP2023008344A (ja) 眼科用画像処理プログラム
JP5508140B2 (ja) 眼底撮像装置及びその処理方法
US11689689B2 (en) Infrared imaging system having structural data enhancement
JP2019168423A (ja) 画像取得装置及び画像取得方法
JP6535701B2 (ja) 撮像装置
US11415517B1 (en) Unwanted near-infrared signal suppression
JP2019098008A (ja) 内視鏡システム
CN117529928A (zh) 多分量图像的自动曝光管理

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19860869

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19860869

Country of ref document: EP

Kind code of ref document: A1