WO2020052626A1 - 一种基于图像曝光的多光谱成像系统和方法 - Google Patents

一种基于图像曝光的多光谱成像系统和方法 Download PDF

Info

Publication number
WO2020052626A1
WO2020052626A1 PCT/CN2019/105574 CN2019105574W WO2020052626A1 WO 2020052626 A1 WO2020052626 A1 WO 2020052626A1 CN 2019105574 W CN2019105574 W CN 2019105574W WO 2020052626 A1 WO2020052626 A1 WO 2020052626A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
visible spectrum
exposure time
time value
light
Prior art date
Application number
PCT/CN2019/105574
Other languages
English (en)
French (fr)
Inventor
胡文忠
张宇
戴玉蓉
陈继东
聂红林
Original Assignee
上海逸思医学影像设备有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海逸思医学影像设备有限公司 filed Critical 上海逸思医学影像设备有限公司
Publication of WO2020052626A1 publication Critical patent/WO2020052626A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0075Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by spectroscopy, i.e. measuring spectra, e.g. Raman spectroscopy, infrared absorption spectroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/046Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances for infrared imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/043Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances for fluorescence imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0655Control therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0661Endoscope light sources
    • A61B1/0684Endoscope light sources using light emitting diodes [LED]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0071Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by measuring fluorescence emission
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0082Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
    • A61B5/0084Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for introduction into the body, e.g. by catheters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence

Definitions

  • Optical imaging is widely used in clinical medicine, and has the advantages of being harmless to the human body, non-invasive, high sensitivity, and in vivo multi-target imaging.
  • fluorescent imaging has become an important tool for intraoperative navigation.
  • the application technology based on the near-infrared fluorescent developer Indocyanine Green (ICG) is also increasingly mature.
  • ICG near-infrared fluorescent developer Indocyanine Green
  • the indocyanine green fluorescent agent After being injected intravenously, the indocyanine green fluorescent agent binds to plasma proteins and is excited by near-infrared light at a wavelength of about 805 nm to release fluorescence at a wavelength of about 835 nm, which is longer than the wavelength of the excitation light.
  • the imaging system processes the captured fluorescent signals through algorithms and displays white light, fluorescence, or superimposed images on the screen in real time. Compared with traditional organic dyes, this method of real-time fluorescence development has higher contrast, better recognition of target tissues, and avoids dye dispersion in the later stage of surgery.
  • each image frame of the system output image is a multi-spectral fusion image.
  • one frame of image can be divided into visible spectrum image information and fluorescence spectrum image information.
  • fluorescent contrast agent navigation for tumor surgical cutting
  • tissue areas are expected to be accurately excised, and this requires clear fluorescent tissue areas and clear boundaries of fluorescent areas.
  • the operator also hopes that the background tissue area of the visible spectrum can also be clearly defined so as to better identify the tissue.
  • the tissue of the labeled fluorescent region contains both the information of the fluorescence spectrum and the visible spectrum, and the light intensity of the visible spectrum is usually much stronger than that of the fluorescence spectrum.
  • the plurality of image sensors include a first image sensor corresponding to a fluorescence spectrum segment and a second image sensor corresponding to a visible spectrum segment; each periodic frame is two frames of images, and the exposure of the first image sensor The time value remains unchanged in successive frames of the image.
  • the exposure time value of the second image sensor is set to the first exposure time value of the visible spectrum image in the first half of the period frame.
  • the exposure time value of the second image sensor is in the lower half.
  • the period frames are set to the second exposure time value of the visible spectrum segment image, and the first exposure time value of the visible spectrum segment image is different from the second exposure time value of the visible spectrum segment image.
  • the plurality of image sensors adopt a progressive scanning exposure method.
  • the image sensor adopts an interlaced exposure method.
  • Each frame of the image is divided into two field images.
  • the odd field image corresponds to the odd-numbered lines of the image sensor color filter array
  • the even field image corresponds to the The even-numbered lines of the color filter array of the image sensor are described.
  • Each periodic frame image includes four consecutive field images. The first field is set to be exposed at the first exposure time value of the visible spectrum image, and the second field is set to be visible. The second exposure time value of the spectrum image is exposed, the third field is set to be exposed at the second exposure time value of the visible spectrum image, and the fourth field is set to be exposed at the first exposure time value of the visible spectrum image.
  • the first exposure time value of the visible spectrum segment image is greater than the second exposure time value of the visible spectrum segment image.
  • the image information of adjacent odd and even fields is output after being subjected to post-image enhancement processing.
  • the application also provides a multi-spectral imaging method, which includes: irradiating a irradiated area with visible light and excitation light, the irradiated area being excited by the excitation light to emit fluorescence; receiving visible light and reflected light from the irradiated area; and Fluorescence to form a multispectral image containing image information of the visible spectrum segment and image information of the fluorescence spectrum segment; wherein the exposure time value of the visible spectrum segment image changes periodically to make the visible spectrum segment image information of different light intensities in the visible spectrum segment Alternately loop output in video continuous frame images.
  • the exposure time value of the visible spectrum segment image includes a first exposure time value of the visible spectrum segment image and a second exposure time value of the visible spectrum segment image, the first exposure time value of the visible spectrum segment image, and The second exposure time values of the visible spectrum image appear alternately and cyclically in the video continuous frame image.
  • each periodic frame is two images, and the exposure time value of the fluorescence spectrum image is maintained in consecutive frames of the image.
  • the exposure time value of the visible spectrum image is set in the first half of the periodic frame.
  • the exposure time value of the visible spectrum segment image is set to the second exposure time value of the visible spectrum segment image in the second half of the period frame, and the first exposure time value of the visible spectrum segment image is different The second exposure time value in the visible spectrum image.
  • the visible spectrum segment is exposed by an interlaced exposure method.
  • Each frame of the image is divided into two field images.
  • the odd field image corresponds to the odd lines of the image
  • the even field image corresponds to the even lines of the image.
  • Each period frame includes four consecutive field images, where the first field is set to be exposed at the first exposure time value of the visible spectrum segment image, the second field is set to be exposed at the second exposure time value of the visible spectrum segment image, the first The three fields are set to be exposed at the second exposure time value of the visible spectrum segment image, and the fourth field is set to be exposed at the first exposure time value of the visible spectrum segment image.
  • the application also provides a computer-readable storage medium storing computer-executable instructions for performing the following operations: irradiating the illuminated area with visible light and excitation light in combination, and Excitation releases fluorescence; receives visible light and fluorescence from the reflected light from the illuminated area to form a multispectral image containing image information of the visible spectrum and image information of the fluorescence spectrum; wherein the exposure time value of the visible spectrum image is Periodic change, so that the visible spectrum image information of different light intensities in the visible spectrum segment is output alternately and cyclically in the video continuous frame image.
  • the image exposure-based multispectral imaging system and method provided by the present application can make the visible background image clear and bright, while the fluorescence imaging has high contrast and low noise.
  • the application can be widely applied in the field of medical imaging technology, including various fields such as medical endoscopes, fluorescence microscopes and the like.
  • FIG. 1 is a schematic structural diagram of a multispectral imaging system according to an embodiment of the present application.
  • FIG. 2 is a schematic structural diagram of a multi-spectral imaging system using a single image sensor according to an embodiment of the present application
  • FIG. 3 is a schematic diagram of a light source module using multiple monochromatic light source combinations according to an embodiment of the present application
  • FIG. 4 is a schematic diagram of a camera module using two image sensors according to an embodiment of the present application.
  • FIG. 5 is a schematic diagram of a camera module using four image sensors according to an embodiment of the present application.
  • FIG. 6 is a schematic diagram of an imaging method in a multi-image sensor mode progressive output mode according to an embodiment of the present application
  • FIG. 7 is a schematic diagram of an arrangement of a color filter array in a single image sensor mode according to an embodiment of the present application.
  • FIG. 8 is a schematic diagram of an imaging method in an interlaced output mode according to an embodiment of the present application.
  • FIG. 9 is a block diagram of an exemplary computer system according to one embodiment of the present application.
  • FIG. 1 shows an exemplary structure diagram of a multi-spectral imaging system according to an embodiment of the present application.
  • a multispectral imaging system can be used for a fluorescent endoscope.
  • the fluorescence endoscope camera system may include a light source module 1, a camera module 2 for imaging, a data acquisition and preprocessing module 3 for image acquisition and preprocessing, a digital image processing module 4, and a system controller.
  • 5 Light source controller 6.
  • the camera module 2 may include an optical lens assembly and one or more image sensors (described in detail in the following figure).
  • the system controller 5 may generate a corresponding control signal according to the feedback processing result of the digital image processing module 4.
  • the light source controller 6 may perform lighting control on the light source module 1 according to a control signal of the system controller 5.
  • FIG. 2 shows an exemplary structure diagram of a multi-spectral imaging system using a single image sensor according to an embodiment of the present application.
  • the light source module 1 may include an incident visible light LED 11, a near-infrared (NIR) exciter 12, and a combination beam splitter 13.
  • the camera module 2 may include an optical lens assembly 9 and a single image sensor 10 (for example, a CCD or CMOS image sensor).
  • the optical lens assembly 9 may further include an optical filter 8.
  • the incident visible light LED 11 can provide illumination in the visible light wavelength range (400nm ⁇ 760nm).
  • the near-infrared (NIR) exciter 12 can generate NIR excitation light in a near-infrared wavelength range (790 nm to 820 nm), particularly near-infrared excitation light in a wavelength range around 800 nm.
  • the visible light and the NIR excitation light can be combined and irradiated to the irradiated area (for example, a tissue) through the combined spectroscope 13.
  • the tissue injected with the fluorescent developer is irradiated with the NIR excitation light in the light source module 1
  • the fluorescent groups in the tissue are excited to emit near-infrared fluorescence with a longer wavelength band than the excitation light wavelength, such as fluorescence of 830 nm to 900 nm.
  • the reflected light from the irradiated area may include three spectral bands such as visible light, excitation light (e.g., NIR excitation light), and fluorescence (e.g., near-infrared fluorescence).
  • the NIR excitation light is completely blocked by the optional optical filter 8 and cannot enter the image sensor, while visible light and fluorescence enter the single image sensor 10 with NIR pixels through the optical lens assembly 9.
  • the data from the image sensor 10 can enter the digital image processing module 4 through the data acquisition and preprocessing module 3 to form a visible spectrum image and a fluorescent spectrum image.
  • the system controller 5 may output a signal to the light source controller 6 according to the feedback from the digital image processing module 4.
  • the light source controller 6 can control the light source module 1 by controlling a driving current or a voltage.
  • FIG. 3 is an exemplary schematic diagram of a light source module employing multiple monochromatic light source combinations according to an embodiment of the present application.
  • the light source module 1 may include a combination of three monochrome light sources as shown in FIG. 3.
  • the visible light source consists of three monochromatic light sources: red LED 18 (620nm ⁇ 760nm), green LED 19 (492nm ⁇ 577nm) and blue LED 20 (400nm ⁇ 450nm) through the corresponding combination beam splitter 15, combination beam splitter 16 and combination.
  • the spectroscope 17 is combined.
  • the camera module 2 may include one or more image sensors.
  • the camera module 2 may be a single image sensor as shown in FIG. 2, or a dual image sensor combination as shown in FIG. 4.
  • FIG. 4 is an exemplary schematic diagram of a camera module employing two image sensors according to an embodiment of the present application.
  • the camera module 2 may include a plurality of image sensors, and a corresponding dichroic prism.
  • the camera module 2 may employ an image sensor 32 (such as a CCD / CMOS image sensor) for visible light and an image sensor 33 (such as a CCD / CMOS image sensor) for fluorescence. Visible light and fluorescence from the tissue enter the corresponding image sensors through the dichroic prism 31, respectively.
  • each image frame of the system output image is a multi-spectral fusion image of visible light and fluorescence.
  • one frame of image can be divided into visible spectral band image information and fluorescent spectral bands.
  • Image information In the mode where multiple image sensors are used to separately capture image signals in different spectral wavelength bands for subsequent fusion (such as the above-mentioned Figure 4 and Figure 5), the light intensity in each image frame information is reflected by the tissue reflected light intensity and reflected light on the image sensor. Light time integrals are determined together.
  • the contrast of the fluorescent region can be increased, but the brightness and sharpness of the visible background tissue can also be reduced.
  • high visible light intensity can highlight the image space details of visible light images, while low light intensity highlights the details of fluorescent images.
  • Increasing the fluorescence contrast by reducing the light intensity of the visible background image will sacrifice the imaging quality of the visible image.
  • the image noise under low illumination is more prominent and obvious, making the image borders blurred and the noise prominent .
  • FIG. 6 is a schematic diagram of an imaging method in a multi-image sensor mode progressive output mode according to an embodiment of the present application.
  • the following exposure adjustment methods can be adopted.
  • the exposure time of the fluorescent image sensor remains unchanged according to the frame rate of the video image.
  • the image sensor corresponding to the visible spectrum segment has a long exposure time
  • the image sensor corresponding to the visible spectrum segment in the second frame has a short exposure time.
  • the visible spectrum segment corresponds to a short exposure time.
  • the image sensor performs a long exposure time
  • the image sensor corresponding to the visible spectrum segment in the fourth frame has a short exposure time.
  • subsequent image frames also change periodically.
  • the different exposure time values of the visible and long spectral bands appear alternately and cyclically in each frame of the image.
  • one period frame is two images
  • the first half period frame is the first exposure time (eg, long exposure time) of the visible spectrum segment image
  • the second half period frame is the second exposure time of the visible spectrum segment image (eg, , Short exposure time).
  • the exposure time of the visible spectrum segment may be long before short (as shown in the illustrated embodiment), but the application is not limited thereto.
  • the human eye is less sensitive to changes in brightness over a certain light intensity In this way, the bright and dark parts of continuous image frames are intertwined, and it is difficult for the human eye to perceive such a rapid change in the brightness of the visible spectrum.
  • the exposure time difference can be adjusted according to the specific visual effect, so that the human eye does not produce a significant splash screen effect.
  • the human eye is more sensitive to brightness at very low brightness, but not sensitive at high brightness.
  • FIG. 7 is a schematic diagram of color filter array arrangement in a single image sensor mode according to an embodiment of the present application.
  • the signal transmission path in the CCD may not be able to adapt to a large amount of one-time data obtained by progressive scanning, which will cause a decline in image processing speed.
  • interlaced scanning technology can be used. Therefore, for a single image sensor with integrated NIR pixels in the color filter array of the sensor, the light intensity information of the visible light and the fluorescent image can be adjusted by interlaced exposure and interlaced output, respectively.
  • Some CCD or CMOS image sensors can only perform interlaced exposure and read out in interlaced output mode. In this mode, the electronic shutter resets all photodiodes before exposure, as in progressive scan mode.
  • NIR near-infrared fluorescence
  • blue light B pixels are in one line
  • red R and green G pixels are in one line
  • the odd-numbered lines of the color filter array can be set as the red R and green G arrays
  • the even-numbered lines can be set as the blue and NIR arrays.
  • the exposure time of odd and even lines of an image can be set separately to adjust the brightness of each monochromatic light.
  • the color visible light image is synthesized by the RGB array in the color filter array. According to the brightness of visible light is equal to the sum of the brightness of the primary colors participating in color mixing.
  • the odd rows of the image are reduced separately.
  • the exposure time can reduce the spectral components of the red R and green G components, resulting in a reduction in the total light intensity of the image in the visible spectrum.
  • this will cause the blue light intensity in the mixed primary color of visible light to be too high, resulting in color cast.
  • gamma correction may be performed in the post-processing of the image, and the weight ratio of the blue light component in the gamma correction is re-determined according to the difference between the exposure time between the even and odd lines and the intensity of the light, so as to restore the visible spectrum.
  • the true color of the segment image is re-determined according to the difference between the exposure time between the even and odd lines and the intensity of the light
  • the same line as the NIR pixel may be red light R or green light G (not shown in the figure).
  • the weight ratio of the corresponding red light R or green light G component needs to be adjusted.
  • FIG. 8 shows a schematic diagram of an imaging method in an interlaced output mode according to an embodiment of the present application.
  • a single image sensor is used.
  • the interlaced exposure interlaced output mode the following frame image exposure adjustment methods can be used.
  • one frame of an image is divided into two field images: an odd field image and an even field image.
  • the odd field image corresponds to the odd line of the image sensor color filter array
  • the even field image corresponds to the even line of the image sensor.
  • the first field is a long exposure
  • the second field is a short exposure
  • the third field is also a short exposure
  • the fourth field is a long exposure.
  • the exposure in four consecutive field images is one exposure change cycle.
  • the corresponding change in the light intensity of the image frame is also shown in Figure 8.
  • the light intensity of the visible spectrum segment is high in the first frame of the image, while the light intensity of the visible spectrum segment is low in the second frame;
  • the intensity is high, while the light intensity in the visible band of the fourth frame is low.
  • the converted digital image signal enters the digital image processing module 4.
  • the digital image processing module 4 performs post-image enhancement processing on field images with different exposure times on the odd and even lines of the image sensor, and then synthesizes the images, and then synthesizes the images on the display (Not shown).
  • the display may also be displayed in an interlaced manner directly on the display.
  • interlaced output the image information is divided into two field images and output separately, which is more continuous for the visual effect of the present invention. At the same time, interlaced output can support higher resolution image output.
  • the continuous frame image information by setting different exposure times to adjust the visible light intensity of different frame images in the video image, so that the high and low visible spectrum Segment light intensity is output alternately in different frames of the image.
  • a continuous frame of image information not only can output a high-brightness visible spectrum image, highlight the visible light image details of the background tissue, but also output high-contrast fluorescent image information under a low visible light intensity background.
  • both the clear details of the visible spectrum image and the high-contrast image details of the fluorescence near-infrared spectrum image are reflected.
  • Adopting the image periodic frame technology which outputs different light intensities and contrasts every other frame, while ensuring high contrast of fluorescent image information, it also outputs high-definition visible light image information.
  • Computer system 900 may include a logic processor 902, such as an execution core. Although one logical processor 902 is illustrated, in other embodiments, the computer system 900 may have multiple logical processors, for example, multiple execution cores per processor chip, and / or multiple processor bases. Chips, where each processor chip can have multiple execution cores. As shown, various computer-readable storage media 910 may be interconnected by one or more system buses that couple various system components to the logical processor 902.
  • the system bus can be any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures.
  • computer-readable storage medium 910 may include, for example, random access memory (RAM) 904, storage device 906 (e.g., electromechanical hard drive, solid state hard drive, etc.), firmware 908 (e.g., flash memory) RAM or ROM), and removable storage devices 918 (such as, for example, CD-ROMs, floppy disks, DVDs, flash drives, external storage devices, etc.).
  • RAM random access memory
  • storage device 906 e.g., electromechanical hard drive, solid state hard drive, etc.
  • firmware 908 e.g., flash memory
  • removable storage devices 918 such as, for example, CD-ROMs, floppy disks, DVDs, flash drives, external storage devices, etc.
  • removable storage devices 918 such as, for example, CD-ROMs, floppy disks, DVDs, flash drives, external storage devices, etc.
  • removable storage devices 918 such as, for example, CD-ROMs, floppy disks, DVDs, flash drives, external storage devices, etc.
  • a basic input / output system (BIOS) 920 which contains basic routines such as helping to transfer information between units of the computer system 900 during startup, may be stored in firmware 908.
  • BIOS basic input / output system
  • a large number of programs may be stored on the firmware 908, the storage device 906, the RAM 904, and / or the removable storage device 918, and may be executed by the logical processor 902, which includes an operating system and / or application programs.
  • Commands and information may be received by the computer system 900 through an input device 916, which may include, but is not limited to, a keyboard and pointing device.
  • Other input devices may include a microphone, joystick, gamepad, scanner, and so on.
  • serial port interface coupled to the system bus, but may also be connected through other interfaces, such as a parallel port, a game port, or a universal serial bus (USB).
  • a display or other type of display device may also be connected to the system bus via an interface, such as a video adapter, which may be part of or connected to the graphics processing unit 912.
  • computers typically include other peripheral output devices, such as speakers and printers (not shown).
  • the exemplary system of FIG. 9 may also include a host adapter, a small computer system interface (SCSI) bus, and an external storage device connected to the SCSI bus.
  • SCSI small computer system interface
  • Computer system 900 may operate in a networked environment using logical connections to one or more remote computers, such as a certain remote computer.
  • the remote computer may be another computer, server, router, network PC, peer device, or other common network node, and may typically include multiple or all of the units described above with respect to computer system 900.
  • computer system 900 When used in a LAN or WAN networking environment, computer system 900 may be connected to a LAN or WAN through a network interface card 914.
  • a network card (NIC) 914 (which may be internal or external) may be connected to the system bus.
  • program modules or portions thereof depicted relative to computer system 900 may be stored in a remote memory storage device. It should be appreciated that the network connections described herein are exemplary and other means of establishing a communications link between the computers may be used.
  • the functions and processes described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium.
  • Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another.
  • a storage media may be any available media that can be accessed by a computer.
  • such computer readable media may include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or in the form of instructions or data structures that can be used to carry or store instructions. Any other medium that agrees with the program code and can be accessed by a computer.
  • the software code may be stored in a memory such as the memory of a mobile station and executed by a processor such as a desktop computer, laptop computer, server computer, microprocessor of a mobile device, or the like.
  • the memory can be implemented inside the processor or outside the processor.
  • memory refers to any type of long-term, short-term, volatile, nonvolatile, or other memory, and is not limited to any particular type of memory or a specific number of memories, or memory storage The type of media on it.
  • any reference to a claim element in a singular form such as a reference using the articles “a”, “an” or “the” should not be interpreted as limiting the element to the singular.
  • any reference to the source depth by the names “first”, “second”, etc. as used herein generally does not limit the number or order of those elements.
  • Reference to the first and second elements does not mean that only two elements are used here, nor does it mean that the first element must precede the second element in some way.
  • a set of elements may include one or more elements.
  • the skilled person can implement the described structure in different ways for each particular application, but such implementation should not be interpreted as causing a departure from the scope of the present application.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Immunology (AREA)
  • General Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)
  • Endoscopes (AREA)

Abstract

基于图像曝光的多光谱成像系统和方法。多光谱成像系统包括:光源模块(1),提供可见光并产生激发光,被照射区由可见光和激发光组合照射,被照射区受激发光的激发而释放出荧光;相机模块(2),包括一个或多个图像传感器(10),图像传感器(10)接收来自被照射区的反射光中的可见光和荧光,以形成包含可见光谱段图像信息和荧光谱段图像信息的多光谱图像;其中,可见光谱段图像的曝光时间值呈周期性变化,使可见光谱段的不同光强度的可见光谱段图像信息在视频连续帧图像中交替循环输出。能够使可见光背景图像成像清晰明亮,同时荧光成像对比度高,噪点低。

Description

一种基于图像曝光的多光谱成像系统和方法 技术领域
本申请涉及光学成像技术,更具体地涉及内窥镜多光谱成像领域。
背景技术
光学成像在临床医学中应用广泛,具有对人体无害、非侵入性、高灵敏度和可进行在体多目标成像的优点。其中,随着荧光显影剂的不断拓展和创新,荧光成像更是成为术中导航的重要工具。目前,基于近红外荧光显影剂吲哚菁绿(Indocyanine Green,ICG)的应用技术也日益成熟。荧光显影剂吲哚菁绿通过静脉注射后,与血浆蛋白结合,通过805nm波段左右的近红外光的激发,释放出比激发光波长更长的835nm左右波段荧光。成像系统将捕捉到的荧光信号通过算法处理,在屏幕上实时地显示白光、荧光或叠加图像。这种实时荧光显影的方法与传统的有机染料相比,对比度更高,对目标组织的识别效果更佳,且避免了手术后期染料弥散。
对于可见光和荧光同时成像而言,系统输出图像的每一图像帧为多光谱融合图像,根据光谱波长段的不同,一帧图像中可以分为可见光谱段图像信息和荧光谱段图像信息。在采用荧光显影剂导航进行肿瘤手术切割的时候,通常荧光标记的组织区域是希望能够被精准切除,而这就需要荧光组织区域清晰且荧光区域边界分明。同时,手术者也希望可见光谱段背景组织区域也能够清晰明确,以便更好的识别组织。然而,被标记的荧光区域组织上既包含荧光谱段又有可见光谱段的信息,且可见光谱段的光强度通常是远远强于荧光谱段的光强度。这导致了荧光通常被可见光淹没,导致在多光谱图像中荧光较暗,不清晰,对比度较低等问题,而通常荧光区域边界处组织病变程度更低,被激发的荧光通常更弱,在强的可见光谱段光强度下,荧光区域的边界更加的模糊不清楚。
因此,本领域需要改进的多光谱成像技术,以提升可见光和激发荧光同时成像的图像质量。
发明内容
为了实现上述目的,本申请提供了一种多光谱成像系统,包括:光源模块,提供可见光并产生激发光,被照射区由可见光和激发光组合照射,被照射区受激发光的激发而释放出荧光;相机模块,包括一个或多个图像传感器,所述图像传感器接收来自被照射区的反射光中的可见光和荧光,以形成包含可见光谱段图像信息和荧光谱段图像信息的多光谱图像;其中,可见光谱段图像的曝光时间值在视频的连续帧图像中中呈周期性变化,使不同光强度可见光谱段图像信息在视频图像不同帧中交替循环输出。
根据本申请的一个方面,可见光谱段图像的曝光时间值包括可见光谱段图像的第一曝光时间值和可见光谱段图像的第二曝光时间值,可见光谱段图像的第一曝光时间值和可见光谱段的第二曝光时间值分别在视频图像的每个周期帧中交替周期性循环出现。
根据本申请的一个实施例,所述多个图像传感器包括荧光谱段对应的第一图像传感器以及可见光谱段对应的第二图像传感器;每个周期帧为两帧图像,第一图像传感器的曝光时间值在图像的连续帧中保持不变,第二图像传感器的曝光时间值在上半个周期帧被设置成可见光谱段图像的第一曝光时间值,第二图像传感器的曝光时间值在下半个周期帧被设置成可见光谱段图像的第二曝光时间值,可见光谱段图像的第一曝光时间值不同于可见光谱段图像的第二曝光时间值。
根据本申请的一个方面,可见光谱段图像的第一曝光时间值和可见光谱段图像的第二曝光时间值是根据系统预设的可见光谱段图像的基准曝光时间值来设置的。
根据本申请的一个方面,可见光谱段图像的第一曝光时间值和可见光谱段图像的第二曝光时间值也可是根据荧光谱段的曝光时间值来设置的。
根据本申请的一个方面,所述多个图像传感器采用逐行扫描曝光方式。
根据本申请的一个方面,通过以下各项中的一者或多者来调解图像相邻帧图像之间的光强度差异:调解图像的相邻帧中可见光谱段图像的曝光时间值;对相邻帧图像的可见光谱段后期图像进行图像增强处理,或对后期图像进行加权算法处理调解所述多个图像传感器中输出后期图像硬件增益放大值;或提高图像 帧率。
根据本申请的另一个实施例,所述图像传感器采用隔行曝光方式,图像的每一帧分为两个场图像,奇场图像对应所述图像传感器滤色阵列的奇数行,偶场图像对应所述图像传感器滤色阵列的偶数行,每个周期帧图像包括连续四个场图像,其中第一场被设置成以可见光谱段图像的第一曝光时间值曝光,第二场被设置成以可见光谱段图像的第二曝光时间值曝光,第三场被设置成以可见光谱段图像的第二曝光时间值曝光,并且第四场被设置成以可见光谱段图像的第一曝光时间值曝光。
根据本申请的一个方面,可见光谱段图像的第一曝光时间值大于可见光谱段图像的第二曝光时间值。
根据本申请的一个方面,相邻奇场和偶场的图像信息进行后期图像增强处理后被分别输出。
申请还提供了一种多光谱成像方法,包括:由可见光和激发光对被照射区组合照射,被照射区受激发光的激发而释放出荧光;接收来自被照射区的反射光中的可见光和荧光,以形成包含可见光谱段图像信息和荧光谱段图像信息的多光谱图像;其中,可见光谱段图像的曝光时间值呈周期性变化,使可见光谱段的不同光强度的可见光谱段图像信息在视频连续帧图像中交替循环输出。
根据本申请的一个实施例,可见光谱段图像的曝光时间值包括可见光谱段图像的第一曝光时间值和可见光谱段图像的第二曝光时间值,可见光谱段图像的第一曝光时间值和可见光谱段图像的第二曝光时间值分别在视频连续帧图像中交替周期性循环出现。
根据本申请的一个方面,每个周期帧为两帧图像,荧光谱段图像的曝光时间值在图像的连续帧中保持不变,可见光谱段图像的曝光时间值在上半个周期帧被设置成可见光谱段图像的第一曝光时间值,可见光谱段图像的曝光时间值在下半个周期帧被设置成可见光谱段图像的第二曝光时间值,可见光谱段图像的第一曝光时间值不同于可见光谱段图像的第二曝光时间值。
根据本申请的另一个实施例,可见光谱段以隔行曝光方式被曝光,图像的每一帧分为两个场图像,奇场图像对应图像的奇数行,偶场图像对应图像的偶数行,每个周期帧包括连续四个场图像,其中第一场被设置成以可见光谱段图像 的第一曝光时间值曝光,第二场被设置成以可见光谱段图像的第二曝光时间值曝光,第三场被设置成以可见光谱段图像的第二曝光时间值曝光,并且第四场被设置成以可见光谱段图像的第一曝光时间值曝光。
申请还提供了一种计算机可读存储介质,所述计算机可读存储介质存储用于执行以下操作的计算机可执行指令:由可见光和激发光对被照射区组合照射,被照射区受激发光的激发而释放出荧光;接收来自被照射区的反射光中的可见光和荧光,以形成包含可见光谱段图像信息和荧光谱段图像信息的多光谱图像;其中,可见光谱段图像的曝光时间值呈周期性变化,使可见光谱段的不同光强度的可见光谱段图像信息在视频连续帧图像中交替循环输出。
相比于已有技术,本申请提供的基于图像曝光的多光谱成像系统和方法能够使得可见光背景图像成像清晰明亮,同时荧光成像对比度高,噪点低。
本申请可广泛应用于医疗成像技术领域,包括医疗内窥镜、荧光显微镜等各个领域。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为根据本申请一个实施例的多光谱成像系统的结构示意图;
图2为根据本申请一个实施例的采用单个图像传感器的多光谱成像系统的结构示意图;
图3为根据本申请一个实施例的光源模块采用多个单色光源组合的示意图;
图4为根据本申请一个实施例的相机模块采用两个图像传感器的示意图;
图5为根据本申请一个实施例的相机模块采用四个图像传感器的示意图;
图6为根据本申请一个实施例的多图像传感器模式逐行输出方式下成像方法示意图;
图7为根据本申请一个实施例的单个图像传感器模式下滤色阵列排布示意 图;
图8为根据本申请一个实施例的隔行输出方式下成像方法示意图;以及
图9是根据本申请的一个实施例的示例性计算机系统框图。
具体实施方式
将参照附图详细描述各种实施例。在可能之处,相同附图标记将贯穿附图用于指代相同或类似部分。对特定示例和实现所作的引用是用于说明性目的,而无意限定本申请或权利要求的范围。
措辞“示例性”在本文中用于表示“用作示例、实例或示出”。本文中描述为“示例性”的任何实现不必然被解释为优于或胜过其他实现。
图1示出了根据本申请一个实施例的多光谱成像系统的示例性结构示意图。在该示例中,多光谱成像系统可以用于荧光内窥镜。如图1所示,荧光内窥镜摄像系统可以包括光源模块1,用于成像的相机模块2,进行图像采集和预处理的数据采集和预处理模块3、数字图像处理模块4,系统控制器5和光源控制器6。其中,相机模块2可以包括光学镜头组件和一个或多个图像传感器(下图将详述)。系统控制器5可以根据数字图像处理模块4的反馈处理结果生成相应的控制信号。光源控制器6可以根据系统控制器5的控制信号对光源模块1进行照明控制。
图2示出了根据本申请一个实施例的采用单个图像传感器的多光谱成像系统的示例性结构示意图。如图2所示,光源模块1可以包括入射可见光LED 11、近红外(NIR)激发器12和组合分光镜13。在该示例中,相机模块2可以包括光学镜头组件9和单个图像传感器10(例如,CCD或CMOS图像传感器)。可任选地,光学镜头组件9还可以包括光学滤波器8。入射可见光LED 11可以提供可见光波长段(400nm~760nm)的照明。近红外(NIR)激发器12可以产生近红外波段(790nm~820nm)的NIR激发光,特别是800nm附近波长段的近红外激发光。可见光和NIR激发光可以通过组合分光镜13对被照射区(例如,组织)进行组合照射。当被注射荧光显影剂的组织被光源模块1中的NIR激发光照射时,组织中的荧光基团被激发而释放出比激发光波长更长波段的近红外荧光,例如830nm~900nm的荧光。来自被照射区(例如,组织)的反射 光可以包括例如可见光、激发光(例如,NIR激发光)和荧光(例如,近红外荧光)三种光谱波段。其中NIR激发光被可选的光学滤波器8完全阻挡而不能进入图像传感器,而可见光和荧光透过光学镜头组件9进入具有NIR像素的单个图像传感器10。来自图像传感器10的数据可以通过数据采集和预处理模块3进入数字图像处理模块4,以形成可见光谱段图像和荧光谱段图像。系统控制器5可以根据数字图像处理模块4的反馈而输出信号给光源控制器6。光源控制器6可以通过控制驱动电流或者电压控制光源模块1。
图3为根据本申请一个实施例的光源模块采用多个单色光源组合的示例性示意图。该示例中,光源模块1可以包括如图3所示的三个单色光源组合形式。可见光源由红光LED 18(620nm~760nm)、绿光LED 19(492nm~577nm)和蓝光LED 20(400nm~450nm)三个单色光源通过相应的组合分光镜15、组合分光镜16和组合分光镜17进行组合。在多个单色光源组合的示例中,相机模块2可以包括一个或多个图像传感器。例如,相机模块2可以是如图2所示的单个图像传感器,也可以是如图4所示的双图像传感器组合。
图4为根据本申请一个实施例的相机模块采用两个图像传感器的示例性示意图。相机模块2可以包括多个图像传感器,以及相应的分光棱镜。该示例中,相机模块2可以采用可见光的图像传感器32(如,CCD/CMOS图像传感器)和荧光的图像传感器33(如,CCD/CMOS图像传感器)。来自组织的可见光和荧光通过分光棱镜31分别进入相应的图像传感器。
图5为根据本申请一个实施例的相机模块采用四个图像传感器的示例性示意图。相机模块2可以包括四个图像传感器,以及相应的三个分光棱镜。如图5所示,当采用4-CCD/CMOS图像传感器组合时,采用棱镜分光技术(例如,分光棱镜34、分光棱镜35和分光棱镜36的组合)将光分别引导至三个RGB单色图像传感器(例如,红光图像传感器38、绿光图像传感器39和蓝光图像传感器40)和一个NIR荧光图像传感器37中,其中各图像传感器分别接收所对应波长段的反射光,以形成多光谱图像。
对于可见光和荧光同时成像而言,系统输出图像的每一图像帧为可见光和荧光的多光谱融合图像,根据光谱波长段的不同,一帧图像中可以分为可见光谱段图像信息和荧光谱段图像信息。对于采用多图像传感器分别捕捉不同光谱 波长段图像信号进行后期融合的模式下(例如上述图4和图5),每一图像帧信息中的光强度由组织反射光强度和反射光在图像传感器上光照时间积分共同决定。
通过降低可见光光强度可以提高荧光区域的对比度,但是也降低了可见光背景组织亮度和清晰度。在可见光谱段和荧光的多光谱融合图像中,高可见光光强度可以突出可见光图像的图像空间细节,而低光强度下则突出荧光图像的细节。通过降低可见光背景图像的光照强度来提高荧光对比度则会牺牲可见光图像成像质量,同时在系统信噪比一定的情况下,低照度下的图像噪点更加的突出和明显,使得图像边界模糊而噪点突出。
在光源光照强度保持不变的条件下,增加曝光时间会增加帧图像的亮度,同理减小曝光时间则相应减少图像的亮度。在保证图像清晰度的基础下,曝光时间长,则图像亮度较强;曝光时间短,则图像亮度减弱。根据前面的描述,在图像一帧曝光时间内,通过减小可见光谱段所对应图像传感器的曝光时间,可以减小多光谱图像中可见光谱段图像信息光强度,从而可以提高多光谱融合图像中荧光谱段图像的对比度和清晰度。
图6为根据本申请一个实施例的多图像传感器模式逐行输出方式下成像方法示意图。如图6所示,在多图像传感器模式且逐行图像输出方式下,可以采取如下曝光调节方式。在视频所有连续帧图像中,根据视频图像的帧率,荧光图像传感器的曝光时间保持不变。而系统图像输出的第一帧中,可见光谱段对应的图像传感器进行长曝光时间,而第二帧中的可见光谱段对应的图像传感器为短曝光时间;第三帧中,可见光谱段对应的图像传感器进行长曝光时间,而第四帧中的可见光谱段对应的图像传感器为短曝光时间。同理,后续图像帧亦是如此周期性变化。可见光谱段的长和短的不同曝光时间值分别在图像的每一帧中交替周期性循环出现。换言之,一个周期帧为两帧图像,上半个周期帧为可见光谱段图像的第一曝光时间(如,长曝光时间),下半个周期帧为可见光谱段图像的第二曝光时间(如,短曝光时间)。应当理解的是,每个周期帧中,可见光谱段的曝光时间可以采取先长后短(如图示实施方式),但本申请并不局限于此。作为替换,在每个周期帧中,上半个周期帧可以被设置为可见光谱段图像的第二曝光时间(如,短曝光时间),下半个周期帧可以设置为可见光 谱段图像的第一曝光时间(如,长曝光时间)。还应当理解的是,图像可见光谱段的长曝光时间或者短曝光时间均是相对于预设置的可见光谱段基准曝光时间,或者是相对于荧光谱段的曝光时间。可任选地,长曝光时间可以根据帧率大小进行最佳匹配。虽然图像相邻帧的可见光谱段曝光时间具有差异,导致不同图像帧光强具有亮暗差异,但是由于人的视觉暂留特性,以及人眼在一定光强之上对亮度变化敏感性较低的特点,这样连续图像帧中的亮暗部分交织在一起,人眼不易察觉这种快速的可见光谱段亮暗变化。曝光时间差值可以根据具体视觉效果进行调节,达到人眼无明显产生闪屏效果即可。根据色光理论,人眼在非常低亮度下对亮度比较敏感,在高亮度下则不敏感。
另外,为了避免相邻帧图像光强度差异过大导致明显的屏闪现象,尤其是在帧率不高的情况下,可通过调节和优化相邻帧图像中可见光谱段曝光时间,同时对相邻帧图像的可见光谱段后期图像进行图像增强处理,或者通过后期图像加权算法处理,或者调节不同图像传感器中输出后期图像硬件增益放大值,将可见光谱段为暗帧数据的亮度进行增强并输出,以适应人眼对光强变化敏感度,达到更好的视觉效果。此外,在具体实施中,也可适当提高图像帧率,以消除曝光时间差异导致的光强变化过大带来的人眼不适应。
此外,在各图像帧包含的信息中,可见光谱段为高曝光的帧图像主要突出背景组织的图像细节和清晰度。而可见光谱段低曝光帧图像则突出荧光区域的图像细节,同时突出与背景组织的对比度。如此,在连续图像帧的过程中,利用人眼的视觉暂留特性,既可以反映背景组织高曝光下的结构图像细节,也体现了荧光区域的具体图像细节。
图7为根据本申请一个实施例的单个图像传感器模式下滤色阵列排布示意图。
由于同样大小CCD像素的不断增加,CCD中传送信号的通路可能无法适应逐行扫描得到的一次性大量数据,会造成图像处理速度的下降。在这种情况下,尤其在高像素的数码相机中,可以采用隔行扫描技术。由此,对于传感器滤色阵列中集成NIR像素的单图像传感器而言,可以通过隔行曝光隔行输出的方式分别调整可见光和荧光图像的光强度信息。一些CCD或者CMOS图像传感器只能进行隔行扫描曝光,并在隔行输出模式下读出。在这种模式下,电子 快门在曝光前将所有光电二极管重置,和逐行扫描模式一样。但是在曝光结束时,并不是所有电荷能同时从光电二极管中传送出。图像的奇数行与偶数行电荷必须在不同的时间内转移,所以图像奇行和偶行将有不同的曝光时间,即是图像传感器先曝光图像的奇数行,然后再曝光图像的偶数行。
对于摄像系统采用单个图像传感器的情况下,其马赛克滤色镜阵列如图7所示。图中NIR(近红外荧光)像素和蓝光B像素为一行,而红光R和绿光G像素为一行。可以将滤色阵列奇数行设置为红光R和绿光G阵列,而偶数行设置为蓝光和NIR的阵列。根据隔行曝光原理,可以分别设置图像奇数行和偶数行的曝光时间,以调整各单色光的亮度。彩色可见光图像由滤色阵列中的RGB阵列进行色光合成,根据可见光的亮度等于参与混色的基色的亮度总和,在偶数行(蓝光B和NIR)曝光时间不变情况下,通过单独减少图像奇数行的曝光时间,可以减少红光R和绿光G成分的光谱分量,导致可见光谱段图像的总光强度减小。但这会导致可见光混合基色中蓝光强度偏高而造成偏色现象。对此,可选地,可在图像后期处理中进行伽马矫正,根据奇偶数行间曝光时间的差值和光照强度的大小,重新确定伽马矫正中蓝光成分的权重比,从而还原可见光谱段图像的真实色彩。
作为替换,和NIR像素为同一行的也可为红光R或者绿光G(图中未示出)。同理,这样后期伽马矫正中需要调整相应红光R或者绿光G成分的权重比。
图8示出了根据本申请一个实施例的隔行输出方式下成像方法示意图。在该示例中,采用单个图像传感器。在隔行曝光隔行输出方式下,可以采用如下帧图像曝光调节方式。如图8所示,在隔行曝光方式下,图像一帧分为两个场图像:奇场图像和偶场图像。奇场图像对应图像传感器滤色阵列奇数行,偶场图像对应图像传感器偶数行。第一场为长时间曝光,第二场为短时间曝光,第三场也为短时间曝光,第四场为长时间曝光。换言之,在连续的4个场图像为一个曝光变化周期。所对应的图像帧光照强度变化也如图8所示,图像的第一帧中可见光谱段光强度为高,而第二帧中可见光谱段光强度为低;第三帧中可见光谱段光强度为高,而第四帧中可见光谱段光强度为低。
在图像传输中,经过转换后的数字图像信号进入数字图像处理模块4,数字图像处理模块4对于图像传感器奇偶数行不同曝光时间的场图像分别进行后 期图像增强处理后合成图像,并在显示器(未示出)上成像。可选地,,也可以进行后期图像增强处理后,直接在显示器上按隔行扫描的方式显示。
应当理解的是,图像滤色阵列中的奇数和偶数行不同曝光时间的调整可以实现图像帧中可见光谱段的光强度变化,最后图像的奇数场和偶数场在不同曝光时间下输出显示。由于可见光谱段亮暗的部分交织在一起,利用人的视觉特点,反而不易察觉。采用隔行输出,图像信息被分为两个场图像分别输出,这种方式对于本发明的视觉效果更加具有连续性。同时隔行输出能支持更高分辨率的图像输出。
本申请中,根据人眼的视觉暂留特性,结合不同视频输出方式,对连续帧图像信息中,通过设置不同曝光时间来调节视频图像中不同帧图像的可见光光强度,使得高和低可见光谱段光强度在不同帧图像中交替循环输出。这样在一个连续的帧图像信息中,既可以输出高亮度的可见光谱段图像,突出背景组织的可见光图像细节,也可以输出低可见光强度背景下高对比度的荧光图像信息。如此,在一系列的连续帧图像信息中,既体现了可见光谱段图像的清晰细节,又体现了荧光近红外谱段图像的高对比度下的图像细节。采用隔帧输出不同光强度和对比度的图像周期帧技术,在保证荧光图像信息高对比度的同时,也输出高清晰可见光图像信息。
根据本申请的方案,解决了现有的荧光和可见光谱段同时成像中时,高可见光强度背景图像下荧光区域对比度低的缺陷。此外,还解决了市场上荧光和可见光谱段同时成像中时,低可见光强度背景图像下可见光图像成像质量差的缺陷。另外,本申请利用曝光时间的差别调节,使得图像亮度信息控制更加方便、灵活和精准。
应当理解,上述实施例仅作为示例而非限制,除了上述逐行扫描和隔行扫描方案之外,本领域技术人员还可以构想更多其他类似方法,既能体现可见光谱段图像的清晰细节,又能体现荧光近红外谱段图像的高对比度下的图像细节。这样的实现方式不应被解读成导致脱离了本申请的范围。
参考图9,示出了一种示例性的计算机系统900。计算机系统900可以包括逻辑处理器902,例如执行核。尽管图示了一个逻辑处理器902,但在其它的实施例中,计算机系统900可以具有多个逻辑处理器,例如,每个处理器基 片多个执行核,和/或多个处理器基片,其中每个处理器基片可以具有多个执行核。如图所示,各种计算机可读存储介质910可以通过一条或多条系统总线互连,所述系统总线将各种系统组件耦合到逻辑处理器902。系统总线可以是若干类型的总线结构中的任何类型,包括存储器总线或存储器控制器、外围总线、以及使用各种各样的总线体系结构中的任一种的本地总线。在示例性的实施例中,计算机可读存储介质910可以包括例如随机存取存储器(RAM)904、存储设备906(例如,机电硬驱动器、固态硬驱动器等等)、固件908(例如,快闪RAM或ROM)、以及可拆卸存储设备918(诸如像CD-ROM、软盘、DVD、闪存驱动器、外部存储设备等等)。本领域的技术人员应当意识到,可以使用其它类型的计算机可读存储介质,诸如磁带盒、闪存卡和/或数字视频盘。计算机可读存储介质910可以提供计算机可执行的指令922、数据结构、程序模块、以及用于计算机系统900的其它数据的非易失性和易失性存储。基本输入/输出系统(BIOS)920——其包含诸如在启动期间帮助在计算机系统900的单元之间转移信息的基本例行程序——可以被存储在固件908中。大量的程序可以被存储在固件908、存储设备906、RAM 904和/或可拆卸存储设备918上,并可以由逻辑处理器902执行,逻辑处理器902包括操作系统和/或应用程序。命令和信息可以通过输入设备916被计算机系统900所接收,输入设备916可以包括但不限于键盘和指向设备。其它的输入设备可以包括话筒、操纵杆、游戏手柄、扫描仪等等。这些和其它输入设备常常通过被耦合到系统总线的串行端口接口而被连接到逻辑处理器902,但是也可以通过其它的接口连接,诸如并行端口、游戏端口或通用串行总线(USB)。显示器或其它类型的显示设备也可以经由接口连接到系统总线,比如视频适配器,其可以是图形处理单元912的一部分或被连接到图形处理单元912。除了显示器,计算机典型地包括其它的外围输出设备,诸如扬声器和打印机(未示出)。图9的示例性系统还可以包括主机适配器、小型计算机系统接口(SCSI)总线、以及连接到SCSI总线的外部存储设备。计算机系统900可以在使用到一个或多个远程计算机(诸如,某个远程计算机)的逻辑连接的联网环境中操作。远程计算机可以是另外的计算机、服务器、路由器、网络PC、对等设备或其它常见的网络节点,以及典型地可以包括上面相对于计算机系统900描述的单元中的多个或 所有单元。当在LAN或WAN联网环境中使用时,计算机系统900可以通过网络接口卡914被连接到LAN或WAN。网卡(NIC)914(其可以是内部的或外部的)可以被连接到系统总线。在联网环境中,相对于计算机系统900描绘的程序模块或者它们的一些部分可以被存储在远程存储器存储设备中。应意识到,这里描述的网络连接是示范性的,以及可以使用在计算机之间建立通信链路的其它手段。
在一个或更多个示例性实施例中,所描述的功能和过程可以在硬件、软件、固件、或其任何组合中实现。如果在软件中实现,则各功能可以作为一条或更多条指令或代码存储在计算机可读介质上或藉其进行传送。计算机可读介质包括计算机存储介质和通信介质两者,其包括促成计算机程序从一地向另一地转移的任何介质。存储介质可以是能被计算机访问的任何可用介质。作为示例而非限定,这样的计算机可读介质可包括RAM、ROM、EEPROM、CD-ROM或其它光盘存储、磁盘存储或其它磁存储设备、或能被用来携带或存储指令或数据结构形式的合意程序代码且能被计算机访问的任何其它介质。对于固件和/或软件实现,这些方法可以用执行本文中所描述功能的模块(例如,程序、函数等等)来实现。有形地体现指令的任何机器可读介质可用于实现本文中所描述的方法体系。例如,软件代码可被存储在例如移动站的存储器之类的存储器中,并由例如台式计算机、膝上型计算机、服务器计算机、移动设备的微处理器等处理器执行。存储器可以实现在处理器内部或处理器外部。如本文所使用的,术语“存储器”是指任何类型的长期、短期、易失性、非易失性、或其他存储器,而并不限于任何特定类型的存储器或特定数目的存储器、或记忆存储在其上的介质的类型。
上述描述和图示仅作为说明性示例提供。对单数形式的权利要求元素的任何引述,例如使用冠词“一”、“某”或“该”的引述不应解释为将该元素限定为单数。例如本文使用的“第一”、“第二”等名称对源深的任何引用通常不限制那些元素的数量或次序。对第一和第二元件的引用并不意味着此处采用仅仅两个元件,也不意味着第一元件必须以某种方式在第二元件之前。除非另外说明,否则一组元件可以包括一个或多个元件。技术人员对于每种特定应用可用不同的方式来实现所描述的结构,但这样的实现方式不应被解读成导致脱 离了本申请的范围。
提供所公开的实施例的先前描述是为了使本领域任何技术人员皆能制作或使用本申请。对这些实施例的各种修改对本领域技术人员来说将是显而易见的,且本文所定义的一般原理可被应用于其它实施例而不背离本申请的精神或范围。由此,本申请并非旨在限定于本文中示出的实施例,而是应被授予与所附权利要求和本文中公开的原理和新颖性特征一致的最广义的范围。

Claims (15)

  1. 一种多光谱成像系统,包括:
    光源模块,提供可见光并产生激发光,被照射区由可见光和激发光组合照射,被照射区受激发光的激发而释放出荧光;
    相机模块,包括一个或多个图像传感器,所述图像传感器接收来自被照射区的反射光中的可见光和荧光,以形成包含可见光谱段图像信息和荧光谱段图像信息的多光谱图像;
    其中,可见光谱段图像的曝光时间值呈周期性变化,使可见光谱段的不同光强度的可见光谱段图像信息在视频连续帧图像中交替循环输出。
  2. 根据权利要求1所述的多光谱成像系统,其特征在于:
    可见光谱段图像的曝光时间值包括可见光谱段图像的第一曝光时间值和可见光谱段图像的第二曝光时间值,可见光谱段图像的第一曝光时间值和可见光谱段图像的第二曝光时间值分别在视频连续帧图像中交替周期性循环出现。
  3. 根据权利要求2所述的多光谱成像系统,其特征在于:
    所述多个图像传感器包括荧光谱段对应的第一图像传感器以及可见光谱段对应的第二图像传感器;每个周期帧为两帧图像,第一图像传感器的曝光时间值在图像的连续帧中保持不变,第二图像传感器的曝光时间值在上半个周期帧被设置成可见光谱段图像的第一曝光时间值,第二图像传感器的曝光时间值在下半个周期帧被设置成可见光谱段图像的第二曝光时间值,可见光谱段图像的第一曝光时间值不同于可见光谱段图像的第二曝光时间值。
  4. 根据权利要求2所述的多光谱成像系统,其特征在于:
    可见光谱段图像的第一曝光时间值和可见光谱段图像的第二曝光时间值是根据预设的可见光谱段的基准曝光时间值来设置的。
  5. 根据权利要求2所述的多光谱成像系统,其特征在于:
    可见光谱段图像的第一曝光时间值和可见光谱段图像的第二曝光时间值是根据荧光谱段图像的曝光时间值来设置的。
  6. 根据权利要求3所述的多光谱成像系统,其特征在于:
    所述多个图像传感器采用逐行扫描曝光方式。
  7. 根据权利要求3所述的多光谱成像系统,其特征在于,通过以下各项中的 一者或多者来调解图像相邻帧之间的光强度差异:
    调解图像的相邻帧中可见光谱段图像的曝光时间值;对相邻帧图像的可见光谱段后期图像进行图像增强处理,或进行加权算法处理调解所述多个图像传感器中输出后期图像硬件增益放大值;或提高图像帧率。
  8. 根据权利要求2所述的多光谱成像系统,其特征在于:
    所述图像传感器采用隔行曝光方式,图像的每一帧分为两个场图像,奇场图像对应所述图像传感器的滤色阵列的奇数行,偶场图像对应所述图像传感器的滤色阵列的偶数行,每个周期帧包括连续四个场图像,其中第一场被设置成以可见光谱段图像的第一曝光时间值曝光,第二场被设置成以可见光谱段图像的第二曝光时间值曝光,第三场被设置成以可见光谱段图像的第二曝光时间值曝光,并且第四场被设置成以可见光谱段图像的第一曝光时间值曝光。
  9. 根据权利要求2所述的多光谱成像系统,其特征在于:
    可见光谱段图像的第一曝光时间值大于可见光谱段图像的第二曝光时间值。
  10. 根据权利要求8所述的多光谱成像系统,其特征在于:
    对相邻奇场和偶场的图像信息进行后期图像增强处理,并分别隔行扫描显示输出。
  11. 一种多光谱成像方法,包括:
    由可见光和激发光对被照射区组合照射,被照射区受激发光的激发而释放出荧光;
    接收来自被照射区的反射光中的可见光和荧光,以形成包含可见光谱段图像信息和荧光谱段图像信息的多光谱图像;
    其中,可见光谱段图像的曝光时间值呈周期性变化,使可见光谱段的不同光强度的可见光谱段图像信息在视频连续帧图像中交替循环输出。
  12. 根据权利要求11所述的多光谱成像方法,其特征在于:
    可见光谱段图像的曝光时间值包括可见光谱段图像的第一曝光时间值和可见光谱段图像的第二曝光时间值,可见光谱段图像的第一曝光时间值和可见光谱段图像的第二曝光时间值分别在视频连续帧图像中交替周期性循环出现。
  13. 根据权利要求12所述的多光谱成像方法,其特征在于:
    每个周期帧为两帧图像,荧光谱段图像的曝光时间值在图像的连续帧中保持不 变,可见光谱段图像的曝光时间值在上半个周期帧被设置成可见光谱段图像的第一曝光时间值,可见光谱段图像的曝光时间值在下半个周期帧被设置成可见光谱段图像的第二曝光时间值,可见光谱段图像的第一曝光时间值不同于可见光谱段图像的第二曝光时间值。
  14. 根据权利要求12所述的多光谱成像方法,其特征在于:
    可见光谱段图像以隔行曝光方式被曝光,图像的每一帧分为两个场图像,奇场图像对应图像的奇数行,偶场图像对应图像的偶数行,每个周期帧包括连续四个场图像,其中第一场被设置成以可见光谱段图像的第一曝光时间值曝光,第二场被设置成以可见光谱段图像的第二曝光时间值曝光,第三场被设置成以可见光谱段图像的第二曝光时间值曝光,并且第四场被设置成以可见光谱段图像的第一曝光时间值曝光。
  15. 一种计算机可读存储介质,所述计算机可读存储介质存储用于执行以下操作的计算机可执行指令:
    由可见光和激发光对被照射区组合照射,被照射区受激发光的激发而释放出荧光;
    接收来自被照射区的反射光中的可见光和荧光,以形成包含可见光谱段图像信息和荧光谱段图像信息的多光谱图像;
    其中,可见光谱段图像的曝光时间值呈周期性变化,使可见光谱段的不同光强度的可见光谱段图像信息在视频连续帧图像中交替循环输出。
PCT/CN2019/105574 2018-09-12 2019-09-12 一种基于图像曝光的多光谱成像系统和方法 WO2020052626A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811064400.3 2018-09-12
CN201811064400.3A CN110893096A (zh) 2018-09-12 2018-09-12 一种基于图像曝光的多光谱成像系统和方法

Publications (1)

Publication Number Publication Date
WO2020052626A1 true WO2020052626A1 (zh) 2020-03-19

Family

ID=69778168

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/105574 WO2020052626A1 (zh) 2018-09-12 2019-09-12 一种基于图像曝光的多光谱成像系统和方法

Country Status (2)

Country Link
CN (1) CN110893096A (zh)
WO (1) WO2020052626A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115719415A (zh) * 2022-03-28 2023-02-28 南京诺源医疗器械有限公司 一种视野可调双视频融合成像方法及系统
CN115802120A (zh) * 2022-11-29 2023-03-14 中国科学院长春光学精密机械与物理研究所 一种多谱段探测器的成像系统
CN116849624A (zh) * 2023-08-31 2023-10-10 南京诺源医疗器械有限公司 基于4cmos图像传感器的荧光成像方法及系统
US11882366B2 (en) 2021-02-26 2024-01-23 Hill-Rom Services, Inc. Patient monitoring system

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115426459B (zh) * 2022-08-30 2024-05-21 广州星博科仪有限公司 一种多光谱相机及其预曝光的优化方法
CN116228612A (zh) * 2022-12-30 2023-06-06 瑞龙诺赋(上海)医疗科技有限公司 图像处理系统以及图像处理方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6527708B1 (en) * 1999-07-02 2003-03-04 Pentax Corporation Endoscope system
CN103340601A (zh) * 2013-06-27 2013-10-09 中国科学院自动化研究所 基于内窥镜的多光谱成像系统和方法
CN103491847A (zh) * 2011-06-07 2014-01-01 奥林巴斯医疗株式会社 内窥镜装置和荧光观察的光量控制方法
CN105074433A (zh) * 2013-03-29 2015-11-18 浜松光子学株式会社 荧光观察装置以及荧光观察方法
CN107072520A (zh) * 2014-08-29 2017-08-18 莱英罗斯有限责任公司 以可见光波长和红外波长并行成像的内窥镜系统
CN107440669A (zh) * 2017-08-25 2017-12-08 北京数字精准医疗科技有限公司 一种双通道内窥式成像系统

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006509573A (ja) * 2002-12-13 2006-03-23 イエトメド リミテッド 手術中に腫瘍と正常組織とを実時間で区別するのに特に有用な光学的検査方法および装置
ES2715633T3 (es) * 2008-05-20 2019-06-05 Univ Health Network Dispositivo y método para formación de imágenes y supervisión por fluorescencia
JP5707758B2 (ja) * 2010-07-13 2015-04-30 ソニー株式会社 撮像装置、撮像システム、手術用ナビゲーションシステム、及び撮像方法
JP2017518693A (ja) * 2014-05-12 2017-07-06 フィリップス ライティング ホールディング ビー ヴィ 符号化された光の検出
CN106214173A (zh) * 2016-09-12 2016-12-14 张宇 一种便携式医学影像装置
CN106254782A (zh) * 2016-09-28 2016-12-21 北京旷视科技有限公司 图像处理方法及装置和相机
CN108478174B (zh) * 2018-03-20 2023-07-25 广东欧谱曼迪科技有限公司 基于曝光反馈的双相机系统及其术中荧光导航调整方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6527708B1 (en) * 1999-07-02 2003-03-04 Pentax Corporation Endoscope system
CN103491847A (zh) * 2011-06-07 2014-01-01 奥林巴斯医疗株式会社 内窥镜装置和荧光观察的光量控制方法
CN105074433A (zh) * 2013-03-29 2015-11-18 浜松光子学株式会社 荧光观察装置以及荧光观察方法
CN103340601A (zh) * 2013-06-27 2013-10-09 中国科学院自动化研究所 基于内窥镜的多光谱成像系统和方法
CN107072520A (zh) * 2014-08-29 2017-08-18 莱英罗斯有限责任公司 以可见光波长和红外波长并行成像的内窥镜系统
CN107440669A (zh) * 2017-08-25 2017-12-08 北京数字精准医疗科技有限公司 一种双通道内窥式成像系统

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11882366B2 (en) 2021-02-26 2024-01-23 Hill-Rom Services, Inc. Patient monitoring system
CN115719415A (zh) * 2022-03-28 2023-02-28 南京诺源医疗器械有限公司 一种视野可调双视频融合成像方法及系统
CN115719415B (zh) * 2022-03-28 2023-11-10 南京诺源医疗器械有限公司 一种视野可调双视频融合成像方法及系统
CN115802120A (zh) * 2022-11-29 2023-03-14 中国科学院长春光学精密机械与物理研究所 一种多谱段探测器的成像系统
CN116849624A (zh) * 2023-08-31 2023-10-10 南京诺源医疗器械有限公司 基于4cmos图像传感器的荧光成像方法及系统
CN116849624B (zh) * 2023-08-31 2023-11-10 南京诺源医疗器械有限公司 基于4cmos的图像传感器荧光成像方法及系统

Also Published As

Publication number Publication date
CN110893096A (zh) 2020-03-20

Similar Documents

Publication Publication Date Title
WO2020052626A1 (zh) 一种基于图像曝光的多光谱成像系统和方法
WO2020052623A1 (zh) 一种用于可见光和激发荧光实时成像的系统和方法
JP6939000B2 (ja) 撮像装置及び撮像方法
JP5507376B2 (ja) 撮像装置
JP4728450B2 (ja) 撮像装置
US10694117B2 (en) Masking approach for imaging multi-peak fluorophores by an imaging system
US11877720B2 (en) Enhanced fluorescence imaging for imaging system
US20100245552A1 (en) Image processing device, imaging device, computer-readable storage medium, and image processing method
CN109475283B (zh) 内窥镜系统
JP6203452B1 (ja) 撮像システム
CN110974135B (zh) 内窥镜用光源装置
JP5399187B2 (ja) 画像取得装置の作動方法および画像取得装置
CN112991367A (zh) 成像系统及生成可见光视频和彩色光视频的方法
US20220151474A1 (en) Medical image processing device and medical observation system
JP2022027195A (ja) 3板式カメラ
JP6535701B2 (ja) 撮像装置
JP2020137955A (ja) 医療用制御装置及び医療用観察システム
US20240259690A1 (en) Auto-exposure management of multi-component images
JP2019168423A (ja) 画像取得装置及び画像取得方法
JP7281308B2 (ja) 医療用画像処理装置及び医療用観察システム
US10918269B2 (en) Medical light source apparatus and medical observation system
JP6896053B2 (ja) 特に顕微鏡および内視鏡のための、蛍光発光性蛍光体のhdrモノクローム画像を作成するためのシステムおよび方法
JP2021146198A (ja) 医療用画像処理装置及び医療用観察システム
JP2021132812A (ja) 医療用画像処理装置及び医療用観察システム
JP2021003347A (ja) 医療用画像処理装置及び医療用観察システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19859678

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19859678

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 17/08/2021)

122 Ep: pct application non-entry in european phase

Ref document number: 19859678

Country of ref document: EP

Kind code of ref document: A1