WO2020052626A1 - 一种基于图像曝光的多光谱成像系统和方法 - Google Patents
一种基于图像曝光的多光谱成像系统和方法 Download PDFInfo
- Publication number
- WO2020052626A1 WO2020052626A1 PCT/CN2019/105574 CN2019105574W WO2020052626A1 WO 2020052626 A1 WO2020052626 A1 WO 2020052626A1 CN 2019105574 W CN2019105574 W CN 2019105574W WO 2020052626 A1 WO2020052626 A1 WO 2020052626A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- visible spectrum
- exposure time
- time value
- light
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0075—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by spectroscopy, i.e. measuring spectra, e.g. Raman spectroscopy, infrared absorption spectroscopy
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/046—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances for infrared imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/043—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances for fluorescence imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0655—Control therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0661—Endoscope light sources
- A61B1/0684—Endoscope light sources using light emitting diodes [LED]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0071—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by measuring fluorescence emission
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0082—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
- A61B5/0084—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for introduction into the body, e.g. by catheters
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/62—Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
- G01N21/63—Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
- G01N21/64—Fluorescence; Phosphorescence
Definitions
- Optical imaging is widely used in clinical medicine, and has the advantages of being harmless to the human body, non-invasive, high sensitivity, and in vivo multi-target imaging.
- fluorescent imaging has become an important tool for intraoperative navigation.
- the application technology based on the near-infrared fluorescent developer Indocyanine Green (ICG) is also increasingly mature.
- ICG near-infrared fluorescent developer Indocyanine Green
- the indocyanine green fluorescent agent After being injected intravenously, the indocyanine green fluorescent agent binds to plasma proteins and is excited by near-infrared light at a wavelength of about 805 nm to release fluorescence at a wavelength of about 835 nm, which is longer than the wavelength of the excitation light.
- the imaging system processes the captured fluorescent signals through algorithms and displays white light, fluorescence, or superimposed images on the screen in real time. Compared with traditional organic dyes, this method of real-time fluorescence development has higher contrast, better recognition of target tissues, and avoids dye dispersion in the later stage of surgery.
- each image frame of the system output image is a multi-spectral fusion image.
- one frame of image can be divided into visible spectrum image information and fluorescence spectrum image information.
- fluorescent contrast agent navigation for tumor surgical cutting
- tissue areas are expected to be accurately excised, and this requires clear fluorescent tissue areas and clear boundaries of fluorescent areas.
- the operator also hopes that the background tissue area of the visible spectrum can also be clearly defined so as to better identify the tissue.
- the tissue of the labeled fluorescent region contains both the information of the fluorescence spectrum and the visible spectrum, and the light intensity of the visible spectrum is usually much stronger than that of the fluorescence spectrum.
- the plurality of image sensors include a first image sensor corresponding to a fluorescence spectrum segment and a second image sensor corresponding to a visible spectrum segment; each periodic frame is two frames of images, and the exposure of the first image sensor The time value remains unchanged in successive frames of the image.
- the exposure time value of the second image sensor is set to the first exposure time value of the visible spectrum image in the first half of the period frame.
- the exposure time value of the second image sensor is in the lower half.
- the period frames are set to the second exposure time value of the visible spectrum segment image, and the first exposure time value of the visible spectrum segment image is different from the second exposure time value of the visible spectrum segment image.
- the plurality of image sensors adopt a progressive scanning exposure method.
- the image sensor adopts an interlaced exposure method.
- Each frame of the image is divided into two field images.
- the odd field image corresponds to the odd-numbered lines of the image sensor color filter array
- the even field image corresponds to the The even-numbered lines of the color filter array of the image sensor are described.
- Each periodic frame image includes four consecutive field images. The first field is set to be exposed at the first exposure time value of the visible spectrum image, and the second field is set to be visible. The second exposure time value of the spectrum image is exposed, the third field is set to be exposed at the second exposure time value of the visible spectrum image, and the fourth field is set to be exposed at the first exposure time value of the visible spectrum image.
- the first exposure time value of the visible spectrum segment image is greater than the second exposure time value of the visible spectrum segment image.
- the image information of adjacent odd and even fields is output after being subjected to post-image enhancement processing.
- the application also provides a multi-spectral imaging method, which includes: irradiating a irradiated area with visible light and excitation light, the irradiated area being excited by the excitation light to emit fluorescence; receiving visible light and reflected light from the irradiated area; and Fluorescence to form a multispectral image containing image information of the visible spectrum segment and image information of the fluorescence spectrum segment; wherein the exposure time value of the visible spectrum segment image changes periodically to make the visible spectrum segment image information of different light intensities in the visible spectrum segment Alternately loop output in video continuous frame images.
- the exposure time value of the visible spectrum segment image includes a first exposure time value of the visible spectrum segment image and a second exposure time value of the visible spectrum segment image, the first exposure time value of the visible spectrum segment image, and The second exposure time values of the visible spectrum image appear alternately and cyclically in the video continuous frame image.
- each periodic frame is two images, and the exposure time value of the fluorescence spectrum image is maintained in consecutive frames of the image.
- the exposure time value of the visible spectrum image is set in the first half of the periodic frame.
- the exposure time value of the visible spectrum segment image is set to the second exposure time value of the visible spectrum segment image in the second half of the period frame, and the first exposure time value of the visible spectrum segment image is different The second exposure time value in the visible spectrum image.
- the visible spectrum segment is exposed by an interlaced exposure method.
- Each frame of the image is divided into two field images.
- the odd field image corresponds to the odd lines of the image
- the even field image corresponds to the even lines of the image.
- Each period frame includes four consecutive field images, where the first field is set to be exposed at the first exposure time value of the visible spectrum segment image, the second field is set to be exposed at the second exposure time value of the visible spectrum segment image, the first The three fields are set to be exposed at the second exposure time value of the visible spectrum segment image, and the fourth field is set to be exposed at the first exposure time value of the visible spectrum segment image.
- the application also provides a computer-readable storage medium storing computer-executable instructions for performing the following operations: irradiating the illuminated area with visible light and excitation light in combination, and Excitation releases fluorescence; receives visible light and fluorescence from the reflected light from the illuminated area to form a multispectral image containing image information of the visible spectrum and image information of the fluorescence spectrum; wherein the exposure time value of the visible spectrum image is Periodic change, so that the visible spectrum image information of different light intensities in the visible spectrum segment is output alternately and cyclically in the video continuous frame image.
- the image exposure-based multispectral imaging system and method provided by the present application can make the visible background image clear and bright, while the fluorescence imaging has high contrast and low noise.
- the application can be widely applied in the field of medical imaging technology, including various fields such as medical endoscopes, fluorescence microscopes and the like.
- FIG. 1 is a schematic structural diagram of a multispectral imaging system according to an embodiment of the present application.
- FIG. 2 is a schematic structural diagram of a multi-spectral imaging system using a single image sensor according to an embodiment of the present application
- FIG. 3 is a schematic diagram of a light source module using multiple monochromatic light source combinations according to an embodiment of the present application
- FIG. 4 is a schematic diagram of a camera module using two image sensors according to an embodiment of the present application.
- FIG. 5 is a schematic diagram of a camera module using four image sensors according to an embodiment of the present application.
- FIG. 6 is a schematic diagram of an imaging method in a multi-image sensor mode progressive output mode according to an embodiment of the present application
- FIG. 7 is a schematic diagram of an arrangement of a color filter array in a single image sensor mode according to an embodiment of the present application.
- FIG. 8 is a schematic diagram of an imaging method in an interlaced output mode according to an embodiment of the present application.
- FIG. 9 is a block diagram of an exemplary computer system according to one embodiment of the present application.
- FIG. 1 shows an exemplary structure diagram of a multi-spectral imaging system according to an embodiment of the present application.
- a multispectral imaging system can be used for a fluorescent endoscope.
- the fluorescence endoscope camera system may include a light source module 1, a camera module 2 for imaging, a data acquisition and preprocessing module 3 for image acquisition and preprocessing, a digital image processing module 4, and a system controller.
- 5 Light source controller 6.
- the camera module 2 may include an optical lens assembly and one or more image sensors (described in detail in the following figure).
- the system controller 5 may generate a corresponding control signal according to the feedback processing result of the digital image processing module 4.
- the light source controller 6 may perform lighting control on the light source module 1 according to a control signal of the system controller 5.
- FIG. 2 shows an exemplary structure diagram of a multi-spectral imaging system using a single image sensor according to an embodiment of the present application.
- the light source module 1 may include an incident visible light LED 11, a near-infrared (NIR) exciter 12, and a combination beam splitter 13.
- the camera module 2 may include an optical lens assembly 9 and a single image sensor 10 (for example, a CCD or CMOS image sensor).
- the optical lens assembly 9 may further include an optical filter 8.
- the incident visible light LED 11 can provide illumination in the visible light wavelength range (400nm ⁇ 760nm).
- the near-infrared (NIR) exciter 12 can generate NIR excitation light in a near-infrared wavelength range (790 nm to 820 nm), particularly near-infrared excitation light in a wavelength range around 800 nm.
- the visible light and the NIR excitation light can be combined and irradiated to the irradiated area (for example, a tissue) through the combined spectroscope 13.
- the tissue injected with the fluorescent developer is irradiated with the NIR excitation light in the light source module 1
- the fluorescent groups in the tissue are excited to emit near-infrared fluorescence with a longer wavelength band than the excitation light wavelength, such as fluorescence of 830 nm to 900 nm.
- the reflected light from the irradiated area may include three spectral bands such as visible light, excitation light (e.g., NIR excitation light), and fluorescence (e.g., near-infrared fluorescence).
- the NIR excitation light is completely blocked by the optional optical filter 8 and cannot enter the image sensor, while visible light and fluorescence enter the single image sensor 10 with NIR pixels through the optical lens assembly 9.
- the data from the image sensor 10 can enter the digital image processing module 4 through the data acquisition and preprocessing module 3 to form a visible spectrum image and a fluorescent spectrum image.
- the system controller 5 may output a signal to the light source controller 6 according to the feedback from the digital image processing module 4.
- the light source controller 6 can control the light source module 1 by controlling a driving current or a voltage.
- FIG. 3 is an exemplary schematic diagram of a light source module employing multiple monochromatic light source combinations according to an embodiment of the present application.
- the light source module 1 may include a combination of three monochrome light sources as shown in FIG. 3.
- the visible light source consists of three monochromatic light sources: red LED 18 (620nm ⁇ 760nm), green LED 19 (492nm ⁇ 577nm) and blue LED 20 (400nm ⁇ 450nm) through the corresponding combination beam splitter 15, combination beam splitter 16 and combination.
- the spectroscope 17 is combined.
- the camera module 2 may include one or more image sensors.
- the camera module 2 may be a single image sensor as shown in FIG. 2, or a dual image sensor combination as shown in FIG. 4.
- FIG. 4 is an exemplary schematic diagram of a camera module employing two image sensors according to an embodiment of the present application.
- the camera module 2 may include a plurality of image sensors, and a corresponding dichroic prism.
- the camera module 2 may employ an image sensor 32 (such as a CCD / CMOS image sensor) for visible light and an image sensor 33 (such as a CCD / CMOS image sensor) for fluorescence. Visible light and fluorescence from the tissue enter the corresponding image sensors through the dichroic prism 31, respectively.
- each image frame of the system output image is a multi-spectral fusion image of visible light and fluorescence.
- one frame of image can be divided into visible spectral band image information and fluorescent spectral bands.
- Image information In the mode where multiple image sensors are used to separately capture image signals in different spectral wavelength bands for subsequent fusion (such as the above-mentioned Figure 4 and Figure 5), the light intensity in each image frame information is reflected by the tissue reflected light intensity and reflected light on the image sensor. Light time integrals are determined together.
- the contrast of the fluorescent region can be increased, but the brightness and sharpness of the visible background tissue can also be reduced.
- high visible light intensity can highlight the image space details of visible light images, while low light intensity highlights the details of fluorescent images.
- Increasing the fluorescence contrast by reducing the light intensity of the visible background image will sacrifice the imaging quality of the visible image.
- the image noise under low illumination is more prominent and obvious, making the image borders blurred and the noise prominent .
- FIG. 6 is a schematic diagram of an imaging method in a multi-image sensor mode progressive output mode according to an embodiment of the present application.
- the following exposure adjustment methods can be adopted.
- the exposure time of the fluorescent image sensor remains unchanged according to the frame rate of the video image.
- the image sensor corresponding to the visible spectrum segment has a long exposure time
- the image sensor corresponding to the visible spectrum segment in the second frame has a short exposure time.
- the visible spectrum segment corresponds to a short exposure time.
- the image sensor performs a long exposure time
- the image sensor corresponding to the visible spectrum segment in the fourth frame has a short exposure time.
- subsequent image frames also change periodically.
- the different exposure time values of the visible and long spectral bands appear alternately and cyclically in each frame of the image.
- one period frame is two images
- the first half period frame is the first exposure time (eg, long exposure time) of the visible spectrum segment image
- the second half period frame is the second exposure time of the visible spectrum segment image (eg, , Short exposure time).
- the exposure time of the visible spectrum segment may be long before short (as shown in the illustrated embodiment), but the application is not limited thereto.
- the human eye is less sensitive to changes in brightness over a certain light intensity In this way, the bright and dark parts of continuous image frames are intertwined, and it is difficult for the human eye to perceive such a rapid change in the brightness of the visible spectrum.
- the exposure time difference can be adjusted according to the specific visual effect, so that the human eye does not produce a significant splash screen effect.
- the human eye is more sensitive to brightness at very low brightness, but not sensitive at high brightness.
- FIG. 7 is a schematic diagram of color filter array arrangement in a single image sensor mode according to an embodiment of the present application.
- the signal transmission path in the CCD may not be able to adapt to a large amount of one-time data obtained by progressive scanning, which will cause a decline in image processing speed.
- interlaced scanning technology can be used. Therefore, for a single image sensor with integrated NIR pixels in the color filter array of the sensor, the light intensity information of the visible light and the fluorescent image can be adjusted by interlaced exposure and interlaced output, respectively.
- Some CCD or CMOS image sensors can only perform interlaced exposure and read out in interlaced output mode. In this mode, the electronic shutter resets all photodiodes before exposure, as in progressive scan mode.
- NIR near-infrared fluorescence
- blue light B pixels are in one line
- red R and green G pixels are in one line
- the odd-numbered lines of the color filter array can be set as the red R and green G arrays
- the even-numbered lines can be set as the blue and NIR arrays.
- the exposure time of odd and even lines of an image can be set separately to adjust the brightness of each monochromatic light.
- the color visible light image is synthesized by the RGB array in the color filter array. According to the brightness of visible light is equal to the sum of the brightness of the primary colors participating in color mixing.
- the odd rows of the image are reduced separately.
- the exposure time can reduce the spectral components of the red R and green G components, resulting in a reduction in the total light intensity of the image in the visible spectrum.
- this will cause the blue light intensity in the mixed primary color of visible light to be too high, resulting in color cast.
- gamma correction may be performed in the post-processing of the image, and the weight ratio of the blue light component in the gamma correction is re-determined according to the difference between the exposure time between the even and odd lines and the intensity of the light, so as to restore the visible spectrum.
- the true color of the segment image is re-determined according to the difference between the exposure time between the even and odd lines and the intensity of the light
- the same line as the NIR pixel may be red light R or green light G (not shown in the figure).
- the weight ratio of the corresponding red light R or green light G component needs to be adjusted.
- FIG. 8 shows a schematic diagram of an imaging method in an interlaced output mode according to an embodiment of the present application.
- a single image sensor is used.
- the interlaced exposure interlaced output mode the following frame image exposure adjustment methods can be used.
- one frame of an image is divided into two field images: an odd field image and an even field image.
- the odd field image corresponds to the odd line of the image sensor color filter array
- the even field image corresponds to the even line of the image sensor.
- the first field is a long exposure
- the second field is a short exposure
- the third field is also a short exposure
- the fourth field is a long exposure.
- the exposure in four consecutive field images is one exposure change cycle.
- the corresponding change in the light intensity of the image frame is also shown in Figure 8.
- the light intensity of the visible spectrum segment is high in the first frame of the image, while the light intensity of the visible spectrum segment is low in the second frame;
- the intensity is high, while the light intensity in the visible band of the fourth frame is low.
- the converted digital image signal enters the digital image processing module 4.
- the digital image processing module 4 performs post-image enhancement processing on field images with different exposure times on the odd and even lines of the image sensor, and then synthesizes the images, and then synthesizes the images on the display (Not shown).
- the display may also be displayed in an interlaced manner directly on the display.
- interlaced output the image information is divided into two field images and output separately, which is more continuous for the visual effect of the present invention. At the same time, interlaced output can support higher resolution image output.
- the continuous frame image information by setting different exposure times to adjust the visible light intensity of different frame images in the video image, so that the high and low visible spectrum Segment light intensity is output alternately in different frames of the image.
- a continuous frame of image information not only can output a high-brightness visible spectrum image, highlight the visible light image details of the background tissue, but also output high-contrast fluorescent image information under a low visible light intensity background.
- both the clear details of the visible spectrum image and the high-contrast image details of the fluorescence near-infrared spectrum image are reflected.
- Adopting the image periodic frame technology which outputs different light intensities and contrasts every other frame, while ensuring high contrast of fluorescent image information, it also outputs high-definition visible light image information.
- Computer system 900 may include a logic processor 902, such as an execution core. Although one logical processor 902 is illustrated, in other embodiments, the computer system 900 may have multiple logical processors, for example, multiple execution cores per processor chip, and / or multiple processor bases. Chips, where each processor chip can have multiple execution cores. As shown, various computer-readable storage media 910 may be interconnected by one or more system buses that couple various system components to the logical processor 902.
- the system bus can be any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures.
- computer-readable storage medium 910 may include, for example, random access memory (RAM) 904, storage device 906 (e.g., electromechanical hard drive, solid state hard drive, etc.), firmware 908 (e.g., flash memory) RAM or ROM), and removable storage devices 918 (such as, for example, CD-ROMs, floppy disks, DVDs, flash drives, external storage devices, etc.).
- RAM random access memory
- storage device 906 e.g., electromechanical hard drive, solid state hard drive, etc.
- firmware 908 e.g., flash memory
- removable storage devices 918 such as, for example, CD-ROMs, floppy disks, DVDs, flash drives, external storage devices, etc.
- removable storage devices 918 such as, for example, CD-ROMs, floppy disks, DVDs, flash drives, external storage devices, etc.
- removable storage devices 918 such as, for example, CD-ROMs, floppy disks, DVDs, flash drives, external storage devices, etc.
- a basic input / output system (BIOS) 920 which contains basic routines such as helping to transfer information between units of the computer system 900 during startup, may be stored in firmware 908.
- BIOS basic input / output system
- a large number of programs may be stored on the firmware 908, the storage device 906, the RAM 904, and / or the removable storage device 918, and may be executed by the logical processor 902, which includes an operating system and / or application programs.
- Commands and information may be received by the computer system 900 through an input device 916, which may include, but is not limited to, a keyboard and pointing device.
- Other input devices may include a microphone, joystick, gamepad, scanner, and so on.
- serial port interface coupled to the system bus, but may also be connected through other interfaces, such as a parallel port, a game port, or a universal serial bus (USB).
- a display or other type of display device may also be connected to the system bus via an interface, such as a video adapter, which may be part of or connected to the graphics processing unit 912.
- computers typically include other peripheral output devices, such as speakers and printers (not shown).
- the exemplary system of FIG. 9 may also include a host adapter, a small computer system interface (SCSI) bus, and an external storage device connected to the SCSI bus.
- SCSI small computer system interface
- Computer system 900 may operate in a networked environment using logical connections to one or more remote computers, such as a certain remote computer.
- the remote computer may be another computer, server, router, network PC, peer device, or other common network node, and may typically include multiple or all of the units described above with respect to computer system 900.
- computer system 900 When used in a LAN or WAN networking environment, computer system 900 may be connected to a LAN or WAN through a network interface card 914.
- a network card (NIC) 914 (which may be internal or external) may be connected to the system bus.
- program modules or portions thereof depicted relative to computer system 900 may be stored in a remote memory storage device. It should be appreciated that the network connections described herein are exemplary and other means of establishing a communications link between the computers may be used.
- the functions and processes described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium.
- Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another.
- a storage media may be any available media that can be accessed by a computer.
- such computer readable media may include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or in the form of instructions or data structures that can be used to carry or store instructions. Any other medium that agrees with the program code and can be accessed by a computer.
- the software code may be stored in a memory such as the memory of a mobile station and executed by a processor such as a desktop computer, laptop computer, server computer, microprocessor of a mobile device, or the like.
- the memory can be implemented inside the processor or outside the processor.
- memory refers to any type of long-term, short-term, volatile, nonvolatile, or other memory, and is not limited to any particular type of memory or a specific number of memories, or memory storage The type of media on it.
- any reference to a claim element in a singular form such as a reference using the articles “a”, “an” or “the” should not be interpreted as limiting the element to the singular.
- any reference to the source depth by the names “first”, “second”, etc. as used herein generally does not limit the number or order of those elements.
- Reference to the first and second elements does not mean that only two elements are used here, nor does it mean that the first element must precede the second element in some way.
- a set of elements may include one or more elements.
- the skilled person can implement the described structure in different ways for each particular application, but such implementation should not be interpreted as causing a departure from the scope of the present application.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Pathology (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Medical Informatics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Optics & Photonics (AREA)
- Radiology & Medical Imaging (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- Chemical & Material Sciences (AREA)
- Immunology (AREA)
- General Physics & Mathematics (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)
- Endoscopes (AREA)
Abstract
Description
Claims (15)
- 一种多光谱成像系统,包括:光源模块,提供可见光并产生激发光,被照射区由可见光和激发光组合照射,被照射区受激发光的激发而释放出荧光;相机模块,包括一个或多个图像传感器,所述图像传感器接收来自被照射区的反射光中的可见光和荧光,以形成包含可见光谱段图像信息和荧光谱段图像信息的多光谱图像;其中,可见光谱段图像的曝光时间值呈周期性变化,使可见光谱段的不同光强度的可见光谱段图像信息在视频连续帧图像中交替循环输出。
- 根据权利要求1所述的多光谱成像系统,其特征在于:可见光谱段图像的曝光时间值包括可见光谱段图像的第一曝光时间值和可见光谱段图像的第二曝光时间值,可见光谱段图像的第一曝光时间值和可见光谱段图像的第二曝光时间值分别在视频连续帧图像中交替周期性循环出现。
- 根据权利要求2所述的多光谱成像系统,其特征在于:所述多个图像传感器包括荧光谱段对应的第一图像传感器以及可见光谱段对应的第二图像传感器;每个周期帧为两帧图像,第一图像传感器的曝光时间值在图像的连续帧中保持不变,第二图像传感器的曝光时间值在上半个周期帧被设置成可见光谱段图像的第一曝光时间值,第二图像传感器的曝光时间值在下半个周期帧被设置成可见光谱段图像的第二曝光时间值,可见光谱段图像的第一曝光时间值不同于可见光谱段图像的第二曝光时间值。
- 根据权利要求2所述的多光谱成像系统,其特征在于:可见光谱段图像的第一曝光时间值和可见光谱段图像的第二曝光时间值是根据预设的可见光谱段的基准曝光时间值来设置的。
- 根据权利要求2所述的多光谱成像系统,其特征在于:可见光谱段图像的第一曝光时间值和可见光谱段图像的第二曝光时间值是根据荧光谱段图像的曝光时间值来设置的。
- 根据权利要求3所述的多光谱成像系统,其特征在于:所述多个图像传感器采用逐行扫描曝光方式。
- 根据权利要求3所述的多光谱成像系统,其特征在于,通过以下各项中的 一者或多者来调解图像相邻帧之间的光强度差异:调解图像的相邻帧中可见光谱段图像的曝光时间值;对相邻帧图像的可见光谱段后期图像进行图像增强处理,或进行加权算法处理调解所述多个图像传感器中输出后期图像硬件增益放大值;或提高图像帧率。
- 根据权利要求2所述的多光谱成像系统,其特征在于:所述图像传感器采用隔行曝光方式,图像的每一帧分为两个场图像,奇场图像对应所述图像传感器的滤色阵列的奇数行,偶场图像对应所述图像传感器的滤色阵列的偶数行,每个周期帧包括连续四个场图像,其中第一场被设置成以可见光谱段图像的第一曝光时间值曝光,第二场被设置成以可见光谱段图像的第二曝光时间值曝光,第三场被设置成以可见光谱段图像的第二曝光时间值曝光,并且第四场被设置成以可见光谱段图像的第一曝光时间值曝光。
- 根据权利要求2所述的多光谱成像系统,其特征在于:可见光谱段图像的第一曝光时间值大于可见光谱段图像的第二曝光时间值。
- 根据权利要求8所述的多光谱成像系统,其特征在于:对相邻奇场和偶场的图像信息进行后期图像增强处理,并分别隔行扫描显示输出。
- 一种多光谱成像方法,包括:由可见光和激发光对被照射区组合照射,被照射区受激发光的激发而释放出荧光;接收来自被照射区的反射光中的可见光和荧光,以形成包含可见光谱段图像信息和荧光谱段图像信息的多光谱图像;其中,可见光谱段图像的曝光时间值呈周期性变化,使可见光谱段的不同光强度的可见光谱段图像信息在视频连续帧图像中交替循环输出。
- 根据权利要求11所述的多光谱成像方法,其特征在于:可见光谱段图像的曝光时间值包括可见光谱段图像的第一曝光时间值和可见光谱段图像的第二曝光时间值,可见光谱段图像的第一曝光时间值和可见光谱段图像的第二曝光时间值分别在视频连续帧图像中交替周期性循环出现。
- 根据权利要求12所述的多光谱成像方法,其特征在于:每个周期帧为两帧图像,荧光谱段图像的曝光时间值在图像的连续帧中保持不 变,可见光谱段图像的曝光时间值在上半个周期帧被设置成可见光谱段图像的第一曝光时间值,可见光谱段图像的曝光时间值在下半个周期帧被设置成可见光谱段图像的第二曝光时间值,可见光谱段图像的第一曝光时间值不同于可见光谱段图像的第二曝光时间值。
- 根据权利要求12所述的多光谱成像方法,其特征在于:可见光谱段图像以隔行曝光方式被曝光,图像的每一帧分为两个场图像,奇场图像对应图像的奇数行,偶场图像对应图像的偶数行,每个周期帧包括连续四个场图像,其中第一场被设置成以可见光谱段图像的第一曝光时间值曝光,第二场被设置成以可见光谱段图像的第二曝光时间值曝光,第三场被设置成以可见光谱段图像的第二曝光时间值曝光,并且第四场被设置成以可见光谱段图像的第一曝光时间值曝光。
- 一种计算机可读存储介质,所述计算机可读存储介质存储用于执行以下操作的计算机可执行指令:由可见光和激发光对被照射区组合照射,被照射区受激发光的激发而释放出荧光;接收来自被照射区的反射光中的可见光和荧光,以形成包含可见光谱段图像信息和荧光谱段图像信息的多光谱图像;其中,可见光谱段图像的曝光时间值呈周期性变化,使可见光谱段的不同光强度的可见光谱段图像信息在视频连续帧图像中交替循环输出。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811064400.3 | 2018-09-12 | ||
CN201811064400.3A CN110893096A (zh) | 2018-09-12 | 2018-09-12 | 一种基于图像曝光的多光谱成像系统和方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020052626A1 true WO2020052626A1 (zh) | 2020-03-19 |
Family
ID=69778168
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2019/105574 WO2020052626A1 (zh) | 2018-09-12 | 2019-09-12 | 一种基于图像曝光的多光谱成像系统和方法 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN110893096A (zh) |
WO (1) | WO2020052626A1 (zh) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115719415A (zh) * | 2022-03-28 | 2023-02-28 | 南京诺源医疗器械有限公司 | 一种视野可调双视频融合成像方法及系统 |
CN115802120A (zh) * | 2022-11-29 | 2023-03-14 | 中国科学院长春光学精密机械与物理研究所 | 一种多谱段探测器的成像系统 |
CN116849624A (zh) * | 2023-08-31 | 2023-10-10 | 南京诺源医疗器械有限公司 | 基于4cmos图像传感器的荧光成像方法及系统 |
US11882366B2 (en) | 2021-02-26 | 2024-01-23 | Hill-Rom Services, Inc. | Patient monitoring system |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115426459B (zh) * | 2022-08-30 | 2024-05-21 | 广州星博科仪有限公司 | 一种多光谱相机及其预曝光的优化方法 |
CN116228612A (zh) * | 2022-12-30 | 2023-06-06 | 瑞龙诺赋(上海)医疗科技有限公司 | 图像处理系统以及图像处理方法 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6527708B1 (en) * | 1999-07-02 | 2003-03-04 | Pentax Corporation | Endoscope system |
CN103340601A (zh) * | 2013-06-27 | 2013-10-09 | 中国科学院自动化研究所 | 基于内窥镜的多光谱成像系统和方法 |
CN103491847A (zh) * | 2011-06-07 | 2014-01-01 | 奥林巴斯医疗株式会社 | 内窥镜装置和荧光观察的光量控制方法 |
CN105074433A (zh) * | 2013-03-29 | 2015-11-18 | 浜松光子学株式会社 | 荧光观察装置以及荧光观察方法 |
CN107072520A (zh) * | 2014-08-29 | 2017-08-18 | 莱英罗斯有限责任公司 | 以可见光波长和红外波长并行成像的内窥镜系统 |
CN107440669A (zh) * | 2017-08-25 | 2017-12-08 | 北京数字精准医疗科技有限公司 | 一种双通道内窥式成像系统 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006509573A (ja) * | 2002-12-13 | 2006-03-23 | イエトメド リミテッド | 手術中に腫瘍と正常組織とを実時間で区別するのに特に有用な光学的検査方法および装置 |
ES2715633T3 (es) * | 2008-05-20 | 2019-06-05 | Univ Health Network | Dispositivo y método para formación de imágenes y supervisión por fluorescencia |
JP5707758B2 (ja) * | 2010-07-13 | 2015-04-30 | ソニー株式会社 | 撮像装置、撮像システム、手術用ナビゲーションシステム、及び撮像方法 |
JP2017518693A (ja) * | 2014-05-12 | 2017-07-06 | フィリップス ライティング ホールディング ビー ヴィ | 符号化された光の検出 |
CN106214173A (zh) * | 2016-09-12 | 2016-12-14 | 张宇 | 一种便携式医学影像装置 |
CN106254782A (zh) * | 2016-09-28 | 2016-12-21 | 北京旷视科技有限公司 | 图像处理方法及装置和相机 |
CN108478174B (zh) * | 2018-03-20 | 2023-07-25 | 广东欧谱曼迪科技有限公司 | 基于曝光反馈的双相机系统及其术中荧光导航调整方法 |
-
2018
- 2018-09-12 CN CN201811064400.3A patent/CN110893096A/zh active Pending
-
2019
- 2019-09-12 WO PCT/CN2019/105574 patent/WO2020052626A1/zh active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6527708B1 (en) * | 1999-07-02 | 2003-03-04 | Pentax Corporation | Endoscope system |
CN103491847A (zh) * | 2011-06-07 | 2014-01-01 | 奥林巴斯医疗株式会社 | 内窥镜装置和荧光观察的光量控制方法 |
CN105074433A (zh) * | 2013-03-29 | 2015-11-18 | 浜松光子学株式会社 | 荧光观察装置以及荧光观察方法 |
CN103340601A (zh) * | 2013-06-27 | 2013-10-09 | 中国科学院自动化研究所 | 基于内窥镜的多光谱成像系统和方法 |
CN107072520A (zh) * | 2014-08-29 | 2017-08-18 | 莱英罗斯有限责任公司 | 以可见光波长和红外波长并行成像的内窥镜系统 |
CN107440669A (zh) * | 2017-08-25 | 2017-12-08 | 北京数字精准医疗科技有限公司 | 一种双通道内窥式成像系统 |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11882366B2 (en) | 2021-02-26 | 2024-01-23 | Hill-Rom Services, Inc. | Patient monitoring system |
CN115719415A (zh) * | 2022-03-28 | 2023-02-28 | 南京诺源医疗器械有限公司 | 一种视野可调双视频融合成像方法及系统 |
CN115719415B (zh) * | 2022-03-28 | 2023-11-10 | 南京诺源医疗器械有限公司 | 一种视野可调双视频融合成像方法及系统 |
CN115802120A (zh) * | 2022-11-29 | 2023-03-14 | 中国科学院长春光学精密机械与物理研究所 | 一种多谱段探测器的成像系统 |
CN116849624A (zh) * | 2023-08-31 | 2023-10-10 | 南京诺源医疗器械有限公司 | 基于4cmos图像传感器的荧光成像方法及系统 |
CN116849624B (zh) * | 2023-08-31 | 2023-11-10 | 南京诺源医疗器械有限公司 | 基于4cmos的图像传感器荧光成像方法及系统 |
Also Published As
Publication number | Publication date |
---|---|
CN110893096A (zh) | 2020-03-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020052626A1 (zh) | 一种基于图像曝光的多光谱成像系统和方法 | |
WO2020052623A1 (zh) | 一种用于可见光和激发荧光实时成像的系统和方法 | |
JP6939000B2 (ja) | 撮像装置及び撮像方法 | |
JP5507376B2 (ja) | 撮像装置 | |
JP4728450B2 (ja) | 撮像装置 | |
US10694117B2 (en) | Masking approach for imaging multi-peak fluorophores by an imaging system | |
US11877720B2 (en) | Enhanced fluorescence imaging for imaging system | |
US20100245552A1 (en) | Image processing device, imaging device, computer-readable storage medium, and image processing method | |
CN109475283B (zh) | 内窥镜系统 | |
JP6203452B1 (ja) | 撮像システム | |
CN110974135B (zh) | 内窥镜用光源装置 | |
JP5399187B2 (ja) | 画像取得装置の作動方法および画像取得装置 | |
CN112991367A (zh) | 成像系统及生成可见光视频和彩色光视频的方法 | |
US20220151474A1 (en) | Medical image processing device and medical observation system | |
JP2022027195A (ja) | 3板式カメラ | |
JP6535701B2 (ja) | 撮像装置 | |
JP2020137955A (ja) | 医療用制御装置及び医療用観察システム | |
US20240259690A1 (en) | Auto-exposure management of multi-component images | |
JP2019168423A (ja) | 画像取得装置及び画像取得方法 | |
JP7281308B2 (ja) | 医療用画像処理装置及び医療用観察システム | |
US10918269B2 (en) | Medical light source apparatus and medical observation system | |
JP6896053B2 (ja) | 特に顕微鏡および内視鏡のための、蛍光発光性蛍光体のhdrモノクローム画像を作成するためのシステムおよび方法 | |
JP2021146198A (ja) | 医療用画像処理装置及び医療用観察システム | |
JP2021132812A (ja) | 医療用画像処理装置及び医療用観察システム | |
JP2021003347A (ja) | 医療用画像処理装置及び医療用観察システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19859678 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19859678 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 17/08/2021) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19859678 Country of ref document: EP Kind code of ref document: A1 |