CN114869207A - Electronic endoscope imaging system and method - Google Patents

Electronic endoscope imaging system and method Download PDF

Info

Publication number
CN114869207A
CN114869207A CN202210549242.0A CN202210549242A CN114869207A CN 114869207 A CN114869207 A CN 114869207A CN 202210549242 A CN202210549242 A CN 202210549242A CN 114869207 A CN114869207 A CN 114869207A
Authority
CN
China
Prior art keywords
light
image
infrared light
excitation
excitation light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210549242.0A
Other languages
Chinese (zh)
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Weimi Medical Instrument Co ltd
Original Assignee
Shanghai Weimi Medical Instrument Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Weimi Medical Instrument Co ltd filed Critical Shanghai Weimi Medical Instrument Co ltd
Priority to CN202210549242.0A priority Critical patent/CN114869207A/en
Publication of CN114869207A publication Critical patent/CN114869207A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0661Endoscope light sources
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Signal Processing (AREA)
  • Endoscopes (AREA)

Abstract

The present application relates to an electronic endoscopic imaging system, method. The system comprises: and the light source equipment is used for continuously emitting visible light to the object to be observed and alternately emitting the first near infrared light and the second near infrared light. Visible light is reflected on an object to be observed to form reflected light, and near infrared light excites exciting light on the object to be observed. And the image acquisition device is positioned on the light path of the reflected light, the first excitation light and the second excitation light. And the image processing device is electrically connected with the image acquisition device and is used for fusing the visible light image and the first excitation light image formed at the same moment into a first fused image, fusing the visible light image and the second excitation light image formed at the same moment into a second fused image and regenerating a target fused image. Therefore, the information of different penetration depths of the object to be observed can be presented on one image, so that a doctor can conveniently observe the metabolic information and the blood vessel shape information of the object to be observed at the same time, and diagnosis and treatment are facilitated.

Description

Electronic endoscope imaging system and method
Technical Field
The application relates to the technical field of medical instruments, in particular to an electronic endoscope imaging system and method.
Background
With the development of medical technology, in order to clearly observe a diseased part in a patient and accurately perform an operation, an endoscope technology is provided, and an endoscope can expand an operation visual field, acquire an image of the diseased part, help a doctor observe the diseased part and improve the diagnosis and treatment capability of the doctor.
In the conventional art, an image of a lesion is acquired by emitting visible light and near infrared light to the lesion.
However, in practice, in order to observe the tissue metabolism and avoid cutting blood vessels during surgery, it is necessary to acquire the metabolic information and the blood vessel shape information of the lesion at the same time, and in the conventional technique, the acquired image cannot include the metabolic information and the blood vessel shape information of the lesion at the same time.
Disclosure of Invention
In view of the above, it is necessary to provide an electronic endoscopic imaging system and method capable of generating an image including both metabolic information and vascular morphology information of a lesion site in order to solve the above-mentioned technical problems
An electronic endoscopic imaging system, the system comprising: the light source equipment is used for continuously emitting visible light to an object to be observed and alternately emitting first near infrared light and second near infrared light; wherein the wavelength of the first near-infrared light is less than the wavelength of the second near-infrared light; the visible light is reflected on the object to be observed to form reflected light, the first near infrared light excites the object to be observed to generate first exciting light, and the second near infrared light excites the object to be observed to generate second exciting light, wherein the object to be observed contains a developer corresponding to the first near infrared light; the image acquisition equipment is positioned on the light paths of the reflected light, the first excitation light and the second excitation light and is used for acquiring the reflected light to form a visible light image, acquiring the first excitation light to form a first excitation light image and acquiring the second excitation light to form a second excitation light image; and the image processing device is electrically connected with the image acquisition device and is used for fusing the visible light image and the first excitation light image formed at the same moment into a first fused image, fusing the visible light image and the second excitation light image formed at the same moment into a second fused image, and fusing the first fused image and the second fused image generated at adjacent moments into a target fused image.
In one embodiment, the image capturing apparatus includes: the beam splitter prism is positioned on the light paths of the reflected light, the first excitation light and the second excitation light and used for changing the light path of at least one of the reflected light, the first excitation light and the second excitation light so that the light path of the reflected light is not coplanar with the light path of the first excitation light and the light path of the second excitation light; the first image sensor group is positioned on a light path of the reflected light passing through the light splitting prism and used for collecting the reflected light to form a visible light image; and the second image sensor group is positioned on the light path of the first excitation light and the second excitation light passing through the light splitting prism and used for collecting the first excitation light to form a first excitation light image and collecting the second excitation light to form a second excitation light image.
In one embodiment, the beam splitting prism includes: first triangular prism and second triangular prism that the inclined plane veneer is connected, first triangular prism's inclined plane with the reverberation first exciting light the acute angle contained angle of the light path of second exciting light is 45, is used for changing the reverberation first exciting light with the light path of at least one in the second exciting light makes the light path of reverberation with the light path of first exciting light the light path of second exciting light is perpendicular, wherein, first triangular prism with second triangular prism is 45 right angle triangular prism.
In one embodiment, the inclined surface of the first triangular prism is plated with a semi-transparent and semi-reflective multilayer dielectric film for reflecting the reflected light and transmitting the first excitation light and the second excitation light; and the inclined plane of the second triangular prism is plated with a high-transmission film for transmitting the first excitation light and the second excitation light.
In one embodiment, the image capturing apparatus further comprises: and the light guide lens group is positioned on the light path of the reflected light, the first excitation light and the second excitation light and is used for converging the reflected light, the first excitation light and the second excitation light into the light splitting prism.
In one embodiment, the image capturing apparatus further comprises: and the filter is positioned on the light path of the reflected light, the first exciting light and the second exciting light and used for screening the reflected light, the first exciting light and the second exciting light.
In one embodiment, the light source apparatus includes: the light source emitting module is used for emitting at least one of the visible light, the first near infrared light and the second near infrared light; and the driving module is connected with the light source transmitting module and used for driving the light source transmitting module to continuously transmit the visible light to the object to be observed and alternately transmit the first near-infrared light and the second near-infrared light.
In one embodiment, the light source emitting module includes: a visible light emitting module for emitting the visible light; the first near infrared light emitting module is used for emitting the first near infrared light; the second near-infrared light emitting module is used for emitting the second near-infrared light; the driving module includes: the visible light driving module is connected with the visible light emitting module and used for providing a driving power supply for the visible light emitting module; the first near-infrared light driving module is connected with the first near-infrared light emitting module and used for providing a driving power supply for the first near-infrared light emitting module; the second near-infrared light driving module is connected with the second near-infrared light emitting module and used for providing a driving power supply for the second near-infrared light emitting module; and the time sequence control module is respectively connected with the first near-infrared light driving module and the second near-infrared light driving module and is used for controlling the first near-infrared light driving module and the second near-infrared light driving module to alternately work according to a preset time sequence signal.
In one embodiment, the first image sensor group comprises two visible light image sensors arranged side by side in a coplanar manner; the second image sensor group comprises two near infrared light image sensors which are arranged side by side in a coplanar manner.
An electronic endoscopic imaging method, the method comprising:
continuously emitting visible light and alternately emitting first near-infrared light and second near-infrared light to an object to be observed; wherein the wavelength of the first near-infrared light is less than the wavelength of the second near-infrared light; the visible light is reflected on the object to be observed to form reflected light, the first near infrared light excites the object to be observed to generate first exciting light, and the second near infrared light excites the object to be observed to generate second exciting light, wherein the object to be observed contains a developer corresponding to the first near infrared light;
collecting the reflected light to form a visible light image, collecting the first exciting light to form a first exciting light image, and collecting the second exciting light to form a second exciting light image;
and fusing the visible light image and the first exciting light image which are formed at the same moment into a first fused image, fusing the visible light image and the second exciting light image which are formed at the same moment into a second fused image, and fusing the first fused image and the second fused image which are generated at adjacent moments into a target fused image.
The electronic endoscope imaging system and the electronic endoscope imaging method are disclosed. Continuously emitting visible light to an object to be observed and alternately emitting first near infrared light and second near infrared light through light source equipment, wherein the object to be observed can reflect the visible light to form reflected light after being irradiated by the visible light; the object to be observed is excited to generate a first excitation light under the irradiation of the first near infrared light, and the object to be observed is excited to generate a second excitation light under the irradiation of the second near infrared light. Through image acquisition equipment, can gather the reverberation, and the first exciting light and the second exciting light that are aroused, and generate the visible light image according to the reverberation, generate first exciting light image and second exciting light image respectively according to first exciting light and second exciting light, because the wavelength of first near-infrared light is less than the wavelength of second near-infrared light, therefore the penetration depth of first near-infrared light does not have the second near-infrared light dark, adopt first near-infrared light, arouse the first exciting light image that obtains, and adopt the second near-infrared light, arouse the second exciting light image that obtains, contain the content of the different degree of depth of waiting to observe the object, consequently, can obtain the image of waiting to observe the different penetration depths of object. The image processing device fuses the visible light image and the first excitation light image formed at the same time into a first fused image, fuses the visible light image and the second excitation light image formed at the same time into a second fused image, and fuses the first fused image and the second fused image generated at adjacent times into a target fused image. Because the first near-infrared light and the second near-infrared light are emitted alternately, the generation frame rates of the first excitation light image and the second excitation light image are half of those of the visible light image, so that each frame of the first fusion image has a corresponding second fusion image, and the generated target fusion images comprise the first fusion image and the second fusion image which are in one-to-one correspondence. The frame rate of the target fusion image is higher, and the target fusion image is clearer and more consistent. The information of different depths of the object to be observed can be presented clearly and simultaneously. In conclusion, the system of the application can present the information of different penetration depths of the object to be observed on one image, so that a doctor can conveniently observe the metabolic information and the blood vessel shape information of the object to be observed at the same time, and the doctor can conveniently make a diagnosis and treat.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments or the conventional technologies of the present application, the drawings used in the descriptions of the embodiments or the conventional technologies will be briefly introduced below, it is obvious that the drawings in the following descriptions are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic diagram of an electronic endoscopic imaging system in one embodiment;
FIG. 2 is a schematic flow chart illustrating the fusion of the visible light image and the first excitation light image according to one embodiment;
FIG. 3 is a schematic flow chart illustrating the fusion of the visible light image and the second excitation light image according to one embodiment;
FIG. 4 is a schematic illustration of image fusion in one embodiment;
FIG. 5 is a schematic structural diagram of an image pickup apparatus according to an embodiment;
FIG. 6 is a schematic structural diagram of a lens barrel at the front part of the image capturing device in one embodiment;
FIG. 7 is a schematic flow chart of 3D fusion of images according to an embodiment;
FIG. 8 is a front view of an image pickup apparatus in one embodiment;
FIG. 9 is a side view of the structure of an image pickup apparatus in one embodiment;
FIG. 10 is a graph of the intensity of light detected by the image sensor in one embodiment;
FIG. 11 is a schematic structural view of an image pickup apparatus in another embodiment;
FIG. 12 is a diagram illustrating an exemplary light guide lens assembly;
FIG. 13 is a spectrum of the output beam in one embodiment;
FIG. 14 is a spectral plot of a light beam collected in one embodiment;
FIG. 15 is a schematic diagram of a signal transmission flow according to an embodiment;
FIG. 16 is a schematic structural view of a light source device in one embodiment;
FIG. 17 is a schematic diagram showing a detailed structure of a light source device in one embodiment;
FIG. 18 is a timing diagram of near infrared light emission in one embodiment;
FIG. 19 is a signal transmission schematic of an electronic endoscopic imaging system in one embodiment;
FIG. 20 is a schematic diagram illustrating the overall workflow of an electronic endoscopic imaging system in one embodiment;
FIG. 21 is a schematic diagram illustrating the signal transmission of the light source output in one embodiment;
FIG. 22 is a flow diagram of a method of electronic endoscopic imaging according to an embodiment.
Description of reference numerals: 10-light source device, 20-image acquisition device, 30-image processing device, 11-light source emission module, 12-drive module, 21-beam splitting prism, 22-first image sensor group, 23-second image sensor group, 211-first triangular prism, 212-second triangular prism, 100-first plane, 200-second plane, 221-first visible light sensor, 222-second visible light sensor, 231-first near infrared light sensor, 232-second near infrared light sensor, 300-light source light outlet, 251-first lens, 252-second lens, 24-light guide lens group, 240-light guide beam, 241-lens group, 400-visible light, 500-excitation light, 110-visible light emission module, 111-a first near infrared light emitting module, 112-a second near infrared light emitting module, 120-a visible light driving module, 121-a first near infrared light driving module, 122-a second near infrared light driving module, 123-a time sequence control module and 40-a display device.
Detailed Description
To facilitate an understanding of the present application, the present application will now be described more fully with reference to the accompanying drawings. Embodiments of the present application are set forth in the accompanying drawings. This application may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein in the description of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another.
Spatial relational terms, such as "under," "below," "under," "over," and the like may be used herein to describe one element or feature's relationship to another element or feature as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements or features described as "below" or "beneath" other elements or features would then be oriented "above" the other elements or features. Thus, the exemplary terms "under" and "under" can encompass both an orientation of above and below. In addition, the device may also include additional orientations (e.g., rotated 90 degrees or other orientations) and the spatial descriptors used herein interpreted accordingly.
It will be understood that when an element is referred to as being "connected" to another element, it can be directly connected to the other element or be connected to the other element through intervening elements. Further, "connection" in the following embodiments is understood to mean "electrical connection", "communication connection", or the like, if there is a transfer of electrical signals or data between the connected objects.
As used herein, the singular forms "a", "an" and "the" may include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises/comprising," "includes" or "including," etc., specify the presence of stated features, integers, steps, operations, components, parts, or combinations thereof, but do not preclude the presence or addition of one or more other features, integers, steps, operations, components, parts, or combinations thereof.
As described in the background art, the endoscope imaging system in the related art has a problem that an image including both metabolic information and vascular morphology information of a lesion site cannot be acquired. The inventor researches and finds that the problem is caused because the endoscope imaging system in the prior art emits light beams including visible light and near infrared light with fixed wavelength, the penetration depth of the near infrared light with the same wavelength is fixed, only image information with a certain fixed depth of the tissue of the lesion part can be acquired, and the metabolic information and the blood vessel shape information are in different tissue depths, so that only one of the metabolic information and the blood vessel shape information can be seen by using the near infrared light with the fixed wavelength.
In view of the above, the present invention provides an electronic endoscopic imaging system and method capable of generating an image including both metabolic information and vascular morphology information of a lesion.
In one embodiment, as shown in fig. 1, there is provided an electronic endoscopic imaging system, the system comprising: a light source device 10, an image acquisition device 20, an image processing device 30. Wherein:
and a light source device 10 for continuously emitting visible light to the object to be observed and alternately emitting the first near-infrared light and the second near-infrared light.
Specifically, the wavelength of the first near-infrared light is smaller than the wavelength of the second near-infrared light; the visible light is reflected on the object to be observed to form reflected light, the first near infrared light excites the object to be observed to generate first exciting light, the second near infrared light excites the object to be observed to generate second exciting light, and the object to be observed contains a developer corresponding to the first near infrared light.
Illustratively, when the developer is indocyanine green, the wavelength of the first near-infrared light is 780nm to 811 nm. The tissue metabolism of the object to be observed is provided with the developer, and under the excitation of the near infrared light with enough intensity, electrons in the developer can enter an excited state and then jump back to a ground state to emit fluorescence. Generally, about 97% of indocyanine green (contrast agent) is excluded from blood 20 minutes after intravenous injection. When liver tumor or liver hardened nodule exists, biliary tract excretion function of liver cells in the pathological liver tissue is damaged, indocyanine green (developer) is retained in the pathological tissue in a targeted mode, and the delayed regression phenomenon occurs, so that the pathological tissue can be distinguished from surrounding normal tissues, and the pathological tissue can be displayed. The excitation wavelength of indocyanine green (developer) is 808nm, so that the first near infrared light can excite the developer to generate fluorescence, and the wavelength of the first excitation light generated by exciting indocyanine green (developer) is 830-850nm, preferably 835 nm. Thereby reflecting the metabolic information of the object to be observed. The developer can also be other chemical substances, such as rhodamine series dyes, the corresponding excitation wavelength is 520-600nm, and the wavelength of the first near infrared light is 520-600nm when the rhodamine series dyes are adopted; the excitation wavelength corresponding to Cyanine dyes (Cy, Cyanine) series is 550-780nm, the wavelength of the first near-infrared light is 550-780nm when Cyanine dyes are used, the excitation wavelength corresponding to AlexaFluor series dyes is 340-680nm, and the wavelength of the first near-infrared light is 340-680nm when AlexaFluor series dyes are used.
Illustratively, the wavelength of the second near-infrared light is 855nm to 865 nm. The penetration depth of the near infrared light with the wavelength is about 6nm, and the near infrared light can penetrate into a deeper layer of an object to be observed, so that the blood vessel shape of the object to be observed can be reflected, and the wavelength of the second excitation light generated by excitation is 855nm-865 nm. Thereby reflecting the blood vessel shape information of the observation object.
And the image acquisition device 20 is positioned on the light paths of the reflected light, the first excitation light and the second excitation light and is used for acquiring the reflected light to form a reflected light image, acquiring the first excitation light to form a first excitation light image and acquiring the second excitation light to form a second excitation light image.
Specifically, the image pickup Device 20 includes a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor, and is capable of converting an optical signal into an electrical signal.
And the image processing device 30 is electrically connected with the image acquisition device 20 and is used for fusing the visible light image and the first excitation light image formed at the same moment into a first fused image, fusing the visible light image and the second excitation light image formed at the same moment into a second fused image, and fusing the first fused image and the second fused image generated at the adjacent moments into a target fused image.
Specifically, fig. 2 is a schematic diagram illustrating a fusion process of the visible light image and the first excitation light image. The visible light Signal is processed by a visible light ISP (Image Signal Processing) to obtain a visible light Image, and the first excitation light Signal is processed by a near-infrared light ISP to obtain a first excitation light Image. Then carrying out gray processing on the first excitation light image, enhancing brightness and contrast, carrying out threshold segmentation to determine different information of the image, then carrying out fluorescence coloring, registering the first excitation light image with a visible light image at the same moment, adopting a global motion vector estimation algorithm to register the first excitation light image and the visible light image of the same frame, fusing the first excitation light image and the visible light image through a Laplacian pyramid algorithm, adopting three-layer pyramid algorithm fusion, and considering image quality and fusion speed to obtain a first fusion image, wherein the first fusion image comprises fluorescence information and visible light information.
Specifically, fig. 3 is a schematic diagram illustrating a fusion process of the visible light image and the second excitation light image. And processing the second excitation light signal by using the near infrared light ISP to obtain a second excitation light image. And then carrying out gray processing on the second excitation light image to enhance brightness and contrast, smoothing the image by adopting Gaussian noise reduction, then segmenting the second excitation light image by adopting a Hessian (black plug matrix) matrix so as to extract blood vessel tissues in the second excitation light image, and then carrying out electronic staining on the extracted blood vessel tissues. And registering the second excitation light image with the visible light image at the same moment, adopting a global motion vector estimation algorithm to register the second excitation light image and the visible light image of the same frame, and fusing the second excitation light image and the visible light image by a Laplace pyramid algorithm.
Illustratively, as shown in fig. 4, the timing sequence of the first excitation light image and the second excitation light image is alternated, the visible light image is continuous, a frame of the first excitation light image and a frame of the visible light image at the same time are fused to obtain a first fused image, a frame of the second excitation light image and a frame of the visible light image at the same time are fused to obtain a second fused image, and then the first fused image and the second fused image at the adjacent time are fused to obtain the target fused image.
In the embodiment, through the light source device, the visible light is continuously emitted to the object to be observed, the first near infrared light and the second near infrared light are alternately emitted, and the object to be observed can reflect the visible light to form reflected light after being irradiated by the visible light; the object to be observed is excited to generate a first excitation light under the irradiation of the first near infrared light, and the object to be observed is excited to generate a second excitation light under the irradiation of the second near infrared light. Through image acquisition equipment, can gather the reverberation, and the first exciting light and the second exciting light that are aroused, and generate the visible light image according to the reverberation, generate first exciting light image and second exciting light image respectively according to first exciting light and second exciting light, because the wavelength of first near-infrared light is less than the wavelength of second near-infrared light, therefore the penetration depth of first near-infrared light does not have the second near-infrared light dark, adopt first near-infrared light, arouse the first exciting light image that obtains, and adopt the second near-infrared light, arouse the second exciting light image that obtains, contain the content of the different degree of depth of waiting to observe the object, consequently, can obtain the image of waiting to observe the different penetration depths of object. The image processing device fuses the visible light image and the first excitation light image formed at the same time into a first fused image, fuses the visible light image and the second excitation light image formed at the same time into a second fused image, and fuses the first fused image and the second fused image generated at adjacent times into a target fused image. Because the first near-infrared light and the second near-infrared light are emitted alternately, the generation frame rates of the first excitation light image and the second excitation light image are half of the generation frame rates of the visible light images, so that each frame of the first fusion image has a corresponding second fusion image, and the generated target fusion images comprise the first fusion image and the second fusion image which are in one-to-one correspondence. The frame rate of the target fusion image is higher, and the target fusion image is clearer and more consistent. The information of different depths of the object to be observed can be presented clearly and simultaneously. In conclusion, the system of the application can present the information of different penetration depths of the object to be observed on one image, so that a doctor can conveniently observe the metabolic information and the blood vessel shape information of the object to be observed at the same time, and the doctor can conveniently make a diagnosis and treat.
In one embodiment, as shown in fig. 5, image capture device 20 comprises: a beam splitter prism 21, a first image sensor group 22, and a second image sensor group 23. Wherein:
and a beam splitter prism 21, which is located on the optical paths of the reflected light, the first excitation light and the second excitation light, and is configured to change the optical path of at least one of the reflected light, the first excitation light and the second excitation light, so that the optical path of the reflected light is not coplanar with the optical paths of the first excitation light and the second excitation light.
Specifically, as shown in fig. 6, the beam splitting prism 21 includes: the first triangular prism 211 and the second triangular prism 212 are connected by inclined plane gluing, acute included angles between the inclined plane of the first triangular prism 211 and light paths of the reflected light, the first exciting light and the second exciting light are 45 degrees, and the first triangular prism is used for changing the light path of at least one of the reflected light, the first exciting light and the second exciting light to enable the light path of the reflected light to be perpendicular to the light path of the first exciting light and the light path of the second exciting light, wherein the first triangular prism 211 and the second triangular prism 212 are 45-degree right-angle triangular prisms. The reflected light, the first excitation light and the second excitation light, the light paths of which are perpendicular to each other, are irradiated on the first plane 100 and the second plane 200, respectively.
Specifically, the inclined plane of the first triangular prism 211 is plated with a semi-transparent and semi-reflective multi-layer dielectric film for reflecting the reflected light and transmitting the first excitation light and the second excitation light; the inclined surface of the second triangular prism 212 is coated with a high transmission film for transmitting the first excitation light and the second excitation light, and the inclined surface of the first triangular prism 211 faces the inclined surface of the second triangular prism 212.
Illustratively, the beam splitter prism 21 is configured to reflect visible light in incident light and transmit near-infrared light in incident light, or reflect near-infrared light in incident light and transmit visible light in incident light. As the beam splitting prism 21 reflects visible light or near infrared light, the arrangement positions of the first image sensor group and the second image sensor group are changed adaptively as long as the first image sensor group is located on the optical path of visible light and the second image sensor group is located on the optical path of near infrared light.
And the first image sensor group 22 is positioned on the light path of the reflected light passing through the beam splitter prism 21 and is used for collecting the reflected light to form a reflected light image.
Specifically, the first image sensor group includes two visible light image sensors arranged side by side in a coplanar manner. The two paths of visible light can be collected simultaneously to form two reflected light images with horizontal parallax, the parallel coplanarity means that the two visible light image sensors are arranged side by side and face the same, and the photosensitive areas are on the same horizontal plane.
And a second image sensor group 23, located on the light path of the first excitation light and the second excitation light passing through the beam splitter prism 21, for collecting the first excitation light to form a first excitation light image and collecting the second excitation light to form a second excitation light image.
Specifically, the second image sensor group includes two near-infrared light image sensors arranged side by side in a coplanar manner. The two paths of near infrared light can be collected simultaneously to form two excitation light images with horizontal parallax, the parallel coplanarity means that the two near infrared light image sensors are arranged side by side and have the same orientation, and the light sensing areas are on the same horizontal plane.
Specifically, since the beam splitter prism 21 splits the reflected light and the excitation light into the reflected light and the excitation light whose optical paths are perpendicular to each other, the first image sensor group and the second image sensor group are also arranged perpendicular to each other, and may be arranged at the first plane 100 and the second plane 200, respectively. Therefore, gaps as large as possible are formed in the arrangement space, heat dissipation of the sensor is facilitated, and heat accumulation of the front section of the lens of the sensor is avoided. And effectively utilized the space in the endoscope lens cone, in two sets of sensors, a set of setting is on the lens cone cross section, and a set of setting is on the lens cone lateral wall to need not unnecessary space, do not influence the size of lens cone pipe diameter, the maximize has utilized the space in the lens cone. And the arrangement enables the two groups of sensors to simultaneously receive visible light and near infrared light, thereby realizing the time alignment of the collected light beams and facilitating the subsequent image fusion. And a space allowance is reserved for placing the beam splitter prism 21, the placing position of the beam splitter prism 21 can completely cover the photosensitive area of the image sensor, the optical signal is guaranteed to be captured by the image sensor to the maximum extent, the photosensitive area of the sensor is utilized to the maximum extent, and a foundation is provided for subsequent high-resolution image processing.
Specifically, each group of image sensors includes two sensors disposed side by side, so that a hardware basis of 3D (3-dimensional) imaging is satisfied, two beams with horizontal parallax can be received at the same time, two 2D images with horizontal parallax can be formed at the same time, and it is convenient to directly fuse the two 2D images with horizontal parallax to generate a 3D image, and the resolution and brightness of the obtained 3D image are not attenuated compared with the 2D image before fusion. Images acquired by the two sensors are respectively input into image processing equipment and processed according to the image fusion process, so that two 2D target fusion images with horizontal parallax can be obtained at the same moment, and then the two 2D target fusion images with the horizontal parallax are processed by adopting a 3D vision processing algorithm, so that the 3D target fusion image can be obtained. The 3D target fusion image is a three-dimensional image, so that a doctor can more clearly observe the metabolic information and the blood vessel morphological information of a lesion part during diagnosis and treatment.
Exemplarily, as shown in fig. 7, the left road target fusion image and the right road target fusion image with horizontal parallax may be directly and respectively output, or may be synthesized by a 3D visual processing algorithm to obtain a 3D target fusion image.
Illustratively, as shown in fig. 8, which is a front view of the image capturing apparatus, as shown in fig. 9, which is a side view of the image capturing apparatus, two sets of image sensors are arranged perpendicular to each other in the barrel of the endoscope on the side of the beam splitter prism 21 for reflecting the light beam and on the side of the transmitted light beam, which are capable of simultaneously capturing visible light and near infrared light, and are two each, the first image sensor set 22 includes a first visible light sensor 221, a second visible light sensor 222, and the second image sensor set 23 includes a first near infrared light sensor 231 and a second near infrared light sensor 232. The two collected visible light paths and the two near infrared light paths can be respectively subjected to 3D visual processing to obtain visible light 3D images and near infrared light 3D images. Light source light outlets 300 are respectively arranged above and below the lens cone and are used for ensuring the light intensity of scenes in the lens cone to be consistent, so that the imaging is more stable. A first lens 251 and a second lens 252 are further provided in front of the beam splitter prism 21 to condense the light beams.
Illustratively, the 3D vision processing is to fuse two 2D images with horizontal parallax to obtain a 3D image, which may be implemented by using existing image fusion software or algorithms.
Illustratively, the image sensors are RGB (red, green, blue, red, green, blue) CMOS (Complementary Metal Oxide Semiconductor) sensors, as shown in fig. 10, which are graphs of the detected light intensity of the CMOS sensors, and as can be seen from the graphs, the CMOS sensors have high photon conversion efficiency in the 400-700nm band and relatively high conversion efficiency (about 50%) in the near-infrared band (830-865 nm), which enables the visible light images and the near-infrared images generated by the CMOS sensors to have good imaging quality. QE (quantum efficiency, quantum yield): describing the photon conversion efficiency, the higher the value, the better the capture path for the optical signal at that wavelength.
In this embodiment, the light splitting prism is arranged to split the reflected light and the excitation light, and the reflected light and the excitation light are collected by the corresponding image sensor groups, so as to obtain a visible light image and an excitation light image. The two separated beams are vertical, and the two corresponding groups of sensors are also vertically arranged, so that the shape of the endoscope lens barrel is matched. And two images can be generated simultaneously, so that subsequent fusion is facilitated, and the two groups of sensors are two sensors which are arranged side by side and coplanar respectively, so that the obtained images are images with horizontal parallax, and the 3D images without attenuation of resolution and brightness are generated conveniently. Therefore, two paths of binocular visible light images and two paths of binocular near-infrared light images with high resolution, high brightness and high definition are generated, and subsequent image fusion processing is facilitated.
In one embodiment, as shown in fig. 11, the image capturing apparatus further includes: a light guide lens group 24.
And a light guide lens group 24, located on the light path of the reflected light, the first excitation light and the second excitation light, for converging the reflected light, the first excitation light and the second excitation light into the beam splitter prism 21.
Exemplarily, as shown in fig. 6 and 12, the light guide lens group 24 includes a lens group 241, and a specific structure of the lens group 241 is as shown in fig. 6, and includes an objective lens arranged inside the lens barrel for converging, so that the light beam can be transmitted to the subsequent beam splitter prism 21 according to a preset track.
In this embodiment, by providing the light guide lens group, the reflected light and the excitation light can be transmitted according to a specific track and converged onto the beam splitter prism, thereby facilitating subsequent imaging.
In one embodiment, image capture device 20 further comprises: a filter. Wherein: and the filter is positioned on the light paths of the reflected light, the first exciting light and the second exciting light and used for screening out the reflected light, the first exciting light and the second exciting light.
Specifically, the filter can be an infrared cut-off filter and a lens film coating layer covering the infrared cut-off filter, and is used for filtering infrared light with a fixed wavelength band. For example, as shown in FIG. 13, the visible light wavelength is 400nm-700nm, the first near-infrared light wavelength is 805nm-810nm, and the second near-infrared light wavelength is 855-865 nm. As shown in fig. 14, after the visible light is irradiated to the object to be observed, the wavelength of the excited reflected visible light is 400-700 nm; after the first near-infrared light irradiates the object to be observed, the near-infrared light signal excited by the object to be observed is a light beam with the spectrum at the wavelength of 830-850 nm; after the second near-infrared light irradiates the object to be observed, the near-infrared light signal excited by the object to be observed is a light beam with a spectrum at 855-865nm wavelength. Therefore, the incident light collected by the image collecting device may be the reflected visible light with the wavelength of 400nm-700nm and the near infrared light with the wavelength of 830-865 nm, or the reflected visible light with the wavelength of 400nm-700nm and the near infrared light with the wavelength of 855-865 nm. Therefore, the light beams with the wavelengths not in the above range are filtered out by the filter, for example, the narrow-band near infrared light with the wavelength of 798-818nm is filtered out by the infrared cut filter, so that the interference-free near infrared light is ensured, and all the collected near infrared light is the near infrared light excited by the object to be observed. And filtering out light beams with wavelengths between visible light and narrow-band near-infrared light through a lens film coating layer.
Illustratively, as shown in fig. 15, a signal transmission flow diagram of image capturing apparatus 20 is shown. Incident light sequentially passes through the light guide lens group, the infrared cut-off filter plate and the lens film coating layer, is divided into visible light and near infrared light through the light splitting prism, and is respectively transmitted to the first image sensor group and the second image sensor group.
In this embodiment, through setting up the filter, filter the light beam of removing all other wavelengths, screen out reverberation and first exciting light, second exciting light to guarantee that the light beam of gathering is all the light beam that treats observation object reflection or arouse, avoid the interference of other ambient light.
In one embodiment, as shown in fig. 16, the light source device 10 includes: light source emission module 11, drive module 12, wherein:
the light source emitting module 11 is configured to emit at least one of visible light, first near-infrared light, and second near-infrared light.
Specifically, the wavelength of visible light is 400nm to 700 nm.
Specifically, the first near-infrared light having a wavelength of 805nm to 810nm can excite the fluorescent developer, so that the fluorescent substance emits fluorescence, which requires near-infrared light having a wavelength of 805nm to 810nm to excite the fluorescent substance.
Specifically, the wavelength of the second near-infrared light is 855-865 nm. The penetration depth is 6mm, so the blood vessel can penetrate to a deeper blood vessel position and reflect the shape of the blood vessel.
Specifically, as shown in fig. 17, the light source emission module 11 includes: a visible light emitting module 110, a first near-infrared light emitting module 111, a second near-infrared light emitting module 112, wherein:
and a visible light emitting module 110 for emitting visible light.
The first near infrared light emitting module 111 is configured to emit first near infrared light.
And a second near-infrared light emitting module 112 for emitting second near-infrared light.
And the driving module 12 is connected with the light source emitting module 11 and is used for driving the light source emitting module 11 to continuously emit visible light to the object to be observed and alternately emit the first near-infrared light and the second near-infrared light.
Specifically, as shown in fig. 17, the drive module 12 includes: visible light drive module 120, first near-infrared light drive module 121, second near-infrared light drive module 122, sequential control module 123, wherein:
and the visible light driving module 120 is connected with the visible light emitting module and is used for providing a driving power supply for the visible light emitting module.
The first near infrared light driving module 121 is connected to the first near infrared light emitting module, and configured to provide a driving power source for the first near infrared light emitting module.
And the second near-infrared light driving module 122 is connected with the second near-infrared light emitting module and is used for providing a driving power supply for the second near-infrared light emitting module.
The timing control module 123 is connected to the first near-infrared light driving module 121 and the second near-infrared light driving module 122, respectively, and configured to control the first near-infrared light driving module 121 and the second near-infrared light driving module 122 to work alternately according to a preset timing signal.
Specifically, after receiving the preset timing signal, the timing control module 123 controls the first near-infrared light driving module 121 and the second near-infrared light driving module 122 to alternately operate according to the preset timing signal, so that the first near-infrared light and the second near-infrared light are alternately output according to the preset timing signal.
Illustratively, the preset timing signals are as shown in fig. 18, for example, PWM (Pulse Width Modulation) 1 and PWM2 respectively correspond to the first near-infrared light and the second near-infrared light, and when the PWM is at a high level, the near-infrared light is output, so that the first near-infrared light and the second near-infrared light can be seen to be alternately output. The preset time sequence signal is consistent with the time sequence of the image processing equipment for processing the image, and when the light source equipment emits the first near infrared light signal, the image processing equipment generates a first fusion image according to the visible light image and the first excitation light image; the image processing device generates a second fused image from the visible light image and the second excitation light image while the light source device emits the second near-infrared light signal.
Specifically, the light source device can converge the visible light and the first near-infrared light, or the visible light and the second near-infrared light, and output the visible light and the first near-infrared light, or the visible light and the second near-infrared light by emitting one beam.
Specifically, the light guide module is a series of lens groups, and can adjust the path of the light beam, so that the visible light and the first near-infrared light, or the visible light and the second near-infrared light are converged into one light beam to be output in a reflection mode.
In the present embodiment, the light source device is a light source capable of outputting light beams of three different wavelengths, capable of continuously outputting white light, and capable of alternately outputting the first light according to a timing signal
Near-infrared light and a second near-infrared light to provide the desired illumination for the system of the present application.
In one embodiment, as shown in fig. 19, the electronic endoscopic imaging system further comprises: a display device 40.
The display device 40 is connected to the image processing device 30, and is used for displaying the image output by the image processing device 30 for the doctor to observe.
The image processing device 30 also outputs a preset timing signal to the light source device 10 for controlling the light source device 10 to alternately output white light and near-infrared light in accordance with the preset timing signal.
Illustratively, the preset timing signals are shown in fig. 18, for example, PWM (Pulse Width Modulation) 1 and PWM2 respectively correspond to the first near-infrared light and the second near-infrared light, and when the PWM is at a high level, the near-infrared light is output, so that it can be seen that the first near-infrared light and the second near-infrared light are alternately output. The preset time sequence signal is consistent with the time sequence of the image processing equipment for processing the image, and when the light source equipment emits the first near infrared light signal, the image processing equipment generates a first fusion image according to the visible light image and the first excitation light image; the image processing device generates a second fused image from the visible light image and the second excitation light image while the light source device emits the second near-infrared light signal.
Illustratively, as shown in fig. 20, a full workflow diagram of an electronic endoscopic imaging system is provided.
The image capturing device 20 is a structure of an endoscope barrel composed of an endoscope lens and a detector, and is capable of capturing an external light beam.
The white light and the near infrared light emitted from the light source device 10 are emitted from the middle of the image pickup device 20 through the light guide beam 240 in the middle of the image pickup device 20.
For example, as shown in fig. 21, the light source device 10 converges the emitted light beam through the light guide beam 240, and the light guide beam 240 is emitted through the image capturing device 20, and may be an optical fiber, which can transmit the light output by the light source device 10 to the image capturing device 20.
After receiving the visible light and the near-infrared light, the image processing device 30 processes the visible light and the near-infrared light according to the method of the above embodiment to finally obtain the target fusion image, and outputs the 3D target fusion image after the 3D vision processing to the image display module 40.
In one embodiment, as shown in fig. 22, there is provided an electronic endoscopic imaging method comprising:
step S2200 is to emit visible light continuously and to alternately emit the first near-infrared light and the second near-infrared light to the object to be observed.
Specifically, the wavelength of the first near-infrared light is smaller than the wavelength of the second near-infrared light; the visible light is reflected on the object to be observed to form reflected light, the first near infrared light excites the object to be observed to generate first excitation light, and the second near infrared light excites the object to be observed to generate second excitation light.
Step S2202, collecting the reflected light to form a reflected light image, collecting the first excitation light to form a first excitation light image, and collecting the second excitation light to form a second excitation light image.
In step S2204, the visible light image and the first excitation light image formed at the same time are fused into a first fused image, and the visible light image and the second excitation light image formed at the same time are fused into a second fused image.
In step S2206, the first fused image and the second fused image generated at the adjacent time are fused into the target fused image.
In the embodiment, the visible light is continuously emitted to the object to be observed, the first near infrared light and the second near infrared light are alternately emitted, and the object to be observed can reflect the visible light to form reflected light after being irradiated by the visible light; the object to be observed is excited to generate a first excitation light under the irradiation of the first near infrared light, and the object to be observed is excited to generate a second excitation light under the irradiation of the second near infrared light. The method comprises the steps of collecting reflected light, excited first excitation light and excited second excitation light, generating a visible light image according to the reflected light, generating a first excitation light image and a second excitation light image according to the first excitation light and the second excitation light respectively, exciting the obtained first excitation light image by adopting the first near infrared light and exciting the obtained second excitation light image by adopting the second near infrared light because the wavelength of the first near infrared light is smaller than that of the second near infrared light, and obtaining images of different penetration depths of an object to be observed. And fusing the visible light image and the first excitation light image formed at the same moment into a first fused image, fusing the visible light image and the second excitation light image formed at the same moment into a second fused image, and fusing the first fused image and the second fused image generated at the adjacent moments into a target fused image. Because the first near-infrared light and the second near-infrared light are emitted alternately, the generation frame rates of the first excitation light image and the second excitation light image are half of the generation frame rates of the visible light images, so that each frame of the first fusion image has a corresponding second fusion image, and the generated target fusion images comprise the first fusion image and the second fusion image which are in one-to-one correspondence. The frame rate of the target fusion image is higher, and the target fusion image is clearer and more consistent. The information of different depths of the object to be observed can be presented clearly and simultaneously. In conclusion, the method can present the information of different penetration depths of the object to be observed on one image, thereby facilitating the doctor to observe the metabolic information and the blood vessel shape information of the object to be observed simultaneously and facilitating the diagnosis and treatment of the doctor.
It should be understood that, although the steps in the flowchart of fig. 22 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in fig. 22 may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed in turn or alternately with other steps or at least a portion of the other steps or stages.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database or other medium used in the embodiments provided herein can include at least one of non-volatile and volatile memory. Non-volatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical storage, or the like. Volatile Memory can include Random Access Memory (RAM) or external cache Memory. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others.
In the description herein, references to the description of "some embodiments," "other embodiments," "desired embodiments," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, a schematic description of the above terminology may not necessarily refer to the same embodiment or example.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. An electronic endoscopic imaging system, comprising:
the light source equipment is used for continuously emitting visible light to an object to be observed and alternately emitting first near infrared light and second near infrared light; wherein the wavelength of the first near-infrared light is less than the wavelength of the second near-infrared light; the visible light is reflected on the object to be observed to form reflected light, the first near infrared light excites the object to be observed to generate first exciting light, and the second near infrared light excites the object to be observed to generate second exciting light, wherein the object to be observed contains a developer corresponding to the first near infrared light;
the image acquisition equipment is positioned on the light paths of the reflected light, the first excitation light and the second excitation light and is used for acquiring the reflected light to form a visible light image, acquiring the first excitation light to form a first excitation light image and acquiring the second excitation light to form a second excitation light image;
and the image processing device is electrically connected with the image acquisition device and is used for fusing the visible light image and the first excitation light image formed at the same moment into a first fused image, fusing the visible light image and the second excitation light image formed at the same moment into a second fused image, and fusing the first fused image and the second fused image generated at adjacent moments into a target fused image.
2. The system of claim 1, wherein the image capture device comprises:
the beam splitter prism is positioned on the light paths of the reflected light, the first excitation light and the second excitation light and used for changing the light path of at least one of the reflected light, the first excitation light and the second excitation light so that the light path of the reflected light is not coplanar with the light path of the first excitation light and the light path of the second excitation light;
the first image sensor group is positioned on a light path of the reflected light passing through the light splitting prism and used for collecting the reflected light to form a visible light image;
and the second image sensor group is positioned on the light path of the first excitation light and the second excitation light passing through the light splitting prism and used for collecting the first excitation light to form a first excitation light image and collecting the second excitation light to form a second excitation light image.
3. The system of claim 2, wherein the beam splitting prism comprises: first triangular prism and second triangular prism that the inclined plane is connected, first triangular prism's inclined plane with the reverberation first exciting light the acute angle contained angle of the light path of second exciting light is 45, is used for changing the reverberation first exciting light with the light path of at least one in the second exciting light makes the light path of reverberation with the light path of first exciting light the light path of second exciting light is perpendicular, wherein, first triangular prism with second triangular prism is 45 right angle triangular prism.
4. The system of claim 3, wherein the inclined surface of the first triangular prism is coated with a semi-transparent and semi-reflective multi-layer dielectric film for reflecting the reflected light and transmitting the first excitation light and the second excitation light;
and the inclined plane of the second triangular prism is plated with a high-transmission film for transmitting the first excitation light and the second excitation light.
5. The system of claim 2, wherein the image capture device further comprises:
and the light guide lens group is positioned on the light path of the reflected light, the first excitation light and the second excitation light and is used for converging the reflected light, the first excitation light and the second excitation light into the light splitting prism.
6. The system of claim 2, wherein the image capture device further comprises:
and the filter is positioned on the light path of the reflected light, the first exciting light and the second exciting light and used for screening the reflected light, the first exciting light and the second exciting light.
7. The system of claim 1, wherein the light source device comprises:
the light source emitting module is used for emitting at least one of the visible light, the first near infrared light and the second near infrared light;
and the driving module is connected with the light source transmitting module and used for driving the light source transmitting module to continuously transmit the visible light to the object to be observed and alternately transmit the first near-infrared light and the second near-infrared light.
8. The system of claim 7,
the light source emission module includes:
a visible light emitting module for emitting the visible light;
the first near infrared light emitting module is used for emitting the first near infrared light;
the second near-infrared light emitting module is used for emitting the second near-infrared light;
the driving module includes:
the visible light driving module is connected with the visible light emitting module and used for providing a driving power supply for the visible light emitting module;
the first near-infrared light driving module is connected with the first near-infrared light emitting module and used for providing a driving power supply for the first near-infrared light emitting module;
the second near-infrared light driving module is connected with the second near-infrared light emitting module and used for providing a driving power supply for the second near-infrared light emitting module;
and the time sequence control module is respectively connected with the first near-infrared light driving module and the second near-infrared light driving module and is used for controlling the first near-infrared light driving module and the second near-infrared light driving module to alternately work according to a preset time sequence signal.
9. The system of claim 2,
the first image sensor group comprises two visible light image sensors which are arranged side by side in a coplanar manner;
the second image sensor group comprises two near infrared light image sensors which are arranged side by side in a coplanar manner.
10. An electronic endoscopic imaging method, comprising:
continuously emitting visible light and alternately emitting first near-infrared light and second near-infrared light to an object to be observed; wherein the wavelength of the first near-infrared light is less than the wavelength of the second near-infrared light; the visible light is reflected on the object to be observed to form reflected light, the first near infrared light excites the object to be observed to generate first exciting light, and the second near infrared light excites the object to be observed to generate second exciting light, wherein the object to be observed contains a developer corresponding to the first near infrared light;
collecting the reflected light to form a visible light image, collecting the first exciting light to form a first exciting light image, and collecting the second exciting light to form a second exciting light image;
and fusing the visible light image and the first excitation light image formed at the same moment into a first fused image, fusing the visible light image and the second excitation light image formed at the same moment into a second fused image, and fusing the first fused image and the second fused image generated at adjacent moments into a target fused image.
CN202210549242.0A 2022-05-20 2022-05-20 Electronic endoscope imaging system and method Pending CN114869207A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210549242.0A CN114869207A (en) 2022-05-20 2022-05-20 Electronic endoscope imaging system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210549242.0A CN114869207A (en) 2022-05-20 2022-05-20 Electronic endoscope imaging system and method

Publications (1)

Publication Number Publication Date
CN114869207A true CN114869207A (en) 2022-08-09

Family

ID=82678610

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210549242.0A Pending CN114869207A (en) 2022-05-20 2022-05-20 Electronic endoscope imaging system and method

Country Status (1)

Country Link
CN (1) CN114869207A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115251809A (en) * 2022-09-28 2022-11-01 科弛医疗科技(北京)有限公司 Endoscope with a detachable handle

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115251809A (en) * 2022-09-28 2022-11-01 科弛医疗科技(北京)有限公司 Endoscope with a detachable handle

Similar Documents

Publication Publication Date Title
US8547425B2 (en) Fluorescence observation apparatus and fluorescence observation method
JP2021508542A (en) Hyperspectral imaging in a light-deficient environment
US10473911B2 (en) Simultaneous visible and fluorescence endoscopic imaging
US20080081990A1 (en) Apparatus, system, kit and method for heart mapping
CN104883946A (en) Image processing apparatus, electronic device, endoscope apparatus, program, and image processing method
JP2013099573A (en) System and method for generating fluorescent light image
JP2011200367A (en) Image pickup method and device
CN114007480A (en) Pulsed illumination in hyperspectral, fluorescence and laser mapping imaging systems
CN104010558A (en) Fluorescent light observation device, fluorescent light observation method and fluorescent light observation device function method
CN114245868A (en) Wide dynamic range using monochromatic image sensors for hyperspectral and fluorescence imaging and topographical laser mapping
CN114128243A (en) Hyperspectral and fluorescence imaging with topological laser scanning in low light environments
EP3641622B1 (en) System for endoscopic imaging and method for processing images
US20130217968A9 (en) Endoscope apparatus, method, and computer readable medium
CN114449940A (en) Laser scanning and tool tracking imaging in a starved environment
CN113208567A (en) Multispectral imaging system, imaging method and storage medium
CN114173633A (en) Fluorescence imaging in low light environments
CN114869207A (en) Electronic endoscope imaging system and method
US20190239749A1 (en) Imaging apparatus
CN113436129A (en) Image fusion system, method, device, equipment and storage medium
WO2021099127A1 (en) Device, apparatus and method for imaging an object
JP2018128759A (en) Shininess removing device
WO2021044590A1 (en) Endoscope system, treatment system, endoscope system operation method and image processing program
Obukhova et al. Learning from multiple modalities of imaging data for cancer detection/diagnosis
US11689689B2 (en) Infrared imaging system having structural data enhancement
CN115177199A (en) Image processing system, surgical system, and image processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination