CN211213049U - Electronic endoscope - Google Patents

Electronic endoscope Download PDF

Info

Publication number
CN211213049U
CN211213049U CN201922004888.7U CN201922004888U CN211213049U CN 211213049 U CN211213049 U CN 211213049U CN 201922004888 U CN201922004888 U CN 201922004888U CN 211213049 U CN211213049 U CN 211213049U
Authority
CN
China
Prior art keywords
light
filter
green
image signal
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201922004888.7U
Other languages
Chinese (zh)
Inventor
王森豪
邱建军
汪洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sonoscape Medical Corp
Original Assignee
Sonoscape Medical Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sonoscape Medical Corp filed Critical Sonoscape Medical Corp
Priority to CN201922004888.7U priority Critical patent/CN211213049U/en
Application granted granted Critical
Publication of CN211213049U publication Critical patent/CN211213049U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

An electronic endoscope includes a light source device for emitting illumination light and excitation light to a subject; wherein the illumination light includes green narrow-band light, and the excitation light is light for exciting the subject to generate green fluorescence; an image pickup device including a first filter, a second filter, and an image sensor; a processor for generating a fluorescence image based on the fluorescence image signal; and/or generating a green narrowband light image based on the green narrowband light image signal; and/or generating a fused image based on the fluorescence image signal and the green narrowband light image signal. The green fluorescence imaging device can separate a green fluorescence signal and a green illumination light signal, and realizes simultaneous imaging of a fluorescence image and a green light image.

Description

Electronic endoscope
Technical Field
The application relates to the field of medical instruments, in particular to an electronic endoscope.
Background
The electronic endoscope images the object in the cavity to be observed onto the imaging element via a small objective optical system, transmits the received image signal to the image processing system via image cable/optical cable, and outputs the processed image on the monitor for observation.
Currently, the imaging modes of the electronic endoscope mainly include a white light imaging mode, a narrow band imaging mode, a fluorescence imaging mode, a monochromatic light imaging mode, and the like. Each of these imaging modes has advantages and disadvantages, and therefore, in order to be able to distinguish normal tissue from diseased tissue more accurately and quickly, simultaneous imaging or fusion imaging may be performed using two or more of these modes.
However, in the process of realizing simultaneous imaging or fusion imaging of a plurality of imaging modes, the inventors found that: since the wavelength band of the fluorescence signal collected in the fluorescence imaging mode is usually overlapped with the green wavelength band in the illumination wavelength band of the narrow-band imaging/white light imaging mode, the green fluorescence signal and the green illumination light signal cannot be effectively separated; if the time-sharing illumination mode is adopted for signal separation, motion artifacts exist in the obtained adjacent frame tissue images, and the frame rate of the video output by the processor is reduced, and a pause phenomenon exists, which is not favorable for improving the imaging quality.
Therefore, how to separate the green fluorescence signal and the green illumination light signal in the same frame to achieve simultaneous imaging of the green fluorescence image and the green illumination light image is a technical problem that needs to be solved at present.
SUMMERY OF THE UTILITY MODEL
The purpose of the present application is to provide an electronic endoscope capable of separating a green fluorescence signal and a green illumination light signal in the same frame, and realizing simultaneous imaging of a green fluorescence image and a green light image.
In order to solve the above technical problem, the present application provides an electronic endoscope, including:
a light source device for emitting illumination light and excitation light to a subject; wherein the illumination light includes green narrow-band light, and the excitation light is light for exciting the subject to generate green fluorescence;
an image pickup device including a first filter, a second filter, and an image sensor; wherein the first filter is used for transmitting green fluorescence generated by the object to be shot and cutting off the illumination light and the excitation light; the second filter is used for cutting off the exciting light; the image sensor is used for receiving the green fluorescence which passes through the first filter and forming a fluorescence image signal of the object, and is also used for receiving the green fluorescence and the green narrow-band light which pass through the second filter and forming a green narrow-band light image signal of the object;
a processor for generating a fluorescence image based on the fluorescence image signal; and/or generating a green narrowband light image based on the green narrowband light image signal; and/or generating a fused image based on the fluorescence image signal and the green narrowband light image signal.
Optionally, the first filter is a filter having a transmission characteristic corresponding to a spectral range of the green fluorescence and a notch characteristic corresponding to a spectral range of the green narrow-band light.
Optionally, a center wavelength at the notch of the first filter coincides with a center wavelength of the green narrowband light, and a half-peak width at the notch is greater than a half-peak width of the green narrowband light.
Optionally, the half-peak width of the green narrow-band light is less than 20 nm.
Optionally, the central wavelength of the green narrow-band light is located between 520nm and 580 nm.
Optionally, the central wavelength of the excitation light is between 380nm and 480 nm.
Optionally, the first filter and the second filter are arranged to form a first filter array, and the image sensor is disposed behind an optical path of the first filter array;
then, the processor is specifically configured to:
respectively carrying out interpolation processing on the fluorescence image signal and the green narrow-band light image signal;
generating a fluorescence image of the subject based on the fluorescence image signal subjected to the interpolation processing; and/or the presence of a gas in the gas,
generating a green narrowband light image of the subject based on the green narrowband light image signal subjected to the interpolation processing; and/or the presence of a gas in the gas,
and generating a fused image of the shot object based on the fluorescence image signal and the green narrow-band light image signal after the interpolation processing.
Optionally, the excitation light is blue-violet narrow-band light, and the illumination light further includes blue wide-band light and red wide-band light; the second filter is also used for cutting off the blue broadband light and the red broadband light;
the first filter array further comprises: a third filter, a fourth filter and a fifth filter; wherein the third filter is configured to transmit only blue-violet narrowband light reflected by the subject, the fourth filter is configured to transmit only blue broadband light reflected by the subject, and the fifth filter is configured to transmit only red broadband light reflected by the subject;
then the process of the first step is carried out,
the image sensor is further configured to: receiving the blue-violet narrowband light that passes through the third filter and forms a blue-violet narrowband light image signal of the subject, receiving the blue broadband light that passes through the fourth filter and forms a blue broadband light image signal of the subject, and receiving the red broadband light that passes through the fifth filter and forms a red broadband light image signal of the subject;
the processor is further configured to:
generating a white light image of the subject based on the red broadband light image signal, the green broadband light image signal, and the blue broadband light image signal; and/or the presence of a gas in the gas,
generating a narrow-band image of the subject based on the blue-violet narrow-band light image signal and the green narrow-band light image signal; and/or the presence of a gas in the gas,
generating a blue-violet narrowband light image of the subject based on the blue-violet narrowband light image signal; and/or the presence of a gas in the gas,
generating a blue broadband light image of the subject based on the blue broadband light image signal; and/or the presence of a gas in the gas,
generating a red broadband light image of the subject based on the red broadband light image signal.
Optionally, the image sensor includes a first image sensor and a second image sensor, the first filter and the second filter are independently disposed, wherein,
the first image sensor is arranged behind the optical path of the first filter and used for receiving the green fluorescence passing through the first filter and forming a fluorescence image signal of the object;
the second image sensor is arranged behind the optical path of the second filter and used for receiving the green fluorescence and the green narrow-band light which pass through the second filter and forming a green narrow-band light image signal of the shot object;
then, the processor generates a fused image based on the fluorescence image signal and the green narrowband light image signal, including:
aligning and correcting the positions of all pixel points in the fluorescent image signal and the green narrow-band optical image signal;
based on the alignment-corrected fluorescent image signal and the green narrowband light image signal, a fused image is generated.
Optionally, the excitation light is blue-violet narrow-band light, and the illumination light further includes blue wide-band light and red wide-band light; the second filter is also used for cutting off the blue broadband light and the red broadband light; the image pickup apparatus further includes: a third filter, a fourth filter and a fifth filter;
wherein the third filter is configured to transmit only blue-violet narrowband light reflected by the subject, the fourth filter is configured to transmit only blue broadband light reflected by the subject, and the fifth filter is configured to transmit only red broadband light reflected by the subject;
the second filter, the third filter, the fourth filter, and the fifth filter are arranged to form a second filter array, the second image sensor is located behind an optical path of the second filter array, and the second image sensor is further configured to: receiving the blue-violet narrowband light that passes through the third filter and forms a blue-violet narrowband light image signal of the subject, receiving the blue broadband light that passes through the fourth filter and forms a blue broadband light image signal of the subject, and receiving the red broadband light that passes through the fifth filter and forms a red broadband light image signal of the subject;
then, the processor is further configured to:
generating a white light image of the subject based on the red broadband light image signal, the green broadband light image signal, and the blue broadband light image signal; and/or the presence of a gas in the gas,
generating a narrow-band image of the subject based on the blue-violet narrow-band light image signal and the green narrow-band light image signal; and/or the presence of a gas in the gas,
generating a blue-violet narrowband light image of the subject based on the blue-violet narrowband light image signal; and/or the presence of a gas in the gas,
generating a blue broadband light image of the subject based on the blue broadband light image signal; and/or the presence of a gas in the gas,
generating a red broadband light image of the subject based on the red broadband light image signal.
Through above-mentioned scheme, the electronic endoscope that this application provided's beneficial effect lies in: by adopting the green narrow-band light to provide green illumination light, extracting the green fluorescence signal by the first filter and extracting the green illumination light signal by the second filter, the green fluorescence signal and the green illumination light signal can be separated in the same frame, and the simultaneous imaging of a fluorescence image and a green light image is realized. Although there is a wavelength band where the green fluorescence and the green narrow-band light overlap, the first filter cuts the green fluorescence of the corresponding wavelength band while cutting off the green narrow-band light, but since the wavelength band where the green narrow-band light is located is narrow, even if the first filter also cuts off the green fluorescence of the portion overlapping with the wavelength band where the green narrow-band light is located, the final fluorescence imaging is not greatly affected. Next, although the second filter cuts off the fluorescence excitation light to obtain a mixed light of the green narrow-band light and the green fluorescence, since the green fluorescence has a weak signal and a small occupancy ratio compared to the green narrow-band light, it is considered that the green fluorescence does not affect the imaging effect of the green narrow-band light. Therefore, in the process of realizing simultaneous imaging of the fluorescence image and the green image, the electronic endoscope does not need to correct motion artifacts of adjacent frame images, does not need to lose imaging time resolution, and basically does not lose imaging quality.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is a schematic structural diagram of an electronic endoscope provided in an embodiment of the present application;
fig. 2 is a schematic structural diagram of a light source device according to an embodiment of the present disclosure;
FIG. 3 is a schematic structural diagram of an electronic endoscope employing a single image sensor structure according to an embodiment of the present application;
FIG. 4 is a schematic view of an illumination spectrum provided by an embodiment of the present application;
fig. 5 is a schematic diagram of a coating of the first filter array 8a according to an embodiment of the present disclosure;
FIG. 6 is a graph illustrating a spectral transmittance curve of a first filter according to an embodiment of the present disclosure;
FIG. 7 is a graph illustrating a spectral transmittance curve of a second filter according to an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of an electronic endoscope adopting a dual image sensor structure according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a minimum bayer filter unit according to an embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 1, fig. 1 is a schematic structural diagram of an electronic endoscope a1 according to an embodiment of the present disclosure, which may include the following devices: a light source device 100, an image pickup device 200, and a processor 300.
Among them, the light source device 100 is a device for emitting illumination light and excitation light to a subject.
The light beam emitted from the light source device 100 can be irradiated on the subject via the light guide device and the illumination lens. The subject may be any living tissue that can generate green fluorescence when excited by excitation light. Specifically, the green fluorescence may be autofluorescence (i.e., fluorescence generated by excitation of an endogenous fluorescent substance in the subject, such as NADH, collagen, riboflavin, etc., by excitation light) or drug fluorescence (i.e., green fluorescence generated by excitation of a fluorescent drug accumulated on the surface of the subject by excitation light), which is not particularly limited in the embodiments of the present application.
In the present embodiment, the illumination light emitted from the light source device 100 includes green narrowband light for providing green illumination light for imaging. In some embodiments, the central wavelength (or peak wavelength) of the green narrow-band light may be located near the absorption peak of hemoglobin, for example, between 520nm and 580nm, in order to better highlight the blood vessels in the middle and deep layers of the tissue mucosa.
The excitation light emitted from the light source device 100 in this embodiment may be determined according to the fluorescent substance to be excited, and for example, if the fluorescent substance to be excited is an endogenous fluorescent substance in the subject, the excitation light having a center wavelength between 380nm and 480nm may be selected.
Specifically, a light source that emits green narrowband light and a light source that emits excitation light may be included in the light source device 100.
Referring to fig. 2, fig. 2 is a schematic structural diagram of a light source apparatus according to an embodiment of the present disclosure, in the light source apparatus shown in fig. 2, light beams of a plurality of light sources are combined by a dichroic mirror beam combining optical path, so that the combined mixed light is irradiated to a subject. Specifically, if the light source device corresponding to fig. 2 is applied to the electronic endoscope in fig. 1, the light emitting element 1 in fig. 2 may be specifically a light emitting element for emitting green narrow-band light, the light emitting element 2 may be specifically a light emitting element for emitting excitation light capable of exciting a subject to generate green fluorescence, and the light emitting elements 3 and 4 … … N may be omitted, or may be used for emitting other illumination light. Further, a light source controller may be disposed in the light source device shown in fig. 2, so as to activate any one or any several light emitting elements to obtain corresponding image signals.
Of course, in addition to the light source device 100 emitting the illumination light and the excitation light by combining the light paths, in some embodiments, two light sources may be provided to emit the green narrowband light and the excitation light simultaneously and independently of each other.
The image pickup apparatus 200 is an apparatus for acquiring an image signal of a subject, and the image pickup apparatus 200 can generate an image signal of the subject from illumination light reflected by the subject and green fluorescence generated by the subject, on the basis of irradiation of the subject by the light source apparatus 100. Specifically, the image pickup apparatus 200 includes a first filter 201, a second filter 202, and an image sensor 203.
The first filter 201 in the present embodiment is configured to transmit green fluorescence generated by the subject and cut off the illumination light and the excitation light, that is, the first filter 201 allows only green fluorescence generated after the subject is excited by the excitation light to pass through.
Specifically, the first filter 201 may be a filter having a transmission characteristic corresponding to the spectral range of the green fluorescence and a notch characteristic corresponding to the spectral range of the green narrow-band light. That is, the first filter 201 has a transmission characteristic and a notch characteristic, the transmission characteristic of the first filter 201 specifically means that light corresponding to the spectrum range of the fluorescence is allowed to pass, and the notch characteristic of the first filter 201 specifically means that light in the wavelength band in which the green narrowband light is not allowed to pass.
Although there is a wavelength band where the green fluorescence and the green narrow-band light overlap, the first filter 201 cuts the green narrow-band light and also cuts the green fluorescence of the corresponding wavelength band, but since the wavelength band where the green narrow-band light is located is narrow, even if the first filter 201 similarly cuts the green fluorescence of the portion overlapping with the wavelength band where the green narrow-band light is located, the final fluorescence imaging is not greatly affected.
Also, since the emission intensity of the green narrow-band light is much greater than that of the green fluorescence, in order to avoid the influence of the leakage of the green narrow-band light on the fluorescence imaging, it is necessary to ensure that the green narrow-band light is completely cut off, so that, in some embodiments, the center wavelength of the notch of the first filter 201 coincides with the center wavelength of the green narrow-band light, and the half-peak width of the notch is greater than the half-peak width of the green narrow-band light. Further, in order to reduce the loss of the green fluorescence signal passing through the first filter 201, green narrow-band light having a half-peak width of less than 20nm may be selected.
The second filter 202 in this embodiment is used to cut off the excitation light, i.e., the second filter 202 can filter the excitation light to obtain a mixed light of the green narrow-band light and the green fluorescence. However, it is considered that the green fluorescence (particularly, autofluorescence) generated by the subject is weak compared to the green narrow-band light reflected by the subject, and the proportion of the mixed light transmitted through the second filter 202 is small, and therefore the imaging effect of the green narrow-band light is not affected by the green fluorescence passing through the second filter 202.
The image sensor 203 is a device that converts an optical signal into an electrical signal based on a photoelectric conversion function of an optoelectronic device. It may specifically be a CCD sensor or a COMS sensor. The image sensor 203 in this embodiment can receive the green fluorescence passing through the first filter 201 and the mixed light of the green narrowband light and the green fluorescence passing through the second filter 202, respectively; and generates a fluorescent image signal (electrical signal) from the received green fluorescence light (optical signal), and generates a green narrowband optical image signal (electrical signal) from the received mixed light (optical signal) of the green narrowband light and the green fluorescence light. Therefore, the fluorescence image signal and the green narrow-band light image signal of the object can be separated and acquired in the same frame time.
Among other things, in some embodiments, such as the electronic endoscope a2 shown in fig. 3, to reduce the volume of the scope insertion portion of the electronic endoscope, a single image sensor configuration may be employed to acquire the fluorescence image signal and the green narrowband light image signal. Specifically, in this embodiment, the first filter 201 and the second filter 202 may be arranged to form a first filter array, and the image sensor 203 is disposed behind the optical path of the first filter array, so that, on the imaging plane of the image sensor 203, a fluorescent image signal and a green narrow-band light image signal may be formed corresponding to the arrangement positions of the first filter 201 and the second filter 202, respectively.
Alternatively, in other embodiments, such as the electronic endoscope a3 shown in fig. 8, in order to improve the image sharpness of the image, a dual image sensor structure may be used to collect the fluorescence image signal and the green narrowband light image signal. Specifically, in this embodiment, the image sensor may include a first image sensor and a second image sensor, the first filter 201 and the second filter 202 are provided independently of each other, and the first image sensor is provided behind the optical path of the first filter 201 to receive green fluorescence passing through the first filter 201 and form a fluorescence image signal of the subject; the second image sensor is disposed behind the optical path of the second filter 202 to receive the green fluorescence and the green narrowband light passing through the second filter 202 and form a green narrowband light image signal of the subject.
The processor 300 may be any device with logic operation and image processing capability, for example, it may be one or more Micro-Control units (MCUs) or programmable logic circuits, and is configured to synchronously receive the fluorescence image signal and the green narrowband light image signal generated by the image sensor 203, and generate any one or more of a fluorescence image, a green narrowband light image, and a fusion image based on the fluorescence image signal and the green narrowband light image signal. For example, a fluorescence image is generated based on the fluorescence image signal; and/or generating a green narrowband light image based on the green narrowband light image signal; and/or generating a fused image based on the fluorescence image signal and the green narrowband light image signal.
After the green narrow-band light is irradiated on the object to be shot, the hemoglobin in the object to be shot can absorb the illuminating light near the wave band where the green narrow-band light is positioned; therefore, the generated green narrow-band light image can help to identify and highlight the form of the middle-deep layer blood vessel of the mucosa, improve the contrast between the focus part and the surrounding normal tissues, and is favorable for observing the blood vessel distribution of the focus area and carrying out pathological judgment based on the surface structure form.
Because the content of the substance with the fluorescence effect in the pathological change tissue and the normal tissue is different and the thickness of the mucous membrane of the pathological change tissue and the normal tissue is also different, the green fluorescence generated in the pathological change tissue and the normal tissue has the intensity difference; therefore, the fluorescence image is generated, the region where the lesion tissue is located and the region where the normal tissue is located can be rapidly identified, and the strong lesion identification sensitivity is achieved.
However, since the attenuation of autofluorescence is not directly observed as a change in a tumor lesion but is indirectly expressed only by the thickening of mucosal epithelium and blood aggregation for autofluorescence imaging, autofluorescence is also attenuated in benign lesions in which blood clots increase, such as inflammatory lesions and the like, and thus, there is a problem of false positive (i.e., inflammatory lesions and the like are misjudged as tumor lesions) in autofluorescence imaging.
For this reason, in the embodiment of the present application, when the green fluorescence signal is an auto fluorescence signal, a fused image may be generated based on the acquired fluorescence image signal and the green narrowband light image signal. The digestive tract tumor lesion is generally accompanied with thickening of mucosa epithelium, the reflection effect on green narrow-band light signals is obvious, the inflammatory lesion is generally accompanied with surface blood aggregation, blood is located near a wavelength of 540nm, a spectrum absorption peak exists, the absorption effect on green narrow-band light is obvious, the green narrow-band reflection light intensity of the tumor tissue is larger than that of the inflammatory tissue, and the green narrow-band light has the advantage of high specificity on the tumor tissue and the inflammatory tissue. Thus, the location of the lesion tissue can be quickly identified by autofluorescence imaging first, and then the location of the tumor tissue can be identified in the lesion tissue by green narrow band light imaging. Therefore, a fused image is formed based on the fluorescent image signal and the green narrow-band light image signal, and the speed and the reliability of tumor identification can be improved simultaneously.
In the process of generating the fusion image, the embodiment may first generate the fluorescence image and the green narrowband light image, and then perform image fusion operation based on the fluorescence image and the green narrowband light image of the same frame to obtain the fusion image. Specifically, the fusion operation may specifically be: superposing the fluorescence image on the green narrow-band light image according to a certain proportion and transparency to obtain a fused image; or respectively endowing different colors to the two images by adopting a color mapping method, and mapping the two images to an RGB space to form a true color image so as to obtain a fused image. The fused image obtained in the above way is obtained by fusing the fluorescent image and the green narrow-band light image of the same frame, so that the fused image has no motion artifact and does not need to be subjected to motion artifact correction processing, and the imaging frame rate of an output video can be improved when the fused image is output to a display device.
It should be noted that, in the present embodiment, the fluorescence image signal and the green narrowband light image signal are merged as an example, but the present invention is not limited thereto. In other embodiments, for example, when the illumination light emitted by the light source device 100 also includes light in other wavelength bands, image signals corresponding to light in other wavelength bands may be further combined with the fluorescence image signal and the green narrowband light image signal to perform fusion, so as to obtain a fused image more suitable for practical application requirements.
In practical application, any one or more of the fluorescence image, the green narrow-band light image and the fusion image can be generated according to the requirements of practical application or observation mode, and output to the monitor for the reference of the doctor.
In a specific implementation, the processor 300 may perform different image processing on the acquired fluorescence image signal and the green narrowband light image signal based on the specific configuration of the employed image pickup apparatus 200.
For example, in an embodiment that uses a single image sensor structure, since optical signal separation is performed by using a first filter array, and a fluorescent image signal and a green narrowband optical image signal acquired by an image sensor have a substantial spatial resolution loss, in this embodiment, the processor 300 needs to perform separation and extraction of a green fluorescent corresponding pixel and a green narrowband optical corresponding pixel according to an arrangement array position of each filter in the first filter array, and perform interpolation processing on the fluorescent image signal and the green narrowband optical image signal, so that each pixel in a finally obtained image has a corresponding gray value, thereby obtaining a fluorescent image signal and a green narrowband optical image signal with image sensor resolution. Further, generating a fluorescence image of the subject based on the fluorescence image signal subjected to the interpolation processing; and/or generating a green narrowband light image of the object based on the green narrowband light image signal subjected to the interpolation processing; and/or generating a fused image of the object based on the fluorescence image signal and the green narrow-band light image signal after interpolation processing.
For another example, in an embodiment adopting a dual image sensor structure, since the first image sensor and the second image sensor have relative shifts in spatial positions, there is a certain pixel shift between the image pixel obtained by the first image sensor and the image pixel obtained by the second image sensor, and thus there is a corresponding pixel shift between the fluorescence image signal formed in the first image sensor and the green narrowband light image signal formed in the second image sensor. Therefore, in this embodiment, when the processor 300 generates the fused image based on the fluorescent image signal and the green narrowband optical image signal, it needs to perform alignment correction on the positions of the pixel points in the fluorescent image signal and the green narrowband optical image signal, and then generate the fused image based on the fluorescent image signal and the green narrowband optical image signal after the alignment correction.
Specifically, the process of implementing alignment correction of the positions of the pixel points in the fluorescent image signal and the green narrowband light image signal by the processor 300 may include: determining the pixel offset between the fluorescence image and the green narrow-band light image according to the spatial position information between the first image sensor and the second image sensor; and then aligning and correcting the positions of all pixel points in the fluorescent image signal and the green narrow-band light image signal according to the pixel offset. As another possible implementation manner, in some embodiments, spatial position matching may also be performed according to the characteristic image structure of the fluorescence image signal and the characteristic image structure of the green narrowband light image signal, so as to obtain a pixel offset between the fluorescence image and the green narrowband light image.
As can be seen from the above embodiments, the electronic endoscope provided by the present application has the following beneficial effects: by adopting the green narrow-band light to provide green illumination light, extracting the fluorescence signal by the first filter and extracting the green illumination light signal by the second filter, the fluorescence signal and the green illumination light signal can be separated in the same frame, and the fluorescence image and the green light image can be imaged simultaneously. Although there is a wavelength band where the green fluorescence and the green narrow-band light overlap, the first filter cuts the green fluorescence of the corresponding wavelength band while cutting off the green narrow-band light, but since the wavelength band where the green narrow-band light is located is narrow, even if the first filter also cuts off the green fluorescence of the portion overlapping with the wavelength band where the green narrow-band light is located, the final fluorescence imaging is not greatly affected. Next, although the second filter cuts off the fluorescence excitation light to obtain a mixed light of the green narrow-band light and the green fluorescence, since the green fluorescence has a weak signal and a small occupancy ratio compared to the green narrow-band light, it is considered that the green fluorescence does not affect the imaging effect of the green narrow-band light. Therefore, in the process of realizing simultaneous imaging of the fluorescence image and the green image, the electronic endoscope does not need to correct motion artifacts of adjacent frame images, does not need to lose imaging time resolution, and basically does not lose imaging quality.
Further, in order to synchronously realize multi-mode imaging, namely, simultaneously perform one or more of fluorescence imaging, white light imaging, narrow-band imaging, fusion imaging, monochrome imaging and the like, the electronic endoscope provided by the above embodiment can be further adapted and expanded. Hereinafter, the electronic endoscope using the single image sensor structure and the electronic endoscope using the dual image sensor structure will be described in detail as examples, respectively.
Fig. 3 is a schematic structural diagram of an electronic endoscope adopting a single image sensor structure according to an embodiment of the present application.
Referring to fig. 3, the electronic endoscope a2 may include the following devices: a green narrow-band light source 1a, a fluorescence excitation light source 2a, other illumination light sources 3a, a light source controller 4a, a light guide device 5a, an illumination lens 6a, an objective lens 7a, a first filter array 8a, an image sensor 203, a signal cable 10a, a processor 300 and the like.
In this embodiment, the light source device of the electronic endoscope a2 may be configured to perform illumination by combining a plurality of light paths, and a plurality of light emitting components such as a green narrowband light source 1a, a fluorescence excitation light source 2a, and other illumination light sources 3a are arranged on the light path, and the light emitting amounts of these light emitting components may be controlled by a light source controller 4a, and the light beams emitted by the light emitting components may be guided to the light guide device 5a by combining the light beams by a dichroic mirror; the illumination light and the excitation light emitted from the light source device are then guided to the illumination lens 6a by the light guide device 5a to be uniformly irradiated on the subject through the illumination lens 6 a. Green fluorescence generated from the subject and illumination light and excitation light reflected by the subject pass through the objective lens 7a and the first filter array 8a, and are received by the image sensor 203 to generate corresponding image signals; the image signal is transmitted to the processor 300 via the signal cable 10a for image processing.
As a possible implementation manner, the green narrowband light source 1a in this embodiment may emit green narrowband light with a central wavelength near 540 nm. Preferably, the half-peak width of the green narrow-band light is less than 20 nm. In this embodiment, the fluorescence excitation light source 2a can emit a blue-violet narrowband light having a central wavelength near 415nm, and can excite a subject to emit green autofluorescence. As a specific embodiment, the illumination spectra emitted by the green narrow-band light source 1a and the fluorescence excitation light source 2a can be as shown in fig. 4. In addition, the other illumination light sources 3a in this embodiment may be light sources emitting light in other color bands, for example, in order to realize white light imaging and improve color rendering property in white light imaging, it may emit blue broadband light with a wavelength range of 430nm to 480nm and red broadband light with a wavelength range of more than 600 nm.
The first filter array 8a in this embodiment may include a first filter, a second filter, a third filter, a fourth filter, and a fifth filter arranged according to a preset rule. The first filter is used for transmitting the green fluorescence generated by the object to be shot and cutting off the illumination light and the excitation light (namely, used for collecting a green fluorescence signal generated by the object to be shot); a second filter for cutting off the excitation light, the blue broadband light and the red broadband light (i.e., for collecting a green narrowband light signal reflected by the subject and a green fluorescence signal generated by the subject); the third filter is used for transmitting only the excitation light reflected by the object (i.e. for collecting the blue-violet narrowband optical signal reflected by the object), the fourth filter is used for transmitting only the blue broadband light reflected by the object (i.e. for collecting the blue broadband optical signal reflected by the object), and the fifth filter is used for transmitting only the red broadband light reflected by the object (i.e. for collecting the red broadband optical signal reflected by the object).
Specifically, the implementation manner of the first filter array 8a may be: the micromirror arrays of different spectral passbands are arranged, for example, in a mosaic-like arrangement. Referring to fig. 5, fig. 5 is a schematic diagram of a plating film of a first filter array 8a according to an embodiment of the present disclosure, F1Is referred to as a first filter, F2Referred to as second filter, FOFilters corresponding to other illumination lights (i.e., a third filter, a fourth filter, and a fifth filter). Specifically, the spectral transmittance curve of the first filter in this embodiment may be as shown in fig. 6, which has a transmission characteristic in a wavelength band between 460nm and 610nm (a wavelength band corresponding to green fluorescence), and a notch characteristic at a center wavelength of 520nm to 560nm and a half-peak width greater than that of green narrowband light, so that most of green fluorescence signals may be retained, and at the same time, green narrowband light signals are filtered out, and separation of fluorescence signals from green illumination light signals is achieved. The spectral transmittance curve of the second filter in this embodiment can be as shown in fig. 7, which has a transmittance characteristic in a wavelength band range of 480nm to 600nm, and can retain the green fluorescent signal and the green narrowband optical signal in the wavelength band and cut off the blue-violet narrowband optical signal, the blue broadband optical signal, and the red broadband optical signal. Alternatively, in other embodiments, the second filter may have a transmission characteristic only in a wavelength band where the green narrow-band light is located, so as to reduce a green fluorescence component in the passed mixed light. By analogy, the spectral transmittances corresponding to the third filter, the fourth filter and the fifth filter can be determined, and finally, the coating film is formedThe first filter array 8 a.
The image sensor 203 in this embodiment is disposed behind the optical path of the first filter array 8a, so that the green fluorescent light signal, the mixed light signal of green fluorescent light + green narrowband light, the blue-violet narrowband light signal, the blue broadband light signal, and the red broadband light signal respectively transmitted through the first filter, the second filter, the third filter, the fourth filter, and the fifth filter in the first filter array 8a can be imaged at corresponding positions of the image sensor 203, and a fluorescent image signal, a green narrowband light image signal, a blue-violet narrowband light image signal, a blue broadband light image signal, and a red broadband light image signal are correspondingly formed. These image signals are transmitted to the processor 300 through the signal cable 10a, and image processing is performed.
After receiving the image signals, the processor 300 may first perform interpolation processing on the image signals respectively; then, based on the image signal after the interpolation processing, any one or more of a fluorescence image, a green narrow-band light image, a fusion image, a white light image, a narrow-band image, a blue-violet narrow-band light image, a blue wide-band light image, and a red wide-band light image of the subject are generated. Specifically, a fluorescence image of the subject is generated based on the fluorescence image signal subjected to the interpolation processing; and/or generating a green narrowband light image of the object based on the green narrowband light image signal subjected to the interpolation processing; and/or generating a fused image of the shot object based on the fluorescence image signal and the green narrow-band light image signal after interpolation processing; and/or generating a white light image of the shot object based on the red broadband light image signal, the green broadband light image signal and the blue broadband light image signal which are subjected to interpolation processing; and/or generating a narrow-band image of the object based on the blue-violet narrow-band optical image signal and the green narrow-band optical image signal after interpolation processing; and/or generating a blue-violet narrowband optical image of the object based on the interpolated blue-violet narrowband optical image signal; and/or generating a blue broadband light image of the object based on the blue broadband light image signal subjected to the interpolation processing; and/or generating a red broadband light image of the object based on the red broadband light image signal subjected to the interpolation processing.
In this embodiment, the electronic endoscope can emit excitation light in a blue-violet narrowband light band, green narrowband light, blue broadband light, and red broadband light to a subject, and further can synthesize optical signals filtered by a plurality of filters according to an image type to be obtained to obtain a corresponding image. Therefore, the electronic endoscope provided by the embodiment can simultaneously acquire images of a plurality of imaging modes of the subject according to actual requirements.
Further, since the green fluorescence signal is usually weak, the present application further provides an electronic endoscope using a dual image sensor structure, wherein one image sensor is used for collecting and forming a fluorescence image signal, and the other image sensor is used for collecting and respectively forming image signals corresponding to each illumination light and/or excitation light.
Specifically, please refer to fig. 8, which is a schematic structural diagram of an electronic endoscope employing a dual image sensor structure according to an embodiment of the present application. The electronic endoscope a3 may include the following: a green narrow-band light source 1b, a fluorescence excitation light source 2b, another illumination light source 3b, a light source controller 4b, a light guide device 5b, an illumination lens 6b, a first objective lens 7b, a second objective lens 7c, a first filter 8b, a second filter array 8c, a first image sensor 203a, a second image sensor 203b, a signal cable 10b, a processor 300 and the like.
As can be seen from a comparison between fig. 8 and 3, the electronic endoscope A3 provided in the present embodiment has substantially the same structure as the electronic endoscope a2 shown in fig. 3, except that: the electronic endoscope a3 provided by the present embodiment includes two imaging optical paths: (1) a first objective lens 7b + a first filter 8b + a first image sensor 203a, and (2) a second objective lens 7c + a second filter array 8c + a second image sensor 203 b. The image signals formed in the first image sensor 203a and the second image sensor 203b are each transmitted to the processor 300 through the signal cable 10 b.
Wherein the first filter 8b characteristic of the first filter F in the embodiment described in FIG. 31Similarly, the related contents can be referred to the corresponding descriptions in the above embodiments. Thus, the first image sensor 203a in the present embodiment is configured to receive the green fluorescence passed through the first filter 8b and form a fluorescence image signal of the subject.
Here, the second filter array 8c is also substantially the same as the first filter array 8a in the above embodiment, except that: the second filter array 8c in the present embodiment is formed by arranging a second filter, a third filter, a fourth filter, and a fifth filter. Specifically, the second filter array 8c may be constituted by a minimum bayer filter unit repeating arrangement as shown in fig. 9, where G in fig. 9 denotes a second filter for transmitting a mixed light of green fluorescent light and green narrow-band light and cutting off the excitation light and other illumination light; UV denotes a third filter for transmitting only blue-violet narrowband light reflected by the subject; b denotes a fourth filter for transmitting only blue broadband light reflected by the subject; r denotes a fifth filter for transmitting only red broadband light reflected by the subject.
Thus, the second image sensor 203b in the present embodiment is specifically configured to: receiving the green fluorescence and the green narrowband light passing through the second filter and forming a green narrowband light image signal of the subject; receiving the blue-violet narrowband light passed through the third filter and forming a blue-violet narrowband light image signal of the subject, receiving the blue broadband light passed through the fourth filter and forming a blue broadband light image signal of the subject, and receiving the red broadband light passed through the fifth filter and forming a red broadband light image signal of the subject.
Wherein, the image signals of the same frame formed in the first image sensor 203a and the second image sensor 203b can be synchronously transmitted to the processor 300 through the signal cable 10 b. The processor 300, upon receiving the fluorescence image signal transmitted by the first image sensor 203a, may generate and output a fluorescence image directly based on the fluorescence image signal. When receiving the green narrowband optical image signal, the blue-violet narrowband optical signal, the blue broadband optical signal and/or the red broadband optical signal sent by the second image sensor 203b, the processor 300 needs to perform interpolation processing on the image signals, and then generates a corresponding image based on the image signals after the interpolation processing, for example, generates a green narrowband optical image of the subject based on the green narrowband optical image signal after the interpolation processing; and/or generating a white light image of the shot object based on the red broadband light image signal, the green broadband light image signal and the blue broadband light image signal which are subjected to interpolation processing; and/or generating a narrow-band image of the object based on the blue-violet narrow-band optical image signal and the green narrow-band optical image signal after interpolation processing; and/or generating a blue-violet narrowband optical image of the object based on the interpolated blue-violet narrowband optical image signal; and/or generating a blue broadband light image of the object based on the blue broadband light image signal subjected to the interpolation processing; and/or generating a red broadband light image of the object based on the red broadband light image signal subjected to the interpolation processing.
It should be noted that, since the first image sensor 203a and the second image sensor 203b have a spatial position offset, if the processor 300 needs to perform the fusion processing on the fluorescence image signal acquired by the first image sensor 203a and the image signal acquired by the second image sensor 203b, the processor 300 needs to perform the alignment correction processing on the fluorescence image signal and the image signal acquired by the second image sensor 203b, and then perform the image fusion. For example, if the fluorescence image signal and the green narrowband optical image signal need to be fused to generate a fused image, the processor 300 further needs to perform alignment correction on the positions of the pixel points in the acquired fluorescence image signal and the green narrowband optical image signal after the interpolation processing; further, a fusion image is generated based on the fluorescence image signal and the green narrowband light image signal after the alignment correction.
In practice, the operation of the electronic endoscopic device by the user may first be recognized by the processor 300 to determine the imaging mode; then, the light source controller 4b is controlled according to the selected imaging mode, and a corresponding image processing mode is selected. For example, when the user recognizes that the machine is set to be in a green narrow-band light and autofluorescence fusion imaging mode, a signal can be sent to the light source controller 4b to control the green narrow-band light source 1b and the fluorescence excitation light source 2b to emit light simultaneously in a certain proportion; meanwhile, the processor 300 may control the first image sensor 203a and the second image sensor 203b to synchronously image, and transmit the collected fluorescent image signal and the green narrowband light image signal to the processor 300, respectively. After receiving the fluorescent image signal and the green narrowband optical image signal, the processor 300 may perform interpolation processing on the green narrowband optical image signal; then, according to the distribution positions of the first image sensor 203a and the second image sensor 203b, spatial position matching of the two images is carried out, and the corresponding spatial positions of the pixel points of the fluorescent image and the green narrowband light image after interpolation processing are corrected to be consistent; then, fusing the calibrated fluorescence image signal with the green narrow-band light image signal to obtain a fused image; the fused image may be output to a monitor for display after the image fusion is completed.
It can be seen from the above embodiment that, in the embodiment of the present application, the lighting surface/imaging surface of the green fluorescent signal can be enlarged by collecting the green fluorescent signal through the separate image sensor, and the imaging quality of the fluorescent image is improved.
In summary, the electronic endoscope provided by the embodiment of the present application can realize effective separation of green narrowband light and a fluorescence signal at an imaging end through a design of narrowband illumination and narrowband notch imaging, and on this basis, can realize synchronous imaging of a narrowband diffuse reflection light signal and a fluorescence signal under an endoscope system.
Further, based on the characteristic that the green narrow-band light image has high specificity (namely, the position of blood vessels or hemoglobin in mucosa can be well reflected, and the contrast is good), and the autofluorescence image has the characteristic of high sensitivity (namely, the sensitivity of lesion tissue identification is high, and the position of tumor tissue or inflammatory tissue can be quickly identified), the embodiment of the application further performs synchronous observation or fusion imaging on the green narrow-band light image and the autofluorescence image, can combine the advantages of high specificity and high sensitivity of the autofluorescence image of the green narrow-band light image, increase the detection rate of early cancer tissue, and reduce the false detection rate of the early cancer tissue.
The embodiments are described in a progressive manner in the specification, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. It should be noted that, for those skilled in the art, it is possible to make several improvements and modifications to the present application without departing from the principle of the present application, and such improvements and modifications also fall within the scope of the claims of the present application.
It is further noted that, in the present specification, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.

Claims (10)

1. An electronic endoscope, comprising:
a light source device for emitting illumination light and excitation light to a subject; wherein the illumination light includes green narrow-band light, and the excitation light is light for exciting the subject to generate green fluorescence;
an image pickup device including a first filter, a second filter, and an image sensor; wherein the first filter is used for transmitting green fluorescence generated by the object to be shot and cutting off the illumination light and the excitation light; the second filter is used for cutting off the exciting light; the image sensor is used for receiving the green fluorescence which passes through the first filter and forming a fluorescence image signal of the object, and is also used for receiving the green fluorescence and the green narrow-band light which pass through the second filter and forming a green narrow-band light image signal of the object;
a processor for generating a fluorescence image based on the fluorescence image signal; and/or generating a green narrowband light image based on the green narrowband light image signal; and/or generating a fused image based on the fluorescence image signal and the green narrowband light image signal.
2. The electronic endoscope of claim 1, wherein the first filter is a filter having a transmission characteristic corresponding to a spectral range of the green fluorescence and a notch characteristic corresponding to a spectral range of the green narrowband light.
3. The electronic endoscope of claim 2, wherein a center wavelength at the notch of the first filter coincides with a center wavelength of the green narrowband light, and a half-peak width at the notch is greater than a half-peak width of the green narrowband light.
4. The electronic endoscope of claim 1, wherein the green narrowband light has a half-peak width of less than 20 nm.
5. The electronic endoscope of claim 1, wherein the central wavelength of the green narrowband light is between 520nm and 580 nm.
6. The electronic endoscope of claim 1, wherein the central wavelength of the excitation light is between 380nm and 480 nm.
7. The electronic endoscope of any one of claims 1-6, wherein the first filter and the second filter are arranged to form a first filter array, the image sensor being disposed behind an optical path of the first filter array;
then, the processor is specifically configured to:
respectively carrying out interpolation processing on the fluorescence image signal and the green narrow-band light image signal;
generating a fluorescence image of the subject based on the fluorescence image signal subjected to the interpolation processing; and/or the presence of a gas in the gas,
generating a green narrowband light image of the subject based on the green narrowband light image signal subjected to the interpolation processing; and/or the presence of a gas in the gas,
and generating a fused image of the shot object based on the fluorescence image signal and the green narrow-band light image signal after the interpolation processing.
8. The electronic endoscope of claim 7, wherein the excitation light is blue-violet narrowband light, and the illumination light further comprises blue broadband light and red broadband light; the second filter is also used for cutting off the blue broadband light and the red broadband light;
the first filter array further comprises: a third filter, a fourth filter and a fifth filter; wherein the third filter is configured to transmit only blue-violet narrowband light reflected by the subject, the fourth filter is configured to transmit only blue broadband light reflected by the subject, and the fifth filter is configured to transmit only red broadband light reflected by the subject;
then the process of the first step is carried out,
the image sensor is further configured to: receiving the blue-violet narrowband light that passes through the third filter and forms a blue-violet narrowband light image signal of the subject, receiving the blue broadband light that passes through the fourth filter and forms a blue broadband light image signal of the subject, and receiving the red broadband light that passes through the fifth filter and forms a red broadband light image signal of the subject;
the processor is further configured to:
generating a white light image of the subject based on the red broadband light image signal, the green broadband light image signal, and the blue broadband light image signal; and/or the presence of a gas in the gas,
generating a narrow-band image of the subject based on the blue-violet narrow-band light image signal and the green narrow-band light image signal; and/or the presence of a gas in the gas,
generating a blue-violet narrowband light image of the subject based on the blue-violet narrowband light image signal; and/or the presence of a gas in the gas,
generating a blue broadband light image of the subject based on the blue broadband light image signal; and/or the presence of a gas in the gas,
generating a red broadband light image of the subject based on the red broadband light image signal.
9. The electronic endoscope of any one of claims 1 through 6, wherein the image sensor comprises a first image sensor and a second image sensor, the first filter and the second filter being disposed independently of each other, wherein,
the first image sensor is arranged behind the optical path of the first filter and used for receiving the green fluorescence passing through the first filter and forming a fluorescence image signal of the object;
the second image sensor is arranged behind the optical path of the second filter and used for receiving the green fluorescence and the green narrow-band light which pass through the second filter and forming a green narrow-band light image signal of the shot object;
then, the processor generates a fused image based on the fluorescence image signal and the green narrowband light image signal, including:
aligning and correcting the positions of all pixel points in the fluorescent image signal and the green narrow-band optical image signal;
based on the alignment-corrected fluorescent image signal and the green narrowband light image signal, a fused image is generated.
10. The electronic endoscope of claim 9, wherein the excitation light is blue-violet narrowband light, and the illumination light further comprises blue broadband light and red broadband light; the second filter is also used for cutting off the blue broadband light and the red broadband light; the image pickup apparatus further includes: a third filter, a fourth filter and a fifth filter;
wherein the third filter is configured to transmit only blue-violet narrowband light reflected by the subject, the fourth filter is configured to transmit only blue broadband light reflected by the subject, and the fifth filter is configured to transmit only red broadband light reflected by the subject;
the second filter, the third filter, the fourth filter, and the fifth filter are arranged to form a second filter array, the second image sensor is located behind an optical path of the second filter array, and the second image sensor is further configured to: receiving the blue-violet narrowband light that passes through the third filter and forms a blue-violet narrowband light image signal of the subject, receiving the blue broadband light that passes through the fourth filter and forms a blue broadband light image signal of the subject, and receiving the red broadband light that passes through the fifth filter and forms a red broadband light image signal of the subject;
then, the processor is further configured to:
generating a white light image of the subject based on the red broadband light image signal, the green broadband light image signal, and the blue broadband light image signal; and/or the presence of a gas in the gas,
generating a narrow-band image of the subject based on the blue-violet narrow-band light image signal and the green narrow-band light image signal; and/or the presence of a gas in the gas,
generating a blue-violet narrowband light image of the subject based on the blue-violet narrowband light image signal; and/or the presence of a gas in the gas,
generating a blue broadband light image of the subject based on the blue broadband light image signal; and/or the presence of a gas in the gas,
generating a red broadband light image of the subject based on the red broadband light image signal.
CN201922004888.7U 2019-11-18 2019-11-18 Electronic endoscope Active CN211213049U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201922004888.7U CN211213049U (en) 2019-11-18 2019-11-18 Electronic endoscope

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201922004888.7U CN211213049U (en) 2019-11-18 2019-11-18 Electronic endoscope

Publications (1)

Publication Number Publication Date
CN211213049U true CN211213049U (en) 2020-08-11

Family

ID=71931135

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201922004888.7U Active CN211213049U (en) 2019-11-18 2019-11-18 Electronic endoscope

Country Status (1)

Country Link
CN (1) CN211213049U (en)

Similar Documents

Publication Publication Date Title
JP5611892B2 (en) Endoscope system and method for operating endoscope system
EP2641531B1 (en) Augmented stereoscopic visualization for a surgical robot
US9066676B2 (en) Endoscopic image display apparatus
WO2017145529A1 (en) Calculation system
JP4855728B2 (en) Illumination device and observation device
US7102142B2 (en) Method of apparatus for generating fluorescence diagnostic information
JP5911496B2 (en) ENDOSCOPE SYSTEM, PROCESSOR DEVICE THEREOF, AND METHOD FOR OPERATING ENDOSCOPY SYSTEM
JP5496075B2 (en) Endoscopic diagnosis device
JP5654511B2 (en) Endoscope system, processor device for endoscope system, and method for operating endoscope system
JP5757891B2 (en) Electronic endoscope system, image processing apparatus, operation method of image processing apparatus, and image processing program
US9271635B2 (en) Fluorescence endoscope apparatus
JP5159904B2 (en) Endoscopic diagnosis device
US20100245616A1 (en) Image processing device, imaging device, computer-readable recording medium, and image processing method
JP2004504090A (en) Compact fluorescent endoscopic imaging system
JP2013013656A (en) Endoscope system, endoscope system processor and image display method
JP2016052391A (en) Imaging system
WO2014084134A1 (en) Observation device
US11160442B2 (en) Endoscope apparatus
JP6259747B2 (en) Processor device, endoscope system, operating method of processor device, and program
JP7350954B2 (en) Endoscopic image processing device, endoscope system, operating method of endoscopic image processing device, endoscopic image processing program, and storage medium
WO2014156604A1 (en) Endoscope system, operational method therefor and processor device
JP2010525922A (en) Improving image acquisition by filtering in multiple endoscopic systems
CN110731748A (en) electronic endoscope
CN211213049U (en) Electronic endoscope
CN216984858U (en) Illumination module, endoscope imaging system and endoscope

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant