CN108670203A - A kind of imaging device - Google Patents
A kind of imaging device Download PDFInfo
- Publication number
- CN108670203A CN108670203A CN201810558755.1A CN201810558755A CN108670203A CN 108670203 A CN108670203 A CN 108670203A CN 201810558755 A CN201810558755 A CN 201810558755A CN 108670203 A CN108670203 A CN 108670203A
- Authority
- CN
- China
- Prior art keywords
- light
- image
- narrow
- fluorescence
- band
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 48
- 238000001506 fluorescence spectroscopy Methods 0.000 claims abstract description 29
- 230000005284 excitation Effects 0.000 claims abstract description 24
- 230000002269 spontaneous effect Effects 0.000 claims abstract description 8
- 230000002159 abnormal effect Effects 0.000 claims description 63
- 238000002073 fluorescence micrograph Methods 0.000 claims description 54
- 206010028980 Neoplasm Diseases 0.000 claims description 49
- 201000011510 cancer Diseases 0.000 claims description 47
- 230000004927 fusion Effects 0.000 claims description 13
- 230000003287 optical effect Effects 0.000 claims description 5
- 235000005811 Viola adunca Nutrition 0.000 description 10
- 240000009038 Viola odorata Species 0.000 description 10
- 235000013487 Viola odorata Nutrition 0.000 description 10
- 235000002254 Viola papilionacea Nutrition 0.000 description 10
- 238000000034 method Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 6
- 238000003745 diagnosis Methods 0.000 description 4
- 239000011159 matrix material Substances 0.000 description 2
- 102000001554 Hemoglobins Human genes 0.000 description 1
- 108010054147 Hemoglobins Proteins 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 210000004204 blood vessel Anatomy 0.000 description 1
- 238000000576 coating method Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0062—Arrangements for scanning
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0033—Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0071—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by measuring fluorescence emission
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Physics & Mathematics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)
- Endoscopes (AREA)
Abstract
This application discloses a kind of imaging devices, wherein the imaging device includes:Light source, harvester and processing unit;Wherein, light source, for emitting preset multichannel narrow band light;Multichannel narrow band light is used to form white light and narrow-band feature light, and as fluorescent exciting;Harvester obtains multipath reflection light data for acquiring multipath reflection light, and acquisition fluorescence obtains fluorescence data;Multipath reflection light is:The narrow band light of white light and narrow-band feature light is formed in multichannel narrow band light through the light of Tissue reflectance to be diagnosed;Fluorescence is:In multichannel narrow band light as the narrow band light of fluorescent exciting excitation described in after diagnosing tissue, the spontaneous fluorescence of tissue to be diagnosed;Processing unit forms White-light image and the feature light image of tissue to be diagnosed for foundation multipath reflection light data, and the fluorescent image of tissue to be diagnosed is formed according to multipath reflection light data and fluorescence data.Imaging device through the invention can generate White-light image, feature light image and fluorescent image.
Description
Technical Field
The present application relates to the field of imaging, and more particularly, to an imaging apparatus.
Background
At present, in order to improve the diagnosis accuracy of early cancer of a tissue to be diagnosed, it is necessary to determine whether the tissue to be diagnosed has early canceration or not by combining information of different types of images corresponding to the tissue to be diagnosed. The different types of images may include white light images, characteristic light images, and fluorescence images. Wherein the white light image provides color and structural information of the tissue under white light illumination; the characteristic light image provides blood vessel network contrast information based on hemoglobin absorbed by tissues; the fluorescence image provides autofluorescence factor information of the tissue.
At present, the imaging device can only generate a white light image and a characteristic light image or only a white light image and a fluorescence image. Thus, the types of images that are combined to determine whether the tissue to be diagnosed has early canceration are generally: the white light image is combined with the characteristic light image or the white light image is combined with the fluorescence image.
Because different types of images reflect different biological characteristic information of the tissue to be diagnosed, and the more biological characteristic information of the tissue to be diagnosed, the higher the accuracy of judging whether the tissue to be diagnosed has early canceration. Therefore, there is a need for an imaging device capable of generating white light images, characteristic light images, and fluorescence images. And then, the accuracy of judging whether the tissue to be diagnosed has early canceration is improved by combining the white light image, the characteristic light image and the fluorescence image.
Disclosure of Invention
Based on this, the present application proposes an imaging device to generate a white light image, a characteristic light image, and a fluorescence image.
The technical scheme provided by the application is as follows:
the application discloses imaging device includes: the device comprises a light source, a collecting device and a processing device; wherein,
the light source is used for emitting preset multi-path narrow-band light; the multi-path narrow-band light is used for forming white light and narrow-band characteristic light and is used as fluorescence exciting light;
the acquisition device is used for acquiring multi-path reflected light to obtain multi-path reflected light data and acquiring fluorescence to obtain fluorescence data; the multipath reflected light is: the narrow-band light of the multi-path narrow-band light, which forms white light and narrow-band characteristic light, is reflected by the tissue to be diagnosed; the fluorescence is: after the narrow-band light serving as fluorescence excitation light in the multi-path narrow-band light excites the tissue to be diagnosed, the spontaneous fluorescence of the tissue to be diagnosed;
the processing device is used for forming a white light image and a characteristic light image of the tissue to be diagnosed according to the multipath reflected light data and forming a fluorescence image of the tissue to be diagnosed according to the multipath reflected light data and the fluorescence data.
Wherein the acquisition device comprises a first sensor;
the light source is specifically used for emitting first narrow-band light and second narrow-band light in a time-sharing manner; the first narrow-band light is: narrow-band light used for forming white light and narrow-band characteristic light in the multi-path narrow-band light; the second narrow-band light is: narrow-band light serving as fluorescence excitation light in the multi-path narrow-band light;
the first sensor is used for collecting the multipath reflected light reflected by the tissue to be diagnosed through the first narrow-band light to generate multipath reflected light data, and collecting the spontaneous fluorescence generated by the tissue to be diagnosed excited by the second narrow-band light to generate fluorescence data.
The acquisition device comprises objective lenses and second sensors, and the number of the objective lenses is the target number; the target quantity is the number of paths of narrow-band light emitted by the light source; the preset wavelength ranges allowed to pass through by different objective lenses are not overlapped with each other; the light that is allowed to pass through by different objective lenses does not overlap in the area of the second sensor;
the objective lens is used for correspondingly focusing the multipath reflected light and the fluorescence on non-overlapping areas of the second sensor;
the second sensor is used for acquiring the multi-path reflected light to obtain the multi-path reflected light data and acquiring the fluorescence to obtain the fluorescence data.
The acquisition device comprises an optical filter and a third sensor;
the optical filter is used for allowing the reflected light or the fluorescence passing through different paths in different preset time periods; wherein, a preset time period only allows one path of reflected light or fluorescence;
the fourth sensor is used for acquiring the multi-path reflected light to obtain the multi-path reflected light data and acquiring the fluorescence to obtain the fluorescence data.
Wherein the processing device is further configured to:
inputting the white light image, the characteristic light image and the fluorescence image into a trained preset model; the trained preset model is used for identifying an abnormal tissue area of suspected cancer in the image and determining the probability that the abnormal tissue area is the cancer;
acquiring a first probability that a first abnormal tissue region and a first abnormal tissue region of a suspected cancer in the white light image output by the trained preset model are cancer, a second probability that a second abnormal tissue region and a second abnormal tissue region of the suspected cancer in the characteristic light image are cancer, and a third probability that a third abnormal tissue region and a third abnormal tissue region of the suspected cancer in the fluorescence image are cancer;
determining a fourth abnormal tissue region suspected of being cancerous in the tissue to be diagnosed based on the first abnormal tissue region, the second abnormal tissue region, and the third abnormal tissue region;
determining a fourth probability that the fourth abnormal tissue region is cancer based on the first, second, and third probabilities.
Wherein the processing device is further configured to:
after the white light image, the characteristic light image and the fluorescence image are generated, any two or three of the white light image, the characteristic light image and the fluorescence image are fused to obtain a fused image.
Wherein the processing device is further configured to mark the fourth abnormal tissue region in the fusion image.
Wherein the image forming apparatus further comprises: a display device;
the display device is used for displaying one or more of the white light image, the characteristic light image, the fluorescence image, the fusion image and the fusion image marked with the fourth abnormal tissue area.
Wherein, the display device is further configured to:
displaying diagnostic information in a case where the fused image marked with the fourth abnormal tissue region is displayed; the diagnostic information indicates a probability that the fourth abnormal tissue region is cancer.
Wherein, the light source is used for emitting multipath narrow-band light and comprises:
the light source is specifically used for emitting red narrow-band light, green narrow-band light, blue narrow-band light and fluorescence excitation light.
The beneficial effect of this application does:
because the light source of the imaging device in the prior art is a broad spectrum light source or the quantity of narrow bands is insufficient, only characteristic light and white light or white light and fluorescence excitation light can be generated, and the characteristic light, the white light and the fluorescence excitation light cannot be generated simultaneously; therefore, a white light image, a characteristic light image, and a fluorescence image cannot be generated. The light source in the embodiment of the present application emits multiple paths of narrow-band light, which can form white light, narrow-band characteristic light, and can be used as fluorescence excitation light. The narrow-band light used for forming white light and narrow-band characteristic light in the multi-path narrow-band light reflects multi-path reflected light from the tissue to be diagnosed, and the narrow-band light serving as fluorescence exciting light in the multi-path narrow-band light excites the tissue to be diagnosed, and then the tissue to be diagnosed autofluorescence; the wavelength range of each path of narrow-band light can be determined by the multi-path narrow-band light according to the requirement, and then the determined multi-path narrow-band light can form white light and narrow-band characteristic light and can be used as fluorescence excitation light; the acquisition device in the embodiment of the application can acquire the multipath reflected light to obtain multipath emitted light data and acquire the fluorescence to obtain fluorescence data; therefore, the imaging apparatus in the embodiment of the present application can generate a white light image and a characteristic light image based on the obtained multiplexed reflected light data, and generate a fluorescence image based on the obtained fluorescence data. And then, based on the information of the generated white light image, the characteristic light image and the fluorescence image, the accuracy of judging whether the tissue to be diagnosed has early canceration is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is a schematic structural view of an image forming apparatus in the present application;
FIG. 2 is a schematic structural diagram of yet another imaging device of the present application;
FIG. 3 is a schematic structural diagram of yet another imaging device of the present application;
FIG. 4 is a schematic structural diagram of yet another imaging device of the present application;
fig. 5 is a schematic structural view of still another image forming apparatus of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 1, there is shown a schematic structural diagram of an imaging apparatus in the present application, which may include a light source 101, a first sensor 102, a processing device 103, and a display device 104.
Among them, the light source 101 may emit red narrow-band light having a central wavelength of 620nm, green narrow-band light having a central wavelength of 532nm, blue narrow-band light having a central wavelength of 450nm, and blue-violet light having a central wavelength of 415 nm. Wherein, red narrow-band light with the central wavelength of 620nm, green narrow-band light with the central wavelength of 532nm and blue narrow-band light with the central wavelength of 450nm can form white light; the blue narrow-band light with the central wavelength of 450nm and the blue-violet light with the central wavelength of 415nm can form narrow-band characteristic light; the light source 101 may be added with fluorescence excitation light having a central wavelength of 405nm as the fluorescence excitation light.
After the light source 101 emits red narrow-band light, green narrow-band light, blue narrow-band light and fluorescence excitation light, the red narrow-band light, the green narrow-band light, the blue narrow-band light and the blue-violet narrow-band light respectively irradiate the tissue to be diagnosed, and the tissue to be diagnosed reflects red narrow-band reflected light, green narrow-band reflected light, blue narrow-band reflected light and blue-violet narrow-band reflected light; the tissue to be diagnosed is irradiated by fluorescence excitation light, and the tissue to be diagnosed spontaneously emits fluorescence after being excited.
In practical application, the light source 101 can also emit red narrow-band light with a central wavelength of 620nm, green narrow-band light with a wavelength range of 540-600 nm, and blue narrow-band light with a central wavelength of 450 nm. The white light can be formed by red narrow-band light with the central wavelength of 620nm, green narrow-band light with the wavelength range of 540-600 nm and blue narrow-band light with the central wavelength of 450 nm; the green narrow-band light with the wavelength range of 540-600 nm and the blue narrow-band light with the central wavelength of 450nm can form narrow-band characteristic light; the green narrow-band light with the wavelength range of 540-600 nm can be used as fluorescence excitation light.
The tissue to be diagnosed reflects red narrow-band reflected light, green narrow-band reflected light and blue narrow-band reflected light; meanwhile, the green narrow-band light with the wavelength range of 540-600 nm can also excite the autofluorescence of the tissue to be diagnosed.
The above two implementations of the multiple narrow-band lights emitted by the light source 101 are given, but in practical applications, the light source 101 may also emit other types of narrow-band lights. The light source 101 is intended to emit multiple narrow-band lights, and the multiple narrow-band lights are emitted as long as white light and narrow-band characteristic light can be formed and can be used as fluorescence excitation light. The present embodiment is not particularly limited with respect to the specific contents of the color, wavelength range, central wavelength, and the like of the multi-path narrowband light emitted by the light source 101.
The first sensor 102 is configured to acquire multiple paths of reflected light reflected by the tissue to be diagnosed to obtain multiple paths of reflected light data, and acquire spontaneous fluorescence of the tissue to be diagnosed to obtain fluorescence data.
After the multi-path reflected light reflected by the tissue to be diagnosed in the multi-path narrowband light emitted by the light source 101 and the fluorescence spontaneous by the tissue to be diagnosed, when the first sensor 102 collects the multi-path reflected light and the fluorescence, since the fluorescence may be covered by the multi-path reflected light, in this embodiment, the light source 101 emits a first narrowband light and a second narrowband light in a time-sharing manner, wherein the first narrowband light is a narrowband light used for forming white light and narrowband characteristic light in the multi-path narrowband light; the second narrow-band light is the narrow-band light which is the fluorescence excitation light among the plurality of narrow-band lights.
When the light source 101 emits the first narrow-band light and the second narrow-band light in a time-sharing manner, the first sensor 102 can collect the multiple paths of reflected light and fluorescence in a time-sharing manner, and further collect the multiple paths of reflected light data and fluorescence data, so that the fluorescence is prevented from being covered by the multiple paths of reflected light.
For example, when the light source 101 emits multiple narrow-band lights: red narrow-band light with a central wavelength of 620nm, green narrow-band light with a central wavelength of 532nm, blue narrow-band light with a central wavelength of 450nm, blue-violet light with a central wavelength of 415nm, and fluorescence excitation light with a central wavelength of 405 nm. Wherein, red narrow-band light with the central wavelength of 620nm, green narrow-band light with the central wavelength of 532nm and blue narrow-band light with the central wavelength of 450nm are used for synthesizing white light; blue narrow-band light with the central wavelength of 450nm and blue-violet light with the central wavelength of 415nm are used for forming narrow-band characteristic light; the fluorescence excitation light with the central wavelength of 405nm is used for exciting the autofluorescence of the tissue to be diagnosed; at this time, the light source 101 first emits red narrow-band light with a central wavelength of 620nm, green narrow-band light with a central wavelength of 532nm, blue narrow-band light with a central wavelength of 450nm, and blue-violet narrow-band light with a central wavelength of 415nm, at this time, the four narrow-band light reflects four reflected lights from the tissue to be diagnosed, and the first sensor 102 collects the four reflected lights to obtain four reflected light data.
Then, the light source 101 emits fluorescence excitation light having a central wavelength of 405nm, the emitted fluorescence excitation light excites autofluorescence of the tissue to be diagnosed, and then, the first sensor 102 acquires the fluorescence to obtain fluorescence data.
After the first sensor acquires the multiple paths of reflected light data and/or fluorescence data, the processing device 103 generates a white light image and a characteristic light image based on the multiple paths of reflected light data, and generates a fluorescence image based on the fluorescence data. Specifically, when the light source 101 emits multiple narrow-band lights: when the red narrow-band light with the central wavelength of 620nm, the green narrow-band light with the central wavelength of 532nm, the blue narrow-band light with the central wavelength of 450nm, the blue-violet narrow-band light with the central wavelength of 415nm and the fluorescence excitation light with the central wavelength of 405nm, the processing device 103 convolves the red reflected light data, the green reflected light data and the blue reflected light data with convolution factors corresponding to the white light image to obtain the white light image; convolving the green reflected light data, the blue reflected light data and the blue-violet reflected light data with convolution factors corresponding to the characteristic light image to obtain a characteristic light image; and (4) convolving the fluorescence data and the blue reflected light data with convolution factors corresponding to the fluorescence image to obtain the fluorescence image.
The convolution factors corresponding to the white light image, the characteristic light image and the fluorescence image are matrixes, the number of rows of the matrix corresponding to each image is the same as the number of types of light data required for generating the image, and the number of columns of the matrix is 3. For example, the white light image corresponds to three data, i.e., red reflected light data, green reflected light data, and blue reflected light data, and therefore, the number of lines of the convolution factor corresponding to the white light image is 3; the characteristic light image corresponds to three data of green reflected light data, blue reflected light data and blue-violet reflected light data, so that the number of lines of convolution factors corresponding to the characteristic light image is 3; the fluorescence image corresponds to both the fluorescence data and the blue reflected light data, and therefore the number of lines of the convolution factor corresponding to the fluorescence image is 2. In practical applications, the fluorescence image can be generated by dividing the fluorescence data by the blue narrow-band light data, besides the convolution method.
And the display device 104 is used for displaying one or more of the white light image, the characteristic light image and the fluorescence image generated by the processing device 103.
Referring to fig. 2, there is shown a schematic structural view of still another image forming apparatus in the present application, the image forming apparatus including: a light source 201, an objective lens 202, an objective lens 203, an objective lens 204, an objective lens 205 and a second sensor 206, processing means 207 and display means 208.
The specific content of the multipath narrowband light emitted by the light source 201 is the same as that of the light source 101 in the imaging device corresponding to fig. 1, and the specific content may refer to the specific content of the light source 101 in fig. 1, which is not described herein again.
In the imaging device, the objective lenses 202, 203, 204 and 205 have specific coatings, and different objective lenses can pass narrow-band light in different wavelength ranges. For example, when the multiple paths of narrowband light emitted by the light source are red narrowband light, green narrowband light, blue narrowband light and fluorescence excitation light respectively, the objective lens 202 can only pass the red narrowband light, the objective lens 203 can only pass the blue narrowband light, the objective lens 204 can only pass the blue narrowband light, and the objective lens 204 can only pass the fluorescence. In the present imaging apparatus, the positional relationship between the objective lens and the second sensor 206 needs to be such that the light beams passed by the different objective lenses do not overlap in the region of the second sensor. For example, the objective lenses 202, 203, 204 and 205 are arranged in a row, and a certain distance is kept between two adjacent objective lenses, so that light passing through different objective lenses is irradiated on different areas of the second sensor 206, and the different areas do not overlap with each other.
It should be noted that, in the imaging apparatus, since there are four kinds of narrow-band lights emitted by the light source, the number of the objective lenses is 4, in practical application, the number of the objective lenses only needs to be the same as the number of the narrow-band lights emitted by the light source, and the number of the objective lenses needs to be determined according to the specific content of the light source.
After the objective lens 202, the objective lens 203, the objective lens 204 and the objective lens 205 map the multiple paths of reflected light and fluorescence emitted by the light source 201 onto mutually non-overlapping regions of the second sensor 206, the second sensor 206 collects reflected light at different regions to obtain reflected light data, and collects fluorescence to obtain fluorescence data.
After the second sensor 204 obtains the multiple paths of reflected light data and fluorescence data, the processing device 207 then generates a white light image and a characteristic light image based on the multiple paths of reflected light data, and generates a fluorescence image based on the fluorescence data. For a specific implementation manner of the processing device 207 for generating the white light image, the characteristic light image and the fluorescence image, reference may be made to a process of the processing device 103 in fig. 1 for generating the white light image, the characteristic light image and the fluorescence image, which is not described herein again.
And a display device 208 for displaying one or more of the white light image, the characteristic light image and the fluorescence image generated by the processing device 207.
Referring to fig. 3, there is shown a schematic structural diagram of still another imaging apparatus in the present application, which may include a light source 301, an optical filter 302, a third sensor 303, a processing device 304, and a display device 305.
The specific content of the multipath narrowband light emitted by the light source 201 is the same as that of the light source 101 in the imaging device corresponding to fig. 1, and the specific content may refer to the specific content of the light source 101 in fig. 1, which is not described herein again.
In the imaging apparatus, the filter 302 passes narrow-band light of different wavelength ranges to irradiate the third sensor 303 at different times, so that the third sensor 303 collects each path of reflected light reflected by the tissue to be diagnosed and the spontaneous fluorescence of the tissue to be diagnosed, respectively. Specifically, the filter 302 may be a liquid crystal tunable filter, which only allows one color of reflected light or fluorescence to pass through in a preset time period. Therefore, the third sensor 303 may collect different reflected lights to obtain multiple reflected light data and collect fluorescence to obtain fluorescence data in a plurality of consecutive preset time periods.
After the optical filter 302 obtains the multiple reflected light data and the third sensor 303 obtains the fluorescence data, the processing device 304 then generates a white light image and a characteristic light image based on the multiple reflected light data, and generates a fluorescence image based on the fluorescence data. For a specific implementation manner of the processing device 304 to generate the white light image, the characteristic light image and the fluorescence image, reference may be made to a process of generating the white light image, the characteristic light image and the fluorescence image by the processing device 103 in fig. 1, which is not described herein again.
A display device 305 for displaying one or more of the white light image, the characteristic light image and the fluorescence image generated by the processing device 304.
Referring to fig. 4, there is shown a schematic structural diagram of still another imaging apparatus in the present application, which includes a light source 401, an acquisition device 402, a processing device 403 and a display device 404.
The light source 401 has the same function and the same specific implementation as the light source 101 in the imaging device of fig. 1, and specific contents may refer to the light source 101 in the imaging device of fig. 1, which is not described herein again.
The functions of the acquisition device 402 and the acquisition device 102 in the imaging apparatus of fig. 1 are the same as those of the specific implementation, and specific contents may refer to the acquisition device 102 in the imaging apparatus of fig. 1, which are not described herein again.
For the processing means 403, it is used to generate a white light image and a characteristic light image based on the multiplexed reflected light data, and to generate a fluorescence image based on the multiplexed reflected light data and the fluorescence data. As for the process of generating the white light image, the characteristic light image and the fluorescence image by the processing device 403, the process is the same as the process of generating the white light image, the characteristic light image and the fluorescence image by the processing device 103 in the imaging apparatus in fig. 1, and specific contents may refer to the processing device 103 in the imaging apparatus in fig. 1, and are not described herein again.
In the imaging apparatus, the processing device 403 is further configured to fuse any two or three of the white light image, the characteristic light image and the fluorescence image to obtain a fused image, in addition to generating the white light image, the characteristic light image and the fluorescence image. Specifically, when any two or three of the white light image, the characteristic light image and the fluorescence image are fused, in order that the fused image information can more directly reflect the spatial profile structure information for determining whether the tissue to be diagnosed is early-stage cancerous, the processing device 403 respectively performs filtering processing on the white light image, the characteristic light image and the fluorescence image through an FIR filter to obtain a filtered white light image, a filtered characteristic light image and a filtered fluorescence image. Then, the processing device 403 fuses any two or three of the filtered white light image, the filtered characteristic light image and the filtered fluorescence image.
In the imaging device, in order to better assist a doctor in judging whether the tissue to be diagnosed is early cancerated, the preset model is trained so that the trained preset model has abnormal tissue regions for identifying suspected cancers in the white light image, the characteristic light image and the fluorescence image and the probability that the abnormal tissue regions are the cancers. The training process of the preset model may be as follows: inputting a large number of pre-labeled white light images of the suspected early cancer abnormal tissue regions and samples of the cancer probability of the abnormal tissue regions into a preset model to train the preset model, so that the trained preset model has the functions of identifying the suspected cancer abnormal tissue regions in the white light images and identifying the cancer probability of the abnormal tissue regions. For the characteristic light image and the fluorescence image, the method for training the preset model is the same as the method for training the preset model through the white light image, and details are not repeated here.
It should be noted that, the imaging device only provides a method for training the preset model, and in practical application, the preset model may also be trained in other manners, and the embodiment does not limit the specific training manner, as long as the trained preset model can identify the abnormal tissue region suspected of cancer in the white light image and determine the probability that the abnormal tissue region is cancer; abnormal tissue regions suspected of being cancerous and the probability of the abnormal tissue regions being cancerous can be identified in the characteristic light image; and an abnormal tissue region in which cancer is suspected in the fluorescence image and a probability that the abnormal tissue region is cancer may be identified.
After obtaining the trained preset model, the imaging apparatus inputs the white light image, the characteristic light image, and the fluorescence image generated by the processing device 403 into the trained preset model, and the trained preset model determines an abnormal tissue region suspected of early cancer in each input image and a probability that the abnormal tissue region is cancer; and outputting the abnormal tissue region of the suspected early cancer in each image and the diagnosis information of the probability that the abnormal tissue region is cancer.
The processing device 403 may further determine the abnormal tissue region of the suspected cancer in the tissue to be diagnosed and the probability that the tissue to be diagnosed is a cancer by combining the abnormal tissue region of the suspected cancer in each image output by the trained preset model and the probability that the abnormal tissue region is a cancer. Furthermore, the processing device 403 may also mark the determined tissue region suspected to be cancerous in the tissue to be diagnosed in the fusion image.
A display device 404 for displaying one or more of the white light image, the characteristic light image, the fluorescence image, the fusion image, and the fusion image of the abnormal tissue region marked with the suspected cancer generated by the processing device 403.
In the present imaging apparatus, the processing device 403 may synthesize the fusion image, and the processing device 403 may determine the abnormal tissue region suspected of cancer and the probability that the abnormal tissue region is cancer in the white light image, the characteristic light image, and the fluorescence image through the trained preset model. And comprehensively determining the probability that the abnormal tissue area in the tissue to be diagnosed is cancer according to the probability that the abnormal tissue corresponding to each image is cancer. Since the processing means 403 comprehensively determines the probability that an abnormal tissue region in the tissue to be diagnosed is cancer based on the information of the three images, the accuracy of the determined probability is improved.
Referring to fig. 5, there is shown yet another imaging apparatus in the present application comprising a light source 501, acquisition means 502, processing means 503 and display means 504.
The specific implementations of the light source 501, the acquisition device 502, and the processing device 503 are the same as the specific implementations of the light source 401, the acquisition device 402, and the processing device 403 in fig. 4, and are not described herein again.
The imaging apparatus includes a display device 504, and the display device 504 can display any one or more of the white light image, the characteristic light image, the fluorescence image, the fusion image, and the fusion image generated by the processing device 503, and the fusion image of the abnormal tissue region marked with the suspected cancer. The display device 504 may also display the diagnosis information indicating the probability of cancer existing in the tissue to be diagnosed, which is determined by the processing device 503.
In the imaging device, the display device can display the fusion image marked with the abnormal tissue area and the diagnosis information of the probability that the abnormal tissue area is cancer, thereby providing a higher-accuracy basis for a doctor to judge the probability that the abnormal tissue area in the tissue to be diagnosed is cancer.
The embodiments are described in a progressive manner in the specification, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. In this document, relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms "comprising," "including," and the like, as used herein, are to be construed in an inclusive sense as opposed to an exclusive or exhaustive sense; that is, the meaning of "includes but is not limited to". The invention can be applied to various fields, such as a mobile phone, a.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
Claims (10)
1. An image forming apparatus, characterized by comprising: the device comprises a light source, a collecting device and a processing device; wherein,
the light source is used for emitting preset multi-path narrow-band light; the multi-path narrow-band light is used for forming white light and narrow-band characteristic light and is used as fluorescence exciting light;
the acquisition device is used for acquiring multi-path reflected light to obtain multi-path reflected light data and acquiring fluorescence to obtain fluorescence data; the multipath reflected light is: the narrow-band light of the multi-path narrow-band light, which forms white light and narrow-band characteristic light, is reflected by the tissue to be diagnosed; the fluorescence is: after the narrow-band light serving as fluorescence excitation light in the multi-path narrow-band light excites the tissue to be diagnosed, the spontaneous fluorescence of the tissue to be diagnosed;
the processing device is used for forming a white light image and a characteristic light image of the tissue to be diagnosed according to the multipath reflected light data and forming a fluorescence image of the tissue to be diagnosed according to the multipath reflected light data and the fluorescence data.
2. The imaging apparatus of claim 1, wherein the acquisition device comprises a first sensor;
the light source is specifically used for emitting first narrow-band light and second narrow-band light in a time-sharing manner; the first narrow-band light is: narrow-band light used for forming white light and narrow-band characteristic light in the multi-path narrow-band light; the second narrow-band light is: narrow-band light serving as fluorescence excitation light in the multi-path narrow-band light;
the first sensor is used for collecting the multipath reflected light reflected by the tissue to be diagnosed through the first narrow-band light to generate multipath reflected light data, and collecting the spontaneous fluorescence generated by the tissue to be diagnosed excited by the second narrow-band light to generate fluorescence data.
3. The imaging apparatus of claim 1, wherein the collecting means includes objective lenses and second sensors, the number of objective lenses being a target number; the target quantity is the number of paths of narrow-band light emitted by the light source; the preset wavelength ranges allowed to pass through by different objective lenses are not overlapped with each other; the light that is allowed to pass through by different objective lenses does not overlap in the area of the second sensor;
the objective lens is used for correspondingly focusing the multipath reflected light and the fluorescence on non-overlapping areas of the second sensor;
the second sensor is used for acquiring the multi-path reflected light to obtain the multi-path reflected light data and acquiring the fluorescence to obtain the fluorescence data.
4. The imaging apparatus of claim 1, wherein the acquisition device comprises a filter and a third sensor;
the optical filter is used for allowing the reflected light or the fluorescence passing through different paths in different preset time periods; wherein, a preset time period only allows one path of reflected light or fluorescence;
the fourth sensor is used for acquiring the multi-path reflected light to obtain the multi-path reflected light data and acquiring the fluorescence to obtain the fluorescence data.
5. The imaging apparatus of claim 1, wherein the processing device is further configured to:
inputting the white light image, the characteristic light image and the fluorescence image into a trained preset model; the trained preset model is used for identifying an abnormal tissue area of suspected cancer in the image and determining the probability that the abnormal tissue area is the cancer;
acquiring a first probability that a first abnormal tissue region and a first abnormal tissue region of a suspected cancer in the white light image output by the trained preset model are cancer, a second probability that a second abnormal tissue region and a second abnormal tissue region of the suspected cancer in the characteristic light image are cancer, and a third probability that a third abnormal tissue region and a third abnormal tissue region of the suspected cancer in the fluorescence image are cancer;
determining a fourth abnormal tissue region suspected of being cancerous in the tissue to be diagnosed based on the first abnormal tissue region, the second abnormal tissue region, and the third abnormal tissue region;
determining a fourth probability that the fourth abnormal tissue region is cancer based on the first, second, and third probabilities.
6. The imaging apparatus of claim 5, wherein the processing device is further configured to:
after the white light image, the characteristic light image and the fluorescence image are generated, any two or three of the white light image, the characteristic light image and the fluorescence image are fused to obtain a fused image.
7. The imaging apparatus of claim 6,
the processing device is further configured to mark the fourth abnormal tissue region in the fused image.
8. The imaging apparatus of claim 7, further comprising: a display device;
the display device is used for displaying one or more of the white light image, the characteristic light image, the fluorescence image, the fusion image and the fusion image marked with the fourth abnormal tissue area.
9. The imaging apparatus of claim 8, wherein the display device is further configured to:
displaying diagnostic information in a case where the fused image marked with the fourth abnormal tissue region is displayed; the diagnostic information indicates a probability that the fourth abnormal tissue region is cancer.
10. The imaging apparatus of any of claims 1 to 9, wherein the light source is configured to emit multiple narrow-band lights, and comprises:
the light source is specifically used for emitting red narrow-band light, green narrow-band light, blue narrow-band light and fluorescence excitation light.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810558755.1A CN108670203A (en) | 2018-06-01 | 2018-06-01 | A kind of imaging device |
PCT/CN2018/122235 WO2019227907A1 (en) | 2018-06-01 | 2018-12-20 | Imaging device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810558755.1A CN108670203A (en) | 2018-06-01 | 2018-06-01 | A kind of imaging device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108670203A true CN108670203A (en) | 2018-10-19 |
Family
ID=63809598
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810558755.1A Pending CN108670203A (en) | 2018-06-01 | 2018-06-01 | A kind of imaging device |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN108670203A (en) |
WO (1) | WO2019227907A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109448825A (en) * | 2018-12-17 | 2019-03-08 | 深圳开立生物医疗科技股份有限公司 | A kind of picture frame extraction system and method |
WO2019227907A1 (en) * | 2018-06-01 | 2019-12-05 | 深圳开立生物医疗科技股份有限公司 | Imaging device |
CN114947696A (en) * | 2022-06-17 | 2022-08-30 | 西安交通大学 | White light-narrow band-fluorescence integrated endoscope and use method thereof |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101623191A (en) * | 2009-08-14 | 2010-01-13 | 北京航空航天大学 | Device and method for noninvasively detecting property of stomach tissue |
CN102274000A (en) * | 2011-05-16 | 2011-12-14 | 易定容 | Narrow-band multispectral fluorescent vagina check device |
EP2474265B1 (en) * | 2011-01-11 | 2013-08-21 | Fujifilm Corporation | Endoscope diagnosis system |
CN103284677A (en) * | 2012-03-02 | 2013-09-11 | 熊大曦 | Imaging device for biological tissue observation |
CN105476597A (en) * | 2015-12-22 | 2016-04-13 | 佛山市南海区欧谱曼迪科技有限责任公司 | Multi-mode electronic colposcope system |
CN107635452A (en) * | 2015-06-02 | 2018-01-26 | 奥林巴斯株式会社 | Special optical endoscope device |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7268485B2 (en) * | 2003-10-07 | 2007-09-11 | Eastman Kodak Company | White-emitting microcavity OLED device |
CN101256312A (en) * | 2008-03-25 | 2008-09-03 | 京东方科技集团股份有限公司 | Backlight source |
JP5460506B2 (en) * | 2009-09-24 | 2014-04-02 | 富士フイルム株式会社 | Endoscope apparatus operating method and endoscope apparatus |
US10197256B2 (en) * | 2013-03-15 | 2019-02-05 | Molex, Llc | LED assembly having frame with plurality of traces |
CN103393391A (en) * | 2013-06-20 | 2013-11-20 | 中国科学院苏州生物医学工程技术研究所 | Multifunctional medical instrument for alimentary canal endoscopic surgery |
JP2016107003A (en) * | 2014-12-10 | 2016-06-20 | 富士フイルム株式会社 | Medical image processor and operation method thereof |
CN108670203A (en) * | 2018-06-01 | 2018-10-19 | 深圳开立生物医疗科技股份有限公司 | A kind of imaging device |
-
2018
- 2018-06-01 CN CN201810558755.1A patent/CN108670203A/en active Pending
- 2018-12-20 WO PCT/CN2018/122235 patent/WO2019227907A1/en active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101623191A (en) * | 2009-08-14 | 2010-01-13 | 北京航空航天大学 | Device and method for noninvasively detecting property of stomach tissue |
EP2474265B1 (en) * | 2011-01-11 | 2013-08-21 | Fujifilm Corporation | Endoscope diagnosis system |
CN102274000A (en) * | 2011-05-16 | 2011-12-14 | 易定容 | Narrow-band multispectral fluorescent vagina check device |
CN103284677A (en) * | 2012-03-02 | 2013-09-11 | 熊大曦 | Imaging device for biological tissue observation |
CN107635452A (en) * | 2015-06-02 | 2018-01-26 | 奥林巴斯株式会社 | Special optical endoscope device |
CN105476597A (en) * | 2015-12-22 | 2016-04-13 | 佛山市南海区欧谱曼迪科技有限责任公司 | Multi-mode electronic colposcope system |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019227907A1 (en) * | 2018-06-01 | 2019-12-05 | 深圳开立生物医疗科技股份有限公司 | Imaging device |
CN109448825A (en) * | 2018-12-17 | 2019-03-08 | 深圳开立生物医疗科技股份有限公司 | A kind of picture frame extraction system and method |
CN114947696A (en) * | 2022-06-17 | 2022-08-30 | 西安交通大学 | White light-narrow band-fluorescence integrated endoscope and use method thereof |
Also Published As
Publication number | Publication date |
---|---|
WO2019227907A1 (en) | 2019-12-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Shapey et al. | Intraoperative multispectral and hyperspectral label‐free imaging: A systematic review of in vivo clinical studies | |
US20220244183A1 (en) | Multispectral Sample Imaging | |
US10111614B2 (en) | Apparatus and method for detecting NIR fluorescence at sentinel lymph node | |
US11703454B2 (en) | Method and apparatus for multiplexed imaging of spectrally-similar fluorophores | |
US10032064B2 (en) | Visualization and measurement of cell compartments | |
US8214025B2 (en) | Fluorescence endoscope system | |
CN102892348B (en) | The method and apparatus of multispectral photon imaging | |
US8078265B2 (en) | Systems and methods for generating fluorescent light images | |
JP6780910B2 (en) | Computer methods and systems for deriving intercellular spatial proximity | |
CN103547207B (en) | Fluorescence monitoring apparatus | |
KR20170097653A (en) | Intra-oral 3-d fluorescence imaging | |
US20100103250A1 (en) | Fluorescence observation apparatus and fluorescence observation method | |
US20220236421A1 (en) | Dynamic range using a monochrome image sensor for hyperspectral and fluorescence imaging and topology laser mapping | |
CN108670203A (en) | A kind of imaging device | |
EP2378955A1 (en) | Method and apparatus for detection of caries | |
US8823786B2 (en) | Method and apparatus for testing an optical investigation system with a light source and an imaging system | |
EP2770899A1 (en) | Tissue and cellular imaging | |
CN107003240B (en) | Method and device for identifying tumor site | |
JP2018146578A (en) | Microscopy system and microscopy method for quantifying fluorescence | |
WO2014014956A1 (en) | Formulaic imaging for tissue diagnosis | |
JP6451494B2 (en) | Imaging device | |
CN110731748B (en) | Electronic endoscope | |
Cavalcanti et al. | Smartphone‐based spectral imaging otoscope: System development and preliminary study for evaluation of its potential as a mobile diagnostic tool | |
CN209377550U (en) | A kind of imaging device | |
EP2589328A1 (en) | Image processing apparatus and image processing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB03 | Change of inventor or designer information |
Inventor after: Chen Yunliang Inventor after: Feng Nengyun Inventor after: Li Peng Inventor before: Chen Yunliang Inventor before: Feng Nengyun |
|
CB03 | Change of inventor or designer information | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20181019 |
|
RJ01 | Rejection of invention patent application after publication |