CN115731205B - Image processing device and method for endoscope, electronic device, and storage medium - Google Patents

Image processing device and method for endoscope, electronic device, and storage medium Download PDF

Info

Publication number
CN115731205B
CN115731205B CN202211504982.9A CN202211504982A CN115731205B CN 115731205 B CN115731205 B CN 115731205B CN 202211504982 A CN202211504982 A CN 202211504982A CN 115731205 B CN115731205 B CN 115731205B
Authority
CN
China
Prior art keywords
image
pixel
pixel value
difference
narrowband
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211504982.9A
Other languages
Chinese (zh)
Other versions
CN115731205A (en
Inventor
张仕鹏
付野
李宗州
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Aohua Endoscopy Co ltd
Peking University
Original Assignee
Shanghai Aohua Endoscopy Co ltd
Peking University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Aohua Endoscopy Co ltd, Peking University filed Critical Shanghai Aohua Endoscopy Co ltd
Priority to CN202211504982.9A priority Critical patent/CN115731205B/en
Publication of CN115731205A publication Critical patent/CN115731205A/en
Application granted granted Critical
Publication of CN115731205B publication Critical patent/CN115731205B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Endoscopes (AREA)

Abstract

The invention provides an image processing device and method of an endoscope, an electronic device and a storage medium, wherein the image processing device of the endoscope comprises: the image processing module is used for processing difference information of the first narrow-band image and the second narrow-band image in a preset processing mode to obtain a false color synthetic image with enhanced tissue difference characteristics of the detected body. Thus, imaging contrast and color distinction of different tissues in the detected body can be increased, and accurate lesion diagnosis based on the tissue enhancement image can be conveniently carried out later.

Description

Image processing device and method for endoscope, electronic device, and storage medium
Technical Field
The present invention relates to the technical field of endoscopes, and in particular, to an image processing apparatus, an image processing method, an electronic device, and a storage medium for an endoscope.
Background
In the medical field, endoscopes are widely used. Diagnosis of color changes in images obtained by endoscopy is an important means for finding digestive tract diseases. Wherein, the change of the image color is beneficial to more accurately judging the lesion position and the lesion property.
Related art shows that in the current image obtained based on the endoscope, the characterization colors of different tissues in the detected body are similar, so that effective distinction cannot be performed, visual fatigue is easily generated by an observer, and the lesion diagnosis accuracy is further reduced.
Disclosure of Invention
The invention provides an image processing device, an image processing method, electronic equipment and a storage medium of an endoscope, which are used for solving the defect that the characterization colors of different tissues in a detected body are similar in the prior art, realizing that the imaging contrast and the color distinguishing degree of different tissues (such as deep and shallow blood vessels) in the detected body can be increased, and facilitating accurate lesion diagnosis.
The present invention provides an image processing apparatus of an endoscope, comprising: an image acquisition module and an image processing module;
The image acquisition module is configured to acquire a first narrowband image and a second narrowband image under the condition that a subject is irradiated by first narrowband light and second narrowband light, where the subject includes a first preset tissue and a second preset tissue, the first narrowband image has a corresponding relationship with the first narrowband light and the first preset tissue, and the second narrowband image has a corresponding relationship with the second narrowband light and the second preset tissue;
The image processing module is used for processing the difference information of the first narrow-band image and the second narrow-band image by adopting a preset processing mode to obtain a false color synthetic image with the tissue difference characteristic of the detected body enhanced.
According to the image processing device of the endoscope, the preset processing mode comprises brightness unification processing, difference characteristic enhancement processing and channel allocation processing, and the image processing module comprises a brightness unification processing unit, a difference characteristic enhancement processing unit and a channel allocation processing unit;
The brightness unification processing unit is configured to perform brightness unification processing on the first narrowband image and the second narrowband image based on an initial pixel value of the first narrowband image and an initial pixel value of the second narrowband image, so as to obtain a first image and a second image, where the first image corresponds to the first narrowband image, and the second image corresponds to the second narrowband image;
The difference characteristic enhancement processing unit is configured to obtain a first image with enhanced difference characteristics of the first preset tissue and the second preset tissue and a second image with enhanced difference characteristics of the first preset tissue and the second preset tissue based on the pixel values of the first image and the pixel values of the second image;
the channel allocation processing unit is used for performing color channel allocation on the first image with the enhanced difference characteristic and the second image with the enhanced difference characteristic based on a preset allocation rule to obtain a false color composite image with the enhanced tissue difference characteristic of the detected body.
According to the image processing device of the endoscope, the difference characteristic enhancement processing unit comprises a common computing subunit, a unique computing subunit and a difference characteristic enhancement subunit;
the sharing calculating subunit is configured to obtain a sharing image based on the pixel value of the first image and the pixel value of the second image;
the unique calculation subunit is configured to obtain a first unique image based on the pixel value of the first image and the common pixel value of the common image, and obtain a second unique image based on the pixel value of the second image and the common pixel value;
The difference feature enhancement subunit is configured to obtain the first image with the enhanced difference feature based on the pixel value of the first image and the unique pixel value of the second unique image, and obtain the second image with the enhanced difference feature based on the pixel value of the second image and the unique pixel value of the first unique image.
According to the image processing device of the endoscope provided by the invention, the common calculating subunit is specifically configured to obtain, for each pixel point in which pixel coordinates in the first image and the second image are the same, the pixel value of the first image corresponding to the pixel coordinates and the pixel value of the second image corresponding to the pixel coordinates, which is smaller than the pixel value corresponding to the pixel coordinates, as the pixel value corresponding to the pixel coordinates; or alternatively, the first and second heat exchangers may be,
The common computing subunit is specifically configured to obtain, for each pixel point in the first image and the second image, the pixel value of the first image corresponding to the pixel coordinate and the weighted average value of the pixel value of the second image corresponding to the pixel coordinate as the pixel value corresponding to the pixel coordinate, so as to obtain the common image.
According to the image processing device of the endoscope, the unique calculation subunit is specifically configured to obtain, for each pixel point in the first image and the common image, a pixel value of the first image corresponding to the pixel coordinate and a difference value of the common pixel value corresponding to the pixel coordinate as a first difference value, and use the first difference value as a pixel value corresponding to the pixel coordinate to obtain the first unique image;
And aiming at each pixel point with the same pixel coordinate in the second image and the common image, taking the difference value between the pixel value of the second image corresponding to the pixel coordinate and the common pixel value corresponding to the pixel coordinate as a second difference value, and taking the second difference value as the pixel value corresponding to the pixel coordinate to obtain the second unique image.
According to the image processing device of the endoscope, the difference characteristic enhancing subunit is specifically configured to, for each pixel point in which pixel coordinates in the first image and the second unique image are the same, take a difference value between a pixel value of the first image corresponding to the pixel coordinate and a unique pixel value of the second unique image corresponding to the pixel coordinate as a third difference value, and take the third difference value as a pixel value corresponding to the pixel coordinate to obtain the first image after the difference characteristic enhancement;
And aiming at each pixel point with the same pixel coordinate in the second image and the first unique image, taking the difference value between the pixel value of the second image corresponding to the pixel coordinate and the unique pixel value of the first unique image corresponding to the pixel coordinate as a fourth difference value, and taking the fourth difference value as the pixel value corresponding to the pixel coordinate to obtain the second image with the enhanced difference characteristic.
According to the image processing device of the endoscope, the brightness unification processing unit is specifically used for determining the first image based on the initial pixel value of the first narrow-band image;
obtaining a pixel coefficient based on the initial pixel value of the first narrowband image and the initial pixel value of the second narrowband image;
The second image is determined based on the initial pixel values of the second narrowband image and the pixel coefficients.
The invention also provides an image processing method of the endoscope, which is applied to the image processing device of any one of the endoscopes, and comprises the following steps:
Acquiring a first narrowband image and a second narrowband image under the condition that a subject is irradiated by first narrowband light and second narrowband light through an image acquisition module, wherein the subject comprises a first preset tissue and a second preset tissue, the first narrowband image has a corresponding relation with the first narrowband light and the first preset tissue, and the second narrowband image has a corresponding relation with the second narrowband light and the second preset tissue;
And processing the difference information of the first narrow-band image and the second narrow-band image by an image processing module in a preset processing mode to obtain a false color composite image with the tissue difference characteristics of the detected body enhanced.
The invention also provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the image processing method of an endoscope as described in any of the above when executing the computer program.
The present invention also provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the image processing method of an endoscope as described in any of the above.
The invention provides an image processing device and method of an endoscope, an electronic device and a storage medium, wherein the image processing device of the endoscope comprises: the image processing module is used for processing difference information of the first narrow-band image and the second narrow-band image in a preset processing mode to obtain a false color synthetic image with enhanced tissue difference characteristics of the detected body. Thus, imaging contrast and color distinction of different tissues in the detected body can be increased, and accurate lesion diagnosis based on the tissue enhancement image can be conveniently carried out later.
Drawings
In order to more clearly illustrate the invention or the technical solutions of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are some embodiments of the invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic view of an image processing apparatus of an endoscope according to the present invention;
FIG. 2 is a second schematic view of an image processing apparatus of an endoscope according to the present invention;
FIG. 3 is a flow chart of an image processing method of an endoscope provided by the present invention;
fig. 4 is a schematic structural diagram of an electronic device according to the present invention.
Reference numerals:
110: an image acquisition module; 120: an image processing module; 201: a light source module; 202: an image sensor; 203: an endoscope; 204: and a display module.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The image processing device of the endoscope provided by the invention can emphasize and display blood vessel images of the surface layer micro blood vessels of the mucous membrane of the digestive tract and the like in one scene, so that the image details and differences of the images can be enhanced. It will be appreciated by those skilled in the art that the design concept of the present invention may be applied to other clinical situations, in addition to the image enhancement display of the capillaries of the mucosa of the digestive tract.
As shown in fig. 1, an image processing apparatus of an endoscope according to the present invention may include: an image acquisition module 110 and an image processing module 120.
The image acquisition module 110 is configured to acquire a first narrowband image and a second narrowband image when the first narrowband light and the second narrowband light illuminate the subject.
In one embodiment, the first and second narrowband lights may illuminate the subject at the same time, and the image acquisition module 110 may acquire the original RGB image in the case where the first and second narrowband lights illuminate the subject as simultaneous illumination lights. And further obtaining a first narrowband image and a second narrowband image based on the color channels of the original RGB image according to the characteristics of the first narrowband light and the second narrowband light.
The subject may include a first preset tissue and a second preset tissue, and when the micro blood vessel is taken as the subject, the superficial blood vessel is taken as the first preset tissue, and the middle-deep blood vessel is taken as the second preset tissue, the first narrow-band light may be blue-violet narrow-band light having a center wavelength of 410nm to 440nm, and the second narrow-band light may be green narrow-band light having a center wavelength of 530nm to 550 nm.
The first narrowband light is narrowband band light corresponding to an absorption peak of the first preset tissue for light, and the second narrowband light is narrowband band light corresponding to an absorption peak of the second preset tissue for light. Thus, B-channel information in the original RGB image can be taken as the first narrowband image. And taking G channel information in the original RGB image as a second narrow-band image.
In another embodiment, the first narrowband light and the second narrowband light may not be irradiated to the subject at the same time, that is, the first narrowband light and the second narrowband light are used as time-sharing illumination light, and the subject is irradiated at different times, respectively, in which case, the image acquisition module 110 may acquire a narrowband image corresponding to the specific local enhancement information of the first preset tissue, that is, the first narrowband image, in the case where the subject is irradiated with the first narrowband light. The image acquisition module 110 may further acquire a narrowband image corresponding to the specific local enhancement information of the second preset tissue, that is, the second narrowband image, in the case where the second narrowband light irradiates the subject.
That is, the first narrowband image has a correspondence with the first narrowband light and the first preset tissue, and the second narrowband image has a correspondence with the second narrowband light and the second preset tissue. The first narrowband image includes specific local enhancement information of a first preset tissue. The second narrowband image comprises specific local enhancement information of a second preset tissue.
The bandwidth range of the first narrowband light is a first preset bandwidth range, and the bandwidth range of the second narrowband light is a second preset bandwidth range, wherein the first preset bandwidth range and the second preset bandwidth range can be consistent or inconsistent, and the first narrowband light and the second narrowband light can be specifically set according to actual conditions, which is reasonable.
The image processing module 120 is configured to process difference information of the first narrowband image and the second narrowband image by adopting a preset processing manner, so as to obtain a false color composite image with enhanced tissue difference characteristics of the subject.
After the first narrowband image and the second narrowband image are acquired, in order to distinguish the first preset tissue from the second preset tissue, a preset processing mode can be adopted to process the difference information of the first narrowband image and the second narrowband image, so as to obtain a false color composite image with enhanced tissue difference characteristics of the detected body.
The preset processing mode can comprise brightness unification processing, image enhancement processing and channel allocation processing, so that imaging contrast and color distinction of different tissues in a detected body can be increased, and accurate lesion diagnosis based on tissue enhancement images can be conveniently carried out later.
In order to facilitate understanding of the image processing apparatus of the endoscope provided by the present invention, the image processing apparatus of the endoscope provided by the present invention will be described with reference to fig. 2.
As shown in fig. 2, the image processing apparatus of the endoscope provided by the present invention includes an image processing module 120, an image acquisition module (not shown in the figure) composed of an endoscope 203 and an image sensor 202 provided on the top of the endoscope 203 (on the side not connected to the image processing apparatus of the endoscope), a light source module 201, and a display module 204.
The light source module 201 of the image processing apparatus of the endoscope may provide an illumination light source including first and second narrowband lights. In the application process, in the case where the first narrowband light and the second narrowband light supplied from the light source module 201 irradiate the subject, the first narrowband image, and the second narrowband image may be acquired by the image acquisition module.
The light source module 201 may simultaneously provide illumination light sources of the first and second narrowband lights. It is also possible to provide the illumination source of the first narrowband light first and then the illumination source of the second narrowband light. It is also possible to first provide an illumination source of the second narrowband light and then provide an illumination source of the first narrowband light. This is reasonable.
It should be noted that, the first preset organization and the second preset organization may be adjusted according to actual situations, and in this embodiment, the specific limitation is not limited. It will be appreciated that if the first preset tissue and the second preset tissue change, the first narrowband light and the second narrowband light change accordingly.
For example, a fine blood vessel is taken as a subject, a superficial blood vessel is taken as a first preset tissue, and a medium deep blood vessel is taken as a second preset tissue. The light source module 201 may provide blue-violet narrow-band light (first narrow-band light) having a center wavelength of 410nm to 440nm, and the light source module 201 may provide green narrow-band light (second narrow-band light) having a center wavelength of 530nm to 550 nm.
The blue-violet narrow-band light with the central wavelength of 410-440 nm is the light corresponding to the absorption peak of the surface blood vessel, and the green narrow-band light source with the central wavelength of 530-550 nm is the light corresponding to the absorption peak of the middle and deep blood vessel. Further, the image acquisition module may acquire the first narrowband image and the second narrowband image.
In one embodiment, the image acquisition module acquires the first narrowband image and the second narrowband image by: the image sensor 202 is a color sensor, and thus B-channel information in the original RGB image can be taken as a first narrow-band image, that is, a shallow blood vessel image, because the original RGB image of the subject under simultaneous illumination of the first narrow-band light and the second narrow-band light is acquired by the image sensor 202 provided on top of the endoscope 203. And taking G channel information in the original RGB image as a second narrow-band image, namely a middle-deep blood vessel image.
In a specific implementation, after an image is acquired by the image sensor 202 disposed on the top of the endoscope 203, noise information in the image may be removed by a digital signal processing method or a denoising method, so as to obtain an original RGB image, and further obtain a first narrowband image and a second narrowband image. Thus, the acquired narrow-band image can be ensured to be more accurate and clear.
In another embodiment, the image acquisition module acquires the first narrowband image and the second narrowband image by: an image of the subject under the first narrowband light irradiation is acquired as a first narrowband image, i.e., a shallow blood vessel image, and an image of the subject under the second narrowband light irradiation is acquired as a second narrowband image, i.e., a middle-deep blood vessel image, by an image sensor 202 provided on top of an endoscope 203. In the case where the first narrowband light and the second narrowband light are time-division illumination light, the image sensor 202 may be a color sensor or a black-and-white sensor.
In a specific implementation, after an image is acquired by the image sensor 202 disposed on the top of the endoscope 203, noise information in the image may be removed by a digital signal processing method or a denoising method, so as to obtain a first narrowband image or a second narrowband image. Thus, the acquired narrow-band image can be ensured to be more accurate and clear.
After the first narrowband image and the second narrowband image are acquired, the difference information of the first narrowband image and the second narrowband image can be processed by adopting a preset processing mode through the image processing module 130, so as to obtain a false color composite image with enhanced tissue difference characteristics of the detected body. After the false color composite image is acquired, the false color composite image enhanced by the tissue difference feature of the subject may be displayed by the display module 204 for subsequent accurate diagnosis of lesions based on the tissue enhanced image.
In one embodiment, the tissue enhancement images of multiple subjects may be acquired in real time, and the tissue enhancement images of the multiple subjects may be integrated frame by frame to obtain a video stream related to the subjects, and displayed by the display module 204 for the user to diagnose the illness state.
As an implementation manner of the embodiment of the present invention, the preset processing manner may include a brightness unification process, a difference feature enhancement process, and a channel allocation process. The image processing module may include a brightness unification processing unit, an image enhancement processing unit, and a channel allocation processing unit.
The brightness unification processing unit is used for performing brightness unification processing on the first narrow-band image and the second narrow-band image based on the initial pixel value of the first narrow-band image and the initial pixel value of the second narrow-band image to obtain a first image and a second image. Wherein the first image corresponds to a first narrowband image and the second image corresponds to a second narrowband image.
The brightness unification processing unit is specifically configured to determine the first image based on an initial pixel value of the first narrowband image.
In one embodiment, a first image (which may be denoted as B') is obtained based on the B-channel luminance (first narrowband image). The first image may be determined based on initial pixel values for each pixel point of the first narrowband image.
The first image determination method can be represented by the formula (1):
B’=B (1)
That is, B is the initial pixel value of the first narrowband image and B' is the pixel value of the first image.
And obtaining a pixel coefficient based on the initial pixel value of the first narrow-band image and the initial pixel value of the second narrow-band image, wherein the pixel coefficient can be represented by k.
In one embodiment, the sum of the initial pixel values of each pixel in the second narrowband image (which may be denoted as sum (G)) may be divided by the sum of the initial pixel values of each pixel in the first narrowband image (which may be denoted as sum (B)), and the resulting quotient may be taken as a pixel coefficient.
The pixel coefficients can be represented by equation (2):
k=sum(G)\sum(B) (2)
Wherein sum (G) is the sum of the initial pixel values of the individual pixels in the second narrowband image and sum (B) is the sum of the initial pixel values of the individual pixels in the first narrowband image.
In another embodiment, the average value of the initial pixel values of each pixel in the second narrowband image (denoted avg (G)) may be divided from the average value of the initial pixel values of each pixel in the first narrowband image (denoted avg (B)) and the resulting quotient may be taken as the pixel coefficient. This is reasonable and is not particularly limited herein.
The second image is determined based on the initial pixel values of the second narrowband image and the pixel coefficients. Specifically, the second image (may be denoted as G ') may be determined based on the product of the G channel luminance (second narrowband image) and the pixel coefficient, that is, G is an initial pixel value of the second narrowband image and G' is a pixel value of the second image.
The manner in which the second image is determined may be represented by equation (3):
G’=k*G (3)
in the above-described formula (1) -formula (3), G represents a pixel value of a G channel (an initial pixel value corresponding to the second narrowband image) among RGB channels with respect to the original RGB image; b represents the pixel value of the B channel (corresponding to the initial pixel value of the first narrowband image) among the RGB channels with respect to the original RGB image.
The brightness unification processing unit may be further configured to perform brightness unification by histogram stretching or the like, so as to obtain a first image and a second image.
Thus, after the first narrow-band image and the second narrow-band image are processed by the brightness unification processing unit, the situation that the image brightness is different due to the fact that the corresponding intensities of the image sensors in the narrow-band light range are different can be avoided. And furthermore, the average value of all pixels of the first image and the second image is uniform, so that the brightness uniformity of the first image and the second image is ensured.
The difference characteristic enhancement processing unit is configured to obtain a first image with enhanced difference characteristics of the first preset tissue and the second preset tissue and a second image with enhanced difference characteristics of the first preset tissue and the second preset tissue based on the pixel values of the first image and the pixel values of the second image.
In order to be able to show the detailed features of the first preset tissue and the second preset tissue, a first image with enhanced difference features of the first preset tissue and the second preset tissue and a second image with enhanced difference features of the first preset tissue and the second preset tissue may be obtained based on the pixel values of the first image and the pixel values of the second image.
In one embodiment, the image enhancement processing unit includes a common computing subunit, a unique computing subunit, and a first enhancement subunit.
The common computing subunit is configured to obtain a common image based on the pixel value of the first image and the pixel value of the second image.
In one embodiment, the sharing calculating subunit is specifically configured to obtain, for each pixel point in the first image and the second image, where the pixel coordinates of the pixel point are the same, a pixel value of the first image corresponding to the pixel coordinates, and a pixel value of the second image corresponding to the pixel coordinates, and take, as a pixel value corresponding to the pixel coordinates, a pixel value with a small value from among the pixel values of the first image and the pixel value of the second image, to obtain the sharing image.
That is, for each pixel in the common image, the pixel corresponds to a pixel value having a small value among the pixel of the first image and the pixel of the second image.
For example, for the pixel coordinate a, in the first image, the pixel value corresponding to the pixel coordinate a is m, and in the second image, the pixel value corresponding to the pixel coordinate a is n, and m is smaller than n, so that the pixel value corresponding to the pixel coordinate a in the common image can be determined as m, and so on, and the pixel value of each pixel point in the common image can be determined.
For the pixel value of each pixel point in the common image, the confirmation can be performed by using formula (4).
Com=min(B’,G’) (4)
Wherein Com is the pixel value of the pixel point in the common image.
In another embodiment, the shared computing subunit is specifically configured to obtain, for each pixel point in the first image and the second image, where the pixel coordinates of the pixel point are the same, a pixel value of the first image corresponding to the pixel coordinates, and a pixel value of the second image corresponding to the pixel coordinates. And taking the weighted average value of the pixel values of the first image and the pixel values of the second image as the pixel value corresponding to the pixel coordinate to obtain a common image.
That is, for each pixel in the common image, the pixel value corresponding to the pixel is a weighted average of the pixel value of the first image and the pixel value of the second image.
For example, for the pixel coordinate B, in the first image, the pixel value corresponding to the pixel coordinate B is x, in the second image, the pixel value corresponding to the pixel coordinate B is y, and the pixel value of the first image is weighted to be consistent with the pixel value of the second image, so that the pixel value corresponding to the pixel coordinate B in the common image can be determined to be (x+y) \2, and so on, the pixel value of each pixel point in the common image can be determined.
The unique calculation subunit is configured to obtain a first unique image based on the pixel value of the first image and the common pixel value of the common image, and obtain a second unique image based on the pixel value of the second image and the common pixel value.
The unique calculating subunit is specifically configured to, for each pixel point in the first image and the common image, use a difference value between a pixel value of the first image corresponding to the pixel coordinate and a common pixel value corresponding to the pixel coordinate as a first difference value. And taking the first difference value as a pixel value corresponding to the pixel coordinate to obtain a first unique image.
The first difference may be obtained by equation (5).
B0= B’-Com (5)
Wherein B0 is a first difference value, that is, a unique pixel value of the first unique image.
And aiming at each pixel point with the same pixel coordinate in the second image and the common image, taking the difference value between the pixel value of the second image corresponding to the pixel coordinate and the common pixel value corresponding to the pixel coordinate as a second difference value. And taking the second difference value as a pixel value corresponding to the pixel coordinate to obtain a second unique image.
The second difference may be obtained by equation (6).
G0= G’-Com (6)
Wherein G0 is the second difference, i.e. the unique pixel value of the second unique image.
The difference feature enhancement subunit is configured to obtain the first image with the enhanced difference feature based on the pixel value of the first image and the unique pixel value of the second unique image, and obtain the second image with the enhanced difference feature based on the pixel value of the second image and the unique pixel value of the first unique image.
In one embodiment, for each pixel point in the first image and the second unique image, the pixel value of the first image corresponding to the pixel coordinate and the difference value of the unique pixel value of the second unique image corresponding to the pixel coordinate may be used as a third difference value, and the third difference value may be used as the pixel value corresponding to the pixel coordinate, so as to obtain the first image with enhanced difference characteristics.
The pixel value of the pixel point in the first image after the difference feature is enhanced can be obtained by the formula (7).
B1= B’- G0 (7)
Wherein B1 is a pixel value of the first image after the difference feature is enhanced.
For each pixel point with the same pixel coordinate in the second image and the first unique image, a difference value between the pixel value of the second image corresponding to the pixel coordinate and the unique pixel value of the second unique image corresponding to the pixel coordinate is used as a fourth difference value, and the fourth difference value is used as the pixel value corresponding to the pixel coordinate, so that the second image with enhanced difference characteristics is obtained.
The pixel value of the pixel point in the second image after the difference feature is enhanced can be obtained by the formula (8).
G1= G’- B0 (8)
Wherein G1 is a pixel value of the second image after the difference feature is enhanced.
By the mode, the interference of the second unique image is eliminated from the first image with the enhanced difference characteristic, and the interference of the first unique image is eliminated from the second image with the enhanced difference characteristic, so that the detailed characteristics of the first preset tissue and the second preset tissue can be displayed more clearly, and a foundation is laid for accurate disease diagnosis.
In the case where the pixel value of each pixel point in the common image is a pixel value having a small median value between the pixel value of the first image and the pixel value of the second image, in order to make it possible to more quickly and easily determine the first image with enhanced difference features and the second image with enhanced difference features, the first image with enhanced difference features and the second image with enhanced difference features may be determined in the following manner.
A difference image is obtained based on the pixel values of the first image and the pixel values of the second image.
For each pixel point in the first image and the second image, the pixel value of the second image corresponding to the pixel coordinate and the absolute value of the difference value of the pixel value of the first image can be used as the pixel value of the pixel coordinate in the difference image.
The pixel values of the difference image may be determined using equation (9).
Δ=| G’ - B’ | (9)
Where Δ is the pixel value of the difference image.
And further it can be determined whether the initial pixel value of the second image is greater than the initial pixel value of the first image.
For each pixel point with the same pixel coordinates in the first narrow-band image and the second narrow-band image, whether the initial pixel value of the second narrow-band image corresponding to the pixel coordinates is larger than the initial pixel value of the first narrow-band image can be judged. And taking the second narrow-band image as the second image with enhanced difference characteristics under the condition that the initial pixel value of the second narrow-band image is larger than that of the first narrow-band image.
And aiming at each pixel point with the same pixel coordinate in the first narrow-band image and the difference image, taking the difference value between the pixel value of the first narrow-band image corresponding to the pixel coordinate and the difference pixel value of the difference image corresponding to the pixel coordinate as a fifth difference value, and taking the fifth difference value as the pixel value of the pixel coordinate to obtain the first image with enhanced difference characteristics.
The pixel values of the pixel points in the second image after the difference feature enhancement can be determined by using the formula (10).
G1= G’ (10)
Wherein G1 is a pixel value of the second image after the difference feature is enhanced.
The pixel value of the pixel point in the first image after the difference feature is enhanced can be determined by adopting the formula (11).
B1= B’-Δ (11)
Wherein B1 is a pixel value of the first image after the difference feature is enhanced.
And taking the first narrow-band image as a first image with enhanced difference characteristics under the condition that the initial pixel value of the second narrow-band image is not larger than that of the first narrow-band image. And aiming at each pixel point with the same pixel coordinate in the second narrow-band image and the difference image, taking the difference value between the pixel value of the second narrow-band image corresponding to the pixel coordinate and the difference pixel value of the difference image corresponding to the pixel coordinate as a sixth difference value, and taking the sixth difference value as the pixel value of the pixel coordinate to obtain the second image with enhanced difference characteristics.
The pixel values of the pixel points in the first image after the difference feature enhancement can be determined by using the formula (12).
B1= B’ (12)
Wherein B1 is a pixel value of the first image after the difference feature is enhanced.
The pixel values of the pixel points in the second image after the difference feature enhancement can be determined by using formula (13).
G1= G’ -Δ (13)
Wherein G1 is a pixel value of the second image after the difference feature is enhanced.
The channel allocation processing unit is used for performing color channel allocation on the first image with the enhanced difference characteristic and the second image with the enhanced difference characteristic based on a preset allocation rule to obtain a false color composite image with the enhanced tissue difference characteristic of the detected body.
Specifically, channel allocation can be performed on the first image with enhanced difference features and the second image with enhanced difference features based on vision antagonism theory and a preset allocation rule, so as to obtain a pseudo-color synthetic image with enhanced tissue difference features of the detected body.
In one embodiment, the preset allocation principle may be that an enhanced image with a wavelength of the narrowband illumination light corresponding to the image with enhanced difference features being smaller than a preset wavelength is allocated to the B, G channels, and an enhanced image with a wavelength of the narrowband illumination light corresponding to the image with enhanced difference features being greater than or equal to the preset wavelength is allocated to the R channels, so that a false color composite image with enhanced tissue difference features of the subject may be obtained.
Specifically, the pixel value of the first image with enhanced difference feature may be assigned to the B channel of the RGB channels, the pixel value of the first image with enhanced difference feature may be assigned to the G channel of the RGB channels, and the pixel value of the second image with enhanced difference feature may be assigned to the R channel of the RGB channels.
The preset allocation rule may be expressed as:
Wherein R, G and B respectively represent each channel in the RGB channels; m1, m2, m3 denote gain coefficients. The detail enhanced first image is expressed as a difference feature enhanced first image; the detail enhanced second image is represented as a difference feature enhanced second image.
As an embodiment of the present invention, when channels are allocated, the channel allocation processing unit may obtain color preferences by setting different gain coefficients, and use an image obtained after obtaining the color preferences and allocating to RGB channels as a processed image.
As an embodiment of the present invention, the gamma color correction may be performed on the processed image by the channel allocation processing unit, so that the brightness contrast of the image may be improved.
As an embodiment of the present invention, after the processed image is acquired, the processed image may be subjected to a filtering process by the channel allocation processing unit based on high-frequency filtering to obtain a structure-enhanced image, and the structure-enhanced image may be further subjected to a structure-enhanced image based on which a false color composite image with enhanced tissue difference characteristics of the subject is obtained. Thereby the imaging contrast and the color distinction of different tissues in the detected body can be further increased, and the accurate lesion diagnosis is more convenient to carry out.
Therefore, the channel distribution is carried out on the first image with the enhanced difference characteristic and the second image with the enhanced difference characteristic according to the visual antagonism theory, so that the details of the images corresponding to the surface blood vessels (first preset tissues) and the middle and deep blood vessels (second preset tissues) can be improved, the color difference between the surface blood vessels and the middle and deep blood vessels and the mucous membrane can be improved, the visibility of the surface blood vessels and the middle and deep blood vessels is improved, and the detection rate of lesions can be improved.
The image processing method of the endoscope provided by the present invention will be described below, and the image processing method of the endoscope described below and the image processing apparatus of the endoscope described above may be referred to correspondingly to each other.
As shown in fig. 3, the image processing method of an endoscope provided by the present invention is applied to the image processing device of the endoscope, and the method includes:
S301, acquiring, by the image acquisition module, a first narrowband image and a second narrowband image when the first narrowband light and the second narrowband light illuminate the subject.
The subject includes a first preset tissue and a second preset tissue, the first narrowband image has a corresponding relationship with the first narrowband light and the first preset tissue, and the second narrowband image has a corresponding relationship with the second narrowband light and the second preset tissue.
S302, processing difference information of the first narrow-band image and the second narrow-band image by an image processing module in a preset processing mode to obtain a false color synthetic image with enhanced tissue difference characteristics of the detected body.
As an embodiment of the present invention, the preset processing manner includes a brightness unification process, a difference feature enhancement process, and a channel allocation process.
The step of processing, by using a preset processing manner by using an image processing module, the difference information of the first narrowband image and the second narrowband image to obtain a false color composite image with enhanced tissue difference characteristics of the subject may include:
the image processing module comprises a brightness unification processing unit, an image enhancement processing unit and a channel allocation processing unit.
And carrying out brightness unification on the first narrow-band image and the second narrow-band image by a brightness unification processing unit based on the initial pixel value of the first narrow-band image and the initial pixel value of the second narrow-band image to obtain a first image and a second image.
Wherein the first image corresponds to the first narrowband image and the second image corresponds to the second narrowband image
And acquiring a first image with enhanced difference characteristics of the first preset tissue and the second preset tissue and a second image with enhanced difference characteristics of the first preset tissue and the second preset tissue based on the pixel values of the first image and the pixel values of the second image through a difference characteristic enhancement processing unit.
And carrying out color channel allocation on the first image with the enhanced difference characteristics and the second image with the enhanced difference characteristics based on a preset allocation rule by a channel allocation processing unit to obtain a false color composite image with the enhanced tissue difference characteristics of the detected body.
As one embodiment of the present invention, the step of obtaining, by the difference feature enhancement processing unit, the first image with the difference feature enhancement of the first preset tissue and the second image with the difference feature enhancement of the first preset tissue and the second preset tissue based on the pixel value of the first image and the pixel value of the second image may include:
The difference characteristic enhancement processing unit comprises a common computing subunit, a unique computing subunit and a first enhancement subunit.
And obtaining a common image based on the pixel value of the first image and the pixel value of the second image through the common computing subunit.
Obtaining a first unique image based on the pixel value of the first image and the common pixel value of the common image by the unique calculation subunit, and obtaining a second unique image based on the pixel value of the second image and the common pixel value.
The first image with the enhanced difference feature is obtained through the difference feature enhancer unit based on the pixel value of the first image and the unique pixel value of the second unique image, and the second image with the enhanced difference feature is obtained based on the pixel value of the second image and the unique pixel value of the first unique image.
As an embodiment of the present invention, the step of obtaining, by the common computing subunit, the common image based on the pixel value of the first image and the pixel value of the second image may include:
The common computing subunit uses the pixel value with small value as the pixel value corresponding to the pixel coordinate in the pixel value of the first image corresponding to the pixel coordinate and the pixel value of the second image corresponding to the pixel coordinate for each pixel point with the same pixel coordinate in the first image and the second image to obtain the common image; or alternatively, the first and second heat exchangers may be,
The common computing subunit is specifically configured to obtain, for each pixel point in the first image and the second image, the pixel value of the first image corresponding to the pixel coordinate and the weighted average value of the pixel value of the second image corresponding to the pixel coordinate as the pixel value corresponding to the pixel coordinate, so as to obtain the common image.
As an embodiment of the present invention, the step of obtaining, by the unique calculation subunit, the first unique image based on the pixel value of the first image and the common pixel value of the common image, and obtaining the second unique image based on the pixel value of the second image and the common pixel value may include:
The unique calculation subunit is used for obtaining a first unique image by taking a difference value between a pixel value of the first image corresponding to the pixel coordinate and a common pixel value corresponding to the pixel coordinate as a first difference value and taking the first difference value as a pixel value corresponding to the pixel coordinate for each pixel point with the same pixel coordinate in the first image and the common image;
And aiming at each pixel point with the same pixel coordinate in the second image and the common image, taking the difference value between the pixel value of the second image corresponding to the pixel coordinate and the common pixel value corresponding to the pixel coordinate as a second difference value, and taking the second difference value as the pixel value corresponding to the pixel coordinate to obtain the second unique image.
As an embodiment of the present invention, the step of obtaining the first image with the enhanced difference feature by the difference feature enhancer unit based on the pixel value of the first image and the unique pixel value of the second unique image, and obtaining the second image with the enhanced difference feature based on the pixel value of the second image and the unique pixel value of the first unique image may include:
The difference characteristic enhancer unit is used for obtaining a first image after the difference characteristic enhancement by taking a difference value between a pixel value of a first image corresponding to the pixel coordinate and a unique pixel value of a second unique image corresponding to the pixel coordinate as a third difference value and taking the third difference value as a pixel value corresponding to the pixel coordinate for each pixel point with the same pixel coordinate in the first image and the second unique image;
And aiming at each pixel point with the same pixel coordinate in the second image and the first unique image, taking the difference value between the pixel value of the second image corresponding to the pixel coordinate and the unique pixel value of the first unique image corresponding to the pixel coordinate as a fourth difference value, and taking the fourth difference value as the pixel value corresponding to the pixel coordinate to obtain the second image with the enhanced difference characteristic.
As one embodiment of the present invention, the luminance unification processing unit performs luminance unification processing on the first narrowband image and the second narrowband image based on the initial pixel value of the first narrowband image and the initial pixel value of the second narrowband image, so as to obtain a first image and a second image. May include the steps of
Determining, by the luminance unification processing unit, the first image based on initial pixel values of the first narrowband image;
obtaining a pixel coefficient based on the initial pixel value of the first narrowband image and the initial pixel value of the second narrowband image;
The second image is determined based on the initial pixel values of the second narrowband image and the pixel coefficients.
Fig. 4 illustrates a physical schematic diagram of an electronic device, as shown in fig. 4, which may include: processor 410, communication interface (Communications Interface) 420, memory 430, and communication bus 440, wherein processor 410, communication interface 420, and memory 430 communicate with each other via communication bus 440. The processor 410 may invoke logic instructions in the memory 430 to perform the image processing method of the endoscope.
Further, the logic instructions in the memory 430 described above may be implemented in the form of software functional units and may be stored in a computer-readable storage medium when sold or used as a stand-alone product. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
In another aspect, the present invention also provides a computer program product comprising a computer program storable on a non-transitory computer readable storage medium, the computer program, when executed by a processor, being capable of performing the image processing method of an endoscope as described above.
In yet another aspect, the present invention also provides a non-transitory computer-readable storage medium having stored thereon a computer program which, when executed by a processor, is implemented to perform a method of image processing of an endoscope.
The apparatus embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
From the above description of the embodiments, it will be apparent to those skilled in the art that the embodiments may be implemented by means of software plus necessary general hardware platforms, or of course may be implemented by means of hardware. Based on this understanding, the foregoing technical solution may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a computer readable storage medium, such as ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method described in the respective embodiments or some parts of the embodiments.
It will further be appreciated that although operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.

Claims (9)

1. An image processing apparatus of an endoscope, characterized in that the image processing apparatus of an endoscope includes: an image acquisition module and an image processing module;
the image acquisition module is configured to acquire a first narrowband image and a second narrowband image under the condition that a subject is irradiated by first narrowband light and second narrowband light, where the first narrowband light is blue-violet narrowband light, the second narrowband light is green narrowband light, the subject includes a first preset tissue and a second preset tissue, the first narrowband image has a corresponding relationship with the first narrowband light and the first preset tissue, and the second narrowband image has a corresponding relationship with the second narrowband light and the second preset tissue;
The image processing module is used for processing the difference information of the first narrow-band image and the second narrow-band image by adopting a preset processing mode to obtain a false color synthetic image with the tissue difference characteristic of the detected body enhanced;
The image processing module comprises a brightness unification processing unit, a difference characteristic enhancement processing unit and a channel allocation processing unit;
The brightness unification processing unit is configured to perform brightness unification processing on the first narrowband image and the second narrowband image based on an initial pixel value of the first narrowband image and an initial pixel value of the second narrowband image, so as to obtain a first image and a second image, where the first image corresponds to the first narrowband image, and the second image corresponds to the second narrowband image;
The difference characteristic enhancement processing unit is configured to obtain a first image with enhanced difference characteristics of the first preset tissue and the second preset tissue and a second image with enhanced difference characteristics of the first preset tissue and the second preset tissue based on the pixel values of the first image and the pixel values of the second image;
the channel allocation processing unit is used for performing color channel allocation on the first image with the enhanced difference characteristic and the second image with the enhanced difference characteristic based on a preset allocation rule to obtain a false color composite image with the enhanced tissue difference characteristic of the detected body.
2. The image processing apparatus of an endoscope according to claim 1, wherein the difference feature enhancement processing unit includes a common computing subunit, a unique computing subunit, a difference feature enhancement subunit;
the sharing calculating subunit is configured to obtain a sharing image based on the pixel value of the first image and the pixel value of the second image;
the unique calculation subunit is configured to obtain a first unique image based on the pixel value of the first image and the common pixel value of the common image, and obtain a second unique image based on the pixel value of the second image and the common pixel value;
The difference feature enhancement subunit is configured to obtain the first image with the enhanced difference feature based on the pixel value of the first image and the unique pixel value of the second unique image, and obtain the second image with the enhanced difference feature based on the pixel value of the second image and the unique pixel value of the first unique image.
3. The image processing apparatus according to claim 2, wherein the common calculating subunit is configured to obtain the common image by using, for each pixel point in which pixel coordinates in the first image and the second image are the same, a pixel value of the first image corresponding to the pixel coordinates and a pixel value of the second image corresponding to the pixel coordinates, the pixel value having a smaller value, as the pixel value corresponding to the pixel coordinates; or alternatively, the first and second heat exchangers may be,
The common computing subunit is specifically configured to obtain, for each pixel point in the first image and the second image, the pixel value of the first image corresponding to the pixel coordinate and the weighted average value of the pixel value of the second image corresponding to the pixel coordinate as the pixel value corresponding to the pixel coordinate, so as to obtain the common image.
4. The image processing apparatus according to claim 2, wherein the unique calculation subunit is specifically configured to obtain, for each pixel point in the first image and the common image where pixel coordinates are the same, a difference between a pixel value of the first image corresponding to the pixel coordinates and the common pixel value corresponding to the pixel coordinates as a first difference, and use the first difference as a pixel value corresponding to the pixel coordinates to obtain the first unique image;
And aiming at each pixel point with the same pixel coordinate in the second image and the common image, taking the difference value between the pixel value of the second image corresponding to the pixel coordinate and the common pixel value corresponding to the pixel coordinate as a second difference value, and taking the second difference value as the pixel value corresponding to the pixel coordinate to obtain the second unique image.
5. The image processing apparatus according to claim 2, wherein the difference feature enhancement subunit is specifically configured to obtain, for each pixel point in which pixel coordinates in the first image and the second unique image are the same, a difference between a pixel value of the first image corresponding to the pixel coordinate and a unique pixel value of the second unique image corresponding to the pixel coordinate as a third difference, and use the third difference as a pixel value corresponding to the pixel coordinate, the first image after the difference feature enhancement;
And aiming at each pixel point with the same pixel coordinate in the second image and the first unique image, taking the difference value between the pixel value of the second image corresponding to the pixel coordinate and the unique pixel value of the first unique image corresponding to the pixel coordinate as a fourth difference value, and taking the fourth difference value as the pixel value corresponding to the pixel coordinate to obtain the second image with the enhanced difference characteristic.
6. The image processing device of an endoscope according to any of claims 1-5, wherein said brightness unification processing unit is specifically configured to determine said first image based on initial pixel values of said first narrowband image;
obtaining a pixel coefficient based on the initial pixel value of the first narrowband image and the initial pixel value of the second narrowband image;
The second image is determined based on the initial pixel values of the second narrowband image and the pixel coefficients.
7. An image processing method of an endoscope, characterized by being applied to the image processing apparatus of an endoscope according to any one of claims 1 to 6, the method comprising:
Acquiring a first narrowband image and a second narrowband image under the condition that a subject is irradiated by first narrowband light and second narrowband light through an image acquisition module, wherein the first narrowband light is blue-violet narrowband light, the second narrowband light is green narrowband light, the subject comprises a first preset tissue and a second preset tissue, the first narrowband image has a corresponding relation with the first narrowband light and the first preset tissue, and the second narrowband image has a corresponding relation with the second narrowband light and the second preset tissue;
processing the difference information of the first narrow-band image and the second narrow-band image by an image processing module in a preset processing mode to obtain a false color composite image with enhanced tissue difference characteristics of the detected body;
The image processing module comprises a brightness unification processing unit, a difference characteristic enhancement processing unit and a channel allocation processing unit;
Performing brightness unification processing on the first narrow-band image and the second narrow-band image through the brightness unification processing unit based on the initial pixel value of the first narrow-band image and the initial pixel value of the second narrow-band image to obtain a first image and a second image, wherein the first image corresponds to the first narrow-band image, and the second image corresponds to the second narrow-band image;
acquiring a first image with enhanced difference characteristics of the first preset tissue and the second preset tissue and a second image with enhanced difference characteristics of the first preset tissue and the second preset tissue by the difference characteristic enhancement processing unit based on the pixel values of the first image and the pixel values of the second image;
And performing color channel allocation on the first image with the enhanced difference characteristic and the second image with the enhanced difference characteristic by the channel allocation processing unit based on a preset allocation rule to obtain a false color composite image with the enhanced tissue difference characteristic of the detected body.
8. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the image processing method of the endoscope of claim 7 when the computer program is executed by the processor.
9. A non-transitory computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when executed by a processor, implements the image processing method claim of an endoscope according to claim 7.
CN202211504982.9A 2022-11-28 2022-11-28 Image processing device and method for endoscope, electronic device, and storage medium Active CN115731205B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211504982.9A CN115731205B (en) 2022-11-28 2022-11-28 Image processing device and method for endoscope, electronic device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211504982.9A CN115731205B (en) 2022-11-28 2022-11-28 Image processing device and method for endoscope, electronic device, and storage medium

Publications (2)

Publication Number Publication Date
CN115731205A CN115731205A (en) 2023-03-03
CN115731205B true CN115731205B (en) 2024-04-26

Family

ID=85298850

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211504982.9A Active CN115731205B (en) 2022-11-28 2022-11-28 Image processing device and method for endoscope, electronic device, and storage medium

Country Status (1)

Country Link
CN (1) CN115731205B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007111357A (en) * 2005-10-21 2007-05-10 Olympus Medical Systems Corp Living body imaging apparatus and living body observing system
KR20080104191A (en) * 2006-04-12 2008-12-01 올림푸스 메디칼 시스템즈 가부시키가이샤 Endoscope device
CN102727158A (en) * 2011-04-01 2012-10-17 富士胶片株式会社 Endoscope system and calibration method
CN204379226U (en) * 2014-12-19 2015-06-10 佛山市南海区欧谱曼迪科技有限责任公司 A kind of Narrow-Band Imaging endoscope apparatus
CN105380587A (en) * 2014-09-03 2016-03-09 Hoya株式会社 Image capturing system and electronic endoscope system
JP2016174976A (en) * 2016-06-29 2016-10-06 富士フイルム株式会社 Endoscope system
CN107625513A (en) * 2017-09-30 2018-01-26 华中科技大学 Enhancing shows Narrow-Band Imaging endoscopic system and its imaging method
CN109584210A (en) * 2018-10-30 2019-04-05 南京理工大学 Multispectral three-dimensional vein imaging system
CN114298944A (en) * 2021-12-30 2022-04-08 上海闻泰信息技术有限公司 Image enhancement method, device, equipment and storage medium
CN114391792A (en) * 2021-09-13 2022-04-26 南京诺源医疗器械有限公司 Tumor prediction method and device based on narrow-band imaging and imaging endoscope
CN114532960A (en) * 2022-02-18 2022-05-27 郑州市中医院(郑州市红十字医院) Ureter endoscopic equipment and system based on image enhancement

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2335560B1 (en) * 2008-10-17 2014-11-26 Olympus Medical Systems Corp. Endoscope system and endoscopic image processing apparatus
JP2017131559A (en) * 2016-01-29 2017-08-03 ソニー・オリンパスメディカルソリューションズ株式会社 Medical imaging device, medical image acquisition system, and endoscope apparatus
JP6522539B2 (en) * 2016-03-18 2019-05-29 富士フイルム株式会社 Endoscope system and method of operating the same
JP6495539B2 (en) * 2016-03-29 2019-04-03 富士フイルム株式会社 Image processing apparatus, method of operating image processing apparatus, and image processing program
JP7163386B2 (en) * 2018-06-19 2022-10-31 オリンパス株式会社 Endoscope device, method for operating endoscope device, and program for operating endoscope device

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007111357A (en) * 2005-10-21 2007-05-10 Olympus Medical Systems Corp Living body imaging apparatus and living body observing system
KR20080104191A (en) * 2006-04-12 2008-12-01 올림푸스 메디칼 시스템즈 가부시키가이샤 Endoscope device
CN102727158A (en) * 2011-04-01 2012-10-17 富士胶片株式会社 Endoscope system and calibration method
CN105380587A (en) * 2014-09-03 2016-03-09 Hoya株式会社 Image capturing system and electronic endoscope system
CN204379226U (en) * 2014-12-19 2015-06-10 佛山市南海区欧谱曼迪科技有限责任公司 A kind of Narrow-Band Imaging endoscope apparatus
JP2016174976A (en) * 2016-06-29 2016-10-06 富士フイルム株式会社 Endoscope system
CN107625513A (en) * 2017-09-30 2018-01-26 华中科技大学 Enhancing shows Narrow-Band Imaging endoscopic system and its imaging method
CN109584210A (en) * 2018-10-30 2019-04-05 南京理工大学 Multispectral three-dimensional vein imaging system
CN114391792A (en) * 2021-09-13 2022-04-26 南京诺源医疗器械有限公司 Tumor prediction method and device based on narrow-band imaging and imaging endoscope
CN114298944A (en) * 2021-12-30 2022-04-08 上海闻泰信息技术有限公司 Image enhancement method, device, equipment and storage medium
CN114532960A (en) * 2022-02-18 2022-05-27 郑州市中医院(郑州市红十字医院) Ureter endoscopic equipment and system based on image enhancement

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
TRANSFER LEARNING WITH CONVOLUTIONAL NEURAL NETWORK FOR EARLY GASTRIC CANCER CLASSIFICATION ON MAGNIFIYING NARROW-BAND IMAGING;Xiaoqi Liu et al.;《ICIP 2018》;20181231;1388-1392 *

Also Published As

Publication number Publication date
CN115731205A (en) 2023-03-03

Similar Documents

Publication Publication Date Title
US11145053B2 (en) Image processing apparatus and computer-readable storage medium storing instructions for specifying lesion portion and performing differentiation classification in response to judging that differentiation classification operation is engaged based on signal from endoscope
US11769265B2 (en) Skin assessment using image fusion
US8837821B2 (en) Image processing apparatus, image processing method, and computer readable recording medium
US9364147B2 (en) System, method and device for automatic noninvasive screening for diabetes and pre-diabetes
JP6584090B2 (en) Image processing device
US20150294463A1 (en) Image processing device, endoscope apparatus, image processing method, and information storage device
WO2008115547A1 (en) A method of automated image color calibration
WO2016136700A1 (en) Image processing device
JP2007190364A (en) Image processing method and apparatus
CN112469323B (en) Endoscope system
WO2013160861A1 (en) Optical coherent imaging medical device
WO2013008526A1 (en) Image processing apparatus
CN112469324B (en) Endoscope system
US10356378B2 (en) Image processing device, image processing method, and computer-readable recording medium
CN115736791B (en) Endoscopic imaging device and method
CN115731205B (en) Image processing device and method for endoscope, electronic device, and storage medium
CN117314872A (en) Intelligent segmentation method and device for retina image
CN113038868A (en) Medical image processing system
US20220222840A1 (en) Control device, image processing method, and storage medium
JPH01138876A (en) Color picture processing unit
CN115797276A (en) Method, device, electronic device and medium for processing focus image of endoscope
JP7478245B2 (en) Medical imaging device and method of operation thereof
JPH01138877A (en) Color picture analyzer
JPS60202318A (en) Picture coloring system
JPH04193254A (en) Image treating apparatus for endoscope

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant