CN115731205A - Image processing device and method for endoscope, electronic device, and storage medium - Google Patents

Image processing device and method for endoscope, electronic device, and storage medium Download PDF

Info

Publication number
CN115731205A
CN115731205A CN202211504982.9A CN202211504982A CN115731205A CN 115731205 A CN115731205 A CN 115731205A CN 202211504982 A CN202211504982 A CN 202211504982A CN 115731205 A CN115731205 A CN 115731205A
Authority
CN
China
Prior art keywords
image
pixel
difference
pixel value
narrow
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211504982.9A
Other languages
Chinese (zh)
Other versions
CN115731205B (en
Inventor
张仕鹏
付野
李宗州
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Aohua Endoscopy Co ltd
Peking University
Original Assignee
Shanghai Aohua Endoscopy Co ltd
Peking University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Aohua Endoscopy Co ltd, Peking University filed Critical Shanghai Aohua Endoscopy Co ltd
Priority to CN202211504982.9A priority Critical patent/CN115731205B/en
Publication of CN115731205A publication Critical patent/CN115731205A/en
Application granted granted Critical
Publication of CN115731205B publication Critical patent/CN115731205B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Endoscopes (AREA)

Abstract

The invention provides an image processing device and method of an endoscope, an electronic device and a storage medium, the image processing device of the endoscope comprises: the image processing device comprises an image acquisition module and an image processing module, wherein the image acquisition module is used for acquiring a first narrow-band image and a second narrow-band image under the condition that a subject is irradiated by first narrow-band light and second narrow-band light, the subject comprises a first preset tissue and a second preset tissue, the first narrow-band image is in corresponding relation with the first narrow-band light and the first preset tissue, the second narrow-band image is in corresponding relation with the second narrow-band light and the second preset tissue, and the image processing module is used for processing difference information of the first narrow-band image and the second narrow-band image by adopting a preset processing mode to obtain a false color synthetic image with enhanced tissue difference characteristics of the subject. Thus, the imaging contrast and the color discrimination of different tissues in the detected body can be increased, and the accurate lesion diagnosis based on the tissue enhancement image is convenient to carry out.

Description

Image processing device and method for endoscope, electronic device, and storage medium
Technical Field
The present invention relates to the field of endoscope technologies, and in particular, to an image processing apparatus and method for an endoscope, an electronic device, and a storage medium.
Background
In the medical field, endoscopes are widely used. Diagnosis of color changes in images obtained by endoscopy is an important means for finding digestive tract diseases. The change of the image color is beneficial to more accurately judging the position and the property of the lesion.
In the related art, in the current image obtained based on the endoscope, the characterization colors of different tissues in the detected body are similar, so that effective distinguishing cannot be performed, visual fatigue is easily caused to an observer, and the accuracy rate of lesion diagnosis is further reduced.
Disclosure of Invention
The invention provides an image processing device and method of an endoscope, electronic equipment and a storage medium, which are used for solving the defect that the characterization colors of different tissues in a detected body are similar in the prior art, and realizing the purposes of increasing the imaging contrast and color discrimination of different tissues (such as deep and shallow blood vessels) in the detected body and facilitating accurate lesion diagnosis.
The present invention provides an image processing apparatus of an endoscope, including: the device comprises an image acquisition module and an image processing module;
the image acquisition module is used for acquiring a first narrow-band image and a second narrow-band image under the condition that a subject is irradiated by first narrow-band light and second narrow-band light, wherein the subject comprises a first preset tissue and a second preset tissue, the first narrow-band image is in a corresponding relationship with the first narrow-band light and the first preset tissue, and the second narrow-band image is in a corresponding relationship with the second narrow-band light and the second preset tissue;
the image processing module is configured to process difference information of the first narrow-band image and the second narrow-band image in a preset processing manner to obtain a false color composite image with enhanced tissue difference characteristics of the subject.
According to the image processing device of the endoscope provided by the invention, the preset processing mode comprises brightness unified processing, difference characteristic enhancement processing and channel distribution processing, and the image processing module comprises a brightness unified processing unit, a difference characteristic enhancement processing unit and a channel distribution processing unit;
the brightness unification processing unit is configured to perform brightness unification processing on the first narrowband image and the second narrowband image based on an initial pixel value of the first narrowband image and an initial pixel value of the second narrowband image to obtain a first image and a second image, where the first image corresponds to the first narrowband image and the second image corresponds to the second narrowband image;
the difference feature enhancement processing unit is used for acquiring a first image after the difference features of the first preset tissue and the second preset tissue are enhanced and a second image after the difference features of the first preset tissue and the second preset tissue are enhanced based on the pixel values of the first image and the pixel values of the second image;
and the channel distribution processing unit is used for carrying out color channel distribution on the first image subjected to the enhancement of the difference characteristic and the second image subjected to the enhancement of the difference characteristic based on a preset distribution rule to obtain a false color composite image subjected to the enhancement of the tissue difference characteristic of the detected body.
According to the image processing apparatus of an endoscope provided by the present invention, the difference feature enhancement processing unit includes a common calculation subunit, an unique calculation subunit, a difference feature enhancement subunit;
the shared calculating subunit is used for obtaining a shared image based on the pixel value of the first image and the pixel value of the second image;
the unique calculation subunit is configured to obtain a first unique image based on the pixel value of the first image and the common pixel value of the common image, and obtain a second unique image based on the pixel value of the second image and the common pixel value;
the difference feature enhancement unit is configured to obtain the first image after the difference feature enhancement based on the pixel values of the first image and the unique pixel values of the second unique image, and obtain the second image after the difference feature enhancement based on the pixel values of the second image and the unique pixel values of the first unique image.
According to the image processing apparatus of the endoscope provided by the present invention, the common calculation subunit is specifically configured to, for each pixel point having the same pixel coordinate in the first image and the second image, obtain the common image by using, as a pixel value corresponding to the pixel coordinate, a pixel value of the first image corresponding to the pixel coordinate and a pixel value of the second image corresponding to the pixel coordinate, which has a smaller value; or the like, or, alternatively,
the common calculating subunit is specifically configured to, for each pixel point having the same pixel coordinate in the first image and the second image, use a weighted average of a pixel value of the first image corresponding to the pixel coordinate and a pixel value of the second image corresponding to the pixel coordinate as a pixel value corresponding to the pixel coordinate, so as to obtain the common image.
According to the image processing apparatus for an endoscope provided by the present invention, the unique calculating subunit is specifically configured to, for each pixel point having the same pixel coordinate in the first image and the common image, obtain the first unique image by taking a difference between a pixel value of the first image corresponding to the pixel coordinate and a common pixel value corresponding to the pixel coordinate as a first difference, and taking the first difference as a pixel value corresponding to the pixel coordinate;
and regarding each pixel point with the same pixel coordinate in the second image and the common image, taking the difference value between the pixel value of the second image corresponding to the pixel coordinate and the common pixel value corresponding to the pixel coordinate as a second difference value, and taking the second difference value as the pixel value corresponding to the pixel coordinate to obtain the second unique image.
According to the image processing apparatus of the endoscope provided by the present invention, the difference feature enhancing unit is specifically configured to, for each pixel point having the same pixel coordinate in the first image and the second unique image, obtain a difference value between a pixel value of the first image corresponding to the pixel coordinate and a unique pixel value of the second unique image corresponding to the pixel coordinate as a third difference value, and obtain the first image after the difference feature enhancement by using the third difference value as a pixel value corresponding to the pixel coordinate;
and regarding each pixel point with the same pixel coordinate in the second image and the first unique image, taking the difference value between the pixel value of the second image corresponding to the pixel coordinate and the unique pixel value of the first unique image corresponding to the pixel coordinate as a fourth difference value, and taking the fourth difference value as the pixel value corresponding to the pixel coordinate to obtain the second image with enhanced difference characteristics.
According to the image processing apparatus of an endoscope provided by the present invention, the brightness unification processing unit is specifically configured to determine the first image based on an initial pixel value of the first narrowband image;
obtaining a pixel coefficient based on the initial pixel value of the first narrowband image and the initial pixel value of the second narrowband image;
determining the second image based on the initial pixel values and the pixel coefficients of the second narrowband image.
The present invention also provides an image processing method for an endoscope, which is applied to any one of the image processing apparatuses for an endoscope described above, the method including:
acquiring a first narrow-band image and a second narrow-band image by an image acquisition module under the condition that a subject is irradiated by first narrow-band light and second narrow-band light, wherein the subject comprises a first preset tissue and a second preset tissue, the first narrow-band image is in corresponding relation with the first narrow-band light and the first preset tissue, and the second narrow-band image is in corresponding relation with the second narrow-band light and the second preset tissue;
and processing the difference information of the first narrow-band image and the second narrow-band image by an image processing module in a preset processing mode to obtain a false color composite image with enhanced tissue difference characteristics of the detected body.
The present invention also provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the image processing method of the endoscope as described in any of the above when executing the computer program.
The present invention also provides a non-transitory computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the image processing method of an endoscope as described in any of the above.
An image processing apparatus for an endoscope, an image processing method for an endoscope, an electronic device, and a storage medium, the image processing apparatus for an endoscope including: the image processing device comprises an image acquisition module and an image processing module, wherein the image acquisition module is used for acquiring a first narrow-band image and a second narrow-band image under the condition that a subject is irradiated by first narrow-band light and second narrow-band light, the subject comprises a first preset tissue and a second preset tissue, the first narrow-band image is in corresponding relation with the first narrow-band light and the first preset tissue, the second narrow-band image is in corresponding relation with the second narrow-band light and the second preset tissue, and the image processing module is used for processing difference information of the first narrow-band image and the second narrow-band image by adopting a preset processing mode to obtain a false color synthetic image with enhanced tissue difference characteristics of the subject. Thus, the imaging contrast and the color discrimination of different tissues in the detected body can be increased, and the accurate lesion diagnosis based on the tissue enhancement image is convenient to carry out.
Drawings
In order to more clearly illustrate the present invention or the technical solutions in the prior art, the drawings used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a schematic view of an image processing apparatus of an endoscope according to the present invention;
FIG. 2 is a second schematic view of the image processing apparatus of the endoscope according to the present invention;
FIG. 3 is a flow chart of an image processing method of an endoscope according to the present invention;
fig. 4 is a schematic structural diagram of an electronic device provided by the present invention.
Reference numerals:
110: an image acquisition module; 120: an image processing module; 201: a light source module; 202: an image sensor; 203: an endoscope; 204: and a display module.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is obvious that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention provides an image processing device of an endoscope, which can highlight and display blood vessel images of micro blood vessels and the like on the surface layer of a digestive tract mucous membrane in one scene, thereby enhancing the image details and differences of the images. It will be appreciated by those skilled in the art that the design concept of the present solution is applicable to other clinical scenarios besides image-enhanced display of the microvasculature of the mucosa of the digestive tract.
As shown in fig. 1, an image processing apparatus of an endoscope according to the present invention may include: an image acquisition module 110 and an image processing module 120.
The image acquisition module 110 is configured to acquire the first narrow-band image and the second narrow-band image when the subject is irradiated with the first narrow-band light and the second narrow-band light.
In one embodiment, the first and second narrow-band lights may illuminate the subject simultaneously, and the image acquisition module 110 may acquire the original RGB image in a case where the first and second narrow-band lights illuminate the subject as the simultaneous illumination light. And then obtaining a first narrow-band image and a second narrow-band image based on the color channels of the original RGB image according to the characteristics of the first narrow-band light and the second narrow-band light.
The object may include a first predetermined tissue and a second predetermined tissue, and in the case where the fine blood vessel is the object, the superficial blood vessel is the first predetermined tissue, and the intermediate-deep blood vessel is the second predetermined tissue, the first narrow-band light may be blue-violet narrow-band light having a center wavelength of 410nm to 440nm, and the second narrow-band light may be green narrow-band light having a center wavelength of 530nm to 550 nm.
The first narrow-band light is the narrow-band light corresponding to the absorption peak of the first preset tissue to the light, and the second narrow-band light is the narrow-band light corresponding to the absorption peak of the second preset tissue to the light. Therefore, the B-channel information in the original RGB image can be taken as the first narrowband image. And taking G channel information in the original RGB image as a second narrow-band image.
In another embodiment, the first narrowband light and the second narrowband light may not irradiate the object at the same time, that is, the first narrowband light and the second narrowband light respectively irradiate the object at different timings as time-sharing illumination light, in which case the image acquisition module 110 may acquire a narrowband image corresponding to the specific local enhancement information of the first preset tissue, that is, the first narrowband image, in a case where the first narrowband light irradiates the object. The image acquiring module 110 may further acquire a narrowband image corresponding to the specific local enhancement information of the second preset tissue, that is, the second narrowband image, when the second narrowband light irradiates the subject.
That is, the first narrow-band image corresponds to the first narrow-band light and the first predetermined tissue, and the second narrow-band image corresponds to the second narrow-band light and the second predetermined tissue. The first narrowband image comprises specific local enhancement information of a first preset tissue. The second narrowband image includes specific local enhancement information of a second preset tissue.
The bandwidth range of the first narrow-band light is a first preset bandwidth range, and the bandwidth range of the second narrow-band light is a second preset bandwidth range, where the first preset bandwidth range may be consistent with or inconsistent with the second preset bandwidth range, and the first preset bandwidth range and the second preset bandwidth range may be set according to actual situations, which is reasonable.
The image processing module 120 is configured to process difference information of the first narrow-band image and the second narrow-band image by using a preset processing manner, so as to obtain a false color composite image with enhanced tissue difference characteristics of the subject.
After the first narrow-band image and the second narrow-band image are acquired, in order to distinguish the first preset tissue from the second preset tissue, a preset processing mode may be adopted to process difference information of the first narrow-band image and the second narrow-band image, so as to obtain a false color composite image with enhanced tissue difference characteristics of the subject.
The preset processing mode can comprise brightness unification processing, image enhancement processing and channel distribution processing, so that the imaging contrast and the color discrimination of different tissues in the detected body can be increased, and accurate lesion diagnosis based on a tissue enhancement image can be conveniently carried out subsequently.
In order to facilitate understanding of the image processing apparatus for an endoscope according to the present invention, the image processing apparatus for an endoscope according to the present invention will be described with reference to fig. 2.
As shown in fig. 2, the image processing apparatus of the endoscope according to the present invention includes an image processing module 120, an image acquisition module (not numbered in the figure) composed of an endoscope 203 and an image sensor 202 provided on the top of the endoscope 203 (the side not connected to the image processing apparatus of the endoscope), a light source module 201, and a display module 204.
The light source module 201 of the image processing apparatus of the endoscope may provide an illumination light source including a first narrow-band light and a second narrow-band light. In the application process, in the case that the first narrowband image and the second narrowband image provided by the light source module 201 irradiate the object, the first narrowband image and the second narrowband image may be acquired by the image acquisition module.
The light source module 201 may simultaneously provide an illumination light source of the first and second narrow-band lights. It is also possible to provide the illumination source of the first narrow-band light first and then the illumination source of the second narrow-band light. It is also possible to provide the illumination source for the second narrow-band light first and then the illumination source for the first narrow-band light. This is all reasonable.
It should be noted that the first preset organization and the second preset organization may be adjusted according to actual situations, and are not specifically limited in this embodiment. It will be appreciated that if the first predetermined tissue and the second predetermined tissue are changed, the first and second narrow band lights are changed accordingly.
For example, a blood vessel of a minute layer is a subject, a superficial blood vessel is a first predetermined tissue, and a blood vessel of a middle-deep layer is a second predetermined tissue. The light source module 201 may provide blue-violet narrowband light (first narrowband light) having a center wavelength of 410nm to 440nm, and the light source module 201 may provide green narrowband light (second narrowband light) having a center wavelength of 530nm to 550 nm.
The bluish violet narrowband light with the central wavelength of 410nm-440nm is the light corresponding to the absorption peak of the superficial blood vessel to the light, and the green narrowband light source with the central wavelength of 530nm-550nm is the light corresponding to the absorption peak of the middle-deep blood vessel to the light. Furthermore, the image acquisition module can acquire the first narrow-band image and the second narrow-band image.
In one embodiment, the manner of acquiring the first narrowband image and the second narrowband image by the image acquisition module is as follows: the original RGB image of the subject under the irradiation of the first and second narrow-band lights as the simultaneous illumination light is acquired by the image sensor 202 provided on the top of the endoscope 203, and the image sensor 202 is a color sensor, and therefore, the B-channel information in the original RGB image can be taken as the first narrow-band image, that is, the shallow blood vessel image. And taking G channel information in the original RGB image as a second narrow-band image, namely a middle-deep layer blood vessel image.
In a specific implementation, after an image is acquired through the image sensor 202 disposed on the top of the endoscope 203, noise information in the image may be removed through a digital signal processing method or a denoising method to obtain an original RGB image, and then a first narrowband image and a second narrowband image are acquired. Therefore, the acquired narrow-band image can be ensured to be more accurate and clear.
In another embodiment, the manner of acquiring the first narrowband image and the second narrowband image by the image acquisition module is as follows: by the image sensor 202 provided on the top of the endoscope 203, an image of the subject under illumination of the first narrow-band light is acquired as a first narrow-band image, i.e., a superficial blood vessel image, and an image of the subject under illumination of the second narrow-band light is acquired as a second narrow-band image, i.e., a middle-deep blood vessel image. When the first narrow-band light and the second narrow-band light are time-division illumination light, the image sensor 202 may be a color sensor or a black-and-white sensor.
In a specific implementation, after an image is acquired by the image sensor 202 disposed on the top of the endoscope 203, noise information in the image may be removed by means of digital signal processing or denoising, so as to obtain a first narrowband image or a second narrowband image. Therefore, the acquired narrow-band image can be ensured to be more accurate and clear.
After the first narrow-band image and the second narrow-band image are acquired, the image processing module 130 may process the difference information of the first narrow-band image and the second narrow-band image in a preset processing manner, so as to obtain a false color composite image with enhanced tissue difference characteristics of the subject. After the false color composite image is acquired, the false color composite image with enhanced tissue difference characteristics of the subject may be displayed by the display module 204 for subsequent accurate lesion diagnosis based on the tissue enhanced image.
In one embodiment, the tissue enhanced images of the plurality of samples may be acquired in real time, and the tissue enhanced images of the plurality of samples may be integrated frame by frame to obtain a video stream about the samples, and the video stream may be displayed by the display module 204 for the user to diagnose the disease condition.
As an implementation manner of the embodiment of the present invention, the preset processing manner may include a luminance unification process, a difference feature enhancement process, and a channel allocation process. The image processing module may include a brightness unification processing unit, an image enhancement processing unit, and a channel assignment processing unit.
The brightness unification processing unit is configured to perform brightness unification processing on the first narrowband image and the second narrowband image based on an initial pixel value of the first narrowband image and an initial pixel value of the second narrowband image, so as to obtain a first image and a second image. The first image corresponds to a first narrow-band image, and the second image corresponds to a second narrow-band image.
The luminance unification processing unit is specifically configured to determine the first image based on an initial pixel value of the first narrowband image.
In one embodiment, a first image (which may be denoted as B') is derived based on B-channel luminance (first narrowband image). The first image may be determined based on initial pixel values of pixels of the first narrowband image.
The first image determination manner may be expressed by formula (1):
B’=B (1)
that is, B is an initial pixel value of the first narrowband image, and B' is a pixel value of the first image.
Deriving a pixel coefficient based on the initial pixel value of the first narrowband image and the initial pixel value of the second narrowband image, wherein the pixel coefficient may be represented by k.
In one embodiment, the sum of the initial pixel values of the pixels in the second narrowband image (which may be denoted as sum (G)) and the sum of the initial pixel values of the pixels in the first narrowband image (which may be denoted as sum (B)) may be divided, and the resulting quotient may be taken as the pixel coefficient.
The pixel coefficient can be expressed by equation (2):
k=sum(G)\sum(B) (2)
where sum (G) is the sum of the initial pixel values of the respective pixels in the second narrowband image, and sum (B) is the sum of the initial pixel values of the respective pixels in the first narrowband image.
In another embodiment, the average value of the initial pixel values of the pixels in the second narrowband image (denoted as avg (G)) may be divided by the average value of the initial pixel values of the pixels in the first narrowband image (denoted as avg (B)), and the resulting quotient may be used as the pixel coefficient. This is all reasonable and is not specifically limited herein.
Determining the second image based on the initial pixel values and the pixel coefficients of the second narrowband image. Specifically, the second image (which may be denoted as G ') may be determined based on the product of the G-channel luminance (second narrowband image) and the pixel coefficient, that is, G is the initial pixel value of the second narrowband image and G' is the pixel value of the second image.
The manner of determining the second image can be expressed by formula (3):
G’=k*G (3)
in the above-described formula (1) -formula (3), G denotes a pixel value of a G channel among RGB channels with respect to the original RGB image (corresponding to an initial pixel value of the second narrowband image); b denotes a pixel value of a B channel (corresponding to an initial pixel value of the first narrowband image) among RGB channels with respect to the original RGB image.
The luminance unification processing unit may be further configured to unify luminance by histogram stretching or the like to obtain the first image and the second image.
In this way, after the first narrow-band image and the second narrow-band image are processed by the brightness unification processing unit, the difference of the brightness of the generated images due to the difference of the corresponding intensities of the light by the image sensor in the narrow-band light range can be avoided. Therefore, the average value of all pixels of the first image and the second image is uniform, and the brightness of the first image and the brightness of the second image are uniform.
The difference feature enhancement processing unit is configured to obtain a first image after enhancement of the difference features of the first preset tissue and the second preset tissue and a second image after enhancement of the difference features of the first preset tissue and the second preset tissue based on the pixel value of the first image and the pixel value of the second image.
In order to display the detailed features of the first preset tissue and the second preset tissue, a first image obtained by enhancing the difference features of the first preset tissue and the second preset tissue and a second image obtained by enhancing the difference features of the first preset tissue and the second preset tissue may be obtained based on the pixel value of the first image and the pixel value of the second image.
In one embodiment, the image enhancement processing unit comprises a common calculation subunit, an individual calculation subunit, a first enhancer unit.
And the shared calculating subunit is used for obtaining a shared image based on the pixel value of the first image and the pixel value of the second image.
In an embodiment, the common calculating subunit is specifically configured to, for each pixel point having the same pixel coordinate in the first image and the second image, obtain a pixel value of the first image corresponding to the pixel coordinate and a pixel value of the second image corresponding to the pixel coordinate, and use a smaller one of the pixel values of the first image and the pixel values of the second image as a pixel value corresponding to the pixel coordinate, so as to obtain a common image.
That is, for each pixel point in the common image, the pixel value corresponding to the pixel point is the pixel value with the smaller value between the pixel value of the first image and the pixel value of the second image.
For example, regarding the pixel coordinate a, in the first image, the pixel value corresponding to the pixel coordinate a is m, in the second image, the pixel value corresponding to the pixel coordinate a is n, and m is smaller than n, so that the pixel value corresponding to the pixel coordinate a in the common image can be determined to be m, and so on, the pixel value of each pixel point in the common image can be determined.
For the pixel value of each pixel point in the common image, the formula (4) can be adopted for confirmation.
Com=min(B’,G’) (4)
Wherein Com is a pixel value of a pixel point in the common image.
In another embodiment, the common calculating subunit is specifically configured to, for each pixel point having the same pixel coordinate in the first image and the second image, obtain a pixel value of the first image corresponding to the pixel coordinate and a pixel value of the second image corresponding to the pixel coordinate. And taking the weighted average value of the pixel value of the first image and the pixel value of the second image as the pixel value corresponding to the pixel coordinate to obtain a shared image.
That is to say, for each pixel point in the common image, the pixel value corresponding to the pixel point is a weighted average of the pixel value of the first image and the pixel value of the second image.
For example, regarding the pixel coordinate B, the pixel value corresponding to the pixel coordinate B in the first image is x, the pixel value corresponding to the pixel coordinate B in the second image is y, and the pixel value of the first image is consistent with the pixel value weight of the second image, so that the pixel value corresponding to the pixel coordinate B in the shared image can be determined to be (x + y) \2, and so on, the pixel value of each pixel point in the shared image can be determined.
The unique calculation subunit is configured to obtain a first unique image based on the pixel value of the first image and the common pixel value of the common image, and obtain a second unique image based on the pixel value of the second image and the common pixel value.
The unique calculating subunit is specifically configured to, for each pixel point having the same pixel coordinate in the first image and the common image, use a difference between a pixel value of the first image corresponding to the pixel coordinate and the common pixel value corresponding to the pixel coordinate as a first difference. And the first difference value is used as a pixel value corresponding to the pixel coordinate to obtain a first unique image.
The first difference value may be obtained by equation (5).
B0= B’-Com (5)
Where B0 is the first difference value, i.e., the unique pixel value of the first unique image.
And regarding each pixel point with the same pixel coordinate in the second image and the common image, taking the difference value between the pixel value of the second image corresponding to the pixel coordinate and the common pixel value corresponding to the pixel coordinate as a second difference value. And the second difference value is used as a pixel value corresponding to the pixel coordinate to obtain a second unique image.
The second difference value may be obtained by equation (6).
G0= G’-Com (6)
Where G0 is the second difference value, i.e., the unique pixel value of the second unique image.
The difference feature enhancement unit is configured to obtain the first image after enhancement of the difference feature based on the pixel values of the first image and the unique pixel values of the second unique image, and obtain the second image after enhancement of the difference feature based on the pixel values of the second image and the unique pixel values of the first unique image.
In an embodiment, for each pixel point having the same pixel coordinate in the first image and the second unique image, a difference between a pixel value of the first image corresponding to the pixel coordinate and a unique pixel value of the second unique image corresponding to the pixel coordinate may be used as a third difference, and the third difference is used as a pixel value corresponding to the pixel coordinate, so as to obtain the first image with enhanced difference characteristics.
The pixel value of the pixel point in the first image after the difference characteristic enhancement can be obtained through a formula (7).
B1= B’- G0 (7)
Wherein, B1 is the pixel value of the first image after the enhancement of the difference feature.
For each pixel point with the same pixel coordinate in the second image and the first unique image, the difference value between the pixel value of the second image corresponding to the pixel coordinate and the unique pixel value of the second unique image corresponding to the pixel coordinate can be used as a fourth difference value, and the fourth difference value is used as the pixel value corresponding to the pixel coordinate, so that the second image with enhanced difference characteristics is obtained.
The pixel value of the pixel point in the second image after the difference characteristic enhancement can be obtained through a formula (8).
G1= G’- B0 (8)
Wherein G1 is the pixel value of the second image after the enhancement of the difference feature.
Through the mode, the interference of the second unique image is eliminated from the first image after the difference characteristic enhancement, and the interference of the first unique image is eliminated from the second image after the difference characteristic enhancement, so that the detail characteristics of the first preset tissue and the second preset tissue can be displayed more clearly, and a foundation is laid for accurate disease diagnosis.
In order to determine the first image after enhancement of the difference feature and the second image after enhancement of the difference feature more quickly and conveniently under the condition that the pixel value of each pixel point in the common image is the pixel value with a smaller median value between the pixel value of the first image and the pixel value of the second image, the first image after enhancement of the difference feature and the second image after enhancement of the difference feature can be determined in the following way.
A difference image is obtained based on the pixel values of the first image and the pixel values of the second image.
For each pixel point with the same pixel coordinate in the first image and the second image, an absolute value of a difference between a pixel value of the second image corresponding to the pixel coordinate and a pixel value of the first image may be used as a pixel value of the pixel coordinate in the difference image.
The pixel value of the difference image may be determined using equation (9).
Δ=| G’ - B’ | (9)
Where Δ is a pixel value of the difference image.
And then whether the initial pixel value of the second image is larger than the initial pixel value of the first image can be judged.
For each pixel point with the same pixel coordinate in the first narrowband image and the second narrowband image, whether the initial pixel value of the second narrowband image corresponding to the pixel coordinate is larger than the initial pixel value of the first narrowband image or not can be judged. And under the condition that the initial pixel value of the second narrowband image is larger than that of the first narrowband image, taking the second narrowband image as the second image with enhanced difference characteristics.
And regarding each pixel point with the same pixel coordinate in the first narrow-band image and the difference image, taking the difference value between the pixel value of the first narrow-band image corresponding to the pixel coordinate and the difference pixel value of the difference image corresponding to the pixel coordinate as a fifth difference value, and taking the fifth difference value as the pixel value of the pixel coordinate to obtain the first image with enhanced difference characteristics.
The pixel value of the pixel point in the second image after the enhancement of the difference feature can be determined by using the formula (10).
G1= G’ (10)
Wherein, G1 is a pixel value of the second image after the enhancement of the difference feature.
The pixel value of the pixel point in the first image after the enhancement of the difference feature can be determined by using formula (11).
B1= B’-Δ (11)
Wherein, B1 is a pixel value of the first image after the enhancement of the difference feature.
And in the case that the initial pixel value of the second narrow-band image is not larger than the initial pixel value of the first narrow-band image, taking the first narrow-band image as the first image with enhanced difference features. And regarding each pixel point with the same pixel coordinate in the second narrow-band image and the difference image, taking the difference value between the pixel value of the second narrow-band image corresponding to the pixel coordinate and the difference pixel value of the difference image corresponding to the pixel coordinate as a sixth difference value, and taking the sixth difference value as the pixel value of the pixel coordinate to obtain the second image with enhanced difference characteristics.
The pixel value of the pixel point in the first image after the enhancement of the difference feature can be determined by using formula (12).
B1= B’ (12)
Wherein, B1 is a pixel value of the first image after the enhancement of the difference feature.
The pixel value of the pixel point in the second image after the enhancement of the difference feature can be determined by using formula (13).
G1= G’ -Δ (13)
Wherein G1 is the pixel value of the second image after the enhancement of the difference feature.
And the channel allocation processing unit is used for performing color channel allocation on the first image subjected to the difference characteristic enhancement and the second image subjected to the difference characteristic enhancement based on a preset allocation rule to obtain a false color composite image subjected to the tissue difference characteristic enhancement of the detected body.
Specifically, the first image with enhanced difference characteristics and the second image with enhanced difference characteristics may be subjected to channel allocation based on the visual antagonism principle and a preset allocation rule, so as to obtain a false color composite image with enhanced tissue difference characteristics of the subject.
In one embodiment, the preset allocation principle may be that the enhanced images with the wavelength of the narrow-band illumination light corresponding to the images with enhanced difference features smaller than the preset wavelength are allocated to the B and G channels, and the enhanced images with the wavelength of the narrow-band illumination light corresponding to the images with enhanced difference features larger than or equal to the preset wavelength are allocated to the R channel, so that the false color composite image with enhanced tissue difference features of the subject can be obtained.
Specifically, the pixel values of the first image after the difference feature enhancement may be assigned to a B channel of the RGB channels, the pixel values of the first image after the difference feature enhancement may be assigned to a G channel of the RGB channels, and the pixel values of the second image after the difference feature enhancement may be assigned to an R channel of the RGB channels.
The preset allocation rule can be expressed as:
Figure BDA0003967857480000171
wherein, R, G and B respectively represent each channel in RGB channels; m1, m2, m3 represent gain coefficients. The detail enhanced first image is represented as a difference feature enhanced first image; the detail-enhanced second image is represented as a difference feature enhanced second image.
As an embodiment of the present invention, when allocating channels, the channel allocation processing unit may obtain a color preference by setting different gain coefficients, and use an image obtained after obtaining the color preference and allocating to the RGB channels as a processed image.
As an embodiment of the present invention, the processed image may be subjected to gamma color correction by the channel allocation processing unit, so as to improve the brightness contrast of the image.
As an embodiment of the present invention, after the processed image is acquired, the channel allocation processing unit may perform filtering processing on the processed image based on high-frequency filtering to obtain a structure-enhanced image, and obtain a false color composite image in which a tissue difference characteristic of the subject is enhanced based on the structure-enhanced image and the structure-enhanced image. Therefore, the imaging contrast and the color discrimination of different tissues in the detected body can be further increased, and the accurate lesion diagnosis is more convenient to carry out.
It can be seen that, the channel distribution is performed on the first image with enhanced difference characteristics and the second image with enhanced difference characteristics according to the visual antagonism principle, so that the details of the images corresponding to the surface blood vessel (first preset tissue) and the middle-deep blood vessel (second preset tissue) can be improved, the color difference between the surface blood vessel and the mucosa can be improved, the visibility of the surface blood vessel and the middle-deep blood vessel can be improved, and the detection rate of pathological changes can be improved.
The following describes an image processing method of an endoscope provided by the present invention, and the image processing method of an endoscope described below and the image processing apparatus of an endoscope described above can be referred to in correspondence with each other.
As shown in fig. 3, an image processing method of an endoscope according to the present invention is applied to an image processing apparatus of the endoscope, and includes:
s301, the image acquisition module acquires a first narrow-band image and a second narrow-band image when the subject is irradiated with the first narrow-band light and the second narrow-band light.
The subject includes a first preset tissue and a second preset tissue, the first narrow-band image corresponds to the first narrow-band light and the first preset tissue, and the second narrow-band image corresponds to the second narrow-band light and the second preset tissue.
And S302, processing the difference information of the first narrow-band image and the second narrow-band image by an image processing module in a preset processing mode to obtain a false color composite image with enhanced tissue difference characteristics of the detected body.
As an embodiment of the present invention, the preset processing manner includes a luminance unification process, a difference feature enhancement process, and a channel assignment process.
The step of processing the difference information of the first narrow-band image and the second narrow-band image by the image processing module in a preset processing manner to obtain the false color composite image with enhanced tissue difference characteristics of the subject may include:
the image processing module comprises a brightness unified processing unit, an image enhancement processing unit and a channel distribution processing unit.
And performing brightness unified processing on the first narrowband image and the second narrowband image through a brightness unified processing unit based on the initial pixel value of the first narrowband image and the initial pixel value of the second narrowband image to obtain a first image and a second image.
Wherein the first image corresponds to the first narrowband image and the second image corresponds to the second narrowband image
And acquiring a first image after the difference features of the first preset tissue and the second preset tissue are enhanced and a second image after the difference features of the first preset tissue and the second preset tissue are enhanced by a difference feature enhancement processing unit based on the pixel values of the first image and the pixel values of the second image.
And carrying out color channel distribution on the first image subjected to the enhancement of the difference characteristic and the second image subjected to the enhancement of the difference characteristic through a channel distribution processing unit based on a preset distribution rule to obtain a false color composite image subjected to the enhancement of the tissue difference characteristic of the detected body.
As an embodiment of the present invention, the step of acquiring, by the difference feature enhancement processing unit, the first image in which the difference features of the first preset tissue and the second preset tissue are enhanced and the second image in which the difference features of the first preset tissue and the second preset tissue are enhanced based on the pixel values of the first image and the pixel values of the second image may include:
the difference characteristic enhancement processing unit comprises a common calculation subunit, an exclusive calculation subunit and a first enhancer unit.
Obtaining, by the common calculation subunit, a common image based on the pixel values of the first image and the pixel values of the second image.
Obtaining, by the unique calculation subunit, a first unique image based on the pixel value of the first image and the common pixel value of the common image, and obtaining a second unique image based on the pixel value of the second image and the common pixel value.
Obtaining, by the difference feature enhancer unit, the difference feature enhanced first image based on the pixel values of the first image and the unique pixel values of the second unique image, and obtaining the difference feature enhanced second image based on the pixel values of the second image and the unique pixel values of the first unique image.
As an embodiment of the present invention, the step of obtaining, by the shared calculating subunit, a shared image based on the pixel values of the first image and the pixel values of the second image may include:
by the common calculation subunit, aiming at each pixel point with the same pixel coordinate in the first image and the second image, taking the pixel value of the first image corresponding to the pixel coordinate and the pixel value with a smaller value in the pixel values of the second image corresponding to the pixel coordinate as the pixel value corresponding to the pixel coordinate to obtain the common image; or the like, or a combination thereof,
the common calculating subunit is specifically configured to, for each pixel point having the same pixel coordinate in the first image and the second image, use a weighted average of a pixel value of the first image corresponding to the pixel coordinate and a pixel value of the second image corresponding to the pixel coordinate as a pixel value corresponding to the pixel coordinate, so as to obtain the common image.
As an embodiment of the present invention, the obtaining, by the unique calculating subunit, a first unique image based on the pixel value of the first image and the shared pixel value of the shared image, and a second unique image based on the pixel value of the second image and the shared pixel value may include:
by the unique calculation subunit, regarding each pixel point with the same pixel coordinate in the first image and the common image, taking a difference value between a pixel value of the first image corresponding to the pixel coordinate and the common pixel value corresponding to the pixel coordinate as a first difference value, and taking the first difference value as a pixel value corresponding to the pixel coordinate to obtain the first unique image;
and regarding each pixel point with the same pixel coordinate in the second image and the common image, taking the difference value between the pixel value of the second image corresponding to the pixel coordinate and the common pixel value corresponding to the pixel coordinate as a second difference value, and taking the second difference value as the pixel value corresponding to the pixel coordinate to obtain the second unique image.
As an embodiment of the present invention, the obtaining, by the difference feature enhancing unit, the first image after the difference feature enhancement based on the pixel values of the first image and the unique pixel values of the second unique image, and the second image after the difference feature enhancement based on the pixel values of the second image and the unique pixel values of the first unique image may include:
regarding each pixel point with the same pixel coordinate in the first image and the second unique image, the difference value between the pixel value of the first image corresponding to the pixel coordinate and the unique pixel value of the second unique image corresponding to the pixel coordinate is used as a third difference value, and the third difference value is used as the pixel value corresponding to the pixel coordinate, so as to obtain the first image with enhanced difference characteristics;
and regarding each pixel point with the same pixel coordinate in the second image and the first unique image, taking the difference value between the pixel value of the second image corresponding to the pixel coordinate and the unique pixel value of the first unique image corresponding to the pixel coordinate as a fourth difference value, and taking the fourth difference value as the pixel value corresponding to the pixel coordinate to obtain the second image with enhanced difference characteristics.
In an embodiment of the present invention, the luminance unification processing unit performs luminance unification processing on the first narrowband image and the second narrowband image based on an initial pixel value of the first narrowband image and an initial pixel value of the second narrowband image to obtain a first image and a second image. May comprise the steps of
Determining, by the brightness unification processing unit, the first image based on an initial pixel value of the first narrowband image;
obtaining a pixel coefficient based on the initial pixel value of the first narrowband image and the initial pixel value of the second narrowband image;
determining the second image based on the initial pixel values and the pixel coefficients of the second narrowband image.
Fig. 4 illustrates a physical structure diagram of an electronic device, which may include, as shown in fig. 4: a processor (processor) 410, a communication Interface (Communications Interface) 420, a memory (memory) 430 and a communication bus 440, wherein the processor 410, the communication Interface 420 and the memory 430 are in communication with each other via the communication bus 440. The processor 410 may invoke logic instructions in the memory 430 to perform image processing methods of the endoscope.
In addition, the logic instructions in the memory 430 may be implemented in the form of software functional units and stored in a computer readable storage medium when the software functional units are sold or used as independent products. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk, and various media capable of storing program codes.
In another aspect, the present invention also provides a computer program product comprising a computer program, the computer program being storable on a non-transitory computer readable storage medium, the computer program, when executed by a processor, being capable of executing the above-mentioned image processing method of an endoscope.
In yet another aspect, the present invention also provides a non-transitory computer-readable storage medium having stored thereon a computer program which, when executed by a processor, is implemented to perform an image processing method of an endoscope.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. Based on the understanding, the above technical solutions substantially or otherwise contributing to the prior art may be embodied in the form of a software product, which may be stored in a computer-readable storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the various embodiments or some parts of the embodiments.
It is further to be understood that while operations are depicted in the drawings in a particular order, this is not to be understood as requiring that such operations be performed in the particular order shown or in serial order, or that all illustrated operations be performed, to achieve desirable results. In certain environments, multitasking and parallel processing may be advantageous.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. An image processing apparatus of an endoscope, comprising: the device comprises an image acquisition module and an image processing module;
the image acquisition module is configured to acquire a first narrowband image and a second narrowband image when a subject is irradiated by first narrowband light and second narrowband light, where the subject includes a first preset tissue and a second preset tissue, the first narrowband image has a corresponding relationship with the first narrowband light and the first preset tissue, and the second narrowband image has a corresponding relationship with the second narrowband light and the second preset tissue;
the image processing module is configured to process difference information of the first narrow-band image and the second narrow-band image in a preset processing manner to obtain a false color composite image with enhanced tissue difference characteristics of the subject.
2. The image processing apparatus of an endoscope according to claim 1, wherein the preset processing manner includes a brightness unification processing unit, a difference feature enhancement processing unit, and a channel assignment processing unit, and the image processing module includes a brightness unification processing unit, a difference feature enhancement processing unit, and a channel assignment processing unit;
the brightness unification processing unit is configured to perform brightness unification processing on the first narrowband image and the second narrowband image based on an initial pixel value of the first narrowband image and an initial pixel value of the second narrowband image to obtain a first image and a second image, where the first image corresponds to the first narrowband image and the second image corresponds to the second narrowband image;
the difference feature enhancement processing unit is used for acquiring a first image after the difference features of the first preset tissue and the second preset tissue are enhanced and a second image after the difference features of the first preset tissue and the second preset tissue are enhanced based on the pixel values of the first image and the pixel values of the second image;
and the channel allocation processing unit is used for performing color channel allocation on the first image subjected to the difference characteristic enhancement and the second image subjected to the difference characteristic enhancement based on a preset allocation rule to obtain a false color composite image subjected to the tissue difference characteristic enhancement of the detected body.
3. The image processing apparatus of an endoscope according to claim 2, wherein said difference feature enhancement processing unit includes a common calculation subunit, an unique calculation subunit, a difference feature enhancer unit;
the shared calculating subunit is used for obtaining a shared image based on the pixel value of the first image and the pixel value of the second image;
the unique calculation subunit is configured to obtain a first unique image based on the pixel value of the first image and the common pixel value of the common image, and obtain a second unique image based on the pixel value of the second image and the common pixel value;
the difference feature enhancement unit is configured to obtain the first image after the difference feature enhancement based on the pixel values of the first image and the unique pixel values of the second unique image, and obtain the second image after the difference feature enhancement based on the pixel values of the second image and the unique pixel values of the first unique image.
4. The image processing apparatus for an endoscope according to claim 3, wherein said common calculation subunit is configured to, for each pixel point having the same pixel coordinate in the first image and the second image, obtain the common image by regarding a pixel value having a smaller value of a pixel value of the first image corresponding to the pixel coordinate and a pixel value of the second image corresponding to the pixel coordinate as a pixel value corresponding to the pixel coordinate; or the like, or, alternatively,
the common calculating subunit is specifically configured to, for each pixel point having the same pixel coordinate in the first image and the second image, use a weighted average of a pixel value of the first image corresponding to the pixel coordinate and a pixel value of the second image corresponding to the pixel coordinate as a pixel value corresponding to the pixel coordinate, so as to obtain the common image.
5. The image processing apparatus of an endoscope according to claim 3, wherein said unique calculation subunit is configured to, for each pixel point having the same pixel coordinate in said first image and said common image, obtain said first unique image by using a difference between a pixel value of the first image corresponding to the pixel coordinate and a common pixel value corresponding to the pixel coordinate as a first difference, and by using the first difference as a pixel value corresponding to the pixel coordinate;
and regarding each pixel point with the same pixel coordinate in the second image and the common image, taking the difference value between the pixel value of the second image corresponding to the pixel coordinate and the common pixel value corresponding to the pixel coordinate as a second difference value, and taking the second difference value as the pixel value corresponding to the pixel coordinate to obtain the second unique image.
6. The image processing apparatus of an endoscope according to claim 3, wherein the difference feature enhancing unit is specifically configured to obtain the first image with enhanced difference features by taking, for each pixel point in the first image and the second unique image having the same pixel coordinate, a difference between a pixel value of the first image corresponding to the pixel coordinate and a unique pixel value of the second unique image corresponding to the pixel coordinate as a third difference, and taking the third difference as a pixel value corresponding to the pixel coordinate;
and regarding each pixel point with the same pixel coordinate in the second image and the first unique image, taking the difference value between the pixel value of the second image corresponding to the pixel coordinate and the unique pixel value of the first unique image corresponding to the pixel coordinate as a fourth difference value, and taking the fourth difference value as the pixel value corresponding to the pixel coordinate to obtain the second image with enhanced difference characteristics.
7. The image processing apparatus of an endoscope according to any one of claims 2 to 6, wherein said brightness unification processing unit is specifically configured to determine said first image based on an initial pixel value of said first narrowband image;
obtaining a pixel coefficient based on the initial pixel value of the first narrowband image and the initial pixel value of the second narrowband image;
determining the second image based on the initial pixel values and the pixel coefficients of the second narrowband image.
8. An image processing method of an endoscope, which is applied to the image processing apparatus of the endoscope according to any one of claims 1 to 7, the method comprising:
acquiring a first narrow-band image and a second narrow-band image by an image acquisition module under the condition that a subject is irradiated by first narrow-band light and second narrow-band light, wherein the subject comprises a first preset tissue and a second preset tissue, the first narrow-band image is in a corresponding relation with the first narrow-band light and the first preset tissue, and the second narrow-band image is in a corresponding relation with the second narrow-band light and the second preset tissue;
and processing the difference information of the first narrow-band image and the second narrow-band image by an image processing module in a preset processing mode to obtain a false color composite image with enhanced tissue difference characteristics of the detected body.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the image processing method of the endoscope as claimed in claim 8 when executing the computer program.
10. A non-transitory computer-readable storage medium on which a computer program is stored, which when executed by a processor implements the image processing method claim of the endoscope of claim 8.
CN202211504982.9A 2022-11-28 2022-11-28 Image processing device and method for endoscope, electronic device, and storage medium Active CN115731205B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211504982.9A CN115731205B (en) 2022-11-28 2022-11-28 Image processing device and method for endoscope, electronic device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211504982.9A CN115731205B (en) 2022-11-28 2022-11-28 Image processing device and method for endoscope, electronic device, and storage medium

Publications (2)

Publication Number Publication Date
CN115731205A true CN115731205A (en) 2023-03-03
CN115731205B CN115731205B (en) 2024-04-26

Family

ID=85298850

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211504982.9A Active CN115731205B (en) 2022-11-28 2022-11-28 Image processing device and method for endoscope, electronic device, and storage medium

Country Status (1)

Country Link
CN (1) CN115731205B (en)

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007111357A (en) * 2005-10-21 2007-05-10 Olympus Medical Systems Corp Living body imaging apparatus and living body observing system
KR20080104191A (en) * 2006-04-12 2008-12-01 올림푸스 메디칼 시스템즈 가부시키가이샤 Endoscope device
US20100331624A1 (en) * 2008-10-17 2010-12-30 Olympus Medical Systems Corp. Endoscope system and endoscopic image processing apparatus
CN102727158A (en) * 2011-04-01 2012-10-17 富士胶片株式会社 Endoscope system and calibration method
CN204379226U (en) * 2014-12-19 2015-06-10 佛山市南海区欧谱曼迪科技有限责任公司 A kind of Narrow-Band Imaging endoscope apparatus
CN105380587A (en) * 2014-09-03 2016-03-09 Hoya株式会社 Image capturing system and electronic endoscope system
JP2016174976A (en) * 2016-06-29 2016-10-06 富士フイルム株式会社 Endoscope system
US20170215711A1 (en) * 2016-01-29 2017-08-03 Sony Olympus Medical Solutions Inc. Medical imaging device, medical image acquisition system, and endoscope apparatus
CN107625513A (en) * 2017-09-30 2018-01-26 华中科技大学 Enhancing shows Narrow-Band Imaging endoscopic system and its imaging method
US20190008361A1 (en) * 2016-03-18 2019-01-10 Fujifilm Corporation Endoscopic system and method of operating same
US20190021580A1 (en) * 2016-03-29 2019-01-24 Fujifilm Corporation Image processing apparatus, method for operating image processing apparatus, and image processing program
CN109584210A (en) * 2018-10-30 2019-04-05 南京理工大学 Multispectral three-dimensional vein imaging system
US20210088772A1 (en) * 2018-06-19 2021-03-25 Olympus Corporation Endoscope apparatus, operation method of endoscope apparatus, and information storage media
CN114298944A (en) * 2021-12-30 2022-04-08 上海闻泰信息技术有限公司 Image enhancement method, device, equipment and storage medium
CN114391792A (en) * 2021-09-13 2022-04-26 南京诺源医疗器械有限公司 Tumor prediction method and device based on narrow-band imaging and imaging endoscope
CN114532960A (en) * 2022-02-18 2022-05-27 郑州市中医院(郑州市红十字医院) Ureter endoscopic equipment and system based on image enhancement

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007111357A (en) * 2005-10-21 2007-05-10 Olympus Medical Systems Corp Living body imaging apparatus and living body observing system
KR20080104191A (en) * 2006-04-12 2008-12-01 올림푸스 메디칼 시스템즈 가부시키가이샤 Endoscope device
US20100331624A1 (en) * 2008-10-17 2010-12-30 Olympus Medical Systems Corp. Endoscope system and endoscopic image processing apparatus
CN102727158A (en) * 2011-04-01 2012-10-17 富士胶片株式会社 Endoscope system and calibration method
CN105380587A (en) * 2014-09-03 2016-03-09 Hoya株式会社 Image capturing system and electronic endoscope system
CN204379226U (en) * 2014-12-19 2015-06-10 佛山市南海区欧谱曼迪科技有限责任公司 A kind of Narrow-Band Imaging endoscope apparatus
US20170215711A1 (en) * 2016-01-29 2017-08-03 Sony Olympus Medical Solutions Inc. Medical imaging device, medical image acquisition system, and endoscope apparatus
US20190008361A1 (en) * 2016-03-18 2019-01-10 Fujifilm Corporation Endoscopic system and method of operating same
US20190021580A1 (en) * 2016-03-29 2019-01-24 Fujifilm Corporation Image processing apparatus, method for operating image processing apparatus, and image processing program
JP2016174976A (en) * 2016-06-29 2016-10-06 富士フイルム株式会社 Endoscope system
CN107625513A (en) * 2017-09-30 2018-01-26 华中科技大学 Enhancing shows Narrow-Band Imaging endoscopic system and its imaging method
US20210088772A1 (en) * 2018-06-19 2021-03-25 Olympus Corporation Endoscope apparatus, operation method of endoscope apparatus, and information storage media
CN109584210A (en) * 2018-10-30 2019-04-05 南京理工大学 Multispectral three-dimensional vein imaging system
CN114391792A (en) * 2021-09-13 2022-04-26 南京诺源医疗器械有限公司 Tumor prediction method and device based on narrow-band imaging and imaging endoscope
CN114298944A (en) * 2021-12-30 2022-04-08 上海闻泰信息技术有限公司 Image enhancement method, device, equipment and storage medium
CN114532960A (en) * 2022-02-18 2022-05-27 郑州市中医院(郑州市红十字医院) Ureter endoscopic equipment and system based on image enhancement

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
XIAOQI LIU ET AL.: "TRANSFER LEARNING WITH CONVOLUTIONAL NEURAL NETWORK FOR EARLY GASTRIC CANCER CLASSIFICATION ON MAGNIFIYING NARROW-BAND IMAGING", 《ICIP 2018》, 31 December 2018 (2018-12-31), pages 1388 - 1392 *

Also Published As

Publication number Publication date
CN115731205B (en) 2024-04-26

Similar Documents

Publication Publication Date Title
US11769265B2 (en) Skin assessment using image fusion
JP5242381B2 (en) Medical image processing apparatus and medical image processing method
US8027533B2 (en) Method of automated image color calibration
US8837821B2 (en) Image processing apparatus, image processing method, and computer readable recording medium
JP6346576B2 (en) Image processing device
US7454046B2 (en) Method and system for analyzing skin conditions using digital images
US9364147B2 (en) System, method and device for automatic noninvasive screening for diabetes and pre-diabetes
JP6584090B2 (en) Image processing device
US20190133513A1 (en) Enhancing pigmentation in dermoscopy images
JP2007190364A (en) Image processing method and apparatus
JP6210962B2 (en) Endoscope system, processor device, operation method of endoscope system, and operation method of processor device
US20160089011A1 (en) Endoscope system, processor device, and method for operating endoscope system
JP2012005512A (en) Image processor, endoscope apparatus, endoscope system, program, and image processing method
CN112469324B (en) Endoscope system
US10356378B2 (en) Image processing device, image processing method, and computer-readable recording medium
WO2018235179A1 (en) Image processing device, endoscope device, method for operating image processing device, and image processing program
EP4216808A1 (en) Acne severity grading methods and apparatuses
CN115736791B (en) Endoscopic imaging device and method
CN113038868A (en) Medical image processing system
CN115731205B (en) Image processing device and method for endoscope, electronic device, and storage medium
JP7163386B2 (en) Endoscope device, method for operating endoscope device, and program for operating endoscope device
JPH01138876A (en) Color picture processing unit
CN111784664B (en) Method for generating distribution map of tumor lymph nodes
JP2015006398A (en) Image processing device, image processing method, and program
JP2594896B2 (en) Image coloring method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant