CN117011155A - Image processing method of polarized light endoscope and computer readable storage medium - Google Patents

Image processing method of polarized light endoscope and computer readable storage medium Download PDF

Info

Publication number
CN117011155A
CN117011155A CN202210474252.2A CN202210474252A CN117011155A CN 117011155 A CN117011155 A CN 117011155A CN 202210474252 A CN202210474252 A CN 202210474252A CN 117011155 A CN117011155 A CN 117011155A
Authority
CN
China
Prior art keywords
information
polarized light
polarization
image
polarization degree
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210474252.2A
Other languages
Chinese (zh)
Inventor
郭毅军
黄潇峰
唐豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Xishan Science and Technology Co Ltd
Original Assignee
Chongqing Xishan Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Xishan Science and Technology Co Ltd filed Critical Chongqing Xishan Science and Technology Co Ltd
Priority to CN202210474252.2A priority Critical patent/CN117011155A/en
Publication of CN117011155A publication Critical patent/CN117011155A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration using histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Endoscopes (AREA)

Abstract

Embodiments of the present application provide an image processing method of a polarized light endoscope and a computer-readable storage medium, the method comprising: acquiring a plurality of initial images, wherein the plurality of initial images are correspondingly obtained based on polarized light of a plurality of polarized angles acquired by a polarized light sensor of a polarized light endoscope; determining light intensity information and synthetic polarization degree information of the environment where the polarized light endoscope is positioned according to the plurality of initial images; determining a plurality of polarized intensity information according to the light intensity information and the synthesized polarization degree information; and generating a target synthetic image according to the polarization intensity information, the preset polarization degree information of the target object and the preset polarization degree information of the background, wherein the target synthetic image is used for representing polarization imaging of the environment where the polarized light endoscope is positioned. The method can make the object in the generated object synthetic image clearer.

Description

Image processing method of polarized light endoscope and computer readable storage medium
Technical Field
The present application relates to the technical field of endoscopes, and in particular, to an image processing method of a polarized light endoscope and a computer readable storage medium.
Background
The endoscope is an important auxiliary tool in the medical field, and can enter a human body and viscera through a natural duct of the human body or a minimally invasive surgical wound so as to shoot focal tissues which cannot be directly observed by other instruments. The existing endoscope can be divided into: 4K endoscopes, fluorescence endoscopes, narrow-band light endoscopes, 3D endoscopes, etc., which can achieve better functions and performance by means of the respective working principles. However, these endoscopes are difficult to image clearly in an operation environment where blood, turbid water, tissue dust, and the like are present, and further the imaging result of the endoscope is affected. The presence of polarized light endoscopes can better solve this problem. The reason is that the polarized light sensor is arranged in the polarized light endoscope, so that polarized imaging can be realized, and the polarized imaging can just realize the functions of removing blood, turbid water, tissue scraps and the like.
However, in the existing imaging method of the polarized light endoscope, the object is not clearly distinguished from the background, which affects the definition of the object in the imaging result.
Disclosure of Invention
The application aims to overcome the defects in the prior art, and provides an image processing method and a computer-readable storage medium of a polarized light endoscope, which are used for realizing imaging by utilizing the difference of polarization information of a target object and background, so that the definition of the target object in an imaging result is greatly improved.
In a first aspect, an embodiment of the present application provides an image processing method of a polarized light endoscope, including:
acquiring a plurality of initial images, wherein the plurality of initial images are correspondingly obtained based on polarized light of a plurality of polarized angles acquired by a polarized light sensor of a polarized light endoscope;
determining light intensity information and synthetic polarization degree information of the environment where the polarized light endoscope is positioned according to the plurality of initial images;
determining a plurality of polarized intensity information according to the light intensity information and the synthesized polarization degree information;
and generating a target synthetic image according to the polarization intensity information, the preset polarization degree information of the target object and the preset polarization degree information of the background, wherein the target synthetic image is used for representing polarization imaging of the environment where the polarized light endoscope is positioned.
According to the method, light intensity information and synthetic polarization degree information of an environment where the polarized light endoscope is located can be determined according to a plurality of initial images under a plurality of acquired polarization angles, and a plurality of polarized intensity information can be determined according to the light intensity information and the synthetic polarization degree information. By using the polarization intensity information, the correlation among the transmission intensities in the current environment can be analyzed and known, and imaging processing is performed by using the correlation, the polarization degree information of the target object and the polarization degree information of the background, which are obtained in advance. The difference of the target object and the background in the polarization degree and the correlation of the polarization degrees in multiple directions in the actual environment are fully considered in the imaging processing process, so that the target object is more prominent relative to the background, the target object in the generated target synthetic image is clearer, and the definition of the target object in the target synthetic image is greatly improved.
As an optional implementation manner, the determining, according to the plurality of initial images, the light intensity information and the synthetic polarization degree information of the environment where the polarized light endoscope is located includes:
determining light intensity information of the environment where the polarized light endoscope is located and polarized light components in at least one direction according to the plurality of initial images;
and determining the synthetic polarization degree information according to the light intensity information and the polarized light component in the at least one direction.
In the method, the light intensity information of the environment where the polarized light endoscope is located and the synthesized polarization degree information of the polarized light component in at least one direction are used, so that the determined synthesized polarization degree information can more accurately represent the overall polarization degree of the environment where the polarized light endoscope is located.
As an optional implementation manner, the determining, according to the plurality of initial images, the light intensity information of the environment where the polarized light endoscope is located and the polarized light component in at least one direction includes:
determining Stokes vectors by taking the plurality of initial images as input parameters;
obtaining the light intensity information and the polarized light component in the at least one direction based on the stokes vector;
Wherein the polarized light component in the at least one direction comprises at least one of: an X-axis direction linear polarized light component, a 45-degree direction linear polarized light component, and a circular polarized light component.
In the method, the light intensity information and the polarized light components in at least one direction are obtained by calculating the Stokes vector, so that the accuracy of the obtained light intensity information and the accuracy of each polarized light component can be ensured.
As an alternative implementation manner, the determining the synthetic polarization degree information according to the light intensity information and the polarized light component in the at least one direction includes:
and determining the synthetic polarization degree information according to the light intensity information, the linear polarized light component in the X-axis direction, the linear polarized light component in the 45-degree direction and the circular polarized component.
In the method, the synthetic polarization degree information is calculated by using the polarized light components in the three directions and the light intensity information, and the calculated synthetic polarization degree information can be ensured to be more accurate because the polarized light components represented by the polarized light components in the three directions are more complete.
As an alternative implementation manner, the determining the synthetic polarization degree information according to the light intensity information and the polarized light component in the at least one direction includes:
And determining the synthetic polarization degree information according to the light intensity information, the X-axis direction linear polarized light component and the 45-degree direction linear polarized light component.
In the method, the circular polarization component is ignored, and the synthesized polarization degree information is calculated by only using the linear polarization light component in the X-axis direction and the linear polarization light component in the 45-degree direction, so that the calculation complexity can be reduced and the calculation efficiency can be improved on the premise of not affecting the accuracy of the calculation result.
As an optional implementation manner, the determining a plurality of polarized intensity information according to the light intensity information and the synthesized polarization degree information includes:
and respectively determining minimum brightness polarization information and maximum brightness polarization information according to the synthesized polarization degree information and the light intensity information.
In the method, the minimum brightness polarization information and the maximum brightness polarization information are determined, so that a target composite image with higher target object definition can be obtained by using the minimum brightness polarization information and the maximum brightness polarization information later.
As an optional implementation manner, the generating a target composite image according to the multiple pieces of polarization intensity information, the preset polarization degree information of the target object, and the preset polarization degree information of the background includes:
Determining first parameter information according to the minimum brightness polarization information and the polarization degree information of the background, wherein the first parameter information is used for representing the interaction degree of the polarization degree of the background and the polarization information at the minimum;
determining second parameter information according to the maximum brightness polarization information and the polarization degree information of the background, wherein the second parameter information is used for representing the interaction degree of the polarization degree of the background and the maximum polarization information;
determining third parameter information according to the polarization degree information of the background and the polarization degree information of the target object, wherein the third parameter information is used for representing the difference between the polarization degree information of the background and the polarization degree information of the target object;
and generating the target synthetic image according to the first parameter information, the second parameter information and the third parameter information.
In the method, the interaction degree when the background polarization degree and the polarization information are maximum and the difference between the polarization degree information of the background and the polarization degree information of the object can be calculated by using the maximum brightness polarization information, the minimum brightness polarization information, the polarization degree information of the object and the polarization degree information of the background, and a clearer target synthetic image of the object can be obtained by using the three information.
As an optional implementation manner, after the generating the target composite image according to the plurality of polarized intensity information, the preset polarization degree information of the target object, and the preset polarization degree information of the background, the method further includes:
determining transmittance information and global spurious information of the target synthetic image;
and carrying out haze removal treatment on the target synthetic image according to the transmittance information and the global stray information to obtain a haze-removed image.
In the method, the target synthetic image is subjected to haze removal treatment by utilizing the transmittance information and the global stray information, so that the definition of the haze-removed image can be greatly improved.
As an optional implementation manner, the determining the transmittance information and the global spurious information of the target composite image includes:
gradient guide filtering is carried out on the target synthetic image, so that the transmittance information is obtained;
determining a minimum value map of the target synthetic image;
extracting a pixel value of a target pixel point from the minimum value graph, and taking the pixel value as the global spurious information, wherein the target pixel point comprises: and the pixel point with the maximum brightness value in the minimum value graph.
As an optional implementation manner, after the performing haze removal processing on the target synthetic image according to the transmittance information and the global spurious information to obtain a haze-removed image, the method further includes:
dividing the image subjected to the turbidity removal treatment into a plurality of image blocks;
and respectively carrying out pixel value equalization processing and bilinear interpolation processing on each image block to obtain an enhanced image composed of the image blocks subjected to the pixel value equalization processing and the bilinear interpolation processing.
As an optional implementation manner, after the pixel value equalization processing and the bilinear interpolation processing are performed on each image block respectively to obtain an enhanced image composed of the image blocks after the pixel value equalization processing and the bilinear interpolation processing, the method further includes:
and inputting the enhanced image into a pre-trained color recovery model, and performing color correction on a target object and/or a background in the enhanced image by using the color recovery model to obtain a corrected image.
In a second aspect, an embodiment of the present application provides an image processing apparatus of a polarized light endoscope, including:
the acquisition module is used for acquiring a plurality of initial images, wherein the initial images are correspondingly obtained based on polarized light of a plurality of polarized angles acquired by a polarized light sensor of the polarized light endoscope;
The first determining module is used for determining light intensity information and synthetic polarization degree information of the environment where the polarized light endoscope is positioned according to the plurality of initial images;
the second determining module is used for determining a plurality of polarized intensity information according to the light intensity information and the synthesized polarization degree information;
the generation module is used for generating a target synthetic image according to the polarization intensity information, the preset polarization degree information of the target object and the preset polarization degree information of the background, wherein the target synthetic image is used for representing polarization imaging of the environment where the polarized light endoscope is located.
In a third aspect, an embodiment of the present application provides an image processing apparatus including: a processor and a memory storing machine readable instructions executable by the processor, the processor executing the machine readable instructions when the imaging processing apparatus is running to perform the steps of the method of image processing of polarized light endoscopes of the first aspect described above.
In a fourth aspect, an embodiment of the present application provides an endoscopic imaging system, which may include a polarized light endoscope and the imaging processing device described in the third aspect, where the polarized light endoscope is communicatively connected to the imaging processing device.
In a fifth aspect, an embodiment of the present application provides a computer-readable storage medium, on which a computer program is stored, which when executed by a processor performs the steps of the image processing method of a polarized light endoscope according to the first aspect described above.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a system architecture of an image processing method of a polarized light endoscope according to an embodiment of the present application;
FIG. 2 is a flow chart of an image processing method of a polarized light endoscope according to an embodiment of the present application;
FIG. 3 is another flow chart of an image processing method of a polarized light endoscope according to an embodiment of the present application;
FIG. 4 is a schematic flow chart of another embodiment of a method for processing images of a polarized light endoscope;
FIG. 5 is a schematic flow chart of an image processing method of a polarized light endoscope according to an embodiment of the present application;
FIG. 6 is an exemplary diagram of an overall process flow according to an embodiment of the present application;
FIG. 7 is a block diagram of an image processing apparatus of a polarized light endoscope according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of an image processing apparatus 80 according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described with reference to the accompanying drawings in the embodiments of the present application, and it should be understood that the drawings in the present application are for the purpose of illustration and description only and are not intended to limit the scope of the present application. In addition, it should be understood that the schematic drawings are not drawn to scale. A flowchart, as used in this disclosure, illustrates operations implemented according to some embodiments of the present application. It should be understood that the operations of the flow diagrams may be implemented out of order and that steps without logical context may be performed in reverse order or concurrently. Moreover, one or more other operations may be added to or removed from the flow diagrams by those skilled in the art under the direction of the present disclosure.
In addition, the described embodiments are only some, but not all, embodiments of the application. The components of the embodiments of the present application generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the application, as presented in the figures, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by a person skilled in the art without making any inventive effort, are intended to be within the scope of the present application.
It should be noted that the term "comprising" will be used in embodiments of the application to indicate the presence of the features stated hereafter, but not to exclude the addition of other features.
The imaging method of the polarized light endoscope in the related art does not clearly distinguish the target object from the background. Wherein, the target object may refer to a lesion tissue, and the background may include a cavity around the lesion tissue, a liquid in the cavity, and the like. Since there is no explicit distinction between the object and the background, the same parameters and manner will be used for the imaging process for both the object and the background. However, the reflection of incident light by different objects is not the same, and therefore, imaging using the same parameters and modes for the object and the background affects the sharpness of the object in the imaging result.
Based on the above problems, the embodiment of the application provides an image processing method of a polarized light endoscope, which fully considers the difference of the object and the background in the polarization degree, and performs imaging processing based on the polarization degree corresponding to the object, the polarization degree corresponding to the background and a plurality of polarization intensity information, so that the definition of the object in an imaging result can be greatly improved.
Fig. 1 is a schematic system architecture diagram of an image processing method of a polarized light endoscope according to an embodiment of the present application, and as shown in fig. 1, the method may be applied to an endoscope imaging system, where the endoscope imaging system may include a polarized light endoscope and an imaging processing device. The polarized light endoscope and the imaging processing device can be in communication connection in a wired or wireless mode. The polarized light endoscope may extend into the interior of the target object. A polarized light sensor is arranged inside the polarized light endoscope. When the endoscope stretches into the interior of the target object, the polarized light sensor can collect polarized light signals in the interior environment of the target object and convert the polarized light signals into electric signals so as to obtain an initial image. Illustratively, the polarized light endoscope is provided with an optical prism and an optical lens in addition to the polarized light sensor. The optical lens provides a light path for the white light generated by the light source to irradiate to the internal environment of the target object, the white light irradiates to the internal environment of the target object to form return light, the return light is guided to the optical prism by the light path formed by the optical lens, and the return light is transmitted to the light field range of the polarized light sensor by the optical prism, so that the polarized light sensor collects polarized light signals and converts the polarized light signals into the initial image. The polarized light sensor can then transmit the initial image to the imaging processing device, and the imaging processing device performs imaging processing by using the method of the embodiment of the application to obtain a target composite image, and performs haze removal processing and the like to obtain an output image. The imaging processing device may be, for example, a device or apparatus including a field programmable gate array (Field Programmable Gate Array, abbreviated as FPGA) and an image processor (Graphics Processing Unit, abbreviated as GPU). For example, in the following embodiments, the initial image sent by the polarized light sensor may be processed by the FPGA to obtain the target composite image, and the image dehazing process, the enhancement process, the color correction process, and the like may be performed by the GPU to obtain the output image.
In one example, the target object may be, for example, a lesion tissue (or referred to as focal tissue) in a human body, and the internal environment of the target object may be, for example, an abdominal cavity of a human body, where the abdominal cavity includes a background such as blood, turbid water, and tissue dust. When the polarized light endoscope penetrates into the abdominal cavity of a person, the polarized light sensor can collect and convert the polarized light signals to obtain an initial image containing lesion tissues and the background in the abdominal cavity.
Fig. 2 is a schematic flow chart of an image processing method of a polarized light endoscope according to an embodiment of the present application, and an execution subject of the method may be the aforementioned image processing apparatus. As shown in fig. 2, the method includes:
s201, acquiring a plurality of initial images, wherein the plurality of initial images are correspondingly obtained based on polarized light with a plurality of polarized angles acquired by a polarized light sensor of a polarized light endoscope.
Alternatively, as described above, the polarized light sensor can collect a polarized light signal in the internal environment of the target object and convert the polarized light signal into an electrical signal. The front end of the polarized light sensor is provided with a polaroid which is a rotatable optical device and can correspondingly have a plurality of polarization angles. The rotation angle of the polarizer of the polarized light sensor may be 0 degrees to 360 degrees. A certain angle of the rotation angles may be preset as an initial angle. The clockwise direction may be a forward direction angle and the counterclockwise direction may be a reverse direction angle with reference to the initial angle. For example, rotating 30 degrees clockwise from the initial angle, the polarization angle is 30 degrees, and collecting polarized light at the rotated angle can be regarded as collecting polarized light at the polarization angle of 30 degrees. A pseudo-clockwise rotation of 30 degrees from the initial angle, the polarization angle is 360 degrees minus 30 degrees, i.e., 330 degrees, and collecting polarized light at the rotated angle can be considered to be collecting polarized light at the polarization angle of 330 degrees.
As an example, in the embodiment of the present application, polarized light of four polarization angles of 0 degrees, 45 degrees, 90 degrees and 135 degrees may be collected by a polarized light sensor and converted into electrical signals, so as to obtain four initial images, where each initial image corresponds to one polarization angle of four polarization angles of 0 degrees, 45 degrees, 90 degrees and 135 degrees.
S202, determining light intensity information and synthetic polarization degree information of the environment where the polarized light endoscope is located according to the plurality of initial images.
Alternatively, the environment of the polarized light endoscope refers to the internal environment of the target object, such as the abdominal cavity of a person. The abdominal cavity contains pathological tissues, blood, turbid water, tissue scraps and the like. Among them, the lesion tissue is the target in the following examples, and blood, turbid water, tissue dust, and the like other than the target are used as the background in the following examples.
Alternatively, since the plurality of initial images are images of different polarization angles, light intensity information and synthetic polarization degree information of the environment in which the polarized light endoscope is located can be determined based on the initial images. The light intensity information may indicate the total light intensity of the environment in which the polarized light endoscope is located. The resultant polarization degree information may be indicative of the overall polarization degree of the environment in which the polarized light endoscope is located.
Alternatively, the polarization degree of light represents the proportion of polarized light in a beam of light to the whole light energy, and the polarization degree can be calculated by the following formula (1).
Wherein P is polarized Represents the intensity of polarized light in a beam of light, and the unit can be mW, P unpolarized Represents the intensity of unpolarized light in a beam of light, the unit can be mW, P total Indicating the full light intensity in mW and DOP indicating the polarization degree of the light.
Optionally, the polarization states of different polarization angles can be obtained by analyzing the plurality of initial images, and the synthetic polarization degree can be obtained by fusion processing of the polarization states of different polarization angles.
S203, determining a plurality of polarized intensity information according to the light intensity information and the synthesized polarization degree information.
Alternatively, the polarized intensity information may indicate a transmitted intensity of polarized light, and the plurality of polarized intensity information may represent a plurality of polarized images in different directions.
For example, the plurality of polarized intensity information may include maximum brightness polarization information and minimum brightness polarization information. It should be understood that under the same illumination intensity, different polarization angles have polarization information with different brightness, and the maximum brightness is called maximum brightness polarization information, i.e. the polarization information with the relatively best definition state of all polarization directions, so that the polarization information is also called optimal polarization information; conversely, the polarization information with the smallest brightness is called the minimum brightness polarization information, i.e. the polarization information with the relatively worst sharpness state of all polarization directions, and is also called the worst polarization information.
S204, generating a target composite image according to the plurality of polarized intensity information, the preset target object polarized degree information and the preset background polarized degree information, wherein the target composite image is used for representing polarized imaging of the environment where the polarized light endoscope is located.
Alternatively, the polarization degree information of the target object and the polarization degree information of the background may be verified in advance through simulation, experiment, and the like.
Optionally, the plurality of polarized intensity information represents a plurality of polarized images of the polarized light endoscope in different directions under the current environment. The correlation among various transmission intensities in the current environment can be analyzed and obtained by utilizing a plurality of polarization degree images of the polarized light endoscope in different directions in the current environment, and the object can be more prominent relative to the background by utilizing the correlation and the polarization degree information of the object and the polarization degree information of the background, so that the object in the generated object synthetic image is clearer correspondingly.
In this embodiment, according to a plurality of initial images obtained under a plurality of polarization angles, light intensity information and synthetic polarization degree information of an environment where the polarized light endoscope is located can be determined, and according to the light intensity information and the synthetic polarization degree information, a plurality of polarized intensity information can be determined. By using the polarization intensity information, the correlation among the transmission intensities in the current environment can be analyzed and known, and imaging processing is performed by using the correlation, the polarization degree information of the target object and the polarization degree information of the background, which are obtained in advance. Through fully considering the difference of the target object and the background in the polarization degree and the correlation of various transmission intensities in the actual environment in the imaging processing process, the target object is more prominent relative to the background, and correspondingly, the target object in the generated target synthetic image is clearer, so that the definition of the target object in the target synthetic image is greatly improved.
As an alternative embodiment, the above step S202 may be implemented as follows.
Fig. 3 is another flow chart of an image processing method of a polarized light endoscope according to an embodiment of the present application, as shown in fig. 3, the step S202 may include:
s301, determining light intensity information of the environment where the polarized light endoscope is located and polarized light components in at least one direction according to the plurality of initial images.
Optionally, each of the plurality of initial images is an image under one polarization angle, and the polarization angles corresponding to the initial images may be different from each other and may be uniformly distributed. For example, each polarization angle may include, for example, 0 degrees, 45 degrees, 90 degrees, and 135 degrees described above. The light intensity information and the polarized light component in the at least one direction may be determined using a plurality of initial images corresponding to a plurality of polarization angles. The at least one direction may be one or more directions among directions indicated by the plurality of polarization angles, and/or may be directions other than the directions indicated by the plurality of polarization angles. It should be understood that, in the embodiment of the present application, the polarization angle corresponding to the initial image refers to the polarization angle acquired and converted to obtain the initial image. Correspondingly, the initial image corresponding to the polarization angle is acquired and converted under the polarization angle.
S302, determining the synthetic polarization degree information according to the light intensity information and the polarized light component in the at least one direction.
Alternatively, each of the polarized light components in the at least one direction may represent a polarized light component in one direction, and the synthetic polarization degree information may be determined by fusion analysis processing of the polarized light components in the directions. That is, by fusing the polarized light components in at least one direction, the overall degree of polarization of the environment in which the polarized light endoscope is located can be analyzed.
In this embodiment, because the light intensity information of the environment where the polarized light endoscope is located and the synthesized polarization degree information of the polarized light component in at least one direction are used, the determined synthesized polarization degree information can more accurately represent the overall polarization degree of the environment where the polarized light endoscope is located.
As an alternative embodiment, when determining the light intensity information of the environment in which the polarized light endoscope is located and the polarized light component in at least one direction in step S301, the determination may be performed by calculating the stokes vector. The following will explain the present invention in detail.
Alternatively, a stokes vector may be determined with the above-mentioned plurality of initial images as input parameters, the stokes vector itself being four-dimensional, i.e. comprising four parameters. Further, the light intensity information and the polarized light component in the at least one direction are obtained based on the stokes vector.
Wherein the polarized light component in the at least one direction comprises at least one of: an X-axis direction linear polarized light component, a 45-degree direction linear polarized light component, and a circular polarized light component.
For example, the polarization angles corresponding to the initial images are respectively 0 degrees, 45 degrees, 90 degrees and 135 degrees, so that four parameters of stokes vectors can be calculated by the following formulas (2) to (5), and each stokes vector is used as the light intensity information and polarized light components in each direction.
I=I +I 90° (2)
Q=I -I 90° (3)
U=I 45° +I 135° (4)
V=I 45°,π/2 -I 135°,π/2 (5)
Wherein in the above formulas (2) - (5), I Representing an initial image corresponding to a polarization angle of 0 degrees, I 90° Representing an initial image corresponding to a polarization angle of 90 degrees, I 45° Representing an initial image corresponding to a polarization angle of 45 degrees, I 135° Representing an initial image corresponding to a polarization angle of 135 degrees. I represents the light intensity information described above. Q represents the linearly polarized light component in the X-axis direction. U represents a 45-degree direction linearly polarized light component. V represents the circularly polarized component.
In this embodiment, the accuracy of the obtained light intensity information and each polarized light component can be ensured by calculating the stokes vector to obtain the light intensity information and the polarized light component in at least one direction.
In the step S302, the synthetic polarization degree information may be determined using four parameters of the total stokes vectors or may be determined using a part of the four parameters of the stokes vectors, on the basis of the light intensity information and the polarized light components in at least one direction obtained based on the plurality of stokes vectors. The two modes are described below.
In the first aspect, the synthetic polarization degree information may be determined based on the light intensity information, the X-axis direction linear polarization component, the 45-degree direction linear polarization component, and the circular polarization component.
In this embodiment, the synthetic polarization degree information is determined using four parameters of the total stokes vector. The above-described synthetic polarization degree information can be calculated by the following formula (6), for example.
Wherein I represents the light intensity information described above. Q represents the linearly polarized light component in the X-axis direction. U represents a 45-degree direction linearly polarized light component. V represents the circularly polarized component. P represents the synthetic polarization degree information.
In this manner, the synthesized polarization degree information is calculated using the polarized light components in the three directions and the light intensity information, and since the polarized light components represented by the polarized light components in the three directions are more complete, it is possible to ensure that the calculated synthesized polarization degree information is more accurate.
In the second aspect, the synthetic polarization degree information may be determined based on the light intensity information, the X-axis direction linear polarization component, and the 45-degree direction linear polarization component.
In this embodiment, the synthetic polarization degree information is determined using three of four parameters of the stokes vector. The above-described synthetic polarization degree information can be calculated by the following formula (7), for example.
Wherein I represents the light intensity information described above. Q represents the linearly polarized light component in the X-axis direction. U represents a 45-degree direction linearly polarized light component. P represents the synthetic polarization degree information.
Because the value of the circular polarization component is extremely small in the polarization effect of the underwater target object and the background light incident on the light, in the mode, the circular polarization component is ignored, and the synthesized polarization degree information is calculated by only using the linear polarization light component in the X-axis direction and the linear polarization light component in the 45-degree direction, so that the calculation complexity can be reduced on the premise of not influencing the accuracy of the calculation result, and the calculation efficiency is improved.
A method of determining a plurality of pieces of polarization intensity information from the light intensity information and the synthesized polarization degree information in the aforementioned step S203 will be described below.
Alternatively, the minimum luminance polarization information and the maximum luminance polarization information may be determined according to the above-described synthetic polarization degree information and the above-described light intensity information, respectively.
The minimum luminance polarization information and the maximum luminance polarization information may be calculated by the following formula (8) and formula (9), for example.
Wherein, in the above formulas (2) to (5), I represents the above light intensity information. P represents the synthetic polarization degree information. Imax represents maximum luminance polarization information, and Imin represents minimum luminance polarization information.
In this embodiment, by determining the minimum luminance polarization information and the maximum luminance polarization information, the target composite image with higher target object definition can be obtained by using the minimum luminance polarization information and the maximum luminance polarization information later.
The minimum luminance polarization information and the maximum luminance polarization information may be determined, and the target composite image may be generated using the minimum luminance polarization information and the maximum luminance polarization information.
Fig. 4 is a schematic flow chart of another image processing method of a polarized light endoscope according to an embodiment of the present application, as shown in fig. 4, the step S204 may include:
s401, determining first parameter information according to the minimum brightness polarization information and the polarization degree information of the background, wherein the first parameter information is used for representing interaction degree between the polarization degree of the background and the polarization information at the minimum.
For example, the polarization degree information of the background may be added to a preset constant and multiplied by the minimum brightness polarization information to obtain the first parameter information, where the first parameter information can measure the interaction degree between the polarization degree and the minimum brightness polarization information.
S402, determining second parameter information according to the maximum brightness polarization information and the polarization degree information of the background, wherein the second parameter information is used for representing interaction degree when the polarization degree of the background is maximum with the polarization information.
For example, the polarization degree information of the background may be subtracted from a preset constant and multiplied by the polarization information of the maximum brightness to obtain the second parameter information, where the second parameter information can measure the interaction degree between the polarization degree of the background and the maximum polarization information.
S403, determining third parameter information according to the polarization degree information of the background and the polarization degree information of the target object, wherein the third parameter information is used for representing the difference between the polarization degree information of the background and the polarization degree information of the target object.
For example, the polarization degree information of the background may be subtracted from the polarization degree information of the target object to obtain the third parameter information, where the third parameter information can measure a difference between the polarization degree information of the background and the polarization degree information of the target object.
It should be noted that the execution sequence of the steps S401 to S403 is not limited to the order described above, and may be executed in parallel.
S404, generating the target synthetic image according to the first parameter information, the second parameter information and the third parameter information.
The above-described target synthetic image can be calculated by the following formula (10), for example.
Wherein Imax represents maximum luminance polarization information, imin represents minimum luminance polarization information, pscat represents polarization intensity information of the background, and Pobj represents polarization intensity information of the object.
Specifically, the first parameter information is calculated by an iminx (1+pscat) part, the second parameter information is calculated by an Imax x (1-Pscat) part, the third parameter information is calculated by a Pscat-Pobj part, and the first parameter information is added with the second parameter information and divided by the third parameter information, thereby obtaining the target synthetic image.
It should be noted that, various information in the execution process of the foregoing embodiments may be represented in a matrix form, and accordingly, the calculation process between the information is calculation between the matrices. After the calculation of the above formula (10) is completed, the obtained matrix can directly represent the above target composite image.
In this embodiment, the degree of influence of the background on the polarized intensity of the minimum brightness, the degree of influence of the background on the polarized intensity of the maximum brightness, and the difference between the polarized intensity information of the background and the polarized intensity information of the target object can be calculated by using the maximum brightness polarized information, the minimum brightness polarized information, the polarized intensity information of the target object, and the polarized intensity information of the background.
The above embodiment describes a process of obtaining a target synthetic image after a plurality of initial images are acquired, and the foregoing process may be referred to as a process of depolarizing synthetic processing. On the basis of obtaining the target synthetic image by utilizing the depolarization synthetic process, haze removal treatment, enhancement treatment, color correction treatment and the like can be further carried out on the target synthetic image so as to further improve the image quality of the target synthetic image.
In the specific implementation process, any one of the haze removal treatment, the enhancement treatment, and the color correction treatment, or a part or all of them may be selected, and when a part or all of them is selected, the execution order of the respective treatment processes may be any order. For example, when the dehazing treatment and the enhancement treatment are selected, the dehazing treatment may be performed on the target synthetic image first, and then the enhancement treatment may be performed on the image after the dehazing treatment. Or, the enhancement processing may be performed on the target synthetic image first, and then the haze removal processing may be performed on the image after the enhancement processing.
The following embodiments of the present application will be described by taking as an example the selection of all the processes in the above-described haze removal process, enhancement process, and color correction process, and the execution order thereof is the haze removal process, enhancement process, and color correction process in this order. It should be understood that this is not a limitation of the present application. When any one or a part of the processes is selected, or other execution order is selected, the implementation of the corresponding process in the following examples may be used for the implementation of the inside of each process. For example, if only the dehazing process and the enhancement process are selected and the order of execution is to enhance the process first and then to dehazing process, the implementation of enhancement process may be that of enhancement process in the following example, and the implementation of dehazing process may be that of dehazing process in the following example, differing only in the input image.
Fig. 5 is a schematic flow chart of another image processing method of a polarized light endoscope according to an embodiment of the present application, as shown in fig. 5, after obtaining the target composite image according to the foregoing embodiment, the method further includes:
s501, determining transmittance information and global spurious information of the target synthetic image.
Optionally, gradient-oriented filtering may be performed on the target synthetic image to obtain the transmittance information. Meanwhile, determining a minimum value diagram of the target synthetic image, extracting a pixel value of a target pixel point from the minimum value diagram, and taking the pixel value as the global spurious information, wherein the target pixel point comprises: and the pixel point with the maximum brightness value in the minimum value graph.
For example, the target synthetic image may be first subjected to gradient-oriented filtering to obtain an initial transmittance value, and the transmittance information may be calculated using the following formula (11).
u′=1-w×u (11)
Where u is the initial value of transmittance, w is the haze removal coefficient, and u' is the transmittance information.
For example, the minimum value in R, G, B of the target synthetic image may be calculated, and then the minimum value filtering process may be performed, so that the minimum value map may be obtained.
S502, performing haze removal treatment on the target synthetic image according to the transmittance information and the global stray information to obtain a haze-removed image.
Illustratively, the image after the haze removal treatment can be calculated by the following formula (12).
Wherein OBJ is the target composite image, u' is the transmittance information, p is the global spurious information, and H is the image after the haze removal process.
In this embodiment, the target synthetic image is subjected to haze removal processing by using the transmittance information and the global spurious information, so that the sharpness of the image after haze removal processing can be greatly improved.
Optionally, on the basis of obtaining the image after the haze removal treatment, enhancement treatment can be further performed on the image so as to enhance the contrast of the image.
Alternatively, the image after the haze removal processing may be divided into a plurality of image blocks, and pixel value equalization processing and bilinear interpolation processing may be performed on each image block, so as to obtain an enhanced image composed of the image blocks after the pixel value equalization processing and bilinear interpolation processing.
Wherein a histogram equalization may be performed for each image block. Specifically, each image block is first corrected, so that a part with more statistics of pixel values is distributed to each stage of pixel values, and a homogenization effect is realized. In addition, bilinear interpolation processing is performed on each image block to remove the blocking effect caused by local processing.
Optionally, on the basis of obtaining the enhanced image, color correction processing can be performed on the enhanced image, so that the obtained corrected image can be close to the true color of the environment where the polarized light endoscope is located, and the image of the environment where the polarized light endoscope is located can be displayed more clearly.
Optionally, the enhanced image may be input into a pre-trained color recovery model, and the color recovery model corrects the color of the target object and/or the background in the enhanced image to obtain a corrected image.
For example, the depth information of the image may be analyzed in advance to establish a lambertian reflection model of the environment where the polarized light endoscope is located, and the corrected image is obtained by correcting the model, so as to reconstruct various tissue colors of the environment where the polarized light endoscope is located. For example, the color of blood, turbid water, tissue dust, etc. in the abdominal cavity of a person can be reconstructed.
The haze removal processing, the enhancement processing, and the color correction processing described above can be regarded as a whole as a process of sharpness processing of an image. Based on this, fig. 6 is an exemplary diagram of an overall processing flow according to an embodiment of the present application, as shown in fig. 6, the present application first obtains initial images corresponding to four polarization angles of 0 degrees, 45 degrees, 90 degrees, and 135 degrees, obtains a target composite image through the foregoing depolarization synthesis process, and performs the foregoing sharpness process on the target composite image, thereby obtaining an output image with higher image quality.
Based on the same inventive concept, the embodiment of the application also provides an image processing device of the polarized light endoscope, which corresponds to the image processing method of the polarized light endoscope, and because the principle of solving the problem of the device in the embodiment of the application is similar to that of the image processing method of the polarized light endoscope in the embodiment of the application, the implementation of the device can refer to the implementation of the method, and the repetition is omitted.
Fig. 7 is a block diagram of an image processing apparatus of a polarized light endoscope according to an embodiment of the present application, and as shown in fig. 7, the apparatus includes:
the acquiring module 701 is configured to acquire a plurality of initial images, where the plurality of initial images are obtained based on polarized light corresponding to a plurality of polarized angles acquired by the polarized light sensor of the polarized light endoscope.
The first determining module 702 is configured to determine, according to the plurality of initial images, light intensity information and synthetic polarization degree information of an environment where the polarized light endoscope is located.
A second determining module 703, configured to determine a plurality of polarized intensity information according to the light intensity information and the synthesized polarization degree information.
And the generating module 704 is configured to generate a target composite image according to the multiple pieces of polarized intensity information, the preset pieces of polarized intensity information of the target object, and the preset pieces of polarized intensity information of the background, where the target composite image is used for representing polarized imaging of the environment where the polarized light endoscope is located.
As an alternative embodiment, the first determining module 702 is specifically configured to:
determining light intensity information of the environment where the polarized light endoscope is located and polarized light components in at least one direction according to the plurality of initial images;
And determining the synthetic polarization degree information according to the light intensity information and the polarized light component in the at least one direction.
As an alternative embodiment, the first determining module 702 is specifically configured to:
determining a plurality of stokes vectors by taking the plurality of initial images as input parameters;
obtaining the light intensity information and the polarized light component in the at least one direction based on the plurality of stokes vectors;
wherein the polarized light component in the at least one direction comprises at least one of: an X-axis direction linear polarized light component, a 45-degree direction linear polarized light component, and a circular polarized light component.
As an alternative embodiment, the first determining module 702 is specifically configured to:
and determining the synthetic polarization degree information according to the light intensity information, the linear polarized light component in the X-axis direction, the linear polarized light component in the 45-degree direction and the circular polarized component.
As an alternative embodiment, the first determining module 702 is specifically configured to:
and determining the synthetic polarization degree information according to the light intensity information, the X-axis direction linear polarized light component and the 45-degree direction linear polarized light component.
As an alternative embodiment, the second determining module 703 is specifically configured to:
and respectively determining minimum brightness polarization information and maximum brightness polarization information according to the synthesized polarization degree information and the light intensity information.
As an alternative embodiment, the generating module 704 is specifically configured to:
determining first parameter information according to the minimum brightness polarization information and the polarization degree information of the background, wherein the first parameter information is used for representing the interaction degree of the polarization degree of the background and the polarization information at the minimum;
determining second parameter information according to the maximum brightness polarization information and the polarization degree information of the background, wherein the second parameter information is used for representing the interaction degree of the polarization degree of the background and the maximum polarization information;
determining third parameter information according to the polarization degree information of the background and the polarization degree information of the target object, wherein the third parameter information is used for representing the difference between the polarization degree information of the background and the polarization degree information of the target object;
and generating the target synthetic image according to the first parameter information, the second parameter information and the third parameter information.
As an alternative embodiment, the generating module 704 is further configured to:
determining transmittance information and global spurious information of the target synthetic image;
and carrying out haze removal treatment on the target synthetic image according to the transmittance information and the global stray information to obtain a haze-removed image.
As an alternative embodiment, the generating module 704 is specifically configured to:
gradient guide filtering is carried out on the target synthetic image, so that the transmittance information is obtained;
determining a minimum value map of the target synthetic image;
extracting a pixel value of a target pixel point from the minimum value graph, and taking the pixel value as the global spurious information, wherein the target pixel point comprises: and the pixel point with the maximum brightness value in the minimum value graph.
As an alternative embodiment, the generating module 704 is further configured to:
dividing the image subjected to the turbidity removal treatment into a plurality of image blocks;
and respectively carrying out pixel value equalization processing and bilinear interpolation processing on each image block to obtain an enhanced image composed of the image blocks subjected to the pixel value equalization processing and the bilinear interpolation processing.
As an alternative embodiment, the generating module 704 is further configured to:
And inputting the enhanced image into a pre-trained color recovery model, and performing color correction on a target object and/or a background in the enhanced image by using the color recovery model to obtain a corrected image.
The embodiment of the present application further provides an imaging processing device 80, as shown in fig. 8, which is a schematic structural diagram of the imaging processing device 80 provided in the embodiment of the present application, including: processor 81, memory 82, and optionally bus 83. The memory 82 stores machine readable instructions executable by the processor 81 (e.g., execution instructions corresponding to the acquisition module 701, the first determination module 702, the second determination module 703, the generation module 704, etc. in the apparatus of fig. 7), which when the imaging processing apparatus 80 is running, are communicated with the memory 82 through the bus 83, and when executed by the processor 81, the machine readable instructions perform the method steps in the foregoing method embodiments.
The embodiment of the application also provides a computer readable storage medium, wherein the computer readable storage medium stores a computer program which is executed by a processor to execute the steps of the image processing method of the polarized light endoscope.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system and apparatus may refer to corresponding procedures in the method embodiments, and are not repeated in the present disclosure. In the several embodiments provided by the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, and the division of the modules is merely a logical function division, and there may be additional divisions when actually implemented, and for example, multiple modules or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, indirect coupling or communication connection of devices or modules, electrical, mechanical, or other form.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily appreciate variations or alternatives within the scope of the present application.

Claims (12)

1. A method of processing an image of a polarized light endoscope, comprising:
acquiring a plurality of initial images, wherein the plurality of initial images are correspondingly obtained based on polarized light of a plurality of polarized angles acquired by a polarized light sensor of a polarized light endoscope;
determining light intensity information and synthetic polarization degree information of the environment where the polarized light endoscope is positioned according to the plurality of initial images;
determining a plurality of polarized intensity information according to the light intensity information and the synthesized polarization degree information;
and generating a target synthetic image according to the polarization intensity information, the preset polarization degree information of the target object and the preset polarization degree information of the background, wherein the target synthetic image is used for representing polarization imaging of the environment where the polarized light endoscope is positioned.
2. The method of claim 1, wherein determining the light intensity information and the combined polarization degree information of the environment in which the polarized light endoscope is located based on the plurality of initial images comprises:
Determining light intensity information of the environment where the polarized light endoscope is located and polarized light components in at least one direction according to the plurality of initial images;
and determining the synthetic polarization degree information according to the light intensity information and the polarized light component in the at least one direction.
3. The method of claim 2, wherein determining light intensity information of an environment in which the polarized light endoscope is located and a polarized light component in at least one direction from the plurality of initial images comprises:
determining Stokes vectors by taking the plurality of initial images as input parameters;
obtaining the light intensity information and the polarized light component in the at least one direction based on the stokes vector;
wherein the polarized light component in the at least one direction comprises at least one of: an X-axis direction linear polarized light component, a 45-degree direction linear polarized light component, and a circular polarized light component.
4. A method according to claim 3, wherein said determining said synthetic polarization degree information from said light intensity information and said polarized light component in said at least one direction comprises:
and determining the synthetic polarization degree information according to the light intensity information, the linear polarized light component in the X-axis direction, the linear polarized light component in the 45-degree direction and the circular polarized component.
5. A method according to claim 3, wherein said determining said synthetic polarization degree information from said light intensity information and said polarized light component in said at least one direction comprises:
and determining the synthetic polarization degree information according to the light intensity information, the X-axis direction linear polarized light component and the 45-degree direction linear polarized light component.
6. The method of claim 1, wherein said determining a plurality of polarized intensity information from said light intensity information and said synthetic polarization degree information comprises:
and respectively determining minimum brightness polarization information and maximum brightness polarization information according to the synthesized polarization degree information and the light intensity information.
7. The method of claim 6, wherein generating the target composite image based on the plurality of polarization intensity information, the preset target polarization degree information, and the preset background polarization degree information comprises:
determining first parameter information according to the minimum brightness polarization information and the polarization degree information of the background, wherein the first parameter information is used for representing the interaction degree of the polarization degree of the background and the polarization information at the minimum;
Determining second parameter information according to the maximum brightness polarization information and the polarization degree information of the background, wherein the second parameter information is used for representing the interaction degree of the polarization degree of the background and the maximum polarization information;
determining third parameter information according to the polarization degree information of the background and the polarization degree information of the target object, wherein the third parameter information is used for representing the difference between the polarization degree information of the background and the polarization degree information of the target object;
and generating the target synthetic image according to the first parameter information, the second parameter information and the third parameter information.
8. The method according to any one of claims 1-7, wherein after generating the target composite image based on the plurality of polarization intensity information, the preset target object polarization degree information, and the preset background polarization degree information, the method further comprises:
determining transmittance information and global spurious information of the target synthetic image;
and carrying out haze removal treatment on the target synthetic image according to the transmittance information and the global stray information to obtain a haze-removed image.
9. The method of claim 8, wherein determining the transmittance information and global spurious information of the target composite image comprises:
Gradient guide filtering is carried out on the target synthetic image, so that the transmittance information is obtained;
determining a minimum value map of the target synthetic image;
extracting a pixel value of a target pixel point from the minimum value graph, and taking the pixel value as the global spurious information, wherein the target pixel point comprises: and the pixel point with the maximum brightness value in the minimum value graph.
10. The method of claim 8, wherein the performing the haze removal processing on the target composite image according to the transmittance information and global spurious information, after obtaining the haze-removed image, further comprises:
dividing the image subjected to the turbidity removal treatment into a plurality of image blocks;
and respectively carrying out pixel value equalization processing and bilinear interpolation processing on each image block to obtain an enhanced image composed of the image blocks subjected to the pixel value equalization processing and the bilinear interpolation processing.
11. The method according to claim 10, wherein after performing pixel value equalization processing and bilinear interpolation processing on each image block, respectively, to obtain an enhanced image composed of the image blocks after the pixel value equalization processing and bilinear interpolation processing, the method further comprises:
And inputting the enhanced image into a pre-trained color recovery model, and performing color correction on a target object and/or a background in the enhanced image by using the color recovery model to obtain a corrected image.
12. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a computer program which, when executed by a processor, performs the steps of the image processing method of a polarized light endoscope according to any one of claims 1 to 11.
CN202210474252.2A 2022-04-29 2022-04-29 Image processing method of polarized light endoscope and computer readable storage medium Pending CN117011155A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210474252.2A CN117011155A (en) 2022-04-29 2022-04-29 Image processing method of polarized light endoscope and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210474252.2A CN117011155A (en) 2022-04-29 2022-04-29 Image processing method of polarized light endoscope and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN117011155A true CN117011155A (en) 2023-11-07

Family

ID=88564108

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210474252.2A Pending CN117011155A (en) 2022-04-29 2022-04-29 Image processing method of polarized light endoscope and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN117011155A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117593437A (en) * 2024-01-18 2024-02-23 华伦医疗用品(深圳)有限公司 Endoscope real-time image processing method and system based on GPU

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117593437A (en) * 2024-01-18 2024-02-23 华伦医疗用品(深圳)有限公司 Endoscope real-time image processing method and system based on GPU
CN117593437B (en) * 2024-01-18 2024-05-14 华伦医疗用品(深圳)有限公司 Endoscope real-time image processing method and system based on GPU

Similar Documents

Publication Publication Date Title
WO2014155778A1 (en) Image processing device, endoscopic device, program and image processing method
CN105072968A (en) Image processing device, endoscopic device, program and image processing method
US10966592B2 (en) 3D endoscope apparatus and 3D video processing apparatus
JPH11332820A (en) Fluorescent endoscope
US9826884B2 (en) Image processing device for correcting captured image based on extracted irregularity information and enhancement level, information storage device, and image processing method
US20190045170A1 (en) Medical image processing device, system, method, and program
US11030745B2 (en) Image processing apparatus for endoscope and endoscope system
JP2017213097A (en) Image processing device, image processing method, and program
CN117011155A (en) Image processing method of polarized light endoscope and computer readable storage medium
CN109091099A (en) The high definition miniature electronic endoscopic system of binocular vision
CN109771052B (en) Three-dimensional image establishing method and system based on multi-view imaging and multi-polarization state imaging
EP3247113B1 (en) Image processing device, image processing method, program, and endoscope system
CN112261399B (en) Capsule endoscope image three-dimensional reconstruction method, electronic device and readable storage medium
CN109068035A (en) A kind of micro- camera array endoscopic imaging system of intelligence
Stoyanov et al. Intra-operative visualizations: Perceptual fidelity and human factors
CN110115557B (en) Hyperspectral endoscopic imaging device and imaging method
CN115994999A (en) Goblet cell semantic segmentation method and system based on boundary gradient attention network
CN114549368A (en) Endoscope imaging system and method based on image sensor
CN116664441A (en) Medical endoscope color restoration method, device and storage medium
US20230081476A1 (en) Method of multiple image reconstruction and registration
JPH04314181A (en) Processing method for endoscope image
Kwan et al. Development of a Light Field Laparoscope for Depth Reconstruction
CN117398042A (en) AI-assisted detection 3D endoscope system and imaging method
WO2023124982A1 (en) Auxiliary navigation method, apparatus and device for bronchoscope
WO2023184526A1 (en) System and method of real-time stereoscopic visualization based on monocular camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination