CN110751605A - Image processing method and device, electronic equipment and readable storage medium - Google Patents

Image processing method and device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN110751605A
CN110751605A CN201910983689.7A CN201910983689A CN110751605A CN 110751605 A CN110751605 A CN 110751605A CN 201910983689 A CN201910983689 A CN 201910983689A CN 110751605 A CN110751605 A CN 110751605A
Authority
CN
China
Prior art keywords
image
boundary
point
value
numerical value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910983689.7A
Other languages
Chinese (zh)
Other versions
CN110751605B (en
Inventor
余力
冯能云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sonoscape Medical Corp
Original Assignee
Sonoscape Medical Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sonoscape Medical Corp filed Critical Sonoscape Medical Corp
Priority to CN201910983689.7A priority Critical patent/CN110751605B/en
Publication of CN110751605A publication Critical patent/CN110751605A/en
Priority to PCT/CN2020/092220 priority patent/WO2021073101A1/en
Application granted granted Critical
Publication of CN110751605B publication Critical patent/CN110751605B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20036Morphological image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20192Edge enhancement; Edge preservation

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Epidemiology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Endoscopes (AREA)

Abstract

The application provides an image processing method, comprising the following steps: acquiring a color endoscope image in a time sequence sampling video; extracting a boundary area image in the color endoscope image, and determining a boundary point according to the boundary area image; obtaining a restoration value of the boundary point by using pixel point information in a preset region range of the boundary point and gradient information determined according to the boundary point, wherein the pixel point information comprises a contribution value and distance information, and the gradient information comprises gradient information in the x direction and gradient information in the y direction; all the restored values are substituted for the values of the boundary points in the color endoscope image to obtain a restored image. The method and the device solve the problem of image restoration of the color boundary caused by the movement of internal instruments or internal organs, the image is more visual and clear, and medical personnel can observe the image after restoration conveniently. The application also provides an image processing device, an electronic device and a computer readable storage medium, which all have the beneficial effects.

Description

Image processing method and device, electronic equipment and readable storage medium
Technical Field
The present application relates to the field of endoscope technologies, and in particular, to an image processing method, an image processing apparatus, an electronic device, and a computer-readable storage medium.
Background
Time-series sampling is one of the core technologies in the field of endoscopy. The image obtained by time-series sampling has higher definition than that obtained by the bayer pattern, and has higher luminance in the dye mode. But its color boundaries due to the imaging mode are a problem to be solved.
Therefore, how to provide a solution to the above technical problem is a problem that needs to be solved by those skilled in the art.
Disclosure of Invention
The application aims to provide an image processing method, an image processing device, electronic equipment and a computer readable storage medium, the image is more visual and clear, and medical staff can observe the image after restoration conveniently. The specific scheme is as follows:
the application provides an image processing method, comprising the following steps:
acquiring a color endoscope image in a time sequence sampling video;
extracting a boundary area image in the color endoscope image, and determining a boundary point according to the boundary area image;
obtaining a restoration value of the boundary point by using pixel point information in a preset region range of the boundary point and gradient information determined according to the boundary point, wherein the pixel point information comprises a contribution value and distance information, and the gradient information comprises gradient information in the x direction and gradient information in the y direction;
and replacing the numerical values of the boundary points in the color endoscope image by all the repairing numerical values so as to obtain a repaired image.
Optionally, extracting a boundary area image in the color endoscope image, and determining a boundary point according to the boundary area image, includes:
extracting a color boundary region of the color endoscope image;
carrying out binarization and expansion processing on the color boundary area to obtain a boundary area image;
judging whether the gray value of a pixel point in the boundary area image is 0 or not;
and if so, determining the pixel point with the gray value of 0 as the boundary point.
Optionally, extracting a color boundary region of the color endoscope image includes:
judging whether the R component of the pixel point in the color endoscope image is larger than the G component multiplied by a first preset multiple or not, or judging whether the R component of the pixel point in the color endoscope image is larger than the B component multiplied by a second preset multiple or not;
if yes, determining pixel points of which the R component is a first preset multiple of the G component or pixel points of which the R component is a second preset multiple of the B component as boundary region points, and obtaining the color boundary region formed by all the boundary region points.
Optionally, performing binarization and expansion processing on the color boundary region and obtaining the boundary region image further includes:
and carrying out corrosion treatment.
Optionally, obtaining a restoration value of the boundary point by using pixel point information in a preset region range of the boundary point and gradient information determined according to the boundary point, includes:
determining a reference channel, obtaining a first image formed by data obtained by dividing a first channel numerical value by reference channel data corresponding to the reference channel, and simultaneously obtaining a second image formed by data obtained by dividing a second channel numerical value by the reference channel data;
determining a first numerical value of the first image and a second numerical value of the second image by using the pixel point information in the boundary area image within the preset area range of all the boundary points and gradient information determined according to the boundary points;
and multiplying the first numerical value by a reference numerical value corresponding to the reference channel to obtain a first repair numerical value, and multiplying the second numerical value by a reference numerical value corresponding to the reference channel to obtain a second repair numerical value, so as to obtain the repair numerical value consisting of the reference numerical value, the first repair numerical value and the second repair numerical value.
Optionally, determining a first numerical value of the first image and a second numerical value of the second image by using the pixel point information in the boundary region image within the preset region range of all the boundary points and the gradient information determined according to the boundary points, includes:
determining a plurality of points which do not need to be repaired within the preset area range of the boundary point;
calculating the contribution value of each non-repair-required point by using a first preset algorithm;
determining the numerical value of the boundary point according to a second preset algorithm based on the contribution value of each non-repair-required point so as to obtain the first numerical value of the first image and the second numerical value of the second image;
the first predetermined algorithm is ω ═ dir abs (cos θ), and the second predetermined algorithm is
Figure BDA0002236021580000031
Wherein the content of the first and second substances,
ω is a contribution value of the unnecessary repair point, Δ x is an x distance between the unnecessary repair point and the boundary point, Δ y is a y distance between the unnecessary repair point and the boundary point, gradx is x-direction gradient information, grady is y-direction gradient information, I isKnowIs a value in the color endoscopic image that does not require a repair point.
Optionally, the reference channel is a G channel.
The application provides an image processing apparatus, including:
the color endoscope image acquisition module is used for acquiring a color endoscope image in the time sequence sampling video;
the boundary point extraction module is used for extracting a boundary area image in the color endoscope image and determining boundary points according to the boundary area image;
a restoration value determining module, configured to determine a restoration value of the boundary point by using pixel point information in a preset region range of the boundary point and gradient information determined according to the boundary point, where the pixel point information includes a contribution value and distance information, and the gradient information includes gradient information in an x direction and gradient information in a y direction;
and the restoration module is used for replacing the numerical values of the boundary points in the color endoscope image with all the restoration numerical values so as to obtain a restored image.
Optionally, the boundary point extracting module includes:
a color boundary region extraction unit configured to extract a color boundary region of the color endoscopic image;
a boundary area image obtaining unit, configured to perform binarization and expansion processing on the color boundary area to obtain a boundary area image;
the judging unit is used for judging whether the gray value of the pixel point in the boundary area image is 0 or not;
and the boundary point determining unit is used for determining the pixel point with the gray value of 0 as the boundary point if the gray value is equal to the gray value of 0.
Optionally, the color boundary region extracting unit includes:
the judgment subunit is used for judging whether the R component of the pixel point in the color endoscope image is larger than the G component multiplied by a first preset multiple or judging whether the R component of the pixel point in the color endoscope image is larger than the B component multiplied by a second preset multiple;
and the color boundary region determining unit is used for determining pixel points of which the R component is a first preset multiple of the G component or pixel points of which the R component is a second preset multiple of the B component as boundary region points if the color boundary region determining unit determines the pixel points as the boundary region points to obtain the color boundary region formed by all the boundary region points.
Optionally, the boundary area image obtaining unit further includes:
and the etching subunit is used for performing etching treatment.
Optionally, the repair value determining module includes:
the first image and second image obtaining unit is used for determining a reference channel, obtaining a first image formed by data obtained by dividing a first channel numerical value by reference channel data corresponding to the reference channel, and simultaneously obtaining a second image formed by data obtained by dividing a second channel numerical value by the reference channel data;
a first numerical value and second numerical value obtaining unit, configured to determine a first numerical value of the first image and a second numerical value of the second image by using the pixel point information in the boundary region image within the preset region range of all the boundary points and gradient information determined according to the boundary points;
and the repair numerical value determining unit is used for multiplying the first numerical value by a reference numerical value corresponding to the reference channel to obtain a first repair numerical value, and multiplying the second numerical value by a reference numerical value corresponding to the reference channel to obtain a second repair numerical value so as to obtain the repair numerical value consisting of the reference numerical value, the first repair numerical value and the second repair numerical value.
Optionally, the first and second numerical value obtaining units include:
a repair-unnecessary point determining subunit, configured to determine a plurality of repair-unnecessary points within the preset region range of the boundary point;
the contribution value determining subunit is used for calculating the contribution value of each point which does not need to be repaired by utilizing a first preset algorithm;
a first numerical value and second numerical value obtaining subunit, configured to determine, according to a second preset algorithm, a numerical value of the boundary point based on a contribution value of each of the unnecessary repair points, so as to obtain the first numerical value of the first image and the second numerical value of the second image;
the first predetermined algorithm is ω ═ dir abs (cos θ), and the second predetermined algorithm is
Figure BDA0002236021580000051
Wherein the content of the first and second substances,
Figure BDA0002236021580000052
ω is a contribution value of the unnecessary repair point, Δ x is an x distance between the unnecessary repair point and the boundary point, Δ y is a y distance between the unnecessary repair point and the boundary point, gradx is x-direction gradient information, grady is y-direction gradient information, I isKnowIs a value in the color endoscopic image that does not require a repair point.
The application provides an electronic device, including:
a memory for storing a computer program;
a processor for implementing the steps of the image processing method as described above when executing the computer program.
The present application provides a computer-readable storage medium having stored thereon a computer program which, when being executed by a processor, carries out the steps of the image processing method as described above.
The application provides an image processing method, comprising the following steps: acquiring a color endoscope image in a time sequence sampling video; extracting a boundary area image in the color endoscope image, and determining a boundary point according to the boundary area image; obtaining a restoration value of the boundary point by using pixel point information in a preset region range of the boundary point and gradient information determined according to the boundary point, wherein the pixel point information comprises a contribution value and distance information, and the gradient information comprises gradient information in the x direction and gradient information in the y direction; all the restored values are substituted for the values of the boundary points in the color endoscope image to obtain a restored image.
Therefore, the boundary region image of the color endoscope image in the time sequence sampling video is extracted, the boundary point is determined, the restoration value is obtained according to the pixel point information in the preset region range of the boundary point and the gradient information determined by the boundary point, the boundary point value in the original color endoscope image is replaced by the restoration value, the image restoration is achieved, the problem of image restoration of the color boundary caused by movement of internal instruments or internal organs can be solved, the image is more visual and clear, and medical staff can observe the restored image conveniently. The application also provides an image processing device, an electronic device and a computer readable storage medium, all having the above beneficial effects, which are not described herein again.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is a flowchart of an image processing method according to an embodiment of the present application;
FIG. 2 is a flowchart illustrating color boundary region extraction in another image processing method according to an embodiment of the present disclosure;
fig. 3 is a flowchart of determining a modified value in another image processing method according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 1, fig. 1 is a flowchart of an image processing method according to an embodiment of the present application, which specifically includes:
and S110, acquiring a color endoscope image in the time-sequence sampling video.
In the field of endoscope technology, images in a time sequence sampling video obtained by time sequence sampling have higher definition than images in a video obtained by a Bayer template, and have higher brightness in a dyeing mode, but in an operation, a color boundary can be generated due to the movement of internal instruments or internal organs, so that the observation of medical staff is not facilitated, and the operation effect is further influenced. Therefore, the embodiment provides an image processing method, which can solve the problem of image restoration of a color boundary caused by movement of an in-vivo instrument or an internal organ, and the image is more intuitive and clear, so that medical staff can observe the restored image conveniently.
And S120, extracting a boundary area image in the color endoscope image, and determining a boundary point according to the boundary area image.
The method for extracting the boundary area image is not limited in this embodiment, as long as the purpose of this embodiment can be achieved, it can be understood that the boundary area image includes a plurality of pixel points, and further, the boundary point is determined according to the boundary area image. The embodiment does not limit the determination mode of the boundary point, and the user can customize the setting.
In an implementation manner, please refer to fig. 2, fig. 2 is a flowchart illustrating a method for extracting a color boundary area in another image processing method according to an embodiment of the present application, specifically, extracting a boundary area image in a color endoscope image and determining a boundary point according to the boundary area image, including:
and S121, extracting a color boundary area of the color endoscope image.
In one implementable embodiment, extracting a color border region of a color endoscope image comprises:
judging whether the R component of a pixel point in the color endoscope image is larger than the G component multiplied by a first preset multiple or not, or judging whether the R component of the pixel point in the color endoscope image is larger than the B component multiplied by a second preset multiple or not; if yes, determining pixel points of which the R component is a first preset multiple of the G component or pixel points of which the R component is a second preset multiple of the B component as boundary region points, and obtaining a color boundary region formed by all the boundary region points.
It is understood that, in the field of endoscope technology, in a color endoscopic image, the amount of red is high due to the high amount of hemoglobin, resulting in a high R component content. For a pixel point in a color endoscope image, when an R component is larger than a G component multiplied by a first preset multiple, the pixel point is proved to be a boundary region point, or when the R component is larger than a B component multiplied by a second preset multiple, the pixel point is proved to be a boundary region point. In this case, all the boundary region points are determined and obtained by the threshold method, and a color boundary region including all the boundary region points is further obtained.
And S122, carrying out binarization and expansion processing on the color boundary area to obtain a boundary area image.
And performing binarization processing on the color boundary area to obtain a binarized image, namely a black-and-white image, and performing expansion processing on the binarized image. The expansion is a process of combining all background points in contact with an object into the object to expand the boundary outwards, so that the image of the boundary area is more complete.
In an implementation manner, between the binarization and expansion processing of the color boundary region and the obtaining of the boundary region image, the method further includes: and carrying out corrosion treatment. Erosion is a process of eliminating boundary points and contracting the boundary inwards, and the process of erosion first and then expansion is called open operation. For eliminating small objects, separating objects at fine points, smoothing the boundaries of larger objects while not significantly changing their area.
And S123, judging whether the gray value of the pixel point in the boundary area image is 0 or not.
And S124, if yes, determining the pixel point with the gray value of 0 as a boundary point.
When the gray value of a pixel point in the boundary area image is 0, the point is proved to be a boundary point. It is understood that the boundary area pattern includes a black portion, i.e., a pixel value of 255 and a white portion, a pixel value of 0, and a point having a pixel value of 0 is a boundary point. Determining the boundary points in this way is more efficient.
S130, obtaining a restoration value of the boundary point by utilizing pixel point information in a preset region range of the boundary point and gradient information determined according to the boundary point.
The pixel point information comprises a contribution value and distance information, and the gradient information comprises gradient information in the x direction and gradient information in the y direction.
The preset area range is not limited in this embodiment, and may be 5 × 5, 3 × 3, or 4 × 4, as long as the purpose of this embodiment can be achieved.
In an implementation manner, please refer to fig. 3, where fig. 3 is a flowchart of determining a repair value in another image processing method according to an embodiment of the present application, specifically, obtaining a repair value of a boundary point by using pixel point information in a preset region range of the boundary point and gradient information determined according to the boundary point includes:
s131, determining a reference channel, obtaining a first image formed by data obtained by dividing the first channel value by the reference channel data corresponding to the reference channel, and obtaining a second image formed by data obtained by dividing the second channel value by the reference channel data.
The reference channel is not limited in this embodiment, and may be any one of an R channel, a G channel, and a B channel. Wherein the first channel value and the second channel value are corresponding data in the color endoscope image. When the channel is a reference channel, the R channel, in one implementable embodiment, the first channel is a G channel, while the second channel is a B channel, the first image is an image of data resulting from dividing G channel data by R channel data, and the second image is an image of data resulting from dividing B channel data by R channel data; in another implementable embodiment, the first channel is a B channel, the second channel is a G channel, the first image is an image of data obtained by dividing B channel data by R channel data, and the second image is an image of data obtained by dividing G channel data by R channel data. When G channel is the reference channel, in one implementable embodiment, the first channel is R channel while the second channel is B channel, the first image is an image composed of data obtained by dividing R channel data by G channel data, and the second image is an image composed of data obtained by dividing B channel data by G channel data; in another implementable embodiment, the first channel is a B channel, the second channel is an R channel, the first image is an image of data obtained by dividing B channel data by G channel data, and the second image is an image of data obtained by dividing R channel data by G channel data. When the channel is a reference channel, the first channel is a G channel, and the second channel is an R channel, the first image is an image composed of data obtained by dividing G channel data by B channel data, and the second image is an image composed of data obtained by dividing R channel data by B channel data; in another implementable embodiment, the first channel is an R channel, the second channel is a G channel, the first image is an image of data obtained by dividing R channel data by B channel data, and the second image is an image of data obtained by dividing G channel data by B channel data. Preferably, the reference channel is a G channel.
S132, determining a first numerical value of the first image and a second numerical value of the second image by utilizing pixel point information in the boundary area image in the preset area range of all the boundary points and gradient information determined according to the boundary points.
In an implementation manner, determining a first value of the first image and a second value of the second image by using pixel point information within a preset region range of all boundary points and gradient information determined according to the boundary points includes: determining a plurality of points which do not need to be repaired within a preset area range of the boundary point; calculating the contribution value of each point which does not need to be repaired by utilizing a first preset algorithm; determining the numerical value of the boundary point according to a second preset algorithm based on the contribution value of each point which does not need to be repaired so as to obtain a first numerical value of the first image and a second numerical value of the second image; the first predetermined algorithm is ω ═ dir abs (cos θ), and the second predetermined algorithm is
Figure BDA0002236021580000091
Wherein the content of the first and second substances,
Figure BDA0002236021580000092
is a contribution value of the unnecessary repair point, Δ x is an unnecessary repair pointX distance between the repair point and the boundary point, Δ y is y distance between the repair-unnecessary point and the boundary point, gradx is x-direction gradient information, grady is y-direction gradient information, IKnowIs a value in the color endoscopic image that does not require a repair point.
The non-repair point is a point except for the boundary point within the preset region range, that is, in the boundary region image, the gray value of the pixel point within the boundary point preset region range is 255. And calculating the contribution value of each point which does not need to be repaired by using a first preset algorithm, obtaining the numerical value of the boundary point by using a second preset algorithm based on the contribution value, and finally obtaining a first numerical value and a second numerical value. For example, when the first image is an R/G image (an image formed by dividing R channel data by G channel data), points which do not need to be repaired within a preset region range of the boundary point b in the image are b1, b2, b3, and ω is ω 1 for b11=dir1*abs(cosθ1)、
Figure BDA0002236021580000101
Figure BDA0002236021580000102
Figure BDA0002236021580000103
Is the x distance between b1 and the boundary point,
Figure BDA0002236021580000104
is the y-distance, I, between b1 and the boundary pointKnow1Is the R value of b1 in a color endoscopic image; for b2, ω2=dir2*abs(cosθ2)、
Figure BDA0002236021580000105
Figure BDA0002236021580000106
Figure BDA0002236021580000107
Is between b2 and the boundary pointThe x-distance of (a) is,
Figure BDA0002236021580000108
is the y-distance, I, between b2 and the boundary pointKnow2Is the R value of b2 in a color endoscopic image; for b3, ω3=dir3*abs(cosθ3)、
Figure BDA0002236021580000109
Figure BDA00022360215800001010
Is the x distance between b3 and the boundary point,
Figure BDA00022360215800001012
is the y-distance, I, between b3 and the boundary pointKnow3Is the R value of b3 in a color endoscopic image; the value of the final boundary point b in the first image is
Figure BDA00022360215800001013
And performing the processing on the images of all the boundary points to obtain the numerical values of all the boundary points of the first image, wherein the numerical values of all the boundary points form the first numerical value of the first image, and similarly, obtaining the second numerical value of the second image.
And S133, multiplying the first numerical value by a reference numerical value corresponding to the reference channel to obtain a first repair numerical value, and multiplying the second numerical value by the reference numerical value corresponding to the reference channel to obtain a second repair numerical value, so as to obtain a repair numerical value consisting of the reference numerical value, the first repair numerical value and the second repair numerical value.
In an implementation manner, when the first image is an R/G image (an image formed by dividing R channel data by G channel data), a first numerical value is obtained, and all the first numerical values can be regarded as R/G numerical values at this time and multiplied by a reference numerical value G corresponding to a G channel of a reference channel to obtain an R repair numerical value, i.e., a first repair numerical value; when the second image is a B/G image (an image formed by dividing R channel data by G channel data), a second numerical value is obtained, and all the second numerical values can be regarded as B/G numerical values at this time and multiplied by a reference numerical value G corresponding to a G channel of the reference channel to obtain a B repair numerical value, that is, a second repair numerical value, and finally, the repair numerical value includes R repair numerical values and a reference numerical value G, B.
In another implementation, when the first image is an R/B image (an image formed by dividing R channel data by B channel data), a first value is obtained, and all the first values may be regarded as R/B values at this time, and multiplied by a reference value B corresponding to a reference channel B channel to obtain an R repair value, i.e., a first repair value; when the second image is a G/B image (an image formed by data obtained by dividing G channel data by B channel data), a second numerical value is obtained, all the second numerical values can be regarded as G/B numerical values at this time, and multiplied by a reference numerical value B corresponding to a reference channel B channel to obtain a G repair numerical value, i.e., a second repair numerical value, and finally, the repair numerical value includes an R repair numerical value, a G repair numerical value, and a reference numerical value B.
In another implementation, when the first image is a G/R image (an image formed by dividing G channel data by R channel data), a first value is obtained, and all the first values may be regarded as G/R values at this time, and multiplied by a reference value R corresponding to a reference channel R channel, so as to obtain a G repair value, i.e., a first repair value; when the second image is a B/R image (an image formed by dividing B channel data by R channel data), a second value is obtained, and all the second values can be regarded as B/R values at this time and multiplied by a reference value R corresponding to the R channel of the reference channel to obtain a B repair value, that is, a second repair value, and finally, the repair value includes the reference value R, G repair value and the B repair value.
And S140, replacing the numerical values of the boundary points in the color endoscope image with all the repaired numerical values so as to obtain a repaired image.
Based on the technical scheme, the boundary region image of the color endoscope image in the time sequence sampling video is extracted, the boundary point is determined, the restoration value is obtained according to the pixel point information in the preset region range of the boundary point and the gradient information determined by the boundary point, so that the boundary point value in the original color endoscope image is replaced by the restoration value, the image restoration is realized, the problem of image restoration of the color boundary caused by the movement of internal instruments or internal organs can be solved, the image is more visual and clear, and medical staff can observe the restored image conveniently.
Based on the foregoing technical solution, this embodiment provides a specific image processing method, including:
1. color endoscopic images in time-series sampled video are acquired.
2. And judging whether the R component of the pixel points in the color endoscope image is larger than the G component multiplied by a first preset multiple or not, or judging whether the R component of the pixel points in the color endoscope image is larger than the B component multiplied by a second preset multiple or not.
3. If yes, determining pixel points of which the R component is a first preset multiple of the G component or pixel points of which the R component is a second preset multiple of the B component as boundary region points, and obtaining a color boundary region formed by all the boundary region points.
4. And carrying out binarization and expansion processing on the color boundary area to obtain a boundary area image.
5. And judging whether the gray value of the pixel point in the boundary area image is 0 or not.
6. If yes, determining the pixel point with the gray value of 0 as a boundary point.
7. And determining the reference channel as a G channel, and obtaining a first image formed by dividing R channel data by G channel data, namely an R/G image, and simultaneously obtaining a second image formed by dividing B channel data by G channel data, namely a B/G image.
8. And determining a plurality of unnecessary repairing points in the preset area range of the boundary point.
9. And calculating the contribution value of each unnecessary repairing point by using a first preset algorithm. The first predetermined algorithm is ω ═ dir abs (cos θ), where,
Figure BDA0002236021580000121
Figure BDA0002236021580000122
Δ x is an x distance between the unnecessary repair point and the boundary point, Δ y is a y distance between the unnecessary repair point and the boundary point, gradx is x-direction gradient information, and grady is y-direction gradient information.
10. And determining the values of the boundary points according to a second preset algorithm based on the contribution values of the points which do not need to be repaired so as to obtain the R/G value of the R/G image and the B/G value of the B/G image.
The second preset algorithm isω is the contribution of the non-repair-required point, IKnowIs a value in the color endoscopic image that does not require a repair point.
11. And multiplying the R/G numerical value by the G numerical value corresponding to the G channel to obtain an R repairing numerical value, and multiplying the B/G numerical value by the G numerical value corresponding to the G channel to obtain a B repairing numerical value so as to obtain a repairing numerical value consisting of the G numerical value, the R repairing numerical value and the B repairing numerical value.
12. All the restored values are substituted for the values of the boundary points in the color endoscope image to obtain a restored image.
Referring to fig. 4, fig. 4 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application, where the image processing apparatus is described below, and the image processing method described below and the image processing apparatus described above are referred to correspondingly, and the image processing apparatus includes:
a color endoscope image acquisition module 100, configured to acquire a color endoscope image in a time-series sampling video;
a boundary point extracting module 200, configured to extract a boundary area image in the color endoscope image, and determine a boundary point according to the boundary area image;
a restoration value determining module 300, configured to determine a restoration value of a boundary point by using pixel point information in a preset region range of the boundary point and gradient information determined according to the boundary point, where the pixel point information includes a contribution value and distance information, and the gradient information includes x-direction gradient information and y-direction gradient information;
and a restoration module 400 for replacing the values of the boundary points in the color endoscopic image with all the restoration values to obtain a restored image.
In some specific embodiments, the boundary point extracting module 200 includes:
a color boundary region extraction unit for extracting a color boundary region of the color endoscopic image;
a boundary area image acquisition unit, which is used for carrying out binarization and expansion processing on the color boundary area to obtain a boundary area image;
the judging unit is used for judging whether the gray value of the pixel point in the boundary area image is 0 or not;
and the boundary point determining unit is used for determining the pixel point with the gray value of 0 as the boundary point if the gray value is equal to the gray value of 0.
In some specific embodiments, the color boundary region extracting unit includes:
the judgment subunit is used for judging whether the R component of the pixel point in the color endoscope image is larger than the G component multiplied by a first preset multiple or judging whether the R component of the pixel point in the color endoscope image is larger than the B component multiplied by a second preset multiple;
and the color boundary region determining unit is used for determining pixel points of which the R component is a first preset multiple of the G component or pixel points of which the R component is a second preset multiple of the B component as boundary region points if the R component is the first preset multiple of the G component, so as to obtain a color boundary region formed by all the boundary region points.
In some specific embodiments, the boundary area image obtaining unit further includes:
and the etching subunit is used for performing etching treatment.
In some specific embodiments, the repair number determination module 300 includes:
the first image and second image obtaining unit is used for determining a reference channel, obtaining a first image formed by data obtained by dividing the first channel numerical value by reference channel data corresponding to the reference channel, and simultaneously obtaining a second image formed by data obtained by dividing the second channel numerical value by the reference channel data;
the first numerical value and second numerical value obtaining unit is used for determining a first numerical value of the first image and a second numerical value of the second image by utilizing pixel point information in a preset region range of all boundary points and gradient information determined according to the boundary points;
and the repair value determining unit is used for multiplying the first value by the reference value corresponding to the reference channel to obtain a first repair value, and multiplying the second value by the reference value corresponding to the reference channel to obtain a second repair value so as to obtain a repair value consisting of the reference value, the first repair value and the second repair value.
In some specific embodiments, the first and second value obtaining units include:
the non-repair-required point determining subunit is used for determining a plurality of non-repair-required points in the preset area range of the boundary point;
the contribution value determining subunit is used for calculating the contribution value of each point which does not need to be repaired by utilizing a first preset algorithm;
a first numerical value and second numerical value obtaining subunit, configured to determine, based on the contribution value of each unnecessary-to-repair point, a numerical value of a boundary point according to a second preset algorithm, so as to obtain a first numerical value of the first image and a second numerical value of the second image;
the first predetermined algorithm is ω ═ dir abs (cos θ), and the second predetermined algorithm is
Figure BDA0002236021580000141
Wherein the content of the first and second substances,
Figure BDA0002236021580000142
ω is a contribution value of the unnecessary repair point, Δ x is an x distance between the unnecessary repair point and the boundary point, Δ y is a y distance between the unnecessary repair point and the boundary point, gradx is x-direction gradient information, grady is y-direction gradient information, I isKnowIs a value in the color endoscopic image that does not require a repair point.
Since the embodiment of the image processing apparatus portion and the embodiment of the image processing method portion correspond to each other, please refer to the description of the embodiment of the image processing method portion for the embodiment of the image processing apparatus portion, which is not repeated here.
In the following, an electronic device provided by an embodiment of the present application is introduced, and the electronic device described below and the image processing method described above may be referred to correspondingly.
The present embodiment provides an electronic device, including:
a memory for storing a computer program;
a processor for implementing the steps of the image processing method as described above when executing the computer program.
Since the embodiment of the electronic device portion and the embodiment of the image processing method portion correspond to each other, please refer to the description of the embodiment of the image processing method portion for the embodiment of the electronic device portion, which is not repeated here.
In the following, a computer-readable storage medium provided by an embodiment of the present application is introduced, and the computer-readable storage medium described below and the image processing method described above may be referred to correspondingly.
The present embodiment provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the image processing method as described above.
Since the embodiment of the computer-readable storage medium portion corresponds to the embodiment of the image processing method portion, please refer to the description of the embodiment of the image processing method portion for the embodiment of the computer-readable storage medium portion, which is not repeated here.
The embodiments are described in a progressive manner in the specification, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. The device disclosed by the embodiment corresponds to the method disclosed by the embodiment, so that the description is simple, and the relevant points can be referred to the method part for description.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The image processing method, the image processing apparatus, the electronic device, and the computer-readable storage medium provided by the present application are described in detail above. The principles and embodiments of the present application are explained herein using specific examples, which are provided only to help understand the method and the core idea of the present application. It should be noted that, for those skilled in the art, it is possible to make several improvements and modifications to the present application without departing from the principle of the present application, and such improvements and modifications also fall within the scope of the claims of the present application.

Claims (10)

1. An image processing method, comprising:
acquiring a color endoscope image in a time sequence sampling video;
extracting a boundary area image in the color endoscope image, and determining a boundary point according to the boundary area image;
obtaining a restoration value of the boundary point by using pixel point information in a preset region range of the boundary point and gradient information determined according to the boundary point, wherein the pixel point information comprises a contribution value and distance information, and the gradient information comprises gradient information in the x direction and gradient information in the y direction;
and replacing the numerical values of the boundary points in the color endoscope image by all the repairing numerical values so as to obtain a repaired image.
2. The image processing method according to claim 1, wherein extracting a boundary area image in the color endoscope image and determining a boundary point from the boundary area image includes:
extracting a color boundary region of the color endoscope image;
carrying out binarization and expansion processing on the color boundary area to obtain a boundary area image;
judging whether the gray value of a pixel point in the boundary area image is 0 or not;
and if so, determining the pixel point with the gray value of 0 as the boundary point.
3. The image processing method according to claim 2, wherein extracting the color boundary region of the color endoscopic image includes:
judging whether the R component of the pixel point in the color endoscope image is larger than the G component multiplied by a first preset multiple or not, or judging whether the R component of the pixel point in the color endoscope image is larger than the B component multiplied by a second preset multiple or not;
if yes, determining pixel points of which the R component is a first preset multiple of the G component or pixel points of which the R component is a second preset multiple of the B component as boundary region points, and obtaining the color boundary region formed by all the boundary region points.
4. The image processing method according to claim 2, wherein between the binarization and expansion processing of the color boundary region and the obtaining of the boundary region image, further comprising:
and carrying out corrosion treatment.
5. The image processing method according to claim 2, wherein obtaining the restoration value of the boundary point by using pixel point information in a preset region range of the boundary point and gradient information determined according to the boundary point comprises:
determining a reference channel, obtaining a first image formed by data obtained by dividing a first channel numerical value by reference channel data corresponding to the reference channel, and simultaneously obtaining a second image formed by data obtained by dividing a second channel numerical value by the reference channel data;
determining a first numerical value of the first image and a second numerical value of the second image by using the pixel point information in the boundary area image within the preset area range of all the boundary points and gradient information determined according to the boundary points;
and multiplying the first numerical value by a reference numerical value corresponding to the reference channel to obtain a first repair numerical value, and multiplying the second numerical value by a reference numerical value corresponding to the reference channel to obtain a second repair numerical value, so as to obtain the repair numerical value consisting of the reference numerical value, the first repair numerical value and the second repair numerical value.
6. The image processing method according to claim 5, wherein determining a first value of the first image and a second value of the second image using the pixel point information in the boundary region image within the preset region range of all the boundary points and gradient information determined according to the boundary points comprises:
determining a plurality of points which do not need to be repaired within the preset area range of the boundary point;
calculating the contribution value of each non-repair-required point by using a first preset algorithm;
determining the numerical value of the boundary point according to a second preset algorithm based on the contribution value of each non-repair-required point so as to obtain the first numerical value of the first image and the second numerical value of the second image;
the first predetermined algorithm is ω ═ dir abs (cos θ), and the second predetermined algorithm is
Wherein the content of the first and second substances,
ω is the contribution of the non-repair-required point, ΔxIs the x distance, Δ, between the point not requiring repair and the boundary pointyIs the y distance between the point not needing to be repaired and the boundary point, gradx is the gradient information in the x direction, grady is the gradient information in the y direction, IKnowIs a value in the color endoscopic image that does not require a repair point.
7. The image processing method according to claim 5, wherein the reference channel is a G channel.
8. An image processing apparatus characterized by comprising:
the color endoscope image acquisition module is used for acquiring a color endoscope image in the time sequence sampling video;
the boundary point extraction module is used for extracting a boundary area image in the color endoscope image and determining boundary points according to the boundary area image;
a restoration value determining module, configured to determine a restoration value of the boundary point by using pixel point information in a preset region range of the boundary point and gradient information determined according to the boundary point, where the pixel point information includes a contribution value and distance information, and the gradient information includes gradient information in an x direction and gradient information in a y direction;
and the restoration module is used for replacing the numerical values of the boundary points in the color endoscope image with all the restoration numerical values so as to obtain a restored image.
9. An electronic device, comprising:
a memory for storing a computer program;
a processor for implementing the steps of the image processing method according to any one of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, characterized in that a computer program is stored on the computer-readable storage medium, which computer program, when being executed by a processor, carries out the steps of the image processing method according to any one of claims 1 to 7.
CN201910983689.7A 2019-10-16 2019-10-16 Image processing method and device, electronic equipment and readable storage medium Active CN110751605B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910983689.7A CN110751605B (en) 2019-10-16 2019-10-16 Image processing method and device, electronic equipment and readable storage medium
PCT/CN2020/092220 WO2021073101A1 (en) 2019-10-16 2020-05-26 Image processing method and apparatus, electronic device, and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910983689.7A CN110751605B (en) 2019-10-16 2019-10-16 Image processing method and device, electronic equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN110751605A true CN110751605A (en) 2020-02-04
CN110751605B CN110751605B (en) 2022-12-23

Family

ID=69278571

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910983689.7A Active CN110751605B (en) 2019-10-16 2019-10-16 Image processing method and device, electronic equipment and readable storage medium

Country Status (2)

Country Link
CN (1) CN110751605B (en)
WO (1) WO2021073101A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112288718A (en) * 2020-10-29 2021-01-29 推想医疗科技股份有限公司 Image processing method and apparatus, electronic device, and computer-readable storage medium
CN112561830A (en) * 2020-12-23 2021-03-26 安徽大学 Endoscope image highlight repair method and device
WO2021073101A1 (en) * 2019-10-16 2021-04-22 深圳开立生物医疗科技股份有限公司 Image processing method and apparatus, electronic device, and readable storage medium
CN113132705A (en) * 2021-04-20 2021-07-16 Oppo广东移动通信有限公司 Image color edge correction method, correction device, electronic device and storage medium
WO2023093693A1 (en) * 2021-11-23 2023-06-01 上海微觅医疗器械有限公司 Medical tool inspection method and system, computer device, and storage medium
CN116309160A (en) * 2023-03-10 2023-06-23 北京百度网讯科技有限公司 Image resolution restoration method, device, equipment and storage medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113012076B (en) * 2021-04-27 2023-06-23 广东工业大学 Dunhuang fresco restoration method based on adjacent pixel points and self-encoder

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101783861A (en) * 2010-02-09 2010-07-21 腾讯科技(深圳)有限公司 Method and device for beautifying picture
CN102231792A (en) * 2011-06-29 2011-11-02 南京大学 Electronic image stabilization method based on characteristic coupling
CN105957027A (en) * 2016-04-22 2016-09-21 西南石油大学 MRF sample image restoring method based on required directional structural feature statistics
CN107146229A (en) * 2017-04-05 2017-09-08 西安电子科技大学 Polyp of colon image partition method based on cellular Automation Model
CN108830780A (en) * 2018-05-09 2018-11-16 北京京东金融科技控股有限公司 Image processing method and device, electronic equipment, storage medium
CN109903322A (en) * 2019-01-24 2019-06-18 江苏大学 A kind of depth camera depth image restorative procedure

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2016194179A1 (en) * 2015-06-03 2018-03-29 オリンパス株式会社 Imaging apparatus, endoscope apparatus, and imaging method
CN110751605B (en) * 2019-10-16 2022-12-23 深圳开立生物医疗科技股份有限公司 Image processing method and device, electronic equipment and readable storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101783861A (en) * 2010-02-09 2010-07-21 腾讯科技(深圳)有限公司 Method and device for beautifying picture
CN102231792A (en) * 2011-06-29 2011-11-02 南京大学 Electronic image stabilization method based on characteristic coupling
CN105957027A (en) * 2016-04-22 2016-09-21 西南石油大学 MRF sample image restoring method based on required directional structural feature statistics
CN107146229A (en) * 2017-04-05 2017-09-08 西安电子科技大学 Polyp of colon image partition method based on cellular Automation Model
CN108830780A (en) * 2018-05-09 2018-11-16 北京京东金融科技控股有限公司 Image processing method and device, electronic equipment, storage medium
CN109903322A (en) * 2019-01-24 2019-06-18 江苏大学 A kind of depth camera depth image restorative procedure

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021073101A1 (en) * 2019-10-16 2021-04-22 深圳开立生物医疗科技股份有限公司 Image processing method and apparatus, electronic device, and readable storage medium
CN112288718A (en) * 2020-10-29 2021-01-29 推想医疗科技股份有限公司 Image processing method and apparatus, electronic device, and computer-readable storage medium
CN112561830A (en) * 2020-12-23 2021-03-26 安徽大学 Endoscope image highlight repair method and device
CN112561830B (en) * 2020-12-23 2022-11-18 安徽大学 Endoscope image highlight repair method and device
CN113132705A (en) * 2021-04-20 2021-07-16 Oppo广东移动通信有限公司 Image color edge correction method, correction device, electronic device and storage medium
WO2023093693A1 (en) * 2021-11-23 2023-06-01 上海微觅医疗器械有限公司 Medical tool inspection method and system, computer device, and storage medium
CN116309160A (en) * 2023-03-10 2023-06-23 北京百度网讯科技有限公司 Image resolution restoration method, device, equipment and storage medium
CN116309160B (en) * 2023-03-10 2024-04-12 北京百度网讯科技有限公司 Image resolution restoration method, device, equipment and storage medium

Also Published As

Publication number Publication date
WO2021073101A1 (en) 2021-04-22
CN110751605B (en) 2022-12-23

Similar Documents

Publication Publication Date Title
CN110751605B (en) Image processing method and device, electronic equipment and readable storage medium
EP2188779B1 (en) Extraction method of tongue region using graph-based approach and geometric properties
JP5094036B2 (en) Endoscope insertion direction detection device
JP5871325B2 (en) Information processing apparatus, information processing system, information processing method, program, and recording medium
JP2007244519A (en) Image analysis apparatus
WO2013187206A1 (en) Image processing device, image processing method, and image processing program
CN111091559A (en) Depth learning-based auxiliary diagnosis system for small intestine sub-scope lymphoma
TWI673683B (en) System and method for identification of symptom image
CN108294728A (en) wound state analysis method and system
CN108615045B (en) Method, device and equipment for screening images shot by capsule endoscopy
CN110855889A (en) Image processing method, image processing apparatus, image processing device, and storage medium
CN111784686A (en) Dynamic intelligent detection method, system and readable storage medium for endoscope bleeding area
JP2008185337A (en) System, method and program for evaluating pathological image
JP7075773B2 (en) Image processing equipment, microscope system, image processing method and image processing program
JP5286215B2 (en) Outline extracting apparatus, outline extracting method, and outline extracting program
JP2018171516A (en) Image processing method, diagnosis device, and program
CN113496470B (en) Image processing method and device, electronic equipment and storage medium
JP2019118670A (en) Diagnosis support apparatus, image processing method, and program
Tie-Rui et al. The research of X-ray bone fracture image enhancement algorithms
WO2021009804A1 (en) Method for learning threshold value
CN111563839A (en) Fundus image conversion method and device
CN110140150B (en) Image processing method and device and terminal equipment
CN113221909B (en) Image processing method, image processing apparatus, and computer-readable storage medium
CN110264418A (en) Method for enhancing picture contrast, system and device
CN114418920B (en) Endoscope multi-focus image fusion method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant