CN110751605B - Image processing method and device, electronic equipment and readable storage medium - Google Patents

Image processing method and device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN110751605B
CN110751605B CN201910983689.7A CN201910983689A CN110751605B CN 110751605 B CN110751605 B CN 110751605B CN 201910983689 A CN201910983689 A CN 201910983689A CN 110751605 B CN110751605 B CN 110751605B
Authority
CN
China
Prior art keywords
image
boundary
value
point
numerical value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910983689.7A
Other languages
Chinese (zh)
Other versions
CN110751605A (en
Inventor
余力
冯能云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sonoscape Medical Corp
Original Assignee
Sonoscape Medical Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sonoscape Medical Corp filed Critical Sonoscape Medical Corp
Priority to CN201910983689.7A priority Critical patent/CN110751605B/en
Publication of CN110751605A publication Critical patent/CN110751605A/en
Priority to PCT/CN2020/092220 priority patent/WO2021073101A1/en
Application granted granted Critical
Publication of CN110751605B publication Critical patent/CN110751605B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • G06T5/77
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20036Morphological image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20192Edge enhancement; Edge preservation

Abstract

The application provides an image processing method, which comprises the following steps: acquiring a color endoscope image in a time sequence sampling video; extracting a boundary area image in the color endoscope image, and determining a boundary point according to the boundary area image; obtaining a restoration value of the boundary point by using pixel point information in a preset region range of the boundary point and gradient information determined according to the boundary point, wherein the pixel point information comprises a contribution value and distance information, and the gradient information comprises gradient information in the x direction and gradient information in the y direction; and replacing the values of the boundary points in the color endoscope image by all the repairing values so as to obtain a repaired image. The method and the device solve the problem of image restoration of the color boundary caused by the movement of internal instruments or internal organs, the image is more visual and clear, and medical personnel can observe the image after restoration conveniently. The application also provides an image processing device, an electronic device and a computer readable storage medium, which all have the beneficial effects.

Description

Image processing method and device, electronic equipment and readable storage medium
Technical Field
The present application relates to the field of endoscope technologies, and in particular, to an image processing method, an image processing apparatus, an electronic device, and a computer-readable storage medium.
Background
Time-sequential sampling is one of the core technologies in the field of endoscopy. The image obtained by time-series sampling has higher definition than that obtained by the bayer pattern, and has higher luminance in the dye mode. But its color boundaries due to the imaging mode are a problem to be solved.
Therefore, how to provide a solution to the above technical problem is a problem that needs to be solved by those skilled in the art.
Disclosure of Invention
The application aims to provide an image processing method, an image processing device, electronic equipment and a computer readable storage medium, the image is more visual and clear, and medical staff can observe the image after restoration conveniently. The specific scheme is as follows:
the application provides an image processing method, comprising the following steps:
acquiring a color endoscope image in a time sequence sampling video;
extracting a boundary area image in the color endoscope image, and determining a boundary point according to the boundary area image;
obtaining a restoration value of the boundary point by using pixel point information in a preset region range of the boundary point and gradient information determined according to the boundary point, wherein the pixel point information comprises a contribution value and distance information, and the gradient information comprises gradient information in an x direction and gradient information in a y direction;
and replacing the values of the boundary points in the color endoscope image by all the repairing values so as to obtain a repaired image.
Optionally, extracting a boundary area image in the color endoscope image, and determining a boundary point according to the boundary area image, includes:
extracting a color boundary area of the color endoscope image;
carrying out binarization and expansion processing on the color boundary area to obtain a boundary area image;
judging whether the gray value of a pixel point in the boundary area image is 0 or not;
and if so, determining the pixel point with the gray value of 0 as the boundary point.
Optionally, extracting a color boundary region of the color endoscope image includes:
judging whether the R component of the pixel points in the color endoscope image is larger than the G component multiplied by a first preset multiple or not, or judging whether the R component of the pixel points in the color endoscope image is larger than the B component multiplied by a second preset multiple or not;
if yes, determining pixel points of which the R component is a first preset multiple of the G component or pixel points of which the R component is a second preset multiple of the B component as boundary region points, and obtaining the color boundary region formed by all the boundary region points.
Optionally, performing binarization and expansion processing on the color boundary region and obtaining the boundary region image further includes:
and carrying out corrosion treatment.
Optionally, obtaining a restoration value of the boundary point by using pixel point information in a preset region range of the boundary point and gradient information determined according to the boundary point, includes:
determining a reference channel, obtaining a first image formed by data obtained by dividing a first channel numerical value by reference channel data corresponding to the reference channel, and simultaneously obtaining a second image formed by data obtained by dividing a second channel numerical value by the reference channel data;
determining a first numerical value of the first image and a second numerical value of the second image by using the pixel point information in the boundary area image within the preset area range of all the boundary points and gradient information determined according to the boundary points;
and multiplying the first numerical value by a reference numerical value corresponding to the reference channel to obtain a first repair numerical value, and multiplying the second numerical value by a reference numerical value corresponding to the reference channel to obtain a second repair numerical value, so as to obtain the repair numerical value consisting of the reference numerical value, the first repair numerical value and the second repair numerical value.
Optionally, determining a first numerical value of the first image and a second numerical value of the second image by using the pixel point information in the boundary region image within the preset region range of all the boundary points and the gradient information determined according to the boundary points, includes:
determining a plurality of points which do not need to be repaired within the preset area range of the boundary point;
calculating the contribution value of each non-repair-required point by using a first preset algorithm;
determining the numerical value of the boundary point according to a second preset algorithm based on the contribution value of each non-repair-required point so as to obtain the first numerical value of the first image and the second numerical value of the second image;
the first preset algorithm is ω = dir abs (cos θ), and the second preset algorithm is
Figure BDA0002236021580000031
Wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0002236021580000032
ω is a contribution value of the unnecessary repair point, Δ x is an x distance between the unnecessary repair point and the boundary point, Δ y is a y distance between the unnecessary repair point and the boundary point, gradx is x-direction gradient information, grady is y-direction gradient information, I is Know Is a value in the color endoscopic image that does not require a repair point.
Optionally, the reference channel is a G channel.
The application provides an image processing apparatus, including:
the color endoscope image acquisition module is used for acquiring a color endoscope image in the time sequence sampling video;
the boundary point extraction module is used for extracting a boundary area image in the color endoscope image and determining boundary points according to the boundary area image;
the restoration value determining module is used for determining the restoration value of the boundary point by utilizing pixel point information in a preset region range of the boundary point and gradient information determined according to the boundary point, wherein the pixel point information comprises a contribution value and distance information, and the gradient information comprises gradient information in the x direction and gradient information in the y direction;
and the restoration module is used for replacing the numerical values of the boundary points in the color endoscope image with all the restoration numerical values so as to obtain a restored image.
Optionally, the boundary point extracting module includes:
a color boundary region extraction unit configured to extract a color boundary region of the color endoscopic image;
a boundary area image obtaining unit, configured to perform binarization and expansion processing on the color boundary area to obtain a boundary area image;
the judging unit is used for judging whether the gray value of the pixel point in the boundary area image is 0 or not;
and the boundary point determining unit is used for determining the pixel point with the gray value of 0 as the boundary point if the gray value is equal to the gray value of 0.
Optionally, the color boundary region extracting unit includes:
the judgment subunit is used for judging whether the R component of the pixel point in the color endoscope image is larger than the G component multiplied by a first preset multiple or judging whether the R component of the pixel point in the color endoscope image is larger than the B component multiplied by a second preset multiple;
and the color boundary region determining unit is used for determining pixel points of which the R component is a first preset multiple of the G component or pixel points of which the R component is a second preset multiple of the B component as boundary region points if the color boundary region determining unit determines the pixel points as the boundary region points to obtain the color boundary region formed by all the boundary region points.
Optionally, the boundary area image obtaining unit further includes:
and the etching subunit is used for performing etching treatment.
Optionally, the repair value determining module includes:
the first image and second image obtaining unit is used for determining a reference channel, obtaining a first image formed by data obtained by dividing a first channel numerical value by reference channel data corresponding to the reference channel, and simultaneously obtaining a second image formed by data obtained by dividing a second channel numerical value by the reference channel data;
a first numerical value and second numerical value obtaining unit, configured to determine a first numerical value of the first image and a second numerical value of the second image by using the pixel point information in the boundary region image within the preset region range of all the boundary points and gradient information determined according to the boundary points;
and the repair numerical value determining unit is used for multiplying the first numerical value by a reference numerical value corresponding to the reference channel to obtain a first repair numerical value, and multiplying the second numerical value by a reference numerical value corresponding to the reference channel to obtain a second repair numerical value so as to obtain the repair numerical value consisting of the reference numerical value, the first repair numerical value and the second repair numerical value.
Optionally, the first and second numerical value obtaining units include:
a repair-unnecessary point determining subunit, configured to determine a plurality of repair-unnecessary points within the preset region range of the boundary point;
the contribution value determining subunit is used for calculating the contribution value of each point which does not need to be repaired by utilizing a first preset algorithm;
a first numerical value and second numerical value obtaining subunit, configured to determine, according to a second preset algorithm, a numerical value of the boundary point based on a contribution value of each of the unnecessary repair points, so as to obtain the first numerical value of the first image and the second numerical value of the second image;
the first preset algorithm is ω = dir abs (cos θ), and the second preset algorithm is
Figure BDA0002236021580000051
Wherein the content of the first and second substances,
Figure BDA0002236021580000052
ω is a contribution value of the unnecessary repair point, Δ x is an x distance between the unnecessary repair point and the boundary point, Δ y is a y distance between the unnecessary repair point and the boundary point, gradx is x-direction gradient information, grady is y-direction gradient information, I is Know Is a value in the color endoscopic image that does not require a repair point.
The application provides an electronic device, including:
a memory for storing a computer program;
a processor for implementing the steps of the image processing method as described above when executing the computer program.
The present application provides a computer-readable storage medium having stored thereon a computer program which, when being executed by a processor, carries out the steps of the image processing method as described above.
The application provides an image processing method, comprising the following steps: acquiring a color endoscope image in a time sequence sampling video; extracting a boundary area image in the color endoscope image, and determining a boundary point according to the boundary area image; obtaining a restoration value of the boundary point by using pixel point information in a preset region range of the boundary point and gradient information determined according to the boundary point, wherein the pixel point information comprises a contribution value and distance information, and the gradient information comprises gradient information in the x direction and gradient information in the y direction; and replacing the values of the boundary points in the color endoscope image by all the repairing values so as to obtain a repaired image.
Therefore, the boundary region image of the color endoscope image in the time sequence sampling video is extracted, the boundary point is determined, the restoration value is obtained according to the pixel point information in the preset region range of the boundary point and the gradient information determined by the boundary point, the boundary point value in the original color endoscope image is replaced by the restoration value, the image restoration is achieved, the problem of image restoration of the color boundary caused by movement of internal instruments or internal organs can be solved, the image is more visual and clear, and medical staff can observe the restored image conveniently. The application also provides an image processing device, an electronic device and a computer readable storage medium, all having the above beneficial effects, which are not described herein again.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is a flowchart of an image processing method according to an embodiment of the present disclosure;
FIG. 2 is a flowchart illustrating color boundary region extraction in another image processing method according to an embodiment of the present disclosure;
fig. 3 is a flowchart of determining a modified value in another image processing method according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 1, fig. 1 is a flowchart of an image processing method according to an embodiment of the present disclosure, which specifically includes:
and S110, acquiring a color endoscope image in the time-sequence sampling video.
In the field of endoscope technology, images in a time sequence sampling video obtained by time sequence sampling have higher definition than images in a video obtained by a Bayer template, and have higher brightness in a dyeing mode, but in an operation, a color boundary can be generated due to the movement of internal instruments or internal organs, so that the observation of medical staff is not facilitated, and the operation effect is further influenced. Therefore, the embodiment provides an image processing method, which can solve the problem of image restoration of a color boundary caused by movement of an in-vivo instrument or an internal organ, and the image is more intuitive and clear, so that medical staff can observe the restored image conveniently.
And S120, extracting a boundary area image in the color endoscope image, and determining a boundary point according to the boundary area image.
The method for extracting the boundary area image is not limited in this embodiment, as long as the purpose of this embodiment can be achieved, it can be understood that the boundary area image includes a plurality of pixel points, and further, the boundary point is determined according to the boundary area image. The embodiment does not limit the determination mode of the boundary point, and the user can customize the setting.
In an implementation manner, please refer to fig. 2, fig. 2 is a flowchart illustrating a method for extracting a color boundary area in another image processing method according to an embodiment of the present application, specifically, extracting a boundary area image in a color endoscope image and determining a boundary point according to the boundary area image, including:
and S121, extracting a color boundary area of the color endoscope image.
In one implementation, extracting a color boundary region of a color endoscopic image includes:
judging whether the R component of a pixel point in the color endoscope image is larger than the G component multiplied by a first preset multiple or not, or judging whether the R component of the pixel point in the color endoscope image is larger than the B component multiplied by a second preset multiple or not; if yes, determining pixel points of which the R component is a first preset multiple of the G component or pixel points of which the R component is a second preset multiple of the B component as boundary region points, and obtaining a color boundary region formed by all the boundary region points.
It is understood that, in the field of endoscope technology, in a color endoscopic image, the amount of red is high due to the high amount of hemoglobin, resulting in a high R component content. For a pixel point in a color endoscope image, when an R component is larger than a G component multiplied by a first preset multiple, the pixel point is proved to be a boundary region point, or when the R component is larger than a B component multiplied by a second preset multiple, the pixel point is proved to be a boundary region point. In this case, all the boundary region points are determined and obtained by the threshold method, and further, a color boundary region composed of all the boundary region points is obtained.
And S122, carrying out binarization and expansion processing on the color boundary area to obtain a boundary area image.
And performing binarization processing on the color boundary area to obtain a binarization image, namely a black-and-white image, and performing expansion processing on the binarization image. The expansion is a process of combining all background points in contact with an object into the object to expand the boundary outwards, so that the image of the boundary area is more complete.
In an implementation manner, between the binarization and expansion processing of the color boundary region and the obtaining of the boundary region image, the method further includes: and carrying out corrosion treatment. Erosion is a process of eliminating boundary points and contracting the boundary inwards, and the process of erosion first and then expansion is called open operation. For eliminating small objects, separating objects at fine points, smoothing the boundaries of larger objects while not significantly changing their area.
And S123, judging whether the gray value of the pixel point in the boundary area image is 0 or not.
And S124, if yes, determining the pixel point with the gray value of 0 as a boundary point.
When the gray value of a pixel point in the boundary area image is 0, the point is proved to be a boundary point. It is understood that the boundary area pattern includes a black portion, i.e., a pixel value of 255 and a white portion, a pixel value of 0, and a point having a pixel value of 0 is a boundary point. Determining the boundary points in this way is more efficient.
S130, obtaining a restoration value of the boundary point by utilizing pixel point information in a preset area range of the boundary point and gradient information determined according to the boundary point.
The pixel point information comprises a contribution value and distance information, and the gradient information comprises gradient information in the x direction and gradient information in the y direction.
The preset area range is not limited in this embodiment, and may be 5 × 5, 3 × 3, or 4 × 4, as long as the purpose of this embodiment can be achieved.
In an implementation manner, please refer to fig. 3, where fig. 3 is a flowchart of determining a repair value in another image processing method according to an embodiment of the present application, specifically, obtaining a repair value of a boundary point by using pixel point information in a preset region range of the boundary point and gradient information determined according to the boundary point includes:
s131, determining a reference channel, obtaining a first image formed by data obtained by dividing the first channel value by reference channel data corresponding to the reference channel, and obtaining a second image formed by data obtained by dividing the second channel value by the reference channel data.
The reference channel is not limited in this embodiment, and may be any one of an R channel, a G channel, and a B channel. Wherein the first channel value and the second channel value are corresponding data in the color endoscope image. When the channel is a reference channel, the R channel, in one implementable embodiment, the first channel is a G channel, while the second channel is a B channel, the first image is an image of data resulting from dividing G channel data by R channel data, and the second image is an image of data resulting from dividing B channel data by R channel data; in another implementable embodiment, the first channel is a B channel, the second channel is a G channel, the first image is an image of data obtained by dividing B channel data by R channel data, and the second image is an image of data obtained by dividing G channel data by R channel data. When G channel is the reference channel, in one implementable embodiment, the first channel is R channel while the second channel is B channel, the first image is an image of data resulting from dividing R channel data by G channel data, and the second image is an image of data resulting from dividing B channel data by G channel data; in another implementable embodiment, the first channel is a B channel, the second channel is an R channel, the first image is an image of data obtained by dividing B channel data by G channel data, and the second image is an image of data obtained by dividing R channel data by G channel data. When the reference channel is a B channel, in one implementable embodiment, the first channel is a G channel while the second channel is an R channel, the first image is an image of data resulting from dividing G channel data by B channel data, and the second image is an image of data resulting from dividing R channel data by B channel data; in another implementable embodiment, the first channel is an R channel, the second channel is a G channel, the first image is an image of data obtained by dividing R channel data by B channel data, and the second image is an image of data obtained by dividing G channel data by B channel data. Preferably, the reference channel is a G channel.
S132, determining a first numerical value of the first image and a second numerical value of the second image by utilizing pixel point information in the boundary region image in the preset region range of all the boundary points and gradient information determined according to the boundary points.
In an implementation manner, determining a first value of the first image and a second value of the second image by using pixel point information within a preset region range of all boundary points and gradient information determined according to the boundary points includes: determining a plurality of points which do not need to be repaired in a preset area range of the boundary point; calculating the contribution value of each point which does not need to be repaired by utilizing a first preset algorithm; determining the numerical value of the boundary point according to a second preset algorithm based on the contribution value of each point which does not need to be repaired so as to obtain a first numerical value of the first image and a second numerical value of the second image; the first predetermined algorithm is ω = dir abs (cos θ), and the second predetermined algorithm is
Figure BDA0002236021580000091
Wherein the content of the first and second substances,
Figure BDA0002236021580000092
is a contribution value of the unnecessary repair point, Δ x is an x distance between the unnecessary repair point and the boundary point, Δ y is a y distance between the unnecessary repair point and the boundary point, gradx is x-direction gradient information, grady is y-direction gradient information, I is Know Is a value in the color endoscopic image that does not require a repair point.
The unnecessary repair point is a point except for the boundary point in the preset area range, that is, in the boundary area image, the gray value of the pixel point in the boundary point preset area range is 255. And calculating the contribution value of each point which does not need to be repaired by using a first preset algorithm, obtaining the numerical value of the boundary point by using a second preset algorithm based on the contribution value, and finally obtaining a first numerical value and a second numerical value. For example, when the first image is an R/G image (an image formed by dividing R-channel data by G-channel data), points that do not need to be repaired within a preset area range of the boundary point b in the image are b1, b2, and b3, and ω is ω for b1 1 =dir 1 *abs(cosθ 1 )、
Figure BDA0002236021580000101
Figure BDA0002236021580000102
Figure BDA0002236021580000103
Is the x distance between b1 and the boundary point,
Figure BDA0002236021580000104
is the y-distance between b1 and the boundary point, I Know1 Is the R value of b1 in the color endoscopic image; for b2, ω 2 =dir 2 *abs(cosθ 2 )、
Figure BDA0002236021580000105
Figure BDA0002236021580000106
Figure BDA0002236021580000107
Is the x distance between b2 and the boundary point,
Figure BDA0002236021580000108
is the y distance between b2 and the boundary point, I Know2 Is the R value of b2 in the color endoscopic image; for b3, ω 3 =dir 3 *abs(cosθ 3 )、
Figure BDA0002236021580000109
Figure BDA00022360215800001010
Figure BDA00022360215800001011
Is the x distance between b3 and the boundary point,
Figure BDA00022360215800001012
is the y distance between b3 and the boundary point, I Know3 Is the R value of b3 in the color endoscopic image; the value of the final boundary point b in the first image is
Figure BDA00022360215800001013
And performing the processing on the images of all the boundary points to obtain the numerical values of all the boundary points of the first image, wherein the numerical values of all the boundary points form the first numerical value of the first image, and similarly, obtaining the second numerical value of the second image.
And S133, multiplying the first numerical value by a reference numerical value corresponding to the reference channel to obtain a first repair numerical value, and multiplying the second numerical value by the reference numerical value corresponding to the reference channel to obtain a second repair numerical value, so as to obtain a repair numerical value consisting of the reference numerical value, the first repair numerical value and the second repair numerical value.
In an implementation manner, when the first image is an R/G image (an image formed by dividing R channel data by G channel data), a first numerical value is obtained, and all the first numerical values can be regarded as R/G numerical values at this time and multiplied by a reference numerical value G corresponding to a G channel of a reference channel to obtain an R repair numerical value, i.e., a first repair numerical value; when the second image is a B/G image (an image formed by dividing R channel data by G channel data), a second numerical value is obtained, and all the second numerical values can be regarded as B/G numerical values at this time and multiplied by a reference numerical value G corresponding to a G channel of the reference channel to obtain a B repair numerical value, i.e., a second repair numerical value, and finally, the repair numerical value includes R repair numerical values, reference numerical values G, and B repair numerical values.
In another implementation, when the first image is an R/B image (an image formed by dividing R channel data by B channel data), a first value is obtained, and all the first values may be regarded as R/B values at this time, and multiplied by a reference value B corresponding to a reference channel B channel to obtain an R repair value, i.e., a first repair value; when the second image is a G/B image (an image formed by data obtained by dividing G channel data by B channel data), a second value is obtained, and at this time, all the second values can be regarded as G/B values, and multiplied by a reference value B corresponding to a reference channel B channel to obtain a G repair value, that is, a second repair value, and finally, the repair values include an R repair value, a G repair value, and a reference value B.
In another implementation, when the first image is a G/R image (an image formed by dividing G channel data by R channel data), a first value is obtained, and all the first values may be regarded as G/R values at this time, and multiplied by a reference value R corresponding to an R channel of a reference channel, so as to obtain a G repair value, i.e. a first repair value; when the second image is a B/R image (an image formed by dividing B channel data by R channel data), a second numerical value is obtained, and all the second numerical values can be regarded as B/R numerical values at this time and multiplied by a reference numerical value R corresponding to a reference channel R channel to obtain a B repair numerical value, i.e., a second repair numerical value, and finally, the repair numerical value includes the reference numerical values R, G repair numerical values, and B repair numerical values.
And S140, replacing the numerical values of the boundary points in the color endoscope image with all the repaired numerical values so as to obtain a repaired image.
Based on the technical scheme, the boundary region image of the color endoscope image in the time sequence sampling video is extracted, the boundary point is determined, the restoration value is obtained according to the pixel point information in the preset region range of the boundary point and the gradient information determined by the boundary point, so that the boundary point value in the original color endoscope image is replaced by the restoration value, the image restoration is realized, the problem of image restoration of the color boundary caused by the movement of internal instruments or internal organs can be solved, the image is more visual and clear, and medical staff can observe the restored image conveniently.
Based on the foregoing technical solution, this embodiment provides a specific image processing method, including:
1. color endoscopic images in time-series sampled video are acquired.
2. And judging whether the R component of the pixel points in the color endoscope image is larger than the G component multiplied by a first preset multiple or not, or judging whether the R component of the pixel points in the color endoscope image is larger than the B component multiplied by a second preset multiple or not.
3. If yes, determining pixel points of which the R component is a first preset multiple of the G component or pixel points of which the R component is a second preset multiple of the B component as boundary region points, and obtaining a color boundary region formed by all the boundary region points.
4. And carrying out binarization and expansion processing on the color boundary area to obtain a boundary area image.
5. And judging whether the gray value of the pixel point in the boundary area image is 0 or not.
6. If yes, determining the pixel point with the gray value of 0 as the boundary point.
7. And determining the reference channel as a G channel, obtaining a first image formed by dividing R channel data by the G channel data, namely an R/G image, and simultaneously obtaining a second image formed by dividing B channel data by the G channel data, namely a B/G image.
8. And determining a plurality of unnecessary repairing points in the preset area range of the boundary point.
9. And calculating the contribution value of each unnecessary repairing point by using a first preset algorithm. The first preset algorithm is ω = dir abs (cos θ), where,
Figure BDA0002236021580000121
Figure BDA0002236021580000122
Δ x is an x distance between the unnecessary repair point and the boundary point, Δ y is a y distance between the unnecessary repair point and the boundary point, gradx is x-direction gradient information, and grady is y-direction gradient information.
10. And determining the values of the boundary points according to a second preset algorithm based on the contribution values of the points which do not need to be repaired so as to obtain the R/G value of the R/G image and the B/G value of the B/G image.
The second preset algorithm is
Figure BDA0002236021580000123
ω is the contribution of the non-repair-required point, I Know Is a value in the color endoscopic image that does not require a repair point.
11. And multiplying the R/G numerical value by the G numerical value corresponding to the G channel to obtain an R repairing numerical value, and multiplying the B/G numerical value by the G numerical value corresponding to the G channel to obtain a B repairing numerical value so as to obtain a repairing numerical value consisting of the G numerical value, the R repairing numerical value and the B repairing numerical value.
12. All the restored values are substituted for the values of the boundary points in the color endoscope image to obtain a restored image.
Referring to fig. 4, fig. 4 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application, where the image processing apparatus is described below, and the image processing method described below and the image processing apparatus described above are referred to correspondingly, and the image processing apparatus includes:
a color endoscope image acquisition module 100, configured to acquire a color endoscope image in a time-series sampling video;
a boundary point extraction module 200, configured to extract a boundary area image in the color endoscope image, and determine a boundary point according to the boundary area image;
a restoration value determining module 300, configured to determine a restoration value of a boundary point by using pixel point information in a preset region range of the boundary point and gradient information determined according to the boundary point, where the pixel point information includes a contribution value and distance information, and the gradient information includes gradient information in an x direction and gradient information in a y direction;
and a restoration module 400 for replacing the values of the boundary points in the color endoscopic image with all the restoration values to obtain a restored image.
In some specific embodiments, the boundary point extracting module 200 includes:
a color boundary region extraction unit for extracting a color boundary region of the color endoscopic image;
a boundary area image acquisition unit, which is used for carrying out binarization and expansion processing on the color boundary area to obtain a boundary area image;
the judging unit is used for judging whether the gray value of the pixel point in the boundary area image is 0 or not;
and the boundary point determining unit is used for determining the pixel point with the gray value of 0 as the boundary point if the gray value is equal to the gray value of 0.
In some specific embodiments, the color boundary region extracting unit includes:
the judgment subunit is used for judging whether the R component of the pixel point in the color endoscope image is larger than the G component multiplied by a first preset multiple or judging whether the R component of the pixel point in the color endoscope image is larger than the B component multiplied by a second preset multiple;
and the color boundary region determining unit is used for determining pixel points of which the R component is a first preset multiple of the G component or pixel points of which the R component is a second preset multiple of the B component as boundary region points if the R component is the first preset multiple of the G component, so as to obtain a color boundary region formed by all the boundary region points.
In some specific embodiments, the boundary area image obtaining unit further includes:
and the etching subunit is used for performing etching treatment.
In some specific embodiments, the repair value determining module 300 includes:
the first image and second image obtaining unit is used for determining a reference channel, obtaining a first image formed by data obtained by dividing the first channel numerical value by reference channel data corresponding to the reference channel, and simultaneously obtaining a second image formed by data obtained by dividing the second channel numerical value by the reference channel data;
the first numerical value and second numerical value obtaining unit is used for determining a first numerical value of the first image and a second numerical value of the second image by utilizing pixel point information in a preset region range of all boundary points and gradient information determined according to the boundary points;
and the restoration value determining unit is used for multiplying the first value by the reference value corresponding to the reference channel to obtain a first restoration value, and multiplying the second value by the reference value corresponding to the reference channel to obtain a second restoration value, so as to obtain a restoration value consisting of the reference value, the first restoration value and the second restoration value.
In some specific embodiments, the first and second value obtaining units include:
the non-repair-required point determining subunit is used for determining a plurality of non-repair-required points in the preset area range of the boundary point;
the contribution value determining subunit is used for calculating the contribution value of each point which does not need to be repaired by utilizing a first preset algorithm;
a first numerical value and second numerical value obtaining subunit, configured to determine, based on the contribution value of each unnecessary-to-repair point, a numerical value of a boundary point according to a second preset algorithm, so as to obtain a first numerical value of the first image and a second numerical value of the second image;
the first predetermined algorithm is ω = dir abs (cos θ), and the second predetermined algorithm is
Figure BDA0002236021580000141
Wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0002236021580000142
ω is a contribution value of the unnecessary repair point, Δ x is an x distance between the unnecessary repair point and the boundary point, Δ y is a y distance between the unnecessary repair point and the boundary point, gradx is x-direction gradient information, grady is y-direction gradient information, I is Know Is a value in the color endoscopic image that does not require a repair point.
Since the embodiment of the image processing apparatus portion and the embodiment of the image processing method portion correspond to each other, please refer to the description of the embodiment of the image processing method portion for the embodiment of the image processing apparatus portion, which is not repeated here.
In the following, an electronic device provided by an embodiment of the present application is introduced, and the electronic device described below and the image processing method described above may be referred to correspondingly.
The present embodiment provides an electronic device, including:
a memory for storing a computer program;
a processor for implementing the steps of the image processing method as described above when executing the computer program.
Since the embodiment of the electronic device portion corresponds to the embodiment of the image processing method portion, please refer to the description of the embodiment of the image processing method portion for the embodiment of the electronic device portion, and details are not repeated here for the moment.
In the following, a computer-readable storage medium provided by an embodiment of the present application is introduced, and the computer-readable storage medium described below and the image processing method described above may be referred to correspondingly.
The present embodiment provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the image processing method as described above.
Since the embodiment of the computer-readable storage medium portion and the embodiment of the image processing method portion correspond to each other, please refer to the description of the embodiment of the image processing method portion for the embodiment of the computer-readable storage medium portion, which is not repeated herein for the moment.
The embodiments are described in a progressive mode in the specification, the emphasis of each embodiment is on the difference from the other embodiments, and the same and similar parts among the embodiments can be referred to each other. The device disclosed by the embodiment corresponds to the method disclosed by the embodiment, so that the description is simple, and the relevant points can be referred to the method part for description.
Those of skill would further appreciate that the various illustrative components and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the components and steps of the various examples have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the technical solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in Random Access Memory (RAM), memory, read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The image processing method, the image processing apparatus, the electronic device, and the computer-readable storage medium provided in the present application are described in detail above. The principles and embodiments of the present application are described herein using specific examples, which are only used to help understand the method and its core idea of the present application. It should be noted that, for those skilled in the art, it is possible to make several improvements and modifications to the present application without departing from the principle of the present application, and such improvements and modifications also fall within the scope of the claims of the present application.

Claims (9)

1. An image processing method, comprising:
acquiring a color endoscope image in a time sequence sampling video;
extracting a boundary area image in the color endoscope image, and determining a boundary point according to the boundary area image;
obtaining a restoration value of the boundary point by using pixel point information in a preset region range of the boundary point and gradient information determined according to the boundary point, wherein the pixel point information comprises a contribution value and distance information, and the gradient information comprises gradient information in the x direction and gradient information in the y direction; the contribution value refers to the contribution weight of the value of the pixel point which does not need to be repaired in the preset area range of the boundary point in the corresponding boundary point repairing calculation process, and the distance information refers to the distance between the pixel point which does not need to be repaired in the preset area range and the boundary point;
replacing the values of the boundary points in the color endoscope image with all the repairing values so as to obtain a repaired image;
the obtaining of the restoration value of the boundary point by using the pixel point information in the preset region range of the boundary point and the gradient information determined according to the boundary point includes:
determining a reference channel, obtaining a first image formed by data obtained by dividing a first channel numerical value by reference channel data corresponding to the reference channel, and simultaneously obtaining a second image formed by data obtained by dividing a second channel numerical value by the reference channel data;
determining a first numerical value of the first image and a second numerical value of the second image by using the pixel point information in the boundary area image within the preset area range of all the boundary points and gradient information determined according to the boundary points;
and multiplying the first numerical value by a reference numerical value corresponding to the reference channel to obtain a first repair numerical value, and multiplying the second numerical value by the reference numerical value corresponding to the reference channel to obtain a second repair numerical value, so as to obtain the repair numerical value consisting of the reference numerical value, the first repair numerical value and the second repair numerical value.
2. The image processing method according to claim 1, wherein extracting a boundary area image in the color endoscope image and determining a boundary point from the boundary area image includes:
extracting a color boundary region of the color endoscope image;
carrying out binarization and expansion processing on the color boundary area to obtain a boundary area image;
judging whether the gray value of a pixel point in the boundary area image is 0 or not;
and if so, determining the pixel point with the gray value of 0 as the boundary point.
3. The image processing method according to claim 2, wherein extracting the color boundary region of the color endoscopic image includes:
judging whether the R component of the pixel points in the color endoscope image is larger than the G component multiplied by a first preset multiple or not, or judging whether the R component of the pixel points in the color endoscope image is larger than the B component multiplied by a second preset multiple or not;
if yes, determining pixel points of which the R component is a first preset multiple of the G component or pixel points of which the R component is a second preset multiple of the B component as boundary region points, and obtaining the color boundary region formed by all the boundary region points.
4. The image processing method according to claim 2, wherein between the binarization and expansion processing of the color boundary region and the obtaining of the boundary region image, further comprising:
and carrying out corrosion treatment.
5. The image processing method according to claim 1, wherein determining a first numerical value of the first image and a second numerical value of the second image using the pixel point information in the boundary region image within the preset region range of all the boundary points and gradient information determined according to the boundary points comprises:
determining a plurality of points which do not need to be repaired within the preset area range of the boundary point;
calculating the contribution value of each point which does not need to be repaired by utilizing a first preset algorithm;
determining the numerical value of the boundary point according to a second preset algorithm based on the contribution value of each non-repair-required point so as to obtain the first numerical value of the first image and the second numerical value of the second image;
the first preset algorithm is ω = dir abs (cos θ), and the second preset algorithm is
Figure FDA0003912541240000021
Wherein, the first and the second end of the pipe are connected with each other,
Figure FDA0003912541240000022
ω is the contribution of the non-repair-required point,Δ x is the x distance between the point where no repair is required and the boundary point, Δ y is the y distance between the point where no repair is required and the boundary point, gradx is the x-direction gradient information, grady is the y-direction gradient information, I Know Is a value in the color endoscopic image that does not require a repair point.
6. The image processing method according to claim 1, wherein the reference channel is a G channel.
7. An image processing apparatus characterized by comprising:
the color endoscope image acquisition module is used for acquiring a color endoscope image in the time sequence sampling video;
the boundary point extraction module is used for extracting a boundary area image in the color endoscope image and determining boundary points according to the boundary area image;
a restoration value determining module, configured to determine a restoration value of the boundary point by using pixel point information in a preset region range of the boundary point and gradient information determined according to the boundary point, where the pixel point information includes a contribution value and distance information, and the gradient information includes gradient information in an x direction and gradient information in a y direction; the contribution value refers to the contribution weight of the value of the pixel point which does not need to be repaired in the preset area range of the boundary point in the corresponding boundary point repairing calculation process, and the distance information refers to the distance between the pixel point which does not need to be repaired in the preset area range and the boundary point;
the restoration module is used for replacing the numerical values of the boundary points in the color endoscope image with all the restoration numerical values so as to obtain a restored image;
wherein the repair value determination module includes:
the first image and second image obtaining unit is used for determining a reference channel, obtaining a first image formed by data obtained by dividing a first channel numerical value by reference channel data corresponding to the reference channel, and simultaneously obtaining a second image formed by data obtained by dividing a second channel numerical value by the reference channel data;
a first numerical value and second numerical value obtaining unit, configured to determine a first numerical value of the first image and a second numerical value of the second image by using the pixel point information in the preset region range of all the boundary points and gradient information determined according to the boundary points;
and the restoration value determining unit is used for multiplying the first value by the reference value corresponding to the reference channel to obtain a first restoration value, and multiplying the second value by the reference value corresponding to the reference channel to obtain a second restoration value, so as to obtain the restoration value consisting of the reference value, the first restoration value and the second restoration value.
8. An electronic device, comprising:
a memory for storing a computer program;
a processor for implementing the steps of the image processing method according to any one of claims 1 to 6 when executing the computer program.
9. A computer-readable storage medium, characterized in that a computer program is stored on the computer-readable storage medium, which computer program, when being executed by a processor, carries out the steps of the image processing method according to any one of claims 1 to 6.
CN201910983689.7A 2019-10-16 2019-10-16 Image processing method and device, electronic equipment and readable storage medium Active CN110751605B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910983689.7A CN110751605B (en) 2019-10-16 2019-10-16 Image processing method and device, electronic equipment and readable storage medium
PCT/CN2020/092220 WO2021073101A1 (en) 2019-10-16 2020-05-26 Image processing method and apparatus, electronic device, and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910983689.7A CN110751605B (en) 2019-10-16 2019-10-16 Image processing method and device, electronic equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN110751605A CN110751605A (en) 2020-02-04
CN110751605B true CN110751605B (en) 2022-12-23

Family

ID=69278571

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910983689.7A Active CN110751605B (en) 2019-10-16 2019-10-16 Image processing method and device, electronic equipment and readable storage medium

Country Status (2)

Country Link
CN (1) CN110751605B (en)
WO (1) WO2021073101A1 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110751605B (en) * 2019-10-16 2022-12-23 深圳开立生物医疗科技股份有限公司 Image processing method and device, electronic equipment and readable storage medium
CN112288718B (en) * 2020-10-29 2021-11-02 推想医疗科技股份有限公司 Image processing method and apparatus, electronic device, and computer-readable storage medium
CN112561830B (en) * 2020-12-23 2022-11-18 安徽大学 Endoscope image highlight repair method and device
CN113132705A (en) * 2021-04-20 2021-07-16 Oppo广东移动通信有限公司 Image color edge correction method, correction device, electronic device and storage medium
CN113012076B (en) * 2021-04-27 2023-06-23 广东工业大学 Dunhuang fresco restoration method based on adjacent pixel points and self-encoder
CN114359135A (en) * 2021-11-23 2022-04-15 上海微创医疗机器人(集团)股份有限公司 Medical tool detection method, system, computer device and storage medium
CN116309160B (en) * 2023-03-10 2024-04-12 北京百度网讯科技有限公司 Image resolution restoration method, device, equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102231792A (en) * 2011-06-29 2011-11-02 南京大学 Electronic image stabilization method based on characteristic coupling
CN107146229A (en) * 2017-04-05 2017-09-08 西安电子科技大学 Polyp of colon image partition method based on cellular Automation Model

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101783861B (en) * 2010-02-09 2011-11-23 腾讯科技(深圳)有限公司 Method and device for beautifying picture
WO2016194179A1 (en) * 2015-06-03 2016-12-08 オリンパス株式会社 Imaging device, endoscope and imaging method
CN105957027B (en) * 2016-04-22 2018-09-21 西南石油大学 A kind of MRF sample block image repair methods based on required direction structure characteristic statistics
CN108830780B (en) * 2018-05-09 2020-09-01 京东数字科技控股有限公司 Image processing method and device, electronic device and storage medium
CN109903322B (en) * 2019-01-24 2023-06-09 江苏大学 Depth camera depth image restoration method
CN110751605B (en) * 2019-10-16 2022-12-23 深圳开立生物医疗科技股份有限公司 Image processing method and device, electronic equipment and readable storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102231792A (en) * 2011-06-29 2011-11-02 南京大学 Electronic image stabilization method based on characteristic coupling
CN107146229A (en) * 2017-04-05 2017-09-08 西安电子科技大学 Polyp of colon image partition method based on cellular Automation Model

Also Published As

Publication number Publication date
WO2021073101A1 (en) 2021-04-22
CN110751605A (en) 2020-02-04

Similar Documents

Publication Publication Date Title
CN110751605B (en) Image processing method and device, electronic equipment and readable storage medium
JP5871325B2 (en) Information processing apparatus, information processing system, information processing method, program, and recording medium
CN106056562B (en) A kind of face image processing process, device and electronic equipment
JP5094036B2 (en) Endoscope insertion direction detection device
US8798344B2 (en) Image processing apparatus, image processing method and computer-readable recording device
JP5576775B2 (en) Image processing apparatus, image processing method, and image processing program
JP2007244519A (en) Image analysis apparatus
CN110855889B (en) Image processing method, image processing apparatus, image processing device, and storage medium
CN103945755B (en) Image processing apparatus
Suman et al. Image enhancement using geometric mean filter and gamma correction for WCE images
CN110136161A (en) Image characteristics extraction analysis method, system and device
KR20160118037A (en) Apparatus and method for detecting lesion from medical image automatically
JP6519703B2 (en) Image processing method, diagnostic device, and program
WO2018158817A1 (en) Image diagnosis device, image diagnosis method, and program
JP5543871B2 (en) Image processing device
CN110140150B (en) Image processing method and device and terminal equipment
CN110264418A (en) Method for enhancing picture contrast, system and device
WO2018030519A1 (en) Breast region detecting system, breast region detecting method, and program
CN114418920B (en) Endoscope multi-focus image fusion method
JP7372072B2 (en) Image processing device, image processing method, image processing program, and image inspection system
CN115482216B (en) Laparoscopic image enhancement method and system
CN117541800B (en) Laryngoscope image-based laryngeal anomaly segmentation method
JP2005274299A (en) Method for evaluating degree of corrosion on hardware
JP6662246B2 (en) Diagnosis support device, image processing method in diagnosis support device, and program
CN115797276A (en) Method, device, electronic device and medium for processing focus image of endoscope

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant