CN107977973B - Method and device for acquiring irradiation field boundary of beam limiter in medical diagnosis image - Google Patents

Method and device for acquiring irradiation field boundary of beam limiter in medical diagnosis image Download PDF

Info

Publication number
CN107977973B
CN107977973B CN201610942122.1A CN201610942122A CN107977973B CN 107977973 B CN107977973 B CN 107977973B CN 201610942122 A CN201610942122 A CN 201610942122A CN 107977973 B CN107977973 B CN 107977973B
Authority
CN
China
Prior art keywords
image
edge
processed
boundary
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610942122.1A
Other languages
Chinese (zh)
Other versions
CN107977973A (en
Inventor
李海春
王柳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Neusoft Medical Equipment Co Ltd
Original Assignee
Beijing Neusoft Medical Equipment Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Neusoft Medical Equipment Co Ltd filed Critical Beijing Neusoft Medical Equipment Co Ltd
Priority to CN201610942122.1A priority Critical patent/CN107977973B/en
Publication of CN107977973A publication Critical patent/CN107977973A/en
Application granted granted Critical
Publication of CN107977973B publication Critical patent/CN107977973B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10116X-ray image

Landscapes

  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The embodiment of the invention discloses a method for acquiring a radiation field boundary of a beam limiter in a medical diagnosis image, which comprises the following steps: acquiring an image to be processed; obtaining a vertical gradient image and a horizontal gradient image of the image; acquiring the gradient accumulation sum of each straight line in the vertical gradient image and the horizontal gradient image to obtain a first set and a second set; searching the maximum value and the minimum value in the first set and the second set, and respectively determining straight lines corresponding to the maximum value and the minimum value in the first set and the second set as a first edge, a second edge, a third edge and a fourth edge; and determining the first edge, the second edge, the third edge and the fourth edge as the boundaries of the irradiation field of the beam limiter in the image to be processed. The method provided by the embodiment of the invention can accurately detect the beam limiter illumination field with weak boundary contrast and discontinuous boundary under the influence of noise, is convenient for removing the area except the beam limiter illumination field in the image during subsequent processing, and improves the precision of the post-image processing result.

Description

Method and device for acquiring irradiation field boundary of beam limiter in medical diagnosis image
Technical Field
The invention relates to the technical field of image processing, in particular to a method and a device for acquiring a radiation field boundary of a beam limiter in a medical diagnosis image.
Background
In order to minimize the X-ray dose received by the patient, a beam limiter is usually used in the medical imaging apparatus to block the X-ray from damaging the human body. The beam limiter typically uses four lead plates to block the X-rays on four boundaries so that the X-rays only impinge on the remaining region of interest. At this time, in the original image obtained by X-ray imaging, the gray scale of the region blocked by the beam limiter is lower than that of the other regions. In subsequent processing and displaying of images, the occlusion area of the beam limiter affects the speed of image processing and the display effect. Therefore, it is necessary to accurately detect the position of the irradiation field of the beam limiter in the image before image processing, and remove the beam limiter shielding region to improve the performance of image processing and the accuracy of the processing result.
The range of the radiation field of the beam limiter is most often obtained by a detection method based on hough transform. According to the method, firstly, a binary image of an original image needs to be obtained through calculation, then the binary image is converted into a parameter space through Hough transform, and the problem of edge detection is converted into the problem of searching a peak value in the parameter space. However, in some original images obtained in practical operation, the gray scale of the blocked area of the beam limiter may be very close to the gray scale of the non-blocked area, and the contrast at the boundary of the irradiation field of the beam limiter is weak. Moreover, image noise also affects binary segmentation, which makes it difficult to perform appropriate binary segmentation when a binary image is obtained by calculation, so that the boundary of the irradiation field detected by the beam limiter using hough transform is discontinuous and the result is inaccurate, thereby affecting the accuracy of subsequent image processing and the accuracy of the processing result.
Therefore, a need exists in the art for a method and an apparatus for obtaining a boundary of an irradiation field of a beam limiter in a medical diagnostic image, which can accurately detect the position of the irradiation field of the beam limiter in an original image, so as to improve the accuracy of a post-image processing result.
Disclosure of Invention
In order to solve the problems in the prior art, the invention provides a method and a device for acquiring the irradiation field boundary of a beam limiter in a medical diagnosis image, which can accurately detect the position of the irradiation field of the beam limiter in an original image so as to improve the precision of a post-image processing result.
The embodiment of the invention provides a method for acquiring a radiation field boundary of a beam limiter in a medical diagnosis image, which comprises the following steps:
acquiring an image to be processed;
respectively acquiring the gradients of each pixel point in the image to be processed in the vertical direction and the horizontal direction;
acquiring gradient accumulated sum of all pixel points on each straight line in the image to be processed in the vertical direction to obtain a first set;
acquiring the gradient accumulated sum of all pixel points on each straight line in the image to be processed in the horizontal direction to obtain a second set;
searching the maximum value and the minimum value in the first set, determining that a straight line corresponding to the maximum value in the first set is a first edge, and determining that a straight line corresponding to the minimum value in the first set is a second edge;
searching the maximum value and the minimum value in the second set, determining that the straight line corresponding to the maximum value in the second set is a third edge, and determining that the straight line corresponding to the minimum value in the second set is a fourth edge;
and determining the first edge, the second edge, the third edge and the fourth edge as boundaries of an irradiation field of the beam limiter in the image to be processed.
Preferably, the respectively obtaining the gradients of each pixel point in the image to be processed in the vertical direction and the horizontal direction specifically includes:
respectively obtaining the gray level mean values of all pixel points in a first preset range, a second preset range, a third preset range and a fourth preset range to obtain a first gray level mean value, a second gray level mean value, a third gray level mean value and a fourth gray level mean value;
the first preset range is located right above a preset point, the second preset range is located right below the preset point, the third preset range is located right left of the preset point, the fourth preset range is located right of the preset point, the first preset range, the second preset range, the third preset range and the fourth preset range are the same in size and the same in distance with the preset point, and the preset point is any pixel point in the image to be processed;
acquiring the gradient of the preset point in the vertical direction according to the first gray average value and the second gray average value;
and acquiring the gradient of the preset point in the horizontal direction according to the third gray average value and the fourth gray average value.
Preferably, the determining that the first edge, the second edge, the third edge, and the fourth edge are boundaries of an irradiation field of a beam limiter in the image to be processed specifically includes:
when the absolute value of the maximum value in the first set is larger than a first initial threshold value, determining that the first edge is a first boundary of an irradiation field of a beam limiter in the image to be processed;
when the absolute value of the minimum value in the first set is larger than a second initial threshold, determining that the second edge is a second boundary of an irradiation field of a beam limiter in the image to be processed;
when the absolute value of the maximum value in the second set is greater than a third initial threshold, determining that the third edge is a third boundary of an irradiation field of a beam limiter in the image to be processed;
and when the absolute value of the minimum value in the second set is greater than a fourth initial threshold, determining that the fourth edge is a fourth boundary of the irradiation field of the beam limiter in the image to be processed.
Preferably, the method further comprises the following steps:
acquiring a low-frequency image of the image to be processed;
determining a region with Y-axis coordinates smaller than (H/2) -H x f in the low-frequency image as an upper moving region of the boundary of the irradiation field of the beam limiter, and determining a region with Y-axis coordinates larger than (H/2) + H x f in the low-frequency image as a lower moving region of the boundary of the irradiation field of the beam limiter;
determining a region with X-axis coordinates smaller than (W/2) -W X f in the low-frequency image as a left moving region of the boundary of the irradiation field of the beam limiter, and determining a region with X-axis coordinates larger than (W/2) + W X f in the low-frequency image as a right moving region of the boundary of the irradiation field of the beam limiter;
determining correspondence between the first edge, the second edge, the third edge, and the fourth edge and the upper movement region, the lower movement region, the left movement region, and the right movement region, each movement region corresponding to one edge;
respectively calculating the gray average values of all pixel points in the upper moving area, the lower moving area, the left moving area and the right moving area to obtain an upper gray average value, a lower gray average value, a left gray average value and a right gray average value;
when the upper gray average value is greater than or equal to A, the judgment threshold value of the upper moving area is equal to a preset maximum judgment threshold value; when the upper gray average value is less than or equal to B, the judgment threshold value of the upper moving area is equal to a preset minimum judgment threshold value; when the upper gray average value is larger than B and smaller than A, setting a judgment threshold value of the upper moving area according to the upper gray average value;
when the lower gray average value is greater than or equal to A, the judgment threshold value of the lower moving area is equal to the preset maximum judgment threshold value; when the lower gray average value is less than or equal to B, the judgment threshold value of the lower moving area is equal to the preset minimum judgment threshold value; when the lower gray average value is larger than B and smaller than A, setting a judgment threshold value of the lower moving area according to the lower gray average value;
when the left part gray average value is greater than or equal to A, the judgment threshold value of the left moving area is equal to the preset maximum judgment threshold value; when the left part gray average value is less than or equal to B, the judgment threshold value of the left moving area is equal to the preset minimum judgment threshold value; when the left part gray level average value is larger than B and the left part gray level average value is smaller than A, setting a judgment threshold value of the left moving area according to the left part gray level average value;
when the right gray average value is greater than or equal to A, the judgment threshold value of the right moving area is equal to the preset maximum judgment threshold value; when the right gray average value is less than or equal to B, the judgment threshold value of the right moving area is equal to the preset minimum judgment threshold value; when the part right gray average value is larger than B and the right gray average value is smaller than A, setting a judgment threshold value of the right moving area according to the right gray average value;
h is the maximum value of the Y-axis coordinate of the low-frequency image, W is the maximum value of the X-axis coordinate of the low-frequency image, f is a preset truncation coefficient, A is a preset maximum gray average value, and B is a preset minimum gray average value;
the first initial threshold, the second initial threshold, the third initial threshold, and the fourth initial threshold are respectively equal to judgment thresholds of moving areas corresponding to the first edge, the second edge, the third edge, and the fourth edge.
Preferably, the determining that the first edge, the second edge, the third edge, and the fourth edge are boundaries of an irradiation field of a beam limiter in the image to be processed specifically includes:
when the absolute value of the difference between the maximum value in the first set and the gradient accumulation sum of all pixel points on the first straight line in the vertical direction is larger than a first preset difference value, determining the first edge as a first boundary of an irradiation field of a beam limiter in the image to be processed;
when the absolute value of the difference between the minimum value in the first set and the gradient accumulation sum of all pixel points on the second straight line in the vertical direction is larger than a second preset difference value, determining the second edge as a second boundary of the beam limiter illumination field in the image to be processed;
when the absolute value of the difference between the maximum value in the second set and the gradient accumulation sum of all pixel points on a third straight line in the horizontal direction is larger than a third preset difference value, determining that the third edge is a third boundary of the beam limiter irradiation field in the image to be processed;
when the absolute value of the difference between the minimum value in the second set and the gradient accumulation sum of all pixel points on a fourth straight line in the horizontal direction is larger than a fourth preset difference value, determining that the fourth edge is a fourth boundary of the beam limiter illumination field in the image to be processed;
wherein the absolute value of the difference between the distance between the first straight line and a first preset point in the image to be processed and the distance between the first boundary and the first preset point is smaller than a first distance; the absolute value of the difference between the distance between the second straight line and a second preset point in the image to be processed and the distance between the second boundary and the second preset point is smaller than a second distance; the absolute value of the difference between the distance between the third straight line and a third preset point in the image to be processed and the distance between the first boundary and the third preset point is smaller than a third distance; and the absolute value of the difference between the distance between the fourth straight line and a fourth preset point in the image to be processed and the distance between the fourth boundary and the fourth preset point is smaller than a fourth distance.
Preferably, the method further comprises the following steps:
when the first boundary and the second boundary are not determined to be the boundary of the beam limiter irradiation field in the image to be processed at the same time and/or the third boundary and the fourth boundary are not determined to be the boundary of the beam limiter irradiation field in the image to be processed at the same time, searching the boundary of the beam limiter irradiation field in the region without the determined boundary again;
wherein the region of undetermined boundary belongs to the image to be processed.
Preferably, the determining that the first edge, the second edge, the third edge, and the fourth edge are boundaries of an irradiation field of a beam limiter in the image to be processed specifically includes:
respectively judging whether the first edge and the second edge are parallel and whether the third edge and the fourth edge are parallel;
when the first edge is parallel to the second edge and the third edge is parallel to the fourth edge, continuously judging whether two adjacent edges of the first edge, the second edge, the third edge and the fourth edge are mutually vertical;
when two adjacent edges of the first edge, the second edge, the third edge and the fourth edge are perpendicular to each other, determining that the first edge, the second edge, the third edge and the fourth edge are boundaries of an irradiation field of a beam limiter in the image to be processed.
Preferably, the acquiring the image to be processed further includes:
performing edge removing processing on the image to be processed, and removing a first preset number of pixel points at the edge of the image to be processed;
after the image to be processed is subjected to convolution calculation and updated, removing a second preset number of pixel points at the edge of the image to be processed;
performing down-sampling processing on the image to be processed, and updating the image to be processed;
and the size of a convolution kernel used for the convolution calculation is equal to the sampling interval in the down-sampling processing.
The embodiment of the invention also provides a device for acquiring the irradiation field boundary of the beam limiter in the medical diagnosis image, which comprises the following steps: the device comprises a first acquisition module, a second acquisition module, a third acquisition module, a fourth acquisition module, a first determination module and a second determination module;
the first acquisition module is used for acquiring an image to be processed;
the second obtaining module is used for respectively obtaining the gradients of each pixel point in the image to be processed in the vertical direction and the horizontal direction;
the third obtaining module is configured to obtain a gradient accumulated sum in the vertical direction of all pixel points on each straight line in the image to be processed, so as to obtain a first set;
the fourth obtaining module is configured to obtain a gradient accumulated sum of all pixel points in each straight line in the image to be processed in the horizontal direction to obtain a second set;
the first determining module is configured to search for a maximum value and a minimum value in the first set, determine that a straight line corresponding to the maximum value in the first set is a first edge, and determine that a straight line corresponding to the minimum value in the first set is a second edge;
the first determining module is further configured to search for a maximum value and a minimum value in the second set, determine that a straight line corresponding to the maximum value in the second set is a third edge, and determine that a straight line corresponding to the minimum value in the second set is a fourth edge;
the second determining module is configured to determine that the first edge, the second edge, the third edge, and the fourth edge are boundaries of an irradiation field of the beam limiter in the image to be processed.
Preferably, the second determining module includes: a first determination submodule;
the first determining submodule is used for determining that the first edge is a first boundary of an irradiation field of the beam limiter in the image to be processed when the absolute value of the maximum value in the first set is larger than a first initial threshold;
the first determining submodule is further configured to determine that the second edge is a second boundary of an irradiation field of the beam limiter in the image to be processed when an absolute value of a minimum value in the first set is greater than a second initial threshold;
the first determining submodule is further configured to determine that the third edge is a third boundary of an irradiation field of the beam limiter in the image to be processed, when an absolute value of a maximum value in the second set is greater than a third initial threshold;
the first determining sub-module is further configured to determine, when an absolute value of a minimum value in the second set is greater than a fourth initial threshold, that the fourth edge is a fourth boundary of an irradiation field of the beam limiter in the image to be processed.
Preferably, the second determining module includes: a boundary determination submodule;
the boundary determining submodule is used for determining the first edge as a first boundary of an irradiation field of the beam limiter in the image to be processed when the absolute value of the difference between the maximum value in the first set and the gradient accumulation sum of all pixel points on the first straight line in the vertical direction is larger than a first preset difference value;
the boundary determining submodule is further configured to determine that the second edge is a second boundary of the irradiation field of the beam limiter in the image to be processed when an absolute value of a difference between a minimum value in the first set and a gradient accumulation sum of all pixel points on a second straight line in the vertical direction is greater than a second preset difference value;
the boundary determining submodule is further configured to determine that the third edge is a third boundary of the beam limiter illumination field in the image to be processed when an absolute value of a difference between a maximum value in the second set and a gradient accumulation sum of all pixel points on a third straight line in the horizontal direction is greater than a third preset difference value;
the boundary determining submodule is further configured to determine that the fourth edge is a fourth boundary of the irradiation field of the beam limiter in the image to be processed when an absolute value of a difference between the minimum value in the second set and a gradient accumulation sum of all pixel points on a fourth straight line in the horizontal direction is greater than a fourth preset difference value;
wherein the absolute value of the difference between the distance between the first straight line and a first preset point in the image to be processed and the distance between the first boundary and the first preset point is smaller than a first distance; the absolute value of the difference between the distance between the second straight line and a second preset point in the image to be processed and the distance between the second boundary and the second preset point is smaller than a second distance; the absolute value of the difference between the distance between the third straight line and a third preset point in the image to be processed and the distance between the first boundary and the third preset point is smaller than a third distance; and the absolute value of the difference between the distance between the fourth straight line and a fourth preset point in the image to be processed and the distance between the fourth boundary and the fourth preset point is smaller than a fourth distance.
Preferably, the method further comprises the following steps: a search module;
the searching module is configured to search the boundary of the beam limiter irradiation field again in an area where the boundary is not determined when the first boundary and the second boundary are not determined at the same time as the boundary of the beam limiter irradiation field in the image to be processed and/or when the third boundary and the fourth boundary are not determined at the same time as the boundary of the beam limiter irradiation field in the image to be processed;
wherein the region of undetermined boundary belongs to the image to be processed.
Preferably, the second determining module specifically includes: a fourth judgment submodule, a fifth judgment submodule and a ninth determination submodule;
the fourth judgment submodule is configured to respectively judge whether the first edge and the second edge are parallel and whether the third edge and the fourth edge are parallel;
the fifth judgment sub-module is configured to, when the fourth judgment sub-module judges that the first edge and the second edge are parallel and the third edge and the fourth edge are parallel, continuously judge whether two adjacent edges of the first edge, the second edge, the third edge and the fourth edge are perpendicular to each other;
the ninth determining submodule is configured to determine that the first edge, the second edge, the third edge, and the fourth edge are boundaries of a radiation field of the beam limiter in the image to be processed, when the fifth determining submodule determines that two adjacent edges of the first edge, the second edge, the third edge, and the fourth edge are all perpendicular to each other.
Preferably, the method further comprises the following steps: the system comprises a first processing module, a second processing module and a third processing module;
the first processing module is used for performing edge deletion processing on the image to be processed, and triggering the second processing module after removing a first preset number of pixel points on the edge of the image to be processed;
the second processing module is configured to perform convolution calculation on the to-be-processed image to update the to-be-processed image, remove a second preset number of pixel points at the edge of the to-be-processed image, and trigger the third processing module;
the third processing module is configured to perform downsampling processing on the to-be-processed image, and update the to-be-processed image;
and the size of a convolution kernel used for the convolution calculation is equal to the sampling interval in the down-sampling processing.
Compared with the prior art, the invention has at least the following advantages:
the method for acquiring the irradiation field boundary of the beam limiter in the medical diagnosis image, provided by the embodiment of the invention, comprises the steps of firstly respectively calculating the gradients of an image to be processed in the vertical direction and the horizontal direction, and then calculating the gradient accumulated sum of all pixel points on each straight line in the image to be processed in the vertical direction and the horizontal direction to obtain the vertical gradient accumulated sum and the horizontal gradient accumulated sum of each straight line in the image to be processed. And then, respectively searching straight lines corresponding to the maximum vertical gradient accumulation, the maximum horizontal gradient accumulation sum, the minimum vertical gradient accumulation sum and the minimum horizontal gradient accumulation sum to obtain four edges. The four edges are the boundary lines of the irradiation field of the beam limiter in the image to be processed. The irradiation field of the beam limiter in the image to be processed is an internal area enclosed by the four edges. The method for acquiring the irradiation field boundary of the beam limiter in the medical diagnosis image, provided by the embodiment of the invention, can accurately detect the irradiation field of the beam limiter with weak boundary contrast and discontinuous boundary under the influence of noise to obtain the continuous irradiation field boundary of the beam limiter, is convenient for removing the area except the irradiation field of the beam limiter in the image during subsequent processing, and improves the precision of the post-image processing result.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is an original image of a hand bone orthoptic X-ray obtained after a beam limiter blocks redundant X-rays;
FIG. 2 is a schematic flowchart of an embodiment of a method for obtaining a boundary of an illumination field of a beam limiter in a medical diagnostic image according to the present invention;
FIG. 3a is a vertical gradient image of the hand bone normal X-ray raw image shown in FIG. 1;
FIG. 3b is a horizontal gradient image of the hand bone normal X-ray raw image shown in FIG. 1;
FIG. 4a is a schematic diagram of a vertical gradient operator according to an embodiment of the present invention;
FIG. 4b is a diagram illustrating a horizontal gradient operator according to an embodiment of the present invention;
FIG. 5 is a schematic flow chart of a gradient calculation method according to an embodiment of the present invention;
FIG. 6a is a partial matrix of an image to be processed according to an embodiment of the present invention;
FIG. 6b is a matrix generated by row accumulation of a part of matrices in the image to be processed according to the embodiment of the present invention;
FIG. 6c is a matrix generated by row accumulation and column accumulation of a portion of matrices in an image to be processed according to an embodiment of the present invention;
FIG. 7a is a vertical line integral image of the hand bone normal X-ray raw image shown in FIG. 1;
FIG. 7b is a horizontal line integral image of the hand bone-setting X-ray raw image shown in FIG. 1;
FIG. 8a is a schematic diagram of the hand bone orthopaedics original image shown in FIG. 1, which is detected by the method for obtaining the boundary of the irradiation field of the beam limiter in the medical diagnosis image according to the embodiment of the present invention;
FIG. 8b is a schematic diagram of a patellar axial X-ray raw image with a boundary line detected by the method for acquiring the boundary of the irradiation field of the beam limiter in the medical diagnostic image according to the embodiment of the present invention;
fig. 8c is a schematic diagram of a boundary line detected by a lumbar vertebra normal position X-ray original image through the method for acquiring the boundary of the irradiation field of the beam limiter in the medical diagnosis image according to the embodiment of the present invention;
fig. 8d is a schematic diagram of a boundary line detected by the sacrum orthoposition X-ray original image through the method for acquiring the irradiation field boundary of the beam limiter in the medical diagnosis image according to the embodiment of the present invention;
fig. 9 is a schematic structural diagram of an embodiment of the apparatus for acquiring a boundary of an irradiation field of a beam limiter in a medical diagnostic image according to the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Before describing the method and apparatus for obtaining the irradiation field boundary of the beam limiter in the medical diagnostic image, an original image obtained by using the beam limiter in the X-ray imaging system is first described. As shown in fig. 1, the figure is a hand bone normal X-ray original image obtained after using a beam limiter to block excessive X-rays. The white area of the image including the hand bone orthopaedics X-ray image is the irradiation field of the beam limiter in the image and is rectangular or square. It can be seen that the gray scale at the wrist is close to the gray scale of the black frame outside the irradiation field, and proper binary segmentation is difficult to perform when the existing beam limiter irradiation field acquisition method is used, so that the accuracy of subsequent image processing and the precision of a processing result are affected.
To this end, the embodiment of the invention provides a method for acquiring the irradiation field boundary of the beam limiter in the medical diagnosis image. After the image to be processed is obtained, the gradients of all pixel points on the image to be processed in the vertical direction and the horizontal direction are respectively obtained, and the gradient accumulation sum of all the pixel points on each straight line in the image to be processed in the vertical direction and the horizontal direction is calculated. And respectively searching the straight line corresponding to the maximum gradient accumulation sum and the straight line corresponding to the minimum gradient accumulation sum of the image to be processed in the vertical direction and the horizontal direction to obtain four edges. The four edges are the boundary lines of the irradiation field of the beam limiter in the image to be processed. By the method, a person skilled in the art can accurately detect the irradiation field of the beam limiter in the original medical diagnosis image with weak boundary contrast and discontinuous boundary under the influence of noise to obtain the continuous irradiation field boundary of the beam limiter, so that the area outside the irradiation field of the beam limiter in the image can be conveniently removed in the subsequent processing, and the precision of the post-image processing result is improved.
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in detail below.
The method comprises the following steps:
referring to fig. 2, the figure is a schematic flowchart of an embodiment of the method for acquiring the irradiation field boundary of the beam limiter in the medical diagnostic image according to the present invention.
The method for acquiring the irradiation field boundary of the beam limiter in the medical diagnosis image, provided by the embodiment, comprises the following steps:
s201: acquiring an image to be processed;
in a preferred embodiment of the present invention, the acquired image to be processed may also be preprocessed. Because the original image is interfered by influence factors such as noise and the like, the noise can influence the accuracy of edge identification in the image in the process of acquiring the irradiation field of the beam limiter. Therefore, the original image can be subjected to noise reduction processing to reduce the influence of noise. Moreover, because the resolution of the original image is relatively high and can generally reach 3000 × 3000 pixels, the original image can be subjected to alternate sampling processing to reduce the calculation amount and improve the acquisition efficiency of the irradiation field of the beam limiter.
As an example, the acquired image to be processed may be pre-processed by:
firstly, performing edge removing processing on the image to be processed, and removing a first preset number of pixel points at the edge of the image to be processed;
since the edge portion of the image contains less image information and is more affected by noise, in this example, the portion of invalid pixels near the edge of the original image is first removed to reduce the amount of subsequent calculations. For example, 10 pixels at the edge.
Before the image to be processed is subjected to down-sampling processing, in order to avoid losing original image information, convolution kernel with the same size as the sampling interval is required to be used for carrying out convolution calculation on the image to be processed. Namely, convolution calculation is carried out on the image to be processed to update the image to be processed. Since the convolution calculation brings about an edge response, in this example, a part of pixels of the edge of the to-be-processed image after the convolution calculation is further removed. Namely, removing the second preset number of pixel points at the edge of the image to be processed after the convolution calculation.
And finally, performing down-sampling processing on the image to be processed, and updating the image to be processed.
S202: respectively acquiring the gradients of each pixel point in the image to be processed in the vertical direction and the horizontal direction;
it can be understood that the image gradient algorithm is a process of calculating gray level variation between each pixel point in the image and a pixel point in a certain neighborhood of the pixel point, and the gray level variation between the pixel point in the image and the pixel point in the certain neighborhood is obtained by means of a gradient operator during specific calculation. For example: the gradient of the image function f (x, y) at the point (x, y) is a vector having a magnitude and a direction, and specifically, the gradient of the point (x, y) in the x-axis direction can be represented by Gx, and the gradient of the point (x, y) in the y-axis direction can be represented by Gy. An image composed of image gradients is called a gradient image. The gradients of each pixel point in the image function f (x, y) in the x-axis direction and the y-axis direction can be approximated to Gx ═ f (x, y) -f (x-1, y) and Gy ═ f (x, y) -f (x, y-1), respectively. When the edge of the irradiation field of the speed limiter exists in the image function f (x, y), the gray level change is large because the gray level at the edge has a large difference value with the gray level of a non-edge area nearby, so that the pixel point on the edge has a large gradient value in the direction of the x axis or the y axis. Conversely, when there is a relatively smooth portion in the image function f (x, y), the gray level change of the region is relatively small, and the gradient of the pixel point in the corresponding region is also relatively small.
The boundary line of the irradiation field of the beam limiter in the image to be processed is generally a horizontal straight line or a vertical straight line. Therefore, in the horizontal gradient image of the image to be processed, because the gray scale change between the pixel point on the horizontal boundary line and other pixel points in the horizontal direction is small, the boundary line in the horizontal direction in the image to be processed is not highlighted in the gradient image in the horizontal direction, and the boundary feature in the horizontal direction cannot be obtained. Similarly, the boundary line in the vertical direction in the image to be processed is not highlighted in the vertical gradient image, and the boundary characteristic in the vertical direction cannot be acquired. Therefore, in order to accurately obtain the four boundary lines of the irradiation field of the beam limiter in the image to be processed, the gradients of the image to be processed in the vertical and horizontal directions need to be calculated respectively, so as to ensure that the edge of the irradiation field of the beam limiter in the image to be processed is completely detected. Fig. 3a is a gradient image of the hand bone normal X-ray original image shown in fig. 1 in a vertical direction (i.e., y-axis), and fig. 3b is a gradient image of the hand bone normal X-ray original image shown in fig. 1 in a horizontal direction (i.e., X-axis).
It should be noted that, according to the actual situation, a person skilled in the art may use any gradient calculation method and gradient operator (such as laplacian operator, sobel operator, etc.) to obtain the gradient of each pixel point in the image to be processed in the vertical direction and the horizontal direction, which is not limited in the present invention, and specific embodiments of gradient acquisition are not listed here.
In some possible implementations of the present embodiment, in order to reduce the influence of noise and make the edge position of the irradiation field of the beam limiter in the gradient image more obvious, the vertical gradient operator and the horizontal gradient operator used in the gradient calculation process are respectively shown in fig. 4a and 4 b. Accordingly, referring to fig. 5, a person skilled in the art can calculate the gradient of each pixel point in the image to be processed in the vertical direction and the horizontal direction by the following steps.
At this time, respectively obtaining the gradients of each pixel point in the image to be processed in the vertical direction and the horizontal direction includes:
s2021: respectively obtaining the gray level mean values of all pixel points in a first preset range, a second preset range, a third preset range and a fourth preset range to obtain a first gray level mean value, a second gray level mean value, a third gray level mean value and a fourth gray level mean value;
the first preset range is located right above a preset point, the second preset range is located right below the preset point, the third preset range is located right left of the preset point, the fourth preset range is located right of the preset point, the first preset range, the second preset range, the third preset range and the fourth preset range are the same in size and the same in distance with the preset point, and the preset point is any pixel point in the image to be processed;
as can be seen from the vertical gradient operator shown in fig. 4a and the horizontal gradient operator shown in fig. 4b, when the gradient of a certain pixel point is calculated, the first preset range is a 5 × 5 pixel point region where two pixel points are spaced from the pixel point right above the pixel point, the second preset range is a 5 × 5 pixel point region where two pixel points are spaced from the pixel point right below the pixel point, the third preset range is a 5 × 5 pixel point region where two pixel points are spaced from the pixel point right left, and the fourth preset range is a 5 × 5 pixel point region where two pixel points are spaced from the pixel point right.
In an example, to reduce the calculation amount, the obtaining gray-scale average values of all pixel points in a first preset range, a second preset range, a third preset range and a fourth preset range corresponding to a certain pixel point in the image to be processed respectively further includes:
firstly, respectively obtaining the accumulated sum of all pixel gray levels in a first area, a second area, a third area and a fourth area corresponding to each preset range in the image to be processed to obtain a first value, a second value, a third value and a fourth value;
in the example shown in fig. 4, it is assumed that four vertices of a first predetermined range of a certain pixel point in the image to be processed are points (x)1,y1)、(x2,y2)、(x3,y3) And (x)4,y4) (as shown in fig. 4 a). Then, the first region is the coordinate (x) in the image to be processed1-1,y1-1) pixel points and a square area defined by pixel points of coordinates (0,0), coordinate (x)1-1,y1-a line between a pixel point of 1) and a pixel point of coordinate (0,0) is a diagonal of the first region; the second area is a coordinate (x)2-1,y2) And the pixel point of (2) and the pixel point of the coordinate (0,0) to determine the square area, coordinate (x)2-1,y2) The connecting line between the pixel point of (a) and the pixel point of the coordinate (0,0) is a diagonal line of the second area; the third area is a coordinate (x)3,y3-1) pixel points and a square area defined by pixel points of coordinates (0,0), coordinate (x)3,y3-the line between the pixel point of 1) and the pixel point of coordinate (0,0) is a diagonal of the third region; the fourth area is a coordinate (x)4,y4) And the pixel point of (2) and the pixel point of the coordinate (0,0) to determine the square area, coordinate (x)4,y4) The connecting line between the pixel point of (1) and the pixel point of the coordinate (0,0) is a diagonal line of the fourth area; wherein, the coordinate (0,0) is the coordinate of the pixel point at the upper left corner of the image to be processed;
similarly, the coordinate ranges of the first region, the second region, the third region and the fourth region corresponding to the second preset range, the third preset range and the fourth preset range of the pixel point are similar to the coordinate ranges of the first region, the second region, the third region and the fourth region corresponding to the first preset range, and are not repeated here.
As an example, a row-column accumulation method may be adopted to obtain the gray-level mean values of all the pixels in the first preset range, the second preset range, the third preset range and the fourth preset range, so as to further reduce the amount of calculation and increase the calculation speed.
The row and column accumulation method specifically comprises the following steps: according to the formula
Figure BDA0001139934040000141
Calculating the row accumulated value sum _ l of a pixel point with the coordinate of (m, n) in the image to be processedm,nWherein G ism,iThe gray scale of a pixel point with the coordinate of (m, i) in the image to be processed is obtained; according to the formula
Figure BDA0001139934040000142
Calculating the gray level sum accumulation sum _ all of all pixel points in the area corresponding to the pixel point with the coordinate of (m, n)m,n(ii) a The pixel point with the coordinate of (m, n) can be a pixel point at the lower right corner of the first area, the second area, the third area or the fourth area respectively.
In the example shown in fig. 4, the coordinates of the pixel points at the lower right corners of the first region, the second region, the third region, and the fourth region are (x)1-1,y1-1)、(x2-1,y2)、(x3,y3-1) and (x)4,y4)。
Similarly, the formula can be firstly followed
Figure BDA0001139934040000143
Calculating the row accumulated value sum _ r of pixel points with coordinates (m, n) in the image to be processedm,nWherein G isi,nThe gray scale of a pixel point with coordinates (i, n) in the image to be processed is obtained; according to the formula
Figure BDA0001139934040000144
Calculating the gray level sum accumulation sum _ all of all pixel points in the area corresponding to the pixel point with the coordinate of (m, n)m,nThe pixel points with coordinates (m, n) may be the pixel points at the lower right corner of the first region, the second region, the third region or the fourth region, respectively.
It can be understood that, the row-column accumulation step is firstly executed to obtain the accumulated value of the gray levels of all the pixel points in each pixel point of the image to be processed and the area determined by the original point of the image, and when the accumulated value of the gray levels of all the pixel points in a certain range is obtained, the accumulated values of the gray levels of all the pixel points in the first area, the second area, the third area and the fourth area are directly obtained according to the coordinates of the pixel points at the lower right corner of the first area, the second area, the third area and the fourth area corresponding to the range, so that the calculation amount is reduced.
Secondly, adding the first value to the fourth value, subtracting the second value and subtracting the third value to obtain the accumulated sum of all pixel gray levels in a first range, and obtaining the accumulated sum of all pixel gray levels in the first preset range, the second preset range, the third preset range and the fourth preset range;
and finally, obtaining the gray level mean value of all pixel points in the first preset range, the second preset range, the third preset range and the fourth preset range according to the accumulated sum of all the pixel point gray levels in the first preset range, the second preset range, the third preset range and the fourth preset range.
For convenience of understanding, the method for obtaining the mean gray level of all the pixels in a certain preset range, which is described in the above example, is described below with reference to fig. 6a to 6 c.
Fig. 6a shows by way of example a partial matrix in the image to be processed. The number is the gray level of the corresponding pixel point in the image to be processed. When the gray average of all the pixels inside the square frame in fig. 6a needs to be calculated, the gray of the pixels in the same row in the matrix is accumulated to obtain fig. 6 b. Specifically, the value of each coordinate point in fig. 6b is equal to the sum of the grays of the pixels located in the same row left of the pixel point in fig. 6a plus the grayscale of the pixel point. Then, the gray levels of the pixels in the same row in the matrix shown in fig. 6b are accumulated in the same manner as described above, so as to obtain fig. 6 c. At this time, the value at each coordinate point in fig. 6c is equal to the cumulative sum of the gray levels of all pixel points in the square area in fig. 6a, where the diagonal line between the pixel point corresponding to the coordinate point and the origin of the image to be processed (the point with the upper left corner value of 1) is the diagonal line. The cumulative sum of the grays of all the pixels in the box shown in fig. 6a is equal to K1+ K4-K2-K3-5 + 161-42-53-71, and the average grayscale value is equal to 71/25.
A diagonal line of the first region is a connection line between a point a in fig. 6a and an origin (i.e., a point e) of the image to be processed, a diagonal line of the second region is a connection line between a point b in fig. 6a and the origin of the image to be processed, a diagonal line of the third region is a connection line between a point c in fig. 6a and the origin of the image to be processed, a diagonal line of the fourth region is a connection line between a point d in fig. 6a and the origin of the image to be processed, K1 is the first value, K2 is the second value, K3 is the third value, and K4 is the fourth value.
Similarly, the gray level mean values of all the pixel points in the first preset range, the second preset range, the third preset range and the fourth preset range corresponding to each pixel point in the image to be processed, that is, the first gray level mean value, the second gray level mean value, the third gray level mean value and the fourth gray level mean value, can be calculated.
S2022: when pixel points lack in the first preset range, the second preset range, the third preset range or the fourth preset range, completing the pixel points in the preset ranges according to the gray scale of the edge pixel points of the image to be processed;
it can be understood that, when the gradient operator shown in fig. 4a and 4b is used to calculate the gradient of the pixel points in the image to be processed, since some of the pixel points may be located at the edge of the image to be processed, the pixel points within a certain or several preset ranges are not complete or have no pixel points. Thus, when the gradient of the pixels is calculated by the method described in the step S2021, the calculation result of the gray level mean of all the pixels in the preset range is inaccurate. When the pixel points are lacked in the preset range, the result of the gray average value is smaller than the true value; and when no pixel point exists in the preset range, the gray average value is zero. Obviously, this will cause errors in the gradient calculation results, and affect the acquisition of the beam limiter illumination field boundary in the image to be processed.
In one example, the pixels in the preset range may be completed according to the gray levels of the pixels at the edge of the image to be processed near the preset range. At this time, for convenience of calculation, the edge of the image to be processed may be expanded outward at the end of the image preprocessing process, so that the pixels in the preset range in any direction of each pixel of the image to be processed before expansion are complete. For example, when the vertical gradient operator and the horizontal gradient operator shown in fig. 4a and 4b are used, the edge of the image to be processed may be expanded outward by 7 pixels in advance, and the expansion principle is to repeat the gray scale of the pixel at the edge of the image.
S2023: acquiring the gradient of the preset point in the vertical direction according to the first gray average value and the second gray average value;
s2024: acquiring the gradient of the preset point in the horizontal direction according to the third gray average value and the fourth gray average value;
thus, the gradients of each pixel point in the image to be processed in the vertical and horizontal directions can be obtained one by one according to the method described in the above steps S2021-S2024.
In one example, the gradient Grad of each pixel point in the image to be processed in the vertical direction can be calculated according to the following formulav
Figure BDA0001139934040000161
The mean _ up is a first gray level mean value of any pixel point in the image to be processed, the mean _ down is a second gray level mean value of the pixel point, and the max (mean _ down, mean _ up) is the larger value of the first gray level mean value and the second gray level mean value of the pixel point.
Correspondingly, the gradient Grad of each pixel point in the image to be processed in the horizontal direction can be calculated according to the following formulah
Figure BDA0001139934040000162
The mean _ right is a fourth gray average value of any pixel point in the image to be processed, and the mean _ left is a third gray average value of the pixel point; and max (mean _ right, mean _ left) is the larger value of the third gray average value and the fourth gray average value of the pixel point.
In addition, due to the influence of noise in the image to be processed or the human tissue image, a gradient may also exist in the pixel points inside the irradiation field area of the beam limiter, and the gradient is generally smaller than that of the pixel points on the boundary of the irradiation field of the beam limiter. If the pixel points with gradient in the irradiation field area of the beam limiter are also used as candidate positions of the irradiation field boundary of the beam limiter in the image to be processed, the calculation amount required for obtaining the irradiation field boundary of the beam limiter in the medical diagnosis image can be increased, and the boundary detection result can be possibly influenced. As a preferred implementation of the present application, a person skilled in the art can set a gradient threshold according to the actual situation. When the gradient of a certain pixel point is smaller than the gradient threshold value, the gradient of the certain pixel point is set to be zero, so that the gradient of the certain pixel point does not participate in subsequent calculation. Therefore, the calculation amount of the subsequent steps can be reduced, and the influence of interference factors on the detection result of the irradiation field boundary of the beam limiter can be reduced to a certain degree.
Specifically, after the obtaining the gradients of each pixel point in the image to be processed in the vertical direction and the horizontal direction respectively, the method further includes: judging whether the gradient of each pixel point in the image to be processed in the vertical direction is larger than a first preset gradient one by one, and if not, setting the gradient of the pixel point in the vertical direction to be zero; and judging whether the gradient of each pixel point in the image to be processed in the horizontal direction is larger than a second preset gradient one by one, and if not, setting the gradient of the pixel point in the horizontal direction to be zero.
It should be noted that, as can be seen from fig. 3a and 3b, the irradiation field boundary in the image to be processed is more prominent than other straight lines detected in the gradient diagram in the gradient image.
Therefore, the gradients of all pixel points in each straight line in the image to be processed in the vertical direction and the horizontal direction can be accumulated to obtain a vertical gradient accumulated sum (namely, a first set) and a horizontal gradient accumulated sum (namely, a second set). And respectively finding out four straight lines corresponding to the maximum vertical gradient accumulation sum, the maximum horizontal gradient accumulation sum, the minimum vertical gradient accumulation sum and the minimum horizontal gradient accumulation sum. These straight lines are the boundaries of the radiation field of the beam limiter in the image to be processed. The method comprises the following specific steps:
s203: acquiring gradient accumulated sum of all pixel points on each straight line in the image to be processed in the vertical direction to obtain a first set;
s204: acquiring the gradient accumulated sum of all pixel points on each straight line in the image to be processed in the horizontal direction to obtain a second set;
in some possible implementations of this embodiment, the first set and the second set may be obtained using a Radon transform (Radon), which is specifically as follows: generating a vertical gradient image according to the gradient of each pixel point in the image to be processed in the vertical direction; and performing Redon transformation on the vertical gradient image to obtain the first set. Generating a horizontal gradient image according to the gradient of each pixel point in the image to be processed in the horizontal direction; and performing Redon transformation on the horizontal gradient image to obtain the second set.
In two-dimensional images, the Radon transform is roughly understood as: the image F (x, y) is subjected to line integration along different straight lines in the image F (x, y), the distance between the straight line and the image origin (namely the pixel point with the coordinate of (0,0) in the image) is d, the direction angle is alpha, and the obtained image F (d, alpha) is Radon transformation of the image F (x, y). That is, the value of each point in the image F (d, α) corresponds to the line integral value of a certain straight line of the original image F (x, y).
It should be noted that each straight line in the image to be processed may be a line segment with any length and any angle with the horizontal direction in the image to be processed. However, in order to facilitate subsequent calculation and judgment, improve the accuracy of boundary acquisition of the beam limiter, and reduce interference of internal information in the image to be processed, each straight line in the image to be processed takes the edge of the image to be processed as two end points and penetrates through the image to be processed.
In addition, in a preferred implementation manner of this embodiment, the gradients of all pixel points in each straight line in the image to be processed in the vertical direction and the horizontal direction may be accumulated and plotted into a vertical line integral image and a horizontal line integral image, so as to facilitate the subsequent determination and verification process and facilitate observation of a user. Fig. 7a is a vertical line integral image of the hand bone setting X-ray raw image shown in fig. 1, and fig. 7b is a horizontal line integral image of the hand bone setting X-ray raw image shown in fig. 1.
Thus, for any straight line in the image to be processed, the distance d from the central point of the image to be processed and the included angle theta from the horizontal direction are set. And calculating the gradient accumulation sum (namely a line integral result) of all pixel points in each straight line in the image to be processed in the vertical direction and the horizontal direction, and respectively recording the gradient accumulation sum to the corresponding positions in the corresponding line integral images. For example, the abscissa of the line-integrated image represents the angle θ between the straight line and the horizontal direction, and the ordinate represents the distance d from the straight line to the center point of the image to be processed. In this case, the coordinate in the vertical line integral image can be expressed as (d)11) The gray level of the pixel point is set to be d which is the distance from the center of the image to be processed1And has an angle theta with the horizontal direction1The gradient accumulation sum of all pixel points in the vertical direction on the straight line of the pixel; integrating the horizontal line into the coordinate (d) in the image22) The gray level of the pixel point is set to be d which is the distance between the image to be processed and the center of the image to be processed2And has an angle theta with the horizontal direction2The gradient accumulation sum of all pixel points in the horizontal direction on the straight line of (2). It is understood that those skilled in the art can also draw the line integral image in other ways according to the actual situation, and the description is not repeated here.
S205: searching the maximum value and the minimum value in the first set, determining that a straight line corresponding to the maximum value in the first set is a first edge, and determining that a straight line corresponding to the minimum value in the first set is a second edge;
s206: searching the maximum value and the minimum value in the second set, determining that the straight line corresponding to the maximum value in the second set is a third edge, and determining that the straight line corresponding to the minimum value in the second set is a fourth edge;
according to the definition of the gradient, if a straight line exists at a certain position in the image to be processed, a larger difference is certainly existed between the straight line and the surrounding gray scale, and the sum of the gradients of all pixel points on the corresponding straight line in the vertical direction or the horizontal direction is certainly larger or smaller, which is reflected that a maximum point exists on the corresponding coordinate in the line integral image. Taking a vertical line integral image as an example, finding out the maximum value and the minimum value of the gray scale in the vertical line integral image, wherein the coordinates of the maximum value and the minimum value respectively correspond to the distance between the boundary of the irradiation field of the beam limiter in the image to be processed and the central point of the image to be processed and the angle between the boundary and the horizontal direction.
In summary, as can be seen from the original hand bone orthopaedics X-ray image shown in fig. 1, even under the condition that the contrast of the boundary of the irradiation field of the beam limiter below the image is weak, the contrast of the pixel point on the boundary line of the irradiation field of the beam limiter with other pixel points around the boundary line of the irradiation field of the beam limiter is stronger than the contrast of the pixel point on the straight line inside the irradiation field of the beam limiter with other pixel points around the boundary line of the irradiation field of the beam limiter. At the moment, the boundary line of the irradiation field of the beam limiter in the image to be processed can be distinguished by calculating the accumulated sum of the gradients of all pixel points on each straight line in the image to be processed in the vertical direction and the horizontal direction and taking the accumulated sum of the gradients as the basis. The boundary lines obtained through the above steps are four continuous straight lines.
S207: and determining the first edge, the second edge, the third edge and the fourth edge as boundaries of an irradiation field of the beam limiter in the image to be processed.
Obviously, the irradiation field of the beam limiter in the image to be processed is an area enclosed by the first edge, the second edge, the third edge and the fourth edge.
In order to further improve the accuracy of obtaining the irradiation field of the beam limiter, whether the first edge, the second edge, the third edge and the fourth edge are boundaries of the irradiation field of the beam limiter in the image to be processed can be verified according to a preset rule, so that the first edge, the second edge, the third edge and the fourth edge are determined to be boundaries of the irradiation field of the beam limiter in the image to be processed. In a specific implementation, the first edge, the second edge, the third edge, and the fourth edge are determined as the boundaries of the radiation field of the beam limiter in the image to be processed, and at least three possible implementation manners are described in detail below.
In a first possible implementation manner, the method for determining the first edge, the second edge, the third edge, and the fourth edge as the boundaries of the irradiation field of the beam limiter in the image to be processed includes:
respectively judging whether the first edge and the second edge are parallel and whether the third edge and the fourth edge are parallel; when the first edge is parallel to the second edge and the third edge is parallel to the fourth edge, continuously judging whether two adjacent edges of the first edge, the second edge, the third edge and the fourth edge are mutually vertical; when two adjacent edges of the first edge, the second edge, the third edge and the fourth edge are perpendicular to each other, the first edge, the second edge, the third edge and the fourth edge are boundaries of an irradiation field of a beam limiter in the image to be processed.
It will be understood that in practice, the two cliches of the beam limiter in opposite directions are parallel to each other and the two cliches in adjacent directions are perpendicular to each other. Therefore, the parallel edges in the opposite direction and the perpendicular edges in the adjacent direction in the four detected edges are ensured.
If the judgment result in the above step is negative, whether the calculated values of the first set and the second set are correct or not needs to be considered.
It should be noted that, in practical applications, due to the influence of operation errors or equipment errors, the four boundaries of the irradiation field of the beam limiter in the image to be processed are not necessarily completely parallel or perpendicular, and there is an error of several degrees in general. Therefore, for any two beam limiter irradiation field boundary lines, it is necessary to ensure that the angular deviation is smaller than the set angular threshold, for example, 2 degrees.
At this time, the angle between the first edge and the second edge and the angle between the third edge and the fourth edge need to be greater than 180a-2 degrees and less than 180a +2 degrees, and a is an integer. The included angle between two adjacent edges in the first edge, the second edge, the third edge and the fourth edge needs to be larger than 90b-2 degrees and smaller than 90b +2 degrees, and b is an integer.
In a second possible implementation manner, determining the first edge, the second edge, the third edge, and the fourth edge as the boundary of the irradiation field of the beam limiter in the image to be processed includes:
when the absolute value of the difference between the maximum value in the first set and the accumulated sum of the gradients of all the pixel points on the first straight line in the vertical direction is larger than a first preset difference value, the first boundary is the boundary of the irradiation field of the beam limiter in the image to be processed;
when the absolute value of the difference between the minimum value in the first set and the accumulated sum of the gradients of all the pixel points on the second straight line in the vertical direction is larger than a second preset difference value, the second boundary is the boundary of the irradiation field of the beam limiter in the image to be processed;
when the absolute value of the difference between the maximum value in the second set and the gradient accumulation sum of all pixel points on a third straight line in the horizontal direction is larger than a third preset difference value, the third boundary is the boundary of the beam limiter illumination field in the image to be processed;
when the absolute value of the difference between the minimum value in the second set and the gradient accumulation sum of all pixel points on a fourth straight line in the horizontal direction is larger than a fourth preset difference value, the fourth boundary is the boundary of the beam limiter illumination field in the image to be processed;
wherein an absolute value of a difference between a distance between the first straight line and a first preset point in the image to be processed and a distance between the first boundary and the first preset point is smaller than a first distance, an absolute value of a difference between a distance between the second straight line and a second preset point in the image to be processed and a distance between the second boundary and the second preset point is smaller than a second distance, an absolute value of a difference between a distance between the third straight line and a third preset point in the image to be processed and a distance between the first boundary and the third preset point is smaller than a third distance, and an absolute value of a difference between a distance between the fourth straight line and a fourth preset point in the image to be processed and a distance between the fourth boundary and the fourth preset point is smaller than a fourth distance.
Those skilled in the art can specifically set the distance differences between the first boundary and the first straight line, the second boundary and the second straight line, the third boundary and the third straight line, and the fourth boundary and the fourth straight line according to actual situations, that is, the first distance, the second distance, the third distance, and the fourth distance, which are not listed herein. In addition, the first preset difference, the second preset difference, the third preset difference and the fourth preset difference can also be specifically set according to actual conditions.
It should be noted that, since there is only one boundary of the irradiation field of the beam limiter in each of the four directions, i.e., the up, down, left, and right directions of the image to be processed. Therefore, in the image to be processed, the gradient accumulated sum of all pixel points on a certain straight line in a region outside a certain span of the straight line corresponding to the maximum value and the minimum value of the first set should be far smaller than the maximum value of the first set or far larger than the minimum value of the first set. Similarly, in the image to be processed, the gradient accumulated sum of all pixel points on a certain straight line in a region outside a certain span of the straight line corresponding to the maximum value and the minimum value of the second set should be far smaller than the maximum value of the second set or far larger than the minimum value of the second set.
The gray scale of a certain point in the area beyond a certain span of the maximum gray scale point is far smaller than that of the maximum gray scale point, and the gray scale of a certain point in the area beyond a certain span of the minimum gray scale point is far larger than that of the minimum gray scale point. When the above conditions are met, the distance and the angle corresponding to the maximum value and the minimum value can be searched according to the vertical line integral image and the horizontal line integral image, so that the position and the angle of the boundary of the irradiation field of the beam limiter in the image to be processed can be obtained.
In a third possible implementation manner, determining the first edge, the second edge, the third edge, and the fourth edge as the boundary of the irradiation field of the beam limiter in the image to be processed includes:
when the absolute value of the maximum value in the first set is larger than a first initial threshold value, determining that the first edge is a first boundary of an irradiation field of a beam limiter in the image to be processed; when the absolute value of the minimum value in the first set is larger than a second initial threshold, determining that the second edge is a second boundary of an irradiation field of a beam limiter in the image to be processed; when the absolute value of the maximum value in the second set is greater than a third initial threshold, determining that the third edge is a third boundary of an irradiation field of a beam limiter in the image to be processed; and when the absolute value of the minimum value in the second set is greater than a fourth initial threshold, determining that the fourth edge is a fourth boundary of the irradiation field of the beam limiter in the image to be processed. And when the edge of a certain edge is not determined to be the boundary of the irradiation field of the beam limiter, judging that the boundary of the irradiation field of the beam limiter does not exist in the direction corresponding to the edge in the image to be processed.
It can be understood that, in general, the boundary line of the irradiation field of the beam limiter is more obvious than other detectable edges in the image to be processed, and the gradient sum of all pixel points on the boundary line of the irradiation field in the image to be processed in the vertical and horizontal directions is much larger or much smaller than the gradient sum of all pixel points on other detected straight lines in the corresponding gradient map. At this time, in order to improve the accuracy of the determination, a person skilled in the art may set the first initial threshold, the second initial threshold, the third initial threshold, and the fourth initial threshold according to the actual situation. When the absolute value of the maximum value in the first set and the second set is greater than the initial threshold corresponding to the maximum value, and the absolute value of the minimum value in the first set and the second set is greater than the initial threshold corresponding to the minimum value, the edge corresponding to the maximum value or the minimum value can be judged to be the real irradiation field boundary of the beam limiter in the image to be processed.
In a preferred embodiment, the first initial threshold value, the second initial threshold value, the third initial threshold value and the fourth initial threshold value may be set by the following method.
Firstly, acquiring a low-frequency image of the image to be processed;
it can be understood that the detail features of the image to be processed are filtered out from the low-frequency image, and only the gray level change trend of the image to be processed is saved. Therefore, the first initial threshold, the second initial threshold, the third initial threshold and the fourth initial threshold are obtained according to the gray scale features in the low-frequency image, so that the interference of high-frequency information (namely detail features) in the image to be processed can be reduced, and the result is more accurate.
In specific implementation, the image to be processed can be decomposed by adopting a multi-resolution pyramid algorithm to obtain a low-frequency image. In this embodiment, an existing gaussian or laplacian pyramid algorithm may be employed. It should be noted that, in order to ensure the accuracy of the search result, a larger operator is required to calculate the low-frequency image of the image to be processed.
Secondly, determining a region with Y-axis coordinates smaller than (H/2) -H x f in the low-frequency image as an upper moving region of the boundary of the irradiation field of the beam limiter, and determining a region with Y-axis coordinates larger than (H/2) + H x f in the low-frequency image as a lower moving region of the boundary of the irradiation field of the beam limiter; determining a region with X-axis coordinates smaller than (W/2) -W X f in the low-frequency image as a left moving region of the boundary of the irradiation field of the beam limiter, and determining a region with X-axis coordinates larger than (W/2) + W X f in the low-frequency image as a right moving region of the boundary of the irradiation field of the beam limiter; h is the maximum value of the Y-axis coordinate of the low-frequency image, W is the maximum value of the X-axis coordinate of the low-frequency image, and f is a preset truncation coefficient.
It can be understood that the point with zero height and width in the image to be processed is the pixel point at the upper left corner of the image.
It should be noted here that, in practical applications, the beam limiter illumination field in the image to be processed is generally located at the center of the image, and the boundary of the beam limiter illumination field is generally located in a region close to the edge of the image to be processed. Moreover, in the practical design of the beam limiter, the lead plate of the beam limiter has a maximum shielding range, so that the beam limiter cannot shield all the X-rays. Therefore, the skilled person sets a preset truncation factor (i.e. f) according to the actual situation, so as to calculate the maximum moving range of the four boundaries of the irradiation field of the beam limiter in each direction of the image to be processed. At this time, in order to improve the accuracy of the determination, the first initial threshold, the second initial threshold, the third initial threshold, and the fourth initial threshold may be specifically set according to the actual situation of the moving region to which the corresponding edge belongs.
Determining the corresponding relations among the first edge, the second edge, the third edge and the fourth edge, the upper moving area, the lower moving area, the left moving area and the right moving area, wherein each moving area corresponds to one edge;
fourthly, respectively obtaining the gray average values of all pixel points in the upper moving area, the lower moving area, the left moving area and the right moving area to obtain an upper gray average value, a lower gray average value, a left gray average value and a right gray average value;
fifthly, when the upper gray average value is larger than or equal to a preset maximum gray average value A, the judgment threshold value of the upper moving area is equal to a preset maximum judgment threshold value; when the upper gray average value is less than or equal to a preset minimum gray average value B, the judgment threshold value of the upper moving area is equal to a preset minimum judgment threshold value; when the upper gray average value is larger than a preset minimum gray average value B and the upper gray average value is smaller than a preset maximum gray average value A, setting a judgment threshold value of the upper moving area according to the upper gray average value;
sixthly, when the lower gray average value is greater than or equal to a preset maximum gray average value A, the judgment threshold value of the lower moving area is equal to the preset maximum judgment threshold value; when the lower gray average value is less than or equal to a preset minimum gray average value B, the judgment threshold value of the lower moving area is equal to the preset minimum judgment threshold value; when the lower gray average value is larger than a preset minimum gray average value B and the lower gray average value is smaller than a preset maximum gray average value A, setting a judgment threshold value of the lower moving area according to the lower gray average value;
seventhly, when the left gray average value is greater than or equal to a preset maximum gray average value A, the judgment threshold value of the left moving area is equal to the preset maximum judgment threshold value; when the left part gray average value is smaller than or equal to a preset minimum gray average value B, the judgment threshold value of the left moving area is equal to the preset minimum judgment threshold value; when the left gray average value is larger than a preset minimum gray average value B and the left gray average value is smaller than a preset maximum gray average value A, setting a judgment threshold value of the left moving area according to the left gray average value;
eighthly, when the right gray average value is larger than or equal to a preset maximum gray average value A, the judgment threshold value of the right moving area is equal to the preset maximum judgment threshold value; when the right gray average value is less than or equal to a preset minimum gray average value B, the judgment threshold value of the right moving area is equal to the preset minimum judgment threshold value; when the right gray average value is larger than a preset minimum gray average value B and the right gray average value is smaller than a preset maximum gray average value A, setting a judgment threshold value of the right moving area according to the right gray average value;
and a ninth step, in which the first initial threshold, the second initial threshold, the third initial threshold and the fourth initial threshold are respectively equal to judgment thresholds of moving areas corresponding to the first edge, the second edge, the third edge and the fourth edge.
Those skilled in the art can set the preset maximum judgment threshold, the preset maximum gray threshold and the preset minimum gray threshold according to actual conditions, which are not listed here.
In short, when the gray average value of any moving area is greater than or equal to the preset maximum gray threshold, the judgment threshold of the moving area is equal to the preset maximum judgment threshold; when the gray average value of the moving area is smaller than or equal to a preset minimum gray threshold, the judgment threshold of the moving area is equal to a preset minimum judgment threshold; and when the gray average value of the moving area is smaller than a preset maximum gray threshold value and larger than a preset minimum gray threshold value, setting a judgment threshold value of the moving area according to the gray average value of the moving area.
When the X-ray camera system is used, the boundary contrast of the irradiation field of the beam limiter in the X-ray original images generated by different body parts is different. Therefore, when the gray average value of a certain moving region is smaller than the preset maximum gray threshold value and larger than the preset minimum gray threshold value, a person skilled in the art can set the judgment threshold value of the moving region according to different body part types shown in the image to be processed and the gray average value of the moving region, so as to improve the accuracy of the judgment of the irradiation field boundary.
It should be noted that, in a specific implementation, one of three possible implementation manners of the above method for determining the first edge, the second edge, the third edge, and the fourth edge as the irradiation field boundary of the beam limiter in the image to be processed may be adopted, or the three possible implementation manners may also be used in combination, for example, the verification method described in the second possible implementation manner is adopted first, and then the verification method described in the third possible implementation manner is adopted.
Furthermore, since in some usage scenarios the beam limiter does not necessarily block all four directions, i.e. there is not necessarily a boundary line of the radiation field of the beam limiter in a certain direction of the image to be processed. In this way, one or more boundaries of the beam limiter illumination field in the image to be processed are made to be the edges of the image to be processed. Moreover, due to the complexity of the information of the image to be processed and the interference of factors such as external noise, when the verification is performed by the methods described in the second possible implementation manner and the third possible implementation manner, a situation that one or more edges are not determined as the irradiation field boundary of the beam limiter in the image to be processed may also occur. Therefore, it is necessary to further judge whether the determination result obtained by the above embodiment is correct.
In a preferred embodiment of this embodiment, in order to further improve the accuracy of obtaining the irradiation field of the beam limiter, when one or more edges are not determined as the boundary of the irradiation field of the beam limiter in the image to be processed, a secondary search needs to be performed on the basis of the detection result, so as to determine that the boundary of the irradiation field of the beam limiter does not exist in the direction of the edge of the image to be processed.
In one example, a region of the to-be-processed image with Y-axis coordinates smaller than (H/2) -H × f may be determined as an upper region of the boundary of the beam limiter irradiation field, a region of the to-be-processed image with Y-axis coordinates larger than (H/2) + H × f may be determined as a lower region of the boundary of the beam limiter irradiation field, a region of the to-be-processed image with X-axis coordinates smaller than (W/2) -W × f may be determined as a left region of the boundary of the beam limiter irradiation field, a region of the to-be-processed image with X-axis coordinates larger than (W/2) + W × f may be determined as a right region of the boundary of the beam limiter irradiation field, H is a maximum value of the Y-axis coordinates of the low-frequency image, W is a maximum value of the X-axis coordinates of the low-frequency image, and f is a preset truncation coefficient. Determining the corresponding relationship between the first edge, the second edge, the third edge and the fourth edge and the upper area, the lower area, the left area and the right area, wherein each area corresponds to one edge;
at this time, when the first boundary and the second boundary are not determined to be the boundary of the beam limiter irradiation field in the image to be processed at the same time and/or the third boundary and the fourth boundary are not determined to be the boundary of the beam limiter irradiation field in the image to be processed at the same time, searching the boundary of the beam limiter irradiation field again in the region where the boundaries are not determined; and the area of the undetermined boundary is the area corresponding to the edge of the irradiation field boundary of the beam limiter, which is not determined in the image to be processed.
As an example, the finding again the boundary of the beam limiter irradiation field in the region of the undetermined boundary includes:
firstly, setting a secondary judgment threshold according to an initial threshold corresponding to the undetermined edge;
it can be understood that the secondary judgment threshold needs to be smaller than the judgment threshold of the first search area corresponding to the undetermined boundary area, and the secondary judgment threshold should be a number greater than or equal to zero. The skilled person can specifically set the secondary judgment threshold according to the actual situation, and the two thresholds are not listed here.
Secondly, judging whether the absolute value of the gradient accumulation sum of all the pixel points on the undetermined edge in the vertical direction is larger than the secondary judgment threshold value or not, or judging whether the absolute value of the gradient accumulation sum of all the pixel points on the undetermined edge in the horizontal direction is larger than the secondary judgment threshold value or not; if not, the edge of the image to be processed in the area of the undetermined boundary is the boundary of the irradiation field of the beam limiter; if so, acquiring the boundary of the irradiation field of the beam limiter in the region of the undetermined boundary.
It is understood that when the above-described condition of the secondary search is not satisfied, it indicates that the boundary of the beam limiter irradiation field does not exist in the region where the boundary is not determined. At this time, the edge of the image to be processed is the boundary of the beam limiter illumination field in this direction. When the condition of the secondary search is satisfied, it is indicated that there may be a boundary of the irradiation field of the beam limiter in the region of the undetermined boundary, and the boundary of the irradiation field of the beam limiter needs to be searched again in the region of the undetermined boundary.
In some possible implementations, acquiring the boundary of the beam limiter illumination field in the region of the undetermined boundary includes:
step 1: acquiring a low-frequency image of the image to be processed;
in specific implementation, the image to be processed can be decomposed by adopting a multi-resolution pyramid algorithm to obtain a low-frequency image. In this embodiment, an existing gaussian or laplacian pyramid algorithm may be employed. It should be noted that, in order to ensure the accuracy of the search result, a larger operator is required to calculate the low-frequency image of the image to be processed.
Step 2: determining a region corresponding to the region of the undetermined boundary in the low-frequency image to obtain a first search region;
and step 3: acquiring the gray level mean value of all pixel points in the first search area, and setting a first threshold value according to the gray level mean value of all pixel points in the search area;
it will be appreciated that the mean grey levels of the areas to which the boundaries of the radiation field of the beam limiter belong in different images are not necessarily the same. Therefore, a first gray level judgment threshold value needs to be set according to the gray level average value of all pixel points in the first search area of the low-frequency image and the preset coefficients set according to different body parts displayed by the image to be processed.
And 4, step 4: acquiring the gray average value of all pixel points on each straight line in the first search area to obtain a third set;
and 5: judging whether the minimum value in the third set is larger than the first threshold value or not; if not, the edge of the image to be processed in the area of the undetermined boundary is the boundary of the irradiation field of the beam limiter; if so, determining that the straight line to be determined is the boundary of the irradiation field of the beam limiter in the region without the determined boundary, wherein the position of the straight line corresponding to the minimum value in the third set in the low-frequency image is the same as the position of the straight line to be determined in the image to be processed.
In an example, to further verify whether the straight line corresponding to the minimum value in the third set is the boundary of the beam limiter irradiation field, and improve accuracy, determining that the undetermined straight line is the boundary of the beam limiter irradiation field in the region where the undetermined straight line is the undetermined boundary includes: obtaining a high-frequency image of the image to be processed according to the image to be processed and the low-frequency image; determining a region corresponding to the region of the undetermined boundary in the high-frequency image to obtain a second search region; acquiring the gray level mean value of all pixel points in the second search area, and judging whether the gray level mean value of all pixel points in the second search area is greater than a second threshold value or not; if not, the undetermined straight line is the boundary of the irradiation field of the beam limiter in the image to be processed; if so, the edge of the image to be processed in the region with the undetermined boundary is the boundary of the irradiation field of the beam limiter.
It should be noted here that the high-frequency image only contains the main high-frequency information in the image to be processed, i.e. the detail features of the image. The person skilled in the art may also sum up a plurality of high-frequency images obtained by decomposing the multi-resolution pyramid algorithm to obtain the high-frequency image.
When the boundary line of the irradiation field of the beam limiter does not exist in the second search area, the detail information of the image in the area is more, so that the gray average value of the corresponding area in the high-frequency image is larger; when the boundary line of the irradiation field of the beam limiter exists in the second search area, the detail information of the image in the area is less, so that the gray average value of the corresponding area in the high-frequency image is smaller. Therefore, when the gray average value of all the pixel points in the second search area is larger than the second threshold value, the boundary of the irradiation field of the beam limiter in the area can be determined.
It is understood that the second threshold value can be specifically set by those skilled in the art according to the actual situation. For example, according to different body parts displayed by the image to be processed, and the like.
In some possible implementations of this embodiment, to further ensure the accuracy of the result obtained by the boundary of the irradiation field of the beam limiter, the result (i.e., the determined first boundary, second boundary, third boundary, and fourth boundary) may be verified again before the final result. In one example, the result may be verified again using a method similar to the first possible implementation, the verification method including: respectively judging whether the first boundary and the second boundary are parallel and whether the third boundary and the fourth boundary are parallel; when the first boundary and the second boundary are parallel and the third boundary and the fourth boundary are parallel, continuously judging whether two adjacent boundaries in the first boundary, the second boundary, the third boundary and the fourth boundary are mutually vertical or not; when two adjacent boundaries of the first boundary, the second boundary, the third boundary and the fourth boundary are all perpendicular to each other, the first boundary, the second boundary, the third boundary and the fourth boundary are boundaries of an irradiation field of a beam limiter in the image to be processed.
It should be noted here that, in practical applications, due to the influence of operation errors or equipment errors, the four boundaries of the irradiation field of the beam limiter in the image to be processed are not necessarily completely parallel or perpendicular, and there is typically an error of several degrees. Therefore, for any two beam limiter irradiation field boundary lines, it is necessary to ensure that the angular deviation is smaller than the set angular threshold, for example, 2 degrees.
At this time, the angle between the first boundary and the second boundary and the angle between the third boundary and the fourth boundary need to be greater than 180a-2 degrees and less than 180a +2 degrees, and a is an integer. The included angle between two adjacent boundaries in the first boundary, the second boundary, the third boundary and the fourth boundary needs to be larger than 90b-2 degrees and smaller than 90b +2 degrees, and b is an integer.
It should be noted here that, when one or more boundaries of the irradiation field of the beam limiter in the image to be processed are image edges, it is also necessary to verify whether the image edges and other detected boundaries satisfy the above-mentioned condition.
In order to facilitate understanding of the above-described method for acquiring the boundary of the irradiation field of the beam limiter in the medical diagnostic image, a preferred embodiment of the present embodiment will be described in detail.
After an original image of an irradiated part is obtained by using an X-ray machine, in order to reduce noise interference and reduce calculation amount, the original image is preprocessed to obtain an image to be processed.
The pretreatment process comprises the following steps: after removing partial pixels at the edge of the image to be processed, performing convolution processing on the image to be processed to remove noise influence; because the convolution calculation can cause edge response, part of pixels influenced by the edge of the image to be processed need to be removed; and then, performing dimensionality reduction on the image by using a method of alternate sampling.
And then, calculating the gradients of each pixel point in the image to be processed in the vertical direction and the horizontal direction respectively to obtain a vertical gradient image and a horizontal gradient image. The specific method of gradient calculation is as described in the above embodiments, and is not described here.
And secondly, calculating the gradient accumulation sum of all pixel points in each straight line in the image to be processed in the vertical direction and the horizontal direction respectively. Generating a vertical line integral image according to the position relationship between each straight line in the image to be processed and the central point and the horizontal line of the image to be processed and the accumulated sum (line integral result) of the gradients of all pixel points on the straight line in the vertical direction; and generating a horizontal line integral image according to the position relation between each straight line in the image to be processed and the central point and the horizontal line of the image to be processed and the accumulated sum (line integral result) of the gradients of all pixel points on the straight line in the horizontal direction.
And finally, searching four maximum value points and minimum value points on the vertical line integral image and the horizontal line integral image. The four straight lines corresponding to the four points are the boundaries of the irradiation fields of the beam limiter in the image to be processed.
On the basis, the result needs to be judged and verified so as to improve the accuracy of the result. The judging and verifying steps are as follows:
the method comprises the steps of firstly, obtaining four moving areas of the boundary of an irradiation field of a beam limiter, and determining which moving area the four straight lines respectively belong to. And respectively acquiring judgment threshold values of the four moving areas, and judging whether the line integral results of the four straight lines are greater than the judgment threshold values of the corresponding moving areas. If so, determining the straight line as the boundary of the irradiation field of the beam limiter; if not, the boundary of the irradiation field of the beam limiter does not exist in the moving area corresponding to the straight line. The method for obtaining the judgment threshold of each moving region is the same as the method described in the above embodiment, and is not described herein again.
And secondly, verifying whether a point exists in an area outside a certain span of the maximum point corresponding to the straight line of the irradiation field boundary of the beam limiter determined in the first step on the vertical line integral image and the horizontal line integral image respectively, wherein the line integral result is far larger or smaller than the maximum point. If not, the boundary of the irradiation field of the beam limiter does not exist in the moving area corresponding to the straight line.
And thirdly, taking the boundaries of the irradiation fields of the beam limiters in the upper and lower directions as an example, if only the boundary line of the irradiation field of the beam limiter in one direction is found through the steps and the boundary of the irradiation field of the beam limiter in the other direction does not exist, secondary judgment is needed to determine whether the boundary line of the irradiation field of the beam limiter does not exist in the other direction. Specifically, by setting a secondary judgment threshold, if there is a point in the direction in which the boundary line of the irradiation field of the beam limiter does not exist, and the absolute value of the line integration result is greater than the secondary judgment threshold, it indicates that there may be a boundary of the irradiation field of the beam limiter in the other direction. At this time, the boundary of the irradiation field of the beam limiter is searched in the direction according to the method A and the method B:
the method A comprises the following steps: and verifying the low-frequency image. And acquiring a first gray judgment threshold according to the gray average value of the corresponding moving area in the low-frequency image of the image to be processed. And finding the minimum value of the gray average values of all pixel points on each straight line in the corresponding area on the low-frequency image of the image to be processed in the direction in which the beam limiter does not exist in the initial judgment. If the minimum value is larger than the gray level judgment threshold value, judging that the boundary of the irradiation field of the beam limiter exists in the direction, wherein the position of a straight line corresponding to the minimum value of the gray level mean value is the same as the boundary position of the irradiation field of the beam limiter in the image to be processed; otherwise, the direction has no boundary of the irradiation field of the beam limiter;
the method B comprises the following steps: and (5) high-frequency image verification. And defining a second gray level judgment threshold, and calculating the gray level mean value of all pixel points in the corresponding area in the high-frequency image of the image to be processed in the direction in which the beam limiter is not present in the initial judgment. And if the pixel mean value is larger than the second gray level judgment threshold value, the boundary of the irradiation field of the beam limiter does not exist in the direction.
The high frequency image verification is performed on the basis of low frequency image verification, that is, if the low frequency image verification determines that there is a boundary of the beam limiter irradiation field in a certain direction, but the boundary does not satisfy the condition of the high frequency image verification, it is also determined that there is no boundary of the beam limiter irradiation field in the direction.
The secondary searching method for the irradiation field boundary of the beam limiter in the left and right directions is similar, and is not described herein again. And defining the edge of the image to be processed as the boundary of the irradiation field of the beam limiter in the direction of judging that the boundary of the irradiation field of the beam limiter does not exist in the image to be processed.
And fourthly, ensuring that the angle deviation of the included angle between every two boundaries is smaller than a set angle threshold value for the boundaries (possibly the edges of the image to be processed) of the illumination fields of the four beam limiters. That is, the detection result is to ensure that the boundaries of the irradiation fields of the beam limiters in the opposite directions (up-down or left-right) are parallel and the boundaries of the irradiation fields of the beam limiters in different directions (adjacent) are perpendicular. When the above condition is not satisfied, it is necessary to consider whether the calculation of the accumulated value of the gradient is correct.
By judging and verifying the initial detection result through the steps, the real position of the boundary of the irradiation field of the beam limiter in the image to be processed can be obtained, and the range of the irradiation field of the beam limiter in the image to be processed is further obtained.
FIG. 8a is a diagram illustrating a boundary line of the hand bone orthopaedics original image shown in FIG. 1 detected by the method for obtaining the boundary of the irradiation field of the beam limiter in the medical diagnosis image according to the embodiment of the present invention; FIG. 8b shows the boundary lines of the patellar axial X-ray raw image detected by the method for acquiring the boundary of the irradiation field of the beam limiter in the medical diagnostic image according to the embodiment of the invention; FIG. 8c is a diagram illustrating the boundary lines of the lumbar vertebra normal position X-ray original image detected by the method for acquiring the boundary of the irradiation field of the beam limiter in the medical diagnosis image according to the embodiment of the present invention; fig. 8d shows the boundary line of the sacral orthopaedics original image detected by the method for acquiring the boundary of the irradiation field of the beam limiter in the medical diagnosis image provided by the embodiment of the invention. As can be seen from the detection result graph, the method for acquiring the boundary of the radiation field of the beam limiter in the medical diagnosis image provided by the embodiment of the invention can accurately detect the radiation field of the beam limiter with weak boundary contrast and discontinuous boundary influenced by noise, and obtain a continuous boundary line of the radiation field of the beam limiter. In the subsequent image processing process, a person skilled in the art can directly and accurately remove the area outside the irradiation field of the beam limiter in the image according to the acquired boundary line, so that the precision of the subsequent image processing result is improved.
In the method for obtaining the irradiation field boundary of the beam limiter in the medical diagnosis image according to this embodiment, the gradients of the image to be processed in the vertical direction and the horizontal direction are respectively calculated, and then the gradient cumulative sum of all the pixel points in each straight line in the image to be processed in the vertical direction and the horizontal direction is calculated, so as to obtain the vertical gradient cumulative sum and the horizontal gradient cumulative sum of each straight line in the image to be processed. And then, respectively searching straight lines corresponding to the maximum vertical gradient accumulation, the maximum horizontal gradient accumulation sum, the minimum vertical gradient accumulation sum and the minimum horizontal gradient accumulation sum to obtain four edges. The four edges are the boundary lines of the irradiation field of the beam limiter in the image to be processed. The irradiation field of the beam limiter in the image to be processed is an internal area enclosed by the four edges. The method for obtaining the irradiation field boundary of the beam limiter in the medical diagnosis image provided by the embodiment can accurately detect the irradiation field of the beam limiter with weak boundary contrast and discontinuous boundary due to the influence of noise, so as to obtain the continuous irradiation field boundary of the beam limiter, so that the region except the irradiation field of the beam limiter in the image can be removed in the subsequent processing, and the precision of the post-image processing result is improved.
Based on the method for acquiring the irradiation field boundary of the beam limiter in the medical diagnosis image provided by the embodiment, the embodiment of the invention also provides a device for acquiring the irradiation field of the beam limiter in the medical diagnosis image.
The embodiment of the device is as follows:
referring to fig. 9, it is a schematic structural diagram of an embodiment of the apparatus for acquiring a boundary of an irradiation field of a beam limiter in a medical diagnostic image according to the present invention.
The apparatus for acquiring the irradiation field boundary of the beam limiter in the medical diagnosis image provided by the embodiment comprises: a first obtaining module 100, a second obtaining module 200, a third obtaining module 300, a fourth obtaining module 400, a first determining module 500 and a second determining module 600;
the first obtaining module 100 is configured to obtain an image to be processed;
the second obtaining module 200 is configured to obtain gradients of each pixel point in the image to be processed in a vertical direction and a horizontal direction, respectively;
in some possible implementation manners of this embodiment, the second obtaining module 200 includes: a fifth acquisition sub-module, an image processing sub-module, and a sixth acquisition sub-module (all not shown in the figure);
the fifth obtaining submodule is used for respectively obtaining the gray level mean values of all the pixel points in the first preset range, the second preset range, the third preset range and the fourth preset range to obtain a first gray level mean value, a second gray level mean value, a third gray level mean value and a fourth gray level mean value;
the first preset range is located right above a preset point, the second preset range is located right below the preset point, the third preset range is located right left of the preset point, the fourth preset range is located right of the preset point, the first preset range, the second preset range, the third preset range and the fourth preset range are the same in size and the same in distance with the preset point, and the preset point is any pixel point in the image to be processed;
the image processing submodule is used for completing pixel points in the preset range according to the gray level of the edge pixel points of the image to be processed when pixel points are lacked in the first preset range, the second preset range, the third preset range or the fourth preset range;
the sixth obtaining submodule is configured to obtain a gradient of the preset point in the vertical direction according to the first gray average value and the second gray average value;
the sixth obtaining sub-module is further configured to obtain a gradient of the preset point in the horizontal direction according to the third gray average value and the fourth gray average value.
In some possible implementation manners of this embodiment, the fifth obtaining sub-module includes: a seventh acquisition sub-module, an eighth acquisition sub-module, and a ninth acquisition sub-module (all not shown in the figure);
the seventh obtaining submodule is used for respectively obtaining the accumulated sum of the gray levels of all pixel points in the first area, the second area, the third area and the fourth area in the image to be processed to obtain a first value, a second value, a third value and a fourth value;
the first area is a square area determined by a first point and a pixel point at the upper left corner of the image to be processed, the second area is a square area determined by a second point and a pixel point at the upper left corner of the image to be processed, the third area is a square area determined by a third point and a pixel point at the upper left corner of the image to be processed, the fourth area is a square area determined by a pixel point at the lower right corner of the first range and a pixel point at the upper left corner of the image to be processed, and the coordinate of the first point is (x)1-1,y1-1), the coordinates of said second point being (x)2-1,y2) The coordinate of the third point is (x)3,y3-1), the coordinates of the pixel point at the upper left corner in the first range being (x)1,y1) The coordinate of the pixel point at the lower left corner in the first range is (x)2,y2) The coordinate of the pixel point at the upper right corner in the first range is (x)3,y3) The first range is the first preset range, the second preset range, the third preset range or the fourth preset range;
the eighth obtaining submodule is configured to add the first value to the fourth value to subtract the second value and then subtract the third value to obtain an accumulated sum of all pixel grays in a first range, and obtain an accumulated sum of all pixel grays in the first preset range, the second preset range, the third preset range and the fourth preset range;
the ninth obtaining submodule is configured to obtain a mean gray level value of all pixel points in the first preset range, the second preset range, the third preset range, and the fourth preset range according to an accumulated sum of gray levels of all pixel points in the first preset range, the second preset range, the third preset range, and the fourth preset range.
In some possible implementation manners of this embodiment, the seventh obtaining sub-module includes: a first computation submodule and a second computation submodule (neither shown in the figure);
the first calculation submodule is used for calculating according to a formula
Figure BDA0001139934040000331
Calculating the row accumulated value sum _ l of a pixel point with the coordinate of (m, n) in the image to be processedm,nWherein G ism,iThe gray scale of a pixel point with the coordinate of (m, i) in the image to be processed is obtained;
the second calculation submodule is used for calculating according to a formula
Figure BDA0001139934040000332
Calculating the gray level sum accumulation sum _ all of all pixel points in the area corresponding to the pixel point with the coordinate of (m, n)m,n
And the pixel point with the coordinate of (m, n) is the pixel point at the lower right corner of the first area, the second area, the third area or the fourth area.
The third obtaining module 300 is configured to obtain a gradient cumulative sum of all pixel points in each straight line in the image to be processed in the vertical direction, so as to obtain a first set;
in some possible implementation manners of this embodiment, the third obtaining module 300 specifically includes: a first generation sub-module and a first processing sub-module (neither shown in the figure);
the first generation submodule is used for generating a vertical gradient image according to the gradient of each pixel point in the image to be processed in the vertical direction;
the first processing submodule is used for carrying out Redon transformation on the vertical gradient image to obtain the first set;
the fourth obtaining module 400 is configured to obtain a gradient cumulative sum of all pixel points in each straight line in the image to be processed in the horizontal direction, so as to obtain a second set;
in some possible implementation manners of this embodiment, the fourth obtaining module 400 specifically includes: a second generation submodule and a second processing submodule (both not shown in the figure);
the second generation submodule is used for generating a horizontal gradient image according to the gradient of each pixel point in the image to be processed in the horizontal direction;
and the second processing submodule is used for carrying out Redon transformation on the horizontal gradient image to obtain the second set.
The first determining module 500 is configured to search for a maximum value and a minimum value in the first set, determine that a straight line corresponding to the maximum value in the first set is a first edge, and determine that a straight line corresponding to the minimum value in the first set is a second edge;
the first determining module 500 is further configured to search for a maximum value and a minimum value in the second set, determine that a straight line corresponding to the maximum value in the second set is a third edge, and determine that a straight line corresponding to the minimum value in the second set is a fourth edge;
the second determining module 600 is configured to determine that the first edge, the second edge, the third edge, and the fourth edge are boundaries of an irradiation field of the beam limiter in the image to be processed.
In some possible implementation manners of this embodiment, the second determining module 600 includes: a first determination submodule (neither shown in the figure);
the first determining submodule is used for determining that the first edge is a first boundary of an irradiation field of the beam limiter in the image to be processed when the absolute value of the maximum value in the first set is larger than a first initial threshold;
the first determining submodule is further configured to determine that the second edge is a second boundary of an irradiation field of the beam limiter in the image to be processed when an absolute value of a minimum value in the first set is greater than a second initial threshold;
the first determining submodule is further configured to determine that the third edge is a third boundary of an irradiation field of the beam limiter in the image to be processed, when an absolute value of a maximum value in the second set is greater than a third initial threshold;
the first determining sub-module is further configured to determine, when an absolute value of a minimum value in the second set is greater than a fourth initial threshold, that the fourth edge is a fourth boundary of an irradiation field of the beam limiter in the image to be processed.
In some possible implementation manners of this embodiment, the apparatus provided in this embodiment further includes: a third determining module, a fourth determining module, a fifth obtaining module and a threshold setting module (all not shown in the figure);
the first obtaining module 100 is further configured to obtain a low-frequency image of the image to be processed;
the third determining module is configured to determine a region in the low-frequency image with Y-axis coordinates smaller than (H/2) -hxf as an upper moving region of the boundary of the irradiation field of the beam limiter, and determine a region in the low-frequency image with Y-axis coordinates larger than (H/2) + hxf as a lower moving region of the boundary of the irradiation field of the beam limiter; the third determining module is further configured to determine a region in the low-frequency image with an X-axis coordinate smaller than (W/2) -wxf as a left moving region of the boundary of the irradiation field of the beam limiter, and determine a region in the low-frequency image with an X-axis coordinate larger than (W/2) + wxf as a right moving region of the boundary of the irradiation field of the beam limiter; wherein H is the maximum value of the Y-axis coordinate of the low-frequency image, W is the maximum value of the X-axis coordinate of the low-frequency image, f is a preset truncation coefficient,
the fourth determining module is configured to determine correspondence relationships between the first edge, the second edge, the third edge, and the fourth edge and the upper moving area, the lower moving area, the left moving area, and the right moving area, where each moving area corresponds to one edge;
the fifth obtaining module is configured to obtain gray level mean values of all pixel points in the upper moving region, the lower moving region, the left moving region and the right moving region respectively to obtain an upper gray level mean value, a lower gray level mean value, a left gray level mean value and a right gray level mean value;
the threshold setting module is used for setting the judgment threshold of the upper moving area to be equal to a preset maximum judgment threshold when the upper gray average is larger than or equal to a preset maximum gray average A; when the upper gray average value is less than or equal to a preset minimum gray average value B, the judgment threshold value of the upper moving area is equal to a preset minimum judgment threshold value; when the upper gray average value is larger than a preset minimum gray average value B and the upper gray average value is smaller than a preset maximum gray average value A, setting a judgment threshold value of the upper moving area according to the upper gray average value;
the threshold setting module is further configured to, when the lower grayscale mean is greater than or equal to a preset maximum grayscale mean a, set a judgment threshold of the lower moving region to be equal to the preset maximum judgment threshold; when the lower gray average value is less than or equal to a preset minimum gray average value B, the judgment threshold value of the lower moving area is equal to the preset minimum judgment threshold value; when the lower gray average value is larger than a preset minimum gray average value B and the lower gray average value is smaller than a preset maximum gray average value A, setting a judgment threshold value of the lower moving area according to the lower gray average value;
the threshold setting module is further configured to, when the left portion gray average is greater than or equal to a preset maximum gray average a, set a judgment threshold of the left moving area to be equal to the preset maximum judgment threshold; when the left part gray average value is smaller than or equal to a preset minimum gray average value B, the judgment threshold value of the left moving area is equal to the preset minimum judgment threshold value; when the left gray average value is larger than a preset minimum gray average value B and the left gray average value is smaller than a preset maximum gray average value A, setting a judgment threshold value of the left moving area according to the left gray average value;
the threshold setting module is further configured to, when the right grayscale mean is greater than or equal to a preset maximum grayscale mean a, set a judgment threshold of the right moving region to be equal to the preset maximum judgment threshold; when the right gray average value is less than or equal to a preset minimum gray average value B, the judgment threshold value of the right moving area is equal to the preset minimum judgment threshold value; when the partial right gray average value is larger than a preset minimum gray average value B and the right gray average value is smaller than a preset maximum gray average value A, setting a judgment threshold value of the right moving area according to the right gray average value;
the first initial threshold, the second initial threshold, the third initial threshold, and the fourth initial threshold are respectively equal to judgment thresholds of moving areas corresponding to the first edge, the second edge, the third edge, and the fourth edge.
In some possible implementation manners, the apparatus provided in this embodiment further includes: a fifth determining module, a sixth determining module and a searching module (all not shown in the figure);
the fifth determining module is configured to determine a region in the to-be-processed image with Y-axis coordinates smaller than (H/2) -hxf as an upper region of the irradiation field boundary of the beam limiter, determine a region in the to-be-processed image with Y-axis coordinates larger than (H/2) + hxf as a lower region of the irradiation field boundary of the beam limiter, determine a region in the to-be-processed image with X-axis coordinates smaller than (W/2) -wxf as a left region of the irradiation field boundary of the beam limiter, determine a region in the to-be-processed image with X-axis coordinates larger than (W/2) + wxf as a right region of the irradiation field boundary of the beam limiter, H is a maximum value of the Y-axis coordinates of the low-frequency image, W is a maximum value of the X-axis coordinates of the low-frequency image, and f is a preset truncation coefficient;
the sixth determining module is configured to determine correspondence relationships between the first edge, the second edge, the third edge, and the fourth edge and the upper area, the lower area, the left area, and the right area, where each area corresponds to one edge;
the searching module is configured to search the boundary of the beam limiter irradiation field again in an area where the boundary is not determined when the first boundary and the second boundary are not determined at the same time as the boundary of the beam limiter irradiation field in the image to be processed and/or when the third boundary and the fourth boundary are not determined at the same time as the boundary of the beam limiter irradiation field in the image to be processed;
and the area of the undetermined boundary is the area corresponding to the edge of the irradiation field boundary of the beam limiter, which is not determined in the image to be processed.
In some possible implementation manners of this embodiment, the search module includes: a setting sub-module, a first judging sub-module, a second determining sub-module and a first obtaining sub-module (all not shown in the figure);
the setting submodule is used for setting a secondary judgment threshold according to the initial threshold corresponding to the undetermined edge;
the first judging submodule is configured to judge whether absolute values of gradient accumulation sums of all pixel points in the undetermined edge in the vertical direction are greater than the secondary judgment threshold, or judge whether absolute values of gradient accumulation sums of all pixel points in the undetermined edge in the horizontal direction are greater than the secondary judgment threshold;
the second determining submodule is used for determining that the edge of the image to be processed in the area of the undetermined boundary is the boundary of the irradiation field of the beam limiter when the judgment result of the first judging submodule is negative;
the first obtaining sub-module is configured to, if a determination result of the first determining sub-module is yes, obtain a boundary of the irradiation field of the beam limiter in the area where the boundary is not determined.
In some possible implementation manners of this embodiment, the first obtaining sub-module includes: a second obtaining submodule, a third determining submodule, a second setting submodule, a third obtaining submodule, a second judging submodule, a fourth determining submodule and a fifth determining submodule (all of which are not shown in the figure);
the second obtaining submodule is used for obtaining a low-frequency image of the image to be processed;
the third determining submodule is used for determining a region corresponding to the region of the undetermined boundary in the low-frequency image to obtain a first search region;
the second setting submodule is used for acquiring the gray level mean value of all pixel points in the first search area and setting a first threshold value according to the gray level mean value of all pixel points in the search area;
the third obtaining submodule is used for obtaining the gray average value of all pixel points on each straight line in the first searching area to obtain a third set;
the second judgment submodule is configured to judge whether the minimum value in the third set is greater than the first threshold;
the fourth determining submodule is configured to determine, when the determination result of the second determining submodule is negative, that the edge of the image to be processed in the area where the boundary is not determined is the boundary of the irradiation field of the beam limiter;
and the fifth determining submodule is configured to determine, when the determination result of the second determining submodule is yes, that the to-be-determined straight line is the boundary of the irradiation field of the beam limiter in the region where the boundary is not determined, where a position of the straight line corresponding to the minimum value in the third set in the low-frequency image is the same as a position of the to-be-determined straight line in the to-be-processed image.
In some possible implementation manners of this embodiment, the fifth determining sub-module includes: a fourth obtaining sub-module, a sixth determining sub-module, a third judging sub-module, a seventh determining sub-module and an eighth determining sub-module (all not shown in the figure);
the fourth obtaining submodule is used for obtaining a high-frequency image of the image to be processed according to the image to be processed and the low-frequency image;
the sixth determining submodule is used for determining a region corresponding to the region of the undetermined boundary in the high-frequency image to obtain a second search region;
the third judgment submodule is used for acquiring the gray level mean value of all the pixel points in the second search area and judging whether the gray level mean value of all the pixel points in the second search area is larger than a second threshold value or not;
the seventh determining submodule is configured to determine that the to-be-determined straight line is the boundary of the irradiation field of the beam limiter in the to-be-processed image if the determination result of the third determining submodule is negative;
and the eighth determining submodule is configured to determine, when the determination result of the third determining submodule is yes, that the edge of the image to be processed in the area where the boundary is not determined is the boundary of the irradiation field of the beam limiter.
In some possible implementation manners of this embodiment, the second determining module 600 includes: a boundary determination submodule (neither shown in the figure);
the boundary determining submodule is used for determining the first edge as a first boundary of an irradiation field of the beam limiter in the image to be processed when the absolute value of the difference between the maximum value in the first set and the gradient accumulation sum of all pixel points on the first straight line in the vertical direction is larger than a first preset difference value;
the boundary determining submodule is further configured to determine that the second edge is a second boundary of the irradiation field of the beam limiter in the image to be processed when an absolute value of a difference between a minimum value in the first set and a gradient accumulation sum of all pixel points on a second straight line in the vertical direction is greater than a second preset difference value;
the boundary determining submodule is further configured to determine that the third edge is a third boundary of the beam limiter illumination field in the image to be processed when an absolute value of a difference between a maximum value in the second set and a gradient accumulation sum of all pixel points on a third straight line in the horizontal direction is greater than a third preset difference value;
the boundary determining submodule is further configured to determine that the fourth edge is a fourth boundary of the irradiation field of the beam limiter in the image to be processed when an absolute value of a difference between the minimum value in the second set and a gradient accumulation sum of all pixel points on a fourth straight line in the horizontal direction is greater than a fourth preset difference value;
wherein the absolute value of the difference between the distance between the first straight line and a first preset point in the image to be processed and the distance between the first boundary and the first preset point is smaller than a first distance; the absolute value of the difference between the distance between the second straight line and a second preset point in the image to be processed and the distance between the second boundary and the second preset point is smaller than a second distance; the absolute value of the difference between the distance between the third straight line and a third preset point in the image to be processed and the distance between the first boundary and the third preset point is smaller than a third distance; and the absolute value of the difference between the distance between the fourth straight line and a fourth preset point in the image to be processed and the distance between the fourth boundary and the fourth preset point is smaller than a fourth distance.
In some possible implementation manners, the apparatus provided in this embodiment further includes: a first judgment module and a second judgment module (both not shown in the figure);
the first judging module is used for judging whether the gradient of each pixel point in the image to be processed in the vertical direction is larger than a first preset gradient one by one, and if not, the gradient of the pixel point in the vertical direction is set to be zero;
and the second judging module is used for judging whether the gradient of each pixel point in the image to be processed in the horizontal direction is larger than a second preset gradient one by one, and if not, setting the gradient of the pixel point in the horizontal direction to be zero.
In some possible implementation manners of this embodiment, the second determining module 600 specifically includes: a fourth judgment sub-module, a fifth judgment sub-module, and a ninth determination sub-module (all not shown in the drawings);
the fourth judgment submodule is configured to respectively judge whether the first edge and the second edge are parallel and whether the third edge and the fourth edge are parallel;
the fifth judgment sub-module is configured to, when the fourth judgment sub-module judges that the first edge and the second edge are parallel and the third edge and the fourth edge are parallel, continuously judge whether two adjacent edges of the first edge, the second edge, the third edge and the fourth edge are perpendicular to each other;
the ninth determining submodule is configured to determine that the first edge, the second edge, the third edge, and the fourth edge are boundaries of a radiation field of the beam limiter in the image to be processed, when the fifth determining submodule determines that two adjacent edges of the first edge, the second edge, the third edge, and the fourth edge are all perpendicular to each other.
In some possible implementation manners, the apparatus provided in this embodiment further includes: a third judging module, a fourth judging module and a seventh determining module (all not shown in the figure);
the third judging module is configured to respectively judge whether the first boundary and the second boundary are parallel and whether the third boundary and the fourth boundary are parallel;
the fourth determining module is configured to, when the third determining module determines that the first boundary and the second boundary are parallel and the third boundary and the fourth boundary are parallel, continue to determine whether two adjacent boundaries of the first boundary, the second boundary, the third boundary and the fourth boundary are perpendicular to each other;
the seventh determining module is configured to, when the fourth determining module determines that two adjacent boundaries of the first boundary, the second boundary, the third boundary, and the fourth boundary are all perpendicular to each other, determine that the first boundary, the second boundary, the third boundary, and the fourth boundary are boundaries of an irradiation field of a beam limiter in the image to be processed.
In some possible implementation manners, the apparatus provided in this embodiment further includes: a first processing module, a second processing module and a third processing module (none shown in the figures);
the first processing module is used for performing edge deletion processing on the image to be processed, and triggering the second processing module after removing a first preset number of pixel points on the edge of the image to be processed;
the second processing module is configured to perform convolution calculation on the to-be-processed image to update the to-be-processed image, remove a second preset number of pixel points at the edge of the to-be-processed image, and trigger the third processing module;
the third processing module is configured to perform downsampling processing on the to-be-processed image, and update the to-be-processed image;
and the size of a convolution kernel used for the convolution calculation is equal to the sampling interval in the down-sampling processing.
In the apparatus for acquiring the irradiation field boundary of the beam limiter in the medical diagnostic image according to this embodiment, after the first acquiring module acquires the image to be processed, the second acquiring module first calculates the gradients of the image to be processed in the vertical direction and the horizontal direction, and then the third acquiring module and the fourth acquiring module calculate the gradient cumulative sum of all pixel points on each straight line in the image to be processed in the vertical direction and the horizontal direction, so as to obtain the vertical gradient cumulative sum and the horizontal gradient cumulative sum of each straight line in the image to be processed. Then, straight lines corresponding to the maximum vertical gradient accumulation, the maximum horizontal gradient accumulation sum, the minimum vertical gradient accumulation sum and the minimum horizontal gradient accumulation sum are respectively searched through a first determining module, and four edges are obtained. And the second determining module determines the four edges as the boundary line of the irradiation field of the beam limiter in the image to be processed. The irradiation field of the beam limiter in the image to be processed is an internal area enclosed by the four edges. The device for acquiring the irradiation field boundary of the beam limiter in the medical diagnosis image, provided by the embodiment, can accurately detect the irradiation field of the beam limiter with weak boundary contrast and discontinuous boundary under the influence of noise, so as to obtain the continuous irradiation field boundary of the beam limiter, so that the region except the irradiation field of the beam limiter in the image can be removed in the subsequent processing, and the precision of the post-image processing result is improved.
It should be noted that, in the present specification, the embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments may be referred to each other. For the system or the device disclosed by the embodiment, the description is simple because the system or the device corresponds to the method disclosed by the embodiment, and the relevant points can be referred to the method part for description.
It is further noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The foregoing is merely a preferred embodiment of the invention and is not intended to limit the invention in any manner. Although the present invention has been described with reference to the preferred embodiments, it is not intended to be limited thereto. Those skilled in the art can make numerous possible variations and modifications to the present teachings, or modify equivalent embodiments to equivalent variations, without departing from the scope of the present teachings, using the methods and techniques disclosed above. Therefore, any simple modification, equivalent change and modification made to the above embodiments according to the technical essence of the present invention are still within the scope of the protection of the technical solution of the present invention, unless the contents of the technical solution of the present invention are departed.

Claims (14)

1. A method of obtaining a boundary of an illumination field of a beam limiter in a medical diagnostic image, comprising:
acquiring an image to be processed;
respectively acquiring the gradients of each pixel point in the image to be processed in the vertical direction and the horizontal direction;
acquiring gradient accumulated sum of all pixel points on each straight line in the image to be processed in the vertical direction to obtain a first set;
acquiring the gradient accumulated sum of all pixel points on each straight line in the image to be processed in the horizontal direction to obtain a second set;
searching the maximum value and the minimum value in the first set, determining that a straight line corresponding to the maximum value in the first set is a first edge, and determining that a straight line corresponding to the minimum value in the first set is a second edge;
searching the maximum value and the minimum value in the second set, determining that the straight line corresponding to the maximum value in the second set is a third edge, and determining that the straight line corresponding to the minimum value in the second set is a fourth edge;
determining the first edge, the second edge, the third edge and the fourth edge as boundaries of an irradiation field of a beam limiter in the image to be processed;
after the obtaining the gradients of each pixel point in the image to be processed in the vertical direction and the horizontal direction respectively, the method further comprises the following steps: judging whether the gradient of each pixel point in the image to be processed in the vertical direction is larger than a first preset gradient one by one, and if not, setting the gradient of the pixel point in the vertical direction to be zero; and judging whether the gradient of each pixel point in the image to be processed in the horizontal direction is larger than a second preset gradient one by one, and if not, setting the gradient of the pixel point in the horizontal direction to be zero.
2. The method according to claim 1, wherein the obtaining the gradient of each pixel point in the image to be processed in the vertical direction and the horizontal direction respectively specifically comprises:
respectively obtaining the gray level mean values of all pixel points in a first preset range, a second preset range, a third preset range and a fourth preset range to obtain a first gray level mean value, a second gray level mean value, a third gray level mean value and a fourth gray level mean value;
the first preset range is located right above a preset point, the second preset range is located right below the preset point, the third preset range is located right left of the preset point, the fourth preset range is located right of the preset point, the first preset range, the second preset range, the third preset range and the fourth preset range are the same in size and the same in distance with the preset point, and the preset point is any pixel point in the image to be processed;
acquiring the gradient of the preset point in the vertical direction according to the first gray average value and the second gray average value;
and acquiring the gradient of the preset point in the horizontal direction according to the third gray average value and the fourth gray average value.
3. The method according to claim 1, wherein the determining that the first edge, the second edge, the third edge, and the fourth edge are boundaries of an irradiation field of a beam limiter in the image to be processed specifically includes:
when the absolute value of the maximum value in the first set is larger than a first initial threshold value, determining that the first edge is a first boundary of an irradiation field of a beam limiter in the image to be processed;
when the absolute value of the minimum value in the first set is larger than a second initial threshold, determining that the second edge is a second boundary of an irradiation field of a beam limiter in the image to be processed;
when the absolute value of the maximum value in the second set is greater than a third initial threshold, determining that the third edge is a third boundary of an irradiation field of a beam limiter in the image to be processed;
and when the absolute value of the minimum value in the second set is greater than a fourth initial threshold, determining that the fourth edge is a fourth boundary of the irradiation field of the beam limiter in the image to be processed.
4. The method of claim 3, further comprising:
acquiring a low-frequency image of the image to be processed;
determining a region with Y-axis coordinates smaller than (H/2) -H x f in the low-frequency image as an upper moving region of the boundary of the irradiation field of the beam limiter, and determining a region with Y-axis coordinates larger than (H/2) + H x f in the low-frequency image as a lower moving region of the boundary of the irradiation field of the beam limiter;
determining a region with X-axis coordinates smaller than (W/2) -W X f in the low-frequency image as a left moving region of the boundary of the irradiation field of the beam limiter, and determining a region with X-axis coordinates larger than (W/2) + W X f in the low-frequency image as a right moving region of the boundary of the irradiation field of the beam limiter;
determining correspondence between the first edge, the second edge, the third edge, and the fourth edge and the upper movement region, the lower movement region, the left movement region, and the right movement region, each movement region corresponding to one edge;
respectively calculating the gray average values of all pixel points in the upper moving area, the lower moving area, the left moving area and the right moving area to obtain an upper gray average value, a lower gray average value, a left gray average value and a right gray average value;
when the upper gray average value is greater than or equal to A, the judgment threshold value of the upper moving area is equal to a preset maximum judgment threshold value; when the upper gray average value is less than or equal to B, the judgment threshold value of the upper moving area is equal to a preset minimum judgment threshold value; when the upper gray average value is larger than B and smaller than A, setting a judgment threshold value of the upper moving area according to the upper gray average value;
when the lower gray average value is greater than or equal to A, the judgment threshold value of the lower moving area is equal to the preset maximum judgment threshold value; when the lower gray average value is less than or equal to B, the judgment threshold value of the lower moving area is equal to the preset minimum judgment threshold value; when the lower gray average value is larger than B and smaller than A, setting a judgment threshold value of the lower moving area according to the lower gray average value;
when the left part gray average value is greater than or equal to A, the judgment threshold value of the left moving area is equal to the preset maximum judgment threshold value; when the left part gray average value is less than or equal to B, the judgment threshold value of the left moving area is equal to the preset minimum judgment threshold value; when the left part gray level average value is larger than B and the left part gray level average value is smaller than A, setting a judgment threshold value of the left moving area according to the left part gray level average value;
when the right gray average value is greater than or equal to A, the judgment threshold value of the right moving area is equal to the preset maximum judgment threshold value; when the right gray average value is less than or equal to B, the judgment threshold value of the right moving area is equal to the preset minimum judgment threshold value; when the right gray average value is greater than B and smaller than A, setting a judgment threshold value of the right moving area according to the right gray average value;
h is the maximum value of the Y-axis coordinate of the low-frequency image, W is the maximum value of the X-axis coordinate of the low-frequency image, f is a preset truncation coefficient, A is a preset maximum gray average value, and B is a preset minimum gray average value;
the first initial threshold, the second initial threshold, the third initial threshold, and the fourth initial threshold are respectively equal to judgment thresholds of moving areas corresponding to the first edge, the second edge, the third edge, and the fourth edge.
5. The method according to claim 1, wherein the determining that the first edge, the second edge, the third edge, and the fourth edge are boundaries of an irradiation field of a beam limiter in the image to be processed specifically includes:
when the absolute value of the difference between the maximum value in the first set and the gradient accumulation sum of all pixel points on the first straight line in the vertical direction is larger than a first preset difference value, determining the first edge as a first boundary of an irradiation field of a beam limiter in the image to be processed;
when the absolute value of the difference between the minimum value in the first set and the gradient accumulation sum of all pixel points on the second straight line in the vertical direction is larger than a second preset difference value, determining the second edge as a second boundary of the beam limiter illumination field in the image to be processed;
when the absolute value of the difference between the maximum value in the second set and the gradient accumulation sum of all pixel points on a third straight line in the horizontal direction is larger than a third preset difference value, determining that the third edge is a third boundary of the beam limiter irradiation field in the image to be processed;
when the absolute value of the difference between the minimum value in the second set and the gradient accumulation sum of all pixel points on a fourth straight line in the horizontal direction is larger than a fourth preset difference value, determining that the fourth edge is a fourth boundary of the beam limiter illumination field in the image to be processed;
wherein the absolute value of the difference between the distance between the first straight line and a first preset point in the image to be processed and the distance between the first boundary and the first preset point is smaller than a first distance; the absolute value of the difference between the distance between the second straight line and a second preset point in the image to be processed and the distance between the second boundary and the second preset point is smaller than a second distance; the absolute value of the difference between the distance between the third straight line and a third preset point in the image to be processed and the distance between the first boundary and the third preset point is smaller than a third distance; and the absolute value of the difference between the distance between the fourth straight line and a fourth preset point in the image to be processed and the distance between the fourth boundary and the fourth preset point is smaller than a fourth distance.
6. The method of any of claims 3 to 5, further comprising:
when the first boundary and the second boundary are not determined to be the boundary of the beam limiter irradiation field in the image to be processed at the same time and/or the third boundary and the fourth boundary are not determined to be the boundary of the beam limiter irradiation field in the image to be processed at the same time, searching the boundary of the beam limiter irradiation field in the region without the determined boundary again;
wherein the region of undetermined boundary belongs to the image to be processed.
7. The method according to claim 1, wherein the determining that the first edge, the second edge, the third edge, and the fourth edge are boundaries of an irradiation field of a beam limiter in the image to be processed specifically includes:
respectively judging whether the first edge and the second edge are parallel and whether the third edge and the fourth edge are parallel;
when the first edge is parallel to the second edge and the third edge is parallel to the fourth edge, continuously judging whether two adjacent edges of the first edge, the second edge, the third edge and the fourth edge are mutually vertical;
when two adjacent edges of the first edge, the second edge, the third edge and the fourth edge are perpendicular to each other, determining that the first edge, the second edge, the third edge and the fourth edge are boundaries of an irradiation field of a beam limiter in the image to be processed.
8. The method of claim 1, wherein the obtaining the image to be processed further comprises:
performing edge removing processing on the image to be processed, and removing a first preset number of pixel points at the edge of the image to be processed;
after the image to be processed is subjected to convolution calculation and updated, removing a second preset number of pixel points at the edge of the image to be processed;
performing down-sampling processing on the image to be processed, and updating the image to be processed;
and the size of a convolution kernel used for the convolution calculation is equal to the sampling interval in the down-sampling processing.
9. An apparatus for obtaining a boundary of an irradiation field of a beam limiter in a medical diagnostic image, comprising: the device comprises a first acquisition module, a second acquisition module, a third acquisition module, a fourth acquisition module, a first determination module and a second determination module;
the first acquisition module is used for acquiring an image to be processed;
the second obtaining module is used for respectively obtaining the gradients of each pixel point in the image to be processed in the vertical direction and the horizontal direction;
the third obtaining module is configured to obtain a gradient accumulated sum in the vertical direction of all pixel points on each straight line in the image to be processed, so as to obtain a first set;
the fourth obtaining module is configured to obtain a gradient accumulated sum of all pixel points in each straight line in the image to be processed in the horizontal direction to obtain a second set;
the first determining module is configured to search for a maximum value and a minimum value in the first set, determine that a straight line corresponding to the maximum value in the first set is a first edge, and determine that a straight line corresponding to the minimum value in the first set is a second edge;
the first determining module is further configured to search for a maximum value and a minimum value in the second set, determine that a straight line corresponding to the maximum value in the second set is a third edge, and determine that a straight line corresponding to the minimum value in the second set is a fourth edge;
the second determining module is configured to determine that the first edge, the second edge, the third edge, and the fourth edge are boundaries of an irradiation field of the beam limiter in the image to be processed;
after the second obtaining module, the method further comprises:
the judging unit is used for judging whether the gradient of each pixel point in the image to be processed in the vertical direction is larger than a first preset gradient one by one, and if not, the gradient of the pixel point in the vertical direction is set to be zero; and judging whether the gradient of each pixel point in the image to be processed in the horizontal direction is larger than a second preset gradient one by one, and if not, setting the gradient of the pixel point in the horizontal direction to be zero.
10. The apparatus of claim 9, wherein the second determining module comprises: a first determination submodule;
the first determining submodule is used for determining that the first edge is a first boundary of an irradiation field of the beam limiter in the image to be processed when the absolute value of the maximum value in the first set is larger than a first initial threshold;
the first determining submodule is further configured to determine that the second edge is a second boundary of an irradiation field of the beam limiter in the image to be processed when an absolute value of a minimum value in the first set is greater than a second initial threshold;
the first determining submodule is further configured to determine that the third edge is a third boundary of an irradiation field of the beam limiter in the image to be processed, when an absolute value of a maximum value in the second set is greater than a third initial threshold;
the first determining sub-module is further configured to determine, when an absolute value of a minimum value in the second set is greater than a fourth initial threshold, that the fourth edge is a fourth boundary of an irradiation field of the beam limiter in the image to be processed.
11. The apparatus of claim 9, wherein the second determining module comprises: a boundary determination submodule;
the boundary determining submodule is used for determining the first edge as a first boundary of an irradiation field of the beam limiter in the image to be processed when the absolute value of the difference between the maximum value in the first set and the gradient accumulation sum of all pixel points on the first straight line in the vertical direction is larger than a first preset difference value;
the boundary determining submodule is further configured to determine that the second edge is a second boundary of the irradiation field of the beam limiter in the image to be processed when an absolute value of a difference between a minimum value in the first set and a gradient accumulation sum of all pixel points on a second straight line in the vertical direction is greater than a second preset difference value;
the boundary determining submodule is further configured to determine that the third edge is a third boundary of the beam limiter illumination field in the image to be processed when an absolute value of a difference between a maximum value in the second set and a gradient accumulation sum of all pixel points on a third straight line in the horizontal direction is greater than a third preset difference value;
the boundary determining submodule is further configured to determine that the fourth edge is a fourth boundary of the irradiation field of the beam limiter in the image to be processed when an absolute value of a difference between the minimum value in the second set and a gradient accumulation sum of all pixel points on a fourth straight line in the horizontal direction is greater than a fourth preset difference value;
wherein the absolute value of the difference between the distance between the first straight line and a first preset point in the image to be processed and the distance between the first boundary and the first preset point is smaller than a first distance; the absolute value of the difference between the distance between the second straight line and a second preset point in the image to be processed and the distance between the second boundary and the second preset point is smaller than a second distance; the absolute value of the difference between the distance between the third straight line and a third preset point in the image to be processed and the distance between the first boundary and the third preset point is smaller than a third distance; and the absolute value of the difference between the distance between the fourth straight line and a fourth preset point in the image to be processed and the distance between the fourth boundary and the fourth preset point is smaller than a fourth distance.
12. The apparatus of claim 10 or 11, further comprising: a search module;
the searching module is configured to search the boundary of the beam limiter irradiation field again in an area where the boundary is not determined when the first boundary and the second boundary are not determined at the same time as the boundary of the beam limiter irradiation field in the image to be processed and/or when the third boundary and the fourth boundary are not determined at the same time as the boundary of the beam limiter irradiation field in the image to be processed;
wherein the region of undetermined boundary belongs to the image to be processed.
13. The apparatus according to claim 9, wherein the second determining module specifically includes: a fourth judgment submodule, a fifth judgment submodule and a ninth determination submodule;
the fourth judgment submodule is configured to respectively judge whether the first edge and the second edge are parallel and whether the third edge and the fourth edge are parallel;
the fifth judgment sub-module is configured to, when the fourth judgment sub-module judges that the first edge and the second edge are parallel and the third edge and the fourth edge are parallel, continuously judge whether two adjacent edges of the first edge, the second edge, the third edge and the fourth edge are perpendicular to each other;
the ninth determining submodule is configured to determine that the first edge, the second edge, the third edge, and the fourth edge are boundaries of a radiation field of the beam limiter in the image to be processed, when the fifth determining submodule determines that two adjacent edges of the first edge, the second edge, the third edge, and the fourth edge are all perpendicular to each other.
14. The apparatus of claim 9, further comprising: the system comprises a first processing module, a second processing module and a third processing module;
the first processing module is used for performing edge deletion processing on the image to be processed, and triggering the second processing module after removing a first preset number of pixel points on the edge of the image to be processed;
the second processing module is configured to perform convolution calculation on the to-be-processed image to update the to-be-processed image, remove a second preset number of pixel points at the edge of the to-be-processed image, and trigger the third processing module;
the third processing module is configured to perform downsampling processing on the to-be-processed image, and update the to-be-processed image;
and the size of a convolution kernel used for the convolution calculation is equal to the sampling interval in the down-sampling processing.
CN201610942122.1A 2016-10-25 2016-10-25 Method and device for acquiring irradiation field boundary of beam limiter in medical diagnosis image Active CN107977973B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610942122.1A CN107977973B (en) 2016-10-25 2016-10-25 Method and device for acquiring irradiation field boundary of beam limiter in medical diagnosis image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610942122.1A CN107977973B (en) 2016-10-25 2016-10-25 Method and device for acquiring irradiation field boundary of beam limiter in medical diagnosis image

Publications (2)

Publication Number Publication Date
CN107977973A CN107977973A (en) 2018-05-01
CN107977973B true CN107977973B (en) 2020-08-11

Family

ID=62004089

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610942122.1A Active CN107977973B (en) 2016-10-25 2016-10-25 Method and device for acquiring irradiation field boundary of beam limiter in medical diagnosis image

Country Status (1)

Country Link
CN (1) CN107977973B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110910373B (en) * 2019-11-25 2023-01-24 西南交通大学 Identification method of orthotropic steel bridge deck fatigue crack detection image
CN111161297B (en) * 2019-12-31 2023-06-16 上海联影医疗科技股份有限公司 Method and device for determining edge of beam limiter and X-ray system
WO2021121234A1 (en) * 2019-12-20 2021-06-24 Shanghai United Imaging Healthcare Co., Ltd. Systems and methods for determining collimator edge
CN111708022B (en) * 2020-07-15 2022-02-08 四川长虹电器股份有限公司 Method and device for calculating scanning area boundary of millimeter wave radar

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101540040A (en) * 2008-03-21 2009-09-23 深圳迈瑞生物医疗电子股份有限公司 Method and device for automatically detecting boundary of beam-limiting device
JP2010250387A (en) * 2009-04-10 2010-11-04 Hitachi Computer Peripherals Co Ltd Image recognition device and program
CN102020036A (en) * 2010-11-02 2011-04-20 昆明理工大学 Visual detection method for transparent paper defect of outer package of strip cigarette
CN102243705A (en) * 2011-05-09 2011-11-16 东南大学 Method for positioning license plate based on edge detection
CN103208106A (en) * 2012-01-16 2013-07-17 上海西门子医疗器械有限公司 Method and device for detecting collimation side and X-ray imaging device
CN103985135A (en) * 2014-06-07 2014-08-13 山西中创伟业科技有限公司 License plate location method based on difference edge images
CN104161531A (en) * 2014-05-04 2014-11-26 上海联影医疗科技有限公司 Beam limiting device edge obtaining method and device and X-ray photographic equipment
CN104715478A (en) * 2015-03-05 2015-06-17 深圳市安健科技有限公司 A method and system for detecting exposure area in image picture

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101540040A (en) * 2008-03-21 2009-09-23 深圳迈瑞生物医疗电子股份有限公司 Method and device for automatically detecting boundary of beam-limiting device
JP2010250387A (en) * 2009-04-10 2010-11-04 Hitachi Computer Peripherals Co Ltd Image recognition device and program
CN102020036A (en) * 2010-11-02 2011-04-20 昆明理工大学 Visual detection method for transparent paper defect of outer package of strip cigarette
CN102243705A (en) * 2011-05-09 2011-11-16 东南大学 Method for positioning license plate based on edge detection
CN103208106A (en) * 2012-01-16 2013-07-17 上海西门子医疗器械有限公司 Method and device for detecting collimation side and X-ray imaging device
CN104161531A (en) * 2014-05-04 2014-11-26 上海联影医疗科技有限公司 Beam limiting device edge obtaining method and device and X-ray photographic equipment
CN103985135A (en) * 2014-06-07 2014-08-13 山西中创伟业科技有限公司 License plate location method based on difference edge images
CN104715478A (en) * 2015-03-05 2015-06-17 深圳市安健科技有限公司 A method and system for detecting exposure area in image picture

Also Published As

Publication number Publication date
CN107977973A (en) 2018-05-01

Similar Documents

Publication Publication Date Title
Maini et al. Study and comparison of various image edge detection techniques
CN107977973B (en) Method and device for acquiring irradiation field boundary of beam limiter in medical diagnosis image
Byun et al. An area-based image fusion scheme for the integration of SAR and optical satellite imagery
Trujillo-Pino et al. Accurate subpixel edge location based on partial area effect
Wilson et al. A new metric for grey-scale image comparison
US7526140B2 (en) Model-based localization and measurement of miniature surface mount components
EP2085928B1 (en) Detection of blobs in images
US7912321B1 (en) Image registration with uncertainty analysis
US20070173744A1 (en) System and method for detecting intervertebral disc alignment using vertebrae segmentation
JP4885138B2 (en) Method and system for motion correction in a sequence of images
US20090252418A1 (en) Detection of edges in an image
US20020035323A1 (en) Scale-based image filtering of magnetic resonance data
JP2009169934A (en) System and method for recognizing deformable object
Coleman et al. Edge detecting for range data using laplacian operators
Korobeynikov et al. Calculation of regularization parameter in the problem of blur removal in digital image
Alexander et al. The registration of MR images using multiscale robust methods
US11189023B2 (en) Devices, systems, and methods for anchor-point-enabled multi-scale subfield alignment
CN115205181B (en) Multi-focus image fusion method and device, electronic equipment and storage medium
Jena et al. An edge detection approach for fractal image processing
Kumar et al. Semiautomatic method for segmenting pedicles in vertebral radiographs
Slabaugh et al. Information-theoretic feature detection in ultrasound images
Shi et al. A combinational filtering method for enhancing suspicious structures in chest X-rays
AlAzawee Computer Aided Brain Tumor Edge Extraction Using Morphological Operations
US10395378B2 (en) Detecting periodic patterns and aperture problems for motion estimation
Jahagirdar et al. Comparative study of satellite image edge detection techniques

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant