CN111210423B - Breast contour extraction method, system and device of NIR image - Google Patents

Breast contour extraction method, system and device of NIR image Download PDF

Info

Publication number
CN111210423B
CN111210423B CN202010034194.2A CN202010034194A CN111210423B CN 111210423 B CN111210423 B CN 111210423B CN 202010034194 A CN202010034194 A CN 202010034194A CN 111210423 B CN111210423 B CN 111210423B
Authority
CN
China
Prior art keywords
contour
image
gradient
point
breast
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010034194.2A
Other languages
Chinese (zh)
Other versions
CN111210423A (en
Inventor
高爽
马贝
李世维
谢晓青
何芸芸
容若文
张国旺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dolby Medical Technology Co ltd
Original Assignee
Zhejiang Dolby Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dolby Medical Technology Co ltd filed Critical Zhejiang Dolby Medical Technology Co ltd
Priority to CN202010034194.2A priority Critical patent/CN111210423B/en
Publication of CN111210423A publication Critical patent/CN111210423A/en
Application granted granted Critical
Publication of CN111210423B publication Critical patent/CN111210423B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30068Mammography; Breast

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a breast contour extraction method, a breast contour extraction system and a breast contour extraction device for NIR images, in the embodiment of the application, the invention can combine image information under LED light sources at multiple angles to obtain complete breast illumination information, and can extract the position with the maximum pixel value change, namely a contour, by combining gradient and distance information from a central point; in addition, by curve fitting the contours of each small range and supplementing the contours outside the illumination range in polar coordinates, a complete smoother breast contour can be obtained.

Description

Breast contour extraction method, system and device of NIR image
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method, a system, and a device for extracting a breast contour of an NIR image.
Background
Currently, the diagnosis methods of breast cancer include molybdenum target X-ray, CT, MR, and optical imaging (DOT) imaging diagnosis methods, and the optical imaging (DOT) system reconstructs the light absorption coefficient of the breast area into an image to achieve the purpose of diagnosis. Wherein the breast area is obtained by contour extraction. Since 127 LED light sources are located near the breast area, conventional thresholding and Canny edge detection are susceptible to LED light source brightness, which also lines the LED area into the breast area. Moreover, the medical DICOM image contains a plurality of frames, wherein one frame only reflects the outline information of the breast under a part of light sources, so that the outline extraction needs to be carried out by combining the information of the plurality of frames. Because LED light sources sometimes do not illuminate the full extent of the breast, it is necessary to supplement a portion of the edge profile.
Disclosure of Invention
In order to solve the technical problem, embodiments of the present application provide a method, a system, and an apparatus for extracting a breast contour of an NIR image.
A first aspect of an embodiment of the present application provides a method for extracting a breast contour of an NIR image, which may include:
acquiring image data in a scanning period, extracting image information of a plurality of previous frames in the image data, and preprocessing the image information;
acquiring gray scale image gradient information of the image based on the acquired image data, establishing a central point, and combining the gradient at each pixel with distance information from the center;
fitting a curved surface based on the gradient, the combined value of the distance to the center and the image coordinate and mapping to a polar coordinate system;
based on a fitted surface under polar coordinates, taking the minimum position of a combined value of the corresponding gradient and the distance to the center as a contour point corresponding to each angle under each angle, fitting a curve to points in the periphery of each contour point, and performing smoothing treatment to obtain a smooth contour curve;
and supplementing the contour of the residual angle based on the smooth contour curve, and mapping the contour curve under the whole polar coordinate to a rectangular coordinate system.
Further, the acquiring image data in a scanning period, extracting image information of a plurality of previous frames, and performing preprocessing specifically includes:
the scanning air bag pressurizes the breast to the image in the first period of a preset value, and the image is combined into a frame after normalization processing;
the merged image data is converted to log domain image data.
Further, the acquiring gray scale image gradient information of the image based on the acquired image data, and setting up a center point, and the combining the gradient at each pixel with the distance information to the center specifically includes:
based on the image data after conversion to the log domain, horizontal and vertical gradient information of the gray-scale image is calculated, and the gradient at each pixel is combined with distance-to-center information centering on the center of the image near the lower part of the breast, resulting in an image in which the gradient value at each pixel is combined with its distance-to-center information.
Further, the fitting and mapping of the curved surface to the polar coordinate system based on the combined values of the gradient and the distance to the center and the image coordinate specifically includes:
performing surface fitting based on the gradient of each pixel, the distance information from the gradient to the center and the corresponding coordinate of each pixel point;
setting an angle range and a radius range under a polar coordinate, and mapping the curved surface obtained in the step to the polar coordinate, wherein the angle of the polar coordinate is limited to-90 degrees to 90 degrees, and the radius is limited to 10 to 90 degrees so as to remove information at the center and the edge;
the minimum value corresponding to each angle of the gradient image under polar coordinates is used as a contour, and the contour information in the range of 180 degrees is used as the basis of contour supplement.
Further, the fitting curved surface based on polar coordinates, in each angle, using the minimum position of the combination value of the corresponding gradient and the distance to the center as a contour point corresponding to the angle, fitting a curve to points in the periphery of each contour point, and performing smoothing processing to obtain a smooth contour curve specifically includes:
based on the mapped curved surface data, taking the minimum gradient position corresponding to each angle under the polar coordinate as a candidate contour, fitting points in the peripheral range of each point at the candidate contour into a curve, if the number of the points in the range is less than 3, removing the points in the range, and if the distance from the current contour point to the fitted curve is more than 3 pixel points, removing the current contour point; finally forming a smooth contour curve;
and (5) supplementing the contour of the remaining angle range without data, and taking the radius value of the boundary point by the radius to obtain the complete contour of the whole breast.
A second aspect of the embodiments of the present application provides a breast contour extraction system for NIR images, which may include:
the image acquisition unit is used for acquiring image data in a first scanning period after pressurizing the breast, extracting image information of a plurality of previous frames in the image data, and preprocessing the image information;
a mapping unit for acquiring gray scale image gradient information of the image based on the acquired image data, and establishing a central point, combining the gradient at each pixel with distance information to the center; fitting a curved surface based on the gradient, the combined value of the distance to the center and the image coordinate and mapping the curved surface to a polar coordinate system;
the contour acquisition unit is used for fitting a curve to points in the periphery of each contour point and performing smoothing treatment to acquire a smooth contour curve, wherein the minimum position of a combination value of a corresponding gradient and a distance to the center is used as a contour point corresponding to each angle on the basis of a fitted surface under polar coordinates;
and the contour output unit is used for supplementing the contour of the residual angle based on the smooth contour curve and mapping the contour curve under the whole polar coordinate system to the rectangular coordinate system.
Further, the image acquisition unit includes:
an image acquisition unit acquiring an image in a first cycle in which a breast is pressurized to a preset value;
the preprocessing unit is used for performing normalization processing after summing processing is performed on the image data;
and an image domain conversion unit which converts the merged image data into log domain image data.
Further, the mapping unit includes:
a central point establishing unit for calculating the central point of the whole image and the width and height of the image based on the image data output by the image domain conversion unit;
the data processing unit is used for acquiring gradients of the image in two mutually vertical directions, combining the gradient data with the distance of the central point to form image data formed by combining the gradients with the central point, and fitting a curved surface;
and a coordinate system conversion unit for converting the fitted curved surface into a polar coordinate system to form gradient data in the polar coordinate system.
Further, the contour acquisition unit includes:
the contour point acquisition unit is used for taking the minimum position of a combination value of the corresponding gradient and the distance from the center as a contour point corresponding to each angle as a candidate contour under each angle according to gradient data under the polar coordinate system;
the curve fitting unit is used for fitting points in the peripheral range of each point at the candidate contour into a curve, if the number of the points in the range is less than 3, the points in the range are removed, and if the distance from the current contour point to the fitted curve is more than 3 pixel points, the current contour point is removed; finally forming a smooth contour curve;
and the contour complementing unit complements the contour with the remaining angle range without data, and obtains a complete contour curve by taking the radius value of the boundary point from the radius.
In a third aspect, an embodiment of the present application provides a breast contour extraction apparatus for NIR images, which includes a memory and a processor, where the memory has stored thereon computer-executable instructions, and the processor executes the computer-executable instructions on the memory to implement the method of the first aspect.
In the embodiment of the application, the method can combine image information under LED light sources at multiple angles to obtain complete breast illumination information, and obtains the position with the maximum pixel value change, namely the outline, by combining gradient and distance information from the center point; in addition, by curve fitting the contours of each small range and supplementing the contours outside the illumination range in polar coordinates, a complete smoother breast contour can be obtained.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the embodiments or the prior art descriptions will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a flow chart of contour extraction provided by an embodiment of the present application;
FIG. 2 is the breast image after the first 3 frames of fusion and normalization;
FIG. 3 is an image of gradient information combined with the distance of each point from a center point;
FIG. 4 is a gradient image mapped to polar coordinates;
FIG. 5 is an angle
Figure BDA0002364382090000061
-images at 90 ° to 90 °;
fig. 6 is the full breast contour range after supplementation.
FIG. 7 is a schematic block diagram of a contour extraction system provided herein;
fig. 8 is a schematic structural diagram of an extraction device provided in the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Fig. 1 is a schematic flowchart of a breast contour extraction method for NIR images according to the present application.
The method comprises the following specific steps:
s101, acquiring and preprocessing image data in a first scanning period after the breast is pressurized to 100 mmHg.
It can be understood that, for the present method, image data is required to be provided at the beginning of breast cancer diagnosis, and there is a requirement for acquisition and output of the whole method, which makes the present scheme unable to obtain enough time to acquire considerable data during data acquisition, so in order to meet the requirement of the whole scheme, the present step scans the image in the first period (100 mmHg pressure is given to breast in the present period), extracts the image of the first 3 frames (the number of frames in the first period is only three frames, and therefore, here is all the image data of the first period extracted), and then performs normalization processing, so as to obtain complete information of breast under illumination of LED light sources of various angles. To increase the brightness, the resulting image is converted to the log domain.
As a specific example of the implementation of the method,
step 1a: carrying out normalized summation on the first three frames of images to obtain
Figure BDA0002364382090000081
Figure BDA0002364382090000082
I i Indicating the gray value of the ith frame. The resulting sum image is then further normalized image = sum/max (sum).
Step 1b: the resulting image is converted to the log domain, i.e., log (image), and the resulting log domain image is shown in FIG. 2.
And S102, acquiring gray level image gradient information of the image based on the acquired image data, establishing a central point, and combining the gradient of each pixel with the distance information from the central point.
It can be understood that, based on the image data after conversion into the log domain, since it has been improved in brightness, the gray difference as a whole is large, facilitating the formation of the entire gradient value, the lateral and longitudinal gradient information of the gray image is calculated, and the gradient at each pixel is combined with the distance information to the center with the center of the image near the lower portion of the breast as the center, and mapped to the polar coordinates, excluding the regions of the center and edge portions in the polar coordinates, to form a preliminary contour region.
As a specific example of the implementation of the method,
in step 2, the specific steps of calculating the gradient and combining the distances from each point to the central point are as follows:
step 2a: defining a center point (x) c ,y c ),
Figure BDA0002364382090000083
Where width is the image width and height is the image height.
And step 2b: calculating image gradients grad in x and y directions respectively x ,grad y . The gradient information is combined with the distance of each point to the center point,
Figure BDA0002364382090000091
where dx, dy represent each point (x, y) to the center point (x) c ,y c ) X, y axes of (1) distance x-x c ,y-y c . The resulting image is shown in fig. 3.
And S103, fitting a curved surface based on the gradient, the combined value of the distances to the center and the image coordinates and mapping the curved surface to a polar coordinate system.
It is understood that in this step, the horizontal gradient and the vertical gradient at each pixel are calculated and multiplied by the distance between the horizontal coordinate and the center point and the distance between the vertical coordinate and the center point, respectively, to obtain image information of a gradient value multiplied by the distance, and after this image information is mapped to the polar coordinates (the polar coordinates have radius information, so here, in order to obtain the final image data, the data is examined under the polar coordinates), the examination of the contour range is started according to the radius range limitation of the polar coordinates.
The contour of the region is drawn by the minimum gradient value corresponding to each angle, non-smooth phenomena exist at certain inflection points, and the region is required to be smoothed before subsequent contour supplement, so that a curved surface is fitted to the combined value of the gradient and the distance from the gradient to the center and image coordinates based on the initial contour region, and then the curved surface is mapped into a polar coordinate range of 180 degrees based on the fitted curved surface to serve as the basis of contour supplement. In this embodiment, the curved surface is mapped to a polar coordinate range of-90 degrees to 90 degrees.
As a specific example of the implementation of the method,
step 3a: taking the radius r to be 10-90 to filter out the area at the center and far away; angle of rotation
Figure BDA0002364382090000092
And the angles are in one-to-one correspondence with the radii at 90 degrees to obtain a two-dimensional polar coordinate grid. And transforming the polar coordinate grid to a rectangular coordinate grid by using a polar coordinate rectangular coordinate transformation formula.
And step 3b: and (c) fitting a curved surface by using the Gimg obtained in the step 2b and the corresponding (x, y), and interpolating at the rectangular coordinate grid obtained in the step 3a to obtain an image Rimg combining the gradient and the distance from the gradient to the central point under the polar coordinate. As shown in fig. 4.
And S104, based on the fitted surface, taking the minimum position of the corresponding gradient and the distance to the center as a contour point corresponding to the angle under each angle, and deleting all points in the range within the range of-10 degrees to 10 degrees of each contour point if the distance to the fitted curve is less than 3 points, otherwise, fitting the curve in the range, deleting the current contour point if the distance to the fitted curve is more than 3 pixels, and keeping other points in the range.
It will be appreciated that the image rectangular coordinates are x-axis and y-axis values, while the polar image data contains information on the radius and angle. Therefore, when the outline radius is limited, the image is mapped to the polar coordinate, and the part which supplements the complete angle later is also under the polar coordinate.
In the present application, since the selected angle is-90 degrees to 90 degrees in the present embodiment, the minimum value is 181 in total, and these minimum positions are taken as candidate contours. And fitting points in the range of 20 degrees of each point in the candidate contour into a curve (deleting the points in the range when the number of the points is less than 3), removing points with larger deviation from the curve to form a smooth contour curve, supplementing the contour with the remaining angle range without data, and taking the radius value of the boundary point by the radius to obtain the complete contour of the whole breast.
As a specific example of the implementation of the method,
in step 4, the specific steps of smoothing the contour edge are as follows:
step 4a: find out each angle in step 3b
Figure BDA0002364382090000101
Corresponding minimum value of Rimg, i.e. profile.
And 4b: for each point in the contour
Figure BDA0002364382090000102
As a reference point, find->
Figure BDA0002364382090000103
If the number of the points is less than 3, all the points are removed at the point of-10 degrees to 10 degrees; if more than 3, a curve is fitted, if the distance of the reference point to the curve>3 pixel values, the reference point is removed. The resulting profile range is shown in fig. 5.
And S105, supplementing the contour of the residual angle based on the smooth contour curve, and mapping the contour curve under the whole polar coordinate to a rectangular coordinate system.
The specific steps for supplementing the complete profile outside the LED illumination range are as follows:
step 5a: get
Figure BDA0002364382090000111
-180°~-90°,/>
Figure BDA0002364382090000112
Supplementing the contour range at the left boundary.
And step 5b: get
Figure BDA0002364382090000113
90°~180°,/>
Figure BDA0002364382090000114
Supplementing the contour range at the right boundary. The complete profile range is supplemented as in fig. 6.
As a specific example, supplementing a profile in the range of-180 degrees to-90 degrees, the angle is taken to be-180 degrees to-90 degrees, and the radius is taken to be a radius at-90 degrees, with which a circle is drawn in the range of-180 degrees to-90 degrees. In the same way, the radius at 90 degrees is taken as the radius to draw a circle from 90 degrees to 180 degrees, thereby realizing the supplement of the contour.
And finally, converting the contour under the polar coordinate into a rectangular coordinate system.
The embodiment of the application also provides a breast contour extraction system of the NIR image, which is used for executing any one of the extraction methods. Specifically, referring to fig. 7, fig. 7 is a schematic block diagram of an extraction system provided in an embodiment of the present application. The system of the embodiment comprises: an image acquisition unit 310, a mapping unit 320, a contour acquisition unit 330, a contour output unit 340.
The image acquiring unit 310 is configured to acquire all image information in a first period from the image data in the scanning period, and perform preprocessing.
The method specifically comprises the following steps:
an image acquisition unit 311 that acquires an image in a first cycle in which the breast is pressurized to a preset value;
a preprocessing unit 312, configured to perform normalization processing after summing the image data;
the image domain conversion unit 313 converts the combined image data into image data in a log domain.
The mapping unit 320 is configured to obtain gray scale image gradient information of the image based on the obtained image data, and set up a central point, and combine the gradient at each pixel with distance information from the center; and fitting a curved surface based on the gradient, the combined value of the distance to the center and the image coordinates and mapping the curved surface to a polar coordinate system.
The method specifically comprises the following steps:
a central point establishing unit 321 for calculating the central point of the whole image and the width and height of the image based on the image data output by the image domain converting unit;
the data processing unit 322 is used for acquiring gradients of the image in two mutually perpendicular directions, combining the gradient data with the distance of the central point to form data formed by combining the gradients with the central point, and fitting a curved surface;
the coordinate system conversion unit 323 converts the fitted curved surface into a polar coordinate system, and forms gradient data in the polar coordinate system.
The above-mentioned contour acquiring unit 330 is configured to, based on the fitted surface, take the minimum position of the combination value of the corresponding gradient and the distance to the center as a contour point corresponding to the angle at each angle, delete all points in the range from-10 degrees to 10 degrees for each contour point if the distance to the fitted curve is less than 3 points, otherwise fit a curve in the range, delete the current contour point if the distance to the fitted curve is greater than 3 pixels for the current contour point, and keep other points in the range, thereby finally acquiring a smooth contour curve.
The method specifically comprises the following steps:
a contour point obtaining unit 331, configured to take a minimum position of a combination value of a corresponding gradient and a distance to a center as a contour point corresponding to each angle according to curve data in the polar coordinate system, as a candidate contour;
a curve fitting unit 332, configured to fit points in a peripheral range of each point at the candidate contour into a curve, remove the points in the range if the number of the points in the range is less than 3, and remove the current contour point if the distance from the current contour point to the fitted curve is greater than 3 pixels, so as to form a smooth contour curve;
and the contour complementing unit 333 complements the contour with the remaining angle range without data, and obtains a complete contour curve by taking the radius value of the boundary point as the radius.
The above-mentioned contour output unit 340 is configured to complement the contour of the remaining angle based on the smooth contour curve, and map the contour curve in the entire polar coordinate system to the rectangular coordinate system.
Because the above-described systems and methods are consistent, specific embodiments are not set forth herein.
Fig. 8 is a schematic structural diagram of an extraction device according to an embodiment of the present application. The object detection apparatus 4000 comprises a processor 41 and may further comprise an input device 42, an output device 43 and a memory 44. The input device 42, the output device 43, the memory 44, and the processor 41 are connected to each other via a bus.
The memory includes, but is not limited to, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM), or a portable read-only memory (CD-ROM), which is used for storing instructions and data.
The input means are for inputting data and/or signals and the output means are for outputting data and/or signals. The output means and the input means may be separate devices or may be an integral device.
The processor may include one or more processors, for example, one or more Central Processing Units (CPUs), and in the case of one CPU, the CPU may be a single-core CPU or a multi-core CPU. The processor may also include one or more special purpose processors, which may include GPUs, FPGAs, etc., for accelerated processing.
The memory is used to store program codes and data for the network device.
The processor is used for calling the program codes and data in the memory and executing the steps in the method embodiment. Specifically, reference may be made to the description of the method embodiment, which is not repeated herein.
It will be appreciated that fig. 8 only shows a simplified design of the object detection device. In practical applications, the motion recognition devices may further include other necessary components, including but not limited to any number of input/output devices, processors, controllers, memories, etc., and all motion recognition devices that can implement the embodiments of the present application are within the scope of the present application.
It can be clearly understood by those skilled in the art that, for convenience and simplicity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the division of the unit is only one logical function division, and other division may be implemented in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. The shown or discussed mutual coupling, direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some interfaces, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The procedures or functions according to the embodiments of the present application are wholly or partially generated when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on or transmitted over a computer-readable storage medium. The computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, digital Subscriber Line (DSL)), or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that includes one or more of the available media. The usable medium may be a read-only memory (ROM), or a Random Access Memory (RAM), or a magnetic medium, such as a floppy disk, a hard disk, a magnetic tape, a magnetic disk, or an optical medium, such as a Digital Versatile Disk (DVD), or a semiconductor medium, such as a Solid State Disk (SSD).
Although the preferred embodiments of the present invention have been described in detail, the present invention is not limited to the details of the foregoing embodiments, and various equivalent changes (such as number, shape, position, etc.) may be made within the technical scope of the present invention, and all such equivalent changes are within the protection scope of the present invention.

Claims (10)

1. A method of breast contour extraction for NIR images, comprising:
acquiring image data in a scanning period, extracting image information of a plurality of previous frames in the image data, and preprocessing the image information;
acquiring gray scale image gradient information of the image based on the acquired image data, establishing a central point, and combining the gradient at each pixel with distance information from the center;
fitting a curved surface based on the gradient, the combined value of the distance to the center and the image coordinate and mapping to a polar coordinate system;
based on a fitted surface under polar coordinates, taking the minimum position of a combined value of the corresponding gradient and the distance to the center as a contour point corresponding to each angle under each angle, fitting a curve to points in the periphery of each contour point, and performing smoothing treatment to obtain a smooth contour curve;
and supplementing the contour of the residual angle based on the smooth contour curve, and mapping the contour curve under the whole polar coordinate to a rectangular coordinate system.
2. The method of breast contour extraction of NIR images according to claim 1,
the acquiring of the image data in the scanning period, extracting the image information of a plurality of previous frames, and the preprocessing specifically comprises:
the scanning air bag pressurizes the breast to the image in the first period of a preset value, and the image is combined into a frame after normalization processing;
the merged image data is converted to log domain image data.
3. The method of breast contour extraction of NIR images according to claim 1,
the acquiring gray scale image gradient information of the image based on the acquired image data, establishing a central point, and combining the gradient of each pixel with the distance information from the central point specifically comprises:
based on the image data converted to the log domain, the horizontal and vertical gradient information of the gray scale image is calculated, and the gradient at each pixel is combined with the distance to the center with the center of the image close to the lower part of the breast as the center, resulting in an image with the gradient value at each pixel combined to the distance to the center point information thereof.
4. The method of breast contour extraction of NIR images according to claim 3,
the fitting and mapping of the curved surface to the polar coordinate system based on the gradient, the combined value of the distance to the center and the image coordinate specifically comprises:
performing surface fitting based on the gradient of each pixel, the distance information from the gradient to the center and the corresponding coordinate of each pixel point;
setting an angle range and a radius range under a polar coordinate, and mapping the curved surface obtained in the step to the polar coordinate, wherein the angle of the polar coordinate is limited to-90 degrees to 90 degrees, and the radius is limited to 10 to 90 degrees so as to remove information at the center and the edge;
the minimum value corresponding to each angle of the gradient image under polar coordinates is used as a contour, and the contour information in the range of 180 degrees is used as the basis of contour supplement.
5. The method of breast contour extraction of NIR images according to claim 4,
the fitting curved surface based on polar coordinates uses the minimum position of the combination value of the corresponding gradient and the distance to the center as a contour point corresponding to each angle, fits a curve to points in the periphery of each contour point, and performs smoothing to obtain a smooth contour curve specifically includes:
based on the mapped curved surface data, taking the minimum gradient position corresponding to each angle under the polar coordinates as a candidate contour, fitting points in the peripheral range of each point at the candidate contour into a curve, if the number of the points in the range is less than 3, removing the points in the range, and if the distance from the current contour point to the fitted curve is more than 3 pixel points, removing the current contour point; finally forming a smooth contour curve;
and (5) supplementing the contour of the remaining angle range without data, and taking the radius value of the boundary point by the radius to obtain the complete contour of the whole breast.
6. A breast contour extraction system for NIR images, comprising:
the image acquisition unit is used for acquiring image data in a first scanning period after pressurizing the breast, extracting image information of a plurality of previous frames in the image data, and preprocessing the image information;
a mapping unit for acquiring gray scale image gradient information of the image based on the acquired image data, and establishing a central point, combining the gradient at each pixel with distance information to the center; fitting a curved surface based on the gradient, the combined value of the distance to the center and the image coordinate and mapping the curved surface to a polar coordinate system;
the contour acquisition unit is used for fitting a curve to points in the periphery of each contour point and performing smoothing treatment to acquire a smooth contour curve, wherein the minimum position of a combination value of a corresponding gradient and a distance to the center is used as a contour point corresponding to each angle on the basis of a fitted surface under polar coordinates;
and the contour output unit is used for supplementing the contour of the residual angle based on the smooth contour curve and mapping the contour curve under the whole polar coordinate system to the rectangular coordinate system.
7. The breast contour extraction system of NIR images of claim 6,
the image acquisition unit includes:
an image acquisition unit acquiring an image in a first cycle in which a breast is pressurized to a preset value;
the preprocessing unit is used for performing normalization processing after summing processing is performed on the image data;
and an image domain conversion unit which converts the merged image data into image data in a log domain.
8. The breast contour extraction system of NIR images of claim 7,
the mapping unit includes:
a central point establishing unit for calculating the central point of the whole image and the width and height of the image based on the image data output by the image domain conversion unit;
the data processing unit is used for acquiring gradients of the image in two mutually vertical directions, combining the gradient data with the distance of the central point to form image data formed by combining the gradients with the central point, and fitting a curved surface;
and a coordinate system conversion unit for converting the fitted curved surface into a polar coordinate system to form gradient data in the polar coordinate system.
9. The breast contour extraction system of NIR images of claim 7,
the contour acquisition unit includes:
the contour point acquisition unit is used for taking the minimum position of a combination value of the corresponding gradient and the distance from the center as a contour point corresponding to each angle as a candidate contour under each angle according to gradient data under the polar coordinate system;
the curve fitting unit is used for fitting points in the peripheral range of each point at the candidate contour into a curve, if the number of the points in the range is less than 3, the points in the range are removed, and if the distance from the current contour point to the fitted curve is more than 3 pixel points, the current contour point is removed; finally forming a smooth contour curve;
and the contour complementing unit complements the contour to the remaining angle range without data, and the radius value of the boundary point is taken as the radius, so that a complete contour curve is obtained.
10. A breast contour extraction apparatus for NIR images, comprising a memory having stored thereon computer-executable instructions and a processor which, when executing the computer-executable instructions on the memory, implements the method of any one of claims 1-5.
CN202010034194.2A 2020-01-13 2020-01-13 Breast contour extraction method, system and device of NIR image Active CN111210423B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010034194.2A CN111210423B (en) 2020-01-13 2020-01-13 Breast contour extraction method, system and device of NIR image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010034194.2A CN111210423B (en) 2020-01-13 2020-01-13 Breast contour extraction method, system and device of NIR image

Publications (2)

Publication Number Publication Date
CN111210423A CN111210423A (en) 2020-05-29
CN111210423B true CN111210423B (en) 2023-04-07

Family

ID=70789607

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010034194.2A Active CN111210423B (en) 2020-01-13 2020-01-13 Breast contour extraction method, system and device of NIR image

Country Status (1)

Country Link
CN (1) CN111210423B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111640114B (en) * 2020-06-16 2024-03-15 北京安德医智科技有限公司 Image processing method and device
CN113598825A (en) * 2021-09-16 2021-11-05 浙江衡玖医疗器械有限责任公司 Breast positioning imaging method for ultrasonic imaging system and application thereof
CN116844697B (en) * 2023-02-24 2024-01-09 萱闱(北京)生物科技有限公司 Image multidimensional visualization method, device, medium and computing equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104978730A (en) * 2014-04-10 2015-10-14 上海联影医疗科技有限公司 Division method and device of left ventricular myocardium
JP2016142666A (en) * 2015-02-04 2016-08-08 日本メジフィジックス株式会社 Technique for extracting tumor contours from nuclear medicine image
CN108475428A (en) * 2015-12-22 2018-08-31 皇家飞利浦有限公司 The coronary artery segmentation of cardiac module guiding
CN108471995A (en) * 2015-09-30 2018-08-31 上海联影医疗科技有限公司 The system and method for determining breast area in medical image
CN110309814A (en) * 2019-07-11 2019-10-08 中国工商银行股份有限公司 A kind of iris identification method and device based on edge detection
CN110458813A (en) * 2019-03-08 2019-11-15 腾讯科技(深圳)有限公司 Image-region localization method, device and Medical Image Processing equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040101184A1 (en) * 2002-11-26 2004-05-27 Radhika Sivaramakrishna Automatic contouring of tissues in CT images

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104978730A (en) * 2014-04-10 2015-10-14 上海联影医疗科技有限公司 Division method and device of left ventricular myocardium
JP2016142666A (en) * 2015-02-04 2016-08-08 日本メジフィジックス株式会社 Technique for extracting tumor contours from nuclear medicine image
CN108471995A (en) * 2015-09-30 2018-08-31 上海联影医疗科技有限公司 The system and method for determining breast area in medical image
CN108475428A (en) * 2015-12-22 2018-08-31 皇家飞利浦有限公司 The coronary artery segmentation of cardiac module guiding
CN110458813A (en) * 2019-03-08 2019-11-15 腾讯科技(深圳)有限公司 Image-region localization method, device and Medical Image Processing equipment
CN110309814A (en) * 2019-07-11 2019-10-08 中国工商银行股份有限公司 A kind of iris identification method and device based on edge detection

Also Published As

Publication number Publication date
CN111210423A (en) 2020-05-29

Similar Documents

Publication Publication Date Title
CN111210423B (en) Breast contour extraction method, system and device of NIR image
WO2021129325A1 (en) Ultrasonic image lesion segmentation method and apparatus, and computer device
US9058650B2 (en) Methods, apparatuses, and computer program products for identifying a region of interest within a mammogram image
CN108154531B (en) Method and device for calculating area of body surface damage region
EP2564352A1 (en) Microcalcification detection and classification in radiographic images
KR20150098119A (en) System and method for removing false positive lesion candidate in medical image
JP7391267B2 (en) Medical image processing methods, devices, equipment, storage media and computer programs
Abdel-Nasser et al. Temporal mammogram image registration using optimized curvilinear coordinates
CN106709920B (en) Blood vessel extraction method and device
US20240005545A1 (en) Measuring method and measuring apparatus of blood vessel diameter of fundus image
CN116385436B (en) Cholelithiasis auxiliary detection system based on CT image
CN114693604A (en) Spine medical image processing method, device, equipment and storage medium
CN111986139A (en) Method and device for measuring intima-media thickness in carotid artery and storage medium
CN113888566A (en) Target contour curve determining method and device, electronic equipment and storage medium
CN111539926B (en) Image detection method and device
WO2023232067A1 (en) Systems and methods for lesion region identification
US11551371B2 (en) Analyzing symmetry in image data
KR101758805B1 (en) Apparatus and method for bonding medical images
CN116091560A (en) Image and model registration method and related product
CN115147360B (en) Plaque segmentation method and device, electronic equipment and readable storage medium
US20160210774A1 (en) Breast density estimation
CN113962958B (en) Sign detection method and device
CN115439423A (en) CT image-based identification method, device, equipment and storage medium
CN111275668A (en) Breast blood vessel extraction method, system and device of NIR image
CN112819051A (en) Capsule endoscopy image similarity evaluation method, system, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant