CN110507358B - Image processing method and system for measuring thickness of fetal nuchal transparency from ultrasonic image - Google Patents

Image processing method and system for measuring thickness of fetal nuchal transparency from ultrasonic image Download PDF

Info

Publication number
CN110507358B
CN110507358B CN201810488866.XA CN201810488866A CN110507358B CN 110507358 B CN110507358 B CN 110507358B CN 201810488866 A CN201810488866 A CN 201810488866A CN 110507358 B CN110507358 B CN 110507358B
Authority
CN
China
Prior art keywords
image
region
edge
point
fetal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810488866.XA
Other languages
Chinese (zh)
Other versions
CN110507358A (en
Inventor
郑末晶
丁红
陈良旭
刘建平
张新玲
张永
郑乐
王博源
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Appletree Biotechnology Co ltd
Original Assignee
Zhuhai Appletree Biotechnology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Appletree Biotechnology Co ltd filed Critical Zhuhai Appletree Biotechnology Co ltd
Priority to CN201810488866.XA priority Critical patent/CN110507358B/en
Publication of CN110507358A publication Critical patent/CN110507358A/en
Application granted granted Critical
Publication of CN110507358B publication Critical patent/CN110507358B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0866Detecting organic movements or changes, e.g. tumours, cysts, swellings involving foetal diagnosis; pre-natal or peri-natal diagnosis of the baby
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30044Fetus; Embryo

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Pregnancy & Childbirth (AREA)
  • Gynecology & Obstetrics (AREA)
  • Quality & Reliability (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The invention discloses an image processing method for measuring the thickness (NT) of a transparent object of a fetal nape from an ultrasonic image, which comprises the following implementation processes: inputting an ultrasonic image to be detected, acquiring a proportion conversion parameter and a thickness area of a fetal nape transparent object, namely an NT area, and detecting an image edge of a double parallel line structure in the NT area; determining the gray gradient direction of pixel points on the edge of the image on one side; tracking the pixel point on the selected image edge in the direction of descending gray gradient of the pixel point by taking the pixel point on the selected image edge as a starting point until the opposite edge is reached and the distance data is recorded; and polling all pixel points on the selected image edge once, and selecting the maximum distance length obtained after polling, namely the thickness of the fetal nape transparency. The method can automatically obtain the NT value, and can effectively solve the technical problems of complicated manual operation and detection error caused by the manual operation in the existing measuring method.

Description

Image processing method and system for measuring thickness of fetal nuchal transparency from ultrasonic image
Technical Field
The invention belongs to the field of target recognition and measurement of digital images, relates to an automatic analysis and image target measurement technology of obstetrical ultrasonic images, and particularly relates to an image processing method and system for measuring the thickness of a fetal nape transparency on an ultrasonic image.
Background
At present, congenital malformations are various in types and high in morbidity, and are one of medical problems, population quality is seriously affected, a heavy burden is brought to the society, prenatal diagnosis is a secondary intervention measure for the congenital malformations, and the prenatal diagnosis plays an important role in reducing the occurrence of the congenital malformations. And the prenatal diagnosis technology of congenital deformity is improved, so that the incidence rate of congenital deformity can be directly reduced.
The obstetrical ultrasonic image analysis is an important means for evaluating the intrauterine growth and development of the fetus, has the advantages of no damage, low price and real-time performance, and is in an important position in the medical imaging technology. In obstetrical ultrasonic image analysis, measurement of fetal Nuchal Translucency (NT) thickness by ultrasonic imaging is one of important indicators for measuring fetal intrauterine growth and development. NT refers to the clear liquid of the neck of the fetus. NT is only in 11-13 weeks of foetus+6Will be present, beginning at 14 weeks, and the NT is normally gradually absorbed by the lymphatic system. In 11 to 13 weeks+6Meanwhile, the thicker the NT, the higher the probability of having problems such as chromosome problems and heart problems after birth. At present, the accuracy of measuring the thickness of a transparent object on the neck and back of a fetus in an obstetrical ultrasonic image is not satisfactory, on one hand, the signal-to-noise ratio of the ultrasonic image is low due to the physical characteristics of ultrasonic imaging hardware, the manual diagnosis difficulty of a doctor is increased, and time and labor are wasted in the diagnosis process; another aspect is the random error introduced by manually measuring NT and the operator's own visual error.
With the development of computer technology, medical image processing technology is gradually widely applied, and it is a feasible development direction to apply the image processing technology to ultrasound image processing to improve NT detection accuracy. The image processing technology can determine the NT area by detecting the target line characteristics (the NT area has local highlight parallel line segments on the ultrasonic image) of the nape transparency so as to measure the thickness of the nape transparency. The NT region has local highlight parallel line segments on the ultrasonic image, and the characteristics of the ultrasonic image have many similar regions, so that the NT region and the back of the neck transparency are detected more wrongly by directly using a common target detection method.
With the increasing requirement of medical diagnosis on the accuracy of the measurement of the thickness of the fetal nuchal transparencies and the continuous and deep revolution of automated target detection and analysis measurement methods in the medical field, there is a need to research an image processing method or a processing system for measuring the thickness of the fetal nuchal transparencies from ultrasound images, and to provide a set of automated ultrasound image analysis auxiliary tools from the viewpoints of digital image processing and computer vision.
Disclosure of Invention
In order to overcome the above problems, the present inventors have made intensive studies to provide an image processing method and a processing system for measuring the thickness of the transparency of the fetal nape from an ultrasound image. The method comprises the steps of firstly determining an accurate NT region position, detecting the image edge of a double-parallel-line structure in the NT region by using a Canny edge detection operator, obtaining a distance tracking direction by combining the characteristics of high edge brightness and low middle region brightness of the NT structure on an ultrasonic image, and obtaining the maximum distance length of the double-parallel-line structure, namely an NT value, thereby completing the invention. The detection method is novel and unique, and the accuracy of the NT value is effectively improved by combining the prior information of the medical image on the basis of the algorithm, and the accuracy of the method is further improved by the method for automatically acquiring the NT region, so that the error and labor cost brought by manual detection are reduced.
The invention aims to provide the following technical scheme:
(1) an image processing method for measuring the thickness of a fetal nuchal transparency from an ultrasound image, the method comprising the steps of:
step 1), inputting an ultrasonic image to be detected, acquiring a proportion conversion parameter and a thickness area of a fetal nape transparent object, namely an NT area, and detecting an image edge of a double parallel line structure in the NT area;
step 2), selecting one side of image edge and determining the gray gradient direction of pixel points of the image edge;
step 3), taking the pixel point on the edge of the image selected in the step 2) as a starting point, tracking along the direction of the gradient decrease of the gray level of the pixel point until the opposite edge is reached, and recording distance data;
and 4) repeating the step 3), polling all pixel points on the selected image edge once, and selecting the maximum distance length obtained after polling, namely the thickness of the fetal nape transparency.
(2) A system for implementing the image processing method for measuring the thickness of the transparency of the fetal nape from an ultrasound image according to (1) above, the system comprising:
the image input module is used for inputting an ultrasonic image, acquiring a scale conversion parameter of the ultrasonic image and detecting an image edge of a double parallel line structure in a thickness area, namely an NT area, of the fetal nape transparency of the ultrasonic image;
the gradient direction detection module is used for determining the gray gradient direction of pixel points on the edge of the image on any side;
the tracking and recording module is used for tracking the selected pixel points on the edge of the image in the direction of descending gray gradient of the pixel points until the opposite edge is reached and is stopped; repeating the tracking operation until all pixel points on the selected image edge are polled once;
the data recording module is used for judging the rationality of the distance data and recording the distance data information;
and the data judgment module is used for screening the recorded data and determining the maximum distance length obtained after polling, namely the thickness of the fetal nape transparency.
According to the image processing method and the image processing system for measuring the thickness of the fetal nuchal transparency from the ultrasonic image, which are provided by the invention, the beneficial effects are as follows:
(1) in the invention, the characteristics of high edge brightness and low brightness of a middle area of the NT structure on an ultrasonic image are combined, a pixel point on the edge of the NT structure is taken as a starting point, a gray gradient descending direction is taken as a distance tracking direction, the distance length of the double parallel line structure is obtained, and the maximum distance length is taken as an NT value; the method for measuring the NT value is novel, and the NT value is scientifically and effectively detected by combining the characteristics of a medical ultrasonic image;
(2) according to the method, the characteristics that the NT structure in the ultrasonic image is approximate to a double parallel line are combined, the verification rule is determined, the rationality of the obtained distance data is screened, and the detection accuracy is effectively guaranteed;
(3) in the invention, the NT area is determined by determining the position of the head area first and then by the average deviation value of the head area and the NT area, the judging sequence of the area positions is from a large area to a small area, and the judging method can reduce the judging error of the NT area to the greatest extent;
(4) according to the method, the head region pre-weight is given to each pixel point in the image through the center point of the nose tip region, the pre-weight is given by taking the nose tip center as a reference, prior knowledge of an NT ultrasonic image is combined on the basis of a target detection method, most of false detection is eliminated by utilizing the prior knowledge, and the overall performance of the algorithm is improved;
(5) in the invention, head region pre-weighting is given to each pixel point in a set range near the nose tip region through the center point of the nose tip region, a head region detector outputs a detection region and a score of the detection region belonging to the head region, and the product of the pre-weighting and the score is used as a final standard for judging the head region; the method combines two parameters closely related to the head region into one parameter, thereby improving the accuracy of head region judgment; similarly, this method also improves the NT region determination accuracy.
Drawings
FIG. 1 shows a flow chart for obtaining NT values in a preferred embodiment of the present invention;
FIG. 2 is a diagram illustrating an example of zoning an ultrasound image in accordance with a preferred embodiment of the present invention;
FIG. 3 shows an NT region image determined in the present invention;
FIG. 4 illustrates the image edges of a double parallel line structure in the NT region detected by the Canny edge detector in the present invention;
FIG. 5 illustrates an example graph of distance tracking along the gradient descent direction of pixel points on an image edge;
FIG. 6 shows a flow chart for detecting the location of NT in an ultrasound image according to a preferred embodiment of the present invention;
FIG. 7 shows an exemplary diagram of an ultrasound image originally containing NT, the detection of the tip region of the nose and the head pre-weighting for an ultrasound image by the detection method of the present invention;
FIG. 8 is a diagram illustrating an exemplary detection process of the detection method of the present invention for a head region of an ultrasound image;
fig. 9 shows an exemplary diagram of the detection process of the present algorithm for an NT region of an ultrasound image.
Detailed Description
The invention is explained in further detail below with reference to the drawing. The features and advantages of the present invention will become more apparent from the description.
As shown in fig. 1, the object of the present invention is to provide an image processing method for measuring the thickness of the fetal nuchal transparency from an ultrasound image, the method comprising the following steps:
step 1), inputting an ultrasonic image to be detected, acquiring a proportion conversion parameter and a thickness area of a fetal nape transparent object, namely an NT area, and detecting an image edge of a double parallel line structure in the NT area;
step 2), selecting one side image edge and determining the gray gradient direction of pixel points on the side image edge;
step 3), taking the pixel point on the edge of the image selected in the step 2) as a starting point, tracking along the direction of the gradient decrease of the gray level of the pixel point until the opposite edge is reached, and recording distance data;
and 4) repeating the step 3), polling all pixel points on the selected image edge once, and selecting the maximum distance length obtained after polling, namely the thickness of the fetal nape transparency.
Step 1), inputting an ultrasonic image to be detected, and detecting the image edge of a double parallel line structure in the thickness area (namely NT area) of the transparency of the neck back of the fetus of the ultrasonic image.
The ultrasonic image source is acquired by a medical ultrasonic instrument, and the image format is a common grating image, such as JPG (joint photographic experts group) and BMP (bone map) format.
In the present invention, the ultrasound image satisfies the following conditions (qualified image criteria):
(i) the fetal head is completely displayed, and the head and the chest occupy 70-85% of the whole graph area;
(ii) the included angle between the tangent of the median sagittal line of the head and the tangent of the anterior chest is 130-160 degrees;
(iii) the tip region of the nose of the head should be the larger highlighted region of the full picture.
In the invention, a scaling parameter of the ultrasound image to be measured is acquired, wherein the scaling parameter represents the actual distance (e.g., how many millimeters) each pixel represents.
In one embodiment, the scaling parameters of the ultrasound image are manually/machine read from the medical ultrasound instrument panel if the instrument configuration is capable of directly displaying the scaling parameters.
In another embodiment, if the instrument configuration is not capable of directly displaying the scaling parameters, the scale and image magnification data are acquired from the medical ultrasound instrument panel and the scaling parameters of the ultrasound image are manually calculated/read.
In another preferred embodiment, if the instrument configuration is not capable of directly displaying the scaling parameter, it can be implemented by the following sub-steps:
step 1.1), dividing an input ultrasonic image to obtain a graduated scale area, an image area and an image magnification area, as shown in fig. 2;
for the division of the areas in the step 1.1), because interfaces displayed by ultrasonic instruments of different manufacturers are slightly different, the division into a plurality of areas in actual use can be flexibly divided according to actual conditions, but on the premise that a graduated scale area, an image area and an image magnification area can be clearly obtained, and a basis is provided for image processing and parameter extraction.
And step 1.2), acquiring an image scale and an image magnification, and solving a ratio conversion parameter of the image by combining the scale and the magnification. The graduated scale is used for representing the actual physical size corresponding to the pixel distance in the image.
In a preferred embodiment, step 1.2) comprises the following substeps:
step 1.2.1), carrying out image interception on the scale area, carrying out binarization operation on the intercepted image, and obtaining the pixel distance between any two adjacent scale points.
Step 1.2.2), intercepting the image magnification area, and obtaining the image magnification f from the image magnification area by utilizing an optical character recognition technology (OCR).
And step 1.2.3), combining the scale and the magnification information, obtaining a scaling parameter r of the image, wherein r is f × C/δ, and C is an actual physical size represented by a pixel distance between any two adjacent scale points of the scale.
In a preferred embodiment, the scale area and the image magnification area are sharpened prior to acquiring the image scale and the image magnification. The outline of the characters/images is compensated through sharpening processing, the edges of the characters/images and the part with gray level jump are enhanced, the images become clear, and the operation of extracting the characters through subsequent scale projection and OCR technology is facilitated.
In the present invention, the NT region position can be obtained by an automated detection algorithm, or can be manually labeled. When obtained by an automated detection algorithm, an image of the target region ROI (region of interest) is cut out according to the location parameters of the NT region, as shown in fig. 3. The position parameters of the NT region are in a (x, y, w, h) mode, wherein x, y, w and h are respectively the abscissa and ordinate of the pixel point at the upper left corner of the detection region and the width and height of the detection region.
In the present invention, Canny edge detection operator is used to detect the image edge of the double parallel line structure in the NT region, as shown in fig. 4. The image edge refers to a part with a significant brightness change in a local area of an image, and the gray profile of the area can be generally regarded as a step, namely, a sharp change from one gray value in a small buffer area to another gray value with a larger gray difference. The image edge of the double parallel line structure is the edge of the fetal nape transparency.
The Canny edge detection operator is selected based on high detection rate, accurate positioning and clear response performance; the algorithm is able to identify as many actual edges in the image as possible, and the identified edges are as close as possible to the actual edges in the actual image.
In a preferred embodiment, before detecting the image edge by using the Canny edge detection operator, the NT region is gaussian smoothed to reduce the image noise, and the image brightness is equalized by using the contrast-limited adaptive histogram equalization algorithm. The advantages of denoising and then brightness equalization are that the enhancement of noise existing in the image in the process of brightness equalization can be effectively prevented, the denoising difficulty is increased, and further the image quality is influenced.
And 2), selecting the edge of the image on one side and determining the gray gradient direction of a pixel point of the image.
Although the edge of the transparent object of the neck and back of the fetus is determined in the step 1, the tracking direction needs to be determined to reach the edge of the other image by taking a pixel point on the edge of one side of the image as a starting point in automatic detection; meanwhile, the fetal nape transparency is represented as high brightness of an edge image and low brightness of a middle area on the medical ultrasonic image; in combination with the specification of the ultrasound image, the inventor creatively and reasonably proposes that, based on the pixels on the edge of the image, the gray gradient directions of all the pixels within a set range, especially within a range of 5 × 5 pixels, centered on the pixels on the edge of the image are calculated, so as to obtain an average direction, the gray gradient descending direction of the average direction is determined (one end of the average direction is the gray gradient increasing direction, and the other end is the gray gradient descending direction), and the tracking direction of the edge of the image on the other side is obtained.
In the invention, the gray gradient of the pixel point (x, y) in the ultrasonic image is as follows:
Gx(x, y) ═ I (x +1, y) -I (x-1, y) formula (1)
Gy(x, y) ═ I (x, y +1) -I (x, y-1) formula (2)
Wherein G isx(x,y)、GyAnd (x, y) and I (x, y) respectively represent the horizontal gray gradient, the vertical gray gradient and the pixel value at the pixel point (x, y) in the ultrasonic image. The gradient magnitude G (x, y) and the gray gradient direction a (x, y) at point (x, y) are then:
Figure BDA0001667567750000091
Figure BDA0001667567750000092
and 3) tracking along the direction of the gradient decrease of the gray level of the pixel point by taking the pixel point on the edge of the image selected in the step 2) as a starting point until the opposite edge is reached and the distance data is recorded. As shown in fig. 5, the point indicated by the arrow is the starting point, and the line segment between the two edges is the tracking path.
In the present invention, step 3) comprises the following substeps:
substep 3.1), taking the selected pixel point on the edge of the image as a starting point, and tracking along the direction of the gradient reduction of the gray level of the pixel point until the opposite edge is reached and terminated;
substep 3.2) obtaining the image gray gradient direction at the termination point; specifically, based on the end point, the gray gradient directions of all the pixels within a set range, particularly within a 5 × 5 pixel range, centered at the end point are calculated, so as to obtain an average direction, and the gray gradient descending direction of the average direction is determined, which is the gray gradient direction of the image at the end point.
And 3.3) verifying the tracking path, and recording the distance length between the starting point and the end point, the included angle of the gray gradient directions of the starting point and the end point and a point pair consisting of the coordinates of the starting point and the end point after the verification is successful.
In substep 3.3) of the present invention, the verification success criteria are: the included angle between the gray gradient direction at the termination point and the gray gradient direction at the start point is more than 150 degrees and less than 180 degrees.
If the two edges of the fetal nape transparent object are in a quasi-parallel line structure, the tracking path is perpendicular to the two edges of the two fetal nape transparent objects, and the included angle between the gradient directions of the starting point and the ending point is 180 degrees; in fact, two edges of the fetal nape transparency are of a nearly parallel line structure, so that the tracking path is approximately perpendicular to two edges of the fetal nape transparency, the included angle between the gradient directions of the starting point and the ending point is close to 180 degrees, and certain-angle deviation is allowed.
In the step 4), the step 3) is repeated, all the pixel points on the selected image edge are polled once, and the maximum distance length obtained after polling is selected, namely the thickness of the fetal nape transparency.
In step 4), the specific operation process of selecting the maximum distance length obtained after polling is as follows:
substep 4.1), counting included angles of gray gradient directions of all the starting points and the end points, and removing point pairs with fewer included angle occurrence times; preferably, the point pairs with fewer occurrences of the included angle are point pairs with fewer occurrences than 5 times;
substep 4.2), sorting all the remaining point pairs according to the distance length between the point pairs, and selecting the maximum distance length, namely the corresponding NT distance;
substep 4.3), the image detection parameters are converted into actual NT values by taking pixels as units and through scaling parameter conversion.
The invention also provides a detection method for automatically acquiring the position of the NT region in the ultrasonic image, and the detection method provides the position parameters of the NT region in the step 1) and further can detect the image edge of the double parallel line structure in the NT region. As shown in fig. 6, the method comprises the steps of:
step 1'), establishing a standard image library, and acquiring average deviation values of a head region and an NT region;
step 2'), inputting an ultrasonic image, and obtaining a scale conversion parameter of the ultrasonic image;
step 3'), determining a head region;
and 4') determining the NT region, namely the position of the transparency of the neck back of the fetus by the determined head region and the average deviation value of the head region and the NT region.
Step 1'), establishing a standard image library: collecting a batch of ultrasound images with a head area and an NT area, acquiring scale conversion parameters of the ultrasound images, and labeling the head area and the NT area. The collected ultrasound images meet the "qualified image criteria" in step 1) above.
In the invention, the number of images in a standard image library is set as N, and a proportion conversion parameter is set as
Figure BDA0001667567750000116
The head region is
Figure BDA0001667567750000117
NT region is
Figure BDA0001667567750000118
Average deviation value of NT region relative to head region
Figure BDA0001667567750000119
Wherein, xhiAnd yhiRespectively the abscissa and ordinate, xn, of the pixel points in the head regioniAnd yniRespectively, the abscissa and the ordinate of the pixel point in the NT region.
In a preferred embodiment, step 1') further comprises training a head region detector. Preferably, the head region detector is trained by using an Adaboost algorithm and using a HOG feature, with an ultrasound image of a standard image library, where the head region position has been marked.
In a preferred embodiment, step 1') further comprises training the NT region detector. Preferably, the NT region detector is trained by using an Adaboost algorithm and using HOG features with an ultrasound image labeled with the NT region position in a standard image library.
The HOG feature is a local area descriptor that describes well the edges of objects and is insensitive to brightness variations and small amounts of offset. The Adaboost algorithm is a classifier algorithm, and the basic idea is to use a large number of simple classifiers with general classification capability to form a strong classifier with strong classification capability through superposition (boost) by a certain method.
In the invention, a training sample is selected from a standard image library, firstly, a head region (interested region) image is intercepted from an ultrasonic image as a positive sample, and a plurality of sub-images are randomly intercepted from a non-head region as a negative sample; after the strong classifier is obtained through training, the classifier can be applied to position the fetal head region.
Correspondingly, an NT region (region of interest) image is cut out from the ultrasonic image as a positive sample, and a plurality of sub-images are randomly cut out from a non-NT region as negative samples; after the strong classifier is obtained through training, the classifier can be applied to position the NT region of the fetus.
Step 3'), determining the head region.
In particular, step 3') comprises the following sub-steps:
step 3.1') locating the position of the nose tip region of the head;
step 3.2') taking the central point of the nose tip region as the center, and giving each pixel point in a set range near the nose tip region a head region pre-weight;
step 3.3') performing head region detection on the input image through a head region detector, and outputting a detection region and a score of the detection region belonging to the head region;
and 3.4') adjusting the score of the head region belonging to the detection region according to the output head region pre-weighting of the center point of the detection region to obtain a final score, and determining the detection region with the highest final score as the head region.
In the invention, in step 3.1'), an iterative threshold segmentation method is adopted to locate the position of the nose tip region of the head by finding a brightest region with a set size. The nasal tip region is determined according to the characteristics of the NT medical ultrasound image, i.e. the nasal tip of the head should be a large highlight region on the whole ultrasound image. Through the priori knowledge, the position of the nose tip region can be effectively and accurately obtained.
In particular, step 3.1') comprises the following sub-steps:
step 3.1.1') using an iterative threshold segmentation method to sequentially take values (255-100) from large to small to obtain a segmented binary image;
step 3.1.2') calculating a connected domain for the obtained binary image; computing a connected domain, preferably using a connected domain search algorithm;
step 3.1.3') when only 1 connected component is obtained in the image and the area of the connected component is greater than 20 pixels, the connected component is determined to be the nose tip region, see fig. 7 b.
In step 3.1.1'), because of the difficulty in identifying the nose tip region in the color image, the noise point may be misjudged as the nose tip region, and the original color image is converted into an image with only black and white two gray levels by an iterative threshold segmentation (binarization) method, for example, as shown in fig. 7a, so that the difficulty in identifying the image is reduced, and the position of the nose tip region can be accurately and conveniently determined according to the characteristics of the NT medical ultrasound image. In this step, the binary image refers to an image in which each pixel is either black or white, and the gray value thereof has no intermediate transition.
In step 3.1.3'), by changing the threshold from large to small, and combining the prior condition that the area of the nose tip region is larger than 20 pixels, accidental noise interference can be effectively eliminated.
In the present invention, in step 3.2'), the center point coordinates (x) of the connected component (nose tip region) are determinedc,yc),
Figure BDA0001667567750000131
Wherein (x, y) is the coordinate of the pixel point in the connected domain.
Endowing pixel points in a set range near the nose tip region with head region pre-weight based on the central point of the nose tip region
Figure BDA0001667567750000132
See fig. 7c, where x and y are respectively the abscissa and ordinate, σ, of each pixel point in the set range near the tip region of the nose in the ultrasound imagehSetting pixel point coordinate and central point coordinate (x) in range near nose tip regionc,yc) Maximum standard deviation of σhThe value is 95-110, and preferably 100. SigmahThe value determines the region to which the pre-weight is assigned, i.e. the approximate extent of the head region, if σhIf the value is too small (lower than 95), the determined range is too small, and the determination of the center position of the head region has deviation; if σhIf the value is too large (higher than 110), the determined range is too large, and the reference value to the head area is low; the accuracy of the subsequent determination of the NT region is greatly reduced under both conditions; when sigma ishWhen the value is 100, the region generally endowed with the pre-weight comprises a head region, the range is moderate, and the accuracy is high.
In the invention, the head region detector is obtained by adopting HOG characteristic training through an Adaboost algorithm, namely, in step 3.3'), a target detection result is output through the Adaboost algorithm, as shown in figure 8a, the target detection result comprises the position of a detection region and a score scoreh of the detection region belonging to the head regioni. The target detection result may be in the form of
Figure BDA0001667567750000141
Wherein, xhi、yhi、whi、hhiAnd scorehiRespectively the abscissa and ordinate of the pixel point at the upper left corner of the detection area, the width and height of the detection area and the score of the head area belonging to the detection area; i is the number of target detection results to be output, i is 1,2, …, P.
In step 3.4') of the present invention, the score of the head region belonging to the detection region is adjusted according to the output head region pre-weighting of the center point of the detection region, so as to obtain the final score, and the detection region with the highest final score is determined as the head region, as shown in fig. 8 b.
Final score scorehwi(x,y)=wh(xhi+whi/2,yhi+hhi/2)*scorehiThe detection region with the highest final score is taken as the head region (xh)i_max,yhi_max,whi_max,hhi_max),i_max=argmaxi∈1,2,…,P(scorehwi)。
Obtaining a center position (x) of the head region from the obtained head regionh,yh)=(xhi_max+whi_max/2,yhi_max+hhi_max/2)。
Step 4'), determining the fetal nape transparency thickness region by determining the head region and the average deviation value of the head region from the fetal nape transparency thickness region.
Step 4') comprises the following substeps:
and 4.1') calculating the central point of the NT region according to the central position of the head region and the average deviation of the NT region relative to the head region by combining the scale conversion parameters of the image, and endowing each pixel point in the set range with NT region pre-weighting based on the central point.
Step 4.2') carrying out NT region detection on the input image through an NT region detector, and outputting a detection region and a score of the detection region belonging to the NT region;
and 4.3') adjusting the score of the NT region belonging to the detection region according to the output NT region pre-weighting of the center point of the detection region to obtain a final score, and determining the detection region with the highest final score as the NT region.
In step 4.1') of the invention, in particular, according to the central position (x) of the head regionh,yh) Calculating the center point x of the NT regionn=xh+Δxhn*1/r,yn=yh+Δyhn1/r, wherein r is a scaling parameter of the input image.
Setting NT region pre-weight based on the center point
Figure BDA0001667567750000151
See FIG. 9a, wherein x and y are respectively the abscissa and the abscissa of each pixel point in the set range near the NT region in the ultrasound imageOrdinate, σnSetting pixel point coordinate and central point coordinate (x) in range near NT regionn,yn) Maximum standard deviation of σnThe value is 85-95, and preferably 90. SigmanThe value determines the approximate range of the pre-weighted region, i.e., the NT region, if σnIf the value is too small (lower than 85), the determined range is too small, which may cause the real NT region not to be included; if σnIf the value is too large (higher than 95), the determined range is too large, and the reference value is low; when sigma isnWhen the value is 90, the region given with the pre-weight generally comprises the NT region, the range is moderate, and the accuracy is high.
In the invention, a head region detector is obtained by adopting HOG characteristic training through an Adaboost algorithm, namely step 4.2'), a target detection result is output through the Adaboost algorithm, and the target detection result comprises a detection region position and a score scoren of a NT region of the detection regioni. The target detection result is in the form of
Figure BDA0001667567750000161
Wherein, xni、yni、wni、hniAnd scoreniRespectively the abscissa and ordinate of the pixel point at the upper left corner of the detection area, the width and height of the detection area and the score of the detection area belonging to the NT area; i is the number of target detection results to be output, i is 1,2, …, Q. See fig. 9 b.
In step 4.3') of the present invention, the score of the NT region belonging to the detection region is adjusted according to the output pre-weighting of the NT region at the center point of the detection region, and the detection region with the highest final score is taken as the NT region. Final score scorenwi(x,y)=wn(xni+wni/2,yni+hni/2)*scoreniThe detection region with the highest final score is taken as the NT region (xn)i_max,yni_max,wni_max,hni_max), i_max=argmaxi∈1,2,…,Q(scorehwi) See fig. 9 c.
Another object of the present invention is to provide a system for implementing the above image processing method for measuring the thickness of the fetal nape transparency from an ultrasound image, the system comprising:
the image input module is used for inputting an ultrasonic image, acquiring a scale conversion parameter of the ultrasonic image and detecting an image edge of a double parallel line structure in a thickness area, namely an NT area, of the fetal nape transparency of the ultrasonic image;
the gradient direction detection module is used for determining the gray gradient direction of pixel points on the edge of any one image;
the tracking and recording module is used for tracking the selected pixel points on the edge of the image in the direction of descending gray gradient of the pixel points until the opposite edge is reached and is stopped; repeating the tracking operation until all pixel points on the selected image edge are polled once;
the data recording module is used for judging the rationality of the distance data and recording the distance data information;
and the data judgment module is used for screening the recorded data and determining the maximum distance length obtained after polling, namely the thickness of the fetal nape transparency.
In the invention, the image input module comprises an image input sub-module, an image dividing sub-module, a scale obtaining sub-module, a magnification obtaining sub-module, a conversion relation sub-module and an edge detection sub-module, wherein,
the image input submodule is used for inputting an ultrasonic image to be processed;
the image dividing submodule is used for carrying out region division on an input image to obtain a graduated scale region, an image region and an image magnification region;
the scale acquisition submodule is used for carrying out image interception on a scale area, carrying out binarization operation on the intercepted image and obtaining the pixel distance between any two adjacent scale points or obtaining the average value delta of the pixel distance;
the magnification acquisition sub-module is used for intercepting an image magnification area and acquiring an image magnification value f from the image magnification area by utilizing an optical character recognition technology (OCR);
the conversion relation submodule is used for combining the scale and the magnification information to obtain a scaling parameter r of the image, wherein r is f C/delta, and C is the actual physical size represented by the unit scale of the scale;
and the edge detection submodule is used for detecting the image edge of the double parallel line structure in the NT region, and preferably adopts a Canny edge detection operator for detection.
In the invention, the gradient direction detection module calculates the gray gradient direction of all pixel points within a set range, particularly within a range of 5 × 5 pixels, by taking the pixel points at the edge of the image as the center based on the pixel points at the edge of the image, so as to obtain an average direction, and determines the gray gradient descending direction of the average direction (one end of the average direction is the gray gradient increasing direction, and the other end is the gray gradient descending direction), so as to obtain the tracking direction of the edge of the image at the other side.
In the invention, the data recording module judges the rationality standard of the distance data as follows: the included angle between the gray gradient direction at the termination point and the gray gradient direction at the start point is more than 150 degrees and less than 180 degrees.
In the invention, the data judgment module comprises a data statistics submodule, a data screening submodule, a data sorting submodule and a data conversion submodule, wherein,
the data statistics submodule is used for recording the distance length between the starting point and the ending point, the included angle of the gray gradient directions of the starting point and the ending point and a point pair consisting of the coordinates of the starting point and the ending point;
the data screening submodule counts included angles in the gray scale gradient direction of all the starting points and the end points and removes point pairs with the included angle occurrence frequency less than 5 times;
the data sorting submodule is used for sorting all the remaining point pairs according to the distance length between the point pairs and selecting the maximum distance length;
and the data conversion submodule is used for converting the maximum distance length detected by the image into an actual NT value through the scaling parameter.
Examples
Example 1
Establishing a model: 3500 ultrasonic images of the pregnancy stage stored by the obstetrical and gynecological ultrasonic workstation of the Zhongshan university Sun-Yi Xian commemorative hospital between 1 month and 2016 and 1 month in 2013 are collected. Screening out images meeting the conditions, and bringing the images into the image standard of research: the fetal head is completely displayed, and the head and the chest occupy 70-85% of the whole graph area; the included angle between the tangent of the median sagittal line of the head and the tangent of the anterior chest is 130-160 degrees; the tip region of the nose of the head should be the larger highlighted region of the full picture.
Manually screening 1500 images which accord with the measurement conditions, acquiring scale conversion parameters of the ultrasonic images, and labeling a head region and an NT region; firstly, acquiring an average offset value of a head region and an NT region; obtaining a head region detector through an Adaboost algorithm and by adopting HOG characteristic training; and thirdly, obtaining the NT region detector by adopting an Adaboost algorithm and HOG characteristic training.
Inputting an ultrasound image containing the NT region, as shown in FIG. 7 a; using an iterative threshold segmentation method, when the threshold is 203, the obtained binary image has only 1 connected domain and the area of the connected domain is greater than 20 pixels, and determining that the connected domain is a nose tip region, as shown in fig. 7 b; with the center point (coordinate 485,139) of the nose tip region as the center, giving a pre-weight, sigma, to the head region to each pixel point in a set range near the nose tip regionhThe value is 100 as shown in fig. 7 c.
Detecting specific positions of the head using a head region detector, outputting 7 target detection results (313,3,110, 0.824), (389,127,143,143,0.781), (458,194,110,110,0.752), (341,28,124, 0.704), (419,318,126,126,0.68), (181,164,118,118,0.607), (175,5,143, 0.569), as shown in fig. 8 a; the detection result is adjusted according to the pre-weights, the highest score is 0.636, and the head center coordinates are (460.5,198.5), as shown in fig. 8 b.
Calculating the central point of the NT region according to the average deviation of the head region and the NT region relative to the head region and by combining the scale conversion parameters of the image, and endowing each pixel point in the set range with the pre-weight of the NT region based on the central point, sigmanThe value is 90 as shown in fig. 9 a. NT region detection is performed on the input image using an NT region detector, and 3 detection results (461,279,66,66,0.884), (273, 36) are output2,71,71,0.857), (329,397,66,66,0.739), as shown in fig. 9 b. The detection results were adjusted according to the pre-weights, with a top score of 0.493 and NT detection region results of (329,397,66,66), as shown in fig. 9 c.
The two-point coordinates of the NT calibration are manually detected by a doctor as (358,422) and (353,447), the NT region finally detected by the algorithm contains the two-point coordinates of the manually detected NT calibration, and the detection result is correct.
And intercepting the NT region, performing Gaussian smoothing and brightness equalization processing on the target region, and detecting the image edge of the double parallel line structure in the NT region by using a Canny edge detection operator.
And taking pixel points on the edge of the image as a starting point, extending to 5 multiplied by 5 pixels as an investigation range, calculating the average direction of the gray gradient of the image in the range of the edge of the image, and obtaining the tracking direction of the edge of the image on the other side by determining the descending direction of the gray gradient.
Tracking the pixel point on the selected image edge along the direction of the gradient reduction of the gray level of the pixel point by taking the pixel point on the selected image edge as a starting point until the opposite edge is reached; and calculating the gray gradient direction of the image at the end point, and recording the distance length between the start point and the end point, the gradient direction angle between the start point and the end point and a point pair consisting of the coordinates of the start point and the end point when the included angle between the gray gradient direction of the image at the end point and the gray gradient direction of the start point is more than 150 degrees and less than 180 degrees.
Polling all pixel points on the selected image edge once, counting all included angles in the gradient direction, and removing point pairs with the occurrence frequency less than 5 times; and sequencing all the remaining point pairs according to the distance length, selecting the maximum distance length, and converting the maximum distance length into an actual NT value through conversion of a proportionality coefficient. It was determined that the NT values correspond to the point pairs of (356,422) and (354,448), and the NT value was 0.20 cm.
The two-point coordinates of NT calibration are (358,422) and (353,447) through manual detection by a doctor, the NT value is 0.22cm, the detection result of the method is very similar to the manual detection result, and the detection result is correct.
The present invention has been described above in connection with preferred embodiments, but these embodiments are merely exemplary and merely illustrative. On the basis of the above, the invention can be subjected to various substitutions and modifications, and the substitutions and the modifications are all within the protection scope of the invention.

Claims (7)

1. An image processing method for measuring the thickness of a fetal nuchal transparency from an ultrasound image, the method comprising the steps of:
step 1), inputting an ultrasonic image to be detected, acquiring a proportion conversion parameter and a thickness area of a fetal nape transparent object, and detecting an image edge of a double parallel line structure in the thickness area of the fetal nape transparent object;
the thickness area position of the transparency of the back of the neck of the fetus in the step 1) is automatically obtained through the following steps:
step 1'), establishing a standard image library, and acquiring average deviation values of a head region and a fetal nape transparent object thickness region;
step 2'), inputting an ultrasonic image, and obtaining a scale conversion parameter of the ultrasonic image;
step 3'), a head region is determined, which comprises in particular the following sub-steps:
step 3.1') locating the position of the nose tip region of the head;
step 3.2') taking the central point of the nose tip region as the center, and giving each pixel point in a set range near the nose tip region a head region pre-weight;
step 3.3') performing head region detection on the input image through a head region detector, and outputting a detection region and a score of the detection region belonging to the head region;
step 3.4') adjusting the score of the head region belonging to the detection region according to the head region pre-weighting of the output detection region center point to obtain a final score, and determining the detection region with the highest final score as the head region;
step 4'), determining the thickness area of the fetal nape transparency, namely the position of the fetal nape transparency through the determined head area and the average deviation value of the head area and the thickness area of the fetal nape transparency;
step 2), selecting one side of image edge and determining the gray gradient direction of pixel points of the image edge;
step 3), taking the pixel point on the edge of the image selected in the step 2) as a starting point, tracking along the direction of the gradient decrease of the gray level of the pixel point until the opposite edge is reached, and recording distance data;
step 4), repeating the step 3), polling all pixel points on the selected image edge once, and selecting the maximum distance length obtained after polling, namely the thickness of the fetal nape transparent object;
in step 1), the scaling parameters are directly read from a panel of the medical ultrasonic instrument, or are implemented by the following sub-steps:
step 1.1), dividing an input ultrasonic image to obtain a graduated scale area, an image area and an image magnification area;
step 1.2), acquiring an image scale and an image magnification, and solving a ratio conversion parameter of the image by combining the scale and the magnification;
step 1.2) comprises the following substeps:
step 1.2.1), carrying out image interception on a graduated scale area, carrying out binarization operation on the intercepted image, and obtaining the pixel distance between any two adjacent graduated points or taking the average value delta of the pixel distance;
step 1.2.2), intercepting an image magnification area, and obtaining an image magnification f from the image magnification area by utilizing an optical character recognition technology;
and step 1.2.3), combining the scale and the magnification information to obtain a scaling parameter r of the image, wherein r is f × C/δ, and C is an actual physical size represented by a pixel distance between any two adjacent scale points of the scale.
2. The method according to claim 1, characterized in that in step 1), the Canny edge detection operator is used for detecting the image edges of the double parallel line structure in the thickness area of the transparency of the fetal napus;
and after Gaussian smoothing and image brightness equalization are sequentially carried out on the thickness area of the fetal nape transparency, image edge detection is carried out.
3. The method according to claim 1, wherein in step 2), based on the pixel points on the image edge, the average direction of the image gray gradient within the set range of the image edge is calculated, the descending direction of the gray gradient is determined, and the tracking direction to the image edge on the other side is obtained.
4. The method as claimed in claim 3, wherein in step 2), based on the pixel points on the image edge, the average direction of the image gray gradient within a range of 5 × 5 pixels set at one edge of the image is calculated, the descending direction of the gray gradient is determined, and the tracking direction to the other edge of the image is obtained.
5. Method according to claim 1, characterized in that step 3) comprises the following sub-steps:
substep 3.1) tracking along the direction of the gradient decrease of the gray level of the pixel point by taking the pixel point on the selected image edge as a starting point until the opposite edge is reached and terminated;
substep 3.2) obtaining the image gray gradient direction at the termination point;
substep 3.3), verifying the tracking path, and recording the distance length between the starting point and the end point, the included angle of the gray gradient directions of the starting point and the end point and a point pair consisting of the coordinates of the starting point and the end point after the verification is successful;
wherein, the verification success standard is as follows: the included angle between the gray gradient direction at the termination point and the gray gradient direction at the start point is more than 150 degrees and less than 180 degrees.
6. The method according to claim 1, wherein the specific operation procedure for selecting the maximum distance length obtained after polling in step 4) is as follows:
substep 4.1) counting included angles in the gray gradient directions of all the starting points and the end points, and removing point pairs with the included angle occurrence frequency less than 5 times;
substep 4.2), sorting all the remaining point pairs according to the distance length between the point pairs, and selecting the maximum distance length, namely the thickness distance of the corresponding fetal nape transparent object;
and substep 4.3) converting the maximum distance length detected by the image into an actual fetal nape transparency thickness value through scaling parameter conversion.
7. A processing system for implementing the image processing method for measuring the thickness of the transparency of the fetal nape from an ultrasound image as claimed in any one of claims 1 to 6.
CN201810488866.XA 2018-05-21 2018-05-21 Image processing method and system for measuring thickness of fetal nuchal transparency from ultrasonic image Active CN110507358B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810488866.XA CN110507358B (en) 2018-05-21 2018-05-21 Image processing method and system for measuring thickness of fetal nuchal transparency from ultrasonic image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810488866.XA CN110507358B (en) 2018-05-21 2018-05-21 Image processing method and system for measuring thickness of fetal nuchal transparency from ultrasonic image

Publications (2)

Publication Number Publication Date
CN110507358A CN110507358A (en) 2019-11-29
CN110507358B true CN110507358B (en) 2022-01-11

Family

ID=68621676

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810488866.XA Active CN110507358B (en) 2018-05-21 2018-05-21 Image processing method and system for measuring thickness of fetal nuchal transparency from ultrasonic image

Country Status (1)

Country Link
CN (1) CN110507358B (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1915177A (en) * 2005-08-17 2007-02-21 北京天惠华数字技术有限公司 Method for recording identification of fetus sexes through B ultrasonic
KR20080004775A (en) * 2006-07-06 2008-01-10 이화여자대학교 산학협력단 Method for automated measurement of nuchal translucency in a fetal ultrasound image
CN101292883A (en) * 2007-04-23 2008-10-29 深圳迈瑞生物医疗电子股份有限公司 Ultrasonic three-dimensional quick imaging method and apparatus
CN102151149A (en) * 2010-12-24 2011-08-17 深圳市理邦精密仪器股份有限公司 Method and system for automatically measuring ultrasound image of fetus
CN102415902A (en) * 2010-09-13 2012-04-18 株式会社东芝 Ultrasonic diagnostic apparatus and ultrasonic image processng apparatus
EP2444002A1 (en) * 2010-10-22 2012-04-25 Samsung Medison Co., Ltd. 3D ultrasound system for intuitive displaying an abnormality of a fetus and method for operating 3D ultrasound system
WO2012053612A1 (en) * 2010-10-20 2012-04-26 株式会社東芝 Ultrasonic diagnosis device, control method and image processing device
CN102592281A (en) * 2012-01-16 2012-07-18 北方工业大学 Image matching method
CN103263278A (en) * 2013-01-23 2013-08-28 郑末晶 Image processing method for automatically measuring thickness of fetal nuchal translucency from ultrasonic image
CN103479399A (en) * 2013-10-11 2014-01-01 华北电力大学(保定) Automatic retrieval method for calcified plaque frames in intravascular ultrasound image sequence
CN103996194A (en) * 2014-05-23 2014-08-20 华中科技大学 Automatic intima-media membrane partitioning method based on ultrasound carotid artery image
CN104156967A (en) * 2014-08-18 2014-11-19 深圳市开立科技有限公司 Nuchal translucency image segmentation method, device and system
CN105662460A (en) * 2014-12-05 2016-06-15 三星麦迪森株式会社 Ultrasound method and apparatus for processing ultrasound image

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2391625A (en) * 2002-08-09 2004-02-11 Diagnostic Ultrasound Europ B Instantaneous ultrasonic echo measurement of bladder urine volume with a limited number of ultrasound beams
US6676605B2 (en) * 2002-06-07 2004-01-13 Diagnostic Ultrasound Bladder wall thickness measurement system and methods
JP5158679B2 (en) * 2007-09-14 2013-03-06 国立大学法人岐阜大学 Image processing apparatus, image processing program, storage medium, and ultrasonic diagnostic apparatus
KR101971622B1 (en) * 2012-01-04 2019-08-13 삼성전자주식회사 The method and apparatus for measuring biometrics of object
KR102273831B1 (en) * 2014-01-07 2021-07-07 삼성메디슨 주식회사 The Method and Apparatus for Displaying Medical Image
CN107007302B (en) * 2017-04-27 2020-11-10 深圳开立生物医疗科技股份有限公司 Ultrasonic equipment, ultrasonic image processing method and device
CN107569257A (en) * 2017-09-29 2018-01-12 深圳开立生物医疗科技股份有限公司 Ultrasonoscopy processing method and system, ultrasonic diagnostic equipment

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1915177A (en) * 2005-08-17 2007-02-21 北京天惠华数字技术有限公司 Method for recording identification of fetus sexes through B ultrasonic
KR20080004775A (en) * 2006-07-06 2008-01-10 이화여자대학교 산학협력단 Method for automated measurement of nuchal translucency in a fetal ultrasound image
CN101292883A (en) * 2007-04-23 2008-10-29 深圳迈瑞生物医疗电子股份有限公司 Ultrasonic three-dimensional quick imaging method and apparatus
CN102415902A (en) * 2010-09-13 2012-04-18 株式会社东芝 Ultrasonic diagnostic apparatus and ultrasonic image processng apparatus
WO2012053612A1 (en) * 2010-10-20 2012-04-26 株式会社東芝 Ultrasonic diagnosis device, control method and image processing device
EP2444002A1 (en) * 2010-10-22 2012-04-25 Samsung Medison Co., Ltd. 3D ultrasound system for intuitive displaying an abnormality of a fetus and method for operating 3D ultrasound system
CN102151149A (en) * 2010-12-24 2011-08-17 深圳市理邦精密仪器股份有限公司 Method and system for automatically measuring ultrasound image of fetus
CN102592281A (en) * 2012-01-16 2012-07-18 北方工业大学 Image matching method
CN103263278A (en) * 2013-01-23 2013-08-28 郑末晶 Image processing method for automatically measuring thickness of fetal nuchal translucency from ultrasonic image
CN103479399A (en) * 2013-10-11 2014-01-01 华北电力大学(保定) Automatic retrieval method for calcified plaque frames in intravascular ultrasound image sequence
CN103996194A (en) * 2014-05-23 2014-08-20 华中科技大学 Automatic intima-media membrane partitioning method based on ultrasound carotid artery image
CN104156967A (en) * 2014-08-18 2014-11-19 深圳市开立科技有限公司 Nuchal translucency image segmentation method, device and system
CN105662460A (en) * 2014-12-05 2016-06-15 三星麦迪森株式会社 Ultrasound method and apparatus for processing ultrasound image

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
产前超声医学图像处理;周崟;《通讯世界》;20170212;全文 *
胎儿颈部透明带厚度检测在产前筛查中的应用;魏瑗,扬孜,刘朝晖,等;《中国优生与遗传杂志》;20060725;全文 *

Also Published As

Publication number Publication date
CN110507358A (en) 2019-11-29

Similar Documents

Publication Publication Date Title
CN114972329B (en) Image enhancement method and system of surface defect detector based on image processing
CN108186051B (en) Image processing method and system for automatically measuring double-apical-diameter length of fetus from ultrasonic image
CN108564085B (en) Method for automatically reading of pointer type instrument
CN110210448B (en) Intelligent face skin aging degree identification and evaluation method
CN108378869B (en) Image processing method and processing system for automatically measuring head circumference length of fetus from ultrasonic image
CN108961280B (en) Fundus optic disc fine segmentation method based on SLIC super-pixel segmentation
CN108537751B (en) Thyroid ultrasound image automatic segmentation method based on radial basis function neural network
CN112818988A (en) Automatic reading identification method and system for pointer instrument
CN108615239B (en) Tongue image segmentation method based on threshold technology and gray level projection
CN109376740A (en) A kind of water gauge reading detection method based on video
CN103263278A (en) Image processing method for automatically measuring thickness of fetal nuchal translucency from ultrasonic image
CN112907519A (en) Metal curved surface defect analysis system and method based on deep learning
CN102254159A (en) Interpretation method for digital readout instrument
CN111415339B (en) Image defect detection method for complex texture industrial product
CN110580697B (en) Video image processing method and system for measuring thickness of fetal nape transparency from ultrasonic video image
CN111724355A (en) Image measuring method for abalone body type parameters
CN111639629A (en) Pig weight measuring method and device based on image processing and storage medium
WO2021139447A1 (en) Abnormal cervical cell detection apparatus and method
CN111583250B (en) Deep learning-based ultrasonic image mitral valve positioning method and system
CN108416304B (en) Three-classification face detection method using context information
CN110051384B (en) Method and system for detecting position of transparency of neck and back of fetus by combining medical statistical information
CN116258864B (en) Village planning construction big data management system
CN110348353B (en) Image processing method and device
CN110507358B (en) Image processing method and system for measuring thickness of fetal nuchal transparency from ultrasonic image
CN116486212A (en) Water gauge identification method, system and storage medium based on computer vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant