WO2015139267A1 - 自动识别测量项的方法、装置及一种超声成像设备 - Google Patents

自动识别测量项的方法、装置及一种超声成像设备 Download PDF

Info

Publication number
WO2015139267A1
WO2015139267A1 PCT/CN2014/073777 CN2014073777W WO2015139267A1 WO 2015139267 A1 WO2015139267 A1 WO 2015139267A1 CN 2014073777 W CN2014073777 W CN 2014073777W WO 2015139267 A1 WO2015139267 A1 WO 2015139267A1
Authority
WO
WIPO (PCT)
Prior art keywords
image frame
slice image
specified slice
measurement
measurement item
Prior art date
Application number
PCT/CN2014/073777
Other languages
English (en)
French (fr)
Inventor
邹耀贤
姚斌
魏芅
林穆清
曾佐祺
Original Assignee
深圳迈瑞生物医疗电子股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳迈瑞生物医疗电子股份有限公司 filed Critical 深圳迈瑞生物医疗电子股份有限公司
Priority to CN201911159310.7A priority Critical patent/CN110811691B/zh
Priority to CN201480047617.7A priority patent/CN105555198B/zh
Priority to PCT/CN2014/073777 priority patent/WO2015139267A1/zh
Priority to EP14885962.2A priority patent/EP3127486B1/en
Publication of WO2015139267A1 publication Critical patent/WO2015139267A1/zh
Priority to US15/271,095 priority patent/US10898109B2/en
Priority to US17/144,786 priority patent/US11717183B2/en
Priority to US18/211,516 priority patent/US20230329581A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1075Measuring physical dimensions, e.g. size of the entire body or parts thereof for measuring dimensions by non-invasive methods, e.g. for determining thickness of tissue layer
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0866Detecting organic movements or changes, e.g. tumours, cysts, swellings involving foetal diagnosis; pre-natal or peri-natal diagnosis of the baby
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0875Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of bone
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • A61B8/4461Features of the scanning mechanism, e.g. for moving the transducer within the housing of the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • A61B8/469Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means for selection of a region of interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5223Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment

Definitions

  • the present invention relates to a medical device, and more particularly to an ultrasonic imaging device and a method thereof and apparatus for automatically identifying a measurement item.
  • Ultrasonic instruments are generally used by doctors to observe the internal structure of the human body.
  • the doctor places the ultrasound probe on the surface of the skin corresponding to the human body part, and the ultrasound section image of the part can be obtained.
  • Ultrasound has become the main inspection method for doctors' diagnosis because of its safety, convenience, losslessness and low cost.
  • HC head circumference
  • BPD double top diameter
  • AC abdominal circumference
  • FL femur length
  • the doctor in the examination, after the doctor has made the relevant standard cut surface, first press the measurement button. After the measurement menu appears in the system, the doctor moves the trackball, selects the corresponding measurement item in the menu, and then performs the measurement of the measurement item. Take the obstetric examination as an example. After the doctor has played the relevant standard cut surface, first press the measurement key, the system displays the measurement menu, enters the measurement state, then moves the trackball, and then selects the corresponding measurement item in the menu.
  • the doctor first moves the cursor to the measurement menu by rotating the trackball, selects the head circumference measurement item in the menu, and the doctor selects the measurement item and then moves the cursor to the skull image by rotating the trackball.
  • press the OK button to click the first point
  • press the OK button to click the second point to get an axis of the ellipse
  • move the cursor to adjust the length of the other axis it may be necessary to modify the two points determined before, until the ellipse will match the position of the fetal skull.
  • One measurement often requires a lot of points to make the ellipse match the measured structure. If it is a line segment target, you need to click at least two. At one point, there is literature that doctors spend 20% to 30% of their time measuring.
  • an embodiment provides a method of automatically identifying a measurement item, comprising:
  • the image obtaining step acquires a gray value of each pixel in the specified slice image frame, and the gray value of the pixel corresponds to the ultrasonic echo formed by the reflected ultrasound signal by the detected living tissue;
  • the measuring step measures the measurement item parameter of the specified slice image frame based on the identified measurement item.
  • an embodiment provides an apparatus for automatically identifying a measurement item, including:
  • An image acquisition module configured to acquire a gray value of each pixel in the specified slice image frame, where the gray value of the pixel point corresponds to the ultrasonic echo formed by the reflected ultrasound signal;
  • An identification module configured to identify, according to a gray value of each pixel point, at least one measurement item corresponding to the specified slice image frame
  • a measurement module configured to measure a measurement item parameter of a specified slice image frame based on the identified measurement item.
  • an ultrasound imaging apparatus including:
  • a probe for transmitting ultrasonic waves to the tissue to be tested and receiving ultrasonic echoes; a signal processor for processing the ultrasonic echoes to generate ultrasonic image data; an image processor for processing the ultrasonic image data, and A slice image is generated, and the image processor includes the above-described means for automatically identifying the measurement item.
  • the automatic identification measurement item method/device of the present invention since the invention can automatically identify the measurement item of the specified slice according to the content of the specified slice image, the step of selecting the measurement item by the user is omitted in the ultrasonic measurement process, so that the measurement is made. More convenient and automated.
  • DRAWINGS 1 is a structural diagram of an ultrasonic imaging apparatus according to an embodiment of the present application.
  • FIG. 2 is a structural diagram of an apparatus for automatically identifying a measurement item according to an embodiment of the present application
  • FIG. 3 is a flow chart of a method for automatically identifying a measurement item in an embodiment of the present application
  • FIG. 4 is a structural diagram of an identification module provided in Embodiment 1 of the present application.
  • FIG. 5 is a flowchart of an identification method provided by Embodiment 1 of the present application.
  • FIG. 6 is a structural diagram of an identification module provided in Embodiment 2 of the present application.
  • Fig. 8 is a view showing an example of a cut surface of each measurement item in the obstetric measurement mode in the second embodiment of the present application, wherein Fig. 8a exemplarily describes a head circumference section; Fig. 8b exemplarily describes a belly circumference section; Fig. 8c exemplarily describes a femoral section.
  • FIG. 1 shows the structure of the ultrasonic imaging apparatus, which includes an ultrasonic generating circuit 11, a probe 1, a signal processor 2, an image processor 3, and a display 4.
  • Probe 1 is used to transmit ultrasonic waves to the scanning target tissue and receive ultrasonic echoes.
  • the ultrasonic generating circuit 11 generates waveform data, and the array element of the probe 1 is turned on by the transmitting channel 12, and the ultrasonic wave is transmitted to the detected tissue, and the ultrasonic wave is reflected and absorbed by the tissue to form an ultrasonic echo, and the probe 1 receives the ultrasonic echo, and passes through the receiving channel 13 Output to signal processor 2.
  • the signal processor 2 is for processing ultrasonic echoes to generate ultrasound image data.
  • the signal processor 2 first obtains a radio frequency (RF) signal through the beam synthesizing link of the ultrasonic echo received by the receiving channel 13; and then quadrature demodulates to obtain a quadrature demodulated baseband signal.
  • the processed ultrasonic image data is output to the image processor 3.
  • RF radio frequency
  • the image processor 3 is for processing the ultrasonic image data, and generates a cut image to be sent to the display 4 for display.
  • the image processor 3 includes means for automatically identifying the measurement item, the means for automatically identifying the measurement item processes the ultrasonic image data output by the signal processor 2, identifies the measurement item corresponding to the user-specified cut surface image, and further measures the measurement item specification Various measurement parameters.
  • the display 4 is used to display a cut image generated by the image processor 3 and various measurement parameters.
  • the structure of the apparatus for automatically identifying the measurement item is as shown in FIG. 2, and includes an image acquisition module 31, an identification module 32, and a measurement module 33.
  • the image obtaining module 31 is configured to acquire a gray value of each pixel in the specified slice image frame, such as The gray value of the prime point corresponds to the ultrasonic echo formed by the reflected biological signal by the detected living tissue.
  • the identification module 32 is configured to identify at least one measurement item corresponding to the specified slice image frame based on the gray value of each pixel. For example, the gray value of each pixel is processed and compared with the preset data model, and the measurement item is identified based on the analysis result.
  • the measurement module 33 is configured to measure a measurement item parameter of the specified slice image frame based on the identified measurement item.
  • the apparatus for automatically identifying the measurement item may further include a measurement mode acquisition module 34 for acquiring a measurement mode selected by the user, and the identification module 32 identifies the specified slice image frame according to the measurement mode entered by the user when identifying the measurement item. Corresponding measurement items.
  • the method flow for automatically identifying the measurement item based on the above ultrasonic imaging apparatus is as shown in FIG. 3, and includes the following steps:
  • Step 400 Detect a measurement instruction input by a user. Measurement commands are generated by the user pressing the measurement key or selecting a measurement option. Perform the following steps when a measurement command is detected.
  • Step 410 image acquisition.
  • the image processor sends the processed ultrasound image to the display for display, and the user specifies the slice image by observing the displayed image, and obtains the gray of each pixel in the specified slice image frame from the stored image data according to the slice image specified by the user.
  • Degree value, the specified cut image is usually the currently displayed image
  • the gray value of the pixel corresponds to the ultrasonic echo formed by the reflected biological tissue
  • the echo signal of the bone part is stronger.
  • the gray value of the part is also large, the echo signal of the soft tissue part is weak, and the gray value of the image part is small.
  • Step 420 Identify the measurement item.
  • a measurement mode measured tissue site
  • commonly used measurement items include liver, biliary, spleen, kidney size, etc.
  • commonly used measurement items include head circumference (HC), double top diameter (BPD), abdominal circumference (AC), and femur length (FL).
  • HC head circumference
  • BPD double top diameter
  • AC abdominal circumference
  • FL femur length
  • the measurement item is determined according to the cut surface image of the measurement target. There are always some differences in the cut surface image corresponding to different measurement items, which provides feasibility for automatic identification. A feasible solution is to preset each measurement item in the system.
  • the slice image data model compares the gray value of each pixel in the image frame of the specified slice with the preset data model to identify the measurement item.
  • the preset data model may be a feature that can distinguish different images from other different cut images, such as a training sample model, or a shape of a slice corresponding to each measurement item, and a bright Physical characteristics such as degree range or size.
  • the head circumference and the abdominal circumference are oval targets
  • the double top diameter and the femur length are line segment targets
  • the head circumference and the double top diameter can be measured on the same image cut surface. Therefore, these four measurement items It consists of three measuring sections, head circumference section (HC and BPD same section), abdominal section, and femur section. If the image specified by the doctor is the abdominal section, the measurement item is the abdominal circumference (AC); when the specified section image is the femoral section, the measurement item is the femur length (FL); if the doctor specifies the cut image as the head circumference When cutting the surface, the measurement items are head circumference (HC) and/or double top diameter (BPD). Based on this, in the embodiment, the image processor analyzes and processes the gray value of each pixel point, and recognizes the slice image of the current specified slice image, thereby obtaining the measurement item corresponding to the specified slice image frame.
  • the image processor analyzes and processes the gray value of each pixel point, and recognizes the slice image of the current
  • Step 430 measuring parameters. Based on the measurement items identified in step 420, the measurement item parameters of the specified slice image frame are measured. In the actual measurement process, the parameters can be measured by manual measurement, semi-automatic measurement or automatic measurement. When a certain aspect image corresponds to two or more measurement items, for example, the measurement items of the head circumference section are head circumference (HC) and double top diameter (BPD), and the identified measurement items can be measured one by one during the measurement process. .
  • HC head circumference
  • BPD double top diameter
  • the method may further include: Step 440: Acquire a measurement mode used when detecting the living tissue. Get the measurement mode selected by the user. By obtaining the measurement mode selected by the user, the recognition range of the measurement item can be narrowed, and the recognition efficiency can be improved, and the recognition accuracy can be further improved.
  • the embodiment automatically recognizes the measurement item according to the image content, and reduces the step of the doctor moving the trackball to select the measurement item in the menu, thereby improving the efficiency of the doctor's measurement.
  • Example 1 The key point of this application is to increase the technical content of the automatic identification measurement item, and there are various solutions for how to identify the measurement item corresponding to the cut surface image.
  • the automatic identification of the measurement items of the present application will be further described below by way of specific embodiments.
  • the present embodiment extracts features that can mark the sections of different measurement items, and then performs identification of the measurement items based on these features.
  • the feature of the slice can be extracted by a machine learning method for classification and discrimination.
  • FIG. 4 is a structural diagram of an identification module 32 provided by this embodiment.
  • the specific structure includes: a feature generation unit 3211, a comparison unit 3212, and a lookup unit 3213.
  • the feature generating unit 3211 is configured to generate a feature of the specified slice image frame according to the gray value of each pixel in the specified slice image frame; the comparing unit 3212 is configured to respectively perform the feature of the specified slice image frame and each training in the preset training sample model The feature of the sample is compared; the searching unit 3213 is configured to find a training sample whose feature is closest to the feature of the specified slice image frame, and the measurement item corresponding to the training sample is used as the measurement item corresponding to the specified slice image frame.
  • this embodiment also discloses a specific machine learning recognition method.
  • the machine learning method usually learns the characteristics of each sample by training the sample (a series of samples that already know the measurement item), and then compares the characteristics of the measured slice sample with the training sample to determine the corresponding sample of the measured slice. Which type of measurement item.
  • common machine learning methods include Principal Component Analysis (PC A), Linear Discriminant Analysis (LDA), Kernel Principal Component Analysis (KPC A), and local maintenance. Locality Preserving Projections (LPP), Support Vector Machine (SVM), Artificial Neural Networks (ANS), etc.
  • the dimension of the image data acquired by the image acquisition module 31 is very high, and a W*H image can be regarded as a W*H-dimensional vector, which often exists between the dimensions of the high-dimensional vector.
  • a common method is to project high-dimensional data into a low-dimensional space, eliminating the data between the dimensions. Redundancy, PCA is such a method.
  • the essence of the algorithm is to find the projection that best represents the original high-dimensional data in the least mean square sense.
  • PC A is a further description of the present embodiment by using PC A as an example.
  • LDA, KPC A, LPP can also be adopted according to the idea of the present embodiment without requiring creative work.
  • Existing other machine learning methods such as SVM and ANNs implement the technical solution of the embodiment.
  • FIG. 5 is a flowchart of the identification method based on the foregoing identification module, and the specific method steps include:
  • Step 510 feature generation.
  • the feature of the specified slice image frame is generated based on the gray value of each pixel in the specified slice image frame.
  • the idea of this embodiment is to compare the features of the specified slice image frame with the features of the preset training samples to obtain the measurement item category of the specified slice image frame. Therefore, it is first necessary to obtain the characteristics of each training sample and the characteristics of the specified slice image frame.
  • the features of each training sample and the features of the specified slice image frame may be a combination of feature values, feature vectors, or feature values and feature vectors.
  • a preferred way is to use the projection coefficient of the feature vector of the training sample on the average value of the training sample as the feature of the training sample, and specify the projection coefficient of the feature vector of the slice image frame on the average value of the training sample as the specified slice image.
  • the feature of the frame is that the high-dimensional data is projected into the low-dimensional space, which can eliminate the redundancy between the dimensions of the data, thereby reducing the amount of calculation and improving the calculation efficiency.
  • each training sample image is ⁇ XH , and each picture is expanded into a M vector long vector.
  • the image of the training library can be represented as a matrix of Mx N, denoted as [ , - N ] M , ⁇ where / is the training sample vector.
  • m is the average sample.
  • the eigenvalues of the matrix C should be solved. Since the dimension of the matrix C is too large, it is not feasible to directly calculate the eigenvector of the matrix C.
  • the small matrix R i L can be calculated first. Feature vector.
  • the eigenvalue matrix ⁇ is a diagonal matrix of M*M, and the eigenvalues are arranged in order from largest to smallest, that is, Au ⁇ A 22 ⁇ ... ⁇ A MM , wherein, the eigenvalue matrix ⁇ the jth row The elements of the j column.
  • 95%
  • each training sample on the average sample ie, the characteristics or principal components of each training sample
  • the gray value of all the pixels in the face image frame can be obtained, and is also expanded into an M-dimensional vector I test , and the feature of the specified slice image frame is calculated according to the formula (1), and the following is obtained:
  • w E r (I feii -m) (2 )
  • w is the projection coefficient of the feature vector of the slice image frame on the average of the training samples (ie, the feature of the slice image frame)
  • I test is the feature of the slice image frame.
  • m is the average of the training samples
  • E is the orthogonalized eigenvector of the training samples
  • ⁇ ⁇ is the transpose of the matrix E.
  • the characteristics of the specified slice image frame can be calculated.
  • Step 520 feature comparison.
  • the features of the specified slice image frame are compared with the features of each training sample in the preset training sample model.
  • the feature of the specified slice image frame calculated by the formula (2) in step 510 is compared with the feature of each training sample in the preset training sample model of the formula (1):
  • the modulus value after comparing the feature of the slice image frame with the feature of the i-th training sample 1 N.
  • the features of the specified slice image frame should be compared to each sample in the sample training library.
  • measuring item search The training sample whose feature is closest to the feature of the specified slice image frame is found, and the measurement item corresponding to the training sample is used as the measurement item corresponding to the specified slice image frame.
  • the measurement items corresponding to the specified slice image frame are searched by:
  • the function index indicates the corresponding sequence number (i) when the minimum value is taken, that is, the test sample and the ith sample are the closest, so that the measurement item to which the test sample belongs is consistent with the measurement item to which the i-th training sample belongs.
  • the measurement item category of the training sample is known, thereby completing the search process of the measurement item corresponding to the specified slice image frame.
  • different measurement modes correspond to different sample models, and have different sample libraries. Therefore, before the feature comparison in step 520, the sample library is selected by acquiring the measurement mode selected by the user, and the feature can be reduced. Comparing the number of samples, while improving the recognition efficiency, it is possible to further improve the accuracy of recognition.
  • Example 2
  • FIG. 6 is a structural diagram of an identification module 32 provided by this embodiment.
  • the specific structure includes: an extracting unit 3221, an identifying unit 3222, and a determining unit 3223.
  • the extracting unit 3221 is configured to extract the highlighted portion in the specified slice image frame according to the gray value of each pixel in the specified slice image frame.
  • the identification unit 3222 is configured to identify the highlighted portion according to the measurement mode, and determine the type of the slice of the specified slice image frame.
  • the determining unit 3223 is configured to determine a measurement item corresponding to the specified slice image frame according to the identified type of the slice and the corresponding measurement item.
  • the embodiment further discloses a measurement item identification method, which adopts an image processing method to extract features such as gray scale and shape of a specified slice image frame, and then combines these features for analysis and judgment.
  • the image processing method may be due to measurement Different modes.
  • the obstetric measurement mode for example, in the obstetric examination, the most common measurement items are head circumference (HC), double top diameter (BPD), abdominal circumference (AC), and femur length (FL), among which head circumference and abdomen
  • the elliptical target, the double top diameter and the femur length are line segment targets, and the head circumference and the double top diameter can be measured on the same image cut surface. Therefore, the four measurement items include three measurement cut surfaces, and the head circumference cut surface ( HC and BPD with the same section), abdominal section, femur section.
  • the head circumference section contains the skull of the fetus.
  • the skull On the ultrasound, the skull is highlighted, and the near field skull and the far field skull form an ellipse.
  • the ellipse is the fetal brain structure, and the gray scale is obviously lower than the gray of the skull.
  • the femur in the femoral section also shows high brightness, but the whole bone is relatively straight, only a small amount of curvature
  • the abdominal section shows that the interior contains a large gradient elliptical target, but the boundary is not highlighted.
  • the gray scale inside the abdominal circumference is also close to the gray level at the boundary. Therefore, when using the image processing method to identify the measurement item, the measurement mode selected by the user should be obtained first. After the measurement mode is acquired, the image processing method can be used to identify the measurement item.
  • FIG. 7 is a flowchart of the identification of the embodiment.
  • the technical solution of the embodiment is further described by taking the obstetric measurement mode as an example.
  • the specific method steps of this embodiment include:
  • Step 600 obtaining a measurement mode.
  • Step 610 extracting the highlighted portion.
  • the highlighted portion of the specified slice image frame is extracted based on the gray value of each pixel in the specified slice image frame.
  • the image may be pre-processed, and then the highlighted portion is extracted based on the pre-processed image.
  • Image preprocessing is mainly used to suppress image noise, increase the continuity of the boundary, and highlight the highlights.
  • algorithms such as anisotropic smoothing.
  • the image may not be pre-processed, but the highlighted portion is extracted on the original image grayscale data.
  • the clustering segmentation algorithm may be used to extract the highlighted portion of the specified slice image frame: the clustering algorithm is used to divide the gray value of each pixel in the specified slice image frame into several categories; The gray value of one or more categories remains unchanged, while the gray values of other categories are assigned a value of 0, thereby obtaining a feature image.
  • the class ie, the highlighted portion assigns a value of 0 to the point of the back NM class, and the gray level of the remaining points remains unchanged to obtain the feature image.
  • the connected area is divided into the feature image to find one or more connected areas with the highest brightness, and the highlighted part in the specified cut image frame is obtained.
  • the brightness can be arranged in a row.
  • the first X connected areas serve as the most connected areas of brightness. Where X is a preset parameter of the system, and it is generally reasonable to take 1 ⁇ 10. In other embodiments, other values may also be selected according to actual needs.
  • the convolution operation may be performed on the specified slice image by a preset M x N operator, and the image obtained by the convolution operation is used as the feature image.
  • the value of the element in the operator can be determined according to actual needs.
  • the feature image often contains a lot of noise, and some morphological algorithms (such as morphological erosion, open operation, removal of small area, etc.) can remove noise and make the boundary of the image more continuous.
  • some morphological algorithms such as morphological erosion, open operation, removal of small area, etc.
  • Step 620 identifying the type of the facet.
  • the highlighted portion is identified based on the measurement mode to determine the type of facet of the specified slice image frame.
  • both the head circumference and the abdominal circumference measurement items are approximately elliptical.
  • the main difference between the head circumference (shown in Figure 8a) and the abdominal section (as shown in Figure 8b) is:
  • the head circumference section contains the skull 81, the skull 81 appears as a highlight echo in the ultrasound, and the head circumference is the brain Other structures of the department, the ultrasound showed lower brightness, and the contrast with the brightness of the skull 81 is obvious;
  • the abdominal section contains the abdomen 82 of the fetus, the boundary is obviously no head circumference bright, and the internal echo is relatively uniform, and the boundary difference is not Too big.
  • Both the head and the femoral section contain a bright femur, the main difference being:
  • the head circumference contains two symmetrical, arc-shaped bones 81;
  • the femoral section contains the bones 83 compared Straight, under normal circumstances there is only one bone. Based on the above physical facts, in this step, the brightness and shape of the found connected regions are analyzed and compared to determine which type of slice is the cut image.
  • the brightness threshold of the connected area can be set in a predetermined measurement mode (e.g., obstetrics) to determine whether a bone is included in the specified cut image. For example, when the average brightness of the connected area is greater than the first threshold, the specified slice image includes the bone; when the average brightness of the connected area is less than the second threshold, the specified slice image does not include the bone, wherein, A threshold value is greater than the second threshold. In other embodiments, it is also possible to set only the luminance threshold of a connected area.
  • a predetermined measurement mode e.g., obstetrics
  • curvature For the analysis of bone shape, a simple method is to use curvature to determine whether the bone is bent or not. For example, in a given measurement mode (such as obstetrics), the curvature threshold is set. When the curvature of the connected region is larger than a certain curvature threshold, the curved image is included in the specified slice image. When the curvature of the connected region is less than a certain curvature When the value is wide, it means that the specified cut image contains straight bones.
  • a definition of curvature can be as follows, both ends of the connected region And taking a point in the middle, calculating the angle formed by the connection between the two ends and the intermediate point is the curvature. Of course, in other embodiments, other definitions may be adopted, as long as the bending degree of the connected area is reflected. can.
  • the designated slice image is considered to contain bones, and the cut surface may be the head circumference and the femur section; then, the curvature of the connected region is observed. If the curvature is greater than a certain threshold, it is considered to be a curved bone, and the specified cut surface is determined to be a head circumference. If the curvature is less than a certain width, it is considered to be a straight bone, and then the designated cut surface is determined as a femur. If the average brightness of the X connected areas is less than a certain threshold, the cut surface does not contain the highlighted bone, and the cut surface is determined to be the abdominal cut surface.
  • Step 630 determining a measurement item.
  • the measurement item corresponding to the specified slice image frame is determined according to the identified type of the slice and its corresponding measurement item.
  • the program may be stored in a computer readable storage medium, and the storage medium may include: a read only memory, Random access memory, disk or CD, etc.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Physiology (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Pregnancy & Childbirth (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Quality & Reliability (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Dentistry (AREA)
  • Rheumatology (AREA)
  • Geometry (AREA)
  • Gynecology & Obstetrics (AREA)
  • Data Mining & Analysis (AREA)
  • Epidemiology (AREA)
  • Databases & Information Systems (AREA)
  • Primary Health Care (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

一种自动识别测量项的方法和装置,通过图像获取模块(31)获取指定的切面图像帧中与被探测生物体组织因反射超声信号而形成的超声回波相对应的各像素点的灰度值;而后由识别模块(32)基于各像素点的灰度识别指定切面图像帧对应的至少一个测量项;测量模块(33)基于识别出得测量项测量指定切面图像帧的测量项参数。由于根据指定切面图像的内容能够自动识别出指定切面的测量项,使得在超声测量过程中,省去了用户移动轨迹球选择测量项的步骤,从而提高了测量效率。

Description

自动识别测量项的方法、 装置及一种超声成像设备 技术领域
本发明涉及一种医疗设备, 具体涉及一种超声成像设备及其自动识 别测量项的方法及其装置。
背景技术
超声仪器一般用于医生观察人体的内部组织结构, 医生将超声探头 放在人体部位对应的皮肤表面, 可以得到该部位的超声切面图像。 超声 由于其安全、 方便、 无损、 廉价等特点, 已经成为医生诊断的主要检查 手段。
在超声测量中, 为了获得感兴趣目标的大小等指标, 医生往往需要 进行很多的测量, 由于在一个测量模式 (所测量的组织部位)中, 往往有 很多个测量项需要测量, 而测量是用户和机器不断交互的过程, 需要用 户不断地选择测量项然后移动轨迹球进行测量,经常需要耗费大量时间。 譬如, 腹部模式下, 常用的测量项有肝脏、 胆嚢、 脾脏、 肾脏的大小等, 产科模式中头围 (HC ) , 双顶径 (BPD ) 、 腹围 (AC ) 、 股骨长(FL ) 也是每次检查必须要测的项目。 一般在检查中, 医生打好相关标准切面 后, 首先要按测量键, 系统出现测量菜单后, 医生移动轨迹球, 在菜单 中选择对应的测量项, 再进行该测量项的测量。 以产科检查为例, 医生 打好相关标准切面后, 首先按测量健, 系统出现测量菜单, 进入测量状 态, 然后移动轨迹球, 再在菜单中选择对应的测量项。 以医生选择头围 测量项为例, 医生首先通过转动轨迹球将光标移动到测量菜单, 选择菜 单中的头围测量项, 医生选中测量项后再通过转动轨迹球将光标移动到 切面图像中颅骨光环一侧, 按确定按钮点第一个点后, 再将光标移动到 另外一侧, 按确定按钮点第二个点后得到椭圓的一个轴, 然后再移动光 标调整另一个轴的长度, 中间还可能要修改之前确定的两个点, 直到椭 圓将胎儿颅骨吻合位置, 一次测量往往需要点很多个点才能使椭圓和所 测量结构吻合; 如果是线段型目标, 也需要点击至少两个点, 有文献称, 医生要花费 20%〜30%的时间用于测量。
据于此, 目前已经有一些专利或公开文献中提出了一些自动测量的 技术, 用以节省医生的测量时间, 但是, 在这些技术中, 都需要用户根 据所打的切面在菜单中手动选择对应的测量项然后再进行自动或半自动 测量, 这就直接影响了自动测量的自动化程度, 医生对测量项的操作也 影响了测量时间, 况且, 在检查时, 医生并不喜欢分心不断按按钮和选 菜单。
发明内容
依据本申请的第一方面, 一种实施方式提供一种自动识别测量项的 方法, 包括:
图像获取步骤, 获取指定切面图像帧中各像素点的灰度值, 像素点 的灰度值与被探测生物体组织因反射超声信号而形成的超声回波相对 应;
识别步骤, 基于各像素点的灰度值识别指定切面图像帧对应的至少 一个测量项;
测量步骤, 基于识别出的测量项, 测量指定切面图像帧的测量项参 数。
依据本申请的第二方面, 一种实施方式提供一种自动识别测量项的 装置, 包括:
图像获取模块, 用于获取指定切面图像帧中各像素点的灰度值, 像 素点的灰度值与被探测生物体组织因反射超声信号而形成的超声回波相 对应;
识别模块, 用于基于各像素点的灰度值识别指定切面图像帧对应的 至少一个测量项;
测量模块, 用于基于识别出的测量项, 测量指定切面图像帧的测量 项参数。
依据本申请的第三方面, 一种实施方式提供一种超声成像设备, 包 括:
探头, 用于向被测生物体组织发射超声波并接收超声回波; 信号处理器, 用于对超声回波进行处理, 生成超声图像数据; 图像处理器, 用于对超声图像数据进行处理, 并生成切面图像, 图 像处理器包括上述自动识别测量项的装置。
依据本发明的自动识别测量项方法 /装置, 由于本发明能够根据指定 切面图像的内容自动识别出指定切面的测量项,使得在超声测量过程中, 省去了用户选择测量项的步骤, 使测量更加方便和自动化。
附图说明 图 1是本申请实施例超声成像设备的结构图;
图 2是本申请实施例自动识别测量项的装置的结构图;
图 3是本申请实施例自动识别测量项的方法流程图;
图 4是本申请实施例 1提供的一种识别模块结构图;
图 5是本申请实施例 1提供的一种识别方法流程图;
图 6是本申请实施例 2提供的一种识别模块结构图;
图 7是本申请实施例 2提供的一种识别方法流程图;
图 8是本申请实施例 2产科测量模式下各测量项切面示意图例, 其 中, 图 8a示例性描述了头围切面; 图 8b示例性描述了腹围切面; 图 8c 示例性描述了股骨切面。
具体实施方式
医用超声成像一般用于医生观察人体的内部组织结构, 医生将操作 探头放在人体部位对应的皮肤表面, 可以得到该部位内部组织的超声切 面图像。 请参考图 1 , 图 1所示为超声成像设备的结构, 包括超声波发 生电路 11、 探头 1、 信号处理器 2、 图像处理器 3和显示器 4。 其中: 探头 1用于向扫查目标组织发射超声波并接收超声回波。 超声波发 生电路 11产生波形数据, 通过发射通道 12接通探头 1的阵元, 向被探 测组织发射超声波, 超声波经组织反射和吸收后形成超声回波, 探头 1 接收超声回波, 通过接收通道 13输出至信号处理器 2。
信号处理器 2用于对超声回波进行处理, 生成超声图像数据。 信号 处理器 2首先将接收通道 13接收到的超声回波通过波束合成环节得到射 频 (radio frequency, RF ) 信号; 再经过正交解调后得到正交解调的基 带信号。 处理后的超声图像数据输出到图像处理器 3。
图像处理器 3用于对超声图像数据进行处理, 并生成切面图像发送 到显示器 4进行显示。 图像处理器 3包括自动识别测量项的装置, 自动 识别测量项的装置对信号处理器 2输出的超声图像数据进行处理, 识别 出用户指定切面图像所对应的测量项, 并进一步测量该测量项规定的各 项测量参数。
显示器 4用于显示图像处理器 3生成的切面图像以及各项测量参数。 自动识别测量项的装置的结构如图 2所示, 包括图像获取模块 31、 识别模块 32和测量模块 33。
图像获取模块 31用于获取指定切面图像帧中各像素点的灰度值 ,像 素点的灰度值与被探测生物体组织因反射超声信号而形成的超声回波相 对应。
识别模块 32 用于基于各像素点的灰度值识别指定切面图像帧对应 的至少一个测量项。 例如对各像素点的灰度值进行处理后与预设数据模 型进行比对分析, 根据分析结果识别出测量项。
测量模块 33用于基于识别出的测量项,测量指定切面图像帧的测量 项参数。
在另一实施例中, 自动识别测量项的装置还可以包括测量模式获取 模块 34 , 用于获取用户选择的测量模式, 识别模块 32在识别测量项时 根据用户进入的测量模式识别指定切面图像帧对应的测量项。
基于上述超声成像设备的自动识别测量项的方法流程如图 3所示, 包括以下步骤:
步骤 400, 检测用户输入的测量指令。 测量指令通过用户按下测量 键或选择测量选项而产生。 当检测到测量指令后执行以下步骤。
步骤 410, 图像获取。 图像处理器将处理后的超声图像发送到显示 器进行显示, 用户通过观察显示的图像指定切面图像, 根据用户指定的 切面图像, 从存储的图像数据中获取该指定切面图像帧中各像素点的灰 度值, 指定切面图像通常是当前显示的图像, 像素点的灰度值与被探测 生物体组织因反射超声信号而形成的超声回波相对应, 骨酪部分的回波 信号较强, 其图像部分的灰度值也较大, 软组织部分的回波信号较弱, 其图像部分的灰度值较小。
步骤 420 , 识别测量项。 在一种测量模式 (所测量的组织部位)中, 往 往有很多个测量项需要测量, 譬如, 在腹部测量模式下, 常用的测量项 有肝脏、 胆嚢、 脾脏、 肾脏的大小等; 在产科测量模式下, 常用的测量 项有头围 (HC )、 双顶径(BPD )、 腹围 (AC )、 股骨长(FL )等。 在其 它实施例中, 还有其它的测量模式, 对应的也有其它相关的测量项。 测 量项根据测量目标的切面图像而确定, 不同测量项对应的切面图像总是 存在一些差别, 这为自动识别提供了可行性, 一种可行的方案是, 在系 统中预设各测量项所对应的切面图像数据模型, 将指定切面的图像帧中 各像素点的灰度值与预设数据模型进行比对分析, 从而识别出测量项。 其中, 预设数据模型可以是某一切面图像能够区别于其他不同切面图像 的特征, 例如训练样本模型, 也可以是各测量项所对应切面的形状、 亮 度范围或大小等物理特征。 当某测量目标的切面图像确定时, 对该测量 目标所进行的测量项也是确定的, 反过来, 当测量项确定时, 也需要显 示出与该测量项对应的测量目标的切面图像。 例如, 对于产科检查, 头 围和腹围是椭圓形目标, 双顶径和股骨长是线段型目标, 头围和双顶径 可以在同一个图像切面上测量, 因而,这四个测量项包含三个测量切面, 头围切面 (HC和 BPD 同切面)、 腹围切面、 股骨切面。 如果医生指定 的切面图像为腹围切面时, 则测量项为腹围 (AC ); 当指定切面图像为 股骨切面时, 则测量项为股骨长 (FL ); 如果医生指定的切面图像为头 围切面时, 则测量项为头围 (HC ) 和 /或双顶径 (BPD ) 。 基于此, 本 实施例中, 图像处理器基于各像素点的灰度值进行分析和处理, 识别出 当前指定切面图像是什么切面图像, 从而得知该指定切面图像帧对应的 测量项。
步骤 430 , 测量参数。 基于步骤 420识别出的测量项, 测量指定切 面图像帧的测量项参数。 在实际测量过程中, 测量参数的方式可以是手 动测量、 半自动测量或者自动测量等。 当某种切面图像对应两个或以上 个测量项时, 例如头围切面的测量项为头围 (HC ) 和双顶径 (BPD ), 在测量过程中可对识别出的测量项逐一进行测量。
在另一种实施例中, 在步骤 420识别测量项之前, 还可以包括: 步骤 440, 获取对生物体组织进行探测时采用的测量模式。 获取用 户选择的测量模式。 通过获取用户选择的测量模式可以缩小测量项的识 别范围, 在提高识别效率的同时, 还可能进一步提高识别的准确率。
本实施例根据图像内容来自动识别测量项, 减少了医生移动轨迹球 在菜单中选择测量项的步骤, 提高了医生测量的效率。
本申请的关键点在于增加了自动识别测量项的技术内容, 而如何识 别切面图像所对应的测量项可以有多种方案。 以下通过具体的实施例对 本申请的测量项自动识别作进一步的阐述。 实施例 1 :
不同测量项切面之间存在差异, 本实施例基于这一事实, 通过提取 出可标记不同测量项切面的特征,然后根据这些特征进行测量项的识别。 在一种具体实施例中, 可以通过机器学习的方法来提取切面的特征进行 分类判别。 请参考图 4 , 为本实施例提供的一种识别模块 32的结构图。 具体结构包括: 特征生成单元 3211、 比较单元 3212和查找单元 3213。 特征生成单元 3211 用于根据指定切面图像帧中各像素点的灰度值 生成指定切面图像帧的特征;比较单元 3212用于将指定切面图像帧的特 征分别与预设的训练样本模型中各训练样本的特征进行比较; 查找单元 3213用于查找出特征与指定切面图像帧的特征最接近的训练样本,将该 训练样本对应的测量项作为指定切面图像帧对应的测量项。
基于上述识别模块结构, 本实施例还公开了一种具体的机器学习识 别方法。 机器学习方法通常通过训练样本(一系列已经知道测量项的样 本)的信息, 学习出各类样本的特征, 然后再将被测切面样本的特征与训 练样本进行比较, 从而确定被测切面样本对应哪一类测量项。 现有技术 中, 常见的机器学习方法有主成分分析 ( Principal Component Analysis , PC A ), 线性判别分析 ( Linear Discriminant Analysis , LDA ), 核主成分 分析 ( Kernel Principal Component Analysis , KPC A ) , 局部保持投影 ( Locality Preserving Projections , LPP ), 支持向量机 ( Support Vector Machine , SVM ), 神经网絡 ( Artificial Neural Networks, ANNs ) 等。
通常, 图像获取模块 31 获取的图像数据的维数是非常高的, 一幅 W*H大小的图像可以看成是一个 W*H维的向量,这个高维数向量各维之 间往往存在很大的相关性, 也就是说, 高维数据的表达是存在很大的冗 余的, 一种常见的方法就是把高维的数据投影到低维的空间里, 消除数 据各维数之间的冗余, PCA就是这样一种方法, 该算法的本质为在最小 均方意义下找出最能够代表原始高维数据的投影。下面以 PC A为例对本 实施例作进一步说明,本领域普通技术人员应该理解,在其它实施例中, 也可以根据本实施例的思想在不需要付出创造性的劳动采用 LDA、 KPC A, LPP , SVM、 ANNs等现有的其它机器学习方法来实现本实施例 的技术方案。
请参考图 5 , 为基于上述识别模块的识别方法流程图, 具体方法步 骤包括:
步骤 510, 特征生成。 根据指定切面图像帧中各像素点的灰度值生 成指定切面图像帧的特征。
本实施例的思路为将指定切面图像帧的特征与预设各训练样本的特 征进行比较, 从而得到指定切面图像帧的测量项类别。 因此, 首先要获 得各训练样本的特征和指定切面图像帧的特征。 在一种具体实施例中, 各训练样本的特征和指定切面图像帧的特征可以是特征值、 特征向量或 者特征值和特征向量的综合考虑。 一种优选的方式为, 将训练样本的特 征向量在训练样本的平均值上的投影系数作为训练样本的特征, 指定切 面图像帧的特征向量在训练样本的平均值上的投影系数作为指定切面图 像帧的特征,采用这种方式的优点为将高维的数据投影到低维的空间里, 可以消除数据各维数之间的冗余, 从而减少运算量, 提高计算效率。
对于训练样本库, 设训练样本库中共有 N个训练样本, 每个训练样 本图像的分辨率为 ^ XH , 把每张图片展开成一个 M 维的长向量,
M = Wx H ,则该训练库的图像可以表示成一个 Mx N的矩阵, 记成 [ ,- N]M,^ 其中 /,是训练样本向量。
首先计算样本的平均值 (下称平均样本)
N m =
N
其中, m为平均样本。 训练库中的样本减去平均样本, 得到均值为 0的新的训练样本:
L = [∑! - m, ,Ι^ - m]
则新样本的协方差矩阵为:
Figure imgf000009_0001
其中, 1/为矩阵 L的转置。
在得到新样本的协方差矩阵 C后,应求解矩阵 C的特征值, 因为矩 阵 C的维数太大了, 直接计算矩阵 C的特征向量是不可行的, 可以先计 算小矩阵 R = i L的特征向量。
设 V是小矩阵 R特征向量矩阵, Λ是特征值矩阵, 则有:
(LrL)V = VA
等式两边同时乘以 L , 得到:
(LLr)LV - LVA
因此, C = LI/的正交化特征向量为:
1
E = LVA 2
其中, 特征值矩阵 Λ是一个 M*M 的对角矩阵, 特征值按从大到小 依次排列, 即 Au≥A22≥...≥AMM , 其中, 表示特征值矩阵 Λ第 j行第 j列的元素。
事实上, 很大一部分特征值都是非常小的, 甚至为 0, 因此, 只需 要保留较大的特征值及对应的特征向量, 如只保留前 n个, 特征向量 V 也只需要选取前 n列, 即保留的特征向量 V的维数为 N*n。 n的选择有 很多种方法, 即可固定设置为某个数, 也可选取满足如下条件的 n: n M
∑Λ,≥^Λ, , 其中, Ρ 为一百分比, 如 Ρ = 95%说明保留了原始数据 ^%的特 4
在经过上述运算后, 即可求出各训练样本在平均样本上的投影 (即 各训练样本的特征或主成分) 为:
F^E^^-m) ( 1 ) 其中, ^ ^为矩阵 的转置, 为 的特征, 该投影将 M*l 维的样 本降到了 n*l, 消除了高维数据之间的相关性, 且该 n*l维的数据在最 小均方意义下最能够代表原始数据。
本领域普通技术人员应该理解, 对于样本库的上述步骤运算可以通 过离线的方式进行, 并将运算结果 (例如矩阵 和各训练样本的特征 ) 存储于系统内部。
对于指定的切面图像帧,可得到切面图像帧中所有像素点的灰度值, 同样将其展开成一 M维的向量 Itest, 根据公式 ( 1 )计算指定切面图像帧 的特征, 可得:
w = Er(Ifeii-m) (2 ) 其中, w为切面图像帧的特征向量在训练样本的平均值上的投影系 数(即切面图像帧的特征), Itest为切面图像帧的特征向量, m为训练样 本的平均值, E为训练样本的正交化特征向量, Ετ为矩阵 E的转置。
通过公式 (2 ), 可以计算出指定切面图像帧的特征。
步骤 520, 特征比较。 将指定切面图像帧的特征分别与预设的训练 样本模型中各训练样本的特征进行比较。
将步骤 510中由公式( 2 )计算得到的指定切面图像帧的特征与公式 ( 1 )预设的训练样本模型中各训练样本的特征进行比较:
^■ =ll -^ ||
其中, 为指定切面图像帧的特征与第 i个训练样本的特征 比较 后的模值, 1 N。 在一种具体实施例中, 指定切面图像帧的特征应与 样本训练库中的各个样本进行比较。 步骤 530 , 测量项查找。 查找出特征与指定切面图像帧的特征最接 近的训练样本, 将该训练样本对应的测量项作为指定切面图像帧对应的 测量项。
在指定切面图像帧的特征分别与预设的训练样本模型中各训练样本 的特征比较后, 通过下式来查找指定切面图像帧对应的测量项:
ind - indexi x.)
l≤i≤N
其中, 函数 index表示 ^取最小值时对应的序号( i ), 即说明测试样 本和第 i个样本是最接近的, 从而测试样本所属的测量项和第 i个训练 样本所属的测量项是一致的, 而训练样本的测量项类别是已知的, 从而 完成了指定切面图像帧对应的测量项的查找过程。
进一步, 在另一种具体实施例中, 不同的测量模式对应不同的样本 模型, 具有不同的样本库, 因此在步骤 520特征比较之前, 通过获取用 户选择的测量模式来选择样本库, 可以缩小特征比较的样本个数, 在提 高识别效率的同时, 还可能进一步提高识别的准确率。 实施例 2 :
由于机器学习方法需要学习出各类样本的特征, 然后再进行比较, 因此需要收集尽可能多的训练样本, 且训练样本在位置、 大小和形状等 方面尽可能涵盖各种情况。 而当没有足够的训练样本时, 也可采用图像 处理的方法进行分析判断。 本实施例采用图像处理的方法, 提取出各个 测量切面的灰度、 形状等图像特征来确定测量项。 请参考图 6 , 为本实 施例提供的一种识别模块 32的结构图。 具体结构包括: 提取单元 3221、 识别单元 3222和确定单元 3223。
提取单元 3221 用于根据指定切面图像帧中各像素点的灰度值提取 指定切面图像帧中的高亮部分。识别单元 3222用于根据测量模式对高亮 部分进行识别,确定出指定切面图像帧的切面类型。确定单元 3223用于 根据识别出的切面类型及其对应的测量项确定指定切面图像帧对应的测 量项。
基于上述识别模块, 本实施例还公开了一种测量项识别方法, 采用 图像处理的方法, 提取出指定切面图像帧的灰度、 形状等特征, 然后综 合这些特征进行分析判断。
由于不同测量模式中的图像特征不同, 图像处理的方法可能因测量 模式不同而需要作不同的设计。 以产科测量模式下为例,在产科检查中, 最常见的测量项目有头围 (HC )、 双顶径 (BPD )、 腹围 (AC )、 股骨长 ( FL ), 其中, 头围和腹围是椭圓形目标, 双顶径和股骨长是线段型目 标, 头围和双顶径可以在同一个图像切面上测量, 因而, 这四个测量项 包含三个测量切面, 头围切面 (HC和 BPD 同切面)、 腹围切面、 股骨 切面。 头围切面包含了胎儿的颅骨, 在超声上颅骨表现为高亮, 且近场 颅骨和远场颅骨构成一个椭圓, 椭圓内为胎儿的脑部结构, 灰度明显要 低于颅骨的灰度,股骨切面中的股骨也表现为高亮,但整个骨头比较直, 只会有少量的弯曲,腹围切面表现为内部包含一个梯度较大的椭圓目标, 但边界不是表现为高亮, 腹围内部的灰度也和边界处灰度接近。 因此, 采用图像处理方法进行识别测量项时, 应先获取用户选择的测量模式, 在获取测量模式之后, 便可采用图像处理方法进行识别测量项。
请参考图 7 , 为本实施例的识别流程图, 下面以产科测量模式为实 例, 对本实施例的技术方案作进一步说明, 本实施例的具体方法步骤包 括:
步骤 600, 获取测量模式。
步骤 610, 提取高亮部分。 根据指定切面图像帧中各像素点的灰度 值提取指定切面图像帧中的高亮部分。
在一种具体实施例中, 在读取到指定切面图像的灰度数据后, 可以 先对图像进行预处理, 然后再基于预处理后的图像提取高亮部分。 图像 预处理主要用于抑制图像噪声, 增加边界的连续性, 突出高亮部分, 此 类算法较多, 比如各向异性平滑等算法。 当然, 在其他实施例中也可以 不对图像进行预处理, 而在原始图像灰度数据上提取高亮部分。
在一种具体实施例中, 可以采用聚类分割算法来提取指定切面图像 帧中的高亮部分: 采用聚类分割算法将指定切面图像帧中各像素点的灰 度值分成若干类别; 将最大的一个或多个类别的灰度值保持不变, 而其 它类别的灰度值赋值为 0, 从而得到特征图像。 譬如, 将指定切面图像 对应的灰度分割成 N类 (如 N=3 ) , 将灰度按照从大到小排列, 将前 M 类(如 M > 1且小于 N )定为亮度最大的一类(即高亮部分), 将亮度属 于后 N-M类的点赋值为 0, 其余点的灰度保持不变, 得到特征图像。 对 特征图像划分连通区域, 找出亮度最大的一个或多个连通区域, 得到指 定切面图像帧中的高亮部分。 在一种具体实施例中, 可以按亮度大小排 在前 X个的连通区域作为亮度最大的连通区域。 其中, X为系统预设的 参数, 通常取 1〜10 比较合理, 在其它实施例中, 也可以根据实际需要 选取其它值。
在另一种实施例中, 也可以通过预设的 M x N的算子与指定切面图 像进行卷积运算, 将卷积运算后得到的图像作为特征图像。 其中, 算子 中元素的值可以根据实际需要进行确定。
进一步地, 特征图像中往往包含很多噪声, 可以通过一些形态学的 算法 (譬如形态学腐蚀、 开操作、 去除小面积区域等) 去除噪声并使图 像的边界更加连续。
步骤 620, 识别切面类型。 根据测量模式对高亮部分进行识别, 确 定出指定切面图像帧的切面类型。
在产科测量模式中, 头围和腹围测量项都是近似椭圓形结构。 头围 切面 (如图 8a所示) 和腹围切面 (如图 8b所示) 的主要区别在于: 头 围切面包含颅骨 81 , 颅骨 81在超声中表现为高亮回声, 且头围内部是 脑部的其它结构, 超声表现为亮度较低, 跟颅骨 81的亮度对比很明显; 腹围切面包含的是胎儿的腹部 82 , 其边界明显没有头围亮, 且内部回声 比较均一, 和边界差别不太大。 头围切面和股骨切面(如图 8c所示)都 包含高亮的股骨, 二者主要区别在于: 头围包含两个对称的、 形状为圓 弧状的骨头 81 ; 股骨切面包含的骨头 83 比较直, 一般情况下只有一根 骨头。基于上述物理事实, 本步骤中, 对找出的连通区域的亮度和形状 进行分析比较, 从而判断出切面图像是哪种类型的切面。
在一种具体实施例中, 可以在既定的测量模式 (如产科) 中设置连 通区域的亮度阔值来判断指定切面图像中是否包含骨骼。 譬如, 当连通 区域的平均亮度大于第一阔值时, 则说明指定切面图像中包含骨骼; 当 连通区域的平均亮度小于第二阔值时, 则说明指定切面图像中不包含骨 骼, 其中, 第一阔值大于第二阔值。 在其它实施例中, 也可以只设置一 个连通区域的亮度阔值。
对于骨骼形状的分析, 一种简单的方法为, 利用曲率来判断骨骼弯 曲与否以及弯曲程度。 譬如, 在既定的测量模式 (如产科) 中设置曲率 阔值, 当连通区域的曲率大于某一曲率阔值时, 则说明指定切面图像中 包含弯曲的骨骼, 当连通区域的曲率小于某一曲率阔值时, 则说明指定 切面图像中包含直的骨骼。 一种曲率的定义可以如下, 连通区域的两端 及中间各取一个点, 计算两端与中间点连线所构成的夹角即为曲率, 当 然, 在其它实施例中, 还可以采取其它的定义方式, 只要能够反映出连 通区域的弯曲程度即可。
以产科测量模式为例, 如果这 X个连通区域的平均亮度大于某个阔 值,则认为该指定切面图像中包含骨骼,其切面可能为头围和股骨切面; 然后再看连通区域的曲率, 如果曲率大于某个阔值, 则认为包含的是弯 曲的骨骼, 则判断该指定切面为头围, 如果曲率小于某个阔值, 则认为 包含的是直的骨骼, 则判断该指定切面为股骨; 如果这 X个连通区域的 平均亮度都小于某个阔值, 则说明该切面不含高亮的骨骼, 判断该切面 为腹围切面。
步骤 630, 确定测量项。 根据识别出的切面类型及其对应的测量项 确定指定切面图像帧对应的测量项。 本领域技术人员可以理解, 上述实施方式中各种方法的全部或部分 步骤可以通过程序来指令相关硬件完成, 该程序可以存储于一计算机可 读存储介质中, 存储介质可以包括: 只读存储器、 随机存储器、 磁盘或 光盘等。
以上应用了具体个例对本发明进行阐述, 只是用于帮助理解本发明 并不用以限制本发明。对于本领域的一般技术人员,依据本发明的思想, 可以对上述具体实施方式进行变化。

Claims

权 利 要 求
1. 一种自动识别测量项的方法, 其特征在于包括:
图像获取步骤, 获取指定切面图像帧中各像素点的灰度值, 所述像 素点的灰度值与被探测生物体组织因反射超声信号而形成的超声回波相 对应;
识别步骤, 基于各像素点的灰度值识别所述指定切面图像帧对应的 至少一个测量项;
测量步骤, 基于识别出的测量项, 测量所述指定切面图像帧的测量 项参数。
2.如权利要求 1所述的方法, 其特征在于, 所述识别步骤基于各像 素点的灰度值与预设数据模型的比对分析, 识别出测量项。
3.如权利要求 2所述的方法, 其特征在于, 在识别步骤之前还包括 获取对所述生物体组织进行探测时采用的测量模式。
4. 如权利要求 1-3任意一项所述的方法, 其特征在于, 识别步骤包 括:
根据指定切面图像帧中各像素点的灰度值生成指定切面图像帧的特 征;
将指定切面图像帧的特征分别与预设的训练样本模型中各训练样本 的特征进行比较;
查找出特征与指定切面图像帧的特征最接近的训练样本, 将该训练 样本对应的测量项作为指定切面图像帧对应的测量项。
5. 如权利要求 4所述的方法, 其特征在于, 所述训练样本的特征为 训练样本的特征向量在训练样本的平均值上的投影系数, 所述指定切面 图像帧的特征为指定切面图像帧的特征向量在训练样本的平均值上的投 影系数, 所述特征向量为一帧图像中所有灰度值的集合。
6. 如权利要求 5所述的方法, 其特征在于, 所述指定切面图像帧的 特征通过以下公式计算得出:
= Er(Ifeii - m) 其中, Itest为指定切面图像帧的特征向量, m为训练样本的平均值, T为矩阵的转置, E为训练样本的正交化特征向量, w为指定切面图像 帧的特征向量在训练样本的平均值上的投影系数。
7. 如权利要求 3所述的方法, 其特征在于, 识别步骤包括: 根据指定切面图像帧中各像素点的灰度值提取指定切面图像帧中的 高亮部分;
根据测量模式对高亮部分进行识别, 确定出指定切面图像帧的切面 类型;
根据识别出的切面类型及其对应的测量项确定指定切面图像帧对应 的测量项。
8. 如权利要求 7所述的方法, 其特征在于, 提取指定切面图像帧中 的高亮部分包括:
采用聚类分割算法将指定切面图像帧中各像素点的灰度值分成若干 类别;
将最大的一个或多个类别的灰度值保持不变, 而其它类别的灰度值 赋值为 0 , 从而得到特征图像;
对特征图像划分连通区域, 找出亮度最大的一个或多个连通区域, 得到指定切面图像帧中的高亮部分。
9. 如权利要求 7所述的方法, 其特征在于, 根据测量模式对高亮部 分进行识别包括:
对找出的连通区域的亮度和形状进行分析;
根据测量模式和分析结果确定指定切面图像帧的切面类型。
10. 如权利要求 1 所述的方法, 其特征在于, 所述测量步骤包括 手动测量、 半自动测量和自动测量。
11. 一种自动识别测量项的装置, 其特征在于包括:
图像获取模块, 用于获取指定切面图像帧中各像素点的灰度值, 所 述像素点的灰度值与被探测生物体组织因反射超声信号而形成的超声回 波相对应;
识别模块, 用于基于各像素点的灰度值识别所述指定切面图像帧对 应的至少一个测量项;
测量模块, 用于基于识别出的测量项, 测量所述指定切面图像帧的 测量项参数。
12. 如权利要求 11 所述的装置, 其特征在于, 所述识别模块基于 各像素点的灰度值与预设数据模型的比对分析识别出测量项。
13. 如权利要求 12所述的装置, 其特征在于, 还包括测量模式获 取模块, 用于获取对所述生物体组织进行探测时采用的测量模式。
14. 如权利要求 11-13中任一项所述的装置, 其特征在于, 所述识 别模块包括:
特征生成单元, 用于根据指定切面图像帧中各像素点的灰度值生成 指定切面图像帧的特征;
比较单元, 用于将指定切面图像帧的特征分别与预设的训练样本模 型中各训练样本的特征进行比较;
查找单元, 用于查找出特征与指定切面图像帧的特征最接近的训练 样本, 将该训练样本对应的测量项作为指定切面图像帧对应的测量项。
15. 如权利要求 14所述的装置, 其特征在于, 所述训练样本的特 征为训练样本的特征向量在训练样本的平均值上的投影系数, 所述指定 切面图像帧的特征为指定切面图像帧的特征向量在训练样本的平均值上 的投影系数, 所述特征向量为一帧图像中所有灰度值的集合。
16. 如权利要求 15所述装置, 其特征在于, 所述特征生成单元在 生成指定切面图像帧的特征时采用以下公式计算得出:
= Er(Ifeii - m)
其中, Itest为指定切面图像帧的特征向量, m为训练样本的平均值, T为矩阵的转置, E为训练样本的正交化特征向量, w为指定切面图像 帧的特征向量在训练样本的平均值上的投影系数。
17. 如权利要求 13所述的装置, 其特征在于, 所述识别模块包括: 提取单元, 用于根据指定切面图像帧中各像素点的灰度值提取指定 切面图像帧中的高亮部分;
识别单元, 用于根据测量模式对高亮部分进行识别, 确定出指定切 面图像帧的切面类型;
确定单元, 用于根据识别出的切面类型及其对应的测量项确定指定 切面图像帧对应的测量项。
18. 如权利要求 17所述的装置, 其特征在于, 所述提取单元在提 取指定切面图像帧中的高亮部分时为:
采用聚类分割算法将指定切面图像帧中各像素点的灰度值分成若干 类别;
将最大的一个或多个类别的灰度值保持不变, 而其它类别的灰度值 赋值为 0, 从而得到特征图像;
对特征图像划分连通区域, 找出亮度最大的一个或多个连通区域, 得到指定切面图像帧中的高亮部分。
19. 如权利要求 18所述的装置, 其特征在于, 所述识别单元在根 据测量模式对高亮部分进行识别时为:
对找出的连通区域的亮度和形状进行分析;
根据测量模式和分析结果确定指定切面图像帧的切面类型。
20. 如权利要求 18所述的装置, 其特征在于, 所述提取单元在提 取指定切面图像帧中的高亮部分之前还对图像进行增加边界连续性的预 处理, 在得到特征图像之后还对特征图像进行去除噪声以增加边界连续 性的后处理。
21. 一种超声成像设备, 其特征在于包括:
探头, 用于向被测生物体组织发射超声波并接收超声回波; 信号处理器, 用于对超声回波进行处理, 生成超声图像数据; 图像处理器, 用于对超声图像数据进行处理, 并生成切面图像, 所 述图像处理器包括:
如权利要求 11-20中任一项所述的自动识别测量项的装置。
PCT/CN2014/073777 2014-03-20 2014-03-20 自动识别测量项的方法、装置及一种超声成像设备 WO2015139267A1 (zh)

Priority Applications (7)

Application Number Priority Date Filing Date Title
CN201911159310.7A CN110811691B (zh) 2014-03-20 2014-03-20 自动识别测量项的方法、装置及一种超声成像设备
CN201480047617.7A CN105555198B (zh) 2014-03-20 2014-03-20 自动识别测量项的方法、装置及一种超声成像设备
PCT/CN2014/073777 WO2015139267A1 (zh) 2014-03-20 2014-03-20 自动识别测量项的方法、装置及一种超声成像设备
EP14885962.2A EP3127486B1 (en) 2014-03-20 2014-03-20 Method and device for automatic identification of measurement item and ultrasound imaging apparatus
US15/271,095 US10898109B2 (en) 2014-03-20 2016-09-20 Method and device for automatic identification of measurement item and ultrasound imaging apparatus
US17/144,786 US11717183B2 (en) 2014-03-20 2021-01-08 Method and device for automatic identification of measurement item and ultrasound imaging apparatus
US18/211,516 US20230329581A1 (en) 2014-03-20 2023-06-19 Method and device for automatic identification of measurement item and ultrasound imaging apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2014/073777 WO2015139267A1 (zh) 2014-03-20 2014-03-20 自动识别测量项的方法、装置及一种超声成像设备

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/271,095 Continuation US10898109B2 (en) 2014-03-20 2016-09-20 Method and device for automatic identification of measurement item and ultrasound imaging apparatus

Publications (1)

Publication Number Publication Date
WO2015139267A1 true WO2015139267A1 (zh) 2015-09-24

Family

ID=54143685

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/073777 WO2015139267A1 (zh) 2014-03-20 2014-03-20 自动识别测量项的方法、装置及一种超声成像设备

Country Status (4)

Country Link
US (3) US10898109B2 (zh)
EP (1) EP3127486B1 (zh)
CN (2) CN105555198B (zh)
WO (1) WO2015139267A1 (zh)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109276275A (zh) * 2018-10-26 2019-01-29 深圳开立生物医疗科技股份有限公司 一种超声图像标准切面提取及测量方法和超声诊断设备
CN109276274A (zh) * 2018-10-26 2019-01-29 深圳开立生物医疗科技股份有限公司 一种超声图像标准切面识别及测量方法和超声诊断设备
CN109589140A (zh) * 2018-12-26 2019-04-09 深圳开立生物医疗科技股份有限公司 一种超声测量多项目处理方法和超声诊断系统
CN109589141A (zh) * 2018-12-28 2019-04-09 深圳开立生物医疗科技股份有限公司 一种超声诊断辅助方法、系统和超声诊断设备
CN109589139A (zh) * 2018-12-06 2019-04-09 深圳开立生物医疗科技股份有限公司 一种超声测量生物量自动确认方法和超声诊断系统
CN110177504A (zh) * 2017-01-16 2019-08-27 深圳迈瑞生物医疗电子股份有限公司 超声图像中参数测量的方法和超声成像系统
CN110680399A (zh) * 2019-10-25 2020-01-14 深圳度影医疗科技有限公司 一种产前超声图像的自动测量方法、存储介质及超声设备
CN113274056A (zh) * 2021-06-30 2021-08-20 深圳开立生物医疗科技股份有限公司 一种超声扫查方法及相关装置

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10716536B2 (en) 2013-07-17 2020-07-21 Tissue Differentiation Intelligence, Llc Identifying anatomical structures
US10154826B2 (en) 2013-07-17 2018-12-18 Tissue Differentiation Intelligence, Llc Device and method for identifying anatomical structures
WO2015139267A1 (zh) * 2014-03-20 2015-09-24 深圳迈瑞生物医疗电子股份有限公司 自动识别测量项的方法、装置及一种超声成像设备
US11986341B1 (en) 2016-05-26 2024-05-21 Tissue Differentiation Intelligence, Llc Methods for accessing spinal column using B-mode imaging to determine a trajectory without penetrating the the patient's anatomy
CN108882917A (zh) * 2016-05-30 2018-11-23 深圳迈瑞生物医疗电子股份有限公司 一种心脏容积识别分析系统和方法
US11701086B1 (en) 2016-06-21 2023-07-18 Tissue Differentiation Intelligence, Llc Methods and systems for improved nerve detection
EP3590116A1 (en) * 2017-03-01 2020-01-08 Koninklijke Philips N.V. Echocardiogram context measurement tool
US10918357B2 (en) * 2017-06-30 2021-02-16 General Electric Company Methods and systems for automatically determining an anatomical measurement of ultrasound images
CN111031930A (zh) * 2017-08-25 2020-04-17 富士胶片株式会社 声波诊断装置及声波诊断装置的控制方法
CN108078592A (zh) * 2017-12-29 2018-05-29 深圳开立生物医疗科技股份有限公司 超声图像处理方法及装置、超声诊断装置、可读存储介质
CN109044398B (zh) * 2018-06-07 2021-10-19 深圳华声医疗技术股份有限公司 超声系统成像方法、装置及计算机可读存储介质
CN111374698A (zh) * 2018-12-29 2020-07-07 深圳迈瑞生物医疗电子股份有限公司 超声成像系统及相关的工作流系统和方法
CN109925002A (zh) * 2019-01-15 2019-06-25 胡秋明 人工智能超声心动图数据采集系统及其数据采集方法
KR20200132144A (ko) * 2019-05-15 2020-11-25 삼성메디슨 주식회사 초음파 영상 장치 및 그 표시 방법
WO2021120065A1 (zh) * 2019-12-18 2021-06-24 深圳迈瑞生物医疗电子股份有限公司 解剖结构的自动测量方法和超声成像系统
CN114072059B (zh) * 2019-12-27 2024-09-06 深圳迈瑞生物医疗电子股份有限公司 一种超声成像设备以及快速设置超声自动工作流的方法
CN111862072A (zh) * 2020-07-29 2020-10-30 南通大学 一种基于ct图像测量腹围的方法
JP7410624B2 (ja) * 2020-09-14 2024-01-10 キヤノン株式会社 超音波診断装置、計測条件設定方法及びプログラム
CN112370078B (zh) * 2020-11-10 2024-01-26 安徽理工大学 一种基于超声成像和贝叶斯优化的图像检测方法
US11944501B2 (en) * 2022-02-16 2024-04-02 GE Precision Healthcare LLC Systems and methods for automatic measurements of medical images

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010057665A1 (en) * 2008-11-21 2010-05-27 Cnr-Consiglio Nazionale Delle Ricerche Ultrasonic apparatus for measuring a labor progress parameter
CN102151149A (zh) * 2010-12-24 2011-08-17 深圳市理邦精密仪器股份有限公司 一种胎儿超声图像自动测量方法及系统
CN102274051A (zh) * 2011-05-27 2011-12-14 深圳市理邦精密仪器股份有限公司 一种超声图像膀胱容积自动测量方法及系统
US20130231564A1 (en) * 2010-08-26 2013-09-05 Koninklijke Philips Electronics N.V. Automated three dimensional aortic root measurement and modeling

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5588435A (en) * 1995-11-22 1996-12-31 Siemens Medical Systems, Inc. System and method for automatic measurement of body structures
US5795296A (en) * 1996-03-29 1998-08-18 University Of Washington Pipeline process for automatically measuring object boundary from ultrasound image samples
US5605155A (en) * 1996-03-29 1997-02-25 University Of Washington Ultrasound system for automatically measuring fetal head size
US6258033B1 (en) * 1999-11-30 2001-07-10 Agilent Technologies, Inc. Ultrasound method employing echoes from a region of interest to enable quantization of backscatter signals
US6561980B1 (en) * 2000-05-23 2003-05-13 Alpha Intervention Technology, Inc Automatic segmentation of prostate, rectum and urethra in ultrasound imaging
JP4614548B2 (ja) * 2001-01-31 2011-01-19 パナソニック株式会社 超音波診断装置
US6733454B1 (en) * 2003-02-26 2004-05-11 Siemens Medical Solutions Usa, Inc. Automatic optimization methods and systems for doppler ultrasound imaging
US7563229B2 (en) * 2003-06-11 2009-07-21 Ge Medical Systems Global Technology Company Llc Method and apparatus for automatically measuring delay of tissue motion and deformation
ITPI20040066A1 (it) * 2004-09-21 2004-12-21 Cnr Consiglio Naz Delle Ricerche Metodo e dispositivo per la valutazione automatica di indici di funzionalita' cardiovascolare mediante elaborazione di immagini ecografiche
KR100747093B1 (ko) * 2005-01-12 2007-08-07 주식회사 메디슨 초음파 진단 영상을 이용한 대상체의 경계를 자동으로검출하는 방법 및 초음파 진단 시스템
JP4701011B2 (ja) * 2005-05-31 2011-06-15 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー 超音波診断装置
JP4934143B2 (ja) * 2006-10-10 2012-05-16 株式会社日立メディコ 医用画像診断装置、医用画像計測方法
US8556814B2 (en) * 2007-10-04 2013-10-15 Siemens Medical Solutions Usa, Inc. Automated fetal measurement from three-dimensional ultrasound data
CN101785681B (zh) * 2010-01-13 2012-06-20 北京航空航天大学 婴儿头骨发育状况定量测量与分析系统
US20110196236A1 (en) * 2010-02-08 2011-08-11 Gokul Swamy System and method of automated gestational age assessment of fetus
US8879813B1 (en) * 2013-10-22 2014-11-04 Eyenuk, Inc. Systems and methods for automated interest region detection in retinal images
WO2015139267A1 (zh) * 2014-03-20 2015-09-24 深圳迈瑞生物医疗电子股份有限公司 自动识别测量项的方法、装置及一种超声成像设备
US10806391B2 (en) * 2016-09-26 2020-10-20 General Electric Company Method and system for measuring a volume of an organ of interest
US10970837B2 (en) * 2019-03-18 2021-04-06 Siemens Healthcare Gmbh Automated uncertainty estimation of lesion segmentation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010057665A1 (en) * 2008-11-21 2010-05-27 Cnr-Consiglio Nazionale Delle Ricerche Ultrasonic apparatus for measuring a labor progress parameter
US20130231564A1 (en) * 2010-08-26 2013-09-05 Koninklijke Philips Electronics N.V. Automated three dimensional aortic root measurement and modeling
CN102151149A (zh) * 2010-12-24 2011-08-17 深圳市理邦精密仪器股份有限公司 一种胎儿超声图像自动测量方法及系统
CN102274051A (zh) * 2011-05-27 2011-12-14 深圳市理邦精密仪器股份有限公司 一种超声图像膀胱容积自动测量方法及系统

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110177504A (zh) * 2017-01-16 2019-08-27 深圳迈瑞生物医疗电子股份有限公司 超声图像中参数测量的方法和超声成像系统
US11826194B2 (en) 2017-01-16 2023-11-28 Shenzhen Mindray Bio-Medical Electronics Co., Ltd. Method for measuring parameters in ultrasonic image and ultrasonic imaging system
US11744540B2 (en) 2017-01-16 2023-09-05 Shenzhen Mindray Bio-Medical Electronics Co., Ltd. Method for measuring parameters in ultrasonic image and ultrasonic imaging system
US11744541B2 (en) 2017-01-16 2023-09-05 Shenzhen Mindray Bio-Medical Electronics Co., Ltd. Method for measuring parameters in ultrasonic image and ultrasonic imaging system
CN109276274A (zh) * 2018-10-26 2019-01-29 深圳开立生物医疗科技股份有限公司 一种超声图像标准切面识别及测量方法和超声诊断设备
CN109276275A (zh) * 2018-10-26 2019-01-29 深圳开立生物医疗科技股份有限公司 一种超声图像标准切面提取及测量方法和超声诊断设备
CN109589139A (zh) * 2018-12-06 2019-04-09 深圳开立生物医疗科技股份有限公司 一种超声测量生物量自动确认方法和超声诊断系统
CN109589140B (zh) * 2018-12-26 2022-04-01 深圳开立生物医疗科技股份有限公司 一种超声测量多项目处理方法和超声诊断系统
CN109589140A (zh) * 2018-12-26 2019-04-09 深圳开立生物医疗科技股份有限公司 一种超声测量多项目处理方法和超声诊断系统
CN109589141A (zh) * 2018-12-28 2019-04-09 深圳开立生物医疗科技股份有限公司 一种超声诊断辅助方法、系统和超声诊断设备
CN110680399B (zh) * 2019-10-25 2020-12-29 深圳度影医疗科技有限公司 一种产前超声图像的自动测量方法、存储介质及超声设备
CN110680399A (zh) * 2019-10-25 2020-01-14 深圳度影医疗科技有限公司 一种产前超声图像的自动测量方法、存储介质及超声设备
CN113274056A (zh) * 2021-06-30 2021-08-20 深圳开立生物医疗科技股份有限公司 一种超声扫查方法及相关装置

Also Published As

Publication number Publication date
US20210128020A1 (en) 2021-05-06
EP3127486A1 (en) 2017-02-08
US11717183B2 (en) 2023-08-08
US10898109B2 (en) 2021-01-26
CN110811691B (zh) 2022-08-05
EP3127486A4 (en) 2018-01-24
US20170007161A1 (en) 2017-01-12
CN110811691A (zh) 2020-02-21
CN105555198A (zh) 2016-05-04
EP3127486B1 (en) 2024-07-31
EP3127486C0 (en) 2024-07-31
US20230329581A1 (en) 2023-10-19
CN105555198B (zh) 2019-12-24

Similar Documents

Publication Publication Date Title
US11717183B2 (en) Method and device for automatic identification of measurement item and ultrasound imaging apparatus
US20170367685A1 (en) Method for processing 3d image data and 3d ultrasonic imaging method and system
JP6467041B2 (ja) 超音波診断装置、及び画像処理方法
CN110325119B (zh) 卵巢卵泡计数和大小确定
CN110945560B (zh) 胎儿超声图像处理
Berton et al. Segmentation of the spinous process and its acoustic shadow in vertebral ultrasound images
US5795296A (en) Pipeline process for automatically measuring object boundary from ultrasound image samples
US9801614B2 (en) Ultrasound diagnostic apparatus, ultrasound image processing method, and non-transitory computer readable recording medium
US8699766B2 (en) Method and apparatus for extracting and measuring object of interest from an image
US20160081663A1 (en) Method and system for automated detection and measurement of a target structure
US20130046168A1 (en) Method and system of characterization of carotid plaque
US11464490B2 (en) Real-time feedback and semantic-rich guidance on quality ultrasound image acquisition
CA3085619C (en) Echo window artifact classification and visual indicators for an ultrasound system
US20110196236A1 (en) System and method of automated gestational age assessment of fetus
CN111820948B (zh) 胎儿生长参数测量方法、系统及超声设备
Potočnik et al. Computerized detection and recognition of follicles in ovarian ultrasound images: a review
US20220249060A1 (en) Method for processing 3d image data and 3d ultrasonic imaging method and system
JP7299100B2 (ja) 超音波診断装置及び超音波画像処理方法
CN116138807A (zh) 一种超声成像设备及腹主动脉的超声检测方法
KR20230039084A (ko) 하지 정렬 평가 방법 및 이를 이용한 하지 정렬 평가용 디바이스

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201480047617.7

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14885962

Country of ref document: EP

Kind code of ref document: A1

REEP Request for entry into the european phase

Ref document number: 2014885962

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2014885962

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE