CN115239720A - Classical Graf-based DDH ultrasonic image artificial intelligence diagnosis system and method - Google Patents

Classical Graf-based DDH ultrasonic image artificial intelligence diagnosis system and method Download PDF

Info

Publication number
CN115239720A
CN115239720A CN202211155349.3A CN202211155349A CN115239720A CN 115239720 A CN115239720 A CN 115239720A CN 202211155349 A CN202211155349 A CN 202211155349A CN 115239720 A CN115239720 A CN 115239720A
Authority
CN
China
Prior art keywords
feature
image
ddh
graf
standard section
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211155349.3A
Other languages
Chinese (zh)
Inventor
张思成
方继红
徐静远
孙军
刘传彬
谢洪涛
蒋健一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui Children's Hospital Anhui Xinhua Hospital Anhui Institute Of Pediatrics Anhui Hospital Of Pediatrics Affiliated To Fudan University
University of Science and Technology of China USTC
Original Assignee
Anhui Children's Hospital Anhui Xinhua Hospital Anhui Institute Of Pediatrics Anhui Hospital Of Pediatrics Affiliated To Fudan University
University of Science and Technology of China USTC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui Children's Hospital Anhui Xinhua Hospital Anhui Institute Of Pediatrics Anhui Hospital Of Pediatrics Affiliated To Fudan University, University of Science and Technology of China USTC filed Critical Anhui Children's Hospital Anhui Xinhua Hospital Anhui Institute Of Pediatrics Anhui Hospital Of Pediatrics Affiliated To Fudan University
Priority to CN202211155349.3A priority Critical patent/CN115239720A/en
Publication of CN115239720A publication Critical patent/CN115239720A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
    • G06V10/449Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters
    • G06V10/451Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters with interaction between the filter responses, e.g. cortical complex cells
    • G06V10/454Integrating the filters into a hierarchical structure, e.g. convolutional neural networks [CNN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/54Extraction of image or video features relating to texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30008Bone

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • Artificial Intelligence (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Quality & Reliability (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The invention discloses a DDH ultrasonic image artificial intelligence diagnosis system and method based on classic Graf, wherein the diagnosis system comprises a classifier used for identifying a standard section from an ultrasonic image; the regressor is used for positioning the position information of key points in the standard section, calculating the included angle alpha of the cartilage vertex line and the included angle beta of the bone vertex line required by the Graf method by utilizing the positions of the key points and judging the grade of the DDH; the method for the classifier to identify the standard section comprises the following steps: and performing feature extraction and coding on the feature image intercepted from the ultrasonic image, performing pooling operation on the coded feature image to convert the coded feature image into a feature vector, and judging whether the feature image is a standard section or not after the feature vector passes through a full connection layer. The diagnosis system can screen the hip joint immature child standard hip joint ultrasonic image fragment, and diagnosis is faster, objective and accurate.

Description

Classical Graf-based DDH ultrasonic image artificial intelligence diagnosis system and method
Technical Field
The invention relates to the technical field of ultrasonic images, in particular to a DDH ultrasonic image artificial intelligence diagnosis system and method based on classical Graf.
Background
The Graf method of ultrasound image to pan-classify the severity of DDH (childhood developmental hip dysplasia) is well accepted by pediatric orthopedists. Most of the current hospital hip ultrasonic screening is mainly operated and evaluated by experienced clinicians, and the problems of low accuracy, repeatability and the like often exist in DDH diagnosis. Artificial intelligence diagnosing DDH by ultrasound images is an emerging research problem. Currently, hareentranathan et al devised a novel rounding index for DDH ultrasound image diagnosis, analyzing DDH with the help of clinician manual labeling. Quadrer et al further propose an automatic bone boundary detection system for DDH analysis. Sezer et al proposed a two-step method of 'segmentation' to analyze hip ultrasound. By using a Convolutional Neural Network (CNN), a fast and accurate DDH automatic analysis and diagnosis system can be constructed. However, these techniques all require analysis of DDH with the help of manual labeling by the clinician, and still rely heavily on manual labeling by the clinician, and are not fully automated.
Disclosure of Invention
Based on the technical problems in the background technology, the invention provides the DDH ultrasonic image artificial intelligence diagnosis system and method based on the classic Graf, the diagnosis system can screen the hip joint immature child standard hip joint ultrasonic image fragments, and the diagnosis is faster, more objective and more accurate.
The invention provides a DDH ultrasonic image artificial intelligence diagnostic system based on classical Graf, which comprises:
the classifier is used for identifying a standard section from the ultrasonic image;
and the regressor is used for positioning the position information of key points in the standard section, calculating the included angle alpha of the cartilage vertex line and the included angle beta of the bone vertex line required by the Graf method by utilizing the positions of the key points, and judging the grade of the DDH.
Preferably, the method for the classifier to identify the standard cross section is as follows: and performing feature extraction and coding on the feature image intercepted from the ultrasonic image, performing pooling operation on the coded feature image to convert the coded feature image into a feature vector, and judging whether the feature image is a standard section or not after the feature vector passes through a full connection layer.
Preferably, the processing method for feature extraction of the feature image is as follows:
F_{n+1}=Pool(BN(Conv(F_{n})))
in the formula: f _ { n } represents the feature of the nth stage; pool indicates maximum pooling of 2*2; BN represents batch regularization, conv represents convolution operation;
the method for converting the coded feature map into the feature vector by performing pooling operation comprises the following steps:
f=AvgPool(F_m)
in the formula: avgPool represents global average pooling; m is the total number of the input images for stage extraction; f _ m represents the feature finally obtained after the feature extraction of the input image in m stages.
Preferably, the fully connected layer completes dot product through weight and feature vector, and obtains a value representing confidence coefficient:
p’=Sigmoid(w*f)
in the formula: sigmoid denotes Sigmoid activation function, w is full connection weight, and p' is confidence of final prediction.
Preferably, the parameters in the sigmoid activation function are determined to take values by adopting a gradient descent method, and the loss function according to the gradient descent method is as follows:
Loss=pLog(p’)+(1-p)Log(1-p’)
in the formula: and p represents whether the characteristic image is a standard section, and takes a value of 1 when the characteristic image is the standard section and takes a value of 0 when the characteristic image is a non-standard section.
Preferably, the method for locating the position information of the key points in the standard cross section by the regressor comprises the following steps:
s11: carrying out 3-layer deconvolution on a feature F obtained by extracting features of a feature image intercepted from an ultrasonic image:
F^{i+1}=ReLU(BN(Deconv(F^{i})))
in the formula: f ^ i represents a feature graph obtained through i times of deconvolution layers, reLU is an activation function, BN represents batch regularization, and Deconv represents an deconvolution layer;
s12: carrying out 1*1 convolution on the feature graph F ^3 obtained after 3 times of deconvolution to obtain a thermodynamic diagram H' of each key point;
s13: for the positions of key points of the standard section, generating a corresponding target thermodynamic diagram by using a Gaussian function:
H(i,j)=exp{-((i-i’)^2+(j-j’)^2)/(2*δ^2)}
in the formula: i, j represents coordinates on the target thermodynamic diagram, i ', j' represents the positions of key points of the standard cross section, delta represents the attenuation degree of the Gaussian function, and H represents the thermodynamic diagram of the target;
s14: the loss function can be calculated to optimize the neural network using the thermodynamic diagrams H of the target and H' of the regressor output:
Loss=MSE(H,H’)
in the formula: MSE represents a mean square error function.
Preferably, the included angle α of the cartilage vertex line is an included angle between a base line and the cartilage vertex line, the included angle β of the bone vertex line is an included angle between a base line and the bone vertex line, the base line is a straight line where a perichondrium vertex and an iliac margin point are located, the bone vertex line is a straight line where a lowest iliac point and a bony vertex of an acetabulum are located, and the cartilage vertex line is a straight line where a turning point of the cartilage vertex which is concave-convex and a center of a labrum are located.
The invention provides a diagnosis method of a DDH ultrasonic image artificial intelligence diagnosis system based on classical Graf, which is characterized by comprising the following steps:
s21: extracting and coding features of a feature map intercepted from an ultrasonic image;
s22: performing pooling operation on the coded feature map to convert the coded feature map into feature vectors;
s23: judging whether the characteristic diagram is a standard section or not after the characteristic vector passes through the full connection layer;
s24: up-sampling the characteristic diagram of the standard section judged in the step S23;
s25: convolving the up-sampling result by 1*1 to obtain a thermodynamic diagram, and performing weighted projection on a response value on the thermodynamic diagram to obtain a position coordinate of a final key point;
s26: and calculating a cartilage vertex line included angle alpha and a bone vertex line included angle beta required by the Graf method by using the position coordinates of the key points, and judging the grade of the DDH.
The processing device provided by the invention comprises a memory and a processor, wherein at least one instruction is stored in the memory, and the at least one instruction is loaded and executed by the processor so as to realize the method.
The present invention provides a computer-readable storage medium, wherein at least one instruction is stored in the storage medium, and the instruction is loaded and executed by a processor to implement the method.
The invention has the beneficial technical effects that:
the method for extracting features adopts a neural network formed by combining a convolutional layer, an active layer and a pooling layer to process an input image, adopts a ResNet framework to extract a feature map representing the image, and the extracted features comprise information such as the edge, texture and relative position of the hip joint ultrasonic image. Immature hip joints can be identified more sensitively than manual labeling and diagnosis, and the average errors of the alpha angle and the beta angle are respectively less than 2.22 degrees and 2.28 degrees.
Drawings
FIG. 1 is a flow chart of a DDH ultrasonic image artificial intelligence diagnosis system based on classical Graf proposed by the present invention;
FIG. 2 is a schematic diagram of a standard cross-section of the present invention after key points are labeled; wherein 1 is the vertex of the perichondrium, 2 is the iliac margin point, 3 is the lowest point of the ilia, 4 is the vertex of the acetabulum bone, 5 is the turning point of the cartilage vertex which is changed from concave to convex, and 6 is the center of the labrum;
fig. 3 is a schematic drawing of a scribe line according to the classical Graf method as proposed by the present invention;
FIG. 4 is a comparison graph of the results of the detection and the results of the manual diagnosis of the diagnosis system proposed by the present invention; where O is the manually labeled key point and x is the labeled key point of the diagnostic system.
Detailed Description
Example 1
Referring to fig. 1, the DDH ultrasonic image artificial intelligence diagnostic system based on the classical Graf provided by the present invention includes:
1) The classifier is used for identifying a standard section from the ultrasonic image;
as an embodiment of the present invention, the method for the classifier to identify the standard cross section is: and performing feature extraction and coding on the feature image intercepted from the ultrasonic image, performing pooling operation on the coded feature image to convert the coded feature image into a feature vector, and judging whether the feature image is a standard section or not after the feature vector passes through a full connection layer.
Specifically, the processing method for feature extraction of the feature image comprises the following steps:
F_{n+1}=Pool(BN(Conv(F_{n})))
in the formula: f _ { n } represents the feature of the nth stage; pool indicates maximum pooling of 2*2; BN represents batch regularization, conv represents convolution operation;
the method for converting the coded feature map into the feature vector by performing pooling operation comprises the following steps:
f=AvgPool(F_m)
in the formula: avgPool represents global average pooling, and the feature vector obtained after global pooling operation can express classification information of the image; m is the total number of the input images for stage extraction; f _ m represents the feature characteristics finally obtained after the input image is subjected to m stages of feature extraction, and for the present embodiment, 5 stages of feature extraction are performed on the input image, where F _0 represents the input image, and F _5 represents the image features finally obtained.
The full-connection layer completes dot product through the weight and the feature vector, and obtains a numerical value representing a confidence coefficient:
p’=Sigmoid(w*f)
in the formula: sigmoid denotes Sigmoid activation function, w is full connection weight, p' is confidence of final prediction.
After parameters in the sigmoid activation function, such as a convolution kernel, full-connection layer weight and the like, are randomly initialized, a final value is determined by adopting a gradient descent method, and the gradient descent method is based on a loss function expressed by cross entropy:
Loss=pLog(p’)+(1-p)Log(1-p’)
in the formula: p is whether the frame of ultrasound marked manually belongs to a standard frame, the standard frame is 1, and the nonstandard frame is 0; the loss function is optimized through a gradient descent method, so that the model can learn the characteristics of the standard frames in the ultrasonic image.
2) And the regressor is used for positioning the position information of the key points in the standard section, calculating the included angle alpha of the cartilage vertex line and the included angle beta of the bone vertex line required by the Graf method by using the positions of the key points, and judging the grade of the DDH.
For the regressor, constructing a loss function by using the image characteristics processed by the neural network and the positions of key points marked on the ultrasonic image by a doctor in a training stage, and optimizing and constraining the network; as a preferred aspect of the present invention, the method for locating the position information of the key points in the standard cross section by the regressor comprises:
s11: performing 3-layer deconvolution on a feature F obtained by extracting features of a feature image intercepted from an ultrasonic image:
F^{i+1}=ReLU(BN(Deconv(F^{i})))
in the formula: f ^ i represents a feature graph obtained through the i-time deconvolution layer, reLU is an activation function, BN represents batch regularization, and Deconv represents an deconvolution layer;
s12: carrying out 1*1 convolution on the feature graph F ^3 obtained after 3 times of deconvolution to obtain a thermodynamic diagram H' of each key point;
s13: for the positions of key points of the standard section, generating a corresponding target thermodynamic diagram by using a Gaussian function:
H(i,j)=exp{-((i-i’)^2+(j-j’)^2)/(2*δ^2)}
in the formula: i, j represents a coordinate on the target thermodynamic diagram, i ', j' represents the position of a key point of a standard cross section, delta represents the attenuation degree of a Gaussian function, the value is generally 3 pixels, and H represents the thermodynamic diagram of the target;
s14: the loss function can be calculated to optimize the neural network using the thermodynamic diagrams H of the target and H' of the regressor output:
Loss=MSE(H,H’)
in the formula: MSE represents a mean square error function. For the diagnostic system of the present application, the input non-key frames do not update the deconvolution partial parameters of the regression network during the training of the whole network.
In addition, for the cartilage vertex line included angle α and the bone vertex line included angle β in the present invention, referring to fig. 2 and 3, the cartilage vertex line included angle α is an included angle between a base line and a cartilage vertex line, the bone vertex line included angle β is an included angle between the base line and the bone vertex line, the base line is a straight line where a perichondrium vertex and an iliac margin point are located, the bone vertex line is a straight line where a lowest iliac point and an acetabular bony vertex are located, and the cartilage vertex line is a straight line where a turning point where a cartilage vertex is made concave and convex and a center of a labrum are located.
In addition, as can be seen from the results of fig. 4, the detection result of the diagnostic system provided by the present invention is substantially the same as the result of the artificial diagnosis, and the average errors of the included angle α of the cartilage vertex line and the included angle β of the bone vertex line are respectively less than 2.22 ° and 2.28 °, which further illustrates that the accuracy of the diagnostic result of the artificial intelligent diagnostic system provided by the present invention is high.
Example 2
The invention provides a diagnosis method of a DDH ultrasonic image artificial intelligence diagnosis system based on classical Graf, which comprises the following steps:
s21: extracting and coding features of a feature map intercepted from an ultrasonic image;
s22: performing pooling operation on the coded feature map to convert the coded feature map into feature vectors;
s23: judging whether the characteristic diagram is a standard section or not after the characteristic vector passes through the full connection layer;
s24: up-sampling the characteristic diagram of the standard section judged in the S23;
s25: convolving the up-sampling result by 1*1 to obtain a thermodynamic diagram, and performing weighted projection on a response value on the thermodynamic diagram to obtain a position coordinate of a final key point;
s26: and calculating a cartilage vertex line included angle alpha and a bone vertex line included angle beta required by the Graf method by using the position coordinates of the key points, and judging the grade of the DDH.
Wherein: the processing method for extracting the features of the feature image comprises the following steps:
F_{n+1}=Pool(BN(Conv(F_{n})))
in the formula: f _ { n } represents the feature of the nth stage; pool indicates maximum pooling of 2*2; BN denotes batch regularization and Conv denotes convolution operation.
The method for converting the coded feature map into the feature vector by performing pooling operation comprises the following steps:
f=AvgPool(F_m)
in the formula: avgPool represents global average pooling; m is the total number of the input images for stage extraction; f _ m represents the feature finally obtained after the feature extraction of the input image in m stages.
The full-connection layer completes dot product through the weight and the feature vector, and obtains a numerical value representing a confidence coefficient:
p’=Sigmoid(w*f)
in the formula: sigmoid denotes Sigmoid activation function, w is full connection weight, and p' is confidence of final prediction.
Parameters in the sigmoid activation function adopt a gradient descent method to determine values, and the loss function according to the gradient descent method is as follows:
Loss=pLog(p’)+(1-p)Log(1-p’)
in the formula: and p represents whether the characteristic image is a standard section, and takes a value of 1 when the characteristic image is the standard section and takes a value of 0 when the characteristic image is a non-standard section.
The method for positioning the position coordinates of the final key points comprises the following steps:
s11: carrying out 3-layer deconvolution on a feature F obtained by extracting features of a feature image intercepted from an ultrasonic image:
F^{i+1}=ReLU(BN(Deconv(F^{i})))
in the formula: f ^ i represents a feature graph obtained through the i-time deconvolution layer, reLU is an activation function, BN represents batch regularization, and Deconv represents an deconvolution layer;
s12: carrying out 1*1 convolution on the feature graph F ^3 obtained after 3 times of deconvolution to obtain a thermodynamic diagram H' of each key point;
s13: for the positions of key points of the standard section, generating a corresponding target thermodynamic diagram by using a Gaussian function:
H(i,j)=exp{-((i-i’)^2+(j-j’)^2)/(2*δ^2)}
in the formula: i, j represents coordinates on the target thermodynamic diagram, i ', j' represents the positions of key points of the standard cross section, delta represents the attenuation degree of the Gaussian function, and H represents the thermodynamic diagram of the target;
s14: the loss function can be calculated to optimize the neural network using the thermodynamic diagram H of the target and the thermodynamic diagram H' of the regressor output:
Loss=MSE(H,H’)
in the formula: MSE represents a mean square error function.
Example 3
The processing device comprises a memory and a processor, wherein at least one instruction is stored in the memory, and is loaded and executed by the processor to realize the diagnosis method in the embodiment 2.
Example 4
The present invention provides a computer-readable storage medium, wherein at least one instruction is stored in the storage medium, and the instruction is loaded and executed by a processor to implement the diagnosis method in embodiment 2.
The present invention is not limited to the above preferred embodiments, and any modifications, equivalent substitutions, improvements, etc. within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. DDH ultrasonic image artificial intelligence diagnostic system based on classic Graf, its characterized in that includes:
the classifier is used for identifying the standard section from the ultrasonic image;
and the regressor is used for positioning the position information of the key points in the standard section, calculating the included angle alpha of the cartilage vertex line and the included angle beta of the bone vertex line required by the Graf method by utilizing the position information of the key points, and judging the grade of the DDH.
2. The system of claim 1, wherein the classifier identifies the standard cross-section by: and performing feature extraction and coding on the feature image intercepted from the ultrasonic image, performing pooling operation on the coded feature image to convert the coded feature image into a feature vector, and judging whether the feature image is a standard section or not after the feature vector passes through a full connection layer.
3. The classic Graf-based DDH ultrasonic image artificial intelligence diagnostic system of claim 2, wherein the feature extraction processing method of the feature image is as follows:
F_{n+1}=Pool(BN(Conv(F_{n})))
in the formula: f _ { n } represents the feature of the nth stage; pool indicates maximum pooling of 2*2; BN represents batch regularization, conv represents convolution operation;
the method for converting the coded feature map into the feature vector by performing pooling operation comprises the following steps:
f=AvgPool(F_m)
in the formula: avgPool represents global average pooling; m is the total number of the input images for stage extraction; f _ m represents the feature finally obtained after the feature extraction of the input image in m stages.
4. A classical Graf-based DDH ultrasound image artificial intelligence diagnostic system according to claim 2, wherein the fully connected layer performs a dot product by weight and feature vector, and obtains a value representing a confidence coefficient:
p’=Sigmoid(w*f)
in the formula: sigmoid denotes Sigmoid activation function, w is full connection weight, and p' is confidence of final prediction.
5. A DDH ultrasonic image artificial intelligence diagnostic system according to claim 4, wherein the parameters in the sigmoid activation function are valued by a gradient descent method based on a loss function:
Loss=pLog(p’)+(1-p)Log(1-p’)
in the formula: and p represents whether the characteristic image is a standard section, and takes a value of 1 when the characteristic image is the standard section and takes a value of 0 when the characteristic image is a non-standard section.
6. The classic Graf-based artificial intelligence diagnostic system for DDH ultrasound images according to claim 1, wherein the regressor locates the location information of key points in a standard cross-section by:
s11: carrying out 3-layer deconvolution on a feature F obtained by extracting features of a feature image intercepted from an ultrasonic image:
F^{i+1}=ReLU(BN(Deconv(F^{i})))
in the formula: f ^ i represents a feature diagram obtained through i deconvolution layers, reLU is an activation function, and Deconv represents an deconvolution layer;
s12: carrying out 1*1 convolution on the feature graph F ^3 obtained after 3 layers of deconvolution to obtain a thermodynamic diagram H' of each key point;
s13: for the positions of key points of the standard section, generating a corresponding target thermodynamic diagram by using a Gaussian function:
H(i,j)=exp{-((i-i’)^2+(j-j’)^2)/(2*δ^2)}
in the formula: i, j represents coordinates on the target thermodynamic diagram, i ', j' represents the positions of key points of the standard cross section, delta represents the attenuation degree of the Gaussian function, and H represents the thermodynamic diagram of the target;
s14: the loss function is calculated using the thermodynamic diagrams H of the target and H' of the regressor output to optimize the neural network:
Loss=MSE(H,H’)
in the formula: MSE represents a mean square error function.
7. A classical Graf-based DDH ultrasound image artificial intelligence diagnostic system according to claim 1, wherein the cartilage apex line angle α is an angle between a base line and a cartilage apex line, the bone apex line angle β is an angle between a base line and a bone apex line, the base line is a straight line where a perichondrium apex and an iliac margin point are located, the bone apex line is a straight line where a lowest iliac point and a bony acetabulum apex are located, and the cartilage apex line is a straight line where a turning point where a cartilage apex is concave-convex and a center of labrum are located.
8. The diagnostic method of a classic Graf-based DDH ultrasound image artificial intelligence diagnostic system of any of claims 1-7, wherein the method steps are as follows:
s21: the classifier extracts and codes the features of the feature images intercepted from the ultrasonic images;
s22: the classifier performs pooling operation on the coded feature images to convert the coded feature images into feature vectors;
s23: the classifier judges whether the feature image is a standard section or not after passing through the full connection layer by the feature vector;
s24: up-sampling the characteristic image which is judged to be the standard section in the S23;
s25: convolving the up-sampling result by 1*1 to obtain a thermodynamic diagram, and performing weighted projection on a response value on the thermodynamic diagram to obtain the position information of the final key point;
s26: and calculating a cartilage vertex line included angle alpha and a bone vertex line included angle beta required by the Graf method by using the position information of the key points, and judging the grade of the DDH.
9. A processing device comprising a memory and a processor, the memory having stored therein at least one instruction, the at least one instruction being loaded and executed by the processor to implement the method of claim 8.
10. A computer-readable storage medium having stored therein at least one instruction which is loaded and executed by a processor to perform the method of claim 8.
CN202211155349.3A 2022-09-22 2022-09-22 Classical Graf-based DDH ultrasonic image artificial intelligence diagnosis system and method Pending CN115239720A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211155349.3A CN115239720A (en) 2022-09-22 2022-09-22 Classical Graf-based DDH ultrasonic image artificial intelligence diagnosis system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211155349.3A CN115239720A (en) 2022-09-22 2022-09-22 Classical Graf-based DDH ultrasonic image artificial intelligence diagnosis system and method

Publications (1)

Publication Number Publication Date
CN115239720A true CN115239720A (en) 2022-10-25

Family

ID=83667214

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211155349.3A Pending CN115239720A (en) 2022-09-22 2022-09-22 Classical Graf-based DDH ultrasonic image artificial intelligence diagnosis system and method

Country Status (1)

Country Link
CN (1) CN115239720A (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170006946A (en) * 2015-07-10 2017-01-18 삼성메디슨 주식회사 Untrasound dianognosis apparatus and operating method thereof
US20180259608A1 (en) * 2015-11-29 2018-09-13 Arterys Inc. Automated cardiac volume segmentation
CN109919943A (en) * 2019-04-16 2019-06-21 广东省妇幼保健院 Infant hip joint angle automatic testing method, system and calculating equipment
CN110895809A (en) * 2019-10-18 2020-03-20 中国科学技术大学 Method for accurately extracting key points in hip joint image
CN111882531A (en) * 2020-07-15 2020-11-03 中国科学技术大学 Automatic analysis method for hip joint ultrasonic image
CN112215829A (en) * 2020-10-21 2021-01-12 深圳度影医疗科技有限公司 Positioning method of hip joint standard tangent plane and computer equipment
US20210056293A1 (en) * 2019-08-19 2021-02-25 Zhuhai Eeasy Technology Co., Ltd. Face detection method
CN112907507A (en) * 2021-01-14 2021-06-04 杭州米迪智能科技有限公司 Graf method hip joint ultrasonic image measuring method, device, equipment and storage medium
CN113724328A (en) * 2021-08-31 2021-11-30 瓴域影诺(北京)科技有限公司 Hip joint key point detection method and system
CN113951925A (en) * 2021-10-28 2022-01-21 扬州市妇幼保健院 Graf ultrasonic technology-based hip joint standard ultrasonic image acquisition method and intelligent system
CN114795258A (en) * 2022-04-18 2022-07-29 浙江大学 Child hip joint dysplasia diagnosis system
CN114862789A (en) * 2022-05-04 2022-08-05 上海交通大学医学院附属仁济医院 Children hip joint ultrasonic image quality detection method based on deep learning

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170006946A (en) * 2015-07-10 2017-01-18 삼성메디슨 주식회사 Untrasound dianognosis apparatus and operating method thereof
US20180259608A1 (en) * 2015-11-29 2018-09-13 Arterys Inc. Automated cardiac volume segmentation
CN109919943A (en) * 2019-04-16 2019-06-21 广东省妇幼保健院 Infant hip joint angle automatic testing method, system and calculating equipment
US20210056293A1 (en) * 2019-08-19 2021-02-25 Zhuhai Eeasy Technology Co., Ltd. Face detection method
CN110895809A (en) * 2019-10-18 2020-03-20 中国科学技术大学 Method for accurately extracting key points in hip joint image
CN111882531A (en) * 2020-07-15 2020-11-03 中国科学技术大学 Automatic analysis method for hip joint ultrasonic image
CN112215829A (en) * 2020-10-21 2021-01-12 深圳度影医疗科技有限公司 Positioning method of hip joint standard tangent plane and computer equipment
CN112907507A (en) * 2021-01-14 2021-06-04 杭州米迪智能科技有限公司 Graf method hip joint ultrasonic image measuring method, device, equipment and storage medium
CN113724328A (en) * 2021-08-31 2021-11-30 瓴域影诺(北京)科技有限公司 Hip joint key point detection method and system
CN113951925A (en) * 2021-10-28 2022-01-21 扬州市妇幼保健院 Graf ultrasonic technology-based hip joint standard ultrasonic image acquisition method and intelligent system
CN114795258A (en) * 2022-04-18 2022-07-29 浙江大学 Child hip joint dysplasia diagnosis system
CN114862789A (en) * 2022-05-04 2022-08-05 上海交通大学医学院附属仁济医院 Children hip joint ultrasonic image quality detection method based on deep learning

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
JINGYUAN XU 等: "Hip Landmark Detection With Dependency Mining in Ultrasound Image", 《IEEE TRANSACTIONS ON MEDICAL IMAGING》 *
JINGYUAN XU 等: "Multi-task hourglass network for online automatic diagnosis of developmental dysplasia of the hip", 《WORLD WIDE WEB》 *
SICHENG ZHANG 等: "Collagen I in the Hip Capsule Plays a Role in Postoperative Clinical Function in Patients With Developmental Dysplasia of the Hip", 《FRONTIERS IN PEDIATRICS》 *
YAOSHENG LU 等: "Multitask Deep Neural Network for the Fully Automatic Measurement of the Angle of Progression", 《COMPUTATIONAL AND MATHEMATICAL METHODS IN MEDICINE》 *
ZHOUBING XU 等: "Less is More: Simultaneous View Classification", 《ARXIV:1805.10376V2 [CS.CV]》 *
孙锡玮 等: "超声人工智能辅助诊断发育性髋关节发育不良", 《中华骨科杂志》 *

Similar Documents

Publication Publication Date Title
CN110210463B (en) Precise ROI-fast R-CNN-based radar target image detection method
CN107609525B (en) Remote sensing image target detection method for constructing convolutional neural network based on pruning strategy
WO2019200747A1 (en) Method and device for segmenting proximal femur, computer apparatus, and storage medium
CN111259930A (en) General target detection method of self-adaptive attention guidance mechanism
CN112967243A (en) Deep learning chip packaging crack defect detection method based on YOLO
CN114972213A (en) Two-stage mainboard image defect detection and positioning method based on machine vision
CN112085024A (en) Tank surface character recognition method
CN112907519A (en) Metal curved surface defect analysis system and method based on deep learning
CN109801305B (en) SAR image change detection method based on deep capsule network
CN113378676A (en) Method for detecting figure interaction in image based on multi-feature fusion
CN109949280B (en) Image processing method, image processing apparatus, device storage medium, and growth evaluation system
CN117854072B (en) Automatic labeling method for industrial visual defects
CN112598031A (en) Vegetable disease detection method and system
CN116665011A (en) Coal flow foreign matter identification method for coal mine belt conveyor based on machine vision
CN114548253A (en) Digital twin model construction system based on image recognition and dynamic matching
CN115294033A (en) Tire belt layer difference level and misalignment defect detection method based on semantic segmentation network
CN116452899A (en) Deep learning-based echocardiographic standard section identification and scoring method
CN115995040A (en) SAR image small sample target recognition method based on multi-scale network
CN115861226A (en) Method for intelligently identifying surface defects by using deep neural network based on characteristic value gradient change
Wu et al. Fast particle picking for cryo-electron tomography using one-stage detection
CN111223113B (en) Nuclear magnetic resonance hippocampus segmentation algorithm based on dual dense context-aware network
CN116245855B (en) Crop variety identification method, device, equipment and storage medium
CN117522891A (en) 3D medical image segmentation system and method
CN112991280A (en) Visual detection method and system and electronic equipment
CN115830302B (en) Multi-scale feature extraction fusion power distribution network equipment positioning identification method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20221025

RJ01 Rejection of invention patent application after publication