CN111563880B - Transverse process spinous process detection positioning method based on target detection and clustering - Google Patents
Transverse process spinous process detection positioning method based on target detection and clustering Download PDFInfo
- Publication number
- CN111563880B CN111563880B CN202010270822.7A CN202010270822A CN111563880B CN 111563880 B CN111563880 B CN 111563880B CN 202010270822 A CN202010270822 A CN 202010270822A CN 111563880 B CN111563880 B CN 111563880B
- Authority
- CN
- China
- Prior art keywords
- transverse
- clustering
- spinous
- image
- transverse process
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 177
- 238000001514 detection method Methods 0.000 title claims abstract description 45
- 239000000523 sample Substances 0.000 claims abstract description 13
- 238000012549 training Methods 0.000 claims description 12
- 210000000115 thoracic cavity Anatomy 0.000 claims description 10
- 238000002604 ultrasonography Methods 0.000 claims description 10
- 230000009466 transformation Effects 0.000 claims description 9
- 238000006243 chemical reaction Methods 0.000 claims description 4
- 239000011159 matrix material Substances 0.000 claims description 4
- 238000003384 imaging method Methods 0.000 description 9
- 206010039722 scoliosis Diseases 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000013136 deep learning model Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000007170 pathology Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 238000011282 treatment Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/23—Clustering techniques
- G06F18/232—Non-hierarchical techniques
- G06F18/2321—Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/23—Clustering techniques
- G06F18/232—Non-hierarchical techniques
- G06F18/2321—Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
- G06F18/23213—Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10132—Ultrasound image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30008—Bone
- G06T2207/30012—Spine; Backbone
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Biophysics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Computational Linguistics (AREA)
- Biomedical Technology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Probability & Statistics with Applications (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Image Analysis (AREA)
- Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
Abstract
The invention discloses a transverse process spinous process detection and positioning method based on target detection and clustering, which comprises the following steps: by using an ultrasonic probe with a position sensor, the spine in the range of the spine of interest is scanned from top to bottom, and a series of original ultrasonic image data containing the transverse processes, the spinous processes and the position information of each pixel point can be obtained. The transverse process and the spinous process of each frame of ultrasonic image are detected and positioned by using a target detection method based on the tiny YOLOv3 to obtain a series of three-dimensional discrete coordinate points, then clustering is carried out on the three-dimensional discrete points by using a clustering algorithm to find finer positioning, and the clustering center point is used as the three-dimensional space coordinate point of the transverse process and the spinous process, so that the accurate positioning of the transverse process and the spinous process position of the spine can be realized. The method has the advantages of high robustness, high intelligent degree, accurate positioning, simple method, convenient operation and high practicability.
Description
Technical Field
The invention relates to a medical ultrasonic spine transverse process spinous process identification and positioning technology, in particular to a transverse process spinous process detection and positioning method based on target detection and clustering.
Background
Scoliosis is a three-dimensional pathology of the spine, and the position distribution information of the transverse processes and the spinous processes of the spine is of great significance to the evaluation of scoliosis. The ultrasonic imaging technology has the advantages of no radiation and low cost, and has wide application in the aspects of screening, diagnosis and treatment of scoliosis diseases, and the traditional ultrasonic imaging technology mainly comprises two-dimensional ultrasonic imaging and three-dimensional ultrasonic imaging. The traditional two-dimensional ultrasonic imaging method is unfavorable for the integral detection of the spine due to the fact that the probe width is narrow and the image field is small, the imaging method often needs to be combined with a wide-field imaging technology to acquire images with large fields, and the three-dimensional position information of the transverse processes and the spinous processes is difficult to accurately calculate by only using two-dimensional image information. On the other hand, the traditional three-dimensional ultrasonic imaging hardware has high cost, high computational complexity, lower imaging resolution, low robustness to ultrasonic images with large noise and blurred boundaries, and is difficult to obtain three-dimensional position information of the transverse process and the spinous process part from the reconstructed images.
Disclosure of Invention
The invention aims to overcome the defects and shortcomings of the prior art, and provides a transverse process spinous process detection positioning method based on target detection and clustering, which utilizes a target detection algorithm to identify and detect transverse processes and spinous processes in each frame of ultrasonic image and utilizes a clustering algorithm to cluster detected discrete coordinate points, so that relatively accurate transverse process and spinous process position information is obtained, and the method is simple, efficient and quick and has high practicability.
The object of the invention is achieved by at least one of the following technical solutions.
A transverse process spinous process detection positioning method based on target detection and clustering comprises the following steps:
marking an ultrasonic image containing a transverse process and a spinous process, wherein the target to be marked comprises the transverse process and the spinous process;
training a tiny YOLOv3 target detection model by using the marked ultrasonic image;
detecting and positioning by using a trained tiny YOLOv3 target detection model to obtain a rectangular boundary box of each target, and converting the central point of the rectangular boundary box into a discrete point of a three-dimensional space;
and clustering discrete points in the three-dimensional space by using a clustering algorithm, wherein the clustering center is used as the three-dimensional position coordinates of the transverse process and the spinous process.
Further, before the marking of the ultrasonic image containing the transverse process and the spinous process, an ultrasonic probe with a position sensor is used for scanning the spine so as to obtain the original ultrasonic image data containing the transverse process and the spinous process.
Further, the scanning of the spine is specifically: the ultrasonic probe with the position sensor scans and collects the spine from the thoracic vertebra to the lumbar vertebra of the spine from top to bottom in the transverse direction.
Further, the training of the tiny YOLOv3 target detection model is performed, wherein the iteration number of the training process is 8000-10000, the number of input samples for each iteration is 64, the learning rate is 0.001, and the weight attenuation coefficient is 0.0005.
Further, the target to be marked comprises a transverse process and a spinous process, and specifically comprises the following steps: the transverse processes located on the left side of the thoracic vertebrae (TP_L_tv), the right side of the thoracic vertebrae (TP_R_tv), the left side of the lumbar vertebrae (TP_L_lv), the right side of the lumbar vertebrae (TP_R_lv), and the Spinous Processes (SP).
Further, the clustering algorithm is KMeans or DBSCAN, and clusters discrete points corresponding to the left transverse process (tp_l_tv, tp_l_lv), the right transverse process (tp_r_tv, tp_r_lv) and the Spinous Process (SP) to obtain a left transverse process cluster center, a right transverse process cluster center and a spinous process cluster center respectively.
Further, the method converts the center point of the rectangular bounding box into a discrete point in three-dimensional space, specifically:
let the coordinate point of the center position of the target rectangular boundary frame be P image The target point P in all ultrasonic images is processed by a three-dimensional coordinate transformation formula image Conversion to world coordinate System C w Thereby obtaining a series of discrete points in three-dimensional space, wherein the three-dimensional coordinate transformation formula is as follows:
P w =T 1 ·T 2 ·P image
wherein T is 1 Representing the transmitter coordinate system C t To world coordinate system C w Is the transformation matrix of the electromagnetic signal transmitting electronic unit in the electromagnetic positioning system, and the transmitter refers to the corresponding coordinate system C t Is an intrinsic parameter of the device itself, the coordinate information acquired by the position sensor has been converted into a transmission coordinate system, T, established by the device manufacturer 2 Representing an image coordinate system C p To the transmitter coordinate system C t An image coordinate system C is established by taking the upper left corner of the image as an origin, the transverse direction of the image as an x-axis, the sound beam direction of ultrasound as a y-axis and the normal vector direction perpendicular to the plane of the ultrasound image as a z-axis p ,P image The center of each target rectangular boundary frame is obtained through the detection processPoint coordinates, P w Representing coordinate points of the object in the world coordinate system.
Further, dividing the discrete points in the series of three-dimensional spaces into a transverse process point set and a spinous process point set, clustering the detected discrete points by a KMeans clustering algorithm, taking the clustering center point result of the obtained left transverse process point set as the three-dimensional position coordinate of the left transverse process point set, taking the clustering center point result of the obtained right transverse process point set as the three-dimensional position coordinate of the right transverse process point set, and taking the clustering center point result of the obtained spinous process point set as the three-dimensional position coordinate of the spinous process.
Compared with the prior art, the invention has the beneficial effects that:
according to the spine transverse process spinous process detection positioning method based on the deep learning model tiny yolv 3 and the clustering algorithm, which is provided by the invention, the transverse process and the spinous process in each frame of ultrasonic image are identified and detected by utilizing a target detection algorithm, and the detected discrete coordinate points are clustered by utilizing the clustering algorithm, so that relatively accurate transverse process and spinous process position information is obtained; the advantages of deep learning end-to-end learning and the unsupervised characteristic of a clustering algorithm are combined, automatic identification and detection are achieved, the spatial position of the transverse process spinous process is obtained automatically, and the accuracy of transverse process spinous process detection and positioning is improved. The method has the advantages of high robustness, high intelligent degree, simplicity, convenient operation and high practicability.
Drawings
The invention will be further described with reference to the drawings and embodiments.
Fig. 1 is a flowchart of a transverse process spinous process detection and positioning method based on tiniy YOLOv3 and KMeans in the present embodiment.
Fig. 2 is an acquired ultrasound image in the present embodiment.
Fig. 3 is a result diagram of detection using the tiny YOLOv3 target detection algorithm in the present embodiment.
Fig. 4 is an explanatory diagram of the ultrasound image coordinate system in the present embodiment.
Fig. 5 is a graph of the result of the coordinate points corresponding to the spinous processes of the transverse processes obtained by kmens clustering in the present embodiment.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not limiting of the invention, as reference may be made to conventional techniques for non-specifically identified process parameters.
Examples:
a transverse process spinous process detection positioning method based on target detection and clustering, the basic flow of which is shown in fig. 1, the method comprises the following steps:
s1, scanning the spine by adopting an ultrasonic probe with a position sensor so as to obtain original ultrasonic image data containing transverse processes and spinous processes.
The method comprises the steps of mounting a position sensor in an electromagnetic positioning system miniBIRD (Ascension Technology Corporation, burlington, VT, USA) on an ultrasonic probe to record three-dimensional position information of probe movement, attaching the probe with the position sensor to the skin of a spine, enabling the probe to be transverse, namely, enabling an ultrasonic beam plane of the probe to be perpendicular to a spine body, enabling an acquisition area to cover a range from a thoracic vertebra to a lumbar vertebra, scanning from top to bottom to obtain a series of ultrasonic image data containing transverse processes and spinous process outlines, and enabling the electromagnetic positioning sensor to acquire the position information of the probe in real time while scanning. The acquired ultrasound image converted to JPG format is shown in fig. 2.
The acquired raw ultrasound image data is divided into a training set and a test set.
S2, marking the ultrasonic image.
The original ultrasonic image data in the training set is converted into an image in a JPG format, and five targets in the image are marked, wherein the five targets are respectively a left transverse process (TP_L_tv) of the thoracic vertebra, a right transverse process (TP_R_tv) of the thoracic vertebra, a left transverse process (TP_L_lv) of the lumbar vertebra, a right transverse process (TP_R_lv) of the lumbar vertebra and a Spinous Process (SP).
S3, training a tiny YOLOv3 target detection model.
Training a tiny YOLOv3 target detection model by using the marked ultrasonic image, wherein the iteration number of the training process is 8000-10000, the number of input samples of each iteration is 64, the learning rate is 0.001, the weight attenuation coefficient is 0.0005, and the training process is operated on a computer with an nvidia GPU to obtain the tiny YOLOv3 target detection model capable of detecting the transverse process spinous process in the ultrasonic image.
The original ultrasonic image data in the test set is converted into an image in a JPG format through format conversion, and is input into a trained tiny YOLOv3 target detection model to test the performance of the model.
And S4, detecting and positioning by using a trained tiny YOLOv3 target detection model to obtain a rectangular boundary box of each target, and converting the central point of the rectangular boundary box into a discrete point of a three-dimensional space.
Detecting each frame of ultrasonic image by using a trained tiny YOLOv3 model to obtain a rectangular boundary frame of each target, and detecting and positioning five types of targets, namely a left transverse process (TP_L_tv), a right transverse process (TP_R_tv), a left transverse process (TP_L_lv), a right transverse process (TP_R_lv) and a Spinous Process (SP) of the thoracic vertebrae respectively as shown in figure 3, wherein a coordinate point of the central position of the rectangular boundary frame of the target is set as P image The target point P in all the ultrasonic images is obtained by the following three-dimensional coordinate transformation formula image Conversion to world coordinate System C w Thereby obtaining a series of discrete points in three-dimensional space.
P w =T 1 ·T 2 ·P image (1)
Wherein T is 1 Representing the transmitter coordinate system C t To world coordinate system C w Is the transformation matrix of the electromagnetic signal transmitting electronic unit in the electromagnetic positioning system, and the transmitter refers to the corresponding coordinate system C t Is an intrinsic parameter of the device itself, the coordinate information acquired by the position sensor has been converted into the transmission coordinate system C, established by the device manufacturer t ,T 2 Representing an image coordinate system C p To the transmitter coordinate system C t Transformation matrix of (C) image coordinate system p The establishment of (a) is shown in FIG. 4, with its origin atThe upper left corner of the image, the x-axis is the transverse direction of the image, the y-axis is the sound beam direction of the ultrasound, the z-axis is the normal vector direction perpendicular to the ultrasound image plane, and P image Representing the coordinates of the central points of various target rectangular boundary boxes obtained through the detection process, P w Representing coordinate points of the object in the world coordinate system, three-dimensional point set data for clustering is obtained by the above formula (1).
S5, clustering discrete points in the three-dimensional space by using a clustering algorithm, wherein a clustering center is used as the three-dimensional position coordinates of the transverse process and the spinous process.
Dividing the three-dimensional point set data obtained in the step S4 into a left transverse process point set, a right transverse process point set and a spinous process point set, clustering discrete points obtained by detection by using a KMeans clustering method respectively, wherein the K value is usually 30 as an empirical value for the left transverse process point set and the right transverse process point set, and the K value is usually 16 as an empirical value for the spinous process point set. And finally, outputting the clustering center point result as spatial position information of the left transverse process, the right transverse process and the spinous process respectively, as shown in fig. 5.
By clustering the three-dimensional discrete points by adopting a KMeans algorithm, finer position information can be searched, and the clustering center point is used as a three-dimensional space coordinate point of the transverse process and the spinous process, so that the accurate positioning of the positions of the transverse process and the spinous process of the spine can be realized.
The present invention can be preferably implemented as described above.
The foregoing examples are preferred embodiments of the present invention, which are described in more detail and are not to be construed as limiting the scope of the invention. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the invention, which are all within the scope of the invention. Accordingly, the protection scope of the present invention is subject to the claims.
Claims (4)
1. The transverse process spinous process detection and positioning method based on target detection and clustering is characterized by comprising the following steps of:
marking an ultrasonic image containing a transverse process and a spinous process, wherein the target to be marked comprises the transverse process and the spinous process;
training a tiny YOLOv3 target detection model by using the marked ultrasonic image;
detecting each frame of ultrasonic image by using a trained tiny YOLOv3 model to obtain a rectangular boundary frame of each target, and converting the central point of the rectangular boundary frame into a discrete point of a three-dimensional space;
clustering discrete points in the three-dimensional space by using a clustering algorithm, wherein a clustering center is used as a three-dimensional position coordinate of a transverse process and a spinous process;
wherein, the target to be marked comprises a transverse process and a spinous process, and the method specifically comprises the following steps: transverse processes located on the left side of the thoracic vertebrae (tp_l_tv), the right side of the thoracic vertebrae (tp_r_tv), the left side of the lumbar vertebrae (tp_l_lv), the right side of the lumbar vertebrae (tp_r_lv), and the Spinous Processes (SP);
the clustering algorithm is KMeans or DBSCAN, and clusters discrete points corresponding to left transverse processes (tp_l_tv, tp_l_lv), right transverse processes (tp_r_tv, tp_r_lv) and Spinous Processes (SP) to obtain a left transverse process clustering center, a right transverse process clustering center and a spinous process clustering center respectively;
the method for converting the center point of the rectangular bounding box into the discrete point of the three-dimensional space comprises the following steps:
let the coordinate point of the center position of the target rectangular boundary frame be P image The target point P in all ultrasonic images is processed by a three-dimensional coordinate transformation formula image Conversion to world coordinate System C w Thereby obtaining a series of discrete points in three-dimensional space, wherein the three-dimensional coordinate transformation formula is as follows:
P w =T 1 ·T 2 ·P image
wherein T is 1 Representing the transmitter coordinate system C t To world coordinate system C w Is the transformation matrix of the electromagnetic signal transmitting electronic unit in the electromagnetic positioning system, and the transmitter refers to the corresponding coordinate system C t Is an intrinsic parameter of the device itself, the coordinate information acquired by the position sensor has been converted into a transmitting seat, established by the device manufacturerStandard system, T 2 Representing an image coordinate system C p To the transmitter coordinate system C t An image coordinate system C is established by taking the upper left corner of the image as an origin, the transverse direction of the image as an x-axis, the sound beam direction of ultrasound as a y-axis and the normal vector direction perpendicular to the plane of the ultrasound image as a z-axis p ,P image The central point coordinates of various target rectangular boundary boxes are obtained through the detection process, and P w Representing coordinate points of the object in a world coordinate system;
dividing the discrete points in the series of three-dimensional spaces into a left transverse process point set, a right transverse process point set and a spinous process point set, clustering the detected discrete points by adopting a clustering algorithm, taking the clustering center point result of the obtained left transverse process point set as the three-dimensional position coordinate of the left transverse process point set, taking the clustering center point result of the obtained right transverse process point set as the three-dimensional position coordinate of the right transverse process point set, and taking the clustering center point result of the obtained spinous process point set as the three-dimensional position coordinate of the spinous process.
2. The transverse process spinous process detection and positioning method based on target detection and clustering as claimed in claim 1, wherein the method comprises the following steps: before the marking of the ultrasonic image containing the transverse process and the spinous process, an ultrasonic probe with a position sensor is adopted to scan the spine so as to obtain the original ultrasonic image data containing the transverse process and the spinous process.
3. The transverse process spinous process detection and positioning method based on target detection and clustering as claimed in claim 2, wherein the method comprises the following steps: the spine scanning method specifically comprises the following steps of: the ultrasonic probe with the position sensor scans and collects the spine from the thoracic vertebra to the lumbar vertebra of the spine from top to bottom in the transverse direction.
4. The method for detecting and positioning spinous processes of transverse processes based on target detection and clustering according to claim 1, wherein the training of the tiny YOLOv3 target detection model is performed, the number of iterations of the training process is 8000-10000, the number of input samples per iteration is 64, the learning rate is 0.001, and the weight attenuation coefficient is 0.0005.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010270822.7A CN111563880B (en) | 2020-04-08 | 2020-04-08 | Transverse process spinous process detection positioning method based on target detection and clustering |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010270822.7A CN111563880B (en) | 2020-04-08 | 2020-04-08 | Transverse process spinous process detection positioning method based on target detection and clustering |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111563880A CN111563880A (en) | 2020-08-21 |
CN111563880B true CN111563880B (en) | 2023-11-10 |
Family
ID=72071552
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010270822.7A Active CN111563880B (en) | 2020-04-08 | 2020-04-08 | Transverse process spinous process detection positioning method based on target detection and clustering |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111563880B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112132064A (en) * | 2020-09-25 | 2020-12-25 | 新希望六和股份有限公司 | Method, device, equipment and medium for identifying number of pregnant sacs based on artificial intelligence |
CN115620042B (en) * | 2022-12-20 | 2023-03-10 | 菲特(天津)检测技术有限公司 | Gear model determination method and system based on target detection and clustering |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106361376A (en) * | 2016-09-23 | 2017-02-01 | 华南理工大学 | Ultrasonic wide-view imaging method for spinal scoliosis |
CN108670301A (en) * | 2018-06-06 | 2018-10-19 | 西北工业大学 | A kind of backbone transverse process localization method based on ultrasonic image |
CN109064473A (en) * | 2018-07-26 | 2018-12-21 | 华南理工大学 | A kind of 2.5D ultrasonic wide-scene image partition method |
-
2020
- 2020-04-08 CN CN202010270822.7A patent/CN111563880B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106361376A (en) * | 2016-09-23 | 2017-02-01 | 华南理工大学 | Ultrasonic wide-view imaging method for spinal scoliosis |
CN108670301A (en) * | 2018-06-06 | 2018-10-19 | 西北工业大学 | A kind of backbone transverse process localization method based on ultrasonic image |
CN109064473A (en) * | 2018-07-26 | 2018-12-21 | 华南理工大学 | A kind of 2.5D ultrasonic wide-scene image partition method |
Also Published As
Publication number | Publication date |
---|---|
CN111563880A (en) | 2020-08-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111432733B (en) | Apparatus and method for determining motion of an ultrasound probe | |
US11373303B2 (en) | Systems and methods for ultrasound imaging | |
CN110338844B (en) | Three-dimensional imaging data display processing method and three-dimensional ultrasonic imaging method and system | |
US20150023561A1 (en) | Dynamic ultrasound processing using object motion calculation | |
US7783095B2 (en) | System and method for fetal biometric measurements from ultrasound data and fusion of same for estimation of fetal gestational age | |
KR101121396B1 (en) | System and method for providing 2-dimensional ct image corresponding to 2-dimensional ultrasound image | |
US11341634B2 (en) | Fetal ultrasound image processing | |
CN111563880B (en) | Transverse process spinous process detection positioning method based on target detection and clustering | |
US20160213353A1 (en) | Ultrasound imaging apparatus, ultrasound imaging method and ultrasound imaging program | |
US20160249879A1 (en) | System and Method for Ultrasound Imaging of Regions Containing Bone Structure | |
US8351650B2 (en) | Foreground action estimating apparatus and foreground action estimating method | |
CN108670301B (en) | Transverse process positioning method for vertebral column based on ultrasonic image | |
CN107811652A (en) | The ultrasonic imaging method and system of adjust automatically parameter | |
CN116068468B (en) | MPI reconstruction method for time domain system matrix combined with x-space | |
CN116310837B (en) | SAR ship target rotation detection method and system | |
CN111126508A (en) | Hopc-based improved heterogeneous image matching method | |
CN112668585B (en) | Object identification and positioning method in dynamic environment | |
CN111028337A (en) | Three-dimensional photoacoustic imaging method for improving problem of limited visual angle | |
Ramesh et al. | Image fusion experiment for information content | |
US20230186477A1 (en) | System and methods for segmenting images | |
WO2016049681A1 (en) | Ultrasound image processing system and method | |
WO2016164021A1 (en) | Automatic marine flare or seep detection from echosounder data | |
CN117808880A (en) | Monocular vision space cooperation target pose measurement method | |
CN117653206A (en) | Amniotic fluid depth measurement method, amniotic fluid depth measurement device, amniotic fluid depth measurement computer equipment and amniotic fluid depth storage medium | |
CN117789196A (en) | Target object detection method, device and equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |