CN116687445B - Automatic positioning and tracking method, device, equipment and storage medium for ultrasonic fetal heart - Google Patents

Automatic positioning and tracking method, device, equipment and storage medium for ultrasonic fetal heart Download PDF

Info

Publication number
CN116687445B
CN116687445B CN202310943953.0A CN202310943953A CN116687445B CN 116687445 B CN116687445 B CN 116687445B CN 202310943953 A CN202310943953 A CN 202310943953A CN 116687445 B CN116687445 B CN 116687445B
Authority
CN
China
Prior art keywords
heart
image
ultrasonic
data
current
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310943953.0A
Other languages
Chinese (zh)
Other versions
CN116687445A (en
Inventor
芦振寰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Wisonic Medical Technology Co ltd
Original Assignee
Shenzhen Wisonic Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Wisonic Medical Technology Co ltd filed Critical Shenzhen Wisonic Medical Technology Co ltd
Priority to CN202310943953.0A priority Critical patent/CN116687445B/en
Publication of CN116687445A publication Critical patent/CN116687445A/en
Application granted granted Critical
Publication of CN116687445B publication Critical patent/CN116687445B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/02Measuring pulse or heart rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0866Detecting organic movements or changes, e.g. tumours, cysts, swellings involving foetal diagnosis; pre-natal or peri-natal diagnosis of the baby
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0883Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the heart
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0891Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/523Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for generating planar views from image data in a user selectable plane not corresponding to the acquisition plane
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • A61B8/5246Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from the same or different imaging techniques, e.g. color Doppler and B-mode
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5292Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves using additional data, e.g. patient information, image labeling, acquisition parameters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • G06N3/0442Recurrent networks, e.g. Hopfield networks characterised by memory or gating, e.g. long short-term memory [LSTM] or gated recurrent units [GRU]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • G06V10/245Aligning, centring, orientation detection or correction of the image by locating a pattern; Special marks for positioning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/62Extraction of image or video features relating to a temporal dimension, e.g. time-based feature extraction; Pattern tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/806Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images
    • G06V2201/031Recognition of patterns in medical or anatomical images of internal organs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Abstract

The invention relates to the field of data processing, and discloses an automatic positioning and tracking method, device and equipment for an ultrasonic fetal heart and a storage medium, wherein the method comprises the following steps: acquiring ultrasonic data of the fetal heart at the current moment and the historical moment; processing the ultrasonic data to obtain a current ultrasonic image and a historical ultrasonic image; generating an image sequence based on the historical ultrasound images; judging whether a heart image in the current ultrasonic image is a heart standard section or not to obtain a judging result; and positioning and tracking the fetal heart according to the judging result, the image sequence and the current ultrasonic image. The ultrasonic data of the fetal heart is analyzed so as to enable the ultrasonic probe to automatically position and track the fetal heart to move, assist doctors in measuring and observing, reduce the operation amount of the doctors, reduce the interference when the fetal heart is scanned, and greatly improve the accuracy of fetal heart screening.

Description

Automatic positioning and tracking method, device, equipment and storage medium for ultrasonic fetal heart
Technical Field
The invention relates to the technical field of data processing, in particular to an automatic positioning and tracking method, device and equipment for an ultrasonic fetal heart and a storage medium.
Background
Fetal heart ultrasound screening is a prenatal ultrasound examination, which can examine the heart structure, function and rhythm of the fetus, and find out whether there are congenital cardiovascular deformities, arrhythmia or abnormal heart functions. The content of fetal heart ultrasound screening includes examination of standard cuts at various locations to assess heart position, size, axial, atrioventricular valve, ventricular septum, pulmonary venous return, aortic cross-relationship, etc.
When a doctor performs a fetal heart examination, he first needs to locate the standard heart surface and then perform a series of procedures to make measurements and observations. Such as first requiring entry into an amplification mode and a blood flow mode, opening the PW spectrum, etc. However, the operation takes time and action, and the fetus and the probe are likely to move in the waiting process, so that the standard section of the heart is found again, and time is wasted. The examination is delayed to some extent.
Disclosure of Invention
In view of the above, the embodiment of the invention provides an automatic positioning and tracking method, device, equipment and storage medium for an ultrasonic fetal heart, which are used for solving the problem that the automatic positioning and tracking cannot be realized in the prior art and the observation of doctors is inconvenient.
In a first aspect, an embodiment of the present invention provides an automatic positioning and tracking method for an ultrasonic fetal heart, including:
acquiring ultrasonic data of the fetal heart at the current moment and the historical moment;
processing the ultrasonic data to obtain a current ultrasonic image and a historical ultrasonic image;
generating an image sequence based on the historical ultrasound images;
judging whether a heart image in the current ultrasonic image is a heart standard section or not to obtain a judging result;
and positioning and tracking the fetal heart according to the judging result, the image sequence and the current ultrasonic image.
The ultrasonic data of the fetal heart is analyzed so as to enable the ultrasonic probe to automatically position and track the fetal heart to move, assist doctors in measuring and observing, reduce the operation amount of the doctors, reduce the interference when the fetal heart is scanned, and greatly improve the accuracy of fetal heart screening.
In an alternative embodiment, processing the ultrasound data to obtain current ultrasound images and historical ultrasound images includes:
performing complex signal modulo on the ultrasonic data at each moment to obtain a position signal;
carrying out logarithmic compression on the position signal to obtain a compressed signal;
carrying out data image preprocessing on the compressed signals to obtain image data;
And carrying out coordinate conversion on the image data to obtain an ultrasonic image at each moment.
By optimizing the ultrasonic data, noise interference is reduced, so that the accuracy of the finally obtained ultrasonic image is effectively improved, and reliable support is provided for the subsequent analysis process.
In an alternative embodiment, the method further comprises, prior to the data image pre-processing of the compressed signal:
and performing space compounding processing on the compressed signals.
The same area images transmitted and received at a plurality of angles are compounded, the degree of speckle noise is greatly weakened in a mode of weakening coherent effect, and the contrast resolution of the whole image is improved.
In an alternative embodiment, generating an image sequence based on historical ultrasound images includes:
sequencing the historical ultrasonic images according to the time information of the historical ultrasonic images to obtain a sequence to be processed;
and intercepting ultrasonic images at the moment closest to the current moment from the sequence to be processed according to the preset required quantity and the preset time interval to obtain an image sequence.
By generating the image sequence, the movement trend of the fetal heart is convenient to analyze later, so that the moving direction of the probe is better guided.
In an alternative embodiment, determining whether the heart image in the current ultrasound image is a standard cardiac surface, to obtain a determination result includes:
Extracting a heart image from a current ultrasonic image;
and inputting the heart image into a preset classification model to obtain an output result, wherein the output result is a judgment result, and the output result comprises a heart standard section and a non-heart standard section.
The heart image can be accurately identified and judged through the preset classification model, and whether the heart image is a heart standard section or not is judged, so that adjustment is performed based on the result in the subsequent step, time is saved, and efficiency is improved.
In an alternative embodiment, when the judgment result is a non-heart standard section, positioning and tracking the fetal heart according to the judgment result, the image sequence and the current ultrasonic image, including:
analyzing the vertical movement direction of the ultrasonic probe according to the image sequence;
moving the ultrasonic probe according to the vertical movement direction;
and acquiring a current ultrasonic image after the ultrasonic probe moves, and returning to the step of judging whether the heart image in the current ultrasonic image is a heart standard section or not until the judging result is the heart standard section.
By judging the vertical movement direction of the ultrasonic probe, the heart position is tracked, so that a doctor can more accurately and conveniently position the heart, and related screening and measuring work can be better performed.
In an alternative embodiment, analyzing the vertical movement direction of the ultrasound probe from the image sequence comprises:
and inputting the image sequence into a preset motion prediction model to obtain the vertical movement direction of the ultrasonic probe.
And judging the heart movement trend through a preset movement prediction model, so that the vertical movement direction of the ultrasonic probe is more accurately obtained, and the correct direction guide is fed back in time.
In an alternative embodiment, when the judgment result is a standard cardiac surface, positioning and tracking the fetal heart according to the judgment result, the image sequence and the current ultrasonic image, including:
judging whether all heart images in the current ultrasonic image are in the image area range of the current ultrasonic image;
if the heart image is in the image area range, directly refreshing the heart position;
if the heart image is not in the image area range, translating the ultrasonic probe according to the heart image and the ultrasonic image area range, so that the heart image is in the image area range.
The heart position is positioned and tracked in time through image judgment, so that the heart images are all in the image area range, the observation of doctors is facilitated, the work interference of the doctors is reduced, and the doctors can position the heart more accurately and conveniently, and therefore, related screening and measuring work can be performed better.
In an alternative embodiment, translating the ultrasound probe from the cardiac image and the ultrasound image region range includes:
analyzing heart data which are not in the image area range according to the current ultrasonic image;
calculating the relative distance that the ultrasonic probe needs to move according to the heart data and the image area range;
the ultrasound probe is translated in the direction of the heart according to the relative distance.
The relative distance that the ultrasonic probe needs to move is calculated through analysis, so that automatic tracking is accurately realized, manual operation is not needed, and high accuracy is achieved.
In a second aspect, an embodiment of the present invention provides an ultrasonic fetal heart automatic positioning and tracking device, including:
the acquisition module is used for acquiring ultrasonic data of the fetal heart at the current moment and the historical moment;
the processing module is used for processing the ultrasonic data to obtain a current ultrasonic image and a historical ultrasonic image;
a generation module for generating a sequence of images based on the historical ultrasound images;
the judging module is used for judging whether the heart image in the current ultrasonic image is a heart standard section or not to obtain a judging result;
and the positioning tracking module is used for positioning and tracking the fetal heart according to the judging result, the image sequence and the current ultrasonic image.
In a third aspect, embodiments of the present invention provide a computer device comprising:
the device comprises a memory and a processor, wherein the memory is in communication connection with the processor, the memory stores computer instructions, and the processor executes the computer instructions so as to execute the automatic positioning and tracking method of the ultrasonic fetal heart provided by the embodiment of the invention.
In a fourth aspect, embodiments of the present invention provide a computer readable storage medium storing computer instructions for causing a computer to execute the method for automatic positioning and tracking of an ultrasonic fetal heart provided by the embodiments of the present invention.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are needed in the description of the embodiments or the prior art will be briefly described, and it is obvious that the drawings in the description below are some embodiments of the present invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart diagram illustration of an automatic positioning and tracking method of an ultrasonic fetal heart according to an embodiment of the present invention;
FIG. 2 is a flow chart diagram illustration of another method of automatic positioning and tracking of an ultrasonic fetal heart according to an embodiment of the present invention;
FIG. 3 is a flow chart diagram illustration of yet another method of automatic positioning and tracking of an ultrasonic fetal heart in accordance with an embodiment of the present invention;
FIG. 4 is a flow chart diagram illustration of yet another method of automatic positioning and tracking of an ultrasonic fetal heart in accordance with an embodiment of the present invention;
FIG. 5 is a flow chart diagram illustration of yet another method of automatic positioning and tracking of an ultrasonic fetal heart in accordance with an embodiment of the present invention;
FIG. 6 is a flowchart illustration of yet another method of automatic positioning and tracking of an ultrasonic fetal heart in accordance with an embodiment of the present invention;
FIG. 7 is a block diagram of an ultrasonic fetal heart automatic positioning tracking device according to an embodiment of the present invention;
fig. 8 is a schematic diagram of a hardware structure of a computer device according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The embodiment of the invention provides an ultrasonic fetal heart automatic positioning and tracking method, which achieves the effects of automatically tracking fetal heart movement, assisting a doctor in measuring and observing, reducing the operation amount of the doctor, reducing the interference when scanning the fetal heart and greatly improving the accuracy of fetal heart screening by analyzing ultrasonic data of the fetal heart.
In accordance with an embodiment of the present invention, an embodiment of an ultrasonic fetal heart automatic positioning tracking method is provided, it being noted that the steps illustrated in the flowchart of the figures may be performed in a computer system, such as a set of computer executable instructions, and, although a logical order is illustrated in the flowchart, in some cases, the steps illustrated or described may be performed in an order other than that illustrated herein.
In this embodiment, an automatic positioning and tracking method for an ultrasonic fetal heart is provided, which can be used for assisting a doctor in fetal heart screening, fig. 1 is a flowchart of the automatic positioning and tracking method for an ultrasonic fetal heart according to an embodiment of the present invention, and as shown in fig. 1, the flowchart includes the following steps:
step S101: ultrasound data of the fetal heart at the current time and at the historical time is acquired. Specifically, ultrasonic data is acquired by transmitting ultrasonic waves by a probe and receiving ultrasonic echoes by each array element channel. The probe is generally composed of a plurality of strip piezoelectric transducers (each single piezoelectric transducer is called an array element) with the same size at equal intervals; or two-dimensional arrays are connected, and array elements are arranged into a two-dimensional matrix shape. The probe used in the present invention is not essentially different from a general ultrasonic imaging system, and is externally emitted ultrasonic waves by converting voltage pulse excitation applied thereto into mechanical vibration through a built-in piezoelectric transducer. After the ultrasonic wave is sent out, the ultrasonic wave is scattered or reflected back to the probe by human tissue, and the piezoelectric transducer corresponding to each array element on the probe can convert the mechanical vibration caused by the echo into an electric signal, so that an original echo analog signal is formed in a corresponding channel. And synthesizing the original received echoes of all channels according to the geometric relationship and the physical principle, uploading the beamformed result to a PC end for processing, and analyzing the uploaded data packet into a organized data format according to a preset design format to obtain ultrasonic data.
Step S102: and processing the ultrasonic data to obtain a current ultrasonic image and a historical ultrasonic image. Specifically, by optimizing the ultrasonic data, noise interference is reduced, so that the accuracy of the finally obtained ultrasonic image is effectively improved, and reliable support is provided for the subsequent analysis process.
Step S103: an image sequence is generated based on the historical ultrasound images. Specifically, by generating an image sequence, the movement trend of the fetal heart is conveniently analyzed later, so that the moving direction of the probe is better guided, and the fetal heart is conveniently positioned and tracked.
Step S104: judging whether the heart image in the current ultrasonic image is a heart standard section or not, and obtaining a judging result. Specifically, whether the heart standard section is judged, so that adjustment is performed in the subsequent step based on the result, time is saved, and efficiency is improved.
Step S105: and positioning and tracking the fetal heart according to the judging result, the image sequence and the current ultrasonic image. Specifically, by positioning and tracking the heart, the method is convenient for doctors to observe, reduces work interference of the doctors, and enables the doctors to position the heart more accurately and conveniently, so that related screening and measuring work can be performed better.
Through the steps from S101 to S105, the ultrasonic fetal heart automatic positioning and tracking method provided by the embodiment of the invention achieves the effects of enabling the ultrasonic probe to automatically position and track fetal heart movement, assisting a doctor to measure and observe, reducing the operation amount of the doctor, reducing interference when scanning the fetal heart and greatly improving the accuracy of fetal heart screening by analyzing the ultrasonic data of the fetal heart.
In this embodiment, an automatic positioning and tracking method for an ultrasonic fetal heart is provided, which can be used for assisting a doctor in fetal heart screening, and fig. 2 is a flowchart of the automatic positioning and tracking method for an ultrasonic fetal heart according to an embodiment of the present invention, as shown in fig. 2, the flowchart includes the following steps:
step S201: ultrasound data of the fetal heart at the current time and at the historical time is acquired. Specifically, the further detailed description of this step is the same as the step S101, and will not be repeated here.
Step S202: and processing the ultrasonic data to obtain a current ultrasonic image and a historical ultrasonic image. Specifically, the further detailed description of this step is the same as the step S102 described above, and will not be repeated here.
Step S203: an image sequence is generated based on the historical ultrasound images. Specifically, the further detailed description of this step is the same as the step S103 described above, and will not be repeated here.
Step S204: judging whether the heart image in the current ultrasonic image is a heart standard section or not, and obtaining a judging result. Specifically, the further detailed description of this step is the same as the step S104 described above, and will not be repeated here.
Step S205: and positioning and tracking the fetal heart according to the judging result, the image sequence and the current ultrasonic image. Specifically, this step is the same as step S105, and will not be described in detail here.
Specifically, step S202 described above includes:
step S2021: and performing complex signal modulo on the ultrasonic data at each moment to obtain a position signal. Specifically, the two-dimensional matrix is analyzed and arranged to be the number of sampling points multiplied by the number of scanning lines, but the signal of each position is a complex signal, and in order to be more intuitively imaged, the energy of the signal is obtained by modulo the complex signal, so that the signal of the position is represented by the energy.
Step S2022: and carrying out logarithmic compression on the position signal to obtain a compressed signal. Specifically, as the range of the data value after the modulo is quite large, the organization level of the image is clearer by carrying out logarithmic compression on the data once, so that the human eyes can better perceive the structure of the image.
Step S2023: and carrying out data image preprocessing on the compressed signal to obtain image data. Specifically, the image preprocessing process includes: gain and dynamic transformation, image enhancement, edge enhancement filtering, dead zone adjustment, frame correlation, full field gain, gray scale mapping, and other parameters.
Step S2024: and carrying out coordinate conversion on the image data to obtain an ultrasonic image at each moment. Specifically, a polar coordinate system based on a scanning line is converted into a real physical rectangular coordinate system, and rectangular coordinate images are output and displayed.
Specifically, by optimizing the ultrasonic data, noise interference is reduced, so that the accuracy of the finally obtained ultrasonic image is effectively improved, and reliable support is provided for the subsequent analysis process.
In some optional embodiments, before the step S2023, further includes: and performing space compounding processing on the compressed signals. Specifically, the same area images transmitted and received at a plurality of angles are compounded, the degree of speckle noise is greatly weakened in a mode of weakening the coherent effect, and the overall contrast resolution of the images is improved.
In this embodiment, an automatic positioning and tracking method for an ultrasonic fetal heart is provided, which can be used for assisting a doctor in fetal heart screening, and fig. 3 is a flowchart of the automatic positioning and tracking method for an ultrasonic fetal heart according to an embodiment of the present invention, as shown in fig. 3, the flowchart includes the following steps:
Step S301: ultrasound data of the fetal heart at the current time and at the historical time is acquired. Specifically, the further detailed description of this step is the same as the step S101, and will not be repeated here.
Step S302: and processing the ultrasonic data to obtain a current ultrasonic image and a historical ultrasonic image. Specifically, the further detailed description of this step is the same as the step S102 described above, and will not be repeated here.
Step S303: an image sequence is generated based on the historical ultrasound images. Specifically, the further detailed description of this step is the same as the step S103 described above, and will not be repeated here.
Step S304: judging whether the heart image in the current ultrasonic image is a heart standard section or not, and obtaining a judging result. Specifically, the further detailed description of this step is the same as the step S104 described above, and will not be repeated here.
Step S305: and positioning and tracking the fetal heart according to the judging result, the image sequence and the current ultrasonic image. Specifically, this step is the same as step S105, and will not be described in detail here.
Specifically, step S303 described above includes:
step S3031: and sequencing the historical ultrasonic images according to the time information of the historical ultrasonic images to obtain a sequence to be processed.
Step S3032: and intercepting ultrasonic images at the moment closest to the current moment from the sequence to be processed according to the preset required quantity and the preset time interval to obtain an image sequence.
Specifically, by generating an image sequence, the movement trend of the fetal heart is conveniently analyzed later, so that the movement direction of the probe is better guided. The number of preset requirements and the preset time interval can be adjusted according to the accuracy requirements. Acquiring images with preset required quantity according to time advance except when scanning is started; after the ultrasonic images with the preset required quantity are acquired, each new image frame input in the later stage can replace the earliest piece of image data. The module is kept always containing a certain amount of sequence data. The sequence data is stored in time sequence, for example, the first sheet at time t-n, where n is the sequence length stored by the current module and the last sheet at time t.
In this embodiment, an automatic positioning and tracking method for an ultrasonic fetal heart is provided, which can be used for assisting a doctor in fetal heart screening, and fig. 4 is a flowchart of the automatic positioning and tracking method for an ultrasonic fetal heart according to an embodiment of the present invention, as shown in fig. 4, the flowchart includes the following steps:
step S401: ultrasound data of the fetal heart at the current time and at the historical time is acquired. Specifically, the further detailed description of this step is the same as the step S101, and will not be repeated here.
Step S402: and processing the ultrasonic data to obtain a current ultrasonic image and a historical ultrasonic image. Specifically, the further detailed description of this step is the same as the step S102 described above, and will not be repeated here.
Step S403: an image sequence is generated based on the historical ultrasound images. Specifically, the further detailed description of this step is the same as the step S103 described above, and will not be repeated here.
Step S404: judging whether the heart image in the current ultrasonic image is a heart standard section or not, and obtaining a judging result. Specifically, the further detailed description of this step is the same as the step S104 described above, and will not be repeated here.
Step S405: and positioning and tracking the fetal heart according to the judging result, the image sequence and the current ultrasonic image. Specifically, this step is the same as step S105, and will not be described in detail here.
Specifically, step S404 described above includes:
step S4041: a heart image is extracted from the current ultrasound image. Specifically, the heart image only occupies a part of positions in the ultrasonic image, so that the heart image can be extracted first when the standard section of the heart is identified.
Step S4042: and inputting the heart image into a preset classification model to obtain an output result, wherein the output result is a judgment result, and the output result comprises a heart standard section and a non-heart standard section. Specifically, the predetermined classification model in the determination of the cardiac surface is not particularly limited herein, and for example, a time-series neural network LSTM may be used to extract time-series information and features for comparison analysis.
Specifically, through the preset classification model, the heart image can be accurately identified and judged, and whether the heart image is a heart standard section or not is judged, so that adjustment is performed based on the result in the subsequent step, time is saved, and efficiency is improved.
In this embodiment, an automatic positioning and tracking method for an ultrasonic fetal heart is provided, which can be used for assisting a doctor in fetal heart screening, and fig. 5 is a flowchart of the automatic positioning and tracking method for an ultrasonic fetal heart according to an embodiment of the present invention, as shown in fig. 5, the flowchart includes the following steps:
step S501: ultrasound data of the fetal heart at the current time and at the historical time is acquired. Specifically, the further detailed description of this step is the same as the step S101, and will not be repeated here.
Step S502: and processing the ultrasonic data to obtain a current ultrasonic image and a historical ultrasonic image. Specifically, the further detailed description of this step is the same as the step S102 described above, and will not be repeated here.
Step S503: an image sequence is generated based on the historical ultrasound images. Specifically, the further detailed description of this step is the same as the step S103 described above, and will not be repeated here.
Step S504: judging whether the heart image in the current ultrasonic image is a heart standard section or not, and obtaining a judging result. Specifically, the further detailed description of this step is the same as the step S104 described above, and will not be repeated here.
Step S505: and positioning and tracking the fetal heart according to the judging result, the image sequence and the current ultrasonic image. Specifically, this step is the same as step S105, and will not be described in detail here.
Specifically, when the determination result is a non-heart standard slice, step S505 includes:
step S5051: and analyzing the vertical movement direction of the ultrasonic probe according to the image sequence. Specifically, the motion trend of the heart can be predicted through the image sequence, so that whether and how the ultrasonic probe needs to be adjusted in the vertical direction are analyzed.
Step S5052: and moving the ultrasonic probe according to the vertical movement direction. Specifically, the process may be automated or may be performed by the physician by providing directions to the physician.
Step S5053: and acquiring a current ultrasonic image after the ultrasonic probe moves, and returning to the step of judging whether the heart image in the current ultrasonic image is a heart standard section or not until the judging result is the heart standard section.
Specifically, by judging the vertical movement direction of the ultrasonic probe, the heart position is tracked, so that a doctor can more accurately and conveniently position the heart, and related screening and measuring work can be better performed.
In some alternative embodiments, step S5051 includes: and inputting the image sequence into a preset motion prediction model to obtain the vertical movement direction of the ultrasonic probe. In particular, the vertical movement direction includes a direction vertically approaching the heart and a direction vertically away from the heart. And judging the heart movement trend through a preset movement prediction model, so that the vertical movement direction of the ultrasonic probe is more accurately obtained, and the correct direction guide is fed back in time.
In some alternative embodiments, step S5051 includes:
step a1, extracting heart long axis data of each frame of image from the image sequence.
And a2, comparing the heart long axis data with preset long axis data to obtain a plurality of difference data.
And a3, carrying out average value processing on the difference data to obtain a data difference value.
And a4, if the data difference value is negative, moving the ultrasonic probe vertically to a direction approaching the heart.
And a5, if the data difference value is positive, moving the ultrasonic probe vertically to a direction far away from the heart.
Specifically, when the data difference is a negative value, it indicates that the heart image in the image is too small, and the direction of the ultrasonic probe close to the heart needs to be adjusted, so that the heart image is of a preset size in the whole display interface, and the observation and measurement are convenient. Similarly, when the data difference is positive, the heart picture in the picture is too large, and the direction of the ultrasonic probe close to the heart needs to be adjusted.
In this embodiment, an automatic positioning and tracking method for an ultrasonic fetal heart is provided, which can be used for assisting a doctor in fetal heart screening, and fig. 6 is a flowchart of the automatic positioning and tracking method for an ultrasonic fetal heart according to an embodiment of the present invention, as shown in fig. 6, the flowchart includes the following steps:
step S601: ultrasound data of the fetal heart at the current time and at the historical time is acquired. Specifically, the further detailed description of this step is the same as the step S101, and will not be repeated here.
Step S602: and processing the ultrasonic data to obtain a current ultrasonic image and a historical ultrasonic image. Specifically, the further detailed description of this step is the same as the step S102 described above, and will not be repeated here.
Step S603: an image sequence is generated based on the historical ultrasound images. Specifically, the further detailed description of this step is the same as the step S103 described above, and will not be repeated here.
Step S604: judging whether the heart image in the current ultrasonic image is a heart standard section or not, and obtaining a judging result. Specifically, the further detailed description of this step is the same as the step S104 described above, and will not be repeated here.
Step S605: and positioning and tracking the fetal heart according to the judging result, the image sequence and the current ultrasonic image. Specifically, this step is the same as step S105, and will not be described in detail here.
Specifically, when the determination result is the standard cardiac surface, step S605 includes:
step S6051: and judging whether the heart images in the current ultrasonic image are all within the image area range of the current ultrasonic image. In particular, the fetal and probe may move during the examination, resulting in images of the heart that are not entirely within the observable range in the acquired ultrasound images.
Step S6052: if the heart image is entirely within the image area, the heart position is directly refreshed. In particular, in this case the probe does not need to be moved, only the heart position needs to be re-identified.
Step S6053: if the heart image is not in the image area range, translating the ultrasonic probe according to the heart image and the ultrasonic image area range, so that the heart image is in the image area range.
Specifically, the heart position is positioned and tracked in time through image judgment, so that the heart image is completely in the image area range, the observation of a doctor is facilitated, the work interference of the doctor is reduced, and the doctor can more accurately and conveniently position the heart, thereby being capable of better performing related screening and measuring work.
In some optional embodiments, translating the ultrasound probe according to the heart image and the ultrasound image region range in step S6053 includes:
And b1, analyzing the heart data which are not in the image area range according to the current ultrasonic image.
And b2, calculating the relative distance required to be moved by the ultrasonic probe according to the heart data and the image area range.
And b3, translating the ultrasonic probe towards the direction of the heart according to the relative distance. Specifically, the heart position can be identified by setting a heart positioning frame, selecting the maximum circumscribed rectangle of the heart structure and outputting a coordinate result, for example, a Yolo target detection network is used for extracting high-dimensional characteristics of an image by combining a plurality of convolution modules and residual modules in a backbox part; combining the features through up-sampling and down-sampling the features for a plurality of times by a Neck part, and fusing the features with a plurality of granularities; and reserving the most proper positioning frame through non-maximum suppression by the Prediction part. Non-maximum suppression refers to removing other interference results by calculating the overlap ratio of the final output positioning frame, and retaining the most reliable positioning frame.
Specifically, the relative distance that the ultrasonic probe needs to move is calculated through analysis, so that automatic tracking is accurately realized, and the image area range just comprises the heart. The manual operation is not needed, the manual opening function of doctors is reduced, the work of adjusting the sampling frame is reduced, and the accuracy is high.
The embodiment also provides an automatic positioning and tracking device for an ultrasonic fetal heart, which is used for realizing the embodiment and the preferred implementation mode, and is not described in detail. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. While the means described in the following embodiments are preferably implemented in software, implementation in hardware, or a combination of software and hardware, is also possible and contemplated.
The present embodiment provides an ultrasonic fetal heart automatic positioning and tracking device, as shown in fig. 7, comprising:
an acquisition module 701 is configured to acquire ultrasound data of the fetal heart at the current time and at the historical time.
The processing module 702 is configured to process the ultrasound data to obtain a current ultrasound image and a historical ultrasound image.
A generation module 703 for generating a sequence of images based on the historical ultrasound images.
And the judging module 704 is used for judging whether the heart image in the current ultrasonic image is a heart standard section or not to obtain a judging result.
And the positioning tracking module 705 is used for positioning and tracking the fetal heart according to the judging result, the image sequence and the current ultrasonic image.
In some alternative embodiments, the processing module 702 includes:
And the module calculating unit is used for calculating the complex signal module of the ultrasonic data at each moment to obtain a position signal.
And the compression unit is used for carrying out logarithmic compression on the position signal to obtain a compressed signal.
And the preprocessing unit is used for preprocessing the data image of the compressed signal to obtain image data.
And the conversion unit is used for carrying out coordinate conversion on the image data to obtain an ultrasonic image at each moment.
In some alternative embodiments, the generating module 703 includes:
and the sequencing unit is used for sequencing the historical ultrasonic images according to the time information of the historical ultrasonic images to obtain a sequence to be processed.
The selecting unit is used for intercepting ultrasonic images which are closest to the current moment from the sequence to be processed according to the preset required quantity and the preset time interval to obtain an image sequence.
In some alternative embodiments, the determining module 704 includes:
and the extraction unit is used for extracting the heart image from the current ultrasonic image.
The judging unit is used for inputting the heart image into a preset classification model to obtain an output result, wherein the output result is a judging result, and the output result comprises a heart standard section and a non-heart standard section.
In some alternative embodiments, when the determination result is a non-heart standard slice, the positioning tracking module 705 includes:
And the first analysis unit is used for analyzing the vertical movement direction of the ultrasonic probe according to the image sequence.
And the first moving unit is used for moving the ultrasonic probe according to the vertical moving direction.
And the circulation unit is used for acquiring the current ultrasonic image after the ultrasonic probe moves, and returning to the step of judging whether the heart image in the current ultrasonic image is a heart standard section or not until the judging result is the heart standard section.
In some alternative embodiments, the first analysis unit comprises:
and the direction judging subunit is used for inputting the image sequence into a preset motion prediction model to obtain the vertical movement direction of the ultrasonic probe.
In some alternative embodiments, when the determination result is a standard cardiac surface, the location tracking module 705 includes:
and the range judging unit is used for judging whether the heart images in the current ultrasonic image are all within the image area range of the current ultrasonic image.
And the refreshing unit is used for directly refreshing the heart position if the heart image is in the image area range.
And the second moving unit is used for translating the ultrasonic probe according to the heart image and the ultrasonic image area range if the heart image part is in the image area range so that the heart image is in the image area range.
In some alternative embodiments, the second mobile unit includes:
and the analysis subunit is used for analyzing the heart data which is not in the image area range according to the current ultrasonic image.
And the calculating subunit is used for calculating the relative distance required to be moved by the ultrasonic probe according to the heart data and the image area range.
And the moving subunit is used for translating the ultrasonic probe towards the direction of the heart according to the relative distance.
Further functional descriptions of the above respective modules and units are the same as those of the above corresponding embodiments, and are not repeated here.
The ultrasonic fetal heart automatic positioning tracking device in this embodiment is presented in the form of a functional unit, where the unit refers to an ASIC (Application Specific Integrated Circuit ) circuit, a processor and memory executing one or more software or fixed programs, and/or other devices that can provide the above-described functionality.
Further functional descriptions of the above respective modules are the same as those of the above corresponding embodiments, and are not repeated here.
The embodiment of the invention also provides computer equipment, which is provided with the automatic positioning and tracking device for the ultrasonic fetal heart shown in the figure 7.
Referring to fig. 8, fig. 8 is a schematic structural diagram of a computer device according to an alternative embodiment of the present invention, as shown in fig. 8, the computer device includes: one or more processors 10, memory 20, and interfaces for connecting the various components, including high-speed interfaces and low-speed interfaces. The various components are communicatively coupled to each other using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions executing within the computer device, including instructions stored in or on memory to display graphical information of the GUI on an external input/output device, such as a display device coupled to the interface. In some alternative embodiments, multiple processors and/or multiple buses may be used, if desired, along with multiple memories and multiple memories. Also, multiple computer devices may be connected, each providing a portion of the necessary operations (e.g., as a server array, a set of blade servers, or a multiprocessor system). One processor 10 is illustrated in fig. 8.
The processor 10 may be a central processor, a network processor, or a combination thereof. The processor 10 may further include a hardware chip, among others. The hardware chip may be an application specific integrated circuit, a programmable logic device, or a combination thereof. The programmable logic device may be a complex programmable logic device, a field programmable gate array, a general-purpose array logic, or any combination thereof.
Wherein the memory 20 stores instructions executable by the at least one processor 10 to cause the at least one processor 10 to perform the methods shown in implementing the above embodiments.
The memory 20 may include a storage program area that may store an operating system, at least one application program required for functions, and a storage data area; the storage data area may store data created according to the use of the computer device, etc. In addition, the memory 20 may include high-speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid-state storage device. In some alternative embodiments, memory 20 may optionally include memory located remotely from processor 10, which may be connected to the computer device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
Memory 20 may include volatile memory, such as random access memory; the memory may also include non-volatile memory, such as flash memory, hard disk, or solid state disk; the memory 20 may also comprise a combination of the above types of memories.
The computer device may further comprise input means and output means. The processor 10, memory 20, input devices, and output devices may be connected by a bus or other means.
The input device may receive entered numeric or character information and generate key signal inputs related to user settings and function control of the computer device, such as a touch screen, keypad, mouse, trackpad, touchpad, pointer stick, one or more mouse buttons, trackball, joystick, and the like. The output means may include a display device, auxiliary lighting means (e.g., LEDs), tactile feedback means (e.g., vibration motors), and the like. Such display devices include, but are not limited to, liquid crystal displays, light emitting diodes, displays and plasma displays. In some alternative implementations, the display device may be a touch screen.
The computer device also includes a communication interface 30 for the computer device to communicate with other devices or communication networks.
The embodiments of the present invention also provide a computer readable storage medium, and the method according to the embodiments of the present invention described above may be implemented in hardware, firmware, or as a computer code which may be recorded on a storage medium, or may be originally stored in a remote storage medium or a non-transitory machine-readable storage medium and to be stored in a local storage medium, downloaded through a network, so that the method described herein may be stored on such software processes on a storage medium using a general purpose computer, a special purpose processor, or programmable or special purpose hardware. The storage medium can be a magnetic disk, an optical disk, a read-only memory, a random access memory, a flash memory, a hard disk, a solid state disk or the like; further, the storage medium may also comprise a combination of memories of the kind described above. It will be appreciated that a computer, processor, microprocessor controller or programmable hardware includes a storage element that can store or receive software or computer code that, when accessed and executed by the computer, processor or hardware, implements the methods illustrated by the above embodiments.
Although embodiments of the present invention have been described in connection with the accompanying drawings, various modifications and variations may be made by those skilled in the art without departing from the spirit and scope of the invention, and such modifications and variations fall within the scope of the invention as defined by the appended claims.

Claims (9)

1. The automatic positioning and tracking method for the ultrasonic fetal heart is characterized by comprising the following steps of:
acquiring ultrasonic data of the fetal heart at the current moment and the historical moment;
processing the ultrasonic data to obtain a current ultrasonic image and a historical ultrasonic image;
generating a sequence of images based on the historical ultrasound images;
judging whether a heart image in the current ultrasonic image is a heart standard section or not to obtain a judging result;
positioning and tracking the fetal heart according to the judging result, the image sequence and the current ultrasonic image;
when the judging result is a non-heart standard section, positioning and tracking the fetal heart according to the judging result, the image sequence and the current ultrasonic image, wherein the positioning and tracking comprises the following steps:
analyzing the vertical movement direction of the ultrasonic probe according to the image sequence; moving the ultrasonic probe according to the vertical movement direction; acquiring a current ultrasonic image after the ultrasonic probe moves, and returning to the step of judging whether a heart image in the current ultrasonic image is a heart standard section or not until the judging result is the heart standard section;
the analyzing the vertical movement direction of the ultrasonic probe according to the image sequence comprises: extracting heart long axis data of each frame of image from the image sequence; comparing the heart long axis data with preset long axis data to obtain a plurality of difference data; carrying out mean value processing on the difference data to obtain a data difference value; if the data difference is negative, the ultrasonic probe is vertically moved to the direction close to the heart; if the data difference is positive, the ultrasonic probe is vertically moved to a direction far away from the heart;
When the judging result is a heart standard section, positioning and tracking the fetal heart according to the judging result, the image sequence and the current ultrasonic image, wherein the positioning and tracking comprises the following steps:
judging whether all heart images in the current ultrasonic image are in the image area range of the current ultrasonic image; if the heart images are all in the image area range, directly refreshing the heart position; if the heart images are not all in the image area range, translating an ultrasonic probe according to the heart images and the ultrasonic image area range so that the heart images are all in the image area range;
translating an ultrasound probe according to the cardiac image and the ultrasound image region range, comprising: analyzing heart data which is not in the image area range according to the current ultrasonic image; calculating the relative distance to be moved by the ultrasonic probe according to the heart data and the image area range; translating the ultrasound probe in the direction of the heart according to the relative distance; the analyzing cardiac data not within the image region from the current ultrasound image includes: and (3) setting a heart positioning frame, selecting the largest circumscribed rectangle of a heart structure, identifying the heart position in a mode of outputting a coordinate result, and reserving the most suitable positioning frame through non-maximum suppression.
2. The method of claim 1, wherein processing the ultrasound data to obtain current ultrasound images and historical ultrasound images comprises:
performing complex signal modulo on the ultrasonic data at each moment to obtain a position signal;
carrying out logarithmic compression on the position signal to obtain a compressed signal;
carrying out data image preprocessing on the compressed signals to obtain image data;
and carrying out coordinate conversion on the image data to obtain an ultrasonic image at each moment.
3. The method of automatic positioning and tracking of an ultrasonic fetal heart of claim 2, wherein prior to preprocessing the compressed signal data images, the method further comprises:
and performing space compounding processing on the compressed signals.
4. The method of claim 1, wherein the generating a sequence of images based on the historical ultrasound images comprises:
sequencing the historical ultrasonic images according to the time information of the historical ultrasonic images to obtain a sequence to be processed;
and intercepting ultrasonic images at the moment closest to the current moment from the sequence to be processed according to the preset required quantity and the preset time interval to obtain an image sequence.
5. The automatic positioning and tracking method of an ultrasonic fetal heart according to claim 1, wherein determining whether a heart image in the current ultrasonic image is a heart standard section, to obtain a determination result, comprises:
extracting a heart image from the current ultrasonic image;
and inputting the heart image into a preset classification model to obtain an output result, wherein the output result is a judgment result, and the output result comprises a heart standard section and a non-heart standard section.
6. The method of automatic positioning and tracking of an ultrasonic fetal heart according to claim 1, wherein analyzing a vertical movement direction of an ultrasonic probe from the image sequence comprises:
and inputting the image sequence into a preset motion prediction model to obtain the vertical movement direction of the ultrasonic probe.
7. An ultrasonic fetal heart automatic positioning and tracking device, comprising:
the acquisition module is used for acquiring ultrasonic data of the fetal heart at the current moment and the historical moment;
the processing module is used for processing the ultrasonic data to obtain a current ultrasonic image and a historical ultrasonic image;
a generation module for generating a sequence of images based on the historical ultrasound images;
The judging module is used for judging whether the heart image in the current ultrasonic image is a heart standard section or not to obtain a judging result;
the positioning tracking module is used for positioning and tracking the fetal heart according to the judging result, the image sequence and the current ultrasonic image; when the judging result is a non-heart standard section, positioning and tracking the fetal heart according to the judging result, the image sequence and the current ultrasonic image, wherein the positioning and tracking comprises the following steps:
analyzing the vertical movement direction of the ultrasonic probe according to the image sequence; moving the ultrasonic probe according to the vertical movement direction; acquiring a current ultrasonic image after the ultrasonic probe moves, and returning to the step of judging whether a heart image in the current ultrasonic image is a heart standard section or not until the judging result is the heart standard section;
the analyzing the vertical movement direction of the ultrasonic probe according to the image sequence comprises: extracting heart long axis data of each frame of image from the image sequence; comparing the heart long axis data with preset long axis data to obtain a plurality of difference data; carrying out mean value processing on the difference data to obtain a data difference value; if the data difference is negative, the ultrasonic probe is vertically moved to the direction close to the heart; if the data difference is positive, the ultrasonic probe is vertically moved to a direction far away from the heart;
When the judging result is a heart standard section, positioning and tracking the fetal heart according to the judging result, the image sequence and the current ultrasonic image, wherein the positioning and tracking comprises the following steps:
judging whether all heart images in the current ultrasonic image are in the image area range of the current ultrasonic image; if the heart images are all in the image area range, directly refreshing the heart position; if the heart images are not all in the image area range, translating an ultrasonic probe according to the heart images and the ultrasonic image area range so that the heart images are all in the image area range;
translating an ultrasound probe according to the cardiac image and the ultrasound image region range, comprising: analyzing heart data which is not in the image area range according to the current ultrasonic image; calculating the relative distance to be moved by the ultrasonic probe according to the heart data and the image area range; translating the ultrasound probe in the direction of the heart according to the relative distance; the analyzing cardiac data not within the image region from the current ultrasound image includes: and (3) setting a heart positioning frame, selecting the largest circumscribed rectangle of a heart structure, identifying the heart position in a mode of outputting a coordinate result, and reserving the most suitable positioning frame through non-maximum suppression.
8. A computer device, comprising:
a memory and a processor, the memory and the processor being communicatively coupled to each other, the memory having stored therein computer instructions, the processor executing the computer instructions to perform the method of automatic positioning and tracking of an ultrasonic fetal heart of any of claims 1-6.
9. A computer readable storage medium having stored thereon computer instructions for causing a computer to perform the method of automatic positioning and tracking of an ultrasonic fetal heart according to any of claims 1-6.
CN202310943953.0A 2023-07-31 2023-07-31 Automatic positioning and tracking method, device, equipment and storage medium for ultrasonic fetal heart Active CN116687445B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310943953.0A CN116687445B (en) 2023-07-31 2023-07-31 Automatic positioning and tracking method, device, equipment and storage medium for ultrasonic fetal heart

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310943953.0A CN116687445B (en) 2023-07-31 2023-07-31 Automatic positioning and tracking method, device, equipment and storage medium for ultrasonic fetal heart

Publications (2)

Publication Number Publication Date
CN116687445A CN116687445A (en) 2023-09-05
CN116687445B true CN116687445B (en) 2024-01-30

Family

ID=87839461

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310943953.0A Active CN116687445B (en) 2023-07-31 2023-07-31 Automatic positioning and tracking method, device, equipment and storage medium for ultrasonic fetal heart

Country Status (1)

Country Link
CN (1) CN116687445B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003079627A (en) * 2001-09-14 2003-03-18 Aloka Co Ltd Cardiac wall movement evaluation apparatus
CN104203114A (en) * 2012-03-26 2014-12-10 中田雅彦 Ultrasound diagnostic apparatus
CN104797199A (en) * 2012-11-20 2015-07-22 皇家飞利浦有限公司 Automatic positioning of standard planes for real-time fetal heart evaluation
CN112472138A (en) * 2020-12-18 2021-03-12 深圳市德力凯医疗设备股份有限公司 Ultrasonic tracking method, system, storage medium and ultrasonic equipment
CN112773402A (en) * 2019-11-09 2021-05-11 无锡祥生医疗科技股份有限公司 Intelligent auxiliary guiding method, ultrasonic diagnosis device and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003079627A (en) * 2001-09-14 2003-03-18 Aloka Co Ltd Cardiac wall movement evaluation apparatus
CN104203114A (en) * 2012-03-26 2014-12-10 中田雅彦 Ultrasound diagnostic apparatus
CN104797199A (en) * 2012-11-20 2015-07-22 皇家飞利浦有限公司 Automatic positioning of standard planes for real-time fetal heart evaluation
CN112773402A (en) * 2019-11-09 2021-05-11 无锡祥生医疗科技股份有限公司 Intelligent auxiliary guiding method, ultrasonic diagnosis device and storage medium
CN112472138A (en) * 2020-12-18 2021-03-12 深圳市德力凯医疗设备股份有限公司 Ultrasonic tracking method, system, storage medium and ultrasonic equipment

Also Published As

Publication number Publication date
CN116687445A (en) 2023-09-05

Similar Documents

Publication Publication Date Title
KR101906916B1 (en) Knowledge-based ultrasound image enhancement
US11950959B2 (en) Ultrasound system with automated dynamic setting of imaging parameters based on organ detection
US11238562B2 (en) Ultrasound system with deep learning network for image artifact identification and removal
US11593933B2 (en) Systems and methods for ultrasound image quality determination
JP7203823B2 (en) An ultrasound system that extracts image planes from volume data using touch interaction with the image
US11432806B2 (en) Information processing apparatus, information processing method, and storage medium
EP4041086A1 (en) Systems and methods for image optimization
JP7008713B2 (en) Ultrasound assessment of anatomical features
US11627936B2 (en) Systems and methods for ultrasound review and imaging
CN116687445B (en) Automatic positioning and tracking method, device, equipment and storage medium for ultrasonic fetal heart
CN112704517B (en) Method, system, equipment and storage medium for processing endometrium peristalsis ultrasonic image
US11890142B2 (en) System and methods for automatic lesion characterization
CN113768546A (en) Ultrasound elastic image generation and processing system and method
CN112955076A (en) Method and system for pulsed wave doppler signal tracking of anatomical structures over time based on multi-gate doppler signals
US11944501B2 (en) Systems and methods for automatic measurements of medical images
CN116616817B (en) Ultrasonic heart rate detection method and device, ultrasonic equipment and storage medium
CN116712101B (en) Ultrasound image generation method, device, computer equipment and storage medium
EP4108178A1 (en) Generation of m-mode data for detecting fetal cardiac activity
US20230157661A1 (en) Ultrasound image analysis apparatus, ultrasound diagnostic apparatus, and control method for ultrasound image analysis apparatus
US20230186477A1 (en) System and methods for segmenting images
CN116188483A (en) Method for processing myocardial reperfusion data and ultrasonic imaging system
CN116115256A (en) Method and system for dynamically adjusting imaging parameters during ultrasound scanning
JP5203614B2 (en) Ultrasonic diagnostic apparatus, ultrasonic tomographic image processing apparatus, and ultrasonic diagnostic program
CN112702955A (en) Method for determining inspection mode and ultrasonic equipment
CN115153642A (en) Ultrasonic imaging device, terminal device and ultrasonic inspection method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant