CN116763346A - Ultrasonic image processing method, ultrasonic imaging device and readable storage medium - Google Patents
Ultrasonic image processing method, ultrasonic imaging device and readable storage medium Download PDFInfo
- Publication number
- CN116763346A CN116763346A CN202210216699.XA CN202210216699A CN116763346A CN 116763346 A CN116763346 A CN 116763346A CN 202210216699 A CN202210216699 A CN 202210216699A CN 116763346 A CN116763346 A CN 116763346A
- Authority
- CN
- China
- Prior art keywords
- ultrasonic
- ultrasonic probe
- image
- fetal head
- analysis result
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 25
- 238000003860 storage Methods 0.000 title claims abstract description 13
- 238000003672 processing method Methods 0.000 title abstract description 7
- 239000000523 sample Substances 0.000 claims abstract description 155
- 238000004458 analytical method Methods 0.000 claims abstract description 87
- 238000001514 detection method Methods 0.000 claims abstract description 66
- 238000000034 method Methods 0.000 claims abstract description 57
- 230000001605 fetal effect Effects 0.000 claims description 115
- 238000002604 ultrasonography Methods 0.000 claims description 107
- 210000002640 perineum Anatomy 0.000 claims description 30
- 210000004061 pubic symphysis Anatomy 0.000 claims description 27
- 238000012545 processing Methods 0.000 claims description 19
- 238000012285 ultrasound imaging Methods 0.000 claims description 13
- 238000004590 computer program Methods 0.000 claims description 7
- 239000000758 substrate Substances 0.000 claims 2
- 208000015181 infectious disease Diseases 0.000 abstract description 6
- 230000011218 segmentation Effects 0.000 description 27
- 238000003745 diagnosis Methods 0.000 description 22
- 210000001015 abdomen Anatomy 0.000 description 16
- 238000004519 manufacturing process Methods 0.000 description 14
- 238000003709 image segmentation Methods 0.000 description 12
- 230000008569 process Effects 0.000 description 12
- 210000004556 brain Anatomy 0.000 description 11
- 238000010586 diagram Methods 0.000 description 11
- 210000001519 tissue Anatomy 0.000 description 7
- 230000005540 biological transmission Effects 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 5
- 238000012549 training Methods 0.000 description 5
- 230000003187 abdominal effect Effects 0.000 description 4
- 238000013473 artificial intelligence Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 238000002360 preparation method Methods 0.000 description 4
- 238000004422 calculation algorithm Methods 0.000 description 3
- 238000002592 echocardiography Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000032696 parturition Effects 0.000 description 3
- 210000003625 skull Anatomy 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 230000035606 childbirth Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 210000003754 fetus Anatomy 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 210000003689 pubic bone Anatomy 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 230000001174 ascending effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000001983 electron spin resonance imaging Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 210000001259 mesencephalon Anatomy 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 210000004197 pelvis Anatomy 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 230000035935 pregnancy Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0866—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving foetal diagnosis; pre-natal or peri-natal diagnosis of the baby
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4444—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/462—Displaying means of special interest characterised by constructional features of the display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10132—Ultrasound image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30044—Fetus; Embryo
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Pathology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Gynecology & Obstetrics (AREA)
- Pregnancy & Childbirth (AREA)
- Quality & Reliability (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
The application relates to the technical field of ultrasonic images, and discloses an ultrasonic image processing method, ultrasonic imaging equipment and a readable storage medium. The method comprises the following steps: collecting an ultrasonic image by using an ultrasonic probe; analyzing the ultrasonic image to obtain an analysis result; adjusting and guiding the ultrasonic probe according to the analysis result to adjust the ultrasonic probe to the optimal acquisition position; responding to the adjustment of the ultrasonic probe to the optimal acquisition position, and acquiring a corresponding target ultrasonic image; imaging analysis is carried out on the target ultrasonic image to obtain detection parameters; displaying the detection parameters. Through the mode, medical staff can visually see the detection parameters, the medical staff is prevented from estimating the parameters by using a traditional finger detection method, the accuracy of the detection parameters is improved, and the infection risk of a puerpera is also reduced.
Description
Technical Field
The present application relates to the field of ultrasound imaging technology, and in particular, to a processing method of an ultrasound image, an ultrasound imaging apparatus, and a readable storage medium.
Background
During childbirth of a pregnant woman, the progress of labor needs to be monitored and clinically assessed before and during the pregnancy into the childbirth chamber. The traditional monitoring method is completed by checking the conditions of large opening degree of the uterine cavity, position and fetal orientation of the fetal head exposed part through internal diagnosis, the process is judged by experience of midwife, the method has strong subjectivity, and frequent finger detection is easy to increase infection and discomfort of pregnant women, and the compliance of the pregnant women is reduced.
Disclosure of Invention
The application mainly solves the technical problems of providing the ultrasonic image processing method, the ultrasonic imaging equipment and the readable storage medium, which can lead medical staff to intuitively see the detection parameters, avoid the medical staff from using the traditional finger detection method to estimate the parameters, improve the accuracy of the detection parameters and reduce the infection risk of puerpera.
In order to solve the above problems, the present application provides a method for processing an ultrasound image, which includes: collecting an ultrasonic image by using an ultrasonic probe; analyzing the ultrasonic image to obtain an analysis result; adjusting and guiding the ultrasonic probe according to the analysis result to adjust the ultrasonic probe to the optimal acquisition position; responding to the adjustment of the ultrasonic probe to the optimal acquisition position, and acquiring a corresponding target ultrasonic image; imaging analysis is carried out on the target ultrasonic image to obtain detection parameters; displaying the detection parameters.
Wherein displaying the detection parameters includes: displaying the target ultrasonic image on the display interface, and marking the detection parameters on the target ultrasonic image.
Wherein the detection parameter comprises at least one of a fetal head progress angle, a distance between pubic symphysis and the fetal head, a fetal head progress distance, and a distance between the fetal head and a perineum; the method further comprises the steps of: forming a graph based on the fetal head progress angle and the distance between the fetal head and the perineum; a chart is displayed.
Wherein the method further comprises: based on the detected parameters, a delivery advice is displayed.
Wherein, based on the detection parameter, display the delivery suggestion, include: acquiring a fetal head progress angle in the detection parameters; the delivery advice is based on the fetal head progress angle, wherein the delivery advice comprises a forward delivery and a caesarean delivery.
Wherein, according to the analysis result to the ultrasonic probe adjust the guide to adjust ultrasonic probe to the best acquisition position, include: and if the analysis result does not meet the preset requirement, adjusting and guiding the ultrasonic probe.
If the analysis result does not meet the preset requirement, adjusting and guiding the ultrasonic probe, including: if the analysis result shows that the characteristic information is not in the preset area of the ultrasonic image, displaying a first guide mark on a display interface to guide the ultrasonic probe to move towards the direction of the characteristic information; if the analysis result shows that the characteristic information does not meet the requirement of the standard characteristic information, displaying a second index mark on the display interface so as to guide the ultrasonic probe to rotate and collect an ultrasonic image of another angle.
In order to solve the above problems, another technical solution adopted by the present application is to provide an ultrasonic imaging apparatus, comprising: an ultrasonic probe; the transmitting circuit is connected with the ultrasonic probe and is used for transmitting ultrasonic signals to target tissues through the ultrasonic probe; the receiving circuit is connected with the ultrasonic probe and is used for collecting ultrasonic echo signals reflected by ultrasonic waves through target tissues; the processor is connected with the receiving circuit and is used for generating an ultrasonic image according to the ultrasonic echo signal; analyzing the ultrasonic image to obtain an analysis result; adjusting and guiding the ultrasonic probe according to the analysis result to adjust the ultrasonic probe to the optimal acquisition position; responding to the adjustment of the ultrasonic probe to the optimal acquisition position, and acquiring a corresponding target ultrasonic image; imaging analysis is carried out on the target ultrasonic image to obtain detection parameters; and the display is used for displaying the detection parameters.
The display is also used for displaying the target ultrasonic image, wherein the target ultrasonic image is marked with the detection parameters.
The detection parameters comprise at least two of fetal head progress angle, distance between pubic symphysis and fetal head, fetal head progress distance and distance between fetal head and perineum; the processor is further configured to form a graph based on the angle of fetal head progress and the distance between the fetal head and the perineum; the display is also used to display a chart.
Wherein the processor is further configured to generate a delivery suggestion based on the detected parameter; the display is also used to display labor advice.
The processor is also used for acquiring the fetal head progress angle in the detection parameters; and generating a delivery recommendation based on the fetal head progress angle, wherein the delivery recommendation includes a antenatal and a caesarean section.
The processor is further used for adjusting and guiding the ultrasonic probe if the analysis result does not meet the preset requirement.
The processor is further used for generating a first guide mark when the analysis result indicates that the characteristic information is not in a preset area of the ultrasonic image; the display is also used for displaying a first guide mark so as to guide the ultrasonic probe to move towards the direction of the characteristic information; or the processor is further used for generating a second index identifier when the analysis result indicates that the characteristic information does not meet the requirement of the standard characteristic information; the display is also used for displaying a second instruction mark so as to guide the ultrasonic probe to rotate to acquire an ultrasonic image of another angle.
In order to solve the above-mentioned problems, another technical solution adopted by the present application is to provide a computer readable storage medium for storing a computer program, which when executed by a processor, is configured to implement the method provided in the above technical solution.
The beneficial effects of the application are as follows: unlike the prior art, the method for processing the ultrasonic image provided by the application comprises the following steps: collecting an ultrasonic image by using an ultrasonic probe; analyzing the ultrasonic image to obtain an analysis result; adjusting and guiding the ultrasonic probe according to the analysis result to adjust the ultrasonic probe to the optimal acquisition position; responding to the adjustment of the ultrasonic probe to the optimal acquisition position, and acquiring a corresponding target ultrasonic image; imaging analysis is carried out on the target ultrasonic image to obtain detection parameters; displaying the detection parameters. Through the mode, an operator can adjust the ultrasonic probe according to the adjustment guide, so that the ultrasonic probe reaches the optimal acquisition position, corresponding ultrasonic images are acquired, the requirements on the operation capacity of an ultrasonic instrument of the operator, ultrasonic expertise and clinical expertise can be reduced, the operator does not need to blindly adjust the ultrasonic probe manually for many times, the acquisition efficiency is improved, the ultrasonic diagnosis efficiency is further improved, corresponding detection parameters are displayed, medical staff can visually see the detection parameters, when the puerpera is produced, the medical staff can be prevented from using the traditional finger detection method to estimate the parameters, the accuracy of the detection parameters is improved, and the infection risk of the puerpera is also reduced.
Drawings
FIG. 1 is a flow chart of an embodiment of a method for processing an ultrasound image according to the present application;
FIG. 2 is a schematic diagram of an embodiment of an ultrasound imaging apparatus provided by the present application;
FIG. 3 is a flow chart of another embodiment of a method for processing an ultrasound image according to the present application; the method comprises the steps of carrying out a first treatment on the surface of the
FIG. 4 is a flow chart of another embodiment of a method for processing an ultrasound image according to the present application;
FIG. 5 is a flow chart of another embodiment of a method for processing an ultrasound image according to the present application;
fig. 6 to 11 are schematic diagrams of application scenarios of the ultrasonic image processing method provided by the application;
FIG. 12 is a schematic diagram of a trend graph of the labor progress parameters provided by the present application;
FIG. 13 is a schematic illustration of a production probability map indication map provided by the present application;
FIG. 14 is a schematic diagram of a display interface according to the present application;
FIG. 15 is a schematic view of an embodiment of an ultrasound imaging apparatus provided by the present application;
fig. 16 is a schematic structural diagram of an embodiment of a computer readable storage medium provided by the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application. It is to be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application. It should be further noted that, for convenience of description, only some, but not all of the structures related to the present application are shown in the drawings. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The terms "first," "second," and the like in this disclosure are used for distinguishing between different objects and not for describing a particular sequential order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
Referring to fig. 1, fig. 1 is a flowchart of an embodiment of a method for processing an ultrasound image according to the present application. The method comprises the following steps:
Step 11: an ultrasound image is acquired using an ultrasound probe.
In the present embodiment, an ultrasound image may be acquired using an ultrasound imaging apparatus. Referring to fig. 2, the ultrasonic imaging apparatus 100 includes an ultrasonic probe 101, a transmitting circuit 102, a receiving circuit 103, a transmission/reception selection switch 104, a processor 105, a display 106, and a memory 107, and the transmitting circuit 102 and the receiving circuit 103 can be connected to the ultrasonic probe 101 through the transmission/reception selection switch 104. In some embodiments, the transmitting circuit 102, the receiving circuit 103, and the transmitting/receiving selection switch 104 may be provided integrally with the ultrasound probe 101.
In the ultrasonic imaging process, the transmission circuit 102 transmits a transmission pulse having a certain amplitude and polarity, which is subjected to delay focusing, to the ultrasonic probe 101 through the transmission/reception selection switch 104 to excite the ultrasonic probe 101 to transmit ultrasonic waves. After a certain delay, the receiving circuit 103 receives the echo of the ultrasonic wave through the transmitting/receiving selection switch 104 to obtain an ultrasonic echo signal, and processes such as amplifying, analog-to-digital conversion, beam forming and the like on the echo signal, and then sends the processed ultrasonic echo signal to the processor 105 for processing, and the processor 105 is used for processing the ultrasonic echo signal to obtain a corresponding ultrasonic image.
The display 106 is connected to the processor 105, for example, the processor 105 may be connected to the display 106 through an external input/output port, and the display 106 may detect input information of a user, for example, a control instruction for an ultrasonic wave transmitting/receiving timing, an operation input instruction for starting still image capturing, dynamic short-film capturing, dynamic image storing, or the like, or may further include other instruction types. The display 106 may include one or more of a keyboard, mouse, scroll wheel, trackball, mobile input device (e.g., a mobile device with a touch screen, a cell phone, etc.), multi-function knob, key, etc., and thus the corresponding external input/output port may be a wireless communication module, a wired communication module, or a combination of both. The external input/output ports may also be implemented based on USB, bus protocols such as CAN, and/or wired network protocols, among others.
The display 106 also includes a display screen that can display the ultrasound images obtained by the processor 105. In addition, the display screen can provide a graphical interface for human-computer interaction for the user when displaying the ultrasonic image, one or more controlled objects are arranged on the graphical interface, and the user is provided with the controlled objects to control by inputting operation instructions through the display 106, so that corresponding control operations are executed. For example, icons are displayed on the graphical interface, and the icons can be operated by a human-computer interaction device to perform specific functions, such as a function of capturing a still image/a dynamic short while storing a dynamic image. In practical applications, the display may be a touch screen display. In addition, the display screen in this embodiment may include one display screen or a plurality of display screens.
In other embodiments of the present application, the processor 105 is further configured to receive instructions to store the ultrasound images and store dynamic images, static images, or dynamic tabs of the ultrasound images in response to the instructions, thereby facilitating review by a user (e.g., a physician) for diagnosis.
Among them, the ultrasonic imaging apparatus 100 may be of an amplitude modulation type, a spot scanning type, a gray modulation type.
In this embodiment, the ultrasonic imaging apparatus 100 is applied to the pregnant woman production process to assist the pregnant woman in cooperation with the medical staff. The mode of acquiring the ultrasonic data by using the ultrasonic probe in step 11 may be that the three-dimensional ultrasonic probe or the two-dimensional ultrasonic probe is placed at the perineum or the abdomen of the pregnant woman so as to acquire the time-producing ultrasonic data corresponding to the perineum or the abdomen of the pregnant woman. And then the ultrasonic data can be subjected to imaging analysis to obtain an ultrasonic image at the time of birth. The time-of-production ultrasonic image data acquired by the three-dimensional ultrasonic probe at least comprises a coil three-dimensional image, and the ultrasonic image data acquired by the two-dimensional ultrasonic probe is a section of time-of-production ultrasonic video.
Step 12: and analyzing the ultrasonic image to obtain an analysis result.
In some embodiments, the ultrasound data may be subjected to imaging analysis to obtain a corresponding ultrasound image. The ultrasound image is then input into an image segmentation model to obtain feature information in the ultrasound image.
The image segmentation model is obtained based on training sample images, and the training sample images are obtained after standard ultrasonic images are marked. Specifically, the contour of the characteristic information in the standard ultrasonic image and the category of the characteristic information can be marked, so that an image segmentation model can be trained.
In this embodiment, the image segmentation model may be trained based on a network structure such as FCN (Fully Convolutional Network, full convolution network), segNet, unet, deeplab, or PSPNet.
And then analyzing the characteristic information to obtain an analysis result.
For example, whether the contour area of the feature information meets the preset contour area requirement can be analyzed. Thus, the analysis result may be a result of whether the contour area of the feature information meets a preset contour area requirement.
As another example, the types of feature information in the ultrasound image may be analyzed for compliance with the types in a standard ultrasound image. Thus, the analysis result may be a result of whether the type meets the type in the standard ultrasound image.
For another example, the sharpness of the feature information in the ultrasound image may be analyzed to determine if the sharpness is satisfactory. Thus, the analysis result may be a result of whether the sharpness is satisfactory.
The above analysis methods may be performed simultaneously or in any combination. In the process, weight setting can be performed for each analysis mode, when the analysis modes are overlapped to be used as analysis requirements, the final result is obtained by multiplying the corresponding weights, and whether the requirements are met or not is judged according to the final result.
Step 13: and adjusting and guiding the ultrasonic probe according to the analysis result so as to adjust the ultrasonic probe to the optimal acquisition position.
In some application scenarios, the mode of adjusting and guiding the ultrasonic probe through the analysis result may be a voice broadcasting mode, for example, the voice prompt content is that please move 3 mm to the right front.
In some application scenarios, the mode of adjusting and guiding the ultrasonic probe through analysis results can be displayed through animation on a display interface, for example, modeling is performed on the ultrasonic probe, then the ultrasonic probe is virtually displayed on the display interface, and then the adjustment mode, for example, rotation, movement and the like, is marked on the display interface for the ultrasonic probe.
In some embodiments, adjusting the ultrasound probe to the optimal acquisition position may be adjusting the position and/or orientation of the ultrasound probe.
Step 14: and responding to the adjustment of the ultrasonic probe to the optimal acquisition position, and acquiring a corresponding target ultrasonic image.
When knowing the adjustment guide, an operator controls the ultrasonic probe according to the content of the adjustment guide so as to enable the ultrasonic probe to reach a designated position and acquire ultrasonic data by utilizing the designated gesture.
And then, executing the step 12 and the step 13 again to obtain an analysis result of the ultrasonic data acquired at the time, and storing an ultrasonic image corresponding to the ultrasonic data acquired at the time if the analysis result meets the preset requirement.
Step 15: and carrying out imaging analysis on the target ultrasonic image to obtain detection parameters.
The above-mentioned detection parameter may include at least one of a fetal head progress angle, a distance of pubic symphysis from the fetal head, a fetal head progress distance, a distance between the fetal head and the peripubis.
In some embodiments, an imaging analysis may be performed on the target ultrasound image to obtain a pubic symphysis profile and a fetal head profile in the ultrasound image. Wherein the ultrasound image may be divided into a longitudinal section image and a transverse section image.
From this, the angle of fetal head progress (AOP, angle of progression), the distance of pubic symphysis from the fetal head (HSD, head-Symphysis distance), the distance of fetal head progress (PD, progress distance) can be calculated using the longitudinal section image and the determined pubic symphysis profile and fetal head profile, and the distance between the fetal head and the peri-pubis (HPD, head-perineum distance) can be calculated using the transverse section image and the determined pubic symphysis profile and fetal head profile.
AOP is the angle between the line passing through the long axis of the pubic symphysis and the tangent line passing through the inferior border of the pubic symphysis and furthest from the skull of the fetus.
HSD is the distance between the lowest pubic symphysis and the fetal skull, and is an indicator of fetal head descent. Mainly used for the fetus in front of the pillow, and the HSD becomes shorter gradually along with the descent of the fetal head to the pelvis.
HPD is the shortest distance between the outer rim of the fetal skull and the pubic space.
Step 16: displaying the detection parameters.
In some embodiments, step 16 may be displaying the target ultrasound image on a display interface and marking the detection parameters on the target ultrasound image.
In the present embodiment, an ultrasonic image is acquired by using an ultrasonic probe; analyzing the ultrasonic image to obtain an analysis result; adjusting and guiding the ultrasonic probe according to the analysis result to adjust the ultrasonic probe to the optimal acquisition position; responding to the adjustment of the ultrasonic probe to the optimal acquisition position, and acquiring a corresponding target ultrasonic image; imaging analysis is carried out on the target ultrasonic image to obtain detection parameters; the mode of displaying the detection parameters can enable an operator to adjust the ultrasonic probe according to adjustment guide, so that the ultrasonic probe reaches the optimal acquisition position, corresponding ultrasonic images are acquired, the requirements on the operation capacity of an ultrasonic instrument of the operator, ultrasonic expertise and clinical expertise can be reduced, the operator does not need to blindly adjust the ultrasonic probe manually for many times, the acquisition efficiency is improved, the ultrasonic diagnosis efficiency is further improved, the corresponding detection parameters are displayed, medical staff can visually see the detection parameters, when the puerpera is produced, the medical staff can be prevented from using the traditional finger detection method to estimate the parameters, the accuracy of the detection parameters is improved, and the infection risk of the puerpera is reduced.
Based on the above detection parameters, in the parturient production process, in order to facilitate the auxiliary production, the following procedures can be performed: referring to fig. 3, fig. 3 is a flow chart of another embodiment of the method for processing an ultrasound image according to the present application. The method comprises the following steps:
step 31: a graph is formed based on the angle of fetal head progress and the distance between the fetal head and the perineum.
For example, a straight line graph or a bar graph can be made by using the fetal head progress angle and the distance between the fetal head and the peri-vaginal region acquired at different times. The abscissa of the graph is time, and the ordinate is a specific value of the fetal head progress angle and the distance between the fetal head and the peri-vaginal.
In some embodiments, the fetal head progress angle and the distance between the fetal head and the perineum can be made as separate trend graphs, respectively, and displayed on the display interface. The fetal head progress angle and the distance between the fetal head and the perineum can be made on the same trend chart, and the trend chart is displayed on a display interface.
On the trend graph, the fetal head progress angle and the distance between the fetal head and the perineum can be intuitively displayed, and in the preset time, if the line representing the fetal head progress angle is in an ascending trend and the line representing the distance between the fetal head and the perineum is in a descending trend, the parturition can be determined at the moment, and the parturition suggestion can be displayed on the display interface. If the difference between the line representing the angle of fetal head progress and the line representing the distance between the fetal head and the perineum is not pulled apart within the preset time, the probability of labor at this time is considered to be low, and the labor suggestion can be displayed on the display interface as the caesarean section.
Step 32: a chart is displayed.
Wherein the chart may be displayed on the same screen as the ultrasound image.
In the above embodiment, the fetal head progress angle and the distance between the fetal head and the perineum are displayed in the form of a chart, so that the medical staff can visually see the detection parameters, and when the puerpera is in production, the medical staff can be prevented from estimating the parameters by using the traditional finger detection method, the accuracy of the detection parameters is improved, and the infection risk of the puerpera is reduced.
In other embodiments, the parturient's production process may display delivery advice on the display interface based on the detected parameters, assisting in the production. Based on this, referring to fig. 4, fig. 4 is a flow chart of another embodiment of the method for processing an ultrasound image according to the present application. The method comprises the following steps:
step 41: and acquiring the fetal head progress angle in the detection parameters.
The angle of fetal head progression may be determined from the pubic symphysis profile and the fetal head profile, and in particular may be obtained directly when marking the detection parameters on the ultrasound image.
Step 42: the delivery advice is based on the fetal head progress angle, wherein the delivery advice comprises a forward delivery and a caesarean delivery.
In some embodiments, if the fetal head progresses more than 110 °, the delivery recommendation is displayed as antenatal.
If the fetal head progress angle is larger than 110 degrees in the preset time, the probability that the fetal head can normally produce is indicated, and the delivery suggestion is displayed on a display interface to be the parturition. The midwife can then perform subsequent preparation work according to the antenatal requirements.
In some embodiments, if the fetal head progresses less than 80 °, the delivery recommendation is indicated as caesarean section.
If the fetal head progress angle is smaller than 80 degrees in the preset time, the fact that the puerpera cannot normally produce in a large probability is indicated, and the delivery suggestion is displayed on a display interface to be caesarean section. The midwife can then perform subsequent preparation work according to the caesarean section requirements.
In some embodiments, if the fetal head progress angle is greater than 80 ° and less than 110 °, then the antenatal probability and intervention opinion are displayed.
If the fetal head progress angle is larger than 80 degrees and smaller than 110 degrees in the preset time, the forward production probability can be calculated according to the historical production cases. Specifically, real-time calculation is performed according to the actual fetal head progress angle so as to evaluate the antenatal probability. And displaying the cis-delivery probability and the intervention opinion on the display interface in real time. So that the midwife can learn, and can perform subsequent preparation work according to the requirement. The intervention opinion may be that when the antenatal probability is lower than a preset value, a caesarean section is recommended and medical staff is requested to perform work preparation. The intervention opinion can also be that when the cis-delivery probability is lower than a preset value, the medical staff is recommended to inquire about the puerpera family members and whether the puerpera family members have caesarean delivery or not.
Referring to fig. 5, fig. 5 is a flowchart of another embodiment of an ultrasound image processing method according to the present application. The method comprises the following steps:
step 51: an ultrasound image is acquired using an ultrasound probe.
Step 52: and analyzing the ultrasonic image to obtain an analysis result.
In some embodiments, the image segmentation model may be used to perform feature extraction on the ultrasound image to obtain corresponding feature information. And comparing and analyzing by utilizing the characteristic information and the characteristic information of the standard ultrasonic image to obtain an analysis result.
Specifically, the ultrasound image may be input into an image segmentation model corresponding to the current examination mode to obtain the feature information in the ultrasound image.
In an application scenario, the current examination mode is an abdomen mode, and then an ultrasonic image may be input into an image segmentation model corresponding to the abdomen mode to obtain feature information in the ultrasonic image, where the feature information includes at least one of fetal eye contour, spine, cervical vertebra and brain midline.
In another application scenario, the current examination mode is the perineum mode, the ultrasound image may be input into an image segmentation model corresponding to the cavity-in-conference mode, to obtain characteristic information in the ultrasound image, the characteristic information including at least one of pubic symphysis contour, fetal head contour, and brain midline.
And comparing the characteristic information with standard characteristic information to obtain a comparison result, and taking the comparison result as an analysis result.
For example, whether the contour area of the feature information meets the preset contour area requirement can be analyzed. Thus, the analysis result may be a result of whether the contour area of the feature information meets a preset contour area requirement. For example, the profile area requirement is set such that the ratio of the profile area difference to the standard profile area is less than five percent. Specifically, the comparison mode may be to compare the profile of the feature information with the standard profile, and determine the area of the non-overlapping area between the profile and the standard profile, where the area may represent the profile area difference. And then, the ratio of the contour area difference to the standard contour area is obtained, and if the ratio is less than five percent, the contour characteristic information can be considered to meet the requirement.
In other embodiments, the contour feature information may be located in the region where the ultrasound image is located, so as to obtain contour position information, and a difference between the contour position information and the standard contour position information is determined, where if the difference is smaller than a preset value, the position of the contour feature information may be considered to be satisfactory. If the difference is larger than the preset value, the position of the contour feature information is considered to be unsatisfactory.
As another example, the types of feature information in the ultrasound image may be analyzed for compliance with the types in a standard ultrasound image. Thus, the analysis result may be a result of whether the type meets the type in the standard ultrasound image. For example, the preset type includes pubic symphysis and fetal head, and if the type of the characteristic information is only one of pubic symphysis and fetal head, the characteristic information is not satisfactory.
For another example, the sharpness of the feature information in the ultrasound image may be analyzed to determine if the sharpness is satisfactory. Thus, the analysis result may be a result of whether the sharpness is satisfactory.
The analysis modes can be carried out simultaneously or in any combination, and the analysis result meeting the preset requirements can be obtained on the premise that each requirement is met.
Step 53: and if the analysis result does not meet the preset requirement, adjusting and guiding the ultrasonic probe to adjust the ultrasonic probe to the optimal acquisition position.
In some embodiments, if the analysis result indicates that the feature information is not in the preset area of the ultrasound image, displaying a first guide identifier on the display interface to guide the ultrasound probe to move towards the direction of the feature information, and further guiding and adjusting the ultrasound probe to the optimal acquisition position.
The first guiding mark can be a dynamic mark displayed on a display interface, such as a dynamic arrow, and the direction of the arrow is the direction for guiding the ultrasonic probe to move. In some embodiments, the arrow may be made hollow and the length of the arrow may be expressed as the length of movement of the ultrasound probe. During the movement of the ultrasound probe, the hollow arrow is filled. When the arrow fills up, it is indicated that the ultrasound probe is moving to the target position. In other embodiments, the first guideline identification may be an arrow and an indicator light. The indicator light indicates whether the ultrasonic probe is moved to the target position indicated by the arrow using a different color. If the indicator light is red, the ultrasonic probe is not moved to the target position indicated by the arrow, and if the indicator light is green, the ultrasonic probe is moved to the target position indicated by the arrow.
For example, the target information corresponds to the above feature information, that is, the feature information does not appear in the preset area. The preset region may be a region near the middle of the ultrasound image.
If the analysis result shows that the characteristic information is not in the preset area of the ultrasonic image, the ultrasonic probe is prompted to move towards the direction of the characteristic information. For example, the standard ultrasound image requires that the feature information is located in a region near the middle of the entire ultrasound image, and if the feature information is located on the left side of the ultrasound image, the ultrasound probe is lifted to move in a direction corresponding to the left side.
In some embodiments, if the analysis result indicates that the characteristic information does not meet the requirement of the standard characteristic information, displaying a second index identifier on the display interface to guide the rotation of the ultrasonic probe so as to acquire an ultrasonic image of another angle.
The second index identifier may be the same as the first index identifier.
If the analysis result shows that the characteristic information does not meet the requirement of the standard characteristic information, displaying a second index mark on the display interface to guide the ultrasonic probe to rotate so as to acquire ultrasonic data of another angle.
For example, the area of the feature information does not conform to the standard contour area, and typically the area of the feature information is smaller than the standard contour area, the operator may be prompted to rotate the ultrasound probe to acquire ultrasound data at another angle. In particular, how to rotate may be decided according to the profile of the feature information. If the difference between the left side profile or the right side profile and the standard profile is large, the ultrasonic probe is prompted to rotate clockwise, and if the difference between the upper side profile or the lower side profile and the standard profile is large, the ultrasonic incidence angle of the ultrasonic probe is prompted to change.
Step 54: and responding to the adjustment of the ultrasonic probe to the optimal acquisition position, and acquiring a corresponding target ultrasonic image.
The position of the ultrasound probe may be acquired in real time during execution of step 54. And when the ultrasonic probe is adjusted to the optimal acquisition position, corresponding prompting is carried out. Such as a voice alert. And then under the optimal acquisition position, acquiring a corresponding target ultrasonic image by using the ultrasonic probe.
Step 55: and carrying out imaging analysis on the target ultrasonic image to obtain detection parameters.
Step 56: displaying the detection parameters.
In an application scenario, options for the corresponding acquisition mode may be displayed on a display interface. For example, the body position of interest is selected for scanning, and for the ultrasonic diagnosis at the time of birth, ultrasonic data acquisition is generally performed at the abdomen and the perineum, so that before data acquisition, the abdomen mode or the perineum mode can be selected. Specifically, the mode of selecting the abdomen mode or the perineum mode can be selected in a display interface, or a corresponding button is arranged on the ultrasonic probe, and when the button of the corresponding mode is triggered, the corresponding mode is transmitted into the ultrasonic system connected with the ultrasonic probe, so that the system works according to the current selected mode.
After the observation body position is selected, an operator can hold the ultrasonic probe to randomly adjust the position and the posture of the probe, and ultrasonic image acquisition is carried out. Likewise, for each adjusted position and pose, ultrasound image acquisition is also performed. Wherein, the ultrasonic probe can generally adopt a convex array probe or a 3D ultrasonic probe.
And analyzing and evaluating the ultrasonic data acquired in the process by using a network model or an analysis module with an analysis function, feeding back the result of ultrasonic probe adjustment, and displaying the adjustment guide on a display interface of display equipment.
The specific flow is as follows:
1. the ultrasonic probe is used for collecting the input ultrasonic data, firstly, preprocessing is carried out, and operations such as denoising, filtering, smoothing, edge enhancement, ESRI and the like are carried out on the ultrasonic data, so that the image quality of the ultrasonic data after imaging is improved.
2. The preprocessed image is transmitted to an AI (Artificial Intelligence ) algorithm module for image processing. The AI algorithm module comprises two trained image segmentation models, one of which is an abdomen feature segmentation network model and the other of which is a perineum feature segmentation network model. The selection of the corresponding feature-segmentation network model may be determined according to a pre-selected mode, e.g., if a perineal mode is pre-selected, the ultrasound images acquired in this mode are input to the perineal feature-segmentation network model, and if an abdominal mode is pre-selected, the ultrasound images acquired in this mode are input to the abdominal feature-segmentation network model.
The method comprises the steps of acquiring an ultrasonic image in an abdomen mode, dividing tissues in the ultrasonic image by using a trained abdomen characteristic dividing network model, and dividing characteristic information such as fetal eye contour, spine, cervical vertebra, midline brain echo and the like. For the acquired ultrasound images in the perineal mode, the trained perineal feature segmentation network model is utilized to segment out tissues in the incoming ultrasound images for segmentation, such as pubic symphysis, fetal head, midline brain echoes and the like. If a convex array probe is adopted, the ultrasonic image of the two-dimensional section is directly segmented; and if the volume data is the volume data acquired by the 3D probe, segmenting the ultrasonic images of all the two-dimensional sections contained in the volume data.
In some embodiments, the dataset used to train the image segmentation model in the AI algorithm module is a clinically acquired two-dimensional standard cut-plane ultrasound image of a term pregnant woman. The ultrasound images are divided according to the body position of the parent body scanned by the ultrasound probe, and generally can be divided into two main types: one is a cut-plane ultrasound image acquired through the abdomen, and the other is a cut-plane ultrasound image acquired through the perineum. For the abdomen position, a large number of standard section ultrasonic images including fetal eye outline, spine, cervical vertebra, midline brain echo and other characteristics are respectively acquired, so as to train an abdomen characteristic segmentation network model. For perineum positions, standard section ultrasonic images comprising characteristics such as pubic symphysis, fetal head outline, midline brain echo and the like are respectively acquired, so that a perineum characteristic segmentation network model is trained. These ultrasound images for training are acquired by experienced physicians, the acquired ultrasound images are then selected, and each feature information is annotated on the selected ultrasound images that meet the requirements. Further, the labeled ultrasonic images can be converted through mirroring, rotation, clipping, telescopic transformation and the like, so that more training samples are obtained, and data enhancement is performed.
In other embodiments, the types of data sets are not limited to two in the present embodiment, and may be actually divided according to more attention positions and features, such as an ultrasound image of the abdomen of the pregnant woman when lying on his side, and an ultrasound image of the perineum when lying on his side. The network model used for training may be FCN, segnet, U net, deep, etc. network models.
And then, analyzing and evaluating the characteristic information segmented by the network model to further determine whether the ultrasonic image meets the requirement of the ultrasonic image with the standard section, and guiding an operator to adjust the ultrasonic probe when the ultrasonic image does not meet the requirement, so as to rapidly and accurately position the optimal scanning state of the ultrasonic probe and acquire the optimal scanning section data.
For example, for the abdominal mode, after the feature information is segmented from the incoming ultrasound image through the processing of the abdominal feature segmentation network model, the segmentation result of various features in the ultrasound image can be obtained. For the segmentation result, preset requirements of the standard tangent plane ultrasonic image are set in aspects of feature type, feature segmentation region integrity, feature position, feature direction and the like. The system can compare the segmentation characteristic result with the preset requirement of the standard section ultrasonic image, and further evaluate the guiding prompt of the ultrasonic probe adjustment by combining with biological experience.
Specifically, the method for adjusting according to the index attribute of the segmentation feature is as follows:
for example, as shown in fig. 6, fig. 6 is an ultrasound image formed by ultrasound data acquired through a scan of a cross section of the lower abdomen, the ultrasound image is segmented by using an abdomen feature segmentation network model, and feature areas are segmented, such as an eye contour and a nose contour, as shown in fig. 6; then, the type and feature integrity of the segmented features are automatically analyzed, and if the eye contour in the upper right corner in fig. 6 does not meet the requirement of contour integrity, the system prompts the probe to adjust to the upper right corner according to biological experience. As shown in fig. 7, after the characteristics of the fetal spinal column are segmented, the probe is prompted to be adjusted to the right according to the spinal column characteristic direction and biological big data experience, and at this time, a corresponding guide mark can be correspondingly displayed on the display interface.
For another example, for the perineum mode, after the feature information is segmented by the processing of the perineum feature segmentation network model, the segmentation result of various features in the ultrasonic image can be obtained. For the segmentation result, preset requirements of the standard tangent plane ultrasonic image are set in aspects of feature type, feature segmentation region integrity, feature position, feature direction and the like. The system can compare the segmentation characteristic result with the preset requirement of the standard section ultrasonic image, and further evaluate the guiding prompt of the ultrasonic probe adjustment by combining with biological experience.
The preset requirements of the standard section ultrasonic image are as follows: the target tissue or feature is a complete, clear and comprehensive two-dimensional image, namely, the preset requirements of the standard section ultrasonic image can be as follows: the first and the second segmentation target features have the largest variety, and specifically, features in pubic symphysis, fetal head outline and fetal brain midline appear in the acquired image as much as possible. Second, the classification probability of the target features is highest, namely, features such as pubic symphysis, fetal head outline, fetal brain midline and the like in the ultrasonic image are most similar to the standard features. Third, the feature areas such as pubic symphysis, fetal head outline, fetal brain midline and the like in the ultrasonic image are as complete as possible, namely the outline area of the target feature is the largest.
For example, referring to fig. 8, for an ultrasound image acquired through a perineum longitudinal section, the ultrasound image is segmented by using a perineum feature segmentation network model, so that the pubic symphysis and the fetal head contour in the segmentation result are ensured to occur simultaneously, and the two shapes are closest to the standard feature shape, and the two feature contours are ensured to be as complete as possible, namely, the area is maximum. For example, as shown in fig. 9, the ultrasound image is segmented by adopting a perineal feature segmentation network model, so that fetal head contours and central line brain echoes in segmentation results are ensured to occur simultaneously, the shapes of the fetal head contours and the central line brain echoes are closest to standard feature shapes, and the two feature contours are ensured to be as complete as possible, namely, the area is the largest.
Specifically, the method for guiding probe adjustment according to the segmentation result and the preset requirement of the standard section ultrasonic image comprises the following steps:
as shown in fig. 10, an ultrasound image acquired through a perineal cross section. After the analysis system adopts the perineum characteristic segmentation network model to segment the input ultrasonic image, the input ultrasonic image is compared with the preset requirement of the ultrasonic image with the standard section, and the ultrasonic image which does not meet the requirement is adjusted according to biological experience. If the figure 10 does not meet the requirement of complete characteristics, the system prompts the probe to adjust the profile upward in the longitudinal section of the position.
Based on the analysis of the acquired image segmentation results, the system display interface prompts probe adjustment information, as shown in FIG. 11. When the segmentation result meets the requirement, the OK indicator light on the display device is green, which indicates that the optimal scanning state is found, and the scanning image is output in the state, namely the optimal image, and the OK indicator light on the display is green, which can be equivalent to the fourth guide mark in the embodiment; if the segmentation result does not meet the requirement, the OK indicator lamp blinks red to indicate that the optimal scanning state and the optimal scanning section are not found and prompt the direction of the adjustment of the ultrasonic probe. The adjustment guide of the ultrasonic probe includes the moving direction of the ultrasonic probe, as shown in fig. 10, and four moving direction indicator lamps, such as "Up-Up", "Down", "Left-Left" and "Right", are arranged on the display interface. When the user needs to move in which direction, the user controls the equal lighting of the corresponding direction, and two rotation direction indicator lamps, such as 'A-clockwise rotation' and 'B-anticlockwise rotation', are arranged on the display interface. When the rotation is needed in which direction, the equal brightness in the corresponding direction is controlled.
Specifically, the ultrasonic probe is adjusted according to the adjustment instruction until the system prompts the ultrasonic probe to adjust to the required position, namely the optimal acquisition position, at which the optimal section ultrasonic image is acquired.
Specifically, the operator continuously adjusts the ultrasonic probe according to the prompt of the guidance display interface as shown in fig. 11 until the "OK indicator light" displays green. The data acquisition state of the probe is optimal at this time, and the ultrasound image acquired by the probe in this state is also optimal.
When the optimal scanning position and the optimal scanning posture are found, the image segmentation result of the ultrasonic image acquired in the scanning state is directly transmitted into the system, and the system calculates detection parameters. The calculation of the detection parameters is mainly based on the definition of the labor progress parameters, a two-dimensional coordinate model is established by using the extracted features, and then the values of all parameters in the ultrasonic image are calculated. The following birth progress parameters may be calculated, for example, based on the segmented pubic symphysis features and fetal head contours:
A. the fetal head angle of progress (Angle of progression, AOP) can be expressed as: the angle between the long axis of the pubis and the tangent to the fetal head from the lower end of the pubis.
B. The fetal Head Direction (HD) can be expressed as: the angle between the longest axis of the fetal head and the long axis of the pubic symphysis can be identified in the image.
C. The fetal head progression distance (progression distance, PD) can be expressed as: a minimum distance between the subpubic line passing through the subpubic endpoint and perpendicular to the long axis of the pubic symphysis, and the fetal head tangent.
D. The fetal head-pubic symphysis distance (head-symphysis distance, HSD) can be expressed as: there is an intersection of the subpubic line passing through the subpubic endpoint and perpendicular to the long axis of the pubic symphysis with the fetal head, the intersection being a distance from the subpubic endpoint.
Specifically, the parameters that can be calculated and acquired are different depending on the scan site of the ultrasound probe. Parameters such as AOP, HD, PD, HSD can be calculated based on the transperineal ultrasound image. In addition to the above parameters, other labor progress parameters such as Head-perineum distance (Head-perineum distance, HPD), mid-brain angle (MLA), fetal Head position (Head station) and the like may be calculated, and the calculation process is the same as that of the present embodiment, and will not be described in detail here.
In order to provide more diagnostic and decision information, the following models can be built specifically according to the results of parameter calculation:
a) And (5) processing the model.
The process model is established mainly to provide an auxiliary basis for diagnostic analysis. The method is to obtain a diagnosis rule based on a large number of sample statistics, and the parameter values calculated by the system are transmitted into a processing model for comparison analysis, so as to output an auxiliary diagnosis result. The specific embodiment of the process model may be as follows:
1. And analyzing and researching a large number of samples, and recording the delivery condition and parameter condition of each sample.
2. And analyzing the statistical relation between the parameter value and the cis-plot probability, further obtaining an empirical value rule, and taking the empirical value rule as a preset parameter diagnosis threshold value of system analysis diagnosis. Taking the parameter AOP as an example, the empirical law is obtained based on a large number of sample statistics: for AOP >110 degrees, suggesting the concurrent production, prompting the information such as the concurrent production probability and the like; for AOP <80 degrees, suggesting caesarean section, prompting the cis-delivery probability, other intervention information and the like; for parameter values in the range of (80 °,110 °), the antenatal probability and intervention opinion can be prompted. The same function is also provided for other parameters than the parameter AOP.
3. When the system is used for auxiliary diagnosis, the system can automatically calculate the value of the detection parameter, and give out actual delivery advice based on the parameter diagnosis threshold value and the parameter diagnosis range.
4. And using a data sharing and cloud computing technology, carrying out statistics and research on clinical sample data in a larger range by a cloud computer, and further obtaining a more accurate parameter diagnosis threshold. Specifically, after each diagnosis of each independent terminal machine is finished, the system can upload case information, delivery results, parameter information and the like to the cloud computer. The cloud computer performs further statistics and analysis calculation by combining the latest incoming diagnosis sample information on the basis of the existing research conclusion, so as to obtain a more accurate parameter diagnosis threshold value and a parameter diagnosis range. The latest threshold is then shared to each individual machine terminal to update the parameter diagnostic threshold.
B) And a change trend graph model of the labor progress parameters.
In particular, the values of the detected parameters calculated at different points in time may be connected to form a labor guidance curve, from which the trend of the progress of the labor is analyzed and predicted. As shown in fig. 12, a trend graph of the process progression parameter was constructed based on AOP and HPD parameter values calculated at different times.
C) The production probability map indicates a graph model.
The quantized values of the indexes such as the cis-labor probability or the caesarean section probability are mapped to the corresponding color intervals, and then the relevant decision probabilities are displayed through the corresponding colors, as shown in fig. 13.
Besides the establishment of the model, the method can also help display the progress of the labor from various aspects by establishing a two-dimensional dynamic model, a three-dimensional dynamic model and the like, and help doctors to intuitively conduct analysis and diagnosis.
The display device of the display interface can be a display device attached to the ultrasonic device, and can also be various data sharing display devices, such as a multi-department joint diagnosis display device, a remote medical treatment display device and the like. The display module is responsible for receiving, managing and displaying image data transmitted by the intelligent image acquisition system and the intelligent image processing system, segmenting characteristic and parameter schematic diagrams, various model diagrams and diagnosis results input by the diagnosis model. As shown in fig. 14, fig. 14 shows information such as parameter diagrams, feature profiles, parameter sizes and information, cis-partum probability, caesarean section probability, intervention opinion, and the like.
In addition to the information displayed above, an electronic program diagram, a two-dimensional dynamic model diagram, a three-dimensional dynamic model diagram, and the like may be added to the display interface.
By the mode, the working flow can be greatly simplified, the requirements on ultrasonic professional technology and clinical experience of medical staff are reduced, and the diagnosis efficiency, objectivity and accuracy are improved.
Referring to fig. 15, fig. 15 is a schematic view of an ultrasonic device according to an embodiment of the present application. The ultrasonic imaging apparatus 100 includes: an ultrasound probe 101, transmit circuitry 102, receive circuitry 103, a processor 105, and a display 106.
Wherein the transmitting circuit 102 is connected to the ultrasound probe 101 for transmitting an ultrasound signal to the target tissue through the ultrasound probe 101.
The receiving circuit 103 is connected to the ultrasound probe 101 for acquiring ultrasound echo signals reflected by the ultrasound waves via the target tissue.
The processor 105 is connected to the receiving circuit 103 and is used for generating an ultrasonic image according to the ultrasonic echo signals; analyzing the ultrasonic image to obtain an analysis result; adjusting and guiding the ultrasonic probe according to the analysis result to adjust the ultrasonic probe to the optimal acquisition position; responding to the adjustment of the ultrasonic probe to the optimal acquisition position, and acquiring a corresponding target ultrasonic image; and performing imaging analysis on the target ultrasonic image to obtain detection parameters.
The display 106 is connected to the processor 105 for displaying the detected parameters.
In other embodiments, the display is further configured to display a target ultrasound image, wherein the target ultrasound image is marked with the detection parameter.
The detection parameters comprise at least two of fetal head progress angle, pubic symphysis distance from fetal head, fetal head progress distance and fetal head and perineum distance;
in other embodiments, the processor 105 is further configured to form a graph based on the angle of fetal head progress and the distance between the fetal head and the peri-vaginal.
The display 106 is also used to display charts.
In other embodiments, the processor 105 is further configured to generate the delivery advice based on the detected parameters.
The display 106 is also used to display labor advice.
In other embodiments, the processor 105 is further configured to obtain an angle of fetal head progress in the detected parameter; and generating a delivery recommendation based on the fetal head progress angle, wherein the delivery recommendation includes a antenatal and a caesarean section.
In other embodiments, the processor 105 is further configured to adjust the ultrasound probe if the analysis result does not meet the preset requirement.
In other embodiments, the processor 105 is further configured to generate the first guide identifier when the analysis result indicates that the feature information is not in the preset region of the ultrasound image.
The display 106 is further configured to display a first guide identifier to guide the ultrasound probe to move in the direction of the characteristic information.
In other embodiments, the processor 105 is further configured to generate the second index identifier when the analysis result indicates that the feature information does not meet the requirement of the standard feature information.
The display 106 is also configured to display a second index marker to direct rotation of the ultrasound probe to acquire another angle ultrasound image.
The ultrasound imaging apparatus 100 further comprises a memory (not shown) for storing a computer program.
It will be appreciated that the processor 105 is further configured to execute a computer program to implement the method of any of the above embodiments, and specific reference to any of the above embodiments is omitted here.
Referring to fig. 16, fig. 16 is a schematic diagram illustrating an embodiment of a computer readable storage medium according to the present application. The computer readable storage medium 160 is for storing a computer program 71, which computer program 71, when being executed by a processor, is for carrying out the method of:
collecting an ultrasonic image by using an ultrasonic probe; analyzing the ultrasonic image to obtain an analysis result; adjusting and guiding the ultrasonic probe according to the analysis result to adjust the ultrasonic probe to the optimal acquisition position; responding to the adjustment of the ultrasonic probe to the optimal acquisition position, and acquiring a corresponding target ultrasonic image; imaging analysis is carried out on the target ultrasonic image to obtain detection parameters; displaying the detection parameters.
It will be appreciated that the computer program 71, when executed by a processor, is further configured to implement the method of any of the above embodiments, and specific reference to any of the above embodiments is not described herein.
In the several embodiments provided in the present application, it should be understood that the disclosed method and apparatus may be implemented in other manners. For example, the above-described device embodiments are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units of the other embodiments described above may be stored in a computer readable storage medium if implemented in the form of software functional units and sold or used as stand alone products. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing description is only of embodiments of the present application, and is not intended to limit the scope of the application, and all equivalent structures or equivalent processes using the descriptions and the drawings of the present application or directly or indirectly applied to other related technical fields are included in the scope of the present application.
Claims (15)
1. A method of processing an ultrasound image, the method comprising:
collecting an ultrasonic image by using an ultrasonic probe;
analyzing the ultrasonic image to obtain an analysis result;
adjusting and guiding the ultrasonic probe according to the analysis result so as to adjust the ultrasonic probe to an optimal acquisition position;
responding to the ultrasonic probe to adjust to the optimal acquisition position, and acquiring a corresponding target ultrasonic image;
imaging analysis is carried out on the target ultrasonic image to obtain detection parameters;
and displaying the detection parameters.
2. The method of claim 1, wherein the step of determining the position of the substrate comprises,
the displaying the detection parameters includes:
and displaying the target ultrasonic image on a display interface, and marking the detection parameters on the target ultrasonic image.
3. The method of claim 2, wherein the detection parameters include at least one of an angle of fetal head progress, a distance of pubic symphysis from the fetal head, a distance of fetal head progress, a distance between the fetal head and the peripubis; the method further comprises the steps of:
forming a graph based on the fetal head progress angle and the distance between the fetal head and the perineum;
the chart is displayed.
4. The method according to claim 1, wherein the method further comprises:
based on the detection parameters, a delivery suggestion is displayed.
5. The method of claim 4, wherein displaying a delivery suggestion based on the detected parameter comprises:
acquiring a fetal head progress angle in the detection parameters;
and carrying out delivery advice based on the fetal head progress angle, wherein the delivery advice comprises a forward delivery and a caesarean delivery.
6. The method of claim 1, wherein the step of determining the position of the substrate comprises,
the adjusting and guiding the ultrasonic probe according to the analysis result to adjust the ultrasonic probe to an optimal acquisition position comprises the following steps:
and if the analysis result does not meet the preset requirement, adjusting and guiding the ultrasonic probe.
7. The method of claim 6, wherein the step of providing the first layer comprises,
and if the analysis result does not meet the preset requirement, adjusting and guiding the ultrasonic probe, wherein the method comprises the following steps:
if the analysis result shows that the characteristic information is not in the preset area of the ultrasonic image, displaying a first guide mark on a display interface so as to guide the ultrasonic probe to move towards the direction of the characteristic information;
And if the analysis result shows that the characteristic information does not meet the requirement of the standard characteristic information, displaying a second index mark on a display interface so as to guide the ultrasonic probe to rotate, and acquiring an ultrasonic image of another angle.
8. An ultrasound imaging apparatus, comprising:
an ultrasonic probe;
the transmitting circuit is connected with the ultrasonic probe and is used for transmitting ultrasonic signals to target tissues through the ultrasonic probe;
the receiving circuit is connected with the ultrasonic probe and is used for collecting ultrasonic echo signals reflected by the ultrasonic waves through the target tissue;
the processor is connected with the receiving circuit and used for generating an ultrasonic image according to the ultrasonic echo signal; analyzing the ultrasonic image to obtain an analysis result; adjusting and guiding the ultrasonic probe according to the analysis result so as to adjust the ultrasonic probe to an optimal acquisition position; and responsive to the ultrasound probe being adjusted to the optimal acquisition position, acquiring a corresponding target ultrasound image; imaging analysis is carried out on the target ultrasonic image to obtain detection parameters;
and the display is used for displaying the detection parameters.
9. The ultrasound imaging apparatus of claim 8, wherein the display is further configured to display the target ultrasound image with the detection parameters marked thereon.
10. The ultrasound imaging apparatus of claim 9, wherein the detection parameters include at least two of angle of fetal head progress, distance of pubic symphysis from fetal head, distance of fetal head progress, distance between fetal head and peripubis;
the processor is further configured to form a graph based on the angle of fetal head progress and the distance between the fetal head and the perineum;
the display is also used for displaying the chart.
11. The ultrasound imaging apparatus of claim 8, wherein the processor is further configured to generate a delivery recommendation based on the detection parameter;
the display is also for displaying the delivery advice.
12. The ultrasound imaging apparatus of claim 11, wherein the processor is further configured to obtain an angle of fetal head progress in the detection parameters; and generating the delivery recommendation based on the fetal head progress angle, wherein the delivery recommendation includes a antenatal and a caesarean section.
13. The ultrasound imaging apparatus of claim 8, wherein the processor is further configured to adjust the guidance of the ultrasound probe if the analysis result does not meet a preset requirement.
14. The ultrasound imaging apparatus of claim 13, wherein the processor is further configured to generate a first guideline identification when the analysis result indicates that the characteristic information is not in a preset region of the ultrasound image;
the display is further used for displaying the first guide mark so as to guide the ultrasonic probe to move towards the direction of the characteristic information;
or the processor is further used for generating a second index identifier when the analysis result indicates that the characteristic information does not meet the requirement of standard characteristic information;
the display is also used for displaying the second guide mark so as to guide the ultrasonic probe to rotate to acquire an ultrasonic image of another angle.
15. A computer readable storage medium for storing a computer program for implementing the method according to any one of claims 1-7 when executed by a processor.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210216699.XA CN116763346A (en) | 2022-03-07 | 2022-03-07 | Ultrasonic image processing method, ultrasonic imaging device and readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210216699.XA CN116763346A (en) | 2022-03-07 | 2022-03-07 | Ultrasonic image processing method, ultrasonic imaging device and readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116763346A true CN116763346A (en) | 2023-09-19 |
Family
ID=87988277
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210216699.XA Pending CN116763346A (en) | 2022-03-07 | 2022-03-07 | Ultrasonic image processing method, ultrasonic imaging device and readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116763346A (en) |
-
2022
- 2022-03-07 CN CN202210216699.XA patent/CN116763346A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11744540B2 (en) | Method for measuring parameters in ultrasonic image and ultrasonic imaging system | |
CN112384146B (en) | Identifying optimal images from several ultrasound images | |
EP2989990B1 (en) | Ultrasound diagnosis apparatus, ultrasound diagnosis method performed by the ultrasound diagnosis apparatus, and computer-readable storage medium having the untrasound dianognosis method recorded thereon | |
CN111481233B (en) | Thickness measuring method for transparent layer of fetal cervical item | |
WO2020215485A1 (en) | Fetal growth parameter measurement method, system, and ultrasound device | |
CN113040823A (en) | Ultrasonic imaging equipment and ultrasonic image analysis method | |
CN115429325A (en) | Ultrasonic imaging method and ultrasonic imaging equipment | |
CN113274056A (en) | Ultrasonic scanning method and related device | |
CN114246611B (en) | System and method for an adaptive interface for an ultrasound imaging system | |
WO2022099705A1 (en) | Early-pregnancy fetus ultrasound imaging method and ultrasound imaging system | |
CN116763347A (en) | Fetal head direction angle measuring method based on ultrasonic image and related device | |
CN116763346A (en) | Ultrasonic image processing method, ultrasonic imaging device and readable storage medium | |
US20210015449A1 (en) | Methods and systems for processing and displaying fetal images from ultrasound imaging data | |
WO2020103098A1 (en) | Ultrasonic imaging method and apparatus, storage medium, processor and computer device | |
WO2022099704A1 (en) | Ultrasonic imaging method and ultrasonic imaging system of fetus in middle and late pregnancy | |
CN116763337A (en) | Ultrasound image acquisition method, ultrasound imaging device and readable storage medium | |
CN113229850A (en) | Ultrasonic pelvic floor imaging method and ultrasonic imaging system | |
CN115813433A (en) | Follicle measuring method based on two-dimensional ultrasonic imaging and ultrasonic imaging system | |
CN116763336A (en) | Ultrasound image acquisition method, ultrasound imaging device and readable storage medium | |
CN116763348A (en) | Parameter analysis method based on ultrasonic image, ultrasonic imaging equipment and storage medium | |
CN116763344A (en) | Ultrasonic data acquisition method, ultrasonic imaging device and readable storage medium | |
CN116763349A (en) | Method for adjusting ultrasonic image, ultrasonic imaging device and storage medium | |
CN117814840A (en) | Ultrasonic imaging method and ultrasonic imaging system for early pregnancy fetus | |
WO2020215484A1 (en) | Method and device for measuring nuchal translucency thickness of fetus, and storage medium | |
CN116763351A (en) | Fetal head azimuth measuring method based on ultrasonic image and related device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |