CN111860636A - Measurement information prompting method and ultrasonic training method - Google Patents

Measurement information prompting method and ultrasonic training method Download PDF

Info

Publication number
CN111860636A
CN111860636A CN202010685946.1A CN202010685946A CN111860636A CN 111860636 A CN111860636 A CN 111860636A CN 202010685946 A CN202010685946 A CN 202010685946A CN 111860636 A CN111860636 A CN 111860636A
Authority
CN
China
Prior art keywords
measurement information
information
sample
ultrasonic
network model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010685946.1A
Other languages
Chinese (zh)
Inventor
莫若理
甘从贵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuxi Chison Medical Technologies Co Ltd
Original Assignee
Wuxi Chison Medical Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuxi Chison Medical Technologies Co Ltd filed Critical Wuxi Chison Medical Technologies Co Ltd
Priority to CN202010685946.1A priority Critical patent/CN111860636A/en
Publication of CN111860636A publication Critical patent/CN111860636A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30044Fetus; Embryo
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30048Heart; Cardiac
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30061Lung
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30068Mammography; Breast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30101Blood vessel; Artery; Vein; Vascular

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computational Linguistics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Biophysics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Evolutionary Biology (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The invention relates to the technical field of image processing, in particular to a measurement information prompting method and an ultrasonic training method. The measurement information prompting method comprises the following steps: acquiring an ultrasonic image, wherein the ultrasonic image is an ultrasonic image and/or an ultrasonic video; inputting the ultrasonic image into a personalized network model, wherein the output of the personalized network model is measurement information determined according to the ultrasonic image; when the measurement information is compared with a preset threshold value and meets a prompt condition, displaying prompt information; the personalized network model is obtained by training according to sample information, the sample information comprises n ultrasonic sample images obtained by medical staff by historical imaging and measurement information corresponding to each ultrasonic sample image, and n is an integer greater than 1. The problems that the measurement efficiency of the measurement information obtained by manual measurement in the existing scheme is low and errors may occur are solved; the effect of improving the measuring efficiency and accuracy is achieved.

Description

Measurement information prompting method and ultrasonic training method
Technical Field
The invention relates to the technical field of image processing, in particular to a measurement information prompting method and an ultrasonic training method.
Background
Ultrasound equipment is one of the commonly used auxiliary devices for medical diagnosis by virtue of being fast and painless and harmless to a subject.
In the existing scheme, after a medical worker inspects an object to be inspected, the medical worker measures the size of the object to be inspected in an ultrasound image according to an acquired ultrasound image. However, due to the influences of the mood of the medical staff on the day and the busy degree of work, the measurement result has certain deviation, and the measurement accuracy is low.
Disclosure of Invention
In view of this, embodiments of the present invention provide a measurement information prompting method and an ultrasound training method to solve the problem that the existing solution may have low measurement accuracy.
According to a first aspect, an embodiment of the present invention provides a method for prompting measurement information, including:
taking an ultrasonic image, wherein the ultrasonic image is an ultrasonic image and/or an ultrasonic video;
inputting the ultrasonic image into a personalized network model, wherein the output of the personalized network model is measurement information determined according to the ultrasonic image;
when the measurement information is compared with a preset threshold value and meets a prompt condition, displaying prompt information;
the personalized network model is obtained by training according to sample information, the sample information comprises n ultrasonic sample images obtained by medical staff by historical imaging and measurement information corresponding to each ultrasonic sample image, and n is an integer greater than 1.
Optionally, the measurement information includes at least two types, each type of measurement information is provided with a corresponding preset threshold, when the measurement information meets a prompt condition compared with the preset threshold, the prompt information is displayed, including:
and distinguishing and displaying the prompt information corresponding to each kind of measurement information according to the size relation between each kind of measurement information and the corresponding preset threshold value.
Optionally, the displaying the prompt information corresponding to each measurement information in a differentiated manner includes:
and distinguishing and displaying the prompt information corresponding to each measurement information in at least one mode of underlining, bolding, italics, font color and background color.
Optionally, the ultrasound image is information including a target object, and the target object includes at least one of a blood vessel, a fetus, a liver, a kidney, a heart, a thyroid, a carotid artery, and a breast.
Optionally, when the target object is a blood vessel, the measurement information includes at least one of an angle of bending of the blood vessel, an inner diameter of the blood vessel, a blood flow velocity, and a size of a plaque of the blood vessel;
when the target object is a fetus, the measurement information comprises the sizes of the double apical diameter, the humerus length, the femur length, the abdominal circumference and the head circumference;
When the target object is a liver, the measurement information includes a size of liver cirrhosis;
when the target object is a kidney, the measurement information comprises at least one of a long diameter, a wide diameter, a thick diameter, a resistance index and a pulsation index of the kidney;
when the target object is a heart, the measurement information includes at least one of an inner diameter of an atrium, a thickness of an atrial wall, and a space between left and right atria;
when the target object is a thyroid gland, the measurement information includes a size and/or a shape of a thyroid nodule;
when the target object is a carotid artery, the measurement information comprises the size and/or shape of plaque;
when the target object is a breast, the measurement information includes an aspect ratio.
Optionally, the method further includes:
receiving a correction instruction for correcting the measurement information output by the personalized network model;
and correcting the measurement information according to the correction instruction.
Optionally, the method further includes:
after the measurement information is corrected, adding the ultrasonic image and the corrected measurement information to the sample information, and updating the personalized network model through the updated sample information;
Alternatively, the first and second electrodes may be,
submitting the ultrasonic image and the corrected measurement information to a training server, adding the ultrasonic image and the corrected measurement information to the sample information by the training server, and updating the personalized network model through the updated sample information.
Optionally, the method further includes:
acquiring the sample information;
and training the initialized network according to the sample information to obtain the personalized network model.
In a second aspect, there is provided a method of ultrasound training, the method comprising:
acquiring sample information, wherein the sample information comprises n ultrasonic sample images acquired by medical staff through historical imaging and measurement information corresponding to each ultrasonic sample image, and n is an integer larger than 1;
and training the initialized network according to the sample information to obtain an individualized network model, wherein the individualized network model is used for outputting the measurement information corresponding to the ultrasonic image obtained by the medical staff by drawing.
Optionally, the method further includes:
receiving the ultrasonic image and the corrected measurement information corresponding to the ultrasonic image;
adding the ultrasonic image and the corrected measurement information to the sample information, and updating the personalized network model according to the updated sample information.
In a third aspect, an ultrasound processing apparatus is provided, the apparatus comprising a memory and a processor, the memory having stored therein at least one program instruction, the at least one program instruction being loaded and executed by the processor to implement the method of the first or second aspect.
In a fourth aspect, there is provided a computer storage medium having stored therein at least one program instruction which is loaded and executed by a processor to implement a method according to the first or second aspect.
After the ultrasonic image is obtained, outputting measurement information according to a personalized network model of the medical care personnel obtained through pre-training, wherein the personalized network model is a network obtained through training according to sample information of the medical care personnel; the problems that the measurement efficiency of the measurement information obtained by manual measurement in the existing scheme is low and errors may occur are solved; the effect of improving the measuring efficiency and accuracy is achieved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a method flow diagram of an ultrasound training method according to an embodiment of the present invention.
Fig. 2 is a schematic structural diagram of a first neural network according to an embodiment of the present invention.
Fig. 3 is a schematic structural diagram of a second neural network provided in an embodiment of the present invention.
Fig. 4 is a schematic structural diagram of a possible personalized network model according to an embodiment of the present invention.
Fig. 5 is a flowchart of a method of prompting measurement information according to an embodiment of the present invention.
Fig. 6 is a schematic hardware structure diagram of an ultrasound apparatus provided in an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, a flowchart of a method of ultrasound training provided in an embodiment of the present application is shown, where as shown in fig. 1, the method includes:
Step 101, obtaining sample information, wherein the sample information includes n ultrasound sample images obtained by medical staff by historical imaging and measurement information corresponding to each ultrasound sample image, and n is an integer greater than 1.
After the medical staff performs historical mapping, the target object in the mapped ultrasonic image is measured to obtain a measurement result, and after the measurement result is determined to be accurate, the ultrasonic image and the corresponding measurement result can be added to the sample. Accordingly, the training server may obtain the sample.
During actual implementation, the training server can acquire the historical measurement record of the doctor from the working machine of the medical staff, or the working machine of the medical staff can actively report the historical measurement record to the training server; and the training server takes the acquired historical measurement record as sample information. The historical measurement record comprises an ultrasonic image and a corresponding measurement result.
Generally, n is a large value, for example, n is a value greater than 1000, in order to ensure the training accuracy.
The ultrasound sample image may be information obtained by scanning a target object, and the target object may be an organ such as a blood vessel, a fetus, a heart, a lung, a thyroid, a carotid artery, a breast, and the like, which is not limited in this embodiment.
When the target object is a blood vessel, the measurement information comprises at least one of the bending angle of the blood vessel, the inner diameter of the blood vessel, the blood flow speed and the size of a blood vessel plaque;
when the target object is a fetus, the measurement information may be the size of the double apical diameter, the humerus length, the femur length, the abdominal circumference, and the head circumference.
When the target object is a liver, the measurement information includes a size of liver cirrhosis;
when the target object is a kidney, the measurement information comprises at least one of a long diameter, a wide diameter, a thick diameter, a resistance index and a pulsation index of the kidney;
when the target object is a heart, the measurement information includes at least one of an inner diameter of an atrium, a thickness of an atrial wall, and a space between left and right atria;
when the target object is a thyroid gland, the measurement information includes a size and/or a shape of a thyroid nodule;
when the target object is a carotid artery, the measurement information comprises the size and/or shape of plaque;
when the target object is a breast, the measurement information includes an aspect ratio.
And 102, training the initialization network according to the sample information to obtain an individualized network model, wherein the individualized network model is used for outputting the measurement information corresponding to the ultrasonic image.
And after the sample information is acquired, training the initialization network according to the sample information. Since the medical staff confirms the accurate historical measurement record to be used as the sample information for training, the measurement information output by the trained personalized network model conforms to the measurement standard of the medical staff.
In practical implementation, the initialization network comprises a convolutional layer, a pooling layer, an activation function layer, a full connection layer, an embedding layer, a cyclic neural network layer, a long-term and short-term memory layer and the like. The initialization network may have one neural network, or may include two or more neural networks. In practical implementation, because different target objects need to be measured in the ultrasound image, and different measurement manners of measuring the target objects, in a possible implementation manner, the initialization network may include a first neural network and a second neural network, where the first neural network is used to process the ultrasound image information to obtain the type of the target object included in the ultrasound image information, and the second neural network is used to process the ultrasound image information including the target object to obtain the measurement information. The types of target objects may include: blood vessels, fetuses, liver, kidney, heart, thyroid, carotid, and breast, among others. The measurement information is measurement information for measuring blood vessels, fetuses, livers, kidneys, hearts, thyroid, carotid arteries, and mammary glands.
Wherein:
referring to fig. 2, the first neural network is composed of a convolutional layer and a pooling layer, processes an input ultrasound image, and outputs a target type corresponding to the ultrasound image; referring to fig. 3, the second neural network generally comprises a convolutional layer, a bilinear difference value convolutional layer, an anti-convolutional layer, and a cross-layer connection layer, and processes the ultrasound image of the specified target type, extracts the envelope information of the corresponding target type, and regresses the corresponding measurement information. Of course, in practical implementation, the second neural network may also be a segmentation network, which is not limited in this embodiment.
In training the initialization network, the first neural network and the second neural network may be trained separately. For example, the first neural network is trained using the ultrasound sample information and the target object included in the ultrasound sample information, the type of the target object included in the ultrasound information can be output by the trained first neural network after the ultrasound information is input, and then the second neural network is trained according to the ultrasound sample information, the type of the target object, and the measurement information can be obtained by the trained second neural network after the ultrasound image including the target object is input.
In addition, in order to expand the measurement range of the trained personalized network, in the training process, the ultrasound images including various target objects can be used as training samples for training, and then a second neural network corresponding to each target object can be obtained through training, that is, in practical implementation, the personalized network model can include a plurality of second neural networks, and each second neural network is used for measuring the measurement information of one target object. For example, referring to fig. 4, the personalized network model includes a first neural network for determining the type of the target object in the ultrasound image and n second neural networks for measuring different sizes of the target object, where n is an integer greater than or equal to 2.
The above is only exemplified by the initialized network comprising the first neural network and the second neural network, and optionally, in a possible implementation example, the initialized network may also be a third neural network, the third neural network uses the ultrasound image and the characteristic information with the user characteristics as input, and outputs personalized measurement information as output, and the model generally consists of a convolutional layer, a bilinear interpolation layer, an embedded layer, a full connection layer, a long-short term memory layer, and the like, wherein the characteristic information with the user characteristics may include: and inputting characteristic information such as different doctor characteristics, characteristics of different hospitals, different regional distributions, different population classifications and the like.
It should be noted that, before the training, each ultrasound sample image may be preprocessed, and the initialization network may be trained according to the preprocessed ultrasound sample image. The preprocessing referred to herein may be normalization processing.
Another point to be described is that, in the embodiment, the personalized network is exemplified only by the sample information including the ultrasound sample image of the medical care personnel and the corresponding measurement information, that is, training the personalized network according to the information of the medical care personnel, and in actual implementation, the sample information may further include the ultrasound images of other medical care personnel and the corresponding measurement information, and then training the personalized network according to the ultrasound images and the measurement information of the medical care personnel and the other medical care personnel. The personalized network is trained by fusing the ultrasonic images of other medical personnel and the corresponding measurement information thereof, so that the precision of the trained personalized network is ensured, the accuracy of measuring a target object in the ultrasonic images through the personalized network is further improved, and the problems that the medical personnel have errors in measurement and further use the trained personalized network for measurement, and the measurement accuracy is low are solved.
In summary, after sample information is obtained, the initialized network is trained according to the sample information, so as to obtain a personalized network model, wherein the sample information includes n ultrasound sample images obtained by medical staff by historical imaging and measurement information corresponding to each ultrasound sample image; thus, when the medical staff makes a picture again, the measurement information corresponding to the ultrasonic image can be directly output through the personalized network model; the problems that the measurement efficiency of the measurement information obtained by manual measurement in the existing scheme is low and errors may occur are solved; the effect of improving the measuring efficiency and accuracy is achieved.
It should be noted that, after the personalized network model is obtained through training, the medical staff may modify the weight of each training parameter of the personalized network model according to personal needs, that is, the method may further include:
firstly, receiving an adjusting instruction for adjusting the weight of each training parameter in the personalized network model;
secondly, the weight of each training parameter in the personalized network model is adjusted according to the adjusting instruction.
After receiving the adjustment instruction, the weight of the training parameter is adjusted. And then, the personalized network model after the weight is adjusted is used as the personalized network model obtained by final training, so that the accuracy of the personalized network model obtained by training is improved.
Referring to fig. 5, a flowchart of a method for prompting measurement information according to an embodiment of the present application is shown, where as shown in fig. 5, the method includes:
step 201, acquiring an ultrasonic image;
the method comprises the steps of obtaining an ultrasonic image obtained by medical staff by drawing, wherein the ultrasonic image can be information acquired by the medical staff in real time through ultrasonic equipment or information acquired by the ultrasonic equipment in advance. The ultrasonic image is an ultrasonic image and/or an ultrasonic video.
The ultrasound image may be information obtained by scanning a target object, and the target object may be an organ such as a blood vessel, a fetus, a heart, a lung, a thyroid, a carotid artery, a breast, and the like, which is not limited in this embodiment.
Step 202, inputting the ultrasonic image into a personalized network model, wherein the output of the personalized network model is measurement information determined according to the ultrasonic image.
The personalized network model described in this embodiment is a model obtained by training in the embodiment shown in fig. 1.
In combination with the description in the above embodiment, the ultrasound image may be input to a first neural network in the personalized network, the type of the target object included in the ultrasound image is determined, then the ultrasound image is input to a second neural network for measuring the type of the target object, and then the measurement information of the target object is output.
The personalized network model is obtained by training according to sample information, the sample information comprises n ultrasonic sample images obtained by medical staff by historical imaging and measurement information corresponding to each ultrasonic sample image, and n is an integer greater than 1.
Optionally, after the personalized network model outputs the measurement information, the medical care personnel may subjectively determine whether the measurement information is accurate, and when the determination result is inaccurate, the medical care personnel may correct the measurement information, that is, the method may further include:
firstly, receiving a correction instruction for correcting the measurement information output by the personalized network model;
secondly, the measurement information is corrected according to the correction instruction.
The medical staff corrects the output of the personalized network model, which indicates that the personalized network model needs to be optimized, and at this time, in order to improve the accuracy of the personalized network model, the method may further include:
after the measurement information is corrected, adding the ultrasonic image and the corrected measurement information to the sample information, and updating the personalized network model through the updated sample information;
alternatively, the first and second electrodes may be,
Submitting the ultrasonic image and the corrected measurement information to a training server, adding the ultrasonic image and the corrected measurement information to the sample information by the training server, and updating the personalized network model through the updated sample information.
By refreshing the sample information in time, the accuracy of the personalized network model obtained by training is ensured, and the accuracy of the measurement information output by the personalized network model is further ensured.
And 203, displaying prompt information when the comparison of the measurement information and a preset threshold meets the prompt condition.
After the measurement information is obtained, the prompt information corresponding to each measurement information is distinguished and displayed according to the size relation between each measurement information and the corresponding preset threshold value.
When the measurement information includes multiple types, a corresponding threshold value can be set for each type of measurement information, and then after the measurement information in the ultrasonic image is confirmed, each type of measurement information is compared with a corresponding preset threshold value, and prompt information corresponding to each type of measurement information is distinguished and displayed according to the size relation. The preset threshold corresponding to each kind of measurement information may be a default numerical value or a self-defined numerical value for medical staff, which is not limited in this embodiment. For example, taking plaque as an example, when the plaque size exceeds a × b, which indicates that intervention treatment is required at this time, the corresponding preset threshold may be a × b for the case where the measurement information is the plaque size.
Wherein, the distinguishing and displaying the prompt information corresponding to each kind of measurement information includes:
and distinguishing and displaying the prompt information corresponding to each measurement information in at least one mode of underlining, bolding, italics, font color and background color.
For example, for prompt information corresponding to different measurement information, different colors of background colors are used for differentiated display. For another example, for prompt information corresponding to different measurement information, the prompt information may be displayed in an underlined and bolded manner, which is not described herein again.
In summary, after the ultrasound image is obtained, the measurement information is output according to the personalized network model of the medical care personnel obtained through pre-training, wherein the personalized network model is a network obtained through training according to the sample information of the medical care personnel; the problems that the measurement efficiency of the measurement information obtained by manual measurement in the existing scheme is low and errors may occur are solved; the effect of improving the measuring efficiency and accuracy is achieved.
The embodiment also discloses an ultrasonic processing device, which comprises a memory and a processor, wherein at least one program instruction is stored in the memory, and the processor executes the method by loading and executing the at least one program instruction.
The present embodiment also discloses a computer storage medium, in which at least one program instruction is stored, and the at least one program instruction is loaded by the processor and executes the method described above.
Referring to fig. 6, fig. 6 is a schematic structural diagram of an ultrasound apparatus according to an alternative embodiment of the present invention, and as shown in fig. 6, the ultrasound apparatus may include: at least one processor 61, such as a CPU (Central Processing Unit), at least one communication interface 63, memory 64, at least one communication bus 62. Wherein a communication bus 62 is used to enable the connection communication between these components. The communication interface 63 may include a Display (Display) and a Keyboard (Keyboard), and the optional communication interface 63 may also include a standard wired interface and a standard wireless interface. The Memory 64 may be a high-speed RAM Memory (volatile Random Access Memory) or a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. The memory 64 may optionally be at least one memory device located remotely from the processor 61. Wherein the processor 61 may be in connection with the apparatus described in fig. 6, an application program is stored in the memory 64, and the processor 61 calls the program code stored in the memory 64 for performing any of the above-mentioned method steps.
The communication bus 62 may be a Peripheral Component Interconnect (PCI) bus or an Extended Industry Standard Architecture (EISA) bus. The communication bus 62 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in FIG. 6, but this is not intended to represent only one bus or type of bus.
The memory 64 may include a volatile memory (RAM), such as a random-access memory (RAM); the memory may also include a non-volatile memory (english: non-volatile memory), such as a flash memory (english: flash memory), a hard disk (english: hard disk drive, abbreviation: HDD), or a solid-state drive (english: SSD); the memory 64 may also comprise a combination of the above types of memory.
The processor 61 may be a Central Processing Unit (CPU), a Network Processor (NP), or a combination of CPU and NP.
The processor 61 may further include a hardware chip. The hardware chip may be an application-specific integrated circuit (ASIC), a Programmable Logic Device (PLD), or a combination thereof. The aforementioned PLD may be a Complex Programmable Logic Device (CPLD), a field-programmable gate array (FPGA), a General Array Logic (GAL), or any combination thereof.
Optionally, the memory 64 is also used to store program instructions. The processor 61 may call program instructions to implement the method as shown in the embodiments of fig. 1 and 5 of the present application.
Embodiments of the present invention further provide a non-transitory computer storage medium, where computer-executable instructions are stored, and the computer-executable instructions may execute the method in any of the above method embodiments. The storage medium may be a magnetic Disk, an optical Disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a Flash Memory (Flash Memory), a Hard Disk (Hard Disk Drive, abbreviated as HDD), a Solid State Drive (SSD), or the like; the storage medium may also comprise a combination of memories of the kind described above.
Although the embodiments of the present invention have been described in conjunction with the accompanying drawings, those skilled in the art may make various modifications and variations without departing from the spirit and scope of the invention, and such modifications and variations fall within the scope defined by the appended claims.

Claims (10)

1. A method for prompting measurement information is characterized by comprising the following steps:
acquiring an ultrasonic image, wherein the ultrasonic image is an ultrasonic image and/or an ultrasonic video;
inputting the ultrasonic image into a personalized network model, wherein the output of the personalized network model is measurement information determined according to the ultrasonic image;
when the measurement information is compared with a preset threshold value and meets a prompt condition, displaying prompt information;
the personalized network model is obtained by training according to sample information, the sample information comprises n ultrasonic sample images obtained by medical staff by historical imaging and measurement information corresponding to each ultrasonic sample image, and n is an integer greater than 1.
2. The method according to claim 1, wherein the measurement information includes at least two types, each type of measurement information is provided with a corresponding preset threshold, and when the measurement information meets a prompt condition compared with the preset threshold, displaying the prompt information includes:
And distinguishing and displaying the prompt information corresponding to each kind of measurement information according to the size relation between each kind of measurement information and the corresponding preset threshold value.
3. The method according to claim 2, wherein the displaying the prompt information corresponding to each measurement information in a differentiated manner includes:
and distinguishing and displaying the prompt information corresponding to each measurement information in at least one mode of underlining, bolding, italics, font color and background color.
4. The method of claim 1, wherein the ultrasound image is information including a target object including at least one of a blood vessel, a fetus, a liver, a kidney, a heart, a thyroid, a carotid artery, and a breast.
5. The method of claim 4,
when the target object is a blood vessel, the measurement information comprises at least one of the bending angle of the blood vessel, the inner diameter of the blood vessel, the blood flow speed and the size of a blood vessel plaque;
when the target object is a fetus, the measurement information comprises the sizes of the double apical diameter, the humerus length, the femur length, the abdominal circumference and the head circumference;
when the target object is a liver, the measurement information includes a size of liver cirrhosis;
When the target object is a kidney, the measurement information comprises at least one of a long diameter, a wide diameter, a thick diameter, a resistance index and a pulsation index of the kidney;
when the target object is a heart, the measurement information includes at least one of an inner diameter of an atrium, a thickness of an atrial wall, and a space between left and right atria;
when the target object is a thyroid gland, the measurement information includes a size and/or a shape of a thyroid nodule;
when the target object is a carotid artery, the measurement information comprises the size and/or shape of plaque;
when the target object is a breast, the measurement information includes an aspect ratio.
6. The method of claim 1, further comprising:
receiving a correction instruction for correcting the measurement information output by the personalized network model;
and correcting the measurement information according to the correction instruction.
7. The method of claim 6, further comprising:
after the measurement information is corrected, adding the ultrasonic image and the corrected measurement information to the sample information, and updating the personalized network model through the updated sample information;
Alternatively, the first and second electrodes may be,
submitting the ultrasonic image and the corrected measurement information to a training server, adding the ultrasonic image and the corrected measurement information to the sample information by the training server, and updating the personalized network model through the updated sample information.
8. The method of any of claims 1 to 7, further comprising:
acquiring the sample information;
and training the initialized network according to the sample information to obtain the personalized network model.
9. A method of ultrasound training, the method comprising:
acquiring sample information, wherein the sample information comprises n ultrasonic sample images acquired by medical staff through historical imaging and measurement information corresponding to each ultrasonic sample image, and n is an integer larger than 1;
and training the initialized network according to the sample information to obtain an individualized network model, wherein the individualized network model is used for outputting the measurement information corresponding to the ultrasonic image obtained by the medical staff by drawing.
10. The method of claim 9, further comprising:
receiving the ultrasonic image and the corrected measurement information corresponding to the ultrasonic image;
Adding the ultrasonic image and the corrected measurement information to the sample information, and updating the personalized network model according to the updated sample information.
CN202010685946.1A 2020-07-16 2020-07-16 Measurement information prompting method and ultrasonic training method Pending CN111860636A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010685946.1A CN111860636A (en) 2020-07-16 2020-07-16 Measurement information prompting method and ultrasonic training method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010685946.1A CN111860636A (en) 2020-07-16 2020-07-16 Measurement information prompting method and ultrasonic training method

Publications (1)

Publication Number Publication Date
CN111860636A true CN111860636A (en) 2020-10-30

Family

ID=72983620

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010685946.1A Pending CN111860636A (en) 2020-07-16 2020-07-16 Measurement information prompting method and ultrasonic training method

Country Status (1)

Country Link
CN (1) CN111860636A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112331049A (en) * 2020-11-04 2021-02-05 无锡祥生医疗科技股份有限公司 Ultrasonic simulation training method and device, storage medium and ultrasonic equipment
CN113393456A (en) * 2021-07-13 2021-09-14 湖南大学 Automatic quality control method of early pregnancy fetus standard section based on multiple tasks

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090024037A1 (en) * 2007-07-17 2009-01-22 Kabushiki Kaisha Toshiba Ultrasonic imaging apparatus and a method of acquiring ultrasonic images
CN109727243A (en) * 2018-12-29 2019-05-07 无锡祥生医疗科技股份有限公司 Breast ultrasound image recognition analysis method and system
CN110613480A (en) * 2019-01-14 2019-12-27 广州爱孕记信息科技有限公司 Fetus ultrasonic dynamic image detection method and system based on deep learning
CN110680399A (en) * 2019-10-25 2020-01-14 深圳度影医疗科技有限公司 Automatic measurement method of prenatal ultrasound image, storage medium and ultrasound equipment
CN111310851A (en) * 2020-03-03 2020-06-19 四川大学华西第二医院 Artificial intelligence ultrasonic auxiliary system and application thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090024037A1 (en) * 2007-07-17 2009-01-22 Kabushiki Kaisha Toshiba Ultrasonic imaging apparatus and a method of acquiring ultrasonic images
CN109727243A (en) * 2018-12-29 2019-05-07 无锡祥生医疗科技股份有限公司 Breast ultrasound image recognition analysis method and system
CN110613480A (en) * 2019-01-14 2019-12-27 广州爱孕记信息科技有限公司 Fetus ultrasonic dynamic image detection method and system based on deep learning
CN110680399A (en) * 2019-10-25 2020-01-14 深圳度影医疗科技有限公司 Automatic measurement method of prenatal ultrasound image, storage medium and ultrasound equipment
CN111310851A (en) * 2020-03-03 2020-06-19 四川大学华西第二医院 Artificial intelligence ultrasonic auxiliary system and application thereof

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘兴会等: "产科临床诊疗流程", vol. 01, 30 September 2010, 人民军医出版社, pages: 289 - 294 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112331049A (en) * 2020-11-04 2021-02-05 无锡祥生医疗科技股份有限公司 Ultrasonic simulation training method and device, storage medium and ultrasonic equipment
CN113393456A (en) * 2021-07-13 2021-09-14 湖南大学 Automatic quality control method of early pregnancy fetus standard section based on multiple tasks

Similar Documents

Publication Publication Date Title
CN108701354B (en) Method and system for identifying contour of interest region in ultrasonic image
US10424067B2 (en) Image processing apparatus, image processing method and storage medium
CN111325759B (en) Vessel segmentation method, apparatus, computer device, and readable storage medium
CN110458837B (en) Image post-processing method and device, electronic equipment and storage medium
CN110310256A (en) Coronary stenosis detection method, device, computer equipment and storage medium
CN111383259B (en) Image analysis method, computer device, and storage medium
CN111860636A (en) Measurement information prompting method and ultrasonic training method
EP3722996A2 (en) Systems and methods for processing 3d anatomical volumes based on localization of 2d slices thereof
CN113066090A (en) Training method and device, application method and device of blood vessel segmentation model
JP7205034B2 (en) Method, image processing device and storage medium for determining midsagittal plane in magnetic resonance images
CN112052896A (en) Image processing method and device, and classification model training method and device
CN111863204A (en) Mammary gland disease AI auxiliary diagnosis method and system based on molybdenum target X-ray photographic examination
CN115526833A (en) Ultrasonic image-based ejection fraction calculation device, method, medium, and apparatus
CN113469963B (en) Pulmonary artery image segmentation method and device
CN111383236B (en) Method, apparatus and computer-readable storage medium for labeling regions of interest
CN110739050B (en) Left ventricle full-parameter and confidence coefficient quantification method
US20220301177A1 (en) Updating boundary segmentations
CN109767468B (en) Visceral volume detection method and device
CN116524158A (en) Interventional navigation method, device, equipment and medium based on image registration
CN114445391B (en) Blood vessel segmentation method and device, electronic device and computer readable storage medium
CN116130090A (en) Ejection fraction measuring method and device, electronic device, and storage medium
CN115775233A (en) Processing method and device for measuring characteristic dimension based on cardiac ultrasound video
CN111820950A (en) Personalized information determination device and ultrasonic training method
CN114565623A (en) Pulmonary vessel segmentation method, device, storage medium and electronic equipment
CN112426170A (en) Placenta thickness determination method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination