CN116458917A - Ultrasound imaging system and auxiliary display method of ultrasound image - Google Patents

Ultrasound imaging system and auxiliary display method of ultrasound image Download PDF

Info

Publication number
CN116458917A
CN116458917A CN202210027817.2A CN202210027817A CN116458917A CN 116458917 A CN116458917 A CN 116458917A CN 202210027817 A CN202210027817 A CN 202210027817A CN 116458917 A CN116458917 A CN 116458917A
Authority
CN
China
Prior art keywords
standard
image
ultrasonic
ultrasonic image
ultrasound
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210027817.2A
Other languages
Chinese (zh)
Inventor
许梦玲
裴海军
温博
赵雅轩
刘硕
林穆清
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mindray Bio Medical Electronics Co Ltd
Original Assignee
Shenzhen Mindray Bio Medical Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mindray Bio Medical Electronics Co Ltd filed Critical Shenzhen Mindray Bio Medical Electronics Co Ltd
Priority to CN202210027817.2A priority Critical patent/CN116458917A/en
Publication of CN116458917A publication Critical patent/CN116458917A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/085Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5269Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving detection or reduction of artifacts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Vascular Medicine (AREA)
  • Artificial Intelligence (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

An ultrasonic imaging system and an auxiliary display method of ultrasonic images are provided, when an ultrasonic imaging system is used for scanning a tissue to be detected, a transmission control circuit stimulates an ultrasonic probe to transmit ultrasonic waves to the tissue to be detected; the receiving control circuit receives an echo of ultrasonic waves returned by the tissue to be detected from the ultrasonic probe, so that an ultrasonic echo signal is obtained; the processor processes the ultrasonic echo signal to acquire an ultrasonic image of the tissue to be detected, determines a standard section type corresponding to the ultrasonic image and containing a specific feature structure, identifies a target feature structure corresponding to the specific feature structure in the ultrasonic image, determines a standard section structure schematic diagram according to the standard section type, controls a display to display the ultrasonic image and the schematic diagram, and highlights the target feature structure in the schematic diagram. According to the standard section structure diagram, which specific characteristic structures exist in the current ultrasonic image can be intuitively and rapidly determined, and the efficiency of acquiring the standard section is improved.

Description

Ultrasound imaging system and auxiliary display method of ultrasound image
Technical Field
The application relates to the technical field of ultrasound, in particular to an ultrasound imaging system and an auxiliary display method of an ultrasound image.
Background
In ultrasound examination, ultrasound clinical teaching or training, an operator of an ultrasound imaging system (referred to herein simply as an operator) may scan a tissue site to be examined of the person to be examined through the ultrasound imaging system, and when an ultrasound image of a standard section type is scanned, the ultrasound image at that time may be saved, thereby performing ultrasound description or measurement. Taking clinical ultrasound cardiac examination as an example, it is generally necessary to obtain standard sections to support accurate assessment of the patient's heart, where the standard sections typically include parasternal long axis, parasternal short axis, apical four-chamber, subxiphoid vena cava section, and the like. The doctor scans the heart part of the doctor through the ultrasonic imaging system, and after the standard section is scanned, the ultrasonic image is reserved.
However, the operator needs to control the probe to move in the scanning process, and observe and determine whether the dynamically changed ultrasonic image is a required target ultrasonic image in real time, so that the efficiency of acquiring the standard section is not high.
Disclosure of Invention
The technical problem that this application mainly solves is that the efficiency of obtaining standard tangent plane is not high.
According to a first aspect, there is provided in one embodiment an ultrasound imaging system comprising:
an ultrasonic probe for transmitting ultrasonic waves and receiving echo signals of the ultrasonic waves;
a transmission and reception control circuit for controlling the ultrasonic probe to perform transmission of ultrasonic waves and reception of echo signals of the ultrasonic waves;
the processor is used for generating an ultrasonic image according to the echo signal of the ultrasonic wave and determining a standard section type to which the ultrasonic image belongs according to the ultrasonic image, wherein the standard section type comprises a specific characteristic structure; identifying a target feature structure corresponding to the specific feature structure in the ultrasonic image according to the ultrasonic image; determining a corresponding standard section structure diagram according to the standard section type, wherein the standard section structure diagram comprises a structure area corresponding to the specific characteristic structure;
and the display is used for displaying the ultrasonic image and the standard section structure schematic diagram, wherein a structure area corresponding to the target characteristic structure in the standard section structure schematic diagram is highlighted.
According to a second aspect, in one embodiment, there is provided an auxiliary display method of an ultrasound image, including:
Transmitting ultrasonic waves to a target tissue;
receiving echo signals of the ultrasonic waves;
generating an ultrasonic image according to the echo signal of the ultrasonic wave;
determining a standard section type to which the ultrasonic image belongs according to the ultrasonic image, wherein the standard section type comprises a specific characteristic structure;
identifying a target feature structure corresponding to the specific feature structure in the ultrasonic image according to the ultrasonic image;
determining a corresponding standard section structure diagram according to the standard section type, wherein the standard section structure diagram comprises a structure area corresponding to the specific characteristic structure;
and displaying the ultrasonic image and the standard section structure schematic diagram, wherein a structure area corresponding to the target characteristic structure in the standard section structure schematic diagram is highlighted.
According to a third aspect, an embodiment provides a computer readable storage medium having stored thereon a program executable by a processor to implement a method as in any of the second aspects above.
According to the ultrasound imaging system, the auxiliary display method of the ultrasound image and the readable storage medium of the above embodiments, when the ultrasound imaging system is used for scanning the tissue to be detected, each acquired ultrasound image can be subjected to the following procedures: the emission control circuit excites the ultrasonic probe to emit ultrasonic waves to the tissue to be detected; the receiving control circuit receives an echo of ultrasonic waves returned by the tissue to be detected from the ultrasonic probe, so that an ultrasonic echo signal is obtained; the processor processes the ultrasonic echo signal to acquire an ultrasonic image of the tissue to be detected, determines a standard section type corresponding to the ultrasonic image and containing a specific feature structure, identifies a target feature structure corresponding to the specific feature structure in the ultrasonic image, determines a standard section structure diagram according to the standard section type, controls a display to display the ultrasonic image and the diagram, and highlights the target feature structure identified from the ultrasonic image in the diagram. In the scanning process, the ultrasonic image changes in real time along with the change of the position and the angle of the probe, so that the structural schematic diagram of the standard tangent plane corresponding to the ultrasonic image displayed in the display also changes in real time along with the ultrasonic image, and an operator of the ultrasonic imaging system can intuitively and rapidly determine whether the currently scanned tangent plane is the standard tangent plane and which specific characteristic structures in the currently scanned tangent plane exist according to the schematic diagram due to the characteristics of clarity, intuitiveness, conciseness and the like of the schematic diagram, thereby being beneficial to the operator to rapidly acquire the standard tangent plane, rapidly acquire related information from the standard tangent plane, improve the acquisition efficiency of the standard tangent plane and facilitate quality control examination.
Drawings
Fig. 1 is a schematic structural diagram of an ultrasound imaging system according to an embodiment of the present application;
FIG. 2A is an ultrasound image of a section of liver and kidney according to an embodiment of the present application;
fig. 2B is a schematic view of a section structure of liver and kidney according to an embodiment of the present application;
FIG. 3A is an ultrasound image of a second portal section provided in an embodiment of the present application;
fig. 3B is a schematic view of a second portal view according to an embodiment of the present disclosure;
FIG. 4A is a display interface diagram of an ultrasound image and a schematic diagram thereof according to an embodiment of the present application;
FIG. 4B is a display interface diagram of another ultrasound image and schematic diagram thereof provided in an embodiment of the present application;
FIG. 5A is an ultrasound image of a parasternal long axis section provided in an embodiment of the present application;
fig. 5B is a schematic view of a parasternal long-axis section structure according to an embodiment of the present disclosure;
FIG. 5C is an ultrasound image of a parasternal short axis view provided in an embodiment of the present application;
fig. 5D is a schematic view of a parasternal short-axis view according to an embodiment of the present disclosure;
FIG. 6A is a standard grade display schematic of a bar form provided herein;
FIG. 6B is a schematic illustration of a standard grade display in the form of a five-pointed star provided herein;
FIG. 7 is a schematic view of another ultrasound image display interface provided herein;
fig. 8 is a flowchart of an auxiliary display method of an ultrasound image according to an embodiment of the present application.
Detailed Description
The present application is described in further detail below with reference to the accompanying drawings by way of specific embodiments. Wherein like elements in different embodiments are numbered alike in association. In the following embodiments, numerous specific details are set forth in order to provide a better understanding of the present application. However, one skilled in the art will readily recognize that some of the features may be omitted, or replaced by other elements, materials, or methods in different situations. In some instances, some operations associated with the present application have not been shown or described in the specification to avoid obscuring the core portions of the present application, and may not be necessary for a person skilled in the art to describe in detail the relevant operations based on the description herein and the general knowledge of one skilled in the art.
Furthermore, the described features, operations, or characteristics of the description may be combined in any suitable manner in various embodiments. Also, various steps or acts in the method descriptions may be interchanged or modified in a manner apparent to those of ordinary skill in the art. Thus, the various orders in the description and drawings are for clarity of description of only certain embodiments, and are not meant to be required orders unless otherwise indicated.
The numbering of the components itself, e.g. "first", "second", etc., is used herein merely to distinguish between the described objects and does not have any sequential or technical meaning. The terms "coupled" and "connected," as used herein, are intended to encompass both direct and indirect coupling (coupling), unless otherwise indicated.
As shown in fig. 1, the ultrasound imaging system provided in this embodiment may include, but is not limited to: an ultrasound probe 30, transmit and receive control circuitry 40, a beam forming module 50, an IQ demodulation module 60, a processor 20, a display 70, and a memory 80.
An ultrasonic probe 30 for transmitting ultrasonic waves and receiving echo signals of the ultrasonic waves. The ultrasonic probe 30 may include a transducer (not shown) composed of a plurality of array elements arranged in an array, the plurality of array elements being arranged in a row to form a linear array, or in a two-dimensional matrix to form an area array, the plurality of array elements may also form a convex array. The array elements are used for transmitting ultrasonic beams according to the excitation electric signals or converting the received ultrasonic beams into electric signals. Each array element can be used for realizing the mutual conversion of the electric pulse signals and ultrasonic beams, thereby realizing the transmission of ultrasonic waves to tissues to be tested (such as organs, tissues, blood vessels, regions of interest in biological tissues such as fetuses and the like in human bodies or animal bodies) and also can be used for receiving the echoes of the ultrasonic waves reflected by the tissues. In performing ultrasonic detection, the transmit control circuit 410 and the receive control circuit 420 may control which elements are used to transmit an ultrasonic beam and which elements are used to receive an ultrasonic beam, or control the element slots are used to transmit an ultrasonic beam or receive an echo of an ultrasonic beam. The array elements participating in ultrasonic wave transmission can be excited by the electric signals at the same time, so that ultrasonic waves are transmitted at the same time; or the array elements participating in the ultrasonic wave transmission can be excited by a plurality of electric signals with a certain time interval, so that the ultrasonic wave with a certain time interval can be continuously transmitted.
The array elements, for example, employ piezoelectric crystals that convert electrical signals into ultrasound signals in accordance with a transmit sequence transmitted by the transmit control circuit 410, which may include one or more scan pulses, one or more reference pulses, one or more push pulses, and/or one or more doppler pulses, depending on the application. Depending on the morphology of the wave, the ultrasonic signal includes a focused wave and a plane wave.
The user selects a proper position and angle by moving the ultrasonic probe 30 to transmit ultrasonic waves to the tissue to be tested and receive echoes of the ultrasonic waves returned by the tissue to be tested, and outputs ultrasonic echo signals, wherein the ultrasonic echo signals are analog electric signals according to a channel formed by taking a receiving array element as a channel, and carry amplitude information, frequency information and time information.
The transmission and reception control circuit 40 is for controlling the ultrasonic probe to perform transmission of ultrasonic waves and reception of echo signals of the ultrasonic waves. The transmission and reception control circuit 40 may include a transmission control circuit 410 and a reception control circuit 420, among others.
The transmission control circuit 410 is configured to generate a transmission sequence according to the control of the processor 20, where the transmission sequence is configured to control some or all of the plurality of array elements to transmit ultrasonic waves to the target tissue, and the transmission sequence parameters include an array element position for transmission, the number of array elements, and an ultrasonic beam transmission parameter (such as amplitude, frequency, number of transmissions, transmission interval, transmission angle, waveform, focal position, etc.). In some cases, the transmit control circuit 410 is further configured to phase delay the transmitted beams so that different transmit elements transmit ultrasound waves at different times, so that each transmit ultrasound beam can be focused at a predetermined region of interest. Different modes of operation, such as B-image mode, C-image mode, and D-image mode (doppler mode), the transmit sequence parameters may be different, and after the echo signals are received by the receive control circuit 420 and processed by subsequent modules and corresponding algorithms, a B-image reflecting the anatomical structure of the tissue, a C-image reflecting the anatomical structure and blood flow information, and a D-image reflecting the doppler spectrum image may be generated.
The reception control circuit 420 is configured to receive an ultrasonic echo signal from an ultrasonic probe and process the ultrasonic echo signal. The receive control circuitry 420 may include one or more amplifiers, analog-to-digital converters (ADCs), and the like. The amplifier is used for amplifying the received echo signals after proper gain compensation, and the amplifier is used for sampling the analog echo signals at preset time intervals so as to convert the analog echo signals into digitized signals, and the digitized echo signals still retain amplitude information, frequency information and phase information. The data output by the receive control circuit 420 may be output to the beam forming module 50 for processing or output to the memory 80 for storage.
The beam forming module 50 is in signal connection with the receiving control circuit 420, and is configured to perform corresponding beam forming processes such as delay and weighted summation on the echo signals, and because distances from the ultrasonic receiving points in the tissue to be tested to the receiving array elements are different, channel data of the same receiving point output by different receiving array elements have delay differences, delay processing is required to be performed, phases are aligned, and different channel data of the same receiving point are weighted and summed, so as to obtain ultrasonic image data after beam forming, and ultrasonic image data output by the beam forming module 50 is also referred to as radio frequency data (RF data). The beam forming module 50 outputs the radio frequency data to the IQ demodulation module 60. In some embodiments, the beam forming module 50 may also output the rf data to the memory 80 for buffering or saving, or directly output the rf data to the processor 20 for image processing.
The beam forming module 50 may perform the above-described functions in hardware, firmware, or software, for example, the beam forming module 50 may include a central controller Circuit (CPU), one or more micro-processing chips, or any other electronic component capable of processing input data according to specific logic instructions, which when the beam forming module 50 is implemented in software, may execute instructions stored on tangible and non-transitory computer readable media (e.g., memory) to perform beam forming calculations using any suitable beam forming method.
In some embodiments, the beam forming module 50 is not required.
The IQ demodulation module 60 removes the signal carrier by IQ demodulation, extracts the tissue structure information contained in the signal, and performs filtering to remove noise, and the signal obtained at this time is referred to as a baseband signal (IQ data pair). The IQ demodulation module 60 outputs IQ data pairs to the processor 20 for image processing.
In some embodiments, the IQ demodulation module 60 also outputs IQ data pairs to the memory 80 for buffering or saving so that the processor 20 reads the data from the memory 80 for subsequent image processing.
The IQ demodulation module 60 may also perform the above functions in hardware, firmware or software, and in some embodiments, the IQ demodulation module 60 may also be integrated with the beam forming module 50 in a single chip.
In some embodiments, IQ demodulation module 60 is not required.
The processor 20 is arranged to generate an ultrasound image from echo signals of said ultrasound waves. For example, the processor 20 processes the ultrasound data to generate a gray scale image of the signal intensity variations over the scan range that reflects the anatomy inside the tissue, referred to as the B image. The processor 20 may output the B-image to the display 70 for display.
The processor 20 is also configured to be a central controller Circuit (CPU), one or more microprocessors, graphics controller circuits (GPUs), or any other electronic component capable of processing input data according to specific logic instructions, which may perform control of peripheral electronic components, or data reading and/or saving of memory 80, according to the input instructions or predetermined instructions, and may also process the input data by executing programs in the memory 80, such as by performing one or more processing operations on the acquired ultrasound data according to one or more modes of operation, including but not limited to adjusting or defining the form of ultrasound emitted by the ultrasound probe 30, generating various image frames for subsequent display by the display 70, or adjusting or defining the content and form displayed on the display 70, or adjusting one or more image display settings (e.g., ultrasound images, interface components, schematics, etc.) displayed on the display 70.
The acquired ultrasound data may be processed by the processor 20 in real time during scanning or therapy as the echo signals are received, or may be temporarily stored on the memory 80 and processed in near real time in an on-line or off-line operation.
In this embodiment, the processor 20 controls the operations of the transmission control circuit 410 and the reception control circuit 420, for example, controls the transmission control circuit 410 and the reception control circuit 420 to alternately operate or simultaneously operate. The processor 20 may also determine an appropriate operation mode according to a user's selection or a program setting, form a transmission sequence corresponding to the current operation mode, and send the transmission sequence to the transmission control circuit 410, so that the transmission control circuit 410 controls the ultrasound probe 30 to transmit ultrasound waves using the appropriate transmission sequence.
The display 70 may be a touch display screen, a liquid crystal display screen, or the like, or may be an independent display device such as a liquid crystal display, a television, or the like, which is independent of the ultrasound imaging system 10, or may be a display screen on an electronic device such as a mobile phone, a tablet computer, or the like.
Wherein the memory 80 may be a volatile memory (RAM), such as a random access memory (Random Access Memory); or a nonvolatile Memory (non-volatile Memory), such as a Read Only Memory (ROM), a flash Memory (flash Memory), a Hard Disk (HDD) or a Solid State Drive (SSD); or a combination of the above types of memories and provide instructions and data to the processor.
In some embodiments, memory 80 is not required.
Optionally, the ultrasound imaging system provided in this embodiment may further include: and a man-machine interaction device.
The man-machine interaction device is used for carrying out man-machine interaction, namely receiving the input and output visual information of a user; the man-machine interaction device may include an input device for detecting input information of a user, and the input information may be, for example, a control instruction for an ultrasonic wave transmission/reception timing, an operation input instruction for editing and annotating an ultrasonic image, or may further include other instruction types. The input device may include at least one of a keyboard, a mouse, a scroll wheel, a trackball, operating buttons, a mobile input device (e.g., a mobile device with a touch display screen, a cell phone, etc.), etc., or a touch screen integrated with the display 70 may be employed; the output of which may employ a display 70. The human-machine interaction means may also comprise an output device such as a printer, for example for printing ultrasound reports.
It will be appreciated that the components included in the ultrasound imaging system shown in fig. 1 are illustrative only and may include more or fewer components, as the invention is not limited in this respect.
The processor 20 may be implemented by software, hardware, firmware, or a combination thereof, and may use at least one of a circuit, a single or multiple application specific integrated circuits (Application Specific Integrated Circuit, ASIC), a digital signal processor (Digital Signal Processor, DSP), a digital signal processing device (Digital SignalProcessing Device, DSPD), a programmable logic device (Programmable Logic Device, PLD), a field programmable gate array (Field Programmable Gate Array, FPGA), a Central ProcessingUnit, CPU, a controller, a microcontroller, a microprocessor, or any combination thereof, so that the processor 20 may perform part or all of the steps in the auxiliary display method of an ultrasound image in various embodiments of the present application.
According to the ultrasonic imaging system and the auxiliary display method of the ultrasonic image, when the ultrasonic imaging device scans tissues, the ultrasonic image and the schematic diagram of the ultrasonic image corresponding to the standard section type containing the specific characteristic structure are displayed in real time, in the schematic diagram, the target characteristic structure identified from the ultrasonic image is highlighted, and due to the characteristics of clear, visual and concise schematic diagram and the like, an operator can intuitively and quickly determine whether the currently scanned section is the standard section according to the schematic diagram, and the specific characteristic structures in the currently scanned section are any, so that the operator can quickly acquire the standard section, related information is quickly acquired from the standard section, the standard section acquisition efficiency is improved, and quality control assessment is facilitated.
The following describes the technical scheme of the present application in detail.
The ultrasonic imaging system provided in the present embodiment controls the transmission and reception control circuit 40 to control the ultrasonic probe 30 to perform transmission of ultrasonic waves and reception of echo signals of ultrasonic waves.
The processor 20 generates an ultrasound image from the echo signals of the ultrasound waves.
The ultrasound image may be a real-time ultrasound image acquired during a real-time scanning process. Ultrasound images may also be pre-acquired by the ultrasound imaging device to obtain offline ultrasound images. The ultrasound image may be two-dimensional or three-dimensional dynamic video data or single frame image data. And acquiring ultrasonic images of corresponding image modes, such as B-mode ultrasonic images and C-mode ultrasonic images, according to the type of the standard section to be identified.
The processor 20 determines from the ultrasound image the type of standard cut surface to which the ultrasound image belongs.
The standard section type to which the ultrasonic image belongs is the type of the standard section closest to the ultrasonic image. In an ultrasound examination, an operator obtains an ultrasound image of a standard section by scanning.
The standard section corresponding to each ultrasonic examination is various, for example, the standard section in the ultrasonic examination of the liver, gall, pancreas and spleen of the abdomen can generally comprise: a first portal section, a second portal section, a left lobe longitudinal section of the trans-abdominal aortic liver, a liver-kidney section, a subxiphoid transverse section of the portal sagittal section, a gall bladder, pancreas, spleen long axis section, and the like. For example, bedside ultrasound cardiac examinations also typically require some standard cut to be made to support accurate assessment of a patient's heart, generally involving: a parasternal long axis, a parasternal short axis, a four-chamber of the apex, a four-chamber under the xiphoid process, a lower vena cava section under the xiphoid process, etc.
Wherein the standard slice type contains a specific feature. Each section image generally contains various characteristic structures, each standard section type ultrasonic image contains specific characteristic structures, the specific characteristic structures belong to the various characteristic structures and are used for indicating the standard section types, and the specific characteristic structures in the standard section are key bases for identifying the standard section types and are objects of relatively focused in clinic by users. Which specific features are included in each standard cut plane can be comprehensively considered and determined in combination with relevant clinical knowledge and feasibility of algorithm identification.
For example, the specific features included in the second portal surface are the mid hepatic vein, the right hepatic vein, and the left hepatic vein, and if the mid hepatic vein, the right hepatic vein, and the left hepatic vein are included in the ultrasound image, the ultrasound image is considered to be the ultrasound image of the second portal surface. For another example, features included in the parasternal short axis view are right, mitral and left compartments, and if included in the ultrasound image, the ultrasound image is considered to be an ultrasound image of the parasternal short axis view.
The processor 20 identifies a target feature in the ultrasound image that corresponds to the particular feature from the ultrasound image.
The target feature structure is a specific feature structure identified from the current ultrasonic image, the target feature structure can be one or more specific feature structures, and the target feature structure can be part or all of specific feature structures in a standard section type to which the ultrasonic image belongs. In the scanning process, an operator often needs to continuously adjust the position and angle of the ultrasonic probe 30, and the ultrasonic imaging device acquires an ultrasonic image in real time until an ultrasonic image of a standard section is acquired, wherein whether the current ultrasonic image is the ultrasonic image of the standard section needs to be judged by the operator. In the scanning process, the ultrasound image acquired in real time is not necessarily an ultrasound image of a standard section, that is, all specific feature structures in the standard section are not necessarily displayed clearly, so that the ultrasound image can be identified, and therefore, the specific feature structures displayed in the ultrasound image are determined, and the specific feature structures identified from the ultrasound image are target feature structures.
The processor 20 determines a corresponding standard facet structure schematic from the standard facet type.
The standard section structure diagram comprises a structure area corresponding to a specific characteristic structure.
The standard section structure schematic diagram is used for schematically showing various characteristic structures in the standard section, and different characteristic structures can be distinguished by lines, colors and the like. The standard section structure diagram includes a region of the specific feature structure.
The ultrasound images shown in fig. 2A and 2B and the corresponding standard section structure diagrams are taken as examples.
Referring to fig. 2A and fig. 2B, fig. 2A is an ultrasound image of a liver and kidney section provided in an embodiment of the present application, and fig. 2B is a schematic diagram of a liver and kidney section structure provided in an embodiment of the present application. As fig. 2B schematically shows structural features such as lines and color blocks of the actual ultrasound image of fig. 2A, it should be noted that fig. 2B does not show colors, and in practical application, fig. 2B may be colored. The specific features in the liver-kidney section are liver and kidney, and the liver-kidney section structure diagram includes a liver corresponding structure region 21 and a kidney corresponding structure region 22, respectively, and the regions other than the structure regions 21 and 22 also include some structural features, but are not specific structural features.
Alternatively, the standard section structure diagram may be stored in the memory 80 in advance, so that the processor 20 obtains the corresponding standard section structure diagram from the memory 80 according to the standard section type of the obtained ultrasound image.
The display 70 displays an ultrasound image and a standard cut plane structure schematic.
Wherein, the structure area corresponding to the target feature structure in the standard section structure diagram is highlighted.
Highlighting means that the structural region corresponding to the target feature structure is distinguished and displayed by brightness, gray scale, color, and the like compared with other regions, so that the structural region corresponding to the target feature structure is more striking in the schematic diagram, wherein the other regions are regions except for the structural region corresponding to the target feature structure.
Alternatively, how to highlight may be preset. The method can be respectively arranged in various standard section structure schematic diagrams; the highlighting mode can also be set uniformly, for example, for different standard section structure schematic diagrams.
In practical application, in teaching training or clinic, when an ultrasonic imaging system is used to scan a tissue to be tested, in order to obtain an ultrasonic image of a standard section, an operator needs to constantly adjust the position and angle of the ultrasonic probe 30, and the ultrasonic imaging device acquires the ultrasonic image in real time, wherein each acquired ultrasonic image can be subjected to the following processes: the emission control circuit 410 excites the ultrasonic probe 30 to emit ultrasonic waves to the tissue to be measured; the reception control circuit 420 receives an echo of the ultrasonic wave returned from the tissue to be measured from the ultrasonic probe 30, thereby obtaining an ultrasonic echo signal; the processor 20 processes the ultrasonic echo signal to obtain an ultrasonic image of the tissue to be measured, determines a standard section type corresponding to the ultrasonic image and containing a specific feature, identifies a target feature corresponding to the specific feature in the ultrasonic image, determines a standard section structure diagram according to the standard section type, controls the display 70 to display the ultrasonic image and the diagram, and highlights the target feature identified from the ultrasonic image in the diagram.
In this embodiment, during the scanning process, along with the change of the position and angle of the ultrasonic probe 30, the ultrasonic image changes in real time, so that the structural diagram of the standard section corresponding to the ultrasonic image displayed in the display 70 also changes in real time along with the ultrasonic image, and because the diagram is clear, intuitive and concise, an operator of the ultrasonic imaging system can intuitively and quickly determine whether the currently scanned section is the standard section according to the diagram, and the specific feature structures in the currently scanned section are some, thereby facilitating the operator to quickly acquire the standard section, thereby quickly acquiring related information from the standard section, improving the acquisition efficiency of the standard section, and facilitating the quality control and assessment.
In some embodiments, in the standard cut plane structure schematic diagram displayed by the display 70, the structure region corresponding to the target feature is highlighted, which may include, but is not limited to, one or a combination of the following display modes:
display mode one: the brightness value of the target characteristic structure in the standard section structure schematic diagram is larger than that of other areas;
and a second display mode: the gray value of the target characteristic structure in the standard section structure schematic diagram is lower than that of other areas;
And a third display mode: the color of the target feature in the standard tangential plane structure diagram is different from the color of the other regions.
The other areas are areas except for the structural area where the target feature structure is located in the standard tangent plane structural schematic diagram.
According to the embodiment, the target feature structure is highlighted in different modes, so that in the process of drawing and checking an operator, the operator can quickly learn the obtained target feature structure in the ultrasonic image from the standard section structure schematic diagram, and the efficiency is improved.
In some embodiments, the other regions may include a first region and a second region, wherein the first region is a region corresponding to a structural feature other than the specific structural feature, and the second region is a region corresponding to a specific structural feature other than the target feature. It will be appreciated that the manner of display between the first region and the second region may also be a differential display, i.e. the brightness, grey scale or colour of the first region and the second region may be different.
Through carrying out the difference to first region and second region and showing, be favorable to operating personnel to know which specific feature structure still has not obtained in the current ultrasonic image to do benefit to the more quick completion standard tangent plane of operating personnel obtain, improved standard tangent plane and obtained efficiency, be convenient for carry out the quality control examination.
In some embodiments, the display 70 may display a standard cut-plane configuration in a variety of ways. Some possible display modes of the display 70 are described below.
In one possible implementation, the display 70 may include two screens to display the standard tangent plane structural schematic and the ultrasound image, respectively, and the display 70 may divide the screens into two portions to display the standard tangent plane structural schematic and the ultrasound image, respectively, wherein the displayed ultrasound image and the standard tangent plane structural schematic are equal in size.
In another possible implementation, the display 70 displays the ultrasound image and the standard section structure diagram simultaneously, and the ultrasound image and the standard section structure diagram do not overlap, and the display size of the standard section structure diagram is smaller than the display size of the ultrasound image.
Since the operator primarily views the ultrasound image, the display 70 displays an ultrasound image having a size greater than that of the standard tangent plane structural schematic.
Further, the display size of the standard section structure schematic may be in a preset ratio to the display size of the ultrasound image, where the preset ratio may be preset.
Wherein the ultrasound image may be displayed in a middle position of the display interface, i.e. in the same display position as the display interface on a conventional ultrasound imaging system. The position relation between the standard section structure schematic diagram and the ultrasonic image can be in various forms, and the standard section structure schematic diagram can be displayed at the position which is not overlapped with the ultrasonic image, such as the upper left corner, the upper right corner, the lower left corner or the lower right corner of the display interface, in consideration of the fact that the actual information area of the ultrasonic image is a sector, so that the observation of an ultrasonic image area by an operator is not interfered.
According to the embodiment, the ultrasonic image and the standard section structure diagram are displayed at the same time, the ultrasonic image and the standard section structure diagram are not overlapped, and the display size of the standard section structure diagram is smaller than that of the ultrasonic image, so that an operator can quickly acquire required key information from the standard section structure diagram under the condition that the drawing and the checking process of the operator are not interfered, and the efficiency is improved.
The following will take fig. 3A, fig. 3B, fig. 4A and fig. 4B as examples to illustrate a standard section structure schematic diagram and a display manner of an ultrasound image interface according to an embodiment of the present application.
Referring to fig. 3A and fig. 3B, fig. 3A is an ultrasound image of a second portal surface provided in an embodiment of the present application, and fig. 3B is a schematic view of a structure of a second portal surface provided in an embodiment of the present application. Wherein, specific structural features in the second portal section are shown in fig. 3B, respectively: a structural region 31 corresponding to a vein in the liver, a structural region 32 corresponding to a vein on the right of the liver, a structural region 33 corresponding to a vein on the left of the liver, and a structural region 34 corresponding to a convergence.
Referring to fig. 4A and fig. 4B, fig. 4A is a display interface diagram of an ultrasound image and a schematic diagram thereof provided in an embodiment of the present application, and fig. 4B is a display interface diagram of another ultrasound image and a schematic diagram thereof provided in an embodiment of the present application. The display interfaces of fig. 4A and 4B each show an ultrasound image and a standard tangent plane feature, wherein the standard tangent plane feature is shown in the upper left corner of the display interface in a size proportional to the ultrasound image.
In the process of acquiring the second portal section, as shown in fig. 4A, only the hepatic right vein in the second portal section is acquired in the ultrasound image, and the structural region corresponding to the hepatic right vein in the corresponding standard section structural diagram 41 is highlighted, where the structural region corresponding to the hepatic right vein may be selectively displayed with a higher brightness, a lower gray level and/or a different color than other regions (the color is not shown in the standard section structural diagram 41), and the structural region corresponding to the hepatic middle vein and the structural region corresponding to the hepatic left vein of the specific structural feature may be displayed with a lower brightness, a larger gray level and/or a different color than the identified structural region corresponding to the hepatic right vein (the color is not shown in the standard section structural diagram 41), for example, in fig. 4A, the structural region corresponding to the hepatic right vein may be displayed with a light blue color with a higher saturation, and the structural region corresponding to the hepatic middle vein and the structural region corresponding to the hepatic left vein may be displayed with a black color.
The operator can quickly understand the target feature structure, namely the hepatic vein, which is already acquired by the current ultrasonic image according to the current standard section structure diagram 41, so that the operator can continuously adjust the ultrasonic probe to acquire the standard section, and when the ultrasonic image shown in fig. 4B is acquired, the operator can understand that the specific feature structure in the second portal section is already displayed in the current ultrasonic image through the standard section structure diagram 42, and the operator can perform operations such as saving the current ultrasonic image according to the schematic of the current standard section structure diagram 42.
In addition, the second region other than the specific structural feature may be displayed in a low brightness, large gray scale, and/or dark color with respect to the target structural feature, such as in fig. 4A, the second region may be different in gray scale, brightness, or color from the region corresponding to the specific structural feature. The display form of the second region may be the same in the same type of standard section structure schematic diagram, for example, the standard section structure schematic diagram 41 in fig. 4A and the standard section structure schematic diagram 42 in fig. 4B both belong to the second portal section, and the display form of the second region is the same.
In some embodiments, further, the display 70 is further configured to display a name of a standard section type to which the ultrasound image belongs, based on any of the above embodiments.
The names of the standard section types can be Chinese or English abbreviations or other forms, and the application is not limited.
In some embodiments, the display 70 is also used to display the name of the target feature of the ultrasound image.
The names of the target feature structures may be chinese or english abbreviations or other forms, which are not limited in this application.
The location of the name label of the target feature of the ultrasound image may be in a variety of ways.
In a possible implementation manner, the names of the target feature structures of the ultrasound image are marked beside the standard tangent plane structure schematic diagram, please refer to fig. 2B or fig. 3B, and for each target feature structure, a line may be used to connect the name of the target feature structure with the corresponding region of the target feature structure in the standard tangent plane structure schematic diagram, so as to mark beside the standard tangent plane structure schematic diagram.
Further, the names of all specific features in the standard tangent plane structure schematic may be labeled. The name labeling mode of all the specific feature structures can be similar to the name labeling mode of the target feature structures, namely the names of the specific feature structures are labeled beside the standard section structure schematic diagram.
In another possible implementation, the name of the target feature of the ultrasound image is displayed in a structural region corresponding to the target feature of the standard tangent plane structural schematic.
Further, the names of the target features of the ultrasound image are displayed in the structure areas corresponding to the target features of the ultrasound image.
Further, the names of all specific feature structures in the standard section structure schematic diagram can be marked and displayed. The name labeling manner of all the specific features may be similar to the name labeling manner of the target feature, that is, the name of the specific feature is labeled in the structure area corresponding to the specific feature of the standard tangent plane structure schematic diagram.
Further, names of all specific features of the ultrasound image are displayed in a structural region corresponding to the specific features of the ultrasound image.
According to the embodiment, the names are marked and displayed, so that a prompting effect is achieved, an operator can acquire an ultrasonic image of a standard section, and efficiency is improved.
The manner in which the above-described names are labeled will be described below with reference to fig. 5A to 5D.
Referring to fig. 5A to 5D, fig. 5A is an ultrasound image of a parasternal long axis section provided in an embodiment of the present application, fig. 5B is a schematic structural diagram of a parasternal long axis section provided in an embodiment of the present application, and fig. 5B is a schematic structural diagram corresponding to fig. 5A. Fig. 5C is an ultrasound image of a parasternal short-axis section provided in an embodiment of the present application, fig. 5D is a schematic structural diagram of a parasternal short-axis section provided in an embodiment of the present application, and fig. 5C is a schematic structural diagram corresponding to fig. 5D. The names of specific structural features in the drawings are marked in the form of English abbreviations, and the specific structural features respectively comprise: RVOT denotes the right ventricular outflow tract, LV denotes the left ventricle, RV denotes the right ventricle, MV denotes the mitral valve, LA denotes the left atrium, DA denotes the descending aorta, and AV denotes the aortic valve. In the standard tangential plane structure diagrams shown in fig. 5B and 5D, the name of a specific feature structure in the standard tangential plane structure diagram may be displayed in a region corresponding to the specific feature structure. As in the ultrasound images shown in fig. 5A and 5C, the name of the target feature of the ultrasound image may be displayed in the region corresponding to the target feature.
In some embodiments, the processor 20 determines the standard slice type to which the ultrasound image belongs by:
the processor 20 obtains the standard section type to which the ultrasound image belongs based on the ultrasound image and the section type identification model.
The section type identification model is used for identifying whether the ultrasonic image belongs to a standard section, and if the ultrasonic image belongs to the standard section type, the ultrasonic image is identified to belong to the standard section type.
The tangent plane type identification model may be a model based on deep learning or other machine-learned image classification.
In the case where the slice type recognition model is a model based on image classification by deep learning, the slice type recognition model may employ various network model structures, such as EfficientNet, mobileNet, VGG, resNet, denseNet and AlexNet, etc., which are not limited in this application.
The tangent plane type recognition model is a model which is trained in advance, and it can be understood that the process of training the tangent plane type recognition model can be completed by an ultrasonic imaging system or by other equipment, and after the training is completed by the other equipment, the trained tangent plane type recognition model is stored in the ultrasonic imaging system. When the facet type recognition model is trained, a database of ultrasonic images (called a first training set) can be constructed first, whether each ultrasonic image in the first training set is a standard facet or not is marked, and if the ultrasonic image is the standard facet, the standard facet type to which the ultrasonic image belongs is marked. And training the network model by using the first training set to obtain a section type identification model after training. The training-completed section type recognition model can output whether the ultrasonic image belongs to a standard section type according to the input ultrasonic image, if so, the standard section type to which the ultrasonic image belongs is output, and further, the probability value of the ultrasonic image as the standard section type can also be output.
In this embodiment, since the model result accuracy of the image classification based on the deep learning is high, and the model is easy to implement, the model for identifying the type of the section is a model of the image classification based on the deep learning, so that the accuracy of identifying the type of the section is high and the model is easy to implement.
In the case where the section type recognition model is a model based on image classification by other machine learning, the section type recognition model may employ various model structures. The section type recognition model may sequentially include a feature extraction module and a classification module.
The feature extraction module is used for extracting features of the ultrasonic image or a region of interest in the ultrasonic image, and the feature extraction module can realize feature extraction in a PCA, LDA, HOG, harr, LBP, SIFT mode or a neural network mode.
The classification module is configured to match the extracted features of the ultrasound image with features of each type of ultrasound image in a database formed by ultrasound images of standard sections, so as to obtain a standard section category to which the ultrasound image belongs, where the classification module may be implemented by a classification algorithm, and the classification algorithm may be a KNN, an SVM, a decision tree, a random forest, a neural network, or the like.
The implementation principle of the classification module is described below by taking KNN, SVM, decision tree and random forest as examples.
The principle of the KNN algorithm is to calculate the distance between the features extracted from the input ultrasound image and the features extracted from the ultrasound image in the first training set (may be euclidean distance, hamming distance, etc.), and then select K ultrasound images (K is an integer greater than 1) with the minimum feature distance, where the category with the largest occurrence number in the K ultrasound images is the category of the input ultrasound image.
SVM is essentially a two-class model, which is a maximally spaced classifier defined in feature space, that is trained to find the optimal separation hyperplane between two classes of data. An important property of SVM is the model after training, most of the training samples need not be retained, the model only being related to the support vector. The SVM may be extended to multi-class classification tasks, such as combining multiple SVM classifiers to achieve multi-class classification.
The decision tree is a process of simulating a person to make a decision in the form of a binary tree or a multi-way tree, taking echocardiography as an example, each echocardiography category can establish a tree model, and each node of the tree corresponds to a feature. In practical application, the ultrasonic image of the tissue to be detected is input into a decision tree, the decision tree can sequentially judge whether the characteristic corresponding to each node on the tree model is contained or not, if the characteristic is not contained, the ultrasonic image does not belong to the standard section class corresponding to the tree model, if the characteristic is contained, the judgment of the next characteristic is continuously carried out, and if the characteristic is contained, the characteristic on the tree model belongs to the standard section class corresponding to the tree model.
Random forests are algorithms based on the idea of ensemble learning, integrating multiple decision trees, and for classification problems, the output class is determined by the mode of individual tree output.
In some embodiments, the processor 20 identifies a target feature in the ultrasound image that corresponds to a particular feature from the ultrasound image by:
the processor 20 obtains a target feature corresponding to a particular feature in the ultrasound image from the ultrasound image and the feature recognition model.
The characteristic structure recognition model is used for recognizing an input ultrasonic image to obtain a specific characteristic structure in the ultrasonic image. The feature recognition model may employ a model of object detection or image segmentation based on deep learning or other machine learning.
If the feature structure recognition model is a model based on deep learning, inputting an ultrasonic image to be detected into the feature structure recognition model, whether a certain target (specific feature structure) exists in the ultrasonic image can be obtained, and further, if the certain target exists, the type of the target, a predicted probability value, the position of a target corresponding region and the range of the target corresponding region can be obtained, wherein the range of the target corresponding region can be represented by a box surrounding the target corresponding region. The characteristic structure identification model can be a network model with various structures, such as a detector like Faster-RCNN, YOLO, SSD, retinaNet, efficientDet, FCOS, centerNet.
The feature structure recognition model is a model which is trained in advance. Before the feature structure recognition model is used, training is needed to be carried out on the feature structure recognition model, so that the feature structure recognition model after training is obtained. It will be appreciated that the training of the feature recognition model may be performed by the ultrasound imaging system, or may be performed by other devices, and after the training is performed by the other devices, the trained feature recognition model is stored in the ultrasound imaging system.
A method of training a feature recognition model is described below. When training the feature structure recognition model, a database (which may be referred to as a second training set) of ultrasound images and labels of target areas (i.e., specific feature structures) is constructed. Wherein each ultrasound image in the second training set marks the target region contained therein, may be represented by a box (bounding box) that closely encloses the region. The information of the box includes the class of the target object in the box (which specific feature structure is), and the coordinate information of the box (which indicates where the target object is). And training the network model by using the constructed second training set, and obtaining a feature structure recognition model after training is finished.
In other machine learning-based image segmentation methods, it is also necessary to construct a database of ultrasound images and target region labels. Wherein each ultrasound image marks the type of target region (feature) and boundary extent, including whether the feature is present, its type if present, and the specific boundary extent. The shape of the specific boundary here is no longer a rectangular bounding box like object detection. The deep learning image segmentation method model can be selected from network models such as FCN, unet, segNet, deepLab, mask RCNN and the like, and the model is trained by using the constructed ultrasonic image database. After training is finished, an image to be segmented is input, an image with the same size as the input image is output through a network, whether a certain characteristic structure is contained or not is given, and if the characteristic structure is contained, the type of the characteristic structure and a specific boundary range are output.
In this embodiment, since the accuracy of the model result of the target detection or the image segmentation based on the deep learning is high, the implementation is easy, and the feature structure recognition model is a model of the target detection or the image segmentation based on the deep learning, the accuracy of the section type recognition is high and the implementation is easy.
If the feature recognition model is another machine-learned model, the feature recognition model may be one of a variety of models.
The principles of the machine-learned model are described below. In a possible implementation manner, the feature structure recognition model acquires a group of candidate interested areas (also called candidate frame areas) in an input ultrasonic image through a sliding window or a selective search method, wherein the candidate frame areas can be rectangular frames; extracting the characteristics of each candidate frame region, wherein the characteristics can be extracted from traditional characteristics such as PCA, LDA, HOG, harr, LBP, SIFT and the like or characteristics extracted from a neural network; the extracted features are matched with features extracted from the marked feature structure areas in the image database, and the extracted features are classified by using a linear classifier, an SVM (support vector machine) or a simple neural network and other classifiers, so that whether the current candidate frame area contains a specific feature structure or not can be determined, and if the current candidate frame area contains the specific feature structure, the type of the specific feature structure contained in the current candidate frame area is determined.
In another possible implementation manner, the feature structure recognition model pre-segments an image through an image processing method such as threshold segmentation, snake, level set or GraphCut, and a group of candidate target structure boundary ranges, which may be called candidate boundary ranges, are obtained in the image; then, extracting the characteristics of the area surrounded by each candidate boundary range, wherein the characteristics can be extracted from traditional characteristics such as PCA, LDA, HOG, harr, LBP, SIFT and the like or extracted from a neural network; matching the extracted features with features extracted from the boundary range of the specific feature structure marked in the database; the classification can be performed by using a linear classifier, an SVM, or a simple neural network, etc., so as to determine whether the current candidate boundary range includes a specific feature structure, and if so, determining the type of the specific feature structure included in the candidate boundary range.
It should be noted that, the processor 20 may also use a section type and feature structure identification model to determine the standard section type to which the ultrasound image belongs and identify the target feature structure. The section type and feature structure identification model is used for identifying an input ultrasonic image to obtain a standard section type to which the ultrasonic image belongs and a target feature structure corresponding to a specific feature structure.
The face type and feature structure recognition model may be a deep learning-based model, and thus the face type and feature structure recognition model may include the face recognition model and the feature structure recognition model, and the face type and feature structure recognition model may be trained through the loss functions of the two tasks.
By adopting the model combining multiple tasks, different tasks can be mutually promoted, the performance of each task can be improved, and the accuracy is improved.
In addition, when determining the standard section type to which the ultrasound image belongs, the processor 20 may determine the standard type to which the ultrasound image belongs based on the image classification method described above; the specific feature structure is a key basis for identifying the standard section type, or the specific feature structure (target feature structure identified by the processor 20) contained in the image can be identified based on a segmentation or detection method, so as to determine the standard section type to which the ultrasonic image belongs; the two can be combined together to determine the standard section type of the ultrasonic image, for example, the combination mode can be to assign respective weights to the result of image classification and the result of feature structure identification, and comprehensively evaluate the standard section type of the ultrasonic image.
In some cases, during scanning and the like, the operator needs to acquire an ultrasound image as close to the standard section as possible, that is, an ultrasound image of high standard level. However, the operator cannot well understand the standard degree of the current ultrasound image, so that the standard degree of the obtained ultrasound image of the standard section is not high. In view of this, the ultrasound imaging system provided in the embodiment of the present application displays the standard level of the ultrasound image in real time through the display 70, so as to effectively assist the operator to acquire the standard section, and improve the efficiency of acquiring the standard section. Specific examples are described in detail below.
The processor 20 determines a standard grade of the ultrasound image based on the standard grade of the ultrasound image.
The processor 20 may determine the standard level of the ultrasound image based on the degree to which the ultrasound image meets the definition of the standard cut plane. Thereby determining the standard grade of the ultrasonic image according to the standard degree and standard grade dividing method of the ultrasonic image.
The standard degree of the ultrasonic image refers to the degree of proximity to the definition of a standard section when the ultrasonic image belongs to a certain standard section type.
The standard level of the ultrasound image may be one of a plurality of preset standard levels, for example, the standard level of the ultrasound image may be represented by a value within a certain value range, the standard level may be a plurality of value range sections into which the certain value range is divided, and which section among the plurality of value range sections the value of the standard level is represented by is determined, so that the standard level of the ultrasound image is the standard level corresponding to the section.
Alternatively, the manner of rating the scale is given by comprehensively considering the influence of a plurality of factors based on the clinical knowledge of the standard degree of each standard section. For example, the processor 20 may determine the standard degree of the ultrasound image according to at least one of a structural similarity between the ultrasound image and a standard tangent plane corresponding to a standard tangent plane type to which the ultrasound image belongs, a proportion of the feature structure, a sharpness of the feature structure, and a sharpness of the blood vessel.
For example, for the second portal section of the abdominal liver, the standard degree of the ultrasonic image belonging to the second portal section may be determined by considering factors such as whether three hepatic veins (structural similarity) are perforated, the length of hepatic vein display (proportion of characteristic structures), the vascular definition (vascular definition), whether the liver parenchyma region is clear (definition of characteristic structures), and the like.
The display 70 displays the standard grade of the ultrasound image.
Processor 20 controls display 70 to display a standard grade of ultrasound image. The manner in which the standard grade of ultrasound image is displayed may take a variety of forms.
According to the method, the standard grade of the ultrasonic image is determined through the standard grade of the ultrasonic image, so that the standard grade of the ultrasonic image is displayed, the standard grade of the ultrasonic image is displayed in a visual and intuitive mode, an operator is effectively assisted to acquire the standard section, and the efficiency of acquiring the standard section is improved.
Further, a standard-class display mode of an ultrasound image provided in the present application is described below.
The processor 20 determines the number of corresponding target graphics and/or the color of the target graphics based on the standard grade of the ultrasound image.
Wherein the target graphic may include, but is not limited to: ring, bar, pie, or five-pointed star.
The correspondence between the plurality of standard levels and the number of target graphics and/or the color of the target graphics, respectively, may be predetermined, so that the number of corresponding target graphics and/or the color of the target graphics is determined according to the standard levels of the ultrasound image.
The display 70 displays the target graphic in the number of target graphics and/or the color of the target graphic.
In one possible implementation, processor 20 may determine the number of corresponding target graphics based on the standard grade of the ultrasound image, thereby controlling display 70 to display the target graphics in the number of target graphics.
For example, N standard levels (N is an integer greater than 1) each corresponding to the number of target graphics may be divided in advance, and the display 70 displays the target graphics corresponding to the standard level of the ultrasound image. For example, the target pattern is five-pointed star, 5 standard grades are divided in advance, the standard grade of the ultrasonic image is 3 grades, the number of the corresponding five-pointed star is 3, and the display 70 displays 3 five-pointed star to represent that the standard grade of the ultrasonic image is 3 grades.
In another possible implementation, the processor 20 may determine the color of the corresponding target graphic based on the standard grade of the ultrasound image, thereby controlling the display 70 to display the target graphic in the color of the target graphic.
For example, N standard levels (N is an integer greater than 1) each corresponding to the number of target images may be divided in advance, and the display 70 displays the target images of the number corresponding to the standard levels of the ultrasonic images as the target colors.
For example, if the target pattern is a circle, the percentage of the portion of the circle displayed as the target color is determined according to the standard level of the ultrasonic image, so that the percentage portion of the circle is displayed as the target color.
In yet another possible implementation, processor 20 may determine the number and color of corresponding target graphics based on the standard grade of the ultrasound image, thereby controlling display 70 to display the target graphics in the number and color of target graphics.
For example, N standard levels (N is an integer greater than 1) may be divided in advance, each standard level corresponds to the number of one target image, and the number of target images is the target color, and the display 70 displays the target images of the number corresponding to the standard level of the ultrasonic image as the target color, and the other target images may be displayed as default colors.
For example, the target pattern is five stars, the default color is black, the target color is yellow, 5 standard levels are divided in advance, the standard level of the ultrasonic image is 3 levels, and the display 70 displays 3 yellow five stars and 2 black five stars to represent that the standard level of the ultrasonic image is 3 levels.
Further, the colors of the target graphics of the target number may also be determined according to the standard grade of the ultrasonic image, for example, in the case that the standard grade meets the requirement, the target graphics of the target number corresponding to the standard grade of the ultrasonic image may be displayed as a first color, and the first color may be a color such as green; in the case that the standard level is not satisfactory, the target graphics corresponding to the standard level of the ultrasound image may be displayed in a second color, which may be a color such as red. The first color and the second color are preset, and the present application is not limited as to which color is set.
The standard-class display form will be described below by taking the bar form shown in fig. 6A and the five-pointed star form shown in fig. 6B as examples.
Referring to fig. 6A, fig. 6A is a schematic diagram of a standard grade display in the form of a bar, where fig. 6A graphically represents the level of the standard grade of the current ultrasound image with bars, and assuming that there are 10 standard grades, which can be represented by 10 bars with different lengths, and assuming that the standard grade of the current ultrasound image is at the sixth grade, the bottom 6 bars of the 10 bars are lit, that is, the 6 bars have higher brightness or different colors (colors are not shown in the figure) than the top 4 bars.
Referring to fig. 6B, fig. 6B is a schematic diagram of a standard grade display in the form of a five-pointed star provided in the present application. Fig. 6B graphically illustrates the standard grade level of the current ultrasound image in five-pointed star. Assuming a total of 5 standard classes, 5 pentanes can be used to represent. If the standard level of the current ultrasound image is the fourth level, the left 4 pentads are lit, i.e., the 4 pentads are higher in brightness or different in color (color not shown in the figure) than the right 1 pentads.
In some embodiments, the display 70 may display the ultrasound image, the standard cut plane structure schematic, and the standard grade simultaneously.
The ultrasonic image, the standard section structure schematic diagram and the standard grade are not overlapped, and the display size of the standard grade is smaller than that of the ultrasonic image.
Further, the standard cut structure map and the standard grade may be placed on the same side of the ultrasound image, for example, both the standard cut structure map and the standard grade may be displayed in the upper left corner position of the ultrasound image.
In this embodiment, the display 70 may display the ultrasonic image, the standard section structure schematic diagram and the standard grade simultaneously, and the ultrasonic image, the standard section structure schematic diagram and the standard grade are not overlapped, and the display size of the standard grade is smaller than the display size of the ultrasonic image, so that the operator can conveniently check the ultrasonic image and the standard grade, and the operator can quickly obtain the required key information from the standard section structure schematic diagram and the standard grade, thereby improving the efficiency.
A display interface for an ultrasound image is described below in conjunction with the display interface schematic diagram shown in fig. 7.
Referring to fig. 7, fig. 7 is a schematic view of a display interface of another ultrasound image provided in the present application, and fig. 7 is an example of a display interface when a heart ultrasound image is identified as a standard section type (apex four-chamber heart), wherein the ultrasound image, the standard section structure schematic view 72 and the standard class 71 are displayed simultaneously.
In some scenarios, the image quality, such as the imaging quality of the ultrasound image itself, such as contrast, noise, whether the image is blurred, etc., may also affect the recognition of the ultrasound image, the diagnosis of diseases, etc., and therefore, the image quality of the ultrasound image may be considered in addition to the standard degree of the ultrasound image when determining the standard grade of the ultrasound image. Specific examples are described in detail below.
The processor 20 determines a standard grade of the ultrasound image based on the standard grade of the ultrasound image and the image quality of the ultrasound image.
Further, the processor 20 may perform weighted averaging on the standard degree of the ultrasound image and the image quality of the ultrasound image to obtain a standard score of the ultrasound image;
the processor 20 obtains a standard grade of the ultrasound image based on the standard score and the preset interval of the ultrasound image.
Alternatively, the processor 20 may determine the image quality of the ultrasound image based on at least one of variance, image entropy, spatial frequency, contrast, and average gradient of the ultrasound image.
In some embodiments, the standard grade of the ultrasound image may be obtained in a plurality of ways according to the standard grade of the ultrasound image. Among them, a conventional image quality evaluation or aesthetic evaluation method may be employed, a quality evaluation or aesthetic evaluation method based on deep learning may also be employed, and the like.
In one possible implementation, a standard grade of ultrasound image is obtained using a standard grade assessment model based on a deep learning quality assessment or aesthetic assessment method. The standard grade evaluation model is used for evaluating the input ultrasonic image to obtain the standard grade of the ultrasonic image. The standard grade evaluation model can be selected from various network models, such as EfficientNet, mobileNet, VGG, resNet, alexNet and the like.
The standard grade assessment model is pre-trained, and it is understood that the process of training the standard grade assessment model can be completed by the ultrasonic imaging system, or can be completed by other equipment, and after the training is completed by the other equipment, the trained standard grade assessment model is stored in the ultrasonic imaging system. When training the standard grade assessment model, an ultrasound image database (which may be referred to as a third training set) may be constructed, the third training set containing ultrasound images of various standard cut surface types. Based on clinical knowledge about the standard cut, each ultrasound image contained in the third training set is labeled with a discrete standard grade or a continuous score of the standard grade belonging to the standard cut type.
In another possible implementation, the standard grade of the ultrasound image is obtained using a conventional image quality evaluation method, where the conventional image quality evaluation method may include, but is not limited to: full-reference, no-reference, and partial-reference image quality evaluations, subjective image quality evaluation methods, machine-learned image quality evaluations, and the like. Wherein: the subjective image quality evaluation method is mainly based on people, and the quality of the images is given by a plurality of observers, and then the average is calculated. The quality evaluation of the full reference image gives a standard reference image, then the distance/error (such as signal to noise ratio, mean square error, structural similarity and the like) between the image to be evaluated and the reference image is calculated, and the quality of the image to be evaluated can be obtained by analyzing the distance/error. The partial reference image quality evaluation is to use the characteristic information of the image, and the quality of the image to be detected can be obtained by comparing the key characteristic information between the reference image and the image to be evaluated. The non-reference image quality evaluation is to evaluate the characteristics (variance, image entropy, spatial frequency, contrast and average gradient) of the image to be evaluated, and comprehensively analyze the quality of the image. The machine learning image quality evaluation method generally uses SVM to establish a classification model, classifies images, and then carries out regression on the quality of the images to be evaluated to obtain quality values of the images to be evaluated.
In another possible implementation manner, the quality of the ultrasound image predicted by the model is estimated based on the standard grade of deep learning in one possible implementation manner, and the quality of the section estimated based on the traditional image quality evaluation method in another possible implementation manner respectively consider different aspects of the image quality, so that the two can be combined to give a comprehensive standard score, and thus the standard grade of the ultrasound image is obtained.
In another possible implementation manner, the standard grade of the ultrasonic image can be obtained comprehensively based on the standard score obtained by the method and the standard section type and/or the identified target feature structure of the determined ultrasonic image. For example, the standard score may be multiplied by a probability value of a standard section class to which the ultrasound image belongs to obtain a product, so as to obtain a standard grade of the ultrasound image according to a standard grade division condition.
It should be noted that the processor 20 may also use an identification and evaluation model to implement the above-mentioned determination of the standard grade of the ultrasound image, determination of the standard section type to which the ultrasound image belongs, and identification of the target feature structure. The identification and evaluation model is used for identifying and evaluating the input ultrasonic image to obtain the standard section type to which the ultrasonic image belongs, the target feature structure corresponding to the specific feature structure and the standard grade. For example, a deep learning-based approach may be used to build the recognition and assessment model. The recognition and evaluation model can comprise three network structure modules, and can respectively realize three tasks of image classification (standard tangent plane type recognition), target detection or segmentation (specific feature structure recognition) and quality evaluation or aesthetic evaluation (standard grade evaluation), and the tangent plane type and feature structure recognition model can be trained through the loss functions of the three tasks. By adopting the model combining multiple tasks, different tasks can be mutually promoted, the performance of each task can be improved, and the accuracy can be improved.
In some scenes, during the scanning process, the ultrasonic images are continuously acquired in real time, when an operator stores images manually, the operator often takes time and labor to pick out the ultrasonic images of the better standard section, and even the ultrasonic images of the standard section which are sometimes accessed are not good. In view of this, the present embodiment automatically stores the identified better ultrasound image, which will be described in detail with specific embodiments.
This embodiment is based on any one of the above embodiments, further, the processor 20 is further configured to: and storing the ultrasonic image when the preset condition is met.
The preset condition is used for indicating whether the current ultrasonic image meets a certain storage standard, and when the ultrasonic image meets the Chinese-style plug condition, the ultrasonic image is close to a certain standard section and meets the condition of acquiring the standard section.
The preset conditions may include, but are not limited to, at least one of the following:
condition one: the standard grade is greater than or equal to a preset grade.
Condition II: the probability value of the ultrasonic image being the standard section type is larger than or equal to a preset probability threshold value.
The method can determine the quality of the ultrasonic image by combining the obtained information such as the standard section type, the probability value and the determined standard grade of the ultrasonic image and whether the obtained information meets the preset condition, such as whether the probability value reaches a preset probability threshold value or not and/or whether the standard grade reaches the preset grade or not, so that for each standard section type, an ultrasonic image, a plurality of ultrasonic images or a section of ultrasonic image film which is relatively better in each identified standard section in the current drawing process is automatically stored.
According to the embodiment, when the preset condition is met, the ultrasonic image is automatically stored, so that the acquisition efficiency of the ultrasonic image of the standard section is improved, and in addition, the acquired ultrasonic image of the standard section has a better effect.
An auxiliary display method for an ultrasound image according to an embodiment of the present application will be described below with reference to fig. 8.
Referring to fig. 8, the method of the present embodiment is performed by an ultrasound imaging system, which may be the ultrasound imaging system shown in fig. 1 and described above, and the present embodiment is described in connection with the ultrasound imaging system shown in fig. 1. The method of the present embodiment includes the steps of:
step 810: ultrasound waves are emitted toward the target tissue.
Step 820: an echo signal of the ultrasonic wave is received.
Step 830: and generating an ultrasonic image according to the echo signal of the ultrasonic wave.
Step 840: and determining the standard section type to which the ultrasonic image belongs according to the ultrasonic image.
Wherein the standard facet type contains a specific feature.
Step 850: and identifying a target feature structure corresponding to the specific feature structure in the ultrasonic image according to the ultrasonic image.
Step 860: and determining a corresponding standard section structure schematic diagram according to the standard section type.
Wherein the standard tangent plane structure schematic diagram comprises a structure area corresponding to the specific characteristic structure;
step 870: and displaying the ultrasonic image and the standard section structure schematic diagram.
Wherein a structural region corresponding to the target feature in the standard tangent plane structural schematic is highlighted.
It is to be understood that the step 840 and the step 850 are not performed sequentially, and the step 840 may be performed first and then the step 850 may be performed first, the step 850 may be performed first and then the step 840 may be performed simultaneously, and the step 840 and the step 850 may be performed simultaneously, which is not limited in this application.
In some embodiments, the structure region corresponding to the target feature in the standard tangent plane structure schematic diagram in step 870 is highlighted, which may include, but is not limited to, one or a combination of the following display modes:
display mode one: the brightness value of the target characteristic structure in the standard section structure schematic diagram is larger than that of other areas;
and a second display mode: the gray value of the target characteristic structure in the standard section structure schematic diagram is lower than that of other areas;
and a third display mode: the color of the target feature in the standard tangential plane structure diagram is different from the color of the other regions.
The other areas are areas except for the structural area where the target feature structure is located in the standard tangent plane structural schematic diagram.
Optionally, the other regions may include a first region and a second region, where the first region is a region corresponding to a structural feature other than the specific structural feature, and the second region is a region corresponding to a specific structural feature other than the target feature. It will be appreciated that the manner of display between the first region and the second region may also be a differential display, i.e. the brightness, grey scale or colour of the first region and the second region may be different.
In some embodiments, the step 870 may display the standard section structure in a variety of ways. Some possible display modes provided in this embodiment are described below.
In one possible implementation, the standard section structure diagram and the ultrasonic image are displayed separately, wherein the displayed ultrasonic image and the standard section structure diagram are equal in size.
In another possible implementation manner, the ultrasound image and the standard section structure diagram are displayed simultaneously, and the ultrasound image and the standard section structure diagram are not overlapped, and the display size of the standard section structure diagram is smaller than the display size of the ultrasound image.
Further, the display size of the standard section structure schematic may be in a preset ratio to the display size of the ultrasound image, where the preset ratio may be preset.
In some embodiments, on the basis of any one of the foregoing embodiments, the method provided in this embodiment further includes the following steps: and displaying the name of the standard section type to which the ultrasonic image belongs.
In some embodiments, the method provided in this embodiment further includes the following steps: the name of the target feature of the ultrasound image is displayed.
The location of the name label of the target feature of the ultrasound image may be in a variety of ways. The name of the target feature structure of the ultrasonic image can be marked beside the standard section structure schematic diagram; the names of the target features of the ultrasound image may also be displayed in the corresponding structural regions of the target features of the standard tangent plane structural representation.
Further, the names of all specific features in the standard tangent plane structure schematic diagram can be also marked.
Further, the name of the target feature of the ultrasound image may also be displayed in a structural region corresponding to the target feature of the ultrasound image.
Furthermore, the names of all specific characteristic structures in the standard section structure schematic diagram can be marked and displayed.
Further, names of the corresponding specific feature structures may also be displayed in the structure areas corresponding to all the specific feature structures of the ultrasound image.
In some embodiments, step 840 may be implemented by:
and according to the ultrasonic image and the section type identification model, obtaining the standard section type to which the ultrasonic image belongs.
In some embodiments, step 850 may be implemented by:
and obtaining a target characteristic structure corresponding to the specific characteristic structure in the ultrasonic image according to the ultrasonic image and the characteristic structure identification model.
Alternatively, step 840 and step 850 may be performed together, as follows:
and according to the section type and the characteristic structure identification model, obtaining the standard section type to which the ultrasonic image belongs and the target characteristic structure corresponding to the specific characteristic structure in the ultrasonic image.
The section type and feature structure recognition model is a trained network model based on deep learning.
On the basis of the foregoing embodiment, the method provided in this embodiment further includes the following steps:
Step 880: determining a standard grade of the ultrasonic image at least according to the standard degree of the ultrasonic image;
step 890: and displaying the standard grade of the ultrasonic image.
On the basis of the foregoing embodiment, the method provided in this embodiment further includes the following steps:
determining the number of corresponding target graphics and/or the colors of the target graphics according to the standard grade of the ultrasonic image, wherein the target graphics comprise: ring, bar, pie, or five-pointed star.
Accordingly, step 890 may include the steps of:
and displaying the target graphics according to the number of the target graphics and/or the color of the target graphics.
In one possible implementation, the number of corresponding target graphics may be determined according to a standard level of the ultrasound image, so that the target graphics are displayed in the number of target graphics.
In another possible implementation, the color of the corresponding target graphic may be determined according to the standard level of the ultrasound image, so that the target graphic is displayed in the color of the target graphic.
In yet another possible implementation, the number and color of the corresponding target graphics may be determined according to a standard level of the ultrasound image, so that the target graphics are displayed in the number and color of the target graphics.
In some embodiments, the ultrasound image, the standard tangent plane structure schematic, and the standard grade may be displayed simultaneously, and the ultrasound image, the standard tangent plane structure schematic, and the standard grade do not overlap, and a display size of the standard grade is smaller than a display size of the ultrasound image.
In some embodiments, step 880 may include the steps of:
and determining the standard grade of the ultrasonic image according to the standard degree of the ultrasonic image and the image quality of the ultrasonic image.
In some embodiments, step 880 may include the steps of:
carrying out weighted average on the standard degree of the ultrasonic image and the image quality of the ultrasonic image to obtain a standard score of the ultrasonic image;
and obtaining the standard grade of the ultrasonic image according to the standard score and the preset interval of the ultrasonic image.
In some embodiments, step 880 may be preceded by obtaining a standard level of ultrasound images by:
and determining the standard degree of the ultrasonic image according to at least one of the structural similarity, the proportion of the characteristic structure, the definition of the characteristic structure and the definition of the blood vessel between the ultrasonic image and the standard section corresponding to the standard section type to which the ultrasonic image belongs.
In some embodiments, step 880 may be preceded by obtaining the image quality of the ultrasound image by:
and determining the image quality of the ultrasonic image according to at least one of variance, image entropy, spatial frequency, contrast and average gradient of the ultrasonic image.
On the basis of any one of the foregoing embodiments, the method provided in this embodiment further includes the following steps:
and storing the ultrasonic image when a preset condition is met.
Wherein the preset conditions include at least one of the following conditions:
condition one: the standard grade is greater than or equal to a preset grade;
condition II: the probability value of the ultrasonic image for the standard section type is larger than or equal to a preset probability threshold.
The method provided in the embodiment of the present application is similar to the implementation principle and effect of the ultrasound imaging system provided in the embodiment of the system described above, and will not be described herein.
Embodiments of the present application provide a computer-readable storage medium having stored thereon a program executable by a processor to implement the auxiliary display method of an ultrasound image as described in any one of the above.
Those skilled in the art will appreciate that all or part of the functions of the various methods in the above embodiments may be implemented by hardware, or may be implemented by a computer program. When all or part of the functions in the above embodiments are implemented by means of a computer program, the program may be stored in a computer readable storage medium, and the storage medium may include: read-only memory, random access memory, magnetic disk, optical disk, hard disk, etc., and the program is executed by a computer to realize the above-mentioned functions. For example, the program is stored in the memory of the device, and when the program in the memory is executed by the processor, all or part of the functions described above can be realized. In addition, when all or part of the functions in the above embodiments are implemented by means of a computer program, the program may be stored in a storage medium such as a server, another computer, a magnetic disk, an optical disk, a flash disk, or a removable hard disk, and the program in the above embodiments may be implemented by downloading or copying the program into a memory of a local device or updating a version of a system of the local device, and when the program in the memory is executed by a processor.
The foregoing description of specific examples has been presented only to aid in the understanding of the present application and is not intended to limit the present application. Several simple deductions, modifications or substitutions may also be made by the person skilled in the art to which the present application pertains, according to the idea of the present application.

Claims (16)

1. An ultrasound imaging system, comprising:
an ultrasonic probe for transmitting ultrasonic waves and receiving echo signals of the ultrasonic waves;
a transmission and reception control circuit for controlling the ultrasonic probe to perform transmission of ultrasonic waves and reception of echo signals of the ultrasonic waves;
the processor is used for generating an ultrasonic image according to the echo signal of the ultrasonic wave and determining a standard section type to which the ultrasonic image belongs according to the ultrasonic image, wherein the standard section type comprises a specific characteristic structure; identifying a target feature structure corresponding to the specific feature structure in the ultrasonic image according to the ultrasonic image; determining a corresponding standard section structure diagram according to the standard section type, wherein the standard section structure diagram comprises a structure area corresponding to the specific characteristic structure;
and the display is used for displaying the ultrasonic image and the standard section structure schematic diagram, wherein a structure area corresponding to the target characteristic structure in the standard section structure schematic diagram is highlighted.
2. The ultrasound imaging system of claim 1, wherein a structural region of the standard cut surface structural schematic diagram corresponding to the target feature is highlighted, comprising at least one of:
the brightness value of the target feature structure in the standard section structure diagram is larger than the brightness value of other areas, wherein the other areas are areas except the structural area where the target feature structure is located in the standard section structure diagram;
the gray value of the target feature structure in the standard section structure schematic diagram is lower than the gray value of the other areas;
the color of the target feature in the standard tangent plane structure schematic diagram is different from the color of the other region.
3. The ultrasound imaging system of claim 1, wherein the display is specifically configured to display the ultrasound image and the normal tangent plane structure map simultaneously, and wherein the ultrasound image and the normal tangent plane structure map do not overlap, and wherein a display size of the normal tangent plane structure map is smaller than a display size of the ultrasound image.
4. The ultrasound imaging system of any of claims 1-3, wherein the display is further configured to display at least one of the following data:
The name of the standard section type to which the ultrasonic image belongs;
a name of a target feature of the ultrasound image;
the name of the target feature structure of the ultrasonic image is marked beside the standard section structure schematic diagram, or is displayed in a structure area corresponding to the target feature structure of the standard section structure schematic diagram.
5. The ultrasound imaging system of any of claims 1 to 3, wherein said processor is specifically configured to:
and obtaining the standard section type to which the ultrasonic image belongs and the target characteristic structure corresponding to the specific characteristic structure in the ultrasonic image according to the section type and the characteristic structure identification model, wherein the section type and the characteristic structure identification model are network models based on deep learning after training.
6. The ultrasound imaging system of any of claims 1 to 3, wherein the processor is further configured to: determining a standard grade of the ultrasonic image at least according to the standard degree of the ultrasonic image;
the display is also for: and displaying the standard grade of the ultrasonic image.
7. The ultrasound imaging system of claim 6, wherein the processor is further configured to:
Determining the number of corresponding target graphics and/or the colors of the target graphics according to the standard grade of the ultrasonic image, wherein the target graphics comprise: ring, bar, pie or five-pointed star;
the display is specifically used for: and displaying the target graphics according to the number of the target graphics and/or the color of the target graphics.
8. The ultrasound imaging system of claim 6, wherein the display is specifically configured to simultaneously display the ultrasound image, the standard tangent plane structure representation, and the standard grade, wherein the ultrasound image, the standard tangent plane structure representation, and the standard grade do not overlap, and wherein a display size of the standard grade is smaller than a display size of the ultrasound image.
9. The ultrasound imaging system of claim 6, wherein said determining a standard grade of said ultrasound image based at least on a standard grade of said ultrasound image comprises:
and determining the standard grade of the ultrasonic image according to the standard degree of the ultrasonic image and the image quality of the ultrasonic image.
10. The ultrasound imaging system of claim 9, wherein determining the standard grade of the ultrasound image based on the standard grade of the ultrasound image and the image quality of the ultrasound image comprises:
Carrying out weighted average on the standard degree of the ultrasonic image and the image quality of the ultrasonic image to obtain a standard score of the ultrasonic image;
and obtaining the standard grade of the ultrasonic image according to the standard score and the preset interval of the ultrasonic image.
11. The ultrasound imaging system of claim 10, wherein the processor is further configured to:
determining the standard degree of the ultrasonic image according to at least one of the structural similarity, the proportion of the characteristic structure, the definition of the characteristic structure and the definition of the blood vessel between the ultrasonic image and the standard section corresponding to the standard section type to which the ultrasonic image belongs;
and determining the image quality of the ultrasonic image according to at least one of variance, image entropy, spatial frequency, contrast and average gradient of the ultrasonic image.
12. The ultrasound imaging system of claim 6, wherein the processor is further configured to:
storing the ultrasound image when a preset condition is satisfied, wherein the preset condition includes at least one of the following conditions:
the standard grade is greater than or equal to a preset grade;
the probability value of the ultrasonic image for the standard section type is larger than or equal to a preset probability threshold.
13. An auxiliary display method of an ultrasonic image, comprising:
transmitting ultrasonic waves to a target tissue;
receiving echo signals of the ultrasonic waves;
generating an ultrasonic image according to the echo signal of the ultrasonic wave;
determining a standard section type to which the ultrasonic image belongs according to the ultrasonic image, wherein the standard section type comprises a specific characteristic structure;
identifying a target feature structure corresponding to the specific feature structure in the ultrasonic image according to the ultrasonic image;
determining a corresponding standard section structure diagram according to the standard section type, wherein the standard section structure diagram comprises a structure area corresponding to the specific characteristic structure;
and displaying the ultrasonic image and the standard section structure schematic diagram, wherein a structure area corresponding to the target characteristic structure in the standard section structure schematic diagram is highlighted.
14. The method of claim 13, wherein the structural region of the standard tangential plane structural representation corresponding to the target feature is highlighted, comprising at least one of:
the brightness value of the target feature structure in the standard section structure diagram is higher than the brightness value of other areas, wherein the other areas are areas except the area where the target feature structure is located in the standard section structure diagram;
The gray value of the target feature structure in the standard section structure schematic diagram is lower than the gray value of the other areas;
the color of the target feature in the standard tangent plane structure schematic diagram is different from the color of the other region.
15. The method of claim 13 or 14, wherein the method further comprises:
determining a standard grade of the ultrasonic image according to the standard degree of the ultrasonic image;
and displaying the standard grade of the ultrasonic image.
16. The method of claim 15, wherein the method further comprises:
storing the ultrasound image when a preset condition is satisfied, wherein the preset condition includes at least one of the following conditions:
the standard grade is greater than or equal to a preset grade;
the probability value of the ultrasonic image for the standard section type is larger than or equal to a preset probability threshold.
CN202210027817.2A 2022-01-11 2022-01-11 Ultrasound imaging system and auxiliary display method of ultrasound image Pending CN116458917A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210027817.2A CN116458917A (en) 2022-01-11 2022-01-11 Ultrasound imaging system and auxiliary display method of ultrasound image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210027817.2A CN116458917A (en) 2022-01-11 2022-01-11 Ultrasound imaging system and auxiliary display method of ultrasound image

Publications (1)

Publication Number Publication Date
CN116458917A true CN116458917A (en) 2023-07-21

Family

ID=87175804

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210027817.2A Pending CN116458917A (en) 2022-01-11 2022-01-11 Ultrasound imaging system and auxiliary display method of ultrasound image

Country Status (1)

Country Link
CN (1) CN116458917A (en)

Similar Documents

Publication Publication Date Title
US11490877B2 (en) System and method of identifying characteristics of ultrasound images
RU2657855C2 (en) Three-dimensional ultrasound imaging system
CN110945560B (en) Fetal Ultrasound Image Processing
CN109788939B (en) Method and system for enhancing visualization and selection of representative ultrasound images by automatically detecting B-lines and scoring images of ultrasound scans
CN106204465A (en) Knowledge based engineering ultrasonoscopy strengthens
US10002422B2 (en) Ultrasound image processing apparatus and medium
US11931201B2 (en) Device and method for obtaining anatomical measurements from an ultrasound image
JP5113548B2 (en) Ultrasonic image processing device
CN116419716A (en) Analysis method of periodicity parameters and ultrasonic imaging system
CN114680929A (en) Ultrasonic imaging method and system for measuring diaphragm
CN114271850B (en) Ultrasonic detection data processing method and ultrasonic detection data processing device
CN116458917A (en) Ultrasound imaging system and auxiliary display method of ultrasound image
CN115670512A (en) Blood flow measuring method based on ultrasound and ultrasonic imaging system
EP4252181A1 (en) Predicting a likelihood that an individual has one or more lesions
CN114699106A (en) Ultrasonic image processing method and equipment
CN113768544A (en) Ultrasonic imaging method and equipment for mammary gland
CN113229850A (en) Ultrasonic pelvic floor imaging method and ultrasonic imaging system
CN116322521A (en) Ultrasonic imaging method and ultrasonic imaging system for midnight pregnancy fetus
CN112826535A (en) Method, device and equipment for automatically positioning blood vessel in ultrasonic imaging
Menchón-Lara et al. Measurement of Carotid Intima-Media Thickness in ultrasound images by means of an automatic segmentation process based on machine learning
WO2020132953A1 (en) Imaging method, and ultrasonic imaging device
CN112294361A (en) Ultrasonic imaging equipment and method for generating section image of pelvic floor
CN116138807A (en) Ultrasonic imaging equipment and ultrasonic detection method of abdominal aorta
WO2021042242A1 (en) Ultrasonic imaging device and ultrasonic echo signal processing method thereof
CN116135155A (en) Ultrasonic imaging equipment and imaging guiding method of ultrasonic section

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication