CN118542692A - Ultrasonic image analysis method, ultrasonic imaging system and computer storage medium - Google Patents

Ultrasonic image analysis method, ultrasonic imaging system and computer storage medium Download PDF

Info

Publication number
CN118542692A
CN118542692A CN202410669022.0A CN202410669022A CN118542692A CN 118542692 A CN118542692 A CN 118542692A CN 202410669022 A CN202410669022 A CN 202410669022A CN 118542692 A CN118542692 A CN 118542692A
Authority
CN
China
Prior art keywords
lung
ultrasound
score
scoring
super
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410669022.0A
Other languages
Chinese (zh)
Inventor
王勃
刘硕
黄云霞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mindray Bio Medical Electronics Co Ltd
Original Assignee
Shenzhen Mindray Bio Medical Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mindray Bio Medical Electronics Co Ltd filed Critical Shenzhen Mindray Bio Medical Electronics Co Ltd
Priority to CN202410669022.0A priority Critical patent/CN118542692A/en
Publication of CN118542692A publication Critical patent/CN118542692A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The application provides an ultrasonic image analysis method, an ultrasonic imaging system and a computer storage medium, wherein the ultrasonic image analysis method comprises the following steps: acquiring a lung ultrasonic image; identifying ultrasound signs of each lung region in the ultrasound image of the lung; scoring the individual lung fields according to the ultrasound signatures to generate a scoring result; and displaying a lung super-score graph on a display interface, wherein the lung super-score graph comprises a lung graph and identifiers displayed at each lung region of the lung graph, and the identifiers are used for representing the scoring results of the corresponding lung regions. The ultrasonic image analysis scheme provided by the application scores each lung region, and generates and displays the lung super-score graph based on the scoring result, so that the lung diagnosis result can be intuitively displayed and compared, and the ultrasonic diagnosis can meet clinical requirements more.

Description

Ultrasonic image analysis method, ultrasonic imaging system and computer storage medium
The scheme is a divisional application, the international application number of the original application is PCT/CN2019/115420, the international application date is 20191104, the national application number is 201980100370.3, the date of entering the national stage of China is 2022, and the invention is named as an ultrasonic image analysis method, an ultrasonic imaging system and a computer storage medium.
Technical Field
The present application relates to the field of ultrasound imaging technology, and more particularly, to an ultrasound image analysis method, an ultrasound imaging system, and a computer storage medium.
Background
In modern medical image inspection, the ultrasonic technology has become one of the most widely applied inspection means with the highest use frequency and the fastest popularization and application of the new technology due to the advantages of high reliability, rapidness, convenience, real-time imaging, repeated inspection and the like. The development of new ultrasonic technology further promotes the application of ultrasonic image examination in clinical diagnosis and treatment.
In recent years, in fields of emergency treatment of severe cases and the like, ultrasound imaging of the lung (abbreviated as lung ultrasound) has been increasingly widely used and emphasized, and the identification of ultrasound signs helps to assist rapid diagnosis. How to rapidly and quantitatively evaluate the pulmonary ventilation degree according to ultrasonic symptoms and display and transmit the evaluation results has more and more important clinical significance. However, the traditional method relies on manual statistics and evaluation, which is time-consuming and labor-consuming, and the evaluation result cannot be intuitively displayed, thus seriously affecting the popularization and application of the lung superb in the field of acute severe.
Disclosure of Invention
In the summary, a series of concepts in a simplified form are introduced, which will be further described in detail in the detailed description. The summary of the invention is not intended to define the key features and essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
An embodiment of the present invention provides an ultrasound image analysis method, including:
Acquiring a lung ultrasonic image;
Identifying ultrasound signs of each lung region in the ultrasound image of the lung;
scoring the individual lung fields according to the ultrasound signatures to generate a scoring result;
and displaying a lung super-score graph on a display interface, wherein the lung super-score graph comprises a lung graph and identifiers displayed at each lung region of the lung graph, and the identifiers are used for representing the scoring results of the corresponding lung regions.
A second aspect of an embodiment of the present invention provides an ultrasound image analysis method, the method including:
acquiring ultrasonic images of one or more areas of a measured object;
Identifying ultrasound signs in the ultrasound image of the one or more regions;
Scoring the ultrasound images of the one or more regions according to the ultrasound signals to generate scoring results;
And displaying an ultrasonic scoring graph on a display interface, wherein the ultrasonic scoring graph comprises a graph of a tested object and identifiers displayed at various areas of the graph of the tested object, and the identifiers are used for representing the scoring results of the corresponding areas.
A third aspect of an embodiment of the present invention provides an ultrasound imaging method, the method comprising:
acquiring a historical ultrasonic scoring graph of a measured object, wherein the historical ultrasonic scoring graph comprises a measured object graph and identifiers displayed in one or more areas of the measured object graph, and the identifiers are used for representing historical scores;
generating a scanning indication of the one or more regions according to the scores of the historical scores;
Sending ultrasonic waves to one or more areas of the tested object for scanning according to the scanning indication to obtain ultrasonic echo signals; and
The ultrasound echo signals are processed to obtain a current ultrasound image of the one or more regions.
A fourth aspect of an embodiment of the present invention provides an ultrasound imaging system comprising:
An ultrasonic probe;
a transmitting/receiving control circuit for exciting the ultrasonic probe to transmit ultrasonic waves to a target object and controlling the ultrasonic probe to receive ultrasonic echoes returned from the target object to obtain ultrasonic echo signals;
a memory for storing a program executed by the processor;
A processor for:
processing the ultrasonic echo signals to obtain a lung ultrasonic image;
identifying ultrasound signs of one or more lung regions in the ultrasound image of the lung;
Scoring the one or more lung regions according to the ultrasound signature to generate a scoring result;
And a display for displaying a lung super-score map on a display interface, the lung super-score map comprising a lung graphic and indicia displayed at each lung region of the lung graphic, the indicia being for characterizing the scoring result for a corresponding lung region.
A fifth aspect of the embodiments of the present invention provides a computer storage medium having stored thereon a computer program which, when executed by a computer or processor, implements the steps of the above-described ultrasound image analysis method.
According to the ultrasonic image analysis method, the ultrasonic imaging system and method and the computer storage medium, each lung area is scored, and the lung super-score graph is generated and displayed based on the scoring result, so that the lung imaging result can be intuitively displayed and compared, and the ultrasonic imaging can meet clinical requirements more.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the description of the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort to a person skilled in the art.
In the drawings:
FIG. 1 shows a schematic block diagram of an ultrasound imaging system according to an embodiment of the invention;
FIG. 2 shows a schematic flow chart of an ultrasound image analysis method according to an embodiment of the invention;
FIG. 3a shows a lung super score graph according to an embodiment of the invention;
FIG. 3b shows a lung super score graph according to an embodiment of the invention;
FIG. 4 illustrates another lung super score graph, according to an embodiment of the present invention;
FIG. 5 shows a schematic diagram of a lung ultrasound report according to an embodiment of the invention;
FIG. 6 shows a schematic view of viewing a lung ultrasound image in a first form of a lung super score map, according to an embodiment of the present invention;
FIG. 7 shows a schematic diagram of displaying a plurality of lung super score graphs on a display interface according to an embodiment of the present invention;
FIG. 8 illustrates a schematic diagram of selecting multiple lung super-score maps for comparative evaluation, according to an embodiment of the invention;
FIG. 9 shows a schematic flow chart of an ultrasound image analysis method according to another embodiment of the invention;
Fig. 10 shows a schematic flow chart of an ultrasound imaging method according to a further embodiment of the invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, exemplary embodiments according to the present invention will be described in detail with reference to the accompanying drawings. It should be apparent that the described embodiments are only some embodiments of the present invention and not all embodiments of the present invention, and it should be understood that the present invention is not limited by the example embodiments described herein. Based on the embodiments of the invention described in the present application, all other embodiments that a person skilled in the art would have without inventive effort shall fall within the scope of the invention.
In the following description, numerous specific details are set forth in order to provide a more thorough understanding of the present invention. It will be apparent, however, to one skilled in the art that the invention may be practiced without one or more of these details. In other instances, well-known features have not been described in detail in order to avoid obscuring the invention.
It should be understood that the present invention may be embodied in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. As used herein, the term "and/or" includes any and all combinations of the associated listed items.
In order to provide a thorough understanding of the present invention, detailed steps and detailed structures will be presented in the following description in order to explain the technical solution presented by the present invention. Preferred embodiments of the present invention are described in detail below, however, the present invention may have other embodiments in addition to these detailed descriptions.
In the following, an ultrasound imaging system according to an embodiment of the invention is first described with reference to fig. 1, fig. 1 showing a schematic block diagram of an ultrasound imaging system 100 according to an embodiment of the invention.
As shown in fig. 1, the ultrasound imaging system 100 includes an ultrasound probe 110, transmit circuitry 112, receive circuitry 114, beam forming circuitry 116, a processor 118, a display 120, a transmit/receive selection switch 122, and a memory 124. Wherein the transmitting circuit 112 and the receiving circuit 114 may be connected to the ultrasound probe 110 through a transmitting/receiving selection switch 122.
The ultrasound probe 110 typically includes an array of a plurality of array elements. At each transmission of ultrasound, all or part of the array elements of the ultrasound probe 110 participate in the transmission of ultrasound. At this time, each or each part of the array elements participating in the ultrasonic transmission is excited by the transmission pulse and transmits ultrasonic waves respectively, and the ultrasonic waves transmitted by the array elements respectively are overlapped in the propagation process to form a synthetic ultrasonic beam transmitted to the target object (for example, human body), and the synthetic ultrasonic beam may be ultrasonic waves transmitted to the lung of the target object (for example, human body).
During ultrasound imaging, the transmit circuit 112 sends delay focused transmit pulses of a certain amplitude and polarity to the ultrasound probe 110 through the transmit/receive select switch 122. The ultrasonic probe 110 is excited by the transmission pulse, transmits ultrasonic waves to the scanning target object, receives ultrasonic echoes with information of the scanning target reflected and/or scattered back from the target area after a certain delay, and reconverts the ultrasonic echoes into electrical signals. The receiving circuit 114 receives the electrical signals converted by the ultrasonic probe 110, obtains ultrasonic echo signals, and feeds the ultrasonic echo signals to the beam forming circuit 116. The beam forming circuit 116 performs focusing delay, weighting and channel summation on the ultrasonic echo signals, and then sends the ultrasonic echo signals to the processor 118 for relevant signal processing
The transmit/receive selection switch 122 may also be referred to as a transmit/receive controller, which may include a transmit controller and a receive controller, the transmit controller for exciting the ultrasound probe 110 to transmit ultrasound waves to a target object (e.g., a human body) via the transmit circuit 112; the receive controller is used to receive ultrasound echoes returned from the target object by the ultrasound probe 110 via the receive circuit 114.
The processor 118 may process the ultrasound echo signals derived based on the ultrasound echoes to derive an ultrasound image of the target object. For example, the ultrasonic echo signals are subjected to beam forming processing by the beam forming circuit 116. The ultrasound images obtained by the processor 118 may be stored in the memory 124. Also, the ultrasound image may be displayed on the display 120. For a more detailed description, reference may be made to the following examples of the present specification.
The processor 118 may be a Central Processing Unit (CPU), an image processing unit (GPU), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), or other form of processing unit with data processing capabilities and/or instruction execution capabilities, and may control other components in the ultrasound imaging system to perform desired functions. For example, the processor 118 can include one or more embedded processors, processor cores, microprocessors, logic circuits, hardware Finite State Machines (FSMs), digital Signal Processors (DSPs), image processing units (GPUs), or combinations thereof.
The display 120 is connected to the processor 118, and the display 120 may be a touch display screen, a liquid crystal display screen, or the like; or the display 120 may be a stand-alone display device such as a liquid crystal display, television, or the like that is independent of the ultrasound imaging system 100; or the display 120 may be a display screen of an electronic device such as a smart phone, tablet, etc. Wherein the number of displays 120 may be one or more. The display 120 may display the ultrasound image and scoring results obtained by the processor 118. In addition, the display 120 may provide the user with a graphical interface for human-computer interaction while displaying the ultrasonic image, one or more controlled objects are provided on the graphical interface, and the user is provided with an operation instruction input by using the human-computer interaction device to control the controlled objects, so as to execute a corresponding control operation. For example, icons are displayed on the graphical interface and can be manipulated using a human-machine interaction device to perform a particular function, such as selecting a lung super score for comparison.
Alternatively, the ultrasound imaging system 100 may further include other man-machine interaction devices besides the display 120, which are connected to the processor 118, for example, the processor 118 may be connected to the man-machine interaction device through an external input/output port, which may be a wireless communication module, a wired communication module, or a combination of both. The external input/output ports may also be implemented based on USB, bus protocols such as CAN, and/or wired network protocols, among others.
The man-machine interaction device may include an input device for detecting input information of a user, where the input information may be, for example, a control instruction for transmitting/receiving an ultrasonic wave, an operation input instruction for editing and annotating an ultrasonic image, or may further include other instruction types. The input device may include one or more of a keyboard, mouse, scroll wheel, trackball, mobile input device (such as a mobile device with a touch display, cell phone, etc.), multi-function knob, etc. The human-machine interaction means may also comprise an output device such as a printer, for example for printing ultrasound reports.
Memory 124 may be used for storing instructions for execution by processor 118, for storing received ultrasound echo signals, for storing ultrasound images, and so forth. Memory 124 may be a flash memory card, solid state memory, hard disk, or the like. Which may be volatile memory and/or nonvolatile memory, removable memory and/or non-removable memory, and the like.
It should be understood that the components included in the ultrasound imaging system 100 shown in fig. 1 are illustrative only and may include more or fewer components. The invention is not limited in this regard.
Next, an ultrasonic image analysis method according to an embodiment of the present invention will be described with reference to fig. 2. Fig. 2 is a schematic flow chart of an ultrasound image analysis method 200 of an embodiment of the present invention.
As shown in fig. 2, the method 200 includes the steps of:
In step S210, an ultrasound image of the lung is acquired.
As one implementation, step S210 may include: a pre-stored ultrasound image of the lung is read from a storage medium. The process of analyzing the acquired ultrasound image of the lung may be performed at any time after the ultrasound image of the lung is acquired. The stored ultrasound images of the lungs may be read from a local storage medium (e.g., memory 124) or from a storage medium of another device via a wired or wireless network.
As another implementation, step S210 may include: and acquiring the lung ultrasonic image in real time.
Wherein, the step of acquiring the lung ultrasound image in real time may comprise: firstly, transmitting ultrasonic waves to the lung of a target object, and receiving ultrasonic echo based on the ultrasonic waves to obtain ultrasonic echo signals; and then, obtaining a lung ultrasonic image of the target object according to the ultrasonic echo signals. The target object may refer to a human body to be detected or a part of the human body to be detected, for example.
Specifically, in connection with fig. 1, the ultrasound probe 110 may be activated by the transmission/reception selection switch 122 to transmit ultrasound waves to the lungs of a target object (e.g., a human body) via the transmission circuit 112, and receive ultrasound echoes returned from the lungs of the target object via the reception circuit 114 by the ultrasound probe 110 and convert into ultrasound echo signals. Thereafter, a beam-forming process may be performed by the beam-forming module 116, and the beamformed ultrasound echo signals may then be fed to a processor 118 for correlation processing to obtain ultrasound images of the lungs. In addition, the lung ultrasound image in the embodiment of the present invention may be obtained by performing a series of signal processing on an ultrasound echo signal, including, for example: analog-to-digital conversion, beam synthesis, IQ (in-phase quadrature) demodulation, logarithmic compression, gray-scale conversion, and the like.
In one embodiment, one or more frames of ultrasound images of the lungs are acquired separately for each lung region of the lungs and stored in memory 124. Wherein the left and right lungs may be divided into at least 2 lung regions, respectively, e.g., the left and right lungs may be divided into 3, 4 or 6 lung regions, respectively, etc., one or more frames of lung ultrasound images may be acquired for each lung region during imaging.
In step S220, ultrasound signs of individual lung regions in the ultrasound image of the lung are identified.
Where ultrasound signatures refer to features used to characterize lung specificity, for example, ultrasound signatures of the lung may include: bat signs, lung slip signs, beach signs, stratosphere signs, comet tail signs, lung points signs, solid tissue signs, fluid dark areas, and the like. In this embodiment, a portion of the ultrasound signatures therein may be identified for scoring, such as B-line, lung metaplasia, and pleural effusion.
Wherein line B (also referred to as "comet tail") is a discrete vertical reverberation artifact that extends from the pleural line to the bottom of the screen, without dropping, and moves in synchrony with lung sliding. The occurrence of a large number of B-ray sound shadows in an ultrasonic image is a sign of a lung interstitial syndrome, and the number of the B-ray sound shadows is increased along with the reduction of the air content and the increase of the lung tissue density, so that the B-ray sound shadows can be used for diagnosing pulmonary edema and judging the pulmonary ventilation degree. In addition, 0-2 isolated lines B are sometimes visible in the same field of view for normal lung tissue. In particular, normal lung tissue, due to the filling of the air, the sound waves are completely scattered and only the pleural and a lines, i.e. several high echo lines parallel to the pleural line, are visible under ultrasound. When pulmonary parenchyma diseases (such as pulmonary edema, pneumonia or acute lung injury) cause hydrostatic pressure increase or capillary permeability increase to lead to small She Ge increase, due to the decrease of air content, some exudates, leaked liquid, collagen and blood increase lung density, and echo drop effect between the lung and surrounding tissues is also reduced, so that ultrasound can reflect images of deeper regions to a certain extent to generate some vertical mixed echoes, namely the above-mentioned B-line. According to the width of the B line region, the B line region can be divided into a single B line and a diffuse B line.
When the air content in the lung is further reduced, the lung tissue becomes substantive, and the sound image can be regarded as a solid tissue similar to liver and spleen echoes. Pulmonary parenchyma is a progressive consequence that can be caused by pulmonary embolism, cancer metastasis in the lungs, compression or obstructive pulmonary atelectasis, and pulmonary contusions. The presence of air and fluid or signs of vascular fusion, etc. also further suggest pulmonary parenchyma.
When pleural effusion exists, the pleural line and the lung surface are separated, and the pleural line and the sound shadows of the upper rib and the lower rib form a quadrilateral shape, and quadrilateral symptoms can be used as characteristic signs of various pleural effusions. In addition, sine wave symptoms are also a sign of pleural effusion, and refer to changes in the surface line of the lung, which are displayed during an M-scan, that appear to resemble sine waves as the respiratory beat moves in the direction of the pleural line.
In embodiments of the present invention, automatic recognition, manual recognition, or a combination of automatic and manual recognition may be employed to identify ultrasound signs in ultrasound images of the lungs.
For example, when automatic identification is employed, the trained neural network model may be utilized to automatically identify ultrasound signs in ultrasound images of the lungs. In particular, the neural network model may be trained to cause the machine to identify ultrasound signatures via an object detection algorithm, which may include FASTER RCNN, etc.
Illustratively, the step of training the neural network model includes: and marking ultrasonic signs in the lung ultrasonic image, and inputting the ultrasonic signs as a training sample into a neural network for training until the model converges, so as to obtain a trained neural network model. Then, the lung ultrasound image acquired in step S210 may be input into the neural network model, and the recognition result of the ultrasound symptom therein may be output.
In addition, conventional image processing methods may be used to identify ultrasound signatures. For example, since line B is a discrete vertical reverberation artifact that extends from the pleural line to the bottom of the screen, no dropouts occur. Thus, according to this feature, the B-line can be identified by detecting the vertical linear feature of the beam line direction. The identification of the linear features can be achieved by means of template matching and the like.
In one embodiment, after the ultrasound symptoms are identified, they may also be quantitatively analyzed to obtain relevant parameters for the ultrasound symptoms. As an example, when the identified ultrasound symptom is B lines, the main calculation parameters include the number of B lines, the percentage of B line coverage, the spacing between adjacent B lines, and the like. The number of the B lines is the total number of the identified B lines, the coverage percentage of the B lines is the percentage of the area occupied by the B lines to the lung detection area, and the interval between the adjacent B lines is the distance between the B lines at the pleura line position.
In embodiments of the present invention, the ultrasound symptoms may be identified fully automatically by the methods described above, may be manually identified and marked by the user, or may be identified by a combination of automatic and manual methods. For example, a relatively easy-to-identify B-line can be identified by using an automatic identification method, and complex diseases such as pulmonary fibrosis, pleural effusion and the like can be marked manually.
After the lung ultrasound images of the respective lung regions are identified, the identification result of the most representative one or more lung ultrasound images may be selected as the final result based on the identification result. For example, the recognition result of one or more lung ultrasound images with the largest number of B lines or the recognition result of one or more lung ultrasound images with the largest percentage of B lines may be selected. The specific selection criteria may be set by the user.
In step S230, the lung fields are scored according to the ultrasound signals to generate a scoring result.
Wherein the scoring identifies the extent of injury to each lung region of the lung. As an example, the higher the scoring result, the more severe the lung injury situation. In the embodiment of the invention, various scoring criteria commonly used in clinic, such as a lung hyper ventilation scoring method, can be used for scoring, and new scoring criteria can be used for scoring.
In one embodiment, individual lung regions may be scored based on the number of B lines, lung transition, and pleural effusions identified in step S220.
For example, when 0-2 isolated lines B are detected, indicating normal ventilation of the lungs, then score 0 is recorded. When a plurality of B lines with clear intervals are identified, namely more than 3 single B lines are detected, the middle lung tissue is marked as 1 score. When densely fused B-lines were identified, i.e. aliased B-lines appeared, severe lung tissue loss of qi was indicated and scored as 2. When lung excess changes or lung excess changes combined with pleural effusion, it was recorded as 3 points.
To facilitate quantitative analysis, the scoring format may be digital, such as 0-3. However, the scoring may also be in the form of words such as N, B1, B2, C, which correspond to different degrees of severity, respectively. In step S230, not only the score of a single lung field may be obtained based on the single Zhang Feibu ultrasound image, but also the scores of the lung fields may be added to obtain a lung overall score.
Illustratively, the scoring may be an automatic scoring, e.g., the system automatically converts ultrasound symptom identification results to scoring results. Or when training the neural network model, the neural network model directly outputs the scoring result. There may also be a manual scoring by the user based on the ultrasound symptoms identified in step S220.
In one embodiment, after the score is generated, an operator interface may be provided to the user, enabling the user to confirm or modify the score for each lung region via the operator interface. After confirming the score obtained in step S230, the user continues to step S240. In another embodiment, after the ultrasonic symptoms are identified in step S220, an operation interface may also be provided to the user, so that the user can confirm or delete the identified ultrasonic symptoms through the operation interface. Of course, the above-mentioned operation interface is not necessary, and the subsequent steps may be directly executed without the operation of the user.
In step S240, a lung super-score map is displayed on a display interface, the lung super-score map including a lung graphic and an identification displayed at each lung region of the lung graphic, the identification being used to characterize the scoring result for the corresponding lung region.
The user can quickly know the grading result of each lung region according to the marks displayed at each lung region of the lung graph, so that the lung injury degree of the patient is visually presented to a doctor, the health state of the patient is conveniently monitored, and the doctor is facilitated to follow-up targeted treatment of the patient. The lung pattern may be a structural drawing showing the shape of the lung, which may be a two-dimensional schematic drawing such as shown in fig. 3a, 3b and 4, or a three-dimensional perspective drawing; the block diagram may be a line model diagram as shown in fig. 3a, 3b and 4, or a rendering diagram. The lung pattern may also be other indicators having an equal or approximately proportional structural relationship to the lung, for example, a rectangular indicator divided in two. The invention is not limited to a specific manner of presenting the lung pattern.
In one embodiment, step S240 includes: a single scored lung super scoring (Lung Ultrasound Score, LUS) map is displayed on the display interface. That is, only one lung super score map including only one lung graphic is displayed on the display interface. Based on a single lung super-score graph, the damage condition of a plurality of lung areas can be visually displayed, and a user can quickly overview the overall lung condition, so that the damage of which part of the lung is more serious is judged. In some examples, the damage condition of a single lung region may also be displayed based on a lung super score map. The scoring results for which lung fields to display may be determined based on user input or may be determined in a predetermined order within the system. The scoring of the lung fields may also be combined to determine which lung field or lung field lesions will be displayed. For example, the processor 118 controls the display to indicate damage to one or more lung regions that exceed a preset scoring threshold according to the scoring result.
In one embodiment, the processor 118 may also control the display to provide dynamic display effects for one or more lung fields. In conjunction with the schematic diagram of fig. 3a, the processor 118 may control the display to display the damage condition of each of the right lung and the left lung in the order of 1R lung, 2R lung, 3R lung, and 4R lung, and the processor 118 may control the display to alternately display the damage condition of each of the right lung and the left lung in the order of 1R lung, 1L lung, 2R lung, and 2L lung …. The scoring of the lung fields may also be combined to determine which lung field or lung field lesions will be displayed. For example, the processor 118 controls the display to sequentially display the lesion status of one or more lung fields according to the level of the scoring result.
As an example, the identification may comprise a graphic. The graphic may characterize the scoring result as one or more of different colors, brightness, texture density, pattern density, fill area, or shape. For example, different colored graphics may be displayed at the location of the corresponding lung fields of the lung graphic according to the level of the score, and the user may quickly obtain the score of the corresponding lung fields according to the color of the graphic.
For ease of understanding, embodiments of the present invention present two visual implementations of lung super-score maps, as shown in fig. 3a, 3b and 4.
Referring first to fig. 3a, in a first form of lung super score map, markers characterizing the scoring result are displayed at various lung region locations of the lung map, and a lung ultrasound image may be displayed after clicking on the markers. The lung super-scoring graph in the form is simple and visual, and can rapidly convey the information of the scoring result to the user. The boundaries of the individual lung fields may or may not be shown as shown in fig. 3 a.
In one embodiment, the indicia may be color patches or color boxes of different colors, each color patch or color box representing a score of the scoring result. The shape of the color patch or the color frame is not limited to any one, and may be, for example, a circle, a square, a triangle, or the like. The identity of the different lung regions may be the same or different. The logo is illustrated in fig. 3a as a circular pattern with different textures, but it will be appreciated that different textures may be replaced with different colors in actual application.
Depending on the level of the score, the color of the logo may have a certain law of variation, so as to facilitate the understanding and memorization of the user, for example, the color of the logo may change from dark to light or from one color system to another color system as the score increases. As an example, 0 minutes may be represented by green, 1 minute by yellow, 2 minutes by orange, and 3 minutes by red.
In one embodiment, the identifier further includes a score (not shown) of the scoring result displayed in the color bar or color box, thereby representing the scoring result in both graphical and textual form. It should be noted, however, that in other embodiments, the numbering of the corresponding lung fields may also be displayed within the graphic, as shown in FIG. 3a, wherein 1R, 2R, 3R and 4R represent the right side 1 to 4 lung fields and 1L, 2L, 3L and 4L represent the left side 1 to 4 lung fields, respectively.
As described above, when the lung super score map of the above-described form is adopted, the lung super score map may also be used as a playback navigation interface, that is, when a selection instruction for identification of a certain lung region of the lung super score map is received, a lung ultrasound image corresponding to the lung region may be displayed on the display interface. Specifically, referring to fig. 6, if the user clicks the identification of the 1R lung field on the lung super score on the left side of the display interface (the identification may be framed or highlighted at this time), the lung ultrasound image corresponding to the 1R lung field is displayed on the right side of the display interface, so that the diagnostic image can be reviewed in detail.
In one embodiment, each lung region may correspond to one or more frames of the ultrasound image of the lung. That is, each lung region may store one or more corresponding lung ultrasound images in memory 124. With continued reference to fig. 6, when each lung region correspondingly stores a plurality of frames of lung ultrasound images, a list 601 of candidate ultrasound images may be shown on a display interface (e.g., may be on a side of the lung ultrasound images) for selection by a user, and the currently displayed ultrasound image may also be framed or highlighted in the list 601.
In one embodiment, with continued reference to fig. 6, to show the change in injury status for each lung field at various stages of the treatment process, a scoring chart 602 of multiple scoring results for a selected lung field may be displayed on a display interface. The scoring graph 602 may be a graph of various forms such as a graph, a line graph, a histogram, and a bar graph, and the scoring graph 602 may be a table in which scoring results are recorded.
As an example, when the scoring graph 602 employs a graph or a line graph, the scoring results of multiple scores may be represented by respective points on the graph or line graph. In addition, the scoring results of the currently displayed ultrasound image of the lung may be highlighted in the scoring graph to facilitate the user's determination of the location of the current scoring results in the graph.
Further, the scoring results on scoring graph 602 may have a mapping relationship with the ultrasound images of the lungs corresponding to the selected lung regions. When a selection instruction for the scoring result on the scoring graph table 602 is received, that is, when a user clicks a point or a line segment on the graph that represents the scoring result, a lung ultrasound image corresponding to the scoring result may be displayed on the display interface, that is, the currently displayed lung ultrasound image is replaced by the lung ultrasound image corresponding to the scoring result selected, and the lung scoring graph is synchronously switched to the lung scoring graph corresponding to the scoring result.
Further, the scoring results or ultrasonic sign recognition results may also be shown with text 603 on the display interface. The scoring results include scores, ultrasonic symptom recognition results such as the number of B lines and B line coverage in the ultrasound image of the lung.
Fig. 3b shows a lung super score graph according to another embodiment of the invention. The lung super score map displays an identification characterizing the scoring result at each lung region location of the lung map, and the lung ultrasound image may be displayed after clicking on the identification. The boundaries of the individual lung fields may or may not be shown as shown in fig. 3 b.
The label of fig. 3b is a cover superimposed over each lung field. The cover can characterize the scoring result with different colors, brightness, textures, texture densities, patterns, pattern densities and/or filling areas. The case of characterizing the score by different textures is illustrated in fig. 3 b. Such an identification covering the whole lung field is identical to the identification shown in fig. 3a, which identification itself may be highlighted upon receipt of a user selection, while the ultrasound image of the lung corresponding to the lung field may be invoked to be displayed. In an example not illustrated, the scoring results for the corresponding lung fields may be displayed on the overlay.
Fig. 4 shows a second form of lung super score according to another embodiment of the invention. When the lung super-score map in this form is adopted, the lung ultrasonic image of the corresponding lung region is displayed at each lung region of the lung map, and the identification of the characterization scoring result is displayed in synchronization with the lung ultrasonic image. The ultrasound image of the lungs displayed at each lung field may be the highest scoring frame in that lung field or may be a frame manually specified by the user. According to this form of lung super score map, the user can view both the scoring results for each lung field and the ultrasound images for each lung field at the same time and quickly at the first time.
As an example, when the logo is displayed in synchronization with the lung ultrasound image, the logo may be displayed on the lung ultrasound image, for example, the logo may be a border, a corner mark, or other graphic displayed on the lung ultrasound image. Alternatively, the marker may be displayed in parallel with the lung ultrasound image, and for example, the marker may be a frame displayed in parallel with the lung ultrasound image, or an indicator strip displayed in parallel with the lung ultrasound image. Or the ultrasound image of the lungs may be displayed on the sign.
As an example, the markers in this embodiment may have different colors, brightness, filling area, patterns, shapes, or textures to characterize the score of the scoring result. For example, referring to fig. 4, a corner mark 402 having a different color (not shown) may be displayed in the lower right corner of the ultrasound image of the lung as the marker. The color of the logo representing the different score may be the same as in the previous embodiment, i.e. 0 score in green, 1 score in yellow, 2 score in orange and 3 score in red.
With continued reference to fig. 4, in one embodiment, when the identification is displayed in synchronization with the lung ultrasound image, one or more of a Score (Score), a number of B lines (B lines), and a percentage of B line coverage (percentage) of the scoring result may also be displayed on or in synchronization with the lung ultrasound image. Further, the total score (LUS) of each lung region may also be displayed on the lung super score map. From the total score displayed on the lung super score map, the user can quickly learn the overall condition of the lung. In addition, the numbers of the corresponding lung regions may also be displayed on the lung ultrasound image, for example, 1R, 2R, 3R displayed in the upper right corner of the lung ultrasound image represent the first, second, and third lung regions of the right lung, respectively, and 1L, 2L, and 3L represent the first, second, and third lung regions of the left lung, respectively.
In one embodiment, when a selection instruction of the lung ultrasound image of a certain lung area of the lung super score map is received, the selected lung ultrasound image may be further displayed in an enlarged manner on the display interface, so that the user can view the selected lung ultrasound image in detail.
In one embodiment, referring to fig. 5, the method 200 further comprises: an ultrasound report is output, the ultrasound report including the lung super score map. The first form of lung super-score graph shown in fig. 3 is very concise and intuitive for presenting diagnostic results, and is therefore very suitable for reporting as an evaluation result graph. Of course, ultrasound reporting may also employ a second form of lung super score map as shown in FIG. 4, depending on the needs of the user. The lung super score map output into the ultrasound report may be consistent with the lung super score map displayed on the display interface. For reasons such as report printing, the method can also be used for adaptively adjusting when outputting the lung super score map, adjusting the marks on the lung super score map to a mode suitable for printing, removing the lung ultrasonic images of all lung areas to display the marks only, and/or outputting the marks with the scoring result exceeding the preset scoring threshold value on the lung super score map. For example, the lung super score graph of FIG. 3 is identified as a color patch, which may be directly converted to a color box when output into an ultrasound report.
Specific details of displaying a single lung super score map on a display interface are described above exemplarily. In another embodiment, referring to fig. 7, step S240 may further include: and simultaneously displaying a plurality of windows on the display interface, wherein each window displays the lung super score graph with one score. For example, four windows are displayed on the display interface in fig. 7, and a lung super score map of one score is displayed in each window.
When the lung super scoring graphs scored for a plurality of times are simultaneously displayed on the display interface, a user can simultaneously check scoring results of various stages in the treatment process, so that treatment effects of various parts can be visually compared and evaluated. For example, if the identification of each lung field on multiple lung super score maps changes from red to orange or from orange to yellow or green as the treatment progresses, the improvement in lung ventilation is visually indicated.
The lung super score map displayed in each of the plurality of windows may be any of the forms of the lung super score map described above, but preferably the lung super score maps of the plurality of windows are in the same form for comparison. Because the first form of lung super score map described in connection with fig. 3 is relatively compact and intuitive, in one embodiment, when multiple lung super score maps are displayed simultaneously for comparison, a lung super score map of the first form described above may be employed.
As an example, when a plurality of lung super score maps are simultaneously displayed on a display interface, the obtaining times of the lung super score maps are displayed in each of the windows, and the lung super score maps are arranged in the order of the obtaining times so as to be convenient for a user to view.
In one embodiment, when an instruction to select an identification of a certain lung region of one of the windows is received while a plurality of lung super score maps are simultaneously displayed on the display interface, the identification at the corresponding lung region of the lung super score map of the other window or windows is automatically highlighted. That is, if the user selects an identifier within a window, for example, if the user selects the identifier of the 1L lung region of the lung super score map in the window at the upper right corner in fig. 7, the identifiers of the 1L lung regions in the lung super score maps in the other three windows are automatically highlighted or enlarged, so that the user is facilitated to compare the scores of the 1L lung regions at the respective time points.
In one embodiment, the method 200 further comprises: the mark that the scoring result of a certain lung area of the lung super scoring graph is changed relative to the time before or after the secondary scoring is highlighted on the secondary scoring lung super scoring graph. That is, when the scoring result of a certain lung field changes in one scoring with respect to the scoring result of the time preceding or following the one scoring, an indication indicating that the scoring result changes is displayed in the lung super scoring map generated based on the one scoring to prompt the user to pay attention to the lung field.
As an example, the indication that the scoring result is changed may be a graphic or a text, and the graphic or the text may be a fixed graphic or a text, that is, only the scoring result is prompted to change, and no improvement or no reduction is prompted to the scoring result. Or different identifiers can be used to respectively represent the increase or decrease of the scoring result.
In one embodiment, the method 200 further comprises: obtaining the difference value of scoring results of each lung area at two adjacent obtaining times; the difference is displayed as a graphical element on the lung super score map in a window of the lung super score map corresponding to a later time. Wherein the graphical elements may have one or more of different colors, sizes, shapes, textures, patterns, fill areas, and pattern densities to characterize the magnitude of the difference. The graphical element of the characterization difference may be displayed in parallel with or overlapping the identification of the characterization scoring result.
For example, referring to fig. 7, in which the scoring results for the 4R lung fields and the 4L lung fields in the 5 th 4 th 2018 lung super score map are changed with respect to the scoring results for the 1 th 2018 4 th lung super score map, the scoring results for both lung fields are reduced by 1 score, and thus the same graphic element (not shown) may be displayed at the corresponding lung fields to characterize the reduction of the scoring results by 1 score.
In one embodiment, when a selection of at least two of the lung super-score maps for comparative evaluation is received, at least two of the lung super-score maps and their corresponding lung ultrasound images are displayed in a plurality of windows of a display interface to allow a user to conduct comparative analysis of the results of two ultrasound diagnoses in combination with the lung super-score maps and the specific lung ultrasound images.
It should be noted that, the user may select two lung super-score graphs for comparative evaluation on the display interface displaying multiple lung super-score graphs, or may select two lung super-score graphs for comparative evaluation on two display interfaces displaying a single lung super-score graph, respectively.
Further, when an instruction for selecting at least two lung super-score maps for comparison evaluation is received, at least two groups of windows are simultaneously displayed on a display interface, and a selected one of the lung super-score maps and a lung ultrasonic image corresponding to the selected one of the lung super-score maps are displayed in each group of windows. The user can click on any position of the lung super-score map to select the corresponding lung super-score map for comparison evaluation, or can select a specific identifier of the lung super-score map to directly select the lung ultrasonic image of the corresponding lung region for comparison evaluation.
For example, referring to fig. 7 and 8, when the lung super score maps of two scores of 2018, 4, 1 and 2018, 3, 30 are selected in the display interface shown in fig. 7 for comparative evaluation, two sets of windows are displayed on the display interface, respectively, wherein the first set of windows located in the upper row displays the lung super score map of 2018, 4, 1 and the lung ultrasound image for comparative evaluation, here the lung ultrasound image of 1R lung region is selected; the second set of windows in the lower row displays the lung super-score map of 3.30.2018 and the ultrasound image of the lungs at the same lung area, i.e., the ultrasound image of the lungs which is also the 1R lung area. As an example, the identification of the currently displayed ultrasound image of the lung may also be highlighted in the lung super score map, e.g. highlighted or enlarged, etc.
In contrast evaluation, when a lung ultrasound image of a lung region of a selected one of the lung super score maps is received for display, lung ultrasound images at corresponding lung regions of the other one or more of the lung super score maps of the contrast evaluation are automatically displayed. For example, with continued reference to fig. 8, assuming that the user changes the selected lung region from 1R lung region to 1L lung region in the lung super score map of month 1 of 2018, the first set of windows will display a lung ultrasound image of the 1L lung region of month 4 of 2018, the processor 118 may further control the automatic change of the selected lung region from 1R lung region to 1L lung region in the lung super score map of month 30 of 2018, at which time the second set of windows will display a lung ultrasound image of the 1L lung region of month 3 of 2018. Through the linkage selection of the lung area, the operation friendliness in comparison and evaluation can be further improved, and the repeated and useless workload of a user is reduced.
In another example, the processor 118 may display a corresponding set of windows based on the number of lung super score maps acquired, as described above with respect to fig. 8, which displays the comparative evaluation in two sets of windows. For example, with respect to the case where four lung super score maps are provided in fig. 7, when an instruction for performing comparative evaluation on the lung super score maps is received, the display 120 may be controlled to display four sets of windows on the display interface, where each set of windows displays one lung super score map and its corresponding ultrasound image of the lung.
In summary, the ultrasound image analysis method 200 of the embodiment of the present invention scores each lung region, and generates and displays the lung super-score map based on the scoring result, so as to intuitively display and compare the lung imaging result, thereby enabling the lung ultrasound examination to more satisfy the clinical requirement.
Next, an ultrasound image analysis method 900 according to another embodiment of the present application will be described with reference to fig. 9. As shown in fig. 9, the ultrasound image analysis method 900 may include the steps of:
in step S910, an ultrasound image of one or more regions of the object under test is acquired.
Wherein the object to be tested can be human body, fetus or animal. Or the object to be detected may be a part to be detected of a human body, such as the chest and abdomen of a human body, or the like. When the object is a human body, the one or more regions of the object include, for example, the heart, lung, liver, gall, spleen and stomach, and other tissue organs. Specifically, the ultrasonic imaging system shown in fig. 1 may be used to transmit ultrasonic waves to the region to be measured of the object to be measured, and obtain an ultrasonic image of the region according to the ultrasonic echo returned from the region to be measured.
In step S920, ultrasound signs in the ultrasound image of the one or more regions are identified.
Wherein the ultrasound images of different regions have different ultrasound signatures. For example, ultrasound signs in ultrasound images of the lungs are, for example, B-line, lung metaplasia, pleural effusion, etc.; ultrasonic signs of the intestinal tract include, for example, intestinal canal distention, intestinal lumen Qi accumulation, cockscomb, key signs, etc. that characterize intestinal obstruction or pseudo-kidney signs, target ring signs, etc. that characterize intestinal tumors; ultrasound signs of the hepatobiliary site are, for example, a target sign (TARGET SIGN) characterizing a liver nodule, an anti-target sign, a vascular float sign (flow VESSELS SIGN) characterizing liver lymphoma, and the like.
For ultrasound images of different regions, automatic recognition, manual recognition, or a combination of automatic and manual recognition may be employed to identify ultrasound signs therein. For example, neural network models may be separately trained for different regions for automatically identifying ultrasound signatures therein. The method for identifying the ultrasonic signs is not limited in the embodiment of the invention.
In step S930, the ultrasound images of the one or more regions are scored according to the ultrasound signals to generate a scoring result.
Wherein the scoring result can be used for representing the damage degree of each region of the tested object, for example, the higher the scoring is, the more serious the damage degree is. In the embodiment of the invention, the method of automatic, manual or a combination of automatic and manual can be adopted to score each area according to the ultrasonic symptoms.
At step S940, an ultrasound scoring graph is displayed on a display interface, the ultrasound scoring graph including a graphic of a measured object and an identification displayed at one or more regions of the graphic of the measured object, the identification being used to characterize the scoring result for the corresponding region. The graph of the measured object can be a structural diagram showing the appearance of the measured object, or can be other indication graphs with corresponding proportion relation with the measured object. For example, when the object is a lung of a human body, the pattern of the object may be a lung pattern. For example, when the object to be measured is a plurality of tissues/organs of the human body, the figure of the object to be measured may be a human body model diagram, and a structure diagram of tissues/organs corresponding to one or more regions is optionally displayed on the human body model diagram. For example, when the object to be measured is a heart, the graph of the object to be measured may be a four-quadrant graph corresponding to four chambers of the heart, respectively.
Wherein the identification may include a graphic that characterizes the score of the scoring result in one or more of different colors, brightness, textures, texture densities, patterns, pattern densities, fill areas, or shapes.
In one embodiment, a single scoring ultrasound score map is displayed on the display interface, which may take two forms: the ultrasound scoring graph of the first form is similar to the lung super scoring graph shown in fig. 3, i.e. the identification representing the scoring result is displayed in one or more areas of the graph of the measured object, at this time, the ultrasound scoring graph may be used as a playback navigation interface, i.e. when clicking on the identification of each area, the ultrasound image of the corresponding area is displayed on the display interface.
The second form of ultrasound score map is similar to the lung super score map shown in fig. 4, in that the ultrasound score map is displayed in synchronization with the ultrasound image on the graph of the object under test. The graph of the measured object depends on the type of the measured object, for example, when the measured object is a human body, the graph of the measured object is a human shape or an approximate human shape. For example, when the object is a specific tissue or organ, the pattern of the object is an image of the corresponding tissue or organ. For example, when the object is a liver, the displayed image is a liver image.
In another embodiment, a plurality of windows are simultaneously displayed on the display interface, each window displaying a single scored ultrasound scoring graph. When the ultrasonic scoring graphs scored for a plurality of times are simultaneously displayed on the display interface, a user can simultaneously know scoring results of the patient at different stages, so that the change conditions of all the parts can be visually compared and evaluated.
Wherein the ultrasound score map displayed in each of the plurality of windows may be any of the above-described forms of ultrasound score maps, but preferably the ultrasound score maps of the plurality of windows are in the same form for comparison. Since the ultrasound score map of the first form described above is relatively simple and intuitive, in one embodiment, when a plurality of ultrasound score maps are displayed simultaneously for comparison, the ultrasound score map of the first form described above may be employed.
As an example, when a plurality of ultrasonic score maps are simultaneously displayed on a display interface, the acquisition times of the ultrasonic score maps are displayed in each of the windows, and the ultrasonic score maps are arranged in the order of the acquisition times so as to be convenient for a user to view.
In one embodiment, when a plurality of ultrasonic score maps are simultaneously displayed on the display interface, when an instruction for selecting the identification of a certain area of the ultrasonic score map of one window is received, the identification at the corresponding area of the ultrasonic score map of the other window or windows is automatically highlighted, so that the user can conveniently compare the scores of the areas at various time points.
In one embodiment, the identification of the change in scoring result for a region of the ultrasound scoring graph relative to the time before or the time after the secondary scoring may also be highlighted on the ultrasound scoring graph of the secondary scoring. That is, when the scoring result of a certain region is changed with respect to the scoring result at a time before or after the sub-score in the sub-score, an identification indicating that the scoring result is changed is displayed in the ultrasound scoring graph generated based on the sub-score to prompt the user to pay attention to the region.
As an example, the indication that the scoring result is changed may be a graphic or a text, and the graphic or the text may be a fixed graphic or a text, that is, only the scoring result is prompted to change, and no improvement or no reduction is prompted to the scoring result. Or different identifiers can be used to respectively represent the increase or decrease of the scoring result.
In one embodiment, the method 900 further comprises: obtaining the difference value of scoring results of each region at two adjacent obtaining times; and displaying the difference value on the ultrasonic scoring graph in a window of the ultrasonic scoring graph corresponding to the later time by a graphic element. Wherein the graphical elements may have one or more of different colors, sizes, shapes, textures, patterns, fill areas, and pattern densities to characterize the magnitude of the difference. The graphical element of the characterization difference may be displayed in parallel with or overlapping the identification of the characterization scoring result.
In one embodiment, when receiving a selection of at least two of the ultrasound score maps for comparative evaluation, at least two of the ultrasound score maps and their corresponding ultrasound images are displayed in a plurality of windows of a display interface to allow a user to perform comparative analysis on the results of two ultrasound diagnoses in combination with the ultrasound score maps and the specific ultrasound images.
It should be noted that, the user may select two ultrasound score maps on the display interface displaying a plurality of ultrasound score maps for comparison evaluation, or may select two ultrasound score maps on two display interfaces displaying a single ultrasound score map for comparison evaluation, respectively.
Further, when an instruction for selecting at least two ultrasonic scoring graphs for comparison and evaluation is received, at least two groups of windows are simultaneously displayed on a display interface, and a selected one of the ultrasonic scoring graphs and a corresponding ultrasonic image thereof are displayed in each group of windows. The user can click on any position of the ultrasonic scoring graph to select the corresponding ultrasonic scoring graph for comparison evaluation, or can select a specific identifier of the ultrasonic scoring graph to directly select an ultrasonic image of a corresponding region for comparison evaluation.
In another embodiment, when an ultrasound image of a region of one ultrasound score map is received for display, the ultrasound images at the corresponding regions of the other one or more ultrasound score maps are automatically displayed.
In summary, the method 900 for analyzing an ultrasonic image in the embodiment of the present invention scores each region of the object to be tested, and generates an ultrasonic score map based on the scoring result for display, so that the ultrasonic diagnosis result can be intuitively displayed and compared, and the clinical requirement can be more satisfied by the ultrasonic diagnosis. According to the scheme for comparing and evaluating the multiple scoring results, the historical examination data of the same patient can be reasonably utilized, so that a doctor can comprehensively know the change condition of the patient in a period of time.
Next, an ultrasound imaging method 1000 according to still another embodiment of the present application will be described with reference to fig. 10. The ultrasonic imaging method can guide the current scanning of the tested object based on the historical scoring condition of the tested object. As shown in fig. 10, the ultrasound imaging method 1000 may include the steps of:
In step S1010, a historical ultrasound scoring graph of a measured object is obtained, the historical ultrasound scoring graph including a measured object graph and identifiers displayed in one or more regions of the measured object graph, the identifiers being used to characterize a historical score. The historical ultrasonic scoring graph can present the comprehensive scores of the tested object in the past period of time, can present the historical scores of the tested object in the last ultrasonic imaging examination, and can also present the scores of the tested object in the past period of time in parallel. Through the identification of one or more areas, a user can intuitively understand the history of each area and direct real-time scanning to be performed according to the history.
The identification may include a graphic. The graphic may characterize the scored scores in one or more of different colors, brightnesses, textures, texture densities, patterns, pattern densities, fill areas, or shapes. For example, graphics with different colors can be displayed at each region on the measured object graphics according to the score, and the user can quickly obtain the score of the corresponding region according to the color of the graphics. The ultrasound scoring graph may display an identification of the score characterizing the score in the manner of fig. 3; the ultrasound scoring graph may also refer to the manner of fig. 4, where the ultrasound image corresponding to each region and the identification of the characterization scoring score are displayed simultaneously in each region. For example, when the subject is a human body, the one or more regions may be thyroid, liver, gall, kidney, spleen, ureter, and the like. The historical ultrasound scoring graph may display a corresponding one or more historical scoring results in each region with a color-differentiated identification.
In step S1020, a scan indication of one or more regions is generated based on the scores of the historical scores. The scanning instruction can be embedded into the scanning workflow, and a user is required to carry out real-time scanning according to the scanning instruction; the interface indication can also be provided on the interface to play a role in prompting the user to scan in real time currently. In one embodiment, the scan order for one or more regions may be generated based on the scores of the historical scores. For example, a higher score generally represents more severe damage to the area, and the method may prompt the user to scan the various areas in real-time in order of higher score to lower score. In one embodiment, the processor determines the relationship between the score of each region and the preset score threshold in the system, if the relationship exceeds the preset threshold, the region is considered to belong to a region which needs to be focused on currently by the user, and the regions can be highlighted on the measured object graph, so that the user is guided to focus on the scanning of the regions. The preset threshold value for each region may be the same or different. In one embodiment, the processor may analyze the change in historical scores of the individual regions, set one or more region scan orders according to how much the score increases or the time it takes to increase, or highlight corresponding regions where the score increases above a preset difference threshold or where the time required for the increase is below a preset time threshold as regions of significant concern.
In step S1030, ultrasound is sent to one or more regions of the tested object for scanning according to the scanning instruction, so as to obtain an ultrasound echo signal. As above, when the scanning instruction provides the scanning sequence, the ultrasonic waves can be sequentially sent to each region of the tested object according to the scanning sequence for scanning; when the scanning indication provides the key area prompt, the corresponding key area can be scanned preferentially or scanned mainly according to the prompt of the key area.
In step S1040, the ultrasound echo signals are processed to obtain a current ultrasound image of one or more regions.
Wherein the form of the historical ultrasound scoring map obtained in step S1010 may be similar to the form of the two lung super scoring maps described in method 200 with reference to fig. 3 and 4. For example, in the ultrasound score chart of the first form, the identification of the score representing the score result is displayed in each region of the measured object figure, but the ultrasound image is not displayed. In this form, the ultrasound score map may be a playback navigation interface, and when a selection instruction of the identification of a certain region of the ultrasound score map is received, an ultrasound image corresponding to the region is displayed on a display interface.
In the second form of the historical ultrasound score map, the ultrasound score map may include ultrasound images of respective regions displayed at respective regions of the graphic of the object under test, the identification being displayed in synchronization with the ultrasound images. Specifically, the identification is displayed on the ultrasound image, or the identification is displayed in parallel with the ultrasound image.
In one embodiment, the scanning sequence of each region can be generated according to the sequence of the scores of each region from high to low in the historical ultrasonic scoring graph, namely, the region with higher score and more serious damage is scanned preferentially. Or the scanning sequence of each region can be generated according to the sequence from low to high of the scores of each region in the historical ultrasonic scoring graph. In another embodiment, the areas with increased score values may also be preferentially swept over multiple historical ultrasound scoring images.
Illustratively, the ultrasonic probe 110 may be excited by the transmitting circuit 112 shown in fig. 1 to transmit ultrasonic waves to respective regions of a measured object, and receive ultrasonic echoes returned from the measured object through the receiving circuit 114 and the beam forming circuit 116 to obtain ultrasonic echo signals.
After that, in step S1040, the ultrasound echo signals are processed to obtain current ultrasound images of the respective regions. For example, the ultrasound echo signals may be processed by the processor 118 to obtain ultrasound images of various regions of the object under test.
In one embodiment, after acquiring the current ultrasound image of one or more regions, the ultrasound imaging method 1000 further comprises: identifying ultrasound signs of a current ultrasound image of the one or more regions; obtaining a current score for one or more regions from the ultrasound signature; and updating the historical ultrasonic scoring graph according to the current scores of the one or more areas to obtain a current ultrasonic scoring graph, wherein the current ultrasonic scoring graph comprises the tested object graph and identifiers of the one or more areas displayed on the tested object graph, and the identifiers are used for representing the current scores. Specific details of identifying ultrasound signatures, scoring from ultrasound signatures, and generating a current ultrasound scoring map from scores are described with respect to methods 200 and 900 and are not described in detail herein.
In one embodiment, the identification of the region where the score changed may be highlighted on the current ultrasound score map when the current ultrasound score map is generated. That is, if the score of a certain region in the current ultrasound score map is changed from the score of the corresponding region in the history ultrasound score map acquired in step S1010, the identification of the region is highlighted.
In the ultrasound imaging method 1000 according to still another embodiment of the present application, guiding the scanning of the current ultrasound diagnosis based on the historical ultrasound scoring map can make the ultrasound diagnosis more targeted. For application scenes such as physical examination, the method is beneficial to reasonably utilizing historical physical examination data, and achieves the effects of long-term tracking and important attention.
Referring back now to fig. 1, embodiments of the present invention also provide an ultrasound imaging system 100, and the ultrasound imaging system 100 may be used to implement the above-described method 200, method 900, or method 1000. The ultrasound imaging system 100 may include some or all of the ultrasound probe 110, transmit circuitry 112, receive circuitry 114, beam synthesis circuitry 116, processor 118, display 120, transmit/receive selection switch 122, and memory 124, the relevant description of each of which may be found above.
Wherein the transmitting circuit 112 is used to excite the ultrasound probe 110 to transmit ultrasound waves to the target object. And a receiving circuit 114, configured to control the ultrasound probe 110 to receive the ultrasound echo returned from the target object, and obtain an ultrasound echo signal. The processor 118 is configured to: processing the ultrasonic echo signals to obtain a lung ultrasonic image; identifying ultrasound signs of each lung region in the ultrasound image of the lung; and scoring each lung region according to the ultrasonic symptoms to generate a scoring result. A memory 124 for storing programs executed by the processor 118. The display 120 is configured to display a lung super-score map on a display interface, the lung super-score map including a lung graphic and indicia displayed at each lung region of the lung graphic, the indicia being configured to characterize the scoring result for a corresponding lung region.
Wherein the processor 118 may perform steps S210 through S230 in the method 200 described above in connection with fig. 2, and the display 120 may perform step S240 in the method 200. Only the main functions of the ultrasound imaging system 100 are described below, and details that have been described above are omitted.
In one embodiment, the indicia in the lung super score graph displayed by display 120 includes graphics that characterize the score result as different colors, brightnesses, textures, texture densities, patterns, pattern densities, fill areas, and/or shapes.
Illustratively, the identification includes color patches or color boxes of different colors, each color patch or color box representing a score of the scoring result. Further, the identification may also include a score of the scoring result displayed in the color patch or color box.
As an example, the embodiments of the present invention propose two forms of lung super score maps. A lung super score map, in the form of fig. 3, displays an identification on a lung graphic, where the lung super score map is a playback navigation interface, and when the processor 118 receives a selection instruction for the identification of a certain lung field of the lung super score map, the display 120 displays an ultrasound image of the lung corresponding to the lung field on the display interface. Wherein each lung field corresponds to one or more of the lung ultrasound images stored in memory 124.
Another form of lung super score graph is shown in fig. 4. The lung super-score map in this form includes lung ultrasound images of respective lung regions displayed at respective lung regions of the lung map, the identification being displayed in synchronism with the lung ultrasound images, the scoring results and the lung ultrasound images being viewable by a user simultaneously.
When the lung ultrasound image is displayed in synchronization with the lung ultrasound image, the marker may be displayed on the lung ultrasound image or the marker may be displayed in parallel with the lung ultrasound image. Wherein the logo comprises, as an example, a corner mark or border with different colors, brightness, filling area, pattern, shape or texture displayed on the ultrasound image of the lung.
In one embodiment, when the processor 118 receives a selection instruction for a lung ultrasound image of a certain lung region of the lung super score map, the display 120 is caused to zoom in on the display interface to display the selected lung ultrasound image for a detailed view by the user.
In one embodiment, the display 120 is further configured to display an overall score for a plurality of lung fields on the lung super score map to enable the user to overview the overall condition of the lung.
The lung super score map displayed by the display 120 is exemplarily described above. The display 120 may display a single scored lung super score map, or the display 120 may simultaneously display multiple windows on the display interface, each window displaying a single scored lung super score map, as shown in fig. 7.
When the display 120 simultaneously displays multiple windows, one lung super score map in each window, in one embodiment, when the processor 118 receives an identification of a lung region of the lung super score map for which one window is selected, the display 120 is controlled to automatically highlight the identification at the corresponding lung region of the lung super score map for the other window or windows for comparative analysis by the user.
As an example, the processor 118 is further configured to control the display 120 to display the time of acquisition of the respective lung super score map in each of the windows. The processor 118 may further control the display 120 to sequence the lung super score maps in order of the acquisition time for convenient viewing by the user.
In one embodiment, the processor 118 is further configured to control the display 120 to perform the following functions: highlighting on the lung super score map of a primary score an indication of a change in the scoring result of a certain lung field of the lung super score map relative to a time preceding or following the primary score. That is, when the scoring result of a certain lung field changes in one scoring with respect to the scoring result of the time preceding or following the one scoring, an indication indicating that the scoring result changes is displayed in the lung super scoring map generated based on the one scoring to prompt the user to pay attention to the lung field.
As an example, the indication of the change in the scoring result may be a graph, which may be a fixed graph, i.e. only used to indicate that the scoring result has changed. Or may represent the score increase or decrease, respectively, using different graphs.
In one embodiment, upon receiving a selection of at least two of the lung super-score maps for comparative evaluation, the processor 118 controls the display 120 to display the at least two of the lung super-score maps and their corresponding ultrasound images of the lungs in the plurality of windows of the display interface.
Further, at this point the processor 118 may control the display 120 to simultaneously display at least two sets of windows on the display interface and to display a selected one of the lung super-score maps and its corresponding ultrasound image of the lung in each set of windows. As shown in fig. 8, if the user selects the 1R lung fields in the two scoring lung super-score maps of fig. 7, 2018, 4, 1 and 2018, 3, 30, for comparative evaluation, then the processor 118 controls the display 120 to display the lung super-score maps of 2018, 4, 1 and 2018, 3, 30, respectively, and the ultrasound image of the lung of the 1R lung field therein on the display interface.
In another embodiment, upon receiving a lung ultrasound image of a lung region of a selected one of the lung super score maps for display during the comparative evaluation, the processor 118 is further configured to control the display 120 to automatically display the lung ultrasound images at the corresponding lung region of the other one or more of the lung super score maps for the comparative evaluation. For example, when the user selects the 1R lung field ultrasound image of 2018, month 4, and 1 in fig. 8 for display, the processor 118 may control the display 120 to display the lung super score map of 2018, month 3, and 30 in another set of windows with the 1R lung field ultrasound image of the lung field.
Furthermore, according to an embodiment of the present application, there is also provided a computer storage medium on which program instructions are stored for performing the respective steps of the ultrasound image analysis method 200, 900 and the ultrasound imaging method 1000 of an embodiment of the present application when the program instructions are executed by a computer or a processor (such as the aforementioned processor 103 or 920). The storage medium may include, for example, a memory card of a smart phone, a memory component of a tablet computer, a hard disk of a personal computer, read-only memory (ROM), erasable programmable read-only memory (EPROM), portable compact disc read-only memory (CD-ROM), USB memory, or any combination of the foregoing storage media. The computer-readable storage medium may be any combination of one or more computer-readable storage media.
Furthermore, according to an embodiment of the present application, there is also provided a computer program, which may be stored on a cloud or local storage medium. Which when executed by a computer or processor is adapted to carry out the respective steps of the ultrasound image analysis method of an embodiment of the present application.
Based on the above description, the ultrasonic image analysis method, the ultrasonic imaging system and the computer storage medium according to the embodiment of the application score each region of the target object, and generate an ultrasonic scoring graph according to the scoring result, wherein the ultrasonic scoring graph can intuitively display and compare the scoring result, so that the ultrasonic diagnosis can meet clinical requirements more. In addition, the ultrasound scoring map may also be used for follow-up ultrasound diagnosis.
Although the illustrative embodiments have been described herein with reference to the accompanying drawings, it is to be understood that the above illustrative embodiments are merely illustrative and are not intended to limit the scope of the present application thereto. Various changes and modifications may be made therein by one of ordinary skill in the art without departing from the scope and spirit of the application. All such changes and modifications are intended to be included within the scope of the present application as set forth in the appended claims.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, e.g., the division of the elements is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple elements or components may be combined or integrated into another device, or some features may be omitted or not performed.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the application may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in order to streamline the application and aid in understanding one or more of the various inventive aspects, various features of the application are sometimes grouped together in a single embodiment, figure, or description thereof in the description of exemplary embodiments of the application. However, the method of the present application should not be construed as reflecting the following intent: i.e., the claimed application requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this application.
It will be understood by those skilled in the art that all of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or units of any method or apparatus so disclosed, may be combined in any combination, except combinations where the features are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features but not others included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the application and form different embodiments. For example, in the claims, any of the claimed embodiments may be used in any combination.
Various component embodiments of the application may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that some or all of the functions of some of the modules according to embodiments of the present application may be implemented in practice using a microprocessor or Digital Signal Processor (DSP). The present application can also be implemented as an apparatus program (e.g., a computer program and a computer program product) for performing a portion or all of the methods described herein. Such a program embodying the present application may be stored on a computer readable medium, or may have the form of one or more signals. Such signals may be downloaded from an internet website, provided on a carrier signal, or provided in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the application, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The application may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, third, etc. do not denote any order. These words may be interpreted as names.
The foregoing description is merely illustrative of specific embodiments of the present application and the scope of the present application is not limited thereto, and any person skilled in the art can easily think about variations or substitutions within the scope of the present application. The protection scope of the application is subject to the protection scope of the claims.

Claims (67)

1. A method of ultrasound image analysis, the method comprising:
Acquiring a lung ultrasonic image;
Identifying ultrasound signs of each lung region in the ultrasound image of the lung;
scoring the individual lung fields according to the ultrasound signatures to generate a scoring result;
and displaying a lung super-score graph on a display interface, wherein the lung super-score graph comprises a lung graph and identifiers displayed at each lung region of the lung graph, and the identifiers are used for representing the scoring results of the corresponding lung regions.
2. The method of claim 1, wherein the indicia comprises a graphic that characterizes the scoring result as a different color, brightness, texture density, pattern density, fill area, and/or shape.
3. The method of claim 1, wherein the indicia comprises color patches or color boxes of different colors, each color patch or color box representing a score of the scoring result.
4. A method according to claim 3, wherein the identification further comprises a score of the scoring result displayed in the color bar or color box.
5. The method of one of claims 1-4, wherein the lung super score map is a playback navigation interface, the method further comprising:
and when a selection instruction of the identification of a certain lung area of the lung super score graph is received, displaying a lung ultrasonic image corresponding to the lung area on the display interface.
6. The method of claim 5, wherein each lung region corresponds to one or more frames of the ultrasound image of the lung.
7. The method of claim 5, wherein the method further comprises: a scoring chart of the multiple scoring results at the selected lung fields is displayed on the ultrasound lung image.
8. The method of claim 7, wherein the scoring result on the scoring graph has a mapping relationship with the ultrasound image of the lung corresponding to the selected lung region, the method further comprising:
And when a selection instruction of the scoring result on the scoring chart is received, displaying a lung ultrasonic image corresponding to the scoring result on the display interface.
9. The method of claim 7, wherein the method further comprises: and highlighting the scoring result of the currently displayed lung ultrasonic image in the scoring chart.
10. The method of claim 1, wherein the lung super-score map comprises a lung ultrasound image of a respective lung region displayed at each lung region of the lung map, the identification being displayed in synchronization with the lung ultrasound image.
11. The method of claim 10, wherein the marker is displayed on the ultrasound lung image or the marker is displayed in juxtaposition to the ultrasound lung image.
12. The method of claim 10, wherein the identification comprises a corner mark or border of different colors, brightness, fill area, patterns, shapes, or textures displayed on the ultrasound image of the lung.
13. The method of claim 10, wherein the lung super-score map further comprises a score, a number of B-lines, and/or a percentage of B-line coverage of a scoring result displayed on the ultrasound image of the lung.
14. The method according to claim 10, wherein the method further comprises:
and when a selection instruction of the lung ultrasonic image of a certain lung area of the lung super score graph is received, magnifying and displaying the selected lung ultrasonic image on the display interface.
15. The method according to claim 1, wherein the method further comprises: the total score for the plurality of lung fields is displayed on the lung super score map.
16. The method according to one of claims 1 to 15, characterized in that the method further comprises: an ultrasound report is output, the ultrasound report including the lung super score map.
17. The method of any one of claims 1-16, wherein displaying the lung super score map on the display interface comprises:
and displaying a single-scoring lung super-scoring graph on the display interface.
18. The method of any one of claims 1-16, wherein displaying the lung super score map on the display interface comprises:
And simultaneously displaying a plurality of windows on the display interface, wherein each window displays the lung super score graph with one score.
19. The method of claim 18, wherein the method further comprises: when an identification of a certain lung region of the lung super score map for one of the windows is received, the identification at the corresponding lung region of the lung super score map for the other window or windows is automatically highlighted.
20. The method as recited in claim 18, further comprising: and displaying the acquisition time of the lung super score map in each window.
21. The method as recited in claim 20, further comprising: highlighting on the lung super score map of a primary score an indication of a change in the scoring result of a certain lung field of the lung super score map relative to a time preceding or following the primary score.
22. The method as recited in claim 20, further comprising:
Obtaining the difference value of scoring results of each lung area at two adjacent obtaining times;
the difference is displayed as a graphical element on the lung super score map in a window of the lung super score map corresponding to a later time.
23. The method of claim 22, wherein the graphical elements have different colors, sizes, shapes, textures, patterns, fill areas, and/or pattern densities to characterize the magnitude of the difference.
24. The method of claim 18, wherein displaying the lung super score map on the display interface comprises: and simultaneously displaying four windows on the display interface.
25. The method of claim 18, wherein the method further comprises: and when receiving selection of at least two lung super-score maps for comparison evaluation, displaying the at least two lung super-score maps and corresponding lung ultrasonic images thereof in a plurality of windows of the display interface.
26. The method of any one of claims 1-16, wherein displaying the lung super score map on the display interface when a selection of at least two of the lung super score maps for comparative evaluation is received comprises:
And simultaneously displaying at least two groups of windows on the display interface, and displaying a selected one of the lung super-score maps and the lung ultrasonic image corresponding to the lung super-score maps in each group of windows.
27. The method according to claim 25 or 26, characterized in that the method further comprises: when a lung ultrasound image of a lung region of a selected one of the lung super score maps is received for display, lung ultrasound images at corresponding lung regions of the other one or more of the lung super score maps are automatically displayed.
28. The method of claim 1, wherein the acquiring an ultrasound image of the lung comprises: the lung ultrasound images are acquired in real-time or read from a storage medium.
29. The method of claim 1, wherein the identifying ultrasound signs in the ultrasound image of the lung comprises: automatic identification, manual identification, or a combination of automatic and manual identification.
30. The method of claim 29, wherein the automatically identifying comprises: inputting the lung ultrasonic image into a trained neural network model, and outputting the identification result of the ultrasonic sign.
31. An ultrasonic image analysis method, comprising:
acquiring ultrasonic images of one or more areas of a measured object;
Identifying ultrasound signs in the ultrasound image of the one or more regions;
Scoring the ultrasound images of the one or more regions according to the ultrasound signals to generate scoring results;
And displaying an ultrasonic scoring graph on a display interface, wherein the ultrasonic scoring graph comprises a graph of a tested object and identifiers of one or more areas displayed on the graph of the tested object, and the identifiers are used for representing the scoring results of the corresponding areas.
32. The method of claim 31, wherein the indicia comprises a graphic that characterizes the scoring result as a different color, brightness, texture density, pattern density, fill area, and/or shape.
33. The method of claim 31, wherein the ultrasound scoring graph is a playback navigation interface, the method further comprising:
And when receiving a selection instruction of the identification of a certain area of the ultrasonic score map, displaying an ultrasonic image corresponding to the area on the display interface.
34. The method of claim 31, wherein the ultrasound scoring comprises an ultrasound image of a respective region displayed at one or more regions of the graphic of the object under test, the identification being displayed in synchronization with the ultrasound image.
35. The method of claim 34, wherein the identification is displayed on the ultrasound image or the identification is displayed in juxtaposition to the ultrasound image.
36. The method of claim 35, wherein the identification comprises a corner mark or border having a different color, brightness, fill area, pattern, shape, or texture displayed on the ultrasound image.
37. The method of any one of claims 30-36, wherein displaying the ultrasound scoring map on the display interface comprises:
And displaying an ultrasonic scoring graph of single scoring on the display interface.
38. The method of any one of claims 30-36, wherein displaying the ultrasound scoring map on the display interface comprises:
and simultaneously displaying a plurality of windows on the display interface, wherein each window displays the ultrasonic scoring graph of the same region.
39. The method as recited in claim 38, further comprising: and displaying the acquisition time of the ultrasonic score map in each window, and arranging the ultrasonic score maps according to the sequence of the acquisition time.
40. The method as recited in claim 39, further comprising: highlighting on the ultrasound scoring map of a primary score an indication of a change in scoring result for a region of the ultrasound scoring map relative to a time preceding or following the primary score.
41. The method as recited in claim 39, further comprising:
obtaining a difference value of scoring results of the one or more regions at two adjacent obtaining times;
and displaying the difference value on the ultrasonic scoring graph in a window of the ultrasonic scoring graph corresponding to the later time by a graphic element.
42. The method of claim 38, comprising displaying the ultrasound scoring graph of the current scores of the plurality of regions in one of the windows and displaying the ultrasound scoring graph of one or more historical scores of corresponding regions in the remaining one or more of the plurality of windows.
43. The method as recited in claim 42, further comprising: and generating a trend graph according to the current scores and the historical scores.
44. The method of claim 43, further comprising: and outputting an ultrasonic report, wherein the ultrasonic report comprises the trend graph and/or the difference value of scoring results of each region at two adjacent acquisition times.
45. The method of claim 38, wherein the method further comprises: and when at least two ultrasonic scoring graphs are selected for comparison evaluation, displaying the at least two ultrasonic scoring graphs and corresponding ultrasonic images thereof in the windows of the display interface.
46. The method according to claim 45, characterized in that the method further comprises: when the ultrasound image of a certain area of one ultrasound score map is selected to be displayed in at least two of the ultrasound score maps of the comparison evaluation, the ultrasound images at the corresponding areas of the other one or more ultrasound score maps are automatically displayed.
47. An ultrasound imaging system, comprising:
An ultrasonic probe;
a transmitting circuit for exciting the ultrasonic probe to transmit ultrasonic waves to a target object;
The receiving circuit is used for controlling the ultrasonic probe to receive the ultrasonic echo returned from the target object and obtaining an ultrasonic echo signal;
A processor for:
processing the ultrasonic echo signals to obtain a lung ultrasonic image;
identifying ultrasound signs of one or more lung regions in the ultrasound image of the lung;
Scoring the one or more lung regions according to the ultrasound signature to generate a scoring result;
a memory for storing a program executed by the processor;
And a display for displaying a lung super-score map on a display interface, the lung super-score map comprising a lung graphic and indicia displayed at each lung region of the lung graphic, the indicia being for characterizing the scoring result for a corresponding lung region.
48. The ultrasound imaging system of claim 47, wherein the identification comprises a graphic that characterizes the scoring result as a number of different colors, brightnesses, textures, texture densities, patterns, pattern densities, fill areas, and/or shapes.
49. The ultrasound imaging system of claim 47, wherein the indicia comprises color patches or color boxes of different colors, each color patch or color box representing a score of the scoring result.
50. The ultrasound imaging system of claim 49, wherein the identification further comprises a score of the scoring result displayed in the color patch or color frame.
51. The ultrasound imaging system of claim 47, wherein the identification includes a cover superimposed over each of the one or more lung fields, the cover characterizing the scoring result as a different color, brightness, texture density, pattern density, and/or fill area.
52. The ultrasound imaging system of any of claims 47-51, wherein the lung super score map is a playback navigation interface, and wherein the display displays a lung ultrasound image corresponding to a lung field on the display interface when the processor receives a selection instruction for the identification of the lung field of the lung super score map.
53. The ultrasound imaging system of claim 52, wherein each of the one or more lung fields correspondingly stores one or more of the ultrasound images of the lung.
54. The ultrasound imaging system of claim 47, wherein the lung super-score map includes lung ultrasound images of respective lung regions displayed at the one or more lung regions of the lung map, the identification being displayed in synchronization with the lung ultrasound images.
55. The ultrasound imaging system of claim 54, wherein the marker is displayed on the ultrasound lung image, or the marker is displayed in juxtaposition to the ultrasound lung image, or the ultrasound lung image is displayed on the marker.
56. The ultrasound imaging system of claim 55, wherein the markers comprise corner marks or borders of different colors, brightness, fill areas, patterns, shapes, or textures displayed on the ultrasound image of the lung.
57. The ultrasound imaging system of claim 54, wherein when the processor receives a selection instruction for a lung ultrasound image of a lung region of the lung super score map, the processor causes the display to magnify the selected lung ultrasound image on the display interface.
58. The ultrasound imaging system of claim 47, wherein the display is further configured to: the total score for the one or more lung fields is displayed on the lung super score map.
59. The ultrasound imaging system of any of claims 47-58, wherein said displaying a lung super score map on a display interface comprises:
And simultaneously displaying a plurality of windows on the display interface, wherein each window displays the lung super score graph with one score.
60. The ultrasound imaging system of claim 59, wherein the processor is further configured to: when an identification of a certain lung field of the lung super score map of one of the windows is received, the display is controlled to automatically highlight the identification at the corresponding lung field of the lung super score map of the other window or windows.
61. The ultrasound imaging system of claim 59, wherein the processor is further configured to control the display to: and displaying the acquisition time of the lung super score in each window, and arranging the lung super score according to the sequence of the acquisition time.
62. The ultrasound imaging system of claim 61, wherein the processor is further configured to control the display to: highlighting on the lung super score map of a primary score an indication of a change in the scoring result of a certain lung field of the lung super score map relative to a time preceding or following the primary score.
63. The ultrasound imaging system of claim 59, wherein when a selection of at least two of the lung super-score maps for comparative evaluation is received, the processor controls the display to display the at least two of the lung super-score maps and their corresponding ultrasound images of the lung in the plurality of windows of the display interface.
64. The ultrasound imaging system of any of claims 47-58, wherein said displaying a lung super-score map on a display interface when a selection of at least two of said lung super-score maps for comparative evaluation is received comprises:
And simultaneously displaying at least two groups of windows on the display interface, and displaying a selected one of the lung super-score maps and the lung ultrasonic image corresponding to the lung super-score maps in each group of windows.
65. The ultrasound imaging system of claim 63 or 64, wherein when a lung ultrasound image of a lung region of one of the at least two lung super score maps of the selection contrast evaluation is received for display, the processor is further configured to control the display to: ultrasound images of the lungs at the corresponding lung regions of the other one or more lung super score maps are automatically displayed.
66. An ultrasound imaging system, comprising:
An ultrasonic probe;
a transmitting circuit for exciting the ultrasonic probe to transmit ultrasonic waves to one or more areas of the object to be measured;
The receiving circuit is used for controlling the ultrasonic probe to receive ultrasonic echoes returned from the one or more areas and obtaining ultrasonic echo signals;
A processor for:
processing the ultrasonic echo signals to obtain ultrasonic images of the one or more areas;
Identifying ultrasound signs in the ultrasound image of the one or more regions;
Scoring the ultrasound images of the one or more regions according to the ultrasound signals to generate scoring results;
a memory for storing a program executed by the processor;
And the display is used for displaying an ultrasonic scoring graph on a display interface, wherein the ultrasonic scoring graph comprises a graph of a tested object and identifiers of one or more areas displayed on the graph of the tested object, and the identifiers are used for representing the scoring results of the corresponding areas.
67. A computer storage medium having stored thereon a computer program, which when executed by a computer or processor performs the steps of the method of any of claims 1 to 46.
CN202410669022.0A 2019-11-04 2019-11-04 Ultrasonic image analysis method, ultrasonic imaging system and computer storage medium Pending CN118542692A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410669022.0A CN118542692A (en) 2019-11-04 2019-11-04 Ultrasonic image analysis method, ultrasonic imaging system and computer storage medium

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN202410669022.0A CN118542692A (en) 2019-11-04 2019-11-04 Ultrasonic image analysis method, ultrasonic imaging system and computer storage medium
PCT/CN2019/115420 WO2021087687A1 (en) 2019-11-04 2019-11-04 Ultrasonic image analyzing method, ultrasonic imaging system and computer storage medium
CN201980100370.3A CN114375179B (en) 2019-11-04 2019-11-04 Ultrasonic image analysis method, ultrasonic imaging system and computer storage medium

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201980100370.3A Division CN114375179B (en) 2019-11-04 2019-11-04 Ultrasonic image analysis method, ultrasonic imaging system and computer storage medium

Publications (1)

Publication Number Publication Date
CN118542692A true CN118542692A (en) 2024-08-27

Family

ID=75848722

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202410669022.0A Pending CN118542692A (en) 2019-11-04 2019-11-04 Ultrasonic image analysis method, ultrasonic imaging system and computer storage medium
CN201980100370.3A Active CN114375179B (en) 2019-11-04 2019-11-04 Ultrasonic image analysis method, ultrasonic imaging system and computer storage medium

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201980100370.3A Active CN114375179B (en) 2019-11-04 2019-11-04 Ultrasonic image analysis method, ultrasonic imaging system and computer storage medium

Country Status (2)

Country Link
CN (2) CN118542692A (en)
WO (1) WO2021087687A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113763353A (en) * 2021-09-06 2021-12-07 杭州类脑科技有限公司 Lung ultrasonic image detection system
CN116521912B (en) * 2023-07-04 2023-10-27 广东恒腾科技有限公司 Ultrasonic data storage management system and method based on artificial intelligence

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101162599B1 (en) * 2010-08-18 2012-07-05 인하대학교 산학협력단 An automatic detection method of Cardiac Cardiomegaly through chest radiograph analyses and the recording medium thereof
JP5600285B2 (en) * 2010-11-16 2014-10-01 日立アロカメディカル株式会社 Ultrasonic image processing device
CN103778600B (en) * 2012-10-25 2019-02-19 北京三星通信技术研究有限公司 Image processing system
DK2973402T3 (en) * 2013-03-13 2019-09-16 Fdna Inc SYSTEMS, PROCEDURES AND COMPUTER READABLE MEDIA TO IDENTIFY WHEN IT IS LIKELY TO HAVE AN INDIVIDUAL IN A MEDICAL CONDITION
JP5924296B2 (en) * 2013-03-19 2016-05-25 コニカミノルタ株式会社 Ultrasound diagnostic imaging equipment
JP2015061592A (en) * 2013-08-21 2015-04-02 コニカミノルタ株式会社 Ultrasonic diagnostic equipment, ultrasonic image processing method, and computer-readable non-temporary recording medium
US20160206291A1 (en) * 2015-01-16 2016-07-21 General Electric Company Live ultrasound image and historical ultrasound image frame overlapping
US20180125446A1 (en) * 2015-06-04 2018-05-10 Koninklijke Philips N.V. System and method for precision diagnosis and therapy augmented by cancer grade maps
US20170086790A1 (en) * 2015-09-29 2017-03-30 General Electric Company Method and system for enhanced visualization and selection of a representative ultrasound image by automatically detecting b lines and scoring images of an ultrasound scan
US10667793B2 (en) * 2015-09-29 2020-06-02 General Electric Company Method and system for enhanced visualization and selection of a representative ultrasound image by automatically detecting B lines and scoring images of an ultrasound scan
US20170322684A1 (en) * 2016-05-03 2017-11-09 Siemens Healthcare Gmbh Automation Of Clinical Scoring For Decision Support
JP6841907B2 (en) * 2016-09-29 2021-03-10 ゼネラル・エレクトリック・カンパニイ Methods, systems and non-transient computer-readable media for improved visualization and selection of representative ultrasound images by automatically detecting B-lines and scoring ultrasound scan images.
EP3482689A1 (en) * 2017-11-13 2019-05-15 Koninklijke Philips N.V. Detection, presentation and reporting of b-lines in lung ultrasound
CN107563123A (en) * 2017-09-27 2018-01-09 百度在线网络技术(北京)有限公司 Method and apparatus for marking medical image
CN107616812A (en) * 2017-10-27 2018-01-23 飞依诺科技(苏州)有限公司 The rapid comparison method and system of ultrasonoscopy during ultrasonic scanning in real time
BR112020009982A2 (en) * 2017-11-22 2020-11-03 Koninklijke Philips N.V. ultrasound system, ultrasound imaging system, non-transitory computer-readable method and media
CN108573490B (en) * 2018-04-25 2020-06-05 王成彦 Intelligent film reading system for tumor image data
CN108846840B (en) * 2018-06-26 2021-11-09 张茂 Lung ultrasonic image analysis method and device, electronic equipment and readable storage medium
CN109727243A (en) * 2018-12-29 2019-05-07 无锡祥生医疗科技股份有限公司 Breast ultrasound image recognition analysis method and system
CN110269641B (en) * 2019-06-21 2022-09-30 深圳开立生物医疗科技股份有限公司 Ultrasonic imaging auxiliary guiding method, system, equipment and storage medium

Also Published As

Publication number Publication date
WO2021087687A1 (en) 2021-05-14
CN114375179A (en) 2022-04-19
CN114375179B (en) 2024-07-23

Similar Documents

Publication Publication Date Title
EP3518771B1 (en) Method and system for enhanced visualization and selection of a representative ultrasound image by automatically detecting b lines and scoring images of an ultrasound scan
JP4476400B2 (en) Ultrasonic diagnostic equipment
TWI473598B (en) Breast ultrasound image scanning and diagnostic assistance system
US20170086790A1 (en) Method and system for enhanced visualization and selection of a representative ultrasound image by automatically detecting b lines and scoring images of an ultrasound scan
US10799215B2 (en) Ultrasound systems, methods and apparatus for associating detection information of the same
US9342922B2 (en) Medical imaging apparatus and method of constructing medical images
CN105828723B (en) Ultrasound imaging assembly and method for displaying ultrasound images
CN104334086A (en) Method for setting regions of interest and ultrasonic diagnostic device
KR20140024190A (en) Method for managing and displaying ultrasound image, and apparatus thereto
RU2662868C2 (en) Support apparatus for supporting user in diagnosis process
CN111973220B (en) Method and system for ultrasound imaging of multiple anatomical regions
CN114375179B (en) Ultrasonic image analysis method, ultrasonic imaging system and computer storage medium
CN111214254A (en) Ultrasonic diagnostic equipment and section ultrasonic image acquisition method and readable storage medium thereof
CN112842394A (en) Ultrasonic imaging system, ultrasonic imaging method and storage medium
CN111493932A (en) Ultrasonic imaging method and system
CN112568933B (en) Ultrasonic imaging method, apparatus and storage medium
US8900144B2 (en) Diagnosis apparatus and method of operating the same
CN114007513A (en) Ultrasonic imaging equipment, method and device for detecting B line and storage medium
CN112294360A (en) Ultrasonic imaging method and device
CN114680937A (en) Mammary gland ultrasonic scanning method, mammary gland machine and storage medium
CN113545806A (en) Prostate elastography method and ultrasound elastography system
KR101611450B1 (en) Method and device for guiding measure in ultrasound diagnosis apparatus, storage medium thereof
US20210251608A1 (en) Ultrasound image processing device and method, and computer-readable storage medium
WO2022040878A1 (en) Ultrasound imaging system and ultrasound image analysis method
KR101511501B1 (en) Ultrasound system for measuring thickness of image and method for operating ultrasound system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination