CN114375179A - Ultrasonic image analysis method, ultrasonic imaging system, and computer storage medium - Google Patents

Ultrasonic image analysis method, ultrasonic imaging system, and computer storage medium Download PDF

Info

Publication number
CN114375179A
CN114375179A CN201980100370.3A CN201980100370A CN114375179A CN 114375179 A CN114375179 A CN 114375179A CN 201980100370 A CN201980100370 A CN 201980100370A CN 114375179 A CN114375179 A CN 114375179A
Authority
CN
China
Prior art keywords
lung
ultrasound
scoring
score
ultrasonic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201980100370.3A
Other languages
Chinese (zh)
Inventor
王勃
刘硕
黄云霞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mindray Bio Medical Electronics Co Ltd
Original Assignee
Shenzhen Mindray Bio Medical Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mindray Bio Medical Electronics Co Ltd filed Critical Shenzhen Mindray Bio Medical Electronics Co Ltd
Publication of CN114375179A publication Critical patent/CN114375179A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Abstract

The application provides an ultrasonic image analysis method, an ultrasonic imaging system and a computer storage medium, wherein the ultrasonic image analysis method comprises the following steps: acquiring a lung ultrasonic image; identifying ultrasound signs for respective lung regions in the lung ultrasound image; scoring the respective lung regions according to the ultrasound signs to generate scoring results; displaying a lung super-score map on a display interface, wherein the lung super-score map comprises a lung graph and marks displayed at each lung area of the lung graph, and the marks are used for representing the scoring result of the corresponding lung area. The ultrasonic image analysis scheme scores all lung areas, and generates and displays a lung over-rating map based on the scoring result, so that the lung diagnosis result can be visually displayed and compared, and the ultrasonic diagnosis can meet clinical requirements.

Description

Ultrasonic image analysis method, ultrasonic imaging system, and computer storage medium
Description
Technical Field
The present application relates to the field of ultrasound imaging technology, and more particularly, to an ultrasound image analysis method, an ultrasound imaging system, and a computer storage medium.
Background
In modern medical image examination, the ultrasonic technology has become one of the most widely used and most frequently used examination means for popularizing and applying new technology due to its advantages of high reliability, rapidness, convenience, real-time imaging, repeatable examination and the like. The development of new ultrasonic technology further promotes the application of ultrasonic image examination in clinical diagnosis and treatment.
In recent years, in the fields of severe emergency treatment and the like, lung ultrasonic imaging (lung ultrasound for short) is more and more widely applied and paid attention, and the auxiliary rapid diagnosis is facilitated by identifying ultrasonic signs. How to rapidly and quantitatively evaluate the degree of lung ventilation according to ultrasonic signs and display and transmit evaluation results have increasingly important clinical significance. However, the traditional method relies on manual statistics and evaluation, which is time-consuming and labor-consuming, and cannot visually display the evaluation result, thereby seriously affecting the popularization and application of the lung surpassing in the acute and severe fields.
Disclosure of Invention
In this summary, concepts in a simplified form are introduced that are further described in the detailed description. This summary of the invention is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
A first aspect of an embodiment of the present invention provides an ultrasound image analysis method, where the method includes:
acquiring a lung ultrasonic image;
identifying ultrasound signs for respective lung regions in the lung ultrasound image;
scoring the respective lung regions according to the ultrasound signs to generate scoring results;
displaying a lung super-score map on a display interface, wherein the lung super-score map comprises a lung graph and marks displayed at each lung area of the lung graph, and the marks are used for representing the scoring result of the corresponding lung area.
A second aspect of the embodiments of the present invention provides an ultrasound image analysis method, where the method includes:
acquiring ultrasonic images of one or more areas of a measured object;
identifying ultrasound signs in the ultrasound images of the one or more regions;
scoring the ultrasound images of the one or more regions according to the ultrasound signs to generate scoring results;
displaying an ultrasonic scoring graph on a display interface, wherein the ultrasonic scoring graph comprises a graph of a tested object and marks displayed at all areas of the graph of the tested object, and the marks are used for representing scoring results of corresponding areas.
A third aspect of embodiments of the present invention provides an ultrasound imaging method, including:
acquiring a historical ultrasonic scoring graph of a measured object, wherein the historical ultrasonic scoring graph comprises a measured object graph and marks displayed in one or more areas of the measured object graph, and the marks are used for representing historical scoring;
generating a scanning indication for the one or more regions according to the scores of the historical scores;
sending ultrasonic waves to one or more areas of the measured object according to the scanning indication to perform scanning so as to obtain ultrasonic echo signals; and
processing the ultrasound echo signals to obtain current ultrasound images of the one or more regions.
A fourth aspect of an embodiment of the present invention provides an ultrasound imaging system, including:
an ultrasonic probe;
the transmitting/receiving control circuit is used for exciting the ultrasonic probe to transmit ultrasonic waves to a target object and controlling the ultrasonic probe to receive ultrasonic echoes returned from the target object to obtain ultrasonic echo signals;
a memory for storing a program executed by the processor;
a processor to:
processing the ultrasonic echo signal to obtain a lung ultrasonic image;
identifying ultrasound signs of one or more lung regions in the lung ultrasound image;
scoring the one or more lung regions according to the ultrasound signs to generate scoring results;
the display is used for displaying a lung super-score map on a display interface, the lung super-score map comprises a lung graph and marks displayed at each lung area of the lung graph, and the marks are used for representing the scoring result of the corresponding lung area.
A fifth aspect of the embodiments of the present invention provides a computer storage medium having a computer program stored thereon, where the computer program is executed by a computer or a processor to implement the steps of the ultrasound image analysis method.
According to the ultrasonic image analysis method, the ultrasonic imaging system and method and the computer storage medium, all lung areas are scored, and the lung hyperscoring graph is generated and displayed based on the scoring result, so that the lung imaging result can be visually displayed and compared, and the ultrasonic imaging can better meet clinical requirements.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
In the drawings:
FIG. 1 shows a schematic block diagram of an ultrasound imaging system according to an embodiment of the present invention;
FIG. 2 shows a schematic flow diagram of a method of ultrasound image analysis in accordance with an embodiment of the present invention;
FIG. 3a shows a lung hyperscoring graph according to an embodiment of the invention;
FIG. 3b shows a lung hyperscoring graph according to an embodiment of the invention;
FIG. 4 illustrates another lung hyperscoring graph according to an embodiment of the invention;
FIG. 5 shows a schematic diagram of a lung super report according to an embodiment of the invention;
FIG. 6 shows a schematic view of an ultrasound image of a lung viewed in a first form of a lung hyperscoring map, in accordance with an embodiment of the invention;
FIG. 7 illustrates a schematic diagram showing a plurality of lung hyperscoring maps displayed on a display interface, according to an embodiment of the invention;
FIG. 8 shows a schematic diagram of selecting a plurality of lung hyper-score maps for comparative evaluation, according to an embodiment of the invention;
FIG. 9 shows a schematic flow diagram of a method of ultrasound image analysis in accordance with another embodiment of the present invention;
fig. 10 shows a schematic flow diagram of an ultrasound imaging method according to a further embodiment of the invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, exemplary embodiments according to the present invention will be described in detail below with reference to the accompanying drawings. It is to be understood that the described embodiments are merely a subset of embodiments of the invention and not all embodiments of the invention, with the understanding that the invention is not limited to the example embodiments described herein. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the invention described herein without inventive step, shall fall within the scope of protection of the invention.
In the following description, numerous specific details are set forth in order to provide a more thorough understanding of the present invention. It will be apparent, however, to one skilled in the art, that the present invention may be practiced without one or more of these specific details. In other instances, well-known features have not been described in order to avoid obscuring the invention.
It is to be understood that the present invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. As used herein, the term "and/or" includes any and all combinations of the associated listed items.
In order to provide a thorough understanding of the present invention, detailed steps and detailed structures will be set forth in the following description in order to explain the present invention. The following detailed description of the preferred embodiments of the invention, however, the invention is capable of other embodiments in addition to those detailed.
Next, an ultrasound imaging system according to an embodiment of the present invention is first described with reference to fig. 1, and fig. 1 shows a schematic structural block diagram of an ultrasound imaging system 100 according to an embodiment of the present invention.
As shown in fig. 1, the ultrasound imaging system 100 includes an ultrasound probe 110, a transmit circuit 112, a receive circuit 114, a beam forming circuit 116, a processor 118, a display 120, a transmit/receive select switch 122, and a memory 124. The transmitting circuit 112 and the receiving circuit 114 may be connected to the ultrasound probe 110 through a transmitting/receiving selection switch 122.
The ultrasound probe 110 typically includes an array of a plurality of array elements. At each transmission of the ultrasound wave, all or part of the elements of the ultrasound probe 110 participate in the transmission of the ultrasound wave. At this time, each of the array elements participating in the ultrasound wave transmission or each part of the array elements is excited by the transmission pulse and respectively transmits the ultrasound wave, and the ultrasound waves respectively transmitted by the array elements are superimposed during the propagation process to form a synthesized ultrasound beam transmitted to the target object, for example, the synthesized ultrasound beam may be the ultrasound wave transmitted to the lung of the target object (for example, a human body).
In the ultrasound imaging process, the transmit circuit 112 sends a delay-focused transmit pulse with a certain amplitude and polarity to the ultrasound probe 110 through the transmit/receive select switch 122. The ultrasound probe 110 is excited by the transmit pulse, transmits an ultrasound wave to the scan target object, receives an ultrasound echo with information of the scan target reflected and/or scattered back from the target region after a certain time delay, and converts the ultrasound echo back into an electrical signal. The receiving circuit 114 receives the electrical signals generated by the ultrasound probe 110, obtains ultrasound echo signals, and sends the ultrasound echo signals to the beam forming circuit 116. The beam forming circuit 116 performs focusing delay, weighting and channel summation on the ultrasonic echo signals, and then sends the ultrasonic echo signals to the processor 118 for related signal processing
The transmission/reception selection switch 122 may also be referred to as a transmission/reception controller, which may include a transmission controller for exciting the ultrasonic probe 110 to transmit ultrasonic waves to a target object (e.g., a human body) via the transmission circuit 112 and a reception controller; the receiving controller is used to receive the ultrasonic echo returned from the target object by the ultrasonic probe 110 via the receiving circuit 114.
The processor 118 may process the ultrasound echo signals obtained based on the ultrasound echoes to obtain an ultrasound image of the target object. For example, the ultrasonic echo signals are subjected to beamforming processing by the beamforming circuit 116. The ultrasound images obtained by the processor 118 may be stored in the memory 124. Also, the ultrasound image may be displayed on the display 120. For a more detailed description, reference may be made to the following examples of the present specification.
The processor 118 may be a Central Processing Unit (CPU), image processing unit (GPU), Application Specific Integrated Circuit (ASIC), Field Programmable Gate Array (FPGA) or other form of processing unit having data processing capabilities and/or instruction execution capabilities, and may control other components in the ultrasound imaging system to perform desired functions. For example, the processor 118 can include one or more embedded processors, processor cores, microprocessors, logic circuits, hardware Finite State Machines (FSMs), Digital Signal Processors (DSPs), image processing units (GPUs), or a combination thereof.
The display 120 is connected to the processor 118, and the display 120 may be a touch screen, a liquid crystal display, or the like; or the display 120 may be a separate display device such as a liquid crystal display, a television, etc. that is separate from the ultrasound imaging system 100; or the display 120 may be a display screen of an electronic device such as a smartphone, a tablet computer, and so on. The number of the display 120 may be one or more. The display 120 may display the ultrasound images and scoring results obtained by the processor 118. In addition, the display 120 may provide a graphical interface for human-computer interaction for the user while displaying the ultrasound image, and one or more controlled objects are disposed on the graphical interface, so that the user may be provided with a human-computer interaction device to input an operation instruction to control the controlled objects, thereby performing a corresponding control operation. For example, an icon is displayed on the graphical interface, and the icon can be operated by the man-machine interaction device to execute a specific function, such as a function of selecting a lung hyper-score map for comparison.
Optionally, the ultrasound imaging system 100 may further include a human-computer interaction device other than the display 120, which is connected to the processor 118, for example, the processor 118 may be connected to the human-computer interaction device through an external input/output port, which may be a wireless communication module, a wired communication module, or a combination thereof. The external input/output port may also be implemented based on USB, bus protocols such as CAN, and/or wired network protocols, etc.
The human-computer interaction device may include an input device for detecting input information of a user, where the input information may be, for example, a control instruction for transmitting/receiving timing of the ultrasound wave, an operation input instruction for editing and labeling the ultrasound wave, or other instruction types. The input device may include one or more of a keyboard, mouse, scroll wheel, trackball, mobile input device (such as a mobile device with a touch screen display, cell phone, etc.), multi-function knob, and the like. The human interaction device may also include an output device such as a printer, for example, for printing ultrasound reports.
Memory 124 may be used to store instructions executed by processor 118 for storing received ultrasound echo signals, for storing ultrasound images, and so forth. The memory 124 may be a flash memory card, solid state memory, hard disk, or the like. Which may be volatile memory and/or non-volatile memory, removable memory and/or non-removable memory, etc.
It should be understood that the components included in the ultrasound imaging system 100 shown in fig. 1 are merely illustrative and that more or fewer components may be included. The invention is not limited in this regard.
Next, an ultrasound image analysis method according to an embodiment of the present invention will be described with reference to fig. 2. FIG. 2 is a schematic flow chart diagram of a method 200 for ultrasound image analysis in accordance with an embodiment of the present invention.
As shown in fig. 2, the method 200 includes the steps of:
in step S210, a lung ultrasound image is acquired.
As one implementation, step S210 may include: the pre-stored ultrasound images of the lungs are read from the storage medium. The analysis of the acquired ultrasound images of the lungs may be performed at any time after the acquisition of the ultrasound images of the lungs. The stored lung ultrasound images may be read from a local storage medium (e.g., memory 124) or may be read from a storage medium of another device via a wired or wireless network.
As another implementation manner, step S210 may include: and acquiring the lung ultrasonic image in real time.
Wherein, the step of acquiring the lung ultrasound image in real time may comprise: firstly, transmitting ultrasonic waves to the lung of a target object, and receiving ultrasonic echoes based on the ultrasonic waves to obtain ultrasonic echo signals; and then, obtaining a lung ultrasonic image of the target object according to the ultrasonic echo signal. Exemplarily, the target object may refer to a human body to be detected or a part of the human body to be detected.
Specifically, in conjunction with fig. 1, the ultrasound probe 110 may be activated by the transmit/receive selection switch 122 to transmit ultrasound waves to the lungs of a target subject (e.g., a human body) via the transmit circuit 112, and to receive ultrasound echoes returned from the lungs of the target subject by the ultrasound probe 110 via the receive circuit 114 and convert the ultrasound echoes into ultrasound echo signals. The beamformed ultrasound echo signals may then be processed by a beamforming module 116 and then processed by a processor 118 for correlation to produce an ultrasound image of the lung. In addition, the lung ultrasound image in the embodiment of the present invention may be obtained by performing a series of signal processing on the ultrasound echo signal, including: analog-to-digital conversion, beam-forming, IQ (in-phase quadrature) demodulation, logarithmic compression, grayscale conversion, and the like.
In one embodiment, one or more frames of ultrasound images of the lungs are acquired separately for each lung region of the lungs and stored in memory 124. The left lung and the right lung may be divided into at least 2 lung regions, for example, the left lung and the right lung may be divided into 3, 4, or 6 lung regions, and the like, and one or more frames of lung ultrasound images are acquired for each lung region during the imaging process.
In step S220, ultrasound signs of the respective lung regions in the lung ultrasound image are identified.
Where ultrasound signs refer to features used to characterize lung specificity, for example, ultrasound signs of the lung may include: bat symbols, lung glide symbols, beach symbols, stratospheric symbols, comet tails symbols, lung puncta symbols, parenchymal tissue symbols, fluid dark areas, and the like. In this embodiment, portions of the ultrasound signature therein may be identified for scoring, such as B-lines, lung consolidation, and pleural effusion.
Where the B-line (also called "comet tail") is a discrete vertical reverberation artifact that appears from the pleural line extending to the bottom of the screen, does not fall out, and moves in synchrony with the lung sliding. The occurrence of a large number of B-line images in an ultrasound image is a sign of interstitial lung syndrome, and the number of B-line images increases with the decrease of air content and the increase of lung tissue density, so that the B-line images can be used for diagnosing pulmonary edema and judging the degree of lung ventilation. In addition, normal lung tissue can sometimes see 0-2 isolated B lines in the same field of view. In particular, normal lung tissue is filled with gas, sound waves are scattered completely, and only pleural lines and A lines, namely a plurality of repeated high echo lines parallel to the pleural lines, can be seen under ultrasound. When the lung parenchymal diseases (such as pulmonary edema, pneumonia or acute lung injury) cause hydrostatic pressure increase or capillary permeability increase to widen lobular septa, the lung density is increased by some exudates, collagen, blood and the like due to the decrease of air content, and the echo drop effect between the lung and surrounding tissues is also reduced, the ultrasound can reflect the image of a deeper area to a certain extent to generate some vertical mixed echoes, namely the B line. According to the width of the B line area, the B line can be divided into a single B line and a diffuse B line.
When the pulmonary gas content is further reduced, the lung tissue is substantiated and the acoustic image can be viewed as a solid tissue similar to liver and spleen echoes. Lung consolidation is a progressive result that can be caused by pulmonary embolism, metastasis of cancer in the lung, pressure or obstructive pulmonary atelectasis, and pulmonary contusion. The presence of peripheral tissue consolidation, air and fluid or vascular fusion may further suggest lung consolidation.
When pleural effusion exists, the pleural line and the lung surface are separated, and the pleural line and the sound shadow of the upper and lower ribs form a quadrilateral shape, and the quadrilateral characteristic can be used as a characteristic sign of various pleural effusions. In addition, the sine wave sign is also a sign of pleural effusion, and means that the pulmonary surface line displayed during the M-mode ultrasonic scanning moves to the direction of the pleural line along with the pulsation of the breath, and shows a sine wave-like change.
In embodiments of the present invention, the ultrasound image in the ultrasound image of the lung may be identified by automatic identification, manual identification, or a combination of automatic identification and manual identification.
For example, when automated identification is employed, trained neural network models can be used to automatically identify ultrasound signs in ultrasound images of the lungs. In particular, the neural network model may be trained to enable the machine to recognize ultrasound signs through object detection algorithms, which may include fast RCNN, among others.
Illustratively, the step of training the neural network model comprises: and marking the ultrasonic signs in the lung ultrasonic image, and inputting the ultrasonic signs as training samples into a neural network for training until the model converges, thereby obtaining a trained neural network model. Thereafter, the ultrasound image of the lung obtained in step S210 may be input into the neural network model, and the recognition result of the ultrasound image therein may be output.
In addition, conventional image processing methods may also be used to identify ultrasound signs. For example, no dropout occurs because the B-line is a discrete vertical reverberation artifact that appears from the pleural line extending to the bottom of the screen. Therefore, according to this feature, the B-line can be identified by detecting the vertical line-like feature of the sound beam line direction. The linear feature can be identified by template matching and the like.
In one embodiment, after the ultrasound signature is identified, it may also be quantitatively analyzed to obtain parameters related to the ultrasound signature. By way of example, when the identified ultrasound symptom is a B-line, the primary calculation parameters include the number of B-lines, the percentage of B-line coverage, the spacing between adjacent B-lines, and the like. The number of the B lines is the total number of the identified B lines, the coverage percentage of the B lines is the percentage of the area occupied by the B lines in the lung detection area, and the interval between the adjacent B lines is the distance between the B lines at the pleural line position.
In embodiments of the present invention, the ultrasound signature may be fully automatically identified by the method described above, may be manually identified and marked by the user, or may be identified by a combination of automatic and manual methods. For example, complex conditions such as lung consolidation, pleural effusion, etc. can be marked manually by identifying relatively easily identifiable B-lines using automatic identification methods.
After the lung ultrasound images of the respective lung regions are identified, the identification result of the one or more frames of lung ultrasound images which is most representative can be selected as a final result based on the identification result. For example, the identification result of the one or more frames of lung ultrasound images with the largest number of B lines, or the identification result of the one or more frames of lung ultrasound images with the largest percentage of B lines may be selected. The specific selection criteria may be set by the user.
In step S230, the respective lung regions are scored according to the ultrasound signs to generate scoring results.
Wherein the scoring identifies a degree of injury to each lung region of the lung. As an example, the higher the score, the more severe the lung injury. In the embodiment of the invention, various scoring standards commonly used in clinic, such as a lung hyper-ventilation scoring method, can be used for scoring, and a new scoring standard can be used for scoring.
In one embodiment, each lung region may be scored according to the number of B-lines, lung consolidation, and pleural effusion identified in step S220.
For example, when 0-2 isolated B lines are detected, indicating normal lung ventilation, score 0 is assigned. When a plurality of clearly spaced lines B are identified, i.e. more than 3 single lines B are detected, it indicates moderate lung tissue failure, and it is scored as 1 point. When a densely fused B-line is identified, i.e., an aliased B-line appears, severe lung tissue failure is indicated, scored as 2. When lung excess or lung excess combined with pleural effusion, score 3.
To facilitate quantitative analysis, the scoring format may be a number, such as 0-3. However, the scoring format may also be text, such as N, B1, B2, C, which correspond to different degrees of severity, respectively. In step S230, not only the score of a single lung region can be obtained based on a single lung ultrasound image, but also the scores of the lung regions can be added to obtain the overall score of the lung.
Illustratively, the scoring may be automatic scoring, e.g., the system automatically converts the ultrasound symptom recognition result into a scoring result. Or when the neural network model is trained, the neural network model directly outputs the scoring result. There may also be manual scoring by the user based on the ultrasound signs identified in step S220.
In one embodiment, after generating the scores, an operation interface may be provided to the user, enabling the user to confirm or modify the scores of the respective lung regions through the operation interface. After the user confirms the score obtained in step S230, the execution of step S240 is resumed. In another embodiment, after the ultrasound symptom is identified in step S220, an operation interface may also be provided to the user, so that the user can confirm or delete the identified ultrasound symptom through the operation interface. Of course, the operation interface is not essential, and the subsequent steps may be directly executed without the operation of the user.
In step S240, a lung super-score map is displayed on a display interface, where the lung super-score map includes a lung graph and identifiers displayed at each lung area of the lung graph, and the identifiers are used to represent the scoring results of corresponding lung areas.
The user can quickly know the scoring result of each lung region according to the mark displayed at each lung region of the lung graph, so that the lung injury degree of the patient can be visually presented to the doctor, the health state of the patient can be conveniently monitored, and the follow-up targeted treatment of the patient by the doctor can be facilitated. The lung graphic may be a block diagram representing the shape of the lungs, which may be a two-dimensional schematic diagram as shown in fig. 3a, 3b and 4, or a three-dimensional perspective diagram, for example; the structure diagram may be a line model diagram as shown in fig. 3a, 3b and 4, or may be a rendering diagram. The lung graph may also be other indicative graphs having an equal or approximately equal structural relationship with the lungs, such as a bisected rectangular indicative graph. The present invention is not limited to the specific way in which the lung pattern is presented.
In one embodiment, step S240 includes: a graph of single-scored Lung super scores (LUS) is displayed on a display interface. That is, only one lung hyperscoring graph including only one lung graphic is displayed on the display interface. Based on a single lung over-score map, the damage conditions of a plurality of lung areas can be visually displayed, and a user can quickly overview the overall situation of the lung, so that the damage of which part of the lung is more serious is judged. In some examples, the lesion status of a single lung region may also be displayed based on a lung hyper-score map. The scoring result of which lung area is displayed can be determined according to the input of the user, and the scoring result of which lung area is displayed can also be determined according to the preset sequence in the system. The scoring of the lung regions may also be combined to determine which lung region or regions will be shown for the lesion. For example, the processor 118 controls the display to display the damage condition of one or more lung regions whose score exceeds a preset score threshold.
In one embodiment, the processor 118 may also control the display to provide a dynamic display effect of one or more lung regions. With reference to the schematic diagram of fig. 3a, the processor 118 may control the display to display the damage status of each lung area of the right lung in the order of the 1R lung area, the 2R lung area, the 3R lung area, and the 4R lung area, and the processor 118 may control the display to alternately display the damage status of each lung area of the right lung and the left lung in the order of the 1R lung area, the 1L lung area, the 2R lung area, and the 2L lung area …. The scoring of the lung regions may also be combined to determine which lung region or regions will be shown for the lesion. For example, the processor 118 controls the display to display the damage status of one or more lung regions in sequence according to the level of the scoring result.
As an example, the identification may comprise a graphic. The graph can represent the score of the scoring result in one or more of different colors, brightness, textures, texture densities, patterns, pattern densities, filling areas or shapes. For example, a graphic with different colors may be displayed at the position of the corresponding lung region of the lung graphic according to the score, and the user may quickly obtain the score of the corresponding lung region according to the color of the graphic.
For ease of understanding, two visualization implementations of the lung score map are presented in the embodiments of the present invention, as shown in fig. 3a, 3b, and 4.
Referring first to fig. 3a, in a first form of a lung hyperscoring graph, markers characterizing the scoring result are displayed at respective lung region locations of a lung graph, and a lung ultrasound image may be displayed after clicking on the markers. The lung hyper-score map in the form is simple and visual, and can quickly convey the information of the score result to the user. The boundaries of the individual lung areas may be displayed or, as shown in fig. 3a, the boundaries between the lung areas may not be displayed.
In one embodiment, the identification may be color blocks or color boxes of different colors, each color block or color box representing a score of the scoring result. The shape of the color block or color frame is not limited to one, and may be, for example, a circle, a square, a triangle, or the like. The identification of different lung regions may be the same or different. The identification is illustrated in fig. 3a as a circular figure with different textures, but it will be appreciated that in practice different textures may be replaced with different colors.
The color of the mark can have a certain change rule according to the value of the score, so that the user can understand and remember the color, for example, the color of the mark can be changed from dark to light or from one color system to another color system along with the increase of the score. For example, green may be used for 0 point, yellow for 1 point, orange for 2 points, and red for 3 points.
In one embodiment, the mark further comprises a score (not shown) of the scoring result displayed in the color block or the color box, so that the scoring result is represented in a form of both a graph and a character. It should be noted that in other embodiments, the numbers of the corresponding lung regions may be displayed inside the graph, as shown in fig. 3a, where 1R, 2R, 3R and 4R represent the right 1-4 lung regions, and 1L, 2L, 3L and 4L represent the left 1-4 lung regions, respectively.
As described above, when the above-described form of the lung hyperscoring map is adopted, the lung hyperscoring map may also be used as a playback navigation interface, that is, when a selection instruction for identifying a certain lung area of the lung hyperscoring map is received, the lung ultrasound image corresponding to the lung area may be displayed on a display interface. Specifically, referring to fig. 6, if the user clicks the identifier of the 1R lung region on the lung hyper-score map on the left side of the display interface (at this time, the identifier may be selected in a frame or highlighted), the lung ultrasound image corresponding to the 1R lung region is displayed on the right side of the display interface, so that the diagnostic image can be reviewed in detail.
In one embodiment, each lung region may correspond to one or more frames of the ultrasound image of the lung. That is, each lung region may store one or more frames of lung ultrasound images in memory 124. With continued reference to fig. 6, when a plurality of frames of lung ultrasound images are stored for each lung region, a list 601 of candidate ultrasound images may be shown on the display interface (e.g., may be on the side of the lung ultrasound images) for selection by the user, and the currently displayed ultrasound image may also be boxed or highlighted in the list 601.
In one embodiment, with continued reference to fig. 6, to display the change in the lesion status of each lung field at various stages of the treatment process, a scoring graph 602 of the multiple scoring results for the selected lung field may be displayed on the display interface. The scoring chart 602 may be a chart in various forms such as a graph, a line graph, a histogram, a bar graph, and the like, and the scoring chart 602 may also be a table in which scoring results of each time are recorded, which is not limited in the present invention.
As an example, when the scoring graph 602 adopts a graph or a line graph, scoring results of a plurality of scores may be represented by respective points on the graph or the line. In addition, the scoring result of the currently displayed lung ultrasound image may be highlighted in the scoring chart to facilitate the user in determining the position of the currently scored result in the chart.
Further, the scoring result on the scoring graph 602 may have a mapping relationship with the ultrasound image of the lung corresponding to the selected lung region. When a selection instruction for the scoring result on the scoring chart 602 is received, that is, when the user clicks a point or a line segment on the chart representing the scoring result, the lung ultrasound image corresponding to the scoring result may be displayed on the display interface, that is, the currently displayed lung ultrasound image is replaced with the lung ultrasound image corresponding to the selected scoring result, and the lung scoring chart is synchronously switched to the lung scoring chart corresponding to the scoring result.
Further, the scoring result or the ultrasound symptom recognition result may be shown in text 603 on the display interface. The scoring result includes a score value, and the ultrasound symptom identification result is, for example, the number of B lines and the coverage rate of the B lines in the lung ultrasound image.
Figure 3b shows a lung hyperscoring graph according to another embodiment of the invention. The lung hyperscoring map displays markers at various lung region locations of the lung graphic that characterize the scoring results, and the lung ultrasound image can be displayed after clicking on the markers. The dividing lines of the respective lung regions may or may not be displayed as shown in fig. 3 b.
The labels of fig. 3b are the covers superimposed on the respective lung areas. The covering part can represent the scores of the scoring results in different colors, brightness, textures, texture densities, patterns, pattern densities and/or filling areas. The case of characterizing the score by different textures is illustrated in fig. 3 b. This indication covering the entire lung area is the same as the indication shown in fig. 3a, and the indication itself may be highlighted/highlighted upon receiving a user selection, while the displaying of the ultrasound image of the lung corresponding to the lung area may be invoked. In an example not illustrated, the scoring results for the corresponding lung regions may be displayed on the covering portion.
Figure 4 illustrates a second form of lung hyperscoring graph according to another embodiment of the present invention. When the lung super-scoring map in the form is adopted, the lung ultrasonic images of the corresponding lung areas are displayed at the lung areas of the lung graph, and the marks representing the scoring results are displayed synchronously with the lung ultrasonic images. The ultrasound image of the lung displayed at each lung region may be the highest scoring frame in the lung region or a frame manually designated by the user. According to the lung hyper-score map in the form, a user can not only view the scoring result of each lung area, but also rapidly view the ultrasonic image of each lung area at the first time.
By way of example, the marker may be displayed on the lung ultrasound image when the marker is displayed in synchronization with the lung ultrasound image, e.g., the marker may be a border, a corner mark, or other graphic displayed on the lung ultrasound image. Alternatively, the marker may be displayed in parallel with the lung ultrasound image, for example, the marker may be a frame displayed in parallel with the lung ultrasound image, an indicator bar displayed in parallel with the lung ultrasound image, or the like. Alternatively, the lung ultrasound image may be displayed on a marker.
As an example, the mark in the present embodiment may have different colors, brightness, filling areas, patterns, shapes, or textures to represent the scores of the scoring results. For example, referring to fig. 4, a corner mark 402 with a different color (not shown) may be displayed in the lower right corner of the ultrasound image of the lung as the marker. The color of the label representing different scores may be the same as in the previous embodiment, i.e., green may represent a score of 0, yellow may represent a score of 1, orange may represent a score of 2, and red may represent a score of 3.
With continued reference to fig. 4, in one embodiment, when the marker is displayed in synchronization with the pulmonary ultrasound image, one or more of the Score (Score), the number of B-lines (B lines), and the percentage of B-line coverage (Percent) of the scoring result may also be displayed on or in synchronization with the pulmonary ultrasound image. Further, the total score (LUS) of each lung region may also be displayed on a lung super score map. According to the total score displayed on the lung hyper-score map, the user can quickly know the overall condition of the lungs. In addition, the number of the corresponding lung area can be displayed on the lung ultrasound image, for example, 1R, 2R, 3R displayed on the upper right corner of the lung ultrasound image respectively represent the first, second, and third lung areas of the right lung, and 1L, 2L, 3L respectively represent the first, second, and third lung areas of the left lung.
In one embodiment, when a selection instruction for a lung ultrasound image of a certain lung region of the lung hyperscoring map is received, the selected lung ultrasound image may also be displayed in an enlarged manner on the display interface to facilitate a user to view the selected lung ultrasound image in detail.
In one embodiment, referring to fig. 5, method 200 further comprises: outputting an ultrasound report including the lung hyperscoring map. Among them, the lung hyper-score map of the first form shown in fig. 3 is very concise and intuitive for the presentation of the diagnosis result, and thus is very suitable for being put into a report as an evaluation result map. Of course, the ultrasound report may also employ the second form of the lung hyperscoring graph shown in fig. 4, depending on the needs of the user. The lung hyper-score map output into the ultrasound report may be consistent with the lung hyper-score map displayed on the display interface. For reasons of report printing and the like, the adaptability adjustment can also be carried out when the lung super scoring map is output, the marks on the lung super scoring map are adjusted to be in a mode suitable for printing, the lung ultrasonic images of all lung areas are removed, only the marks are displayed, and/or only the marks with scoring results exceeding a preset scoring threshold value are output on the lung super scoring map, and the like. For example, the lung hyperscoring map of fig. 3 is identified as color blocks, which may be directly converted to color boxes when output into an ultrasound report.
Specific details of displaying a single lung hyperscoring map on a display interface are described above by way of example. In another embodiment, referring to fig. 7, step S240 may further include: simultaneously displaying a plurality of windows on the display interface, wherein the lung super-score map with one score is displayed in each window. For example, four windows are displayed on the display interface in fig. 7, and a lung hyperscoring graph with one score is displayed in each window.
When the lung over-rating map with multiple scores is displayed on the display interface, the user can check the scoring results of all stages in the treatment process at the same time, so that the treatment effect of all parts can be visually compared and evaluated. For example, if the identification of each lung region on the multiple lung hyperscoring maps turns from red to orange or from orange to yellow or green as the treatment progresses, the improvement in pulmonary ventilation is visually indicated.
The lung super-score map displayed in each of the plurality of windows may be any one of the above forms of lung super-score maps, but preferably, the lung super-score maps of the plurality of windows are in the same form for comparison. Since the first form of the lung hyper-score map described in connection with fig. 3 is relatively compact and intuitive, in one embodiment, when a plurality of lung hyper-score maps are displayed simultaneously for comparison, the first form of the lung hyper-score map described above may be used, for example.
As an example, when a plurality of lung super score maps are displayed on the display interface at the same time, the obtaining time of the lung super score maps is displayed in each window, and the lung super score maps are arranged according to the sequence of the obtaining time so as to be conveniently viewed by the user.
In one embodiment, when a plurality of lung hyper-score maps are simultaneously displayed on the display interface, when an instruction to select an identification of a certain lung region of the lung hyper-score map of one of the windows is received, the identification at the corresponding lung region of the lung hyper-score maps of the other one or more windows is automatically highlighted. That is, if the user selects an identifier in one window, for example, if the user selects the identifier of the 1L lung region in the lung hyperscoring graph in the window at the top right corner in fig. 7, the identifiers of the 1L lung regions in the lung hyperscoring graphs in the other three windows are automatically highlighted or enlarged, so that the user can compare the scores of the 1L lung regions at various time points.
In one embodiment, the method 200 further comprises: and highlighting an identification of the change of the scoring result of a certain lung area relative to the lung super scoring image at the time before or after the scoring on the once-scored lung super scoring image. That is, when the score of a certain lung region changes from the score of the previous time or the next time after the score in one time, a flag indicating that the score changes is displayed in the lung over-score map generated based on the score to prompt the user to focus on the lung region.
As an example, the identifier indicating that the scoring result changes may be a graph or a text, and the graph or the text may be a fixed graph or a text, that is, only the scoring result is prompted to change, and the scoring result is not prompted to increase or decrease. Alternatively, different labels may be used to indicate an increase or decrease in the scoring result, respectively.
In one embodiment, the method 200 further comprises: obtaining the difference value of the scoring results of each lung area at two adjacent obtaining times; displaying the difference value as a graphical element on the lung score map in a window of the lung score map corresponding to a later time. Wherein the graphical elements may have one or more of different colors, sizes, shapes, textures, patterns, fill areas, and pattern densities to characterize the magnitude of the difference. The graphical elements characterizing the difference values may be displayed in parallel with the identification of the characterization scoring results or in superimposition with the identification of the characterization scoring results.
For example, referring to fig. 7, in which the scoring results of the 4R lung region and the 4L lung region are changed from the scoring results of the 2018 lung over-scoring graph at 4/5/2018, the scoring results of the two lung regions are decreased by 1 point, and thus, the same graphic elements (not shown) may be displayed at the corresponding lung regions to represent that the scoring results are decreased by 1 point.
In one embodiment, when a selection of at least two of the lung hyper-score maps for comparative evaluation is received, at least two of the lung hyper-score maps and their corresponding lung ultrasound images are displayed in a plurality of windows of a display interface to allow a user to perform comparative analysis of the results of two ultrasound diagnoses in conjunction with the lung hyper-score map and the specific lung ultrasound image.
It should be noted that the user may select two lung super score maps for comparative evaluation on the display interface displaying the plurality of lung super score maps, or may select two lung super score maps for comparative evaluation on two display interfaces displaying a single lung super score map.
Further, when an instruction for selecting at least two lung hyper-score maps for comparative evaluation is received, at least two groups of windows are displayed on the display interface at the same time, and one selected lung hyper-score map and the corresponding lung ultrasound image are displayed in each group of windows. The user can click any position of the lung super-score map to select the corresponding lung super-score map for comparative evaluation, or the user can select a specific mark of the lung super-score map to directly select the lung ultrasonic image of the corresponding lung area for comparative evaluation.
For example, referring to fig. 7 and 8, when the twice-scored lung hyper-score map of 2018, 4/1/2018 and 2018, 3/30/2018 is selected for comparative evaluation in the display interface shown in fig. 7, two sets of windows are respectively displayed on the display interface, wherein the first set of windows in the upper row displays the twice-scored lung hyper-score map of 2018, 4/1/2018 and the lung ultrasound image for comparative evaluation, wherein the selected lung ultrasound image is the 1R lung region; the second set of windows in the lower row shows a lung hyper-score map for day 3 and 30 in 2018 and a lung ultrasound image at the same lung region, i.e., also the 1R lung region. By way of example, the identity of the currently displayed ultrasound image of the lung may also be highlighted, e.g., highlighted or enlarged, in the lung hyper-score map.
In contrast evaluation, when a lung ultrasound image of a certain lung area of one lung hyperscoring map is received for display, lung ultrasound images at corresponding lung areas of other one or more lung hyperscoring maps of the contrast evaluation are automatically displayed. For example, with continued reference to fig. 8, assuming that the user changed the selected lung region from the 1R lung region to the 1L lung region in the 2018, 4-month, 1-day lung hyperscore map, the first set of windows would display a 1L ultrasound image of the lung region at 2018, 4-month, 1-day lung, and the processor 118 may further control the selected lung region to be automatically changed from the 1R lung region to the 1L lung region in the 2018, 3-month, 30-day lung hyperscore map, at which time a second set of windows would display an ultrasound image of the lungs of the 1L lung region at 2018, 3-month, 30-day. Through the linkage selection of the lung areas, the operation friendliness during comparison and evaluation can be further improved, and the repeated and useless workload of a user is reduced.
While fig. 8 above shows the comparative evaluation in two sets of windows, in another example, the processor 118 may show the corresponding sets of windows based on the number of acquired lung hyperscoring maps. For example, when receiving an instruction to perform comparative evaluation on the lung hyperscoring map, the display 120 may be controlled to display four sets of windows on the display interface, where each set of windows respectively displays one lung hyperscoring map and its corresponding lung ultrasound image, as compared with the case of providing four lung hyperscoring maps in fig. 7.
In summary, the ultrasound image analysis method 200 of the embodiment of the present invention scores each lung region, and generates and displays a lung hyperscoring graph based on the scoring result, so that the lung imaging result can be visually displayed and compared, and the lung ultrasound examination can better meet clinical requirements.
Next, an ultrasound image analysis method 900 according to another embodiment of the present application will be described with reference to fig. 9. As shown in fig. 9, the ultrasound image analysis method 900 may include the steps of:
in step S910, ultrasound images of one or more regions of the object are acquired.
The measured object can be a human body, a fetus or an animal. Alternatively, the object to be measured may be a part to be measured of a human body, such as a chest and an abdomen of the human body. When the measured object is a human body, one or more regions of the measured object include, for example, tissues and organs such as heart, lung, liver and gall, spleen and stomach, etc. Specifically, an ultrasonic wave may be emitted to a region to be measured of the object to be measured based on the ultrasonic imaging system shown in fig. 1, and an ultrasonic image of the region may be obtained according to an ultrasonic echo returned from the region to be measured.
In step S920, ultrasound signs in the ultrasound images of the one or more regions are identified.
Wherein the ultrasound images of different regions have different ultrasound signs. For example, ultrasound signs in ultrasound images of the lungs are, for example, B-lines, lung consolidation, pleural effusion, etc.; ultrasonic signs of the intestinal tract include, for example, intestinal canal dilatation, intestinal lumen pneumatosis, cockscomb signs, piano-key signs and the like which characterize intestinal obstructions or pseudorenal signs, target ring signs and the like which characterize intestinal tumors; ultrasound signs at the site of the hepatobiliary system are, for example, Target signs (Target signs) which characterize liver nodules, anti-targets, vascular Floating signs (Floating vessels signs) which characterize hepatic lymphomas, etc.
For the ultrasonic images of different areas, the ultrasonic signs can be identified by adopting automatic identification, manual identification or a combination of the automatic identification and the manual identification. For example, neural network models may be trained separately for different regions for automatic recognition of ultrasound signatures therein. The embodiment of the invention does not limit the identification method of the ultrasonic signs.
In step S930, the ultrasound images of the one or more regions are scored according to the ultrasound signs to generate scoring results.
The scoring result can represent the damage degree of each region of the tested object, for example, the higher the score is, the more serious the damage degree is. In embodiments of the present invention, the regions may be scored based on ultrasound signs, automatically, manually, or a combination of automatically and manually.
In step S940, an ultrasound score map is displayed on a display interface, where the ultrasound score map includes a graph of a measured object and identifiers displayed at one or more regions of the graph of the measured object, and the identifiers are used to represent the scoring results of corresponding regions. The graph of the measured object can be a structural graph representing the shape of the measured object, and can also be other indicating graphs having corresponding proportional relation with the measured object. For example, when the object to be measured is a lung of a human body, the pattern of the object to be measured may be a lung pattern. For example, when the measured object is a plurality of tissues/organs of a human body, the graph of the measured object may be a human body model graph, and optionally a structural diagram of the tissues/organs corresponding to one or more regions is displayed on the human body model graph. For example, when the object to be measured is a heart, the graph of the object to be measured may be a four-quadrant graph corresponding to four chambers of the heart.
Wherein the identification may comprise a graph, and the graph represents the score of the scoring result in one or more of different colors, brightness, textures, texture densities, patterns, pattern densities, filling areas or shapes.
In one embodiment, a single-scoring ultrasound score map is displayed on a display interface, which may be in two forms: the first form of ultrasound score map is similar to the lung score map shown in fig. 3, i.e. when one or more regions of the graph of the object to be tested display the mark representing the scoring result, the ultrasound score map can be used as a playback navigation interface, i.e. when the mark of each region is clicked, the ultrasound image of the corresponding region is displayed on the display interface.
The second form of ultrasound score map is similar to the lung score map shown in fig. 4, i.e., the ultrasound score map is displayed on the graph of the object to be measured in synchronization with the ultrasound image. The pattern of the object to be measured depends on the type of the object to be measured, for example, when the object to be measured is a human body, the pattern of the object to be measured is human shape or approximately human shape. For example, when the object to be measured is a specific tissue or organ, the pattern of the object to be measured is an image of the corresponding tissue or organ. For example, when the object to be measured is a liver, the displayed pattern is a liver pattern.
In another embodiment, a plurality of windows are simultaneously displayed on the display interface, and an ultrasound score map of scores is displayed once in each window. When the ultrasonic scoring graph with multiple scores is displayed on the display interface, the user can simultaneously know the scoring results of the patient at different stages, so that the change conditions of all parts can be visually compared and evaluated.
The ultrasound score map displayed in each of the plurality of windows may be any one of the above forms of ultrasound score maps, but preferably, the ultrasound score maps of the plurality of windows are in the same form for comparison. Since the ultrasound score map of the first form is relatively compact and intuitive, in one embodiment, when a plurality of ultrasound score maps are displayed simultaneously for comparison, the ultrasound score map of the first form may be used, for example.
As an example, when a plurality of ultrasound score maps are simultaneously displayed on a display interface, the obtaining time of the ultrasound score maps is displayed in each window, and the ultrasound score maps are arranged in the sequence of the obtaining time so as to be conveniently viewed by a user.
In one embodiment, when a plurality of ultrasound scoring maps are simultaneously displayed on the display interface, when an instruction to select an identification of a certain region of the ultrasound scoring map of one of the windows is received, identifications at corresponding regions of the ultrasound scoring maps of the other one or more windows are automatically highlighted, thereby facilitating comparison of scores of the region at various points in time by a user.
In one embodiment, an identifier of a change in scoring result relative to a region of the ultrasound score map at a time before or after the scoring may also be highlighted on the ultrasound score map for a scoring. That is, when a score of a certain region is changed from a score of a previous time or a next time of the score in one score, an identifier indicating the change of the score is displayed in an ultrasound score map generated based on the score to prompt a user to focus on the region.
As an example, the identifier indicating that the scoring result changes may be a graph or a text, and the graph or the text may be a fixed graph or a text, that is, only the change of the scoring result is prompted, and the score is not prompted to be increased or decreased. Alternatively, different labels may be used to indicate an increase or decrease in the scoring result, respectively.
In one embodiment, the method 900 further comprises: obtaining the difference value of the scoring results of each area at two adjacent obtaining times; displaying the difference values in graphical elements on the ultrasound score map in a window of the ultrasound score map corresponding to a later time. Wherein the graphical elements may have one or more of different colors, sizes, shapes, textures, patterns, fill areas, and pattern densities to characterize the magnitude of the difference. The graphical elements characterizing the difference values may be displayed in parallel with the identification of the characterization scoring results or in superimposition with the identification of the characterization scoring results.
In one embodiment, when receiving selection of at least two ultrasound scoring maps for comparative evaluation, at least two ultrasound scoring maps and corresponding ultrasound images thereof are displayed in a plurality of windows of a display interface to allow a user to perform comparative analysis on results of two ultrasound diagnoses in combination with the ultrasound scoring maps and the specific ultrasound images.
It should be noted that, a user may select two ultrasound score maps on a display interface displaying a plurality of ultrasound score maps for comparative evaluation, or may select two ultrasound score maps on two display interfaces displaying a single ultrasound score map for comparative evaluation.
Further, when an instruction for selecting at least two ultrasound score maps for comparative evaluation is received, at least two groups of windows are displayed on the display interface at the same time, and one selected ultrasound score map and the ultrasound image corresponding to the ultrasound score map are displayed in each group of windows. The user can click any position of the ultrasonic scoring graph to select the corresponding ultrasonic scoring graph for comparative evaluation, or the user can select a specific identifier of the ultrasonic scoring graph to directly select the ultrasonic image of the corresponding area for comparative evaluation.
In another embodiment, when an ultrasound image is received that selects a region of one ultrasound scoring map for display, ultrasound images at corresponding regions of the other one or more ultrasound scoring maps are automatically displayed.
In summary, the ultrasound image analysis method 900 according to the embodiment of the present invention scores each region of the object to be tested, and generates the ultrasound score map based on the scoring result for displaying, so that the ultrasound diagnosis result can be visually displayed and compared, and the ultrasound diagnosis can better meet the clinical requirements. The scheme for carrying out comparison and evaluation on the results of multiple grading can reasonably utilize the historical examination data of the same patient, and is convenient for a doctor to comprehensively know the change condition of the patient within a period of time.
Next, an ultrasound imaging method 1000 according to yet another embodiment of the present application will be described with reference to fig. 10. The ultrasonic imaging method can guide the current scanning of the tested object based on the historical scoring condition of the tested object. As shown in fig. 10, the ultrasound imaging method 1000 may include the steps of:
in step S1010, a historical ultrasound score map of the measured object is obtained, where the historical ultrasound score map includes a measured object graph and identifiers displayed in one or more areas of the measured object graph, and the identifiers are used to represent historical scores. The historical ultrasonic scoring graph can present the comprehensive scoring of the tested object in a period of time in the past, can also present the historical scoring of the tested object in the last ultrasonic imaging examination, and can also present the multiple scoring of the tested object in parallel in a period of time in the past. Through the identification of one or more areas, the user can intuitively know the historical conditions of each area and direct real-time scanning to be executed according to the historical conditions.
The identification may comprise a graphic. The graph may characterize the score of the score in one or more of different colors, intensities, textures, texture densities, patterns, pattern densities, fill areas, or shapes. For example, graphs with different colors can be displayed at each region on the tested object graph according to the scores, and according to the colors of the graphs, a user can quickly obtain the scores of the corresponding regions. The ultrasound score map may display an indication of the score of the characterization score in the manner of fig. 3; the ultrasound score map can also display the ultrasound image corresponding to each region and the mark of the characterization score in a synchronous manner in the manner of fig. 4. For example, when the object to be measured is a human body, one or more regions may be a thyroid gland, a liver and gall bladder, a kidney, a spleen, a ureter, or the like. The historical ultrasound scoring map may display the corresponding one or more historical scoring results in each region in a color-differentiated representation.
At step S1020, a scanning indication for one or more regions is generated based on the scores of the historical scores. The scanning instruction can be embedded into a scanning workflow, and a user is required to perform real-time scanning according to the scanning instruction; or an interface indication can be provided on the interface, so that the user is prompted to perform the current real-time scanning. In one embodiment, a scan order for one or more regions may be generated based on scores of historical scores. For example, a higher score generally indicates a more severe lesion in the region, the method may prompt the user to scan the regions in real time in order of the scores from high to low. In one embodiment, the processor judges the relationship between the scores of the regions and a preset score threshold in the system, if the relationship exceeds the preset score threshold, the regions are considered to belong to the regions which need the current important attention of the user, and the regions can be highlighted on the graph of the object to be tested, so that the user is guided to scan the regions which are important attention of the user. The preset threshold value for each region may be the same or different. In one embodiment, the processor may analyze the change of the historical scores of the regions, set one or more region scanning orders according to the degree of score increase or the time taken for the increase, or highlight the corresponding region with the score increase exceeding a preset difference threshold or the time taken for the increase being lower than a preset time threshold as the region needing important attention.
In step S1030, ultrasonic waves are transmitted to one or more regions of the measured object according to the scanning instruction to perform scanning, so as to obtain ultrasonic echo signals. As above, when the scanning order is provided by the scanning indication, the ultrasonic waves can be sequentially sent to each region of the measured object according to the scanning order to scan; when the scanning indication provides the key area prompt, the key area can be scanned preferentially or the key area corresponding to the key area can be scanned preferentially according to the prompt of the key area.
In step S1040, the ultrasound echo signals are processed to obtain current ultrasound images of one or more regions.
The form of the historical ultrasound score map acquired in step S1010 may be similar to the form of the two lung ultrasound score maps described with reference to fig. 3 and 4 in the method 200. For example, in the first form of the ultrasound score map, an indication of the score of the characterization score result is displayed in each region of the test object figure, but the ultrasound image is not displayed. In this form, the ultrasound score map may be a playback navigation interface that, when a selection instruction for the identification of a region of the ultrasound score map is received, displays an ultrasound image corresponding to the region on a display interface.
In a second form of the historical ultrasound score map, the ultrasound score map may comprise ultrasound images of respective areas displayed at respective areas of the graph of the object under test, the indication being displayed in synchronism with the ultrasound images. Specifically, the identifier is displayed on the ultrasound image, or the identifier and the ultrasound image are displayed in parallel.
In one embodiment, the scanning sequence of each region can be generated according to the sequence of the scores of each region from top to bottom in the historical ultrasound score map, that is, the regions with higher scores and more serious damage degree are scanned preferentially. Or generating the scanning sequence of each region according to the sequence of the scores of each region in the historical ultrasonic score map from low to high. In another embodiment, the regions with increased scores may also be scanned preferentially over multiple historical ultrasound score maps.
For example, the ultrasonic probe 110 shown in fig. 1 may be excited by the transmitting circuit 112 to transmit ultrasonic waves to various regions of the object to be measured, and receive ultrasonic echoes returned from the object to be measured through the receiving circuit 114 and the beam forming circuit 116 to obtain ultrasonic echo signals.
Then, in step S1040, the ultrasound echo signals are processed to obtain current ultrasound images of each region. For example, the ultrasonic echo signals may be processed by the processor 118 to obtain ultrasonic images of various regions of the object.
In one embodiment, after acquiring the current ultrasound image of one or more regions, the ultrasound imaging method 1000 further comprises: identifying ultrasound signs of a current ultrasound image of the one or more regions; obtaining a current score for one or more regions from the ultrasound signature; and updating the historical ultrasonic scoring map according to the current scores of the one or more regions to obtain a current ultrasonic scoring map, wherein the current ultrasonic scoring map comprises the measured object graph and the identifiers of the one or more regions displayed on the measured object graph, and the identifiers are used for representing the current scores. The specific details of identifying the ultrasound signs, scoring based on the ultrasound signs, and generating the current ultrasound score map based on the scoring are described with reference to the associated description in method 200 and method 900 and will not be described in detail herein.
In one embodiment, when generating the current ultrasound score map, the identification of the regions of varying scores may be highlighted on the current ultrasound score map. That is, if the score of a certain region in the current ultrasound score map and the score of the corresponding region in the historical ultrasound score map acquired in step S1010 are changed, the identifier of the region is highlighted.
In the ultrasound imaging method 1000 according to yet another embodiment of the present application, guiding the scanning of the current ultrasound diagnosis based on the historical ultrasound score map can make the ultrasound diagnosis more targeted. For application scenes such as physical examination, the method is favorable for reasonably utilizing historical physical examination data and achieves the effects of long-term tracking and important attention.
Referring now back to fig. 1, embodiments of the present invention also provide an ultrasound imaging system 100, the ultrasound imaging system 100 may be used to implement the method 200, the method 900 or the method 1000 described above. The ultrasound imaging system 100 may include an ultrasound probe 110, transmit circuitry 112, receive circuitry 114, beam-forming circuitry 116, a processor 118, a display 120, a transmit/receive selection switch 122, and some or all of the components in memory 124, the relevant description of which may be referred to above.
The transmitting circuit 112 is used for exciting the ultrasonic probe 110 to transmit ultrasonic waves to the target object. And a receiving circuit 114, configured to control the ultrasound probe 110 to receive the ultrasound echo returned from the target object, so as to obtain an ultrasound echo signal. The processor 118 is configured to: processing the ultrasonic echo signal to obtain a lung ultrasonic image; identifying ultrasound signs for respective lung regions in the lung ultrasound image; and scoring each lung region according to the ultrasonic signs to generate scoring results. A memory 124 for storing programs executed by the processor 118. The display 120 is configured to display a lung super-score map on a display interface, where the lung super-score map includes a lung graph and identifiers displayed at lung regions of the lung graph, and the identifiers are used to characterize the scoring result of corresponding lung regions.
Wherein the processor 118 may perform steps S210 to S230 of the method 200 described above in connection with fig. 2, and the display 120 may perform step S240 of the method 200. Only the main functions of the ultrasound imaging system 100 will be described below, and details that have been described above will be omitted.
In one embodiment, the indicia in the lung score map displayed by the display 120 comprise a graph that characterizes the score of the scoring result in different colors, intensities, textures, texture densities, patterns, pattern densities, fill areas, and/or shapes.
Illustratively, the identification comprises color patches or color boxes of different colors, each color patch or color box representing a score of one of said scoring results. Further, the identification may also include the score of the scoring result displayed in the color block or frame.
By way of example, embodiments of the present invention propose two forms of lung hyperscoring maps. One form of the lung hyperscoring graph is shown in fig. 3, which displays the identification on the lung graph, wherein the lung hyperscoring graph is a replay navigation interface, and when the processor 118 receives a selection instruction for the identification of a certain lung area of the lung hyperscoring graph, the display 120 displays the lung ultrasound image corresponding to the certain lung area on the display interface. Wherein each lung region corresponds to one or more frames of the ultrasound images of the lungs stored in the memory 124.
Another form of a lung score map is shown in figure 4. The lung hyper-score map in this form comprises lung ultrasound images of respective lung regions displayed at respective lung regions of the lung graphic, the markers being displayed in synchronization with the lung ultrasound images, and a user can simultaneously view the scoring results and the lung ultrasound images.
The marker may be displayed on the lung ultrasound image when the lung ultrasound image is displayed in synchronization with the lung ultrasound image, or the marker may be displayed in parallel with the lung ultrasound image. Wherein the indication comprises, by way of example, a corner mark or border of different color, brightness, fill area, pattern, shape, or texture displayed on the ultrasound image of the lung.
In one embodiment, when the processor 118 receives a selection instruction for a lung ultrasound image of a lung region of the lung hyperscoring map, the display 120 is caused to display the selected lung ultrasound image in an enlarged scale on the display interface for detailed viewing by the user.
In one embodiment, the display 120 is further configured to display the total score of the plurality of lung regions on a lung hyperscoring graph to enable a user to have an overview of the overall condition of the lungs.
The lung super score map displayed by the display 120 is exemplarily described above. The display 120 may display a single-scored lung super score map, or the display 120 may simultaneously display a plurality of windows on the display interface, each of which displays a one-scored lung super score map, as shown in fig. 7.
When the display 120 displays multiple windows simultaneously, one lung hyperscoring graph in each window, in one embodiment, when the processor 118 receives an identification of a lung region of the lung hyperscoring graph of a selected one of the windows, the display 120 is controlled to automatically highlight the identification at the corresponding lung region of the lung hyperscoring graph of the other one or more windows for comparative analysis by the user.
Illustratively, the processor 118 is further configured to control the display 120 to display the time of acquisition of the respective lung hyperscoring map in each of the windows. The processor 118 may further control the display 120 to rank the lung hyperscoring graph in chronological order of the acquisition time for easy viewing by the user.
In one embodiment, the processor 118 is further configured to control the display 120 to perform the following functions: highlighting an identification of the change of the scoring result of a certain lung area of the lung super scoring graph relative to the time before or after the scoring on the lung super scoring graph with one scoring. That is, when the score of a certain lung region changes from the score of the previous time or the next time after the score in one time, a flag indicating that the score changes is displayed in the lung over-score map generated based on the score to prompt the user to focus on the lung region.
As an example, the identifier indicating that the scoring result changes may be a graph, which may be a fixed graph, that is, a graph only used for indicating that the scoring result changes. Alternatively, different graphs may be used to show the score results increase or decrease, respectively.
In one embodiment, upon receiving a selection of at least two of the lung hyper-score maps for comparative evaluation, the processor 118 controls the display 120 to display the at least two of the lung hyper-score maps and their corresponding lung ultrasound images in the plurality of windows of the display interface.
Further, the processor 118 may control the display 120 to simultaneously display at least two sets of windows on the display interface, and display a selected one of the lung hyper-score maps and its corresponding lung ultrasound image in each set of windows. As shown in fig. 8, if the user selects the 1R lung region in the twice-scored lung hyperscoring graph of 2018, 4/month, 1/day and 2018, 3/month, 30/day in fig. 7 for comparative evaluation, the processor 118 controls the display 120 to display the lung hyperscoring graph of 2018, 4/month, 1/day and 2018, 3/month, 30/day and the lung ultrasound image of the 1R lung region on the display interface.
In another embodiment, the processor 118 is further configured to control the display 120 to automatically display the ultrasound images of the lungs at the corresponding lung region of the other one or more lung hyperscoring maps under comparative evaluation when receiving the ultrasound images of the lungs of a lung region of a selected one of the lung hyperscoring maps for display under comparative evaluation. For example, when the user selects the lung ultrasound image of the 1R lung region of day 4/1 2018 for display in fig. 8, the processor 118 may control the display 120 to display the lung hyper-score map of day 3/30 2018 and the lung ultrasound image of the 1R lung region thereof in another set of windows.
Further, according to an embodiment of the present application, there is also provided a computer storage medium having stored thereon program instructions for executing the respective steps of the ultrasound image analysis method 200, 900 and the ultrasound imaging method 1000 of the embodiment of the present application when the program instructions are executed by a computer or a processor (such as the aforementioned processor 103 or processor 920). The storage medium may include, for example, a memory card of a smart phone, a storage component of a tablet computer, a hard disk of a personal computer, a Read Only Memory (ROM), an Erasable Programmable Read Only Memory (EPROM), a portable compact disc read only memory (CD-ROM), a USB memory, or any combination of the above storage media. The computer-readable storage medium may be any combination of one or more computer-readable storage media.
In addition, according to the embodiment of the application, a computer program is further provided, and the computer program can be stored on a storage medium in a cloud or a local place. When being executed by a computer or a processor, the computer program is used for executing the corresponding steps of the ultrasound image analysis method of the embodiment of the application.
Based on the above description, the ultrasound image analysis method, the ultrasound imaging system, and the computer storage medium according to the embodiments of the present application score each region of the target object, and generate an ultrasound scoring graph according to the scoring result, and the ultrasound scoring graph can visually display and compare the scoring result, so that the ultrasound diagnosis can better satisfy clinical requirements. In addition, the ultrasound score map can also be used to guide subsequent ultrasound diagnosis.
Although the example embodiments have been described herein with reference to the accompanying drawings, it is to be understood that the above-described example embodiments are merely illustrative and are not intended to limit the scope of the present application thereto. Various changes and modifications may be effected therein by one of ordinary skill in the pertinent art without departing from the scope or spirit of the present application. All such changes and modifications are intended to be included within the scope of the present application as claimed in the appended claims.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the units is only one logical functional division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another device, or some features may be omitted, or not executed.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the application may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the description of exemplary embodiments of the present application, various features of the present application are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the application and aiding in the understanding of one or more of the various inventive aspects. However, the method of the present application should not be construed to reflect the intent: this application is intended to cover such departures from the present disclosure as come within known or customary practice in the art to which this invention pertains. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this application.
It will be understood by those skilled in the art that all of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where such features are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the application and form different embodiments. For example, in the claims, any of the claimed embodiments may be used in any combination.
The various component embodiments of the present application may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that a microprocessor or Digital Signal Processor (DSP) may be used in practice to implement some or all of the functionality of some of the modules according to embodiments of the present application. The present application may also be embodied as apparatus programs (e.g., computer programs and computer program products) for performing a portion or all of the methods described herein. Such programs implementing the present application may be stored on a computer readable medium or may be in the form of one or more signals. Such a signal may be downloaded from an internet website or provided on a carrier signal or in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the application, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The application may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
The above description is only for the specific embodiments of the present application or the description thereof, and the protection scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope disclosed in the present application, and shall be covered by the protection scope of the present application. The protection scope of the present application shall be subject to the protection scope of the claims.

Claims (71)

  1. A method of ultrasound image analysis, the method comprising:
    acquiring a lung ultrasonic image;
    identifying ultrasound signs for respective lung regions in the lung ultrasound image;
    scoring the respective lung regions according to the ultrasound signs to generate scoring results;
    displaying a lung super-score map on a display interface, wherein the lung super-score map comprises a lung graph and marks displayed at each lung area of the lung graph, and the marks are used for representing the scoring result of the corresponding lung area.
  2. The method of claim 1, wherein the identifier comprises a graph that characterizes the score of the scoring result in different colors, intensities, textures, texture densities, patterns, pattern densities, fill areas, and/or shapes.
  3. The method of claim 1, wherein the identifier comprises color patches or color boxes of different colors, each color patch or color box representing a score of one of the scoring results.
  4. The method of claim 3, wherein the identification further comprises a score of the scoring result displayed in the color block or box.
  5. The method of any one of claims 1-4, wherein the lung score map is a playback navigation interface, the method further comprising:
    when a selection instruction of the identification of a certain lung area of the lung hyperscoring graph is received, displaying a lung ultrasound image corresponding to the lung area on the display interface.
  6. The method of claim 5, wherein each lung region corresponds to one or more frames of the ultrasound image of the lungs.
  7. The method of claim 5, further comprising: displaying a scoring chart of the multiple scoring results at the selected lung region on the lung ultrasound image.
  8. The method of claim 7, wherein the scoring result on the scoring graph has a mapping relationship with the ultrasound image of the lung corresponding to the selected lung region, the method further comprising:
    when a selection instruction of the scoring result on the scoring chart is received, displaying the lung ultrasonic image corresponding to the scoring result on the display interface.
  9. The method of claim 7, further comprising: and highlighting the scoring result of the currently displayed lung ultrasonic image in the scoring chart.
  10. The method of claim 1, wherein the lung hyper-score map comprises a lung ultrasound image of a respective lung region displayed at each lung region of the lung graphic, the indication being displayed in synchronization with the lung ultrasound image.
  11. The method of claim 10, wherein the marker is displayed on the ultrasound lung image or the marker is displayed in juxtaposition with the ultrasound lung image.
  12. The method of claim 10, wherein the indication comprises a corner mark or border of different color, brightness, fill area, pattern, shape, or texture displayed on the ultrasound image of the lung.
  13. The method of claim 10, wherein the lung hyperscoring map further comprises scores, B-line numbers, and/or B-line coverage percentages of scoring results displayed on the lung ultrasound image.
  14. The method of claim 10, further comprising:
    when a selection instruction of a lung ultrasonic image of a certain lung area of the lung hyperscoring map is received, the selected lung ultrasonic image is displayed in an enlarged mode on the display interface.
  15. The method of claim 1, further comprising: displaying a total score for a plurality of lung regions on the lung hyperscore map.
  16. The method according to one of claims 1 to 15, characterized in that the method further comprises: outputting an ultrasound report including the lung hyperscoring map.
  17. The method according to any one of claims 1-16, wherein displaying the lung hyper-score map on the display interface comprises:
    displaying a single-scored lung hyperscoring graph on the display interface.
  18. The method according to any one of claims 1-16, wherein displaying the lung hyper-score map on the display interface comprises:
    simultaneously displaying a plurality of windows on the display interface, wherein the lung super-score map with one score is displayed in each window.
  19. The method of claim 18, further comprising: when an identification of a certain lung region of the lung hyper-score map of a selected one of the windows is received, the identification at the corresponding lung region of the lung hyper-score map of the other one or more windows is automatically highlighted.
  20. The method of claim 18, further comprising: displaying the time of acquisition of the lung hyperscoring graph in each of the windows.
  21. The method of claim 20, further comprising: highlighting an identification of the change of the scoring result of a certain lung area of the lung super scoring graph relative to the time before or after the scoring on the lung super scoring graph with one scoring.
  22. The method of claim 20, further comprising:
    obtaining the difference value of the scoring results of each lung area at two adjacent obtaining times;
    displaying the difference value as a graphical element on the lung score map in a window of the lung score map corresponding to a later time.
  23. The method of claim 22, wherein the graphical elements have different colors, sizes, shapes, textures, patterns, fill areas, and/or pattern densities to characterize the magnitude of the difference.
  24. The method of claim 18, wherein displaying the lung hyper-score map on the display interface comprises: and simultaneously displaying four windows on the display interface.
  25. The method of claim 18, further comprising: when receiving and selecting at least two lung hyper-score maps for comparative evaluation, displaying the at least two lung hyper-score maps and the corresponding lung ultrasonic images in the plurality of windows of the display interface.
  26. The method according to any one of claims 1-16, wherein when receiving a selection of at least two of the lung hyper-score maps for comparative evaluation, the displaying the lung hyper-score maps on the display interface comprises:
    and simultaneously displaying at least two groups of windows on the display interface, and displaying a selected one of the lung super-score maps and the corresponding lung ultrasonic image in each group of windows.
  27. The method of claim 25 or 26, further comprising: when receiving and selecting the lung ultrasonic image of a certain lung area of one lung super scoring image to display, automatically displaying the lung ultrasonic images of the corresponding lung areas of other one or more lung super scoring images.
  28. The method of claim 1, wherein said obtaining a pulmonary ultrasound image comprises: acquiring the lung ultrasonic image in real time or reading the lung ultrasonic image from a storage medium.
  29. The method of claim 1, wherein the identifying ultrasound signs in the pulmonary ultrasound image comprises: automatic identification, manual identification, or a combination of automatic and manual identification.
  30. The method of claim 29, wherein the automatically identifying comprises: and inputting the lung ultrasonic image into a trained neural network model, and outputting an identification result of the ultrasonic symptom.
  31. An ultrasound image analysis method, comprising:
    acquiring ultrasonic images of one or more areas of a measured object;
    identifying ultrasound signs in the ultrasound images of the one or more regions;
    scoring the ultrasound images of the one or more regions according to the ultrasound signs to generate scoring results;
    displaying an ultrasonic scoring graph on a display interface, wherein the ultrasonic scoring graph comprises a graph of a tested object and marks of one or more areas displayed on the graph of the tested object, and the marks are used for representing scoring results of the corresponding areas.
  32. The method of claim 31, wherein the identifier comprises a graphic that characterizes the score of the scoring result in different colors, intensities, textures, texture densities, patterns, pattern densities, fill areas, and/or shapes.
  33. The method of claim 31, wherein the ultrasound score map is a playback navigation interface, the method further comprising:
    when a selection instruction of the identification of a certain area of the ultrasonic scoring graph is received, displaying the ultrasonic image corresponding to the area on the display interface.
  34. The method of claim 31, wherein the ultrasound score map includes ultrasound images of corresponding areas displayed at one or more areas of the graphic of the object under test, the identification being displayed in synchronization with the ultrasound images.
  35. The method of claim 34, wherein the marker is displayed on the ultrasound image or wherein the marker is displayed in juxtaposition to the ultrasound image.
  36. The method of claim 35, wherein the indication comprises a corner mark or border of different color, brightness, fill area, pattern, shape, or texture displayed on the ultrasound image.
  37. The method according to any one of claims 30-36, wherein displaying the ultrasound score map on the display interface comprises:
    and displaying the ultrasonic scoring graph of the single scoring on the display interface.
  38. The method according to any one of claims 30-36, wherein displaying the ultrasound score map on the display interface comprises:
    simultaneously displaying a plurality of windows on the display interface, wherein the ultrasonic scoring graph with one scoring to the same area is displayed in each window.
  39. The method of claim 38, further comprising: and displaying the obtaining time of the ultrasonic scoring graph in each window, and arranging the ultrasonic scoring graphs according to the sequence of the obtaining time.
  40. The method of claim 39, further comprising: highlighting an identification of the change of the scoring result of a certain area of the ultrasonic scoring image relative to the time before or after the scoring on the ultrasonic scoring image of one scoring.
  41. The method of claim 39, further comprising:
    obtaining the difference value of the scoring results of the one or more areas at two adjacent obtaining times;
    displaying the difference values in graphical elements on the ultrasound score map in a window of the ultrasound score map corresponding to a later time.
  42. The method of claim 38, comprising displaying the ultrasound score map of the current scores of the plurality of regions in one of the windows and displaying the ultrasound score map of one or more historical scores of the corresponding regions in the remaining one or more of the plurality of windows.
  43. The method of claim 42, further comprising: and generating a trend graph according to the current score and the historical score.
  44. The method of claim 43, further comprising: and outputting an ultrasonic report, wherein the ultrasonic report comprises the trend chart and/or the difference value of the scoring results of the two adjacent acquisition times of each area.
  45. The method of claim 38, further comprising: when at least two ultrasound scoring maps are selected for comparative evaluation, the at least two ultrasound scoring maps and their corresponding ultrasound images are displayed in the plurality of windows of the display interface.
  46. The method of claim 45, further comprising: when an ultrasound image of a certain region of one ultrasound scoring map is selected to be displayed in at least two ultrasound scoring maps which are comparatively evaluated, ultrasound images at corresponding regions of the other one or more ultrasound scoring maps are automatically displayed.
  47. An ultrasound imaging method, comprising:
    acquiring a historical ultrasonic scoring graph of a measured object, wherein the historical ultrasonic scoring graph comprises a measured object graph and marks displayed in one or more areas of the measured object graph, and the marks are used for representing historical scoring;
    generating a scanning indication for the one or more regions according to the scores of the historical scores;
    sending ultrasonic waves to one or more areas of the measured object according to the scanning indication to perform scanning so as to obtain ultrasonic echo signals; and
    processing the ultrasound echo signals to obtain current ultrasound images of the one or more regions.
  48. The method of claim 47, wherein generating the scanning indication for the one or more regions according to the scores of the historical scores comprises: generating a scanning sequence of the one or more regions according to the scores; or highlighting the area with the score exceeding the preset threshold according to the relation between the score and the preset threshold.
  49. The method of claim 47, further comprising:
    identifying ultrasound signs of a current ultrasound image of the one or more regions;
    obtaining a current score for the one or more regions from the ultrasound signature; and
    and updating the historical ultrasonic scoring map according to the current scores of the one or more regions to obtain a current ultrasonic scoring map, wherein the current ultrasonic scoring map comprises the measured object graph and the identifiers of the one or more regions displayed on the measured object graph, and the identifiers are used for representing the current scores.
  50. The method of claim 49, further comprising: highlighting an identification of an area of varying score on the current ultrasound score map.
  51. An ultrasound imaging system, comprising:
    an ultrasonic probe;
    the transmitting circuit is used for exciting the ultrasonic probe to transmit ultrasonic waves to a target object;
    the receiving circuit is used for controlling the ultrasonic probe to receive the ultrasonic echo returned from the target object to obtain an ultrasonic echo signal;
    a processor to:
    processing the ultrasonic echo signal to obtain a lung ultrasonic image;
    identifying ultrasound signs of one or more lung regions in the lung ultrasound image;
    scoring the one or more lung regions according to the ultrasound signs to generate scoring results;
    a memory for storing a program executed by the processor;
    the display is used for displaying a lung super-score map on a display interface, the lung super-score map comprises a lung graph and marks displayed at each lung area of the lung graph, and the marks are used for representing the scoring result of the corresponding lung area.
  52. The ultrasound imaging system of claim 51, wherein the indicia comprises a graphic that characterizes how high the score of the scoring result is in different colors, intensities, textures, texture densities, patterns, pattern densities, fill areas, and/or shapes.
  53. The ultrasound imaging system of claim 51, wherein the indicia comprise color patches or color boxes of different colors, each color patch or color box representing a score of the scoring result.
  54. The ultrasound imaging system of claim 53, wherein the identification further comprises a score of the scoring result displayed in the color block or box.
  55. The ultrasound imaging system of claim 51, wherein the identification comprises a cover superimposed over each of the one or more lung regions, the cover characterizing the score level of the scoring result in different colors, intensities, textures, texture densities, patterns, pattern densities and/or fill areas.
  56. The ultrasound imaging system of any of claims 51 to 55, wherein the lung hyperscoring map is a replay navigation interface, and wherein the display displays a lung ultrasound image corresponding to a lung region of the lung hyperscoring map on the display interface when the processor receives a selection instruction of the identification of the lung region.
  57. The ultrasound imaging system of claim 56, wherein each lung region of the one or more lung regions stores one or more frames of the ultrasound images of the lungs.
  58. The ultrasound imaging system of claim 51, wherein the lung superscript map comprises a lung ultrasound image of a respective lung region displayed at the one or more lung regions of the lung graphic, the identification being displayed in synchronization with the lung ultrasound image.
  59. The ultrasound imaging system of claim 58, wherein the marker is displayed on the lung ultrasound image, or wherein the marker is displayed in juxtaposition with the lung ultrasound image, or wherein the lung ultrasound image is displayed on the marker.
  60. The ultrasound imaging system of claim 59, wherein the indicia comprises corner marks or borders of different colors, brightness, fill areas, patterns, shapes, or textures displayed on the ultrasound images of the lungs.
  61. The ultrasound imaging system of claim 58, wherein when the processor receives an instruction to select a lung ultrasound image for a lung region of the lung hyperscoring map, the processor causes the display to display the selected lung ultrasound image in an enlarged manner on the display interface.
  62. The ultrasound imaging system of claim 51, wherein the display is further configured to: displaying the total score of the one or more lung regions on the lung hyperscore map.
  63. The ultrasound imaging system of one of claims 51 to 62, wherein the displaying the lung hyper-score map on the display interface comprises:
    simultaneously displaying a plurality of windows on the display interface, wherein the lung super-score map with one score is displayed in each window.
  64. The ultrasound imaging system of claim 63, wherein the processor is further configured to: when an identification of a certain lung region of the lung super score map of a selected one of the windows is received, controlling the display to automatically highlight the identification at the corresponding lung region of the lung super score maps of the other one or more windows.
  65. The ultrasound imaging system of claim 63, wherein the processor is further configured to control the display to: and displaying the obtaining time of the lung super-score map in each window, and arranging the lung super-score maps according to the sequence of the obtaining time.
  66. The ultrasound imaging system of claim 65, wherein the processor is further configured to control the display to: highlighting an identification of the change of the scoring result of a certain lung area of the lung super scoring graph relative to the time before or after the scoring on the lung super scoring graph with one scoring.
  67. The ultrasound imaging system of claim 63, wherein the processor controls the display to display at least two of the lung hyper-score maps and their corresponding lung ultrasound images in the plurality of windows of the display interface upon receiving a selection of the at least two of the lung hyper-score maps for comparative evaluation.
  68. The ultrasound imaging system of one of claims 51 to 62, wherein said displaying a lung hyper-score map on a display interface upon receiving a selection of at least two of said lung hyper-score maps for comparative evaluation comprises:
    and simultaneously displaying at least two groups of windows on the display interface, and displaying a selected one of the lung super-score maps and the corresponding lung ultrasonic image in each group of windows.
  69. The ultrasound imaging system of claim 67 or 68, wherein when receiving a lung ultrasound image for a lung region of one of the at least two lung hyperscoring maps selected for comparative evaluation for display, the processor is further configured to control the display to: and automatically displaying the lung ultrasonic images at the corresponding lung areas of the other one or more lung hyperscoring maps.
  70. An ultrasound imaging system, comprising:
    an ultrasonic probe;
    the transmitting circuit is used for exciting the ultrasonic probe to transmit ultrasonic waves to one or more areas of the measured object;
    the receiving circuit is used for controlling the ultrasonic probe to receive the ultrasonic echoes returned from the one or more areas to obtain ultrasonic echo signals;
    a processor to:
    processing the ultrasonic echo signals to obtain ultrasonic images of the one or more regions;
    identifying ultrasound signs in the ultrasound images of the one or more regions;
    scoring the ultrasound images of the one or more regions according to the ultrasound signs to generate scoring results;
    a memory for storing a program executed by the processor;
    the display is used for displaying an ultrasonic scoring graph on a display interface, the ultrasonic scoring graph comprises a graph of a measured object and marks of one or more areas displayed on the graph of the measured object, and the marks are used for representing scoring results of the corresponding areas.
  71. A computer storage medium on which a computer program is stored, the computer program, when being executed by a computer or a processor, realizing the steps of the method of any one of claims 1 to 50.
CN201980100370.3A 2019-11-04 2019-11-04 Ultrasonic image analysis method, ultrasonic imaging system, and computer storage medium Pending CN114375179A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/115420 WO2021087687A1 (en) 2019-11-04 2019-11-04 Ultrasonic image analyzing method, ultrasonic imaging system and computer storage medium

Publications (1)

Publication Number Publication Date
CN114375179A true CN114375179A (en) 2022-04-19

Family

ID=75848722

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980100370.3A Pending CN114375179A (en) 2019-11-04 2019-11-04 Ultrasonic image analysis method, ultrasonic imaging system, and computer storage medium

Country Status (2)

Country Link
CN (1) CN114375179A (en)
WO (1) WO2021087687A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113763353A (en) * 2021-09-06 2021-12-07 杭州类脑科技有限公司 Lung ultrasonic image detection system
CN116521912B (en) * 2023-07-04 2023-10-27 广东恒腾科技有限公司 Ultrasonic data storage management system and method based on artificial intelligence

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101162599B1 (en) * 2010-08-18 2012-07-05 인하대학교 산학협력단 An automatic detection method of Cardiac Cardiomegaly through chest radiograph analyses and the recording medium thereof
CN103778600B (en) * 2012-10-25 2019-02-19 北京三星通信技术研究有限公司 Image processing system
JP5924296B2 (en) * 2013-03-19 2016-05-25 コニカミノルタ株式会社 Ultrasound diagnostic imaging equipment
JP2015061592A (en) * 2013-08-21 2015-04-02 コニカミノルタ株式会社 Ultrasonic diagnostic equipment, ultrasonic image processing method, and computer-readable non-temporary recording medium
EP3518771B1 (en) * 2016-09-29 2020-09-02 General Electric Company Method and system for enhanced visualization and selection of a representative ultrasound image by automatically detecting b lines and scoring images of an ultrasound scan
CN107563123A (en) * 2017-09-27 2018-01-09 百度在线网络技术(北京)有限公司 Method and apparatus for marking medical image
CN108573490B (en) * 2018-04-25 2020-06-05 王成彦 Intelligent film reading system for tumor image data
CN108846840B (en) * 2018-06-26 2021-11-09 张茂 Lung ultrasonic image analysis method and device, electronic equipment and readable storage medium
CN109727243A (en) * 2018-12-29 2019-05-07 无锡祥生医疗科技股份有限公司 Breast ultrasound image recognition analysis method and system

Also Published As

Publication number Publication date
WO2021087687A1 (en) 2021-05-14

Similar Documents

Publication Publication Date Title
JP6367425B2 (en) Ultrasonic diagnostic equipment
US20210177373A1 (en) Ultrasound system with an artificial neural network for guided liver imaging
EP3518771B1 (en) Method and system for enhanced visualization and selection of a representative ultrasound image by automatically detecting b lines and scoring images of an ultrasound scan
US10799215B2 (en) Ultrasound systems, methods and apparatus for associating detection information of the same
US20170086790A1 (en) Method and system for enhanced visualization and selection of a representative ultrasound image by automatically detecting b lines and scoring images of an ultrasound scan
JP2014036863A (en) Method for management of ultrasonic image, method for display and device therefor
CN101448461A (en) Ultrasonographic device and ultrasonographic method
CN114375179A (en) Ultrasonic image analysis method, ultrasonic imaging system, and computer storage medium
CN111493932B (en) Ultrasonic imaging method and system
CN112568933B (en) Ultrasonic imaging method, apparatus and storage medium
US8900144B2 (en) Diagnosis apparatus and method of operating the same
US20120078101A1 (en) Ultrasound system for displaying slice of object and method thereof
JP7346266B2 (en) Ultrasonic imaging system and method for displaying target object quality level
CN112842394A (en) Ultrasonic imaging system, ultrasonic imaging method and storage medium
CN114007513A (en) Ultrasonic imaging equipment, method and device for detecting B line and storage medium
US11250564B2 (en) Methods and systems for automatic measurement of strains and strain-ratio calculation for sonoelastography
CN112294360A (en) Ultrasonic imaging method and device
JP7159025B2 (en) Diagnostic device and diagnostic method
CN115666400A (en) Assisting a user in performing a medical ultrasound examination
CN113545806A (en) Prostate elastography method and ultrasound elastography system
US20210251608A1 (en) Ultrasound image processing device and method, and computer-readable storage medium
CN114569154A (en) Ultrasound imaging system and method for assessing tissue elasticity
CN113040822A (en) Method for measuring endometrial peristalsis and device for measuring endometrial peristalsis
CN114680937A (en) Mammary gland ultrasonic scanning method, mammary gland machine and storage medium
CN116096297A (en) Puncture guiding method based on ultrasonic imaging and ultrasonic imaging system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination