WO2021003711A1 - 超声成像设备及检测b线的方法、装置、存储介质 - Google Patents
超声成像设备及检测b线的方法、装置、存储介质 Download PDFInfo
- Publication number
- WO2021003711A1 WO2021003711A1 PCT/CN2019/095473 CN2019095473W WO2021003711A1 WO 2021003711 A1 WO2021003711 A1 WO 2021003711A1 CN 2019095473 W CN2019095473 W CN 2019095473W WO 2021003711 A1 WO2021003711 A1 WO 2021003711A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- line
- lung
- ultrasound
- processor
- Prior art date
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
Definitions
- the invention relates to the technical field of ultrasonic imaging, in particular to an ultrasonic imaging equipment, a method, a device and a storage medium for detecting B-line.
- Ultrasound imaging is a medical imaging technology used to image organs and soft tissues in the human body, and it plays an important role in clinical medicine.
- Lung ultrasound has great application value in the identification and diagnosis of pulmonary exudative lesions. It has good sensitivity and specificity for the diagnosis of various lung diseases, and can even be used instead of chest CT examination.
- the diagnosis of lung diseases in emergency intensive care medicine can save time and cost in clinical practice and save patients' lives in time.
- lung ultrasound imaging in most cases does not reflect direct images of lung tissue, but a series of artifacts, which can be defined according to the display characteristics of lung ultrasound images .
- Ultrasound can only detect the pleura when the normally inflated lung is inspected. However, as the air content decreases, the echo loss effect between the lung and surrounding tissues will be reduced. Ultrasound can reflect the image of the deeper area to a certain extent, resulting in a typical Image signs.
- pulmonary ultrasound has been widely used and valued in the fields of critical and emergency care. Recognizing typical image signs can help medical staff in the rapid diagnosis of lung diseases.
- the current lung ultrasound examination mainly relies on manual manual measurement, which is time-consuming and laborious. The examination results are largely affected by the experience level of the operator, and the degree of intelligence is low.
- the present invention mainly provides an ultrasonic imaging equipment, a method, a device, and a storage medium for detecting B-line, so as to improve the intelligence of lung ultrasonic inspection.
- an ultrasound imaging device including:
- a receiving circuit and a beam combining module for receiving the echo of the ultrasonic beam to obtain an ultrasonic echo signal
- the processor is configured to process the ultrasound echo signal to obtain at least one frame of lung ultrasound image; the processor is also configured to identify image signs of the lung ultrasound image, the image signs including at least B-line , And calculate the parameter information of the lung ultrasound image whose image sign is the B-line, the parameter information includes at least one of the B-line coverage percentage and the B-line interval, and the B-line coverage percentage is the area occupied by the B-line in the lung Percentage of the detection area, the B line interval is the distance between adjacent B lines;
- a human-computer interaction device which is connected to the processor, is used to detect user input information and display the detection result, and the detection result includes the parameter information.
- an embodiment provides a method for automatically detecting the B-line in a lung ultrasound image, including:
- the parameter information includes at least one of the B-line coverage percentage and the B-line interval, and the B-line coverage percentage is the area occupied by the B-line in the lung detection The percentage of the area, the B line interval is the distance between adjacent B lines;
- an embodiment provides an apparatus for automatically detecting B-line in a lung ultrasound image, including:
- An acquisition module for acquiring at least one frame of lung ultrasound images
- An identification module configured to identify image signs corresponding to an ultrasound image of the lungs in the lung detection area, where the image signs include at least a B line;
- the calculation module is used to calculate the parameter information of the lung ultrasound image whose image sign is the B-line, the parameter information includes at least one of the B-line coverage percentage and the B-line interval, and the B-line coverage percentage is the B-line coverage
- the area accounts for the percentage of the detection area of the lung, and the B line interval is the distance between adjacent B lines;
- the display module is used to display the parameter information.
- an embodiment provides a computer-readable storage medium that includes a program that can be executed by a processor to implement the method as described above.
- the ultrasonic imaging equipment can automatically detect the image signs in the lung ultrasound images, the image signs include at least the B-line, and the B-line can be automatically calculated
- the percentage of the area occupied in the detection area of the lung and/or the distance between the B-line realizes the quantitative analysis of the B-line and improves the intelligence of the lung ultrasound examination.
- Figure 1 is a schematic structural diagram of an ultrasonic imaging device in an embodiment of the present invention
- FIG. 2 is a flowchart of a method for automatically detecting B-line in an ultrasound image of a lung in an embodiment of the present invention
- FIG. 3 is a flowchart of a method for automatically detecting B-line in an ultrasound image of the lung in a specific embodiment of the present invention
- FIG. 4 is a schematic structural diagram of a processor in a specific embodiment of the present invention.
- FIG. 5 is a schematic diagram of the display effect of displaying the target image and the quantitative result in a specific embodiment of the present invention
- FIG. 6 is a schematic structural diagram of an apparatus for automatically detecting B-line in an ultrasound image of a lung in an embodiment of the present invention
- FIG. 7 is a schematic structural diagram of another device for automatically detecting the B-line in the lung ultrasound image in an embodiment of the present invention.
- Fig. 8 is a schematic structural diagram of another device for automatically detecting the B-line in the lung ultrasound image in an embodiment of the present invention.
- connection and “connection” mentioned in this application include direct and indirect connection (connection) unless otherwise specified.
- the available methods are for example manually counting the number of B lines.
- the number of B lines is small, there is no big problem in implementation. However, as the number increases, the B lines will merge with each other and it is difficult to distinguish. It is difficult to use the number of B lines to quantitatively evaluate the B line.
- manual measurement is time-consuming and laborious, and relies on operator experience, which is very unfavorable to the promotion and application of lung ultrasound in the field of acute and severe diseases.
- Intelligent B-line automatic identification and quantitative analysis tools that support multiple parameters are expected to improve the efficiency and accuracy of B-line quantitative analysis, and promote lung ultrasound to play a greater role in the field of acute and severe diseases.
- the ultrasound imaging device recognizes the image signs of the acquired lung ultrasound images, the image signs include at least the B line, and then calculates the number of B lines and the percentage of the area occupied by the B line in the lung detection area And/or the distance between line B, complete automatic detection and quantitative analysis of line B.
- FIG. 1 is a schematic structural diagram of an ultrasonic imaging device provided by an embodiment of the present invention.
- the ultrasonic imaging device includes an ultrasonic probe 01, a transmitting circuit 02, a receiving circuit 03, a beam combining module 04, a processor 05, and a human
- the machine interaction device 06, the transmitting circuit 02 and the receiving circuit 03 can be connected to the ultrasonic probe 01 through the transmitting/receiving selection switch 07.
- the transmitting circuit 02 sends delayed-focused transmission pulses with a certain amplitude and polarity to the ultrasound probe 01 through the transmission/reception selection switch 07 to stimulate the ultrasound probe 01 to target tissues (for example, humans or animals).
- tissues for example, humans or animals.
- Organs, tissues, blood vessels, etc.) in the body emit ultrasonic beams.
- ultrasonic beams are emitted to the lungs.
- the receiving circuit 03 receives the echo of the ultrasonic beam through the transmit/receive selection switch 07, obtains the ultrasonic echo signal, and sends the echo signal to the beam synthesis module 04, and the beam synthesis module 04 responds to the ultrasonic echo
- the signal is processed by focusing delay, weighting and channel summation to obtain the beam-synthesized ultrasonic echo signal, and then the beam-synthesized ultrasonic echo signal is sent to the processor 05 for related processing to obtain the desired ultrasonic image or A video file composed of ultrasound images.
- the ultrasound probe 01 usually includes an array of multiple array elements. Each time an ultrasonic wave is transmitted, all the array elements of the ultrasonic probe 01 or a part of all the array elements participate in the transmission of the ultrasonic wave. At this time, each of the array elements or parts of the array elements that participate in the ultrasonic emission is excited by the emission pulse and emits ultrasonic waves respectively. The ultrasonic waves emitted by these array elements are superimposed during the propagation process to form the The synthetic ultrasonic beam of the scanning target, in the embodiment of the present invention, the synthetic ultrasonic beam is the ultrasonic beam emitted to the lungs.
- the human-computer interaction device 06 is connected to the processor 05.
- the processor 05 can be connected to the human-computer interaction device 06 through an external input/output port.
- the human-computer interaction device 06 can detect the input information of the user, and the input information can be, for example,
- the control instruction for the timing of ultrasound transmission and reception may be an operation input instruction such as editing and marking the ultrasound image, or may also include other instruction types.
- the operation instructions obtained by the user when editing, marking, and measuring the ultrasound image are used for the measurement of the target tissue.
- the human-computer interaction device 06 may include one or a combination of a keyboard, a mouse, a scroll wheel, a trackball, a mobile input device (such as a mobile device with a touch display screen, a mobile phone, etc.), a multi-function knob, etc., so
- the corresponding external input/output port can be a wireless communication module, a wired communication module, or a combination of the two.
- the external input/output ports can also be implemented based on USB, bus protocols such as CAN, and/or wired network protocols.
- the human-computer interaction device 06 also includes a display, which can display the ultrasound image obtained by the processor 05.
- the display can also provide the user with a graphical interface for human-computer interaction while displaying the ultrasound image.
- One or more controlled objects are set on the graphical interface and provided for the user to use the human-computer interaction device 06 to input operating instructions to control these The controlled object, thereby performing the corresponding control operation.
- an icon is displayed on a graphical interface, and the icon can be operated using a human-computer interaction device to perform a specific function, such as the function of marking ultrasound images.
- the display may be a touch screen display.
- the display in this embodiment may include one display or multiple displays.
- the processor 05 is used to process the ultrasound echo signal obtained by the beam synthesis module 04 to obtain at least one frame of lung ultrasound image; the processor 05 is also used to identify the image of the lung ultrasound image
- the image sign includes at least the B-line, and the parameter information of the lung ultrasound image whose image sign is the B-line is calculated.
- the parameter information includes at least one of the number of the B-line, the coverage percentage of the B-line, and the B-line interval.
- the coverage percentage of line B is the percentage of the area occupied by line B in the detection area of the lung, and the interval of line B is the distance between adjacent lines.
- the human-computer interaction device 06 displays the detection result through the display, and the detection result includes the parameter information calculated by the processor 05.
- the embodiment of the present invention also provides a method for automatically detecting the B-line in the lung ultrasound image.
- the flowchart can be seen in FIG. 2, and the method may include the following steps:
- Step 101 Acquire at least one frame of lung ultrasound images, for example, acquire a video file including one or more frames of ultrasound images.
- the processor 05 acquires at least one frame of lung ultrasound images collected by the ultrasound probe 01 in real time, or the processor 05 may also read at least one frame of lung ultrasound images from a storage device.
- Step 102 Recognize the image signs.
- the processor 05 After acquiring at least one frame of lung ultrasound images, the processor 05 recognizes image signs of each frame of lung ultrasound images, and the image signs include at least the B-line. In one embodiment, the processor 05 may only identify lung ultrasound images with B-line from the acquired lung ultrasound images based on the target detection algorithm.
- Step 103 Calculate parameter information of line B.
- the processor 05 After the processor 05 recognizes the B-line in the lung ultrasound image, it calculates the parameter information of the lung ultrasound image with the image sign as the B-line.
- the parameter information includes at least one of the coverage percentage of the B-line and the interval of the B-line.
- the coverage percentage of line B is the percentage of the area occupied by line B in the detection area of the lung, and the interval of line B is the distance between adjacent B lines.
- the lung detection area of the frame of lung ultrasound image can be determined first, and then the area occupied by the B-line can be calculated based on the recognized B line.
- the percentage of the lung detection area; the lung detection area can be determined according to the deep learning method, or the position of the pleural line of the lung ultrasound image can be identified first, and then the far field area of the pleural line position can be regarded as the lung of the lung ultrasound image Department detection area.
- the area occupied by line B refers to the area of all lines B in the lung ultrasound image, which can be determined during the recognition process of line B.
- the processor 05 can determine the identified area pixels belonging to line B, and then recognize The area that belongs to line B is determined as the area occupied by line B. For example, the processor 05 may determine the pixels belonging to the B line, and regard the area defined by each pixel as the area belonging to the B line.
- the distance between adjacent B-lines at the position of the pleural line can also be calculated.
- Step 104 Display parameter information.
- the processor 05 sends the calculated parameter information to the human-computer interaction device 06 for display.
- the ultrasound imaging equipment and the method for automatically detecting the B-line in the lung ultrasound image provided by the embodiments of the present invention can automatically detect the image signs of the lung ultrasound image, and calculate the B-line coverage percentage of the lung ultrasound image whose image signs are the B-line And/or B-line interval, realizes the B-line intelligent identification and quantitative analysis of multiple parameters, and improves the intelligence of lung ultrasound examination.
- By calculating the coverage percentage of the B line it is possible to avoid the problem that the number of B lines cannot be used to quantitatively evaluate the B line due to the difficulty of mutual integration when the number of B lines is large, and it overcomes the problems of low manual measurement efficiency and the detection results being easily affected by human factors. Improve measurement efficiency and accuracy of detection results.
- FIG. 4 provides a specific method of automatically detecting the B-line in the lung ultrasound image.
- the structure of the processor 05 can be seen in FIG. 4, which may include an acquisition unit 51, an image selection unit 52, an image analysis unit 53, a result selection unit 54 and a scoring unit 55.
- the method may include the following steps:
- Step 201 Acquire at least one frame of lung ultrasound images.
- the processor 05 obtains at least one frame of lung ultrasound images collected by the ultrasound probe 01 in real time through the obtaining unit 51, or the obtaining unit 51 may also read at least one frame of lung ultrasound images from a storage device.
- Step 202 Filter out images to be analyzed.
- the processor 05 After the processor 05 obtains at least one frame of lung ultrasound images, it inputs these lung ultrasound images to the image selection unit 52, and the image selection unit 52 screens out the images to be analyzed from these lung ultrasound images.
- the images to be analyzed are Ultrasound images of the lungs with pathological features. Image screening can eliminate useless images such as images without image information, images with non-pulmonary signs, and blurred images, so as to improve detection efficiency and reduce false detections.
- Step 203 Identify the image signs of the image to be analyzed.
- the image selection unit 52 selects the images to be analyzed
- the images to be analyzed are input to the image analysis unit 53, and the image analysis unit 53 identifies the image signs of each frame of the image to be analyzed, and the identified image signs contain at least the B line .
- the image analysis unit 53 first determines the lung detection area of each frame of the image to be analyzed.
- the image analysis unit 53 can determine the lung detection area of each frame of the image to be analyzed according to the deep learning method; that is, it can pre-calibrate a large number of lung areas, and then train the machine to recognize this area through the target detection algorithm .
- the target detection algorithm can be, for example, a fast area convolutional neural network (Faster RCNN) algorithm, etc.
- the image analysis unit 53 may also determine the lung detection area of each frame of the image to be analyzed by image processing; for example, it may first identify the position of the pleural line of the image to be analyzed, and then the position of the pleural line The far-field area of is used as the lung detection area of the image to be analyzed, where the position of the pleural line can be determined according to the recognized near-field highlight horizontal line features, or it can be implemented by means of deep learning.
- the bat signs include chest mold lines. Therefore, the image analysis unit 53 can also use the bat signs of the image to be analyzed to determine the lung detection area, that is, it is recognizing In the bat sign, the position of the pleural line can be determined from the bat sign.
- the image analysis unit 53 determines the lung detection area, it recognizes image signs in the lung detection area.
- image signs in lung ultrasound images include A-line, B-line, bat sign, coastal sign, pleural slip sign, debris sign, etc.
- the detected image signs can include B-line, lung consolidation, etc.
- the image analysis unit 53 can recognize image signs in the lung detection area according to the deep learning method, that is, calibrate a large number of diseases, and then train the machine to recognize through the target detection algorithm.
- the target detection algorithm may be Faster, for example. RCNN algorithm etc.
- the image analysis unit 53 can also use image processing methods to identify image signs in the lung detection area; since the B line is a discrete vertical reverberation artifact that extends from the pleural line to the bottom of the image display screen, No loss occurs. Therefore, according to this feature, the vertical linear feature in the direction of the sound beam line can be detected in the lung detection area to obtain the B line; among them, the recognition of the linear feature can be obtained by template matching and other methods. In practical applications, the B line can be divided into a single B line and a diffuse B line according to the width of the B line.
- Step 204 Determine the lung ultrasound image with B-line.
- the image analysis unit 53 After the image analysis unit 53 recognizes the image signs of each frame of the image to be analyzed, it determines the lung ultrasound images with B-line from the images to be analyzed according to the image signs.
- Step 205 Calculate the parameter information of the B line.
- the image analysis unit 53 determines the lung ultrasound image with B-line, for each frame of the lung ultrasound image with B-line, calculate the coverage percentage of B-line and/or the interval of B-line according to the recognized B-line, and realize the comparison of B-line. Quantitative analysis of the line.
- the image analysis unit 53 may first determine the area occupied by the B line, and then calculate the percentage of the area occupied by the B line in the corresponding lung detection area.
- the area occupied by the B line refers to the area occupied by all the B lines in the lung ultrasound image, which can be considered as the area defined by the position where the B line pixels appear.
- the B line pixels can be used in the process of identifying the B line Learned. Based on this, when determining the area occupied by line B, the image analysis unit 53 can first determine the identified area pixels belonging to line B in the lung detection area, and determine the area belonging to line B as the area occupied by line B area.
- the image analysis unit 53 may first determine the position of the pleural line of the lung ultrasound image, and then calculate the distance between adjacent B-lines at the position of the pleural line according to the recognized B-line, or it may also calculate the distance from the pleura
- the line position is the distance between adjacent B lines at a preset distance. Calculating the interval of the B-line can assist medical staff to assess the condition of the lungs.
- the image analysis unit 53 may also calculate the number of B lines based on the identified B lines.
- the number of B lines In the process of evaluating the number of B lines, when the number of B lines is small, each B line can be clearly distinguished. However, as the number of B lines increases, the B lines will merge with each other and be difficult to distinguish. At this time, in order to To obtain a more accurate number of line B, you can calculate the percentage of line B.
- the image analysis unit 53 after the image analysis unit 53 recognizes the B line, it can calculate the number of B lines when each line B can be clearly distinguished (the number of B lines is small), and can also calculate the B line coverage percentage , Or calculate the number of lines B and the coverage percentage of line B at the same time; when each line B cannot be clearly distinguished (the number of lines B is so much that it is difficult to distinguish between them), the coverage percentage of line B can be calculated and covered with line B The percentage to reflect the number of B lines.
- Step 206 Determine the target image.
- the result selection unit 54 of the processor 05 may select at least one frame of images from the lung ultrasound images as the target image according to a preset rule.
- the preset rule may be the largest number of B-lines or the largest coverage percentage of the B-line.
- the selection unit 54 may select a frame with the largest number of B-lines from the lung ultrasound images.
- the target image or, select the frame with the largest coverage of line B as the target image.
- the selection criteria for selecting the target image can be set by the user.
- Step 207 Score the target image.
- the scoring unit 55 scores the selected at least one frame of the target image to obtain a scoring result for each frame of the target image, and the scoring result reflects the correlation between the target image and the associated disease.
- the scoring form for scoring the target image may be various forms consisting of at least one of numbers, letters, and text, and the scoring may be determined according to one or more of the calculated parameter information.
- a scoring rule can be: when there is no abnormality in the target image or there are less than 2 clear B-lines, the scoring result is determined to be N or 0; when there are more than 3 clear B-lines in the target image, the scoring result is determined to be B1 or 1; when the B-line interval in the target image is less than the preset value (diffuse B-line), the score result is determined to be B2 or 2; when the image signs of the target image represent lung consolidation, the score result is determined to be C or 3.
- the scoring unit 55 can not only obtain the scoring result of a single position of the lung, but also calculate the sum of the scores of the target image corresponding to each scan position of the lung, that is, add the scores of each scan position to obtain the entire lung Scoring results.
- Step 208 Display the target image and the quantitative result.
- the processor 05 may send the selected target image and its corresponding parameter information and scoring result to the human-computer interaction device 06 for display.
- the human-computer interaction device 06 may also highlight at least one of the lung detection area, the B line and the marked line of the B line interval in the target image.
- Figure 5 is a schematic diagram of the display effect, where the dotted line indicates the lung detection area, the vertical solid line in this area is the detected B line, and the upper right area shows the quantitative analysis result of the target image.
- the analysis results can include scoring results, the number of B lines and the coverage percentage of B lines.
- the table in the lower right area indicates the B line interval.
- the method for automatically detecting the B-line in the lung ultrasound image provided in this embodiment first selects the images to be analyzed with pathological features from the acquired lung ultrasound images, thereby eliminating useless images, improving the detection efficiency and reducing False detection rate; then determine the lung detection area of each frame of the image to be analyzed, and identify the image signs in the lung detection area, the image signs include at least the B line; and then determine the B line according to the recognized image signs Lung ultrasound images, and then calculate at least one of the B-line coverage percentage, B-line interval, and B-line number corresponding to each of these lung ultrasound images to realize automatic detection and quantitative analysis of B-line.
- At least one frame of the lung ultrasound image can be selected as the target image according to the calculated parameter information of the B-line, and the target image can be scored, and finally the target image and its corresponding parameter information and scoring result can be displayed ,
- This method overcomes the problem of low manual measurement efficiency, and at the same time avoids the influence of human factors on the detection results, improves the intelligence of lung ultrasound examination, and also improves the measurement efficiency and the accuracy of the detection results.
- the processor 05 screens out the image to be analyzed with pathological characteristics when performing image screening, then recognizes the image signs of the image to be analyzed, and then determines the lung ultrasound image with B-line according to the recognized image signs In order to obtain an ultrasound image of the lung with B-line.
- the processor 05 can also filter out only the images with B-line when performing image screening, that is, directly identify the images with the B-line from the acquired at least one frame of lung ultrasound images.
- the lung ultrasound image of the B-line for example, the lung ultrasound image with the B-line can be identified from the acquired lung ultrasound image based on the target detection algorithm.
- the foregoing embodiment is an example for determining the target image and scoring the target image.
- the human-computer interaction device 06 displays the target image and its corresponding parameter information and scoring result. In practical applications, the target image may not be scored.
- the processor calculates the parameter information and selects at least one frame of the lung ultrasound image as the target image according to preset rules, and the human-computer interaction device 06 directly displays the target image And its corresponding parameter information.
- the detection of the disease is realized by a fully automatic method. In practical applications, it can also be realized by a combination of automatic and manual methods, such as automatically identifying B-line and manually marking non-B-line lung ultrasound images.
- the image analysis unit 53 of the processor 05 can determine from the image to be analyzed that the image sign is a non-B-line lung ultrasound image based on the recognized image sign of the image to be analyzed; the human-computer interaction device 06 detects the user's
- the image sign is an operation instruction for marking the lung ultrasound image that is not B-line, and the operation instruction is sent to the processor 05.
- the processor 05 will perform an operation instruction for the lung ultrasound that has a non-B-line image sign.
- the image is marked.
- a device for automatically detecting the B-line in the lung ultrasound image includes an acquisition module 61, a determination module 62, an identification module 63, a calculation module 64 and a display module 65.
- the acquiring module 61 is used to acquire at least one frame of lung ultrasound images.
- the determination module 62 is used to determine the lung detection area of the lung ultrasound image acquired by the acquisition module 61; for example, the determination module 62 may determine the lung detection area of the lung ultrasound image according to a deep learning method, or the determination module 62 may also first Identify the position of the pleural line of the lung ultrasound image, and then use the far-field area of the pleural line position as the lung detection area of the frame of the lung ultrasound image.
- the identification module 63 is configured to identify image signs corresponding to the lung ultrasound images in the lung detection area determined by the determining module 62, and the image signs include at least the B-line.
- the calculation module 64 is used to calculate the parameter information of the lung ultrasound image whose image sign is the B-line, the parameter information includes at least one of the B-line coverage percentage and the B-line interval, where the B-line coverage percentage is the area occupied by the B-line It accounts for the percentage of the detection area of the lung, where the B-line interval is the distance between adjacent B-lines.
- the calculation module 64 is specifically configured to determine the position of the pleural line of the lung ultrasound image whose image sign is the B-line, and then calculate the distance between adjacent B-lines at the position of the pleural line according to the recognized B-line .
- the calculation module 64 is used to calculate the percentage of the area occupied by the B line in the corresponding lung detection area according to the identified B line to obtain the B line coverage percentage; specifically, the calculation module 64 can first determine the identified area belonging to line B, and then determine the area belonging to line B as the area occupied by line B, and then calculate the percentage of the area occupied by line B to its corresponding lung detection area.
- the display module 65 is used to display the parameter information calculated by the calculation module 64.
- the device includes an acquisition module 61, a determination module 62, an identification module 63, a calculation module 64, a display module 65, Selection module 66 and scoring module 67.
- the functions of the acquiring module 61, the determining module 62, the identifying module 63, and the calculating module 64 are the same as the one-to-one correspondence in FIG. 6.
- the selection module 66 is configured to select at least one frame of images from the lung ultrasound images acquired by the acquisition module 61 as the target image according to preset rules; for example, one frame of images may be selected as the target image, and the calculation module 64 may also calculate the number of B lines.
- the preset rule can be that the number of lines B is the largest or the coverage percentage of line B is the largest.
- the selection module 66 can select line B from the lung ultrasound images acquired by the acquisition module 61 according to the calculation result of the calculation module 64 The image with the largest number is used as the target image, or the image with the largest coverage of line B is selected as the target image.
- the display module 65 is used to display the target image determined by the selection module 66 and its corresponding parameter information.
- the scoring module 67 can also score the selected at least one frame of the target image to obtain a scoring result, which can reflect the correlation between the target image and the associated disease Sex.
- the display module 65 can synchronously display the target image and its corresponding parameter information and scoring result.
- the display module 65 may also highlight in the target image at least one of the lung detection area, the B-line and the marked line of the B-line interval.
- the device includes an acquisition module 61, a determination module 62, an identification module 63, a calculation module 64, a display module 65, The screening module 68, the non-B line determination module 69, and the marking module 60.
- the acquiring module 61 is used to acquire at least one frame of lung ultrasound images.
- the screening module 68 is configured to screen out an image to be analyzed from the lung ultrasound images acquired by the acquiring module 61, and the image to be analyzed is a lung ultrasound image with pathological characteristics. Different from the device in FIG. 6, in the device in FIG.
- the determining module 62 is used to determine the lung detection area of the image to be analyzed filtered by the screening module 68, and the identification module 63 is used to identify the image to be analyzed in the lung detection area According to the image signs, the lung ultrasound image with B-line is determined from the image to be analyzed.
- the function of the calculation module 64 is the same as that in the device of FIG. 6.
- the non-B-line determination module 69 is configured to determine the lung ultrasound image with a non-B-line image sign from the image to be analyzed according to the image signs of the image to be analyzed recognized by the recognition module 63.
- the marking module 60 is configured to receive an operation instruction for marking the lung ultrasound image whose image sign is non-B-line, and according to the operation instruction, the image sign determined by the non-B-line determination module 69 is a lung ultrasound image that is not B-line To mark.
- the device may also include a selection module 66 and a scoring module 67 in the device as shown in FIG. 7. At this time, the selection module 66 is used to select at least from the to-be-analyzed images screened by the screening module 68 according to preset rules. One frame of image is used as the target image.
- any tangible, non-transitory computer-readable storage medium can be used, including magnetic storage devices (hard disks, floppy disks, etc.), optical storage devices (CD-ROM, DVD, Blu-Ray disks, etc.), flash memory and/or And so on.
- These computer program instructions can be loaded on a general-purpose computer, a special-purpose computer, or other programmable data processing equipment to form a machine, so that these instructions executed on the computer or other programmable data processing device can generate a device that realizes the specified function.
- Computer program instructions can also be stored in a computer-readable memory, which can instruct a computer or other programmable data processing equipment to operate in a specific manner, so that the instructions stored in the computer-readable memory can form a piece of Manufactured products, including realization devices that realize specified functions.
- Computer program instructions can also be loaded on a computer or other programmable data processing equipment, thereby executing a series of operation steps on the computer or other programmable equipment to produce a computer-implemented process, so that the execution on the computer or other programmable equipment Instructions can provide steps for implementing specified functions.
- Coupled refers to physical connection, electrical connection, magnetic connection, optical connection, communication connection, functional connection and/or any other connection.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Animal Behavior & Ethology (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
Description
Claims (42)
- 一种超声成像设备,其特征在于,包括:超声探头;发射电路,用于激励所述超声探头向肺部发射超声波束;接收电路和波束合成模块,用于接收所述超声波束的回波,获得超声回波信号;处理器,用于对所述超声回波信号进行处理,得到至少一帧肺部超声图像;所述处理器还用于识别所述肺部超声图像的图像征象,所述图像征象至少包括B线,并计算图像征象为B线的肺部超声图像的参数信息,所述参数信息包括B线覆盖百分比和B线间隔中的至少之一,所述B线覆盖百分比为B线所占区域占肺部检测区域的百分比,所述B线间隔为相邻B线间的距离;人机交互装置,其与所述处理器连接,用于检测用户的输入信息和显示检测结果,所述检测结果中包括所述参数信息。
- 如权利要求1所述的超声成像设备,其特征在于,所述参数信息包括B线间隔,所述处理器在计算B线间隔时用于:确定胸膜线的位置;根据识别到的B线计算胸膜线位置处相邻B线间的距离。
- 如权利要求1所述的超声成像设备,其特征在于,所述处理器用于从所述至少一帧肺部超声图像中筛选出待分析图像,所述待分析图像为具有病理特征的肺部超声图像,识别所述待分析图像的图像征象,并根据该图像征象从待分析图像中确定出具有B线的肺部超声图像。
- 如权利要求3所述的超声成像设备,其特征在于,所述处理器还根据识别出的待分析图像的图像征象,从待分析图像中确定出图像征象为非B线的肺部超声图像;所述人机交互装置还用于检测用户对图像征象为非B线的肺部超声图像进行标记的操作指令,并将该操作指令发送给处理器;所述处理器还用于根据所述操作指令对所述图像征象为非B线的肺部超声图像进行标记。
- 如权利要求1所述的超声成像设备,其特征在于,所述处理器用于从所述至少一帧肺部超声图像中识别出具有B线的肺部超声图像。
- 如权利要求1至5中任一项所述的超声成像设备,其特征在于,所述处理器在识别肺部超声图像的图像征象时用于:确定肺部超声图像的肺部检测区域;在所述肺部检测区域内识别图像征象。
- 如权利要求6所述的超声成像设备,其特征在于,所述处理器根据深度学习法确定肺部超声图像的肺部检测区域。
- 如权利要求6所述的超声成像设备,其特征在于,所述处理器在确定肺部超声图像的肺部检测区域时用于:识别肺部超声图像的胸膜线位置,并将所述胸膜线位置的远场区域作为该肺部超声图像的肺部检测区域。
- 如权利要求6所述的超声成像设备,其特征在于,所述参数信息包括B线覆盖百分比,所述处理器在计算B线覆盖百分比时用于:根据识别到的B线计算该B线所占区域占其对应的肺部检测区域的百分比。
- 如权利要求9所述的超声成像设备,其特征在于,所述处理器用于确定识别到的属于B线的区域,确定为B线所占区域。
- 如权利要求6所述的超声成像设备,其特征在于,所述处理器根据深度学习法在所述肺部检测区域内识别图像征象。
- 如权利要求6所述的肺部超声成像设备,其特征在于,所述图像征象为B线,所述处理器用于在所述肺部检测区域内检测声束线方向的垂直线状特征,得到对应的图像征象。
- 如权利要求1所述的超声成像设备,其特征在于,所述处理器在计算出参数信息之后,还用于根据预设规则从肺部超声图像中选择至少一帧图像作为目标图像;所述人机交互装置用于显示所述目标图像及其对应的参数信息。
- 如权利要求1所述的超声成像设备,其特征在于,所述处理器在计算出参数信息之后,还用于根据预设规则从肺部超声图像中选择至少一帧图像作为目标图像,并对选定的至少一帧目标图像进行评分,得到评分结果,所述评分结果反映目标图像与关联病症的相关性。
- 如权利要求13或14所述的超声成像设备,其特征在于,所述参数信息还包括B线个数,所述目标图像是一帧图像,所述预设规则包括:B线个数最多或B线覆盖百分比最大。
- 如权利要求14所述的超声成像设备,其特征在于,所述处理器还用于计算肺部各扫查位置对应的目标图像的评分之和,得到整个肺部的评分结果。
- 如权利要求14所述的超声成像设备,其特征在于,所述人机交互装置用于显示所述目标图像及其对应的参数信息和评分结果。
- 如权利要求13或14所述的超声成像设备,其特征在于,所述人机交互装置还用于在所述目标图像中突出显示肺部检测区域、B线和B线间隔的标注线中至少之一。
- 如权利要求1所述的超声成像设备,其特征在于,所述处理器还用于从存储设备中读取至少一帧肺部超声图像。
- 一种自动检测肺部超声图像中B线的方法,其特征在于,包括:获取至少一帧肺部超声图像;识别所述肺部超声图像的图像征象,所述图像征象至少包括B线;计算图像征象为B线的肺部超声图像的参数信息,所述参数信息包括B线覆盖百分比和B线间隔中的至少之一,所述B线覆盖百分比为B线所占区域占肺部检测区域的百分比,所述B线间隔为相邻B线间的距离;显示所述参数信息。
- 如权利要求20所述的方法,其特征在于,所述参数信息包括B线间隔,所述计算图像征象为B线的肺部超声图像的参数信息包括:确定图像征象为B线的肺部超声图像的胸膜线位置;根据识别到的B线计算胸膜线位置处相邻B线间的距离。
- 如权利要求20所述的方法,其特征在于,所述识别所述肺部超声图像的图像征象包括:从所述肺部超声图像中筛选出待分析图像,所述待分析图像为具有病理特征的肺部超声图像;识别所述待分析图像的图像征象,并根据该图像征象从待分析图像中确定出具有B线的肺部超声图像。
- 如权利要求22所述的方法,其特征在于,还包括:根据识别出的待分析图像的图像征象,从待分析图像中确定出图像征象为非B线的肺部超声图像;接收对图像征象为非B线的肺部超声图像进行标记的操作指令,并根据该操作指令对所述图像征象为非B线的肺部超声图像进行标记。
- 如权利要求20所述的方法,其特征在于,所述识别所述肺部超声图像的图像征象包括:基于目标检测算法从所述肺部超声图像中识别出具有B线的肺部超声图像。
- 如权利要求20至24中任一项所述的方法,其特征在于,识别肺部超声图像的图像征象包括:确定肺部超声图像的肺部检测区域;在所述肺部检测区域内识别图像征象。
- 如权利要求25所述的方法,其特征在于,所述确定肺部超声图像的肺部检测区域包括:根据深度学习法确定肺部超声图像的肺部检测区域;或者,识别肺部超声图像的胸膜线位置,并将所述胸膜线位置的远场区域作为该帧肺部超声图像的肺部检测区域。
- 如权利要求25所述的方法,其特征在于,所述参数信息包括B线覆盖百分比,所述计算图像征象为B线的肺部超声图像的参数信息包括:根据识别到的B线计算该B线所占区域占其对应的肺部检测区域的百分比。
- 如权利要求27所述的方法,其特征在于,所述根据识别到的B线计算该B线所占区域占其对应的肺部检测区域的百分比包括:确定识别到的属于B线的区域,并将其确定为B线所占区域;计算B线所占区域占其对应的肺部检测区域的百分比。
- 如权利要求20所述的方法,其特征在于,在所述计算图像征象为B线的肺部超声图像的参数信息之后,所述方法还包括:根据预设规则从肺部超声图像中选择至少一帧图像作为目标图像;显示所述目标图像及其对应的参数信息。
- 如权利要求20所述的方法,其特征在于,在所述计算图像征象为B线的肺部超声图像的参数信息之后,所述方法还包括:对选定的至少一帧目标图像进行评分,得到评分结果,所述目标图像根据预设规则选自于肺部超声图像,所述评分结果反映目标图像与关联病症的相关性。
- 如权利要求29或30所述的方法,其特征在于,所述参数信息还包括B线个数,所述目标图像是一帧图像,所述预设规则包括:B线个数最多或B线覆盖百分比最大。
- 如权利要求30所述的方法,其特征在于,还包括:计算肺部各扫查位置对应的目标图像的评分之和,得到整个肺部的评分结果。
- 如权利要求30所述的方法,其特征在于,在显示所述参数信息时,所述方法还包括:同步显示所述目标图像及其对应的评分结果。
- 如权利要求29或30所述的方法,其特征在于,还包括:在所述目标图像中突出显示肺部检测区域、B线和B线间隔的标注线中至少之一。
- 一种超声成像设备,其特征在于,包括:超声探头;发射电路,用于激励所述超声探头向肺部发射超声波束;接收电路和波束合成模块,用于接收所述超声波束的回波,获得超声回波信号;处理器,用于对所述超声回波信号进行处理,得到包括多帧肺部超声图像的视频文件;所述处理器还用于确定肺部超声图像的肺部检测区域,并在所述肺部检测区域内识别所述肺部超声图像的图像征象,所述图像征象至少包括B线;所述处理器还用于确定图像征象为B线的肺部超声图像的定量参数信息,所述定量参数信息包括B线个数、B线覆盖百分比和B线间隔中的至少之一;人机交互装置,其与所述处理器连接,用于检测用户的输入信息和显示检测结果,所述检测结果中包括所述参数信息。
- 如权利要求35所述的超声成像设备,其特征在于,所述处理器用于从所述视频文件的多帧肺部超声图像中筛选出待分析图像,所述待分析图像为具有病理特征的肺部超声图像,识别所述待分析图像的图像征象,并根据该图像征象从待分析图像中确定出具有B线的肺部超声图像。
- 如权利要求35所述的超声成像设备,其特征在于,所述处理器在确定肺部超声图像的肺部检测区域时用于:识别肺部超声图像的胸膜线位置,并将所述胸膜线位置的远场区域作为该肺部超声图像的肺部检测区域。
- 如权利要求35所述的超声成像设备,其特征在于,所述参数信息包括B线覆盖百分比,所述处理器在确定B线覆盖百分比时用于:根据识别到的B线计算该B线所占区域占其对应的肺部检测区域的百分比。
- 如权利要求38所述的超声成像设备,其特征在于,所述处理器用于确定识别到的属于B线的区域,将并将其确定为B线所占区域。
- 如权利要求35所述的超声成像设备,其特征在于,所述参数信息包括B线间隔,所述处理器在确定B线间隔时用于:确定胸膜线的位置;根据识别到的B线计算胸膜线位置处相邻B线间的距离,或根据识别到的B线计算距离胸膜线预设位置处相邻B线间的距离。
- 如权利要求35所述的超声成像设备,其特征在于,所述参数信息包括B线个数,所述处理器在确定B线个数时用于:统计所述肺部超声图像的肺部检测区域内识别到的B线的条数。
- 一种计算机可读存储介质,其特征在于,包括程序,所述程序能够被处理器执行以实现如权利要求20至34中任一项所述的方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201980097733.2A CN114007513A (zh) | 2019-07-10 | 2019-07-10 | 超声成像设备及检测b线的方法、装置、存储介质 |
PCT/CN2019/095473 WO2021003711A1 (zh) | 2019-07-10 | 2019-07-10 | 超声成像设备及检测b线的方法、装置、存储介质 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2019/095473 WO2021003711A1 (zh) | 2019-07-10 | 2019-07-10 | 超声成像设备及检测b线的方法、装置、存储介质 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021003711A1 true WO2021003711A1 (zh) | 2021-01-14 |
Family
ID=74114313
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2019/095473 WO2021003711A1 (zh) | 2019-07-10 | 2019-07-10 | 超声成像设备及检测b线的方法、装置、存储介质 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN114007513A (zh) |
WO (1) | WO2021003711A1 (zh) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112819773A (zh) * | 2021-01-28 | 2021-05-18 | 清华大学 | 一种超声图像定量评估方法 |
CN113763353A (zh) * | 2021-09-06 | 2021-12-07 | 杭州类脑科技有限公司 | 一种肺部超声图像检测系统 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070167797A1 (en) * | 2003-11-07 | 2007-07-19 | Michalakis Averkiou | System and method for ultrasound perfusion imaging |
US20120302885A1 (en) * | 2011-05-27 | 2012-11-29 | Samsung Medison Co., Ltd. | Providing a measuring item candidate group for measuring size of a target object in an ultrasound system |
CN104116523A (zh) * | 2013-04-25 | 2014-10-29 | 深圳迈瑞生物医疗电子股份有限公司 | 一种超声影像分析系统及其分析方法 |
CN108038875A (zh) * | 2017-12-07 | 2018-05-15 | 浙江大学 | 一种肺部超声图像识别方法和装置 |
CN109310398A (zh) * | 2016-03-24 | 2019-02-05 | 皇家飞利浦有限公司 | 用于检测肺部滑动的超声系统和方法 |
US10324065B2 (en) * | 2014-01-06 | 2019-06-18 | Samsung Electronics Co., Ltd. | Ultrasound diagnostic apparatus, ultrasound image capturing method, and computer-readable recording medium |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11185311B2 (en) * | 2015-09-17 | 2021-11-30 | Koninklijke Philips N.V. | Distinguishing lung sliding from external motion |
US10667793B2 (en) * | 2015-09-29 | 2020-06-02 | General Electric Company | Method and system for enhanced visualization and selection of a representative ultrasound image by automatically detecting B lines and scoring images of an ultrasound scan |
EP3518771B1 (en) * | 2016-09-29 | 2020-09-02 | General Electric Company | Method and system for enhanced visualization and selection of a representative ultrasound image by automatically detecting b lines and scoring images of an ultrasound scan |
EP3482689A1 (en) * | 2017-11-13 | 2019-05-15 | Koninklijke Philips N.V. | Detection, presentation and reporting of b-lines in lung ultrasound |
-
2019
- 2019-07-10 CN CN201980097733.2A patent/CN114007513A/zh active Pending
- 2019-07-10 WO PCT/CN2019/095473 patent/WO2021003711A1/zh active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070167797A1 (en) * | 2003-11-07 | 2007-07-19 | Michalakis Averkiou | System and method for ultrasound perfusion imaging |
US20120302885A1 (en) * | 2011-05-27 | 2012-11-29 | Samsung Medison Co., Ltd. | Providing a measuring item candidate group for measuring size of a target object in an ultrasound system |
CN104116523A (zh) * | 2013-04-25 | 2014-10-29 | 深圳迈瑞生物医疗电子股份有限公司 | 一种超声影像分析系统及其分析方法 |
US10324065B2 (en) * | 2014-01-06 | 2019-06-18 | Samsung Electronics Co., Ltd. | Ultrasound diagnostic apparatus, ultrasound image capturing method, and computer-readable recording medium |
CN109310398A (zh) * | 2016-03-24 | 2019-02-05 | 皇家飞利浦有限公司 | 用于检测肺部滑动的超声系统和方法 |
CN108038875A (zh) * | 2017-12-07 | 2018-05-15 | 浙江大学 | 一种肺部超声图像识别方法和装置 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112819773A (zh) * | 2021-01-28 | 2021-05-18 | 清华大学 | 一种超声图像定量评估方法 |
CN113763353A (zh) * | 2021-09-06 | 2021-12-07 | 杭州类脑科技有限公司 | 一种肺部超声图像检测系统 |
Also Published As
Publication number | Publication date |
---|---|
CN114007513A (zh) | 2022-02-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6841907B2 (ja) | Bラインを自動的に検出し、超音波スキャンの画像をスコア付けすることによる代表超音波画像の向上された視覚化および選択のための方法、システム及び非一時的コンピュータ可読媒体 | |
EP3554380B1 (en) | Target probe placement for lung ultrasound | |
US20170086790A1 (en) | Method and system for enhanced visualization and selection of a representative ultrasound image by automatically detecting b lines and scoring images of an ultrasound scan | |
EP3463098B1 (en) | Medical ultrasound image processing device | |
JP7022217B2 (ja) | 超音波システムのためのエコー窓のアーチファクト分類及び視覚的インジケータ | |
CN107157515B (zh) | 超声检测血管系统及方法 | |
JP7285826B2 (ja) | 肺超音波検査におけるbラインの検知、提示及び報告 | |
US11403778B2 (en) | Fetal development monitoring | |
US11931201B2 (en) | Device and method for obtaining anatomical measurements from an ultrasound image | |
JP6648587B2 (ja) | 超音波診断装置 | |
JP2020503099A (ja) | 出産前超音波イメージング | |
CN111511288A (zh) | 超声肺评估 | |
WO2021003711A1 (zh) | 超声成像设备及检测b线的方法、装置、存储介质 | |
KR20150000261A (ko) | 초음파 영상에 대응하는 참조 영상을 제공하는 초음파 시스템 및 방법 | |
WO2021087687A1 (zh) | 超声图像分析方法、超声成像系统和计算机存储介质 | |
CN116194048A (zh) | 膈肌的超声测量方法及系统 | |
US20220361852A1 (en) | Ultrasonic diagnostic apparatus and diagnosis assisting method | |
WO2020037673A1 (zh) | 一种超声弹性成像装置及对弹性图像进行处理的方法 | |
CN115299986A (zh) | 一种超声成像设备及其超声检查方法 | |
JP2009148499A (ja) | 超音波診断装置 | |
US20230320694A1 (en) | Graphical user interface for providing ultrasound imaging guidance | |
WO2021042242A1 (zh) | 一种超声成像设备及其超声回波信号的处理方法 | |
US11382595B2 (en) | Methods and systems for automated heart rate measurement for ultrasound motion modes | |
JP2013052131A (ja) | 超音波診断装置及び血管狭窄改善表示プログラム | |
JP7457571B2 (ja) | 超音波診断装置及び診断支援方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19936909 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19936909 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 13/06/2022) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19936909 Country of ref document: EP Kind code of ref document: A1 |