CN111528907A - Ultrasonic image pneumonia auxiliary diagnosis method and system - Google Patents

Ultrasonic image pneumonia auxiliary diagnosis method and system Download PDF

Info

Publication number
CN111528907A
CN111528907A CN202010375817.2A CN202010375817A CN111528907A CN 111528907 A CN111528907 A CN 111528907A CN 202010375817 A CN202010375817 A CN 202010375817A CN 111528907 A CN111528907 A CN 111528907A
Authority
CN
China
Prior art keywords
patient
lung
image
ultrasonic image
rib
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010375817.2A
Other languages
Chinese (zh)
Inventor
郝文强
欧永红
陆林国
吴军
郑洪喆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wandong Yusheng Suzhou Medical Technology Co ltd
Original Assignee
Wandong Yusheng Suzhou Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wandong Yusheng Suzhou Medical Technology Co ltd filed Critical Wandong Yusheng Suzhou Medical Technology Co ltd
Priority to CN202010375817.2A priority Critical patent/CN111528907A/en
Publication of CN111528907A publication Critical patent/CN111528907A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/465Displaying means of special interest adapted to display user selection data, e.g. icons or menus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5223Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5292Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves using additional data, e.g. patient information, image labeling, acquisition parameters

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physiology (AREA)
  • Human Computer Interaction (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The invention discloses an ultrasonic image pneumonia auxiliary diagnosis method, which comprises the following steps: leading in the ultrasonic image of the chest of the patient; matching and displaying the ultrasonic image of the healthy sample, wherein rib and rib shadows, a high-echo pleural line, a lung sliding contour and an A/B line area are drawn on the ultrasonic image of the healthy sample and used as comparison; sequentially drawing the four areas on the chest ultrasonic image of the patient; respectively extracting edge features and texture features from ribs, rib shadow areas and hyperechoic pleural lines of healthy samples and patients, and performing feature comparison; dynamically displaying the sliding outline of the lung and the A/B line area on the image of the health sample and the patient to judge the state of the illness. The invention also discloses an ultrasonic image auxiliary diagnosis system. The invention trains the convolutional neural network by using the lung characteristic diagram labeled by a large number of experts, and the accuracy of comparison can be ensured when an ultrasonic doctor with less experience operates; the invention provides a pneumonia diagnosis imaging reference for inexperienced doctors or doctors with little diagnosis experience.

Description

Ultrasonic image pneumonia auxiliary diagnosis method and system
Technical Field
The invention relates to an ultrasonic medical image processing technology, in particular to an ultrasonic image pneumonia auxiliary diagnosis method and system.
Background
In the treatment of pneumonia caused by the novel coronavirus which breaks out in the first season of 2020, medical image diagnosis is used as an auxiliary method to lay a solid foundation for the definite diagnosis of patients. According to the results of breast X-ray immobilized with a short selected pulmonary immobilized by chemical route, an ultrasonic image as a pneumonia diagnosis method has higher sensitivity and specificity than a common breast X-ray film. Moreover, the application range of lung ultrasound in emergency treatment and critical patients has been widely studied. The lung ultrasound is convenient to use at the bedside, free of radiation hazard and good in real-time performance, and the characteristics are beneficial to reducing the requirements of bedside X-ray and chest CT scanning. Furthermore, pulmonary ultrasound has proven to be superior to bedside chest X-ray in diagnosing many pleural and pulmonary lesions, comparable to chest CT.
However, the lung ultrasound image diagnosis method is often closely related to the experience of the imaging doctor. Generally, a doctor makes an ultrasonic diagnosis according to the characteristics (such as echo type, structural characteristics and the like) of a lesion region in an ultrasonic gray-scale map. However, the interpretation results of the same ultrasound gray-scale image at different time points or different doctors are highly inconsistent, and have great observer differences.
Therefore, in the face of a large number of patients and scarce ultrasound imaging experts, it is urgent to enable doctors without experience in ultrasound imaging diagnosis of pneumonia to quickly grasp the ultrasound imaging characteristics of pneumonia.
Disclosure of Invention
In order to overcome the defects and shortcomings in the prior art, the invention aims to provide an ultrasonic image pneumonia auxiliary diagnosis method which is used for training doctors without pneumonia diagnosis experience to carry out ultrasonic image auxiliary diagnosis.
The invention also aims to provide an ultrasonic image pneumonia auxiliary diagnosis system.
After a focus area in an ultrasonic gray-scale image is manually drawn by a doctor, the ultrasonic image pneumonia auxiliary diagnosis system adopts an image processing technology to perform comparative analysis and marking on the gray value of the focus area, provides an objective visual image, reduces the difference of interpretation of the doctor, and enables the doctor without pneumonia ultrasonic image diagnosis experience to quickly master the ultrasonic imaging characteristics of pneumonia; meanwhile, the system provides the functions of marking, viewing and the like of the focus area.
The technical scheme adopted by the invention for solving the technical problems is as follows:
an auxiliary diagnosis method for pneumonia by ultrasonic image comprises the following steps:
leading in the ultrasonic image of the chest of the patient;
matching and displaying the ultrasonic image of the healthy sample, wherein rib and rib shadows, a high-echo pleural line, a lung sliding contour and an A/B line area are drawn on the ultrasonic image of the healthy sample and used as comparison;
sequentially delineating rib and rib shadow, high-echo pleural line, lung sliding contour and A/B line area on the chest ultrasonic image of the patient;
respectively extracting edge features and texture features from ribs, rib shadow areas and hyperechoic pleural lines of healthy samples and patients, and performing feature comparison;
and observing the lung sliding contour of the healthy sample and the patient and the dynamic image of the A/B line area to make the disease condition judgment.
Furthermore, the rib and rib shadow, the high-echo pleural line, the lung sliding contour and the A/B line area of the healthy sample lung ultrasonic image can be automatically and correctly labeled through the trained supervised convolutional neural network. Still further, the process of supervising comprises: and labeling the relevant regions of the healthy samples by the lung ultrasonic expert, training the convolutional neural network, labeling the relevant regions of the healthy samples by the trained convolutional neural network again, and finishing training by the lung ultrasonic expert until all labels are correct.
Further, extracting edge features and texture features from the health sample and rib shadow areas divided by the doctor, comparing the features, and if the edge features and the texture features are obvious, performing the step a; otherwise, the diagnosis suggestion: patients with tissue edema or thick subcutaneous fat;
a. b, extracting edge features and texture features of the health sample and the high-echo pleural line region divided by the doctor, comparing the features, and if the edge features and the texture features are obvious, executing the step b; otherwise, follow-up observation of the dynamic image of the lung sliding contour area of the patient is required;
b. observing whether the dynamic lung ultrasonic image of the patient has sine wave characteristics:
if the sine wave characteristic exists, the diagnosis opinion is as follows: patients with pleural effusion, considering pneumonia or atelectasis;
if the sine wave characteristic does not exist, the diagnosis suggestion is as follows: the patient had no apparent pleural effusion.
Further, the dynamic image of the lung sliding contour area of the patient is observed firstly, if the lung slides, the dynamic ultrasonic images of the A/B line areas of the patient and the healthy sample are observed, and if the two lines are obvious, the diagnosis suggestion is as follows: considering the patient's pulmonary edema or pneumonia; if the two lines are not apparent, the patient is considered pulmonary embolism or chronic obstruction of the lung; diagnosis suggestion if there is no lung sliding: consider a patient pneumothorax.
An ultrasonic image pneumonia auxiliary diagnosis system comprises a login system, a patient list, a region delineation, a characteristic module, a supervision learning, a common tool, a database and software configuration, wherein the region delineation module is used for delineating and marking rib and rib shadows, high-echo pleural lines, lung sliding contours and A/B line regions, and therefore the system calls the common tool module to facilitate the user to carry out delineation and marking; the characteristic module is used for extracting image textures, extracting edges and comparing characteristics; the supervised learning module is used for training the convolutional neural network to automatically and correctly mark the relevant area of the healthy sample lung ultrasonic image; the common tool is used for importing or exporting the image and sketching and marking the image; the database is used for storing the imported patient images, the marked health images, the marked patient images and the images to be marked, and further comprises a health/patient matching sub-module.
Aiming at the situation of the current pneumonia, the invention provides a method flow and an implementation structure for auxiliary diagnosis of pneumonia. The invention trains the convolutional neural network by using the lung characteristic diagram labeled by a large number of experts, and can ensure the comparison accuracy when being used by an ultrasonic doctor with short experience. The invention provides a pneumonia diagnosis imaging reference for inexperienced doctors or doctors with little diagnosis experience. In addition, the ultrasonic diagnosis of pneumonia is superior to chest X-ray image as auxiliary diagnosis means, and the cost is lower than that of CT, so that the method is suitable for quick diagnosis beside a bed.
Drawings
FIG. 1 is a flow chart of an embodiment of a pneumonia auxiliary diagnosis method by ultrasonic imaging;
wherein the characteristic region includes: 1-ribs and rib shadows, 2-hyperechoic pleural line, 3-lung glide profile, 4-a & B line;
FIG. 2 is a system block diagram;
FIG. 3 is a diagram of software modules involved in the steps of the embodiment of FIG. 1;
FIG. 4 is a schematic diagram of a supervised learning module.
Detailed Description
The invention is further described below with reference to the accompanying drawings and examples.
The system of the embodiment comprises: login system, patient list, region delineation, feature module, supervised learning, common tools, database and software configuration.
As shown in fig. 2, wherein the login system is composed of a system description and a system login page.
The patient list is composed of patient information, patient images, patient videos and analysis marks.
The region delineation is composed of operation guidance, an image panel, rib delineation, pleural line delineation, lung sliding contour delineation and A/B line delineation.
The feature module is composed of texture extraction, edge extraction, high echo point extraction, uniform feature extraction, echo type classification and feature comparison.
The supervised learning consists of a convolutional neural network and automatic labeling.
The common tools consist of import/export, display panel, zoom in/out, gripper tool, mouse tool, brush tool, text tool, and sketching display panel.
The database consists of imported patient images, health images after labeling, patient images after labeling, images to be labeled and a health/patient matching module.
The software configuration comprises a top navigation bar, version information, system configuration and user management.
Logging in the system: firstly, displaying a system description, and displaying a system login page, wherein after a user inputs login information, the login system transmits the information to a user management module in software configuration, the user management module compares the input login information with a user information table, and when the user information table is matched with the input login information, the system login is successful.
The common tool imports patient images into an 'imported patient image' sub-library of a database. And searching for a matching healthy image from a 'labeled healthy image' sub-library for the imported patient image using a health/patient matching module.
The user then uses the region delineation module to delineate the imported patient image with reference to the matched healthy image. After all areas (ribs, pleural lines, lung sliding outlines, A/B lines) are outlined, the system imports the outlined images into a 'marked patient image' sub-library. The system leads rib and pleural line areas of patients and corresponding healthy people into a characteristic module for characteristic extraction and comparison. The sliding outline of the lung of the patient and the corresponding healthy person and the A/B line area are led into a display panel of a common tool to make judgment for the user. (refer to the following 'an example of an ultrasound image pneumonia aided diagnosis method' for making a diagnosis opinion).
And finally, calling an export tool of the common tool to export a diagnosis report with diagnosis opinions and matching pictures.
In addition, the user can use the hand grab tool, the mouse tool and the painting brush tool in common tools, and the text tool can be used for sketching and marking the image in a sketching display panel, so that customized diagnosis reports can be conveniently made and demonstration teaching can be conveniently carried out. The user can also use the high echo point extraction, uniform feature extraction and echo type classification submodules in the feature module to carry out feature delineation and analysis on the image in the delineation display panel.
Specifically, in the database of the system, the health/patient matching module performs matching by patient gender and age. Gender was required to be the same, age +1 year (the more healthy samples collected before the formal clinical diagnosis the better). In addition, the system provides homogeneous normal lung ultrasound images (including M-mode) as controls.
When the user uses the system, the ultrasonic image and information of a suspected pneumonia patient are introduced, and the patient information shows the basic information of the patient: patient name, gender, age, race, PID number.
Common tool modules for the present system include the following:
importing/exporting: and importing and exporting the patient image.
A display panel: and displaying the image of the patient.
Amplifying the image: clicking a function icon or dragging can enlarge the image.
Image reduction: clicking the function icon or dragging can reduce the image.
A gripper tool: and pressing the right mouse button can drag the image in the image panel.
Mouse tool: the mouse is switched from a non-mouse state to a mouse state.
A painting brush tool: pressing the right mouse button can draw lines freely on the image panel.
A text tool: a text edit box may be added to the image panel.
Sketching a display panel: compared with the display panel, the display panel has the following functions:
(1) cross amplification: when a user mouse is in a brush pen state, clicking intersection amplification, and when lines drawn by the brush pen intersect to form a closed loop, the intersection part can automatically enter the range of the drawing area.
(2) Intersection and decrement: when a user mouse is in a brush pen state, clicking intersection is reduced, and when lines drawn by the brush pen intersect to form a closed loop, the intersection part can be automatically removed from the range of the drawing area.
(3) Restoring the original image: when the user clicks to restore the original image, all previous operations on the image are cleared.
(4) Focal sketching: the user can draw any focus area within the image display range.
(5) Normal tissue delineation: the user can draw any normal tissue area within the image display range.
The feature type modules supported by the system are as follows:
extracting edge features: tools are provided for quantitatively analyzing edge features of lesions. The user can drag the scale to adjust the feature presentation.
Extracting texture features: tools are provided for quantitatively analyzing structural features of a lesion. The user can drag the scale to adjust the feature presentation.
High echo point extraction: tools are provided for quantitatively analyzing hyperechoic point features of lesions. The user can drag the scale to adjust the feature presentation.
Uniform feature extraction: tools are provided for quantitatively analyzing the uniformity characteristics of a lesion. The user can drag the scale to adjust the feature presentation.
And (3) classifying echo types: tools are provided for quantitatively analyzing echogenic type features of lesions. The user may modify the echo type feature presentation.
Among these, the embodiment of extracting hyperechoic points: (1) and converting the sketched characteristic region into a gray-scale map. (2) A grayscale histogram of the region is calculated. (3) The pixels are divided into high, medium and low echo areas according to the threshold set by the doctor. (4) The hyperechoic areas are surrounded by red pixels to indicate hyperechoic areas.
Example of extraction of uniform features: (1) and converting the sketched characteristic region into a gray-scale map. (2) The grayscale image pixel grayscale values are normalized to between 0-1. (3) The image is divided into N x N image sub-blocks (e.g., 16 x 16) and the variance of the pixel values in each sub-block is calculated. (4) All sub-block variances are sorted. (5) And dividing the image uniform area and the non-uniform area by taking the sorted median as a threshold value.
Example of echo type classification: (1) and converting the sketched characteristic region into a gray-scale map. (2) A grayscale histogram of the region is calculated. (3) And dividing the pixels into high, medium and low echo areas according to upper and lower quartiles of the gray value of the image. (4) The high, medium and low echo area ranges are respectively outlined by red, yellow and blue colors.
The three feature sub-modules belong to a tool for displaying the basic features of the ultrasonic image, and can be called and displayed on the sketching display panel or the display panel sub-module.
This module is sketched about the area: the system is mainly used for rib, pleural line, lung sliding and A/B line region delineation, and therefore the system calls a common tool module to facilitate delineation of a user.
With regard to the supervised learning module: the system has trained a supervised Convolutional Neural Network (CNN) before a doctor uses the system, and can automatically and correctly mark characteristic regions such as normal lung ultrasonic images of ribs, rib shadows, high-echo pleural lines, lung sliding contours, A lines and B lines and the like to be used as contrast of ultrasonic images of pneumonia patients. The training image data volume is large, and the accuracy is guaranteed through lung ultrasound expert labeling. As shown in fig. 4, an embodiment of the supervised learning module: (1) calling the rib and rib shadow marked by a lung ultrasonic expert, a high-echo pleural line, a lung sliding contour, an A line and a B line of a healthy sample lung ultrasonic image; (2) the method comprises the steps of (1) training a convolutional neural network, (3) importing an unlabelled lung ultrasonic image, and labeling the characteristics of a relevant region by using the trained CNN, (4) evaluating the CNN network labeling condition by a lung ultrasonic expert, putting an image with an incorrect label into an unlabelled lung ultrasonic image data set, and putting an image with a correct label into a data set with a correct label, and (5) finishing training when the unlabelled data set is empty.
As shown in fig. 1 and fig. 3, an embodiment of a method for assisting pneumonia diagnosis by ultrasound imaging includes the following specific processes:
(1) the doctor imports ultrasound images (B-mode and M-mode) and information of the patient's chest.
(2) The system displays ultrasound images of the patient and corresponding parts of the healthy sample (after patient images are imported, matching healthy samples, healthy samples should be delineated, stored samples with identified features).
(3) On the ultrasonic image of the healthy sample, the system can automatically draw ribs and rib shadows, high-echo pleural lines, lung sliding contours, and the areas on the line A and the line B are used as contrast.
(4) The doctor draws the rib and the rib shadow, the high-echo pleural line, the lung sliding contour, the line A and the line B on the chest ultrasonic image of the introduced patient in sequence.
(5) The system extracts edge features and texture features from the health sample and rib shadow areas divided by doctors, compares the features, and performs the step (6) if the edge features and the texture features are both significant, or performs the step (7) if the edge features and the texture features are not significant.
(6) The system extracts edge features and texture features of the health sample and the hyperechoic pleural line regions divided by the doctor, compares the features, and executes the step (8) if the hyperechoic pleural line divided by the doctor is absent (if the hyperechoic pleural line and the hyperechoic pleural line are both significant), or executes the step (9).
(7) Displaying a dialog box: the patient may have tissue edema or subcutaneous fat thickness.
(8) The system displays the imported patient chest ultrasound image (M-mode) and displays the options: A. and B, no sine wave characteristic. If the doctor selects A, the step (10) is executed, and if the doctor selects B, the step (11) is executed.
(9) The system displays the lung sliding contour area on the imported patient image, and simultaneously displays options A, lung sliding exists and B, no lung sliding exists. If the doctor selects A, the step (12) is executed, and if the doctor selects B, the step (13) is executed. This step requires the doctor to observe and judge by himself, because the lung sliding is a dynamic image, the automatic comparison method does not need to extract edge features and texture features, and the doctor needs to keep track of the sliding track of the lung during breathing in the dynamic image.
(10) The system displays the diagnosis opinions: patients have pleural effusion, accounting for pneumonia or atelectasis.
(11) The system displays the diagnosis opinions: the patient had no apparent pleural effusion.
(12) The system displays the a-line and B-line regions of the ultrasound image of the patient and corresponding part of the healthy sample, and displays the options: A. two lines are evident and b. If the doctor selects A, the step (14) is executed, and if the doctor selects B, the step (15) is executed. Line a needs to be combined with lung glide to determine alveolar filling. Line B is a discrete vertical reverberation artifact that appears from the pleural line extending to the bottom of the screen, moving in synchrony with lung sliding. Both of these determinations require the doctor to pay attention to the dynamic image, and do not require edge features and texture features.
(13) The system displays the diagnosis opinions: consider a patient pneumothorax.
(14) The system displays the diagnosis opinions: patient pulmonary edema or pneumonia is considered.
(15) The system displays the diagnosis opinions: patient pulmonary embolism or chronic obstruction of the lung are considered.
The doctor selection in the steps (8), (9) and (12), namely the manual selection, is because the feature extraction method of the continuously moving image (M mode) is different from that of a static image, so that the doctor can judge and reduce the technical difficulty of system implementation and improve the interpretation accuracy.
Among them, the embodiment of edge feature extraction: (1) converting the sketching characteristic area into a gray-scale map; (2) normalizing the gray scale image pixel gray scale value to be between 0 and 1; (3) automatically selecting threshold edge detection by using a vertical Sobel operator; (4) automatically selecting threshold edge detection by using a horizontal Sobel operator; (5) automatically selecting threshold edge detection by using a 45-degree angle Sobel operator; (6) overlapping the edge maps detected at all angles; (7) and homogenizing the histogram to obtain an edge map.
Embodiment of texture feature extraction: (1) converting the sketching characteristic area into a gray-scale map; (2) dividing the image into a plurality of image sub-blocks (such as 16 × 16) of N × N, and calculating an LBP value of each pixel in each sub-block; (3) performing histogram statistics on each subblock to obtain a histogram of the NxN image subblocks; (4) normalizing the histograms of all the image sub-blocks; (5) and connecting the normalized histograms of all the sub-blocks to obtain the texture features which delineate the feature region.
Feature comparison example (perceptual Hash algorithm): (1) reducing the obtained texture feature map or edge feature map to 8 × 8, and totally 64 pixels; (2) converting the reduced image into 64-level gray; (3) calculating the gray level average value of all 64 pixels; (4) comparing the gray scale of each pixel with the average value, and recording the average value greater than or equal to 1 and the average value smaller than 0; (5) combining the comparison results of the previous step together to form a 64-bit integer, which is the fingerprint of the image; (6) comparing two 64-bit integers obtained by calculating two feature maps to be compared, and if the number of different data bits is not more than 5, indicating that the difference between the two images is not obvious; if the difference is larger than 10, the difference is significant.
The above description is only a preferred embodiment of the present invention, and should not be construed as limiting the scope of the present invention. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the scope of the claims of the present invention.

Claims (6)

1. An auxiliary diagnosis method for pneumonia by ultrasonic image is characterized by comprising the following steps:
leading in the ultrasonic image of the chest of the patient;
matching and displaying the ultrasonic image of the healthy sample, wherein rib and rib shadows, a high-echo pleural line, a lung sliding contour and an A/B line area are drawn on the ultrasonic image of the healthy sample and used as comparison;
sequentially delineating rib and rib shadow, high-echo pleural line, lung sliding contour and A/B line area on the chest ultrasonic image of the patient;
respectively extracting edge features and texture features from ribs, rib shadow areas and hyperechoic pleural lines of healthy samples and patients, and performing feature comparison;
and observing the sliding contour of the lung of the healthy sample and the patient and the dynamic image of the A/B line area to obtain a conclusion.
2. The method as claimed in claim 1, wherein the rib and rib shadow, hyperechoic pleural line, lung sliding contour, and A/B line region of the lung ultrasound image of the healthy sample can be automatically and correctly labeled by the trained supervised convolutional neural network.
3. The method for aided diagnosis of pneumonia according to claim 2, wherein said supervision process comprises:
the lung sonographer labels the relevant area of the healthy sample,
the convolutional neural network is trained and used for training,
the trained convolutional neural network re-labels the relevant regions of the healthy sample,
and (5) evaluating and confirming by a lung ultrasonic expert until all labels are correct, and finishing training.
4. The method for aided diagnosis of pneumonia according to any one of claims 1 to 3,
extracting edge features and texture features from the health sample and rib shadow areas divided by doctors, comparing the features, and if the edge features and the texture features are obvious, performing the step a; otherwise, the diagnosis suggestion: patients with tissue edema or thick subcutaneous fat;
a. b, extracting edge features and texture features of the health sample and the high-echo pleural line region divided by the doctor, comparing the features, and if the edge features and the texture features are obvious, executing the step b; otherwise, follow-up observation of the dynamic image of the lung sliding contour area of the patient is required;
b. observing whether the dynamic lung ultrasonic image of the patient has sine wave characteristics:
if the sine wave characteristic exists, the diagnosis opinion is as follows: patients with pleural effusion, considering pneumonia or atelectasis;
if the sine wave characteristic does not exist, the diagnosis suggestion is as follows: the patient had no apparent pleural effusion.
5. The method for aided diagnosis of pneumonia according to any one of claims 1 to 3,
firstly, the dynamic image of the sliding contour region of the lung of the patient is observed,
and if the lung slides, observing the dynamic ultrasonic images of the A/B line areas of the patient and the healthy sample, and if two lines are obvious, judging that: considering the patient's pulmonary edema or pneumonia; if the two lines are not apparent, the patient is considered pulmonary embolism or chronic obstruction of the lung;
diagnosis suggestion if there is no lung sliding: consider a patient pneumothorax.
6. An auxiliary diagnosis system for pneumonia by ultrasonic image is characterized in that the system comprises a login system, a patient list, a regional sketch, a feature module, a supervision and learning system, a common tool, a database and a software configuration,
the system comprises a region delineation module, a region identification module and a region identification module, wherein the region delineation module is used for delineating and marking rib and rib shadows, high-echo pleural lines, lung sliding outlines and A/B line regions, so that the system calls a common tool module to facilitate the user to carry out delineation and marking;
the characteristic module is used for extracting image textures, extracting edges and comparing characteristics;
the supervised learning module is used for training the convolutional neural network to automatically and correctly mark the relevant area of the healthy sample lung ultrasonic image;
the common tool is used for importing or exporting the image and sketching and marking the image;
the database is used for storing the imported patient images, the marked health images, the marked patient images and the images to be marked, and further comprises a health/patient matching sub-module.
CN202010375817.2A 2020-05-07 2020-05-07 Ultrasonic image pneumonia auxiliary diagnosis method and system Pending CN111528907A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010375817.2A CN111528907A (en) 2020-05-07 2020-05-07 Ultrasonic image pneumonia auxiliary diagnosis method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010375817.2A CN111528907A (en) 2020-05-07 2020-05-07 Ultrasonic image pneumonia auxiliary diagnosis method and system

Publications (1)

Publication Number Publication Date
CN111528907A true CN111528907A (en) 2020-08-14

Family

ID=71967774

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010375817.2A Pending CN111528907A (en) 2020-05-07 2020-05-07 Ultrasonic image pneumonia auxiliary diagnosis method and system

Country Status (1)

Country Link
CN (1) CN111528907A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112085729A (en) * 2020-09-18 2020-12-15 无锡祥生医疗科技股份有限公司 Pleural line region extraction method, storage medium, and ultrasound diagnostic apparatus
CN112801957A (en) * 2021-01-18 2021-05-14 华东师范大学 Pneumothorax automatic check out system based on ultrasonic strain formation of image
CN113487537A (en) * 2021-06-01 2021-10-08 北京大学人民医院 Information processing method, device and storage medium for breast cancer ultrasonic high-echo halo
WO2023215190A1 (en) * 2022-05-02 2023-11-09 Fujifilm Sonosite, Inc. Automated detection of lung slide to aid in diagnosis of pneumothorax

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130184584A1 (en) * 2012-01-17 2013-07-18 Richard E. Berkey Systems and methods for computerized ultrasound image interpretation and labeling
US20160203263A1 (en) * 2015-01-08 2016-07-14 Imbio Systems and methods for analyzing medical images and creating a report
CN109583440A (en) * 2017-09-28 2019-04-05 北京西格码列顿信息技术有限公司 It is identified in conjunction with image and reports the medical image aided diagnosis method edited and system
CN109993733A (en) * 2019-03-27 2019-07-09 上海宽带技术及应用工程研究中心 Detection method, system, storage medium, terminal and the display system of pulmonary lesions

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130184584A1 (en) * 2012-01-17 2013-07-18 Richard E. Berkey Systems and methods for computerized ultrasound image interpretation and labeling
US20160203263A1 (en) * 2015-01-08 2016-07-14 Imbio Systems and methods for analyzing medical images and creating a report
CN109583440A (en) * 2017-09-28 2019-04-05 北京西格码列顿信息技术有限公司 It is identified in conjunction with image and reports the medical image aided diagnosis method edited and system
CN109993733A (en) * 2019-03-27 2019-07-09 上海宽带技术及应用工程研究中心 Detection method, system, storage medium, terminal and the display system of pulmonary lesions

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112085729A (en) * 2020-09-18 2020-12-15 无锡祥生医疗科技股份有限公司 Pleural line region extraction method, storage medium, and ultrasound diagnostic apparatus
CN112801957A (en) * 2021-01-18 2021-05-14 华东师范大学 Pneumothorax automatic check out system based on ultrasonic strain formation of image
CN113487537A (en) * 2021-06-01 2021-10-08 北京大学人民医院 Information processing method, device and storage medium for breast cancer ultrasonic high-echo halo
WO2023215190A1 (en) * 2022-05-02 2023-11-09 Fujifilm Sonosite, Inc. Automated detection of lung slide to aid in diagnosis of pneumothorax

Similar Documents

Publication Publication Date Title
CN111528907A (en) Ultrasonic image pneumonia auxiliary diagnosis method and system
CN109583440B (en) Medical image auxiliary diagnosis method and system combining image recognition and report editing
CN107767376B (en) X-ray bone age prediction method and system based on deep learning
US6901277B2 (en) Methods for generating a lung report
US8355553B2 (en) Systems, apparatus and processes for automated medical image segmentation using a statistical model
US7130457B2 (en) Systems and graphical user interface for analyzing body images
CN109671068B (en) Abdominal muscle labeling method and device based on deep learning
CN111227864A (en) Method and apparatus for lesion detection using ultrasound image using computer vision
CN109003269B (en) Medical image focus label extraction method capable of improving doctor efficiency
US20030028401A1 (en) Customizable lung report generator
CN110338844A (en) The display processing method and 3-D supersonic imaging method and system of three-dimensional imaging data
CN106408566B (en) A kind of fetal ultrasound image quality control method and system
CN101203747A (en) Method and system for intelligent qualitative and quantitative analysis of digital radiography softcopy reading
US8285013B2 (en) Method and apparatus for detecting abnormal patterns within diagnosis target image utilizing the past positions of abnormal patterns
CN104586418B (en) medical image data processing apparatus and medical image data processing method
CN104394771A (en) Ultrasonographic images processing
CN111214255A (en) Medical ultrasonic image computer-aided diagnosis method
CN101103924A (en) Galactophore cancer computer auxiliary diagnosis method based on galactophore X-ray radiography and system thereof
CN108846828A (en) A kind of pathological image target-region locating method and system based on deep learning
CN111374712B (en) Ultrasonic imaging method and ultrasonic imaging equipment
CN112365438A (en) Automatic pelvis parameter measuring method based on target detection neural network
Piętka et al. Role of radiologists in CAD life-cycle
CN111275754B (en) Face acne mark proportion calculation method based on deep learning
CN115690556B (en) Image recognition method and system based on multi-mode imaging features
Gordon et al. Evaluation of uterine cervix segmentations using ground truth from multiple experts

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200814

RJ01 Rejection of invention patent application after publication