CN114628011A - Human-computer interaction method of ultrasonic device, ultrasonic device and storage medium - Google Patents

Human-computer interaction method of ultrasonic device, ultrasonic device and storage medium Download PDF

Info

Publication number
CN114628011A
CN114628011A CN202011442113.9A CN202011442113A CN114628011A CN 114628011 A CN114628011 A CN 114628011A CN 202011442113 A CN202011442113 A CN 202011442113A CN 114628011 A CN114628011 A CN 114628011A
Authority
CN
China
Prior art keywords
line
pleural
pleural line
lung
editing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202011442113.9A
Other languages
Chinese (zh)
Inventor
王曾科
龚栋梁
陈建军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chison Medical Technologies Co ltd
Original Assignee
Chison Medical Technologies Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chison Medical Technologies Co ltd filed Critical Chison Medical Technologies Co ltd
Priority to CN202011442113.9A priority Critical patent/CN114628011A/en
Priority to CN202110519851.7A priority patent/CN113053498B/en
Publication of CN114628011A publication Critical patent/CN114628011A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H15/00ICT specially adapted for medical reports, e.g. generation or transmission thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30061Lung

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Radiology & Medical Imaging (AREA)
  • Epidemiology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The invention relates to the technical field of image processing, and provides a man-machine interaction method of ultrasonic equipment, which comprises the following steps: acquiring a lung ultrasonic image comprising a pleural line and a B line; identifying a pleural line and a B line in the lung ultrasound image, and drawing the pleural line and the B line in the lung ultrasound image; when an editing instruction for the pleural line and the B line input by a user is received, editing the pleural line and the B line in the lung ultrasonic image according to the editing instruction; and measuring the edited pleural line and B line to obtain the thickness related parameters of the pleural line and the B line interval and generate a lung examination report. The man-machine interaction method of the ultrasonic equipment provided by the invention can identify the pleural line and the B line in the lung ultrasonic image, accurately measure the thickness related parameters of the pleural line and the distance between the B lines by adjusting the positions of the pleural line and the B line, generate a lung examination report, and provide correct basis for subsequent parameter detection or disease diagnosis.

Description

Human-computer interaction method of ultrasonic device, ultrasonic device and storage medium
Technical Field
The invention relates to the technical field of image processing, in particular to a man-machine interaction method of ultrasonic equipment.
Background
At present, when a patient is subjected to ultrasonic diagnosis, accurate identification and editing of a pleural line region and a B line region in a lung ultrasonic image are crucial, and support can be provided for diagnosis of relevant lung diseases such as subsequent pleural line region thickness measurement.
Disclosure of Invention
The invention aims to overcome the defects in the prior art, and provides a man-machine interaction method of an ultrasonic device, which can identify a pleural line and a B line in a lung ultrasonic image, accurately measure various thicknesses of the pleural line by adjusting the positions of the pleural line and the B line, generate a lung examination report, and provide a correct basis for subsequent parameter detection or disease diagnosis.
In a first aspect, an embodiment of the present invention provides a human-computer interaction method for an ultrasound device, including:
acquiring a lung ultrasonic image comprising a pleural line and a B line;
identifying a pleural line and a B line in the lung ultrasound image, and drawing the pleural line and the B line in the lung ultrasound image;
when an editing instruction for the pleural line and the B line input by a user is received, editing the pleural line and the B line in the lung ultrasonic image according to the editing instruction;
and measuring the edited pleural line and B line to obtain the thickness related parameters of the pleural line and the B line interval and generate a lung examination report.
Further, the pleural line includes pleural line and lower pleural line, the relevant parameter of thickness includes pleural line thickness, pleural line thickness standard deviation, wherein, pleural line thickness is last pleural line and the distance between the lower pleural line, pleural line thickness includes average thickness of pleural line, the biggest thickness of pleural line and the minimum thickness of pleural line.
Further, still include: inputting the editing instructions of the pleural line and the B line in the lung ultrasound image through one or more of a keyboard, a touch screen and a mouse.
Further, when an edit instruction for the pleural line and the B line input by a user is received, editing the pleural line and the B line in the lung ultrasound image according to the edit instruction, including:
and receiving a movement editing instruction for the line B input by a user, and moving the line B left and right according to the movement editing instruction, wherein the position of the line B is adjusted within the X-axis coordinate range of the lower pleural line.
Further, when an edit instruction for the pleural line and the B line input by a user is received, editing the pleural line and the B line in the lung ultrasound image according to the edit instruction, including:
and receiving an adding and editing instruction for the B line input by a user, and adding the B line in the middle position of the lower pleural line according to the adding and editing instruction.
Further, when an edit instruction for the pleural line and the B line input by a user is received, editing the pleural line and the B line in the lung ultrasound image according to the edit instruction, including:
and receiving a deleting and editing instruction for the B line input by a user, and deleting the B line according to the deleting and editing instruction.
Further, when an edit instruction for the pleural line and the B line input by a user is received, editing the pleural line and the B line in the lung ultrasound image according to the edit instruction, including:
and receiving a button click command input by a user, selecting an upper pleural line or a lower pleural line according to the button click command, and adjusting the y-axis coordinate of the upper pleural line or the lower pleural line according to a preset rule.
Further, the adjusting the y-axis coordinate of the superior pleural line or the inferior pleural line according to a preset rule includes:
obtaining a pressing instruction, drawing a target auxiliary line in the lung ultrasonic image according to the pressing instruction, wherein the target auxiliary line is intersected with the selected pleural line, modifying the y-axis coordinate of the selected pleural line by taking the intersected first contact point as a base point, and redrawing the corresponding pleural line according to the modified y-axis coordinate of the pleural line.
In a second aspect, an embodiment of the present invention provides an ultrasound apparatus, including:
a memory and a processor, the memory and the processor being communicatively coupled to each other, the memory having stored therein computer instructions, the processor performing the steps of the method as described above by executing the computer instructions.
In a third aspect, an embodiment of the present invention provides a computer-readable storage medium storing computer instructions for causing a computer to perform the steps of the method as described above.
Compared with the prior art, the method and the device can identify the pleural line and the line B in the lung ultrasonic image, accurately measure the thickness-related parameters and the distance between the pleural lines and the line B by adjusting the positions of the pleural lines and the line B, generate a lung examination report, and provide correct basis for subsequent parameter detection or disease diagnosis.
Drawings
FIG. 1 is a flow chart of a method according to an embodiment of the present invention.
FIG. 2 is a schematic illustration of a lung exam report in an embodiment of the present invention.
Fig. 3a is an original image of an ultrasound image of a lung including a pleural line and a B line according to an embodiment of the present invention.
Fig. 3B is a schematic diagram illustrating the operation of B-line in the original image of the ultrasound image of the lung according to the embodiment of the present invention.
Fig. 3c is a schematic diagram illustrating an operation performed on a pleural line in an original image of a lung ultrasound image according to an embodiment of the present invention.
Fig. 4 is a schematic hardware structure diagram of an ultrasound apparatus provided in an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In accordance with an embodiment of the present invention, there is provided a human-computer interaction method embodiment of an ultrasound device, it is noted that the steps illustrated in the flowchart of the drawings may be performed in a computer system such as a set of computer-executable instructions and that, although a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than here.
As shown in fig. 1, the present embodiment provides a human-computer interaction method for an ultrasound apparatus, where the method includes the steps of:
step S1, acquiring a lung ultrasonic image comprising a pleural line and a B line;
when an operator operates the ultrasonic equipment, the ultrasonic equipment can be scanned by an ultrasonic probe to obtain a lung ultrasonic image comprising a pleural line and a B line;
step S2, identifying a pleural line and a B line in the lung ultrasonic image, and drawing the pleural line and the B line in the lung ultrasonic image;
step S3, when receiving the edit instruction of the pleural line and B line input by the user, editing the pleural line and B line in the lung ultrasonic image according to the edit instruction;
and step S4, measuring the edited pleural line and B line to obtain the thickness related parameters of the pleural line and the distance between the B lines and generate a lung examination report.
When all editing actions are completed, the measurements associated with the pleural and B lines are recalculated.
The pleural thickening is an important assessment factor of lung diseases, so that the pleural line and the line B can be identified in a lung ultrasonic image, various thicknesses of the pleural line can be accurately measured by adjusting the positions of the pleural line and the line B, a lung examination report is generated, and a correct basis can be provided for subsequent parameter detection or disease diagnosis.
In some embodiments, the pleural line includes upper pleural line and lower pleural line, the thickness-related parameter includes pleural line thickness, pleural line thickness standard deviation, wherein, pleural line thickness is the distance between upper pleural line and the lower pleural line, pleural line thickness includes pleural line average thickness, pleural line maximum thickness and pleural line minimum thickness, B line interval is the distance between two B lines.
In fig. 3a-3c, the transverse curves F1 and F2 are the upper and lower pleural lines, respectively, and the vertical lines B1 and B2 are the two lines B, wherein in fig. 3a, the average pleural line thickness is 0.72mm, the maximum pleural line thickness is 1.15mm, the minimum pleural line thickness is 0.44mm, the standard deviation pleural line is 0.15, and the B line spacing is 6.91 mm.
In some embodiments, the lung function identification comprises a real-time identification mode and a freeze identification mode, wherein the real-time identification mode refers to identifying and drawing a pleural line and a B line in an image in the real-time mapping process under a lung preset value B mode; the freezing recognition mode is that after a picture is taken in real time and frozen in a lung preset value B mode, a pleural line and a line B in a current frame are recognized and drawn by clicking a lung function button, and the average thickness of the pleural line, the maximum thickness of the pleural line, the minimum thickness of the pleural line, the standard deviation of the pleural line thickness and the distance between the lines B are measured and calculated.
In some embodiments, the editing instructions for the pleural line and B line are entered in the pulmonary ultrasound image by one or more of a keyboard, a touch screen, a mouse.
In some embodiments, when the editing instruction for the pleural line and the B line input by the user is received, editing the pleural line and the B line in the lung ultrasound image according to the editing instruction includes:
and receiving a movement editing instruction for the line B input by a user, and moving the line B left and right according to the movement editing instruction, wherein the position of the line B is adjusted within the X-axis coordinate range of the lower pleural line. The moving editing instruction is an operation instruction for moving the line B along a preset direction, and the preset direction may be moving left or moving right, which is not described herein again.
For example, in fig. 3B, the B-line effective area is pressed and moved to the left and right, and the B-line position is adjusted within the X-axis coordinate range of the lower pleural line F2; specifically, when a finger or a stylus is pressed on (or near) the line B, the line B changes color, the finger moves to the left, the line B also moves to the left, and the same applies to the right.
In some embodiments, when the editing instruction for the pleural line and the B line input by the user is received, editing the pleural line and the B line in the lung ultrasound image according to the editing instruction comprises:
and receiving an adding and editing instruction for the B line input by a user, and adding the B line in the middle position of the lower pleural line according to the adding and editing instruction. Optionally, the adding editing instruction may be a selection instruction for selecting an adding option, and after receiving the selection instruction, a click instruction at the target position is received, and thereafter, a B line may be added at the target position. Optionally, after the B line is added, if the user needs to change the position of the B line, the added B line may be moved to a position required by the user through a move editing instruction.
In some embodiments, when the editing instruction for the pleural line and the B line input by the user is received, editing the pleural line and the B line in the lung ultrasound image according to the editing instruction includes:
receiving a deleting and editing instruction for the line B input by a user, and deleting the line B according to the deleting and editing instruction; the deleting operation needs to click and select a B line first, then click a deleting button to delete, and click an image non-button area to cancel the selection.
In some embodiments, when the editing instruction for the pleural line and the B line input by the user is received, editing the pleural line and the B line in the lung ultrasound image according to the editing instruction includes:
and receiving a button click command input by a user, selecting an upper pleural line or a lower pleural line according to the button click command, and adjusting the y-axis coordinate of the upper pleural line or the lower pleural line according to a preset rule.
In some embodiments, the adjusting the y-axis coordinate of the superior pleural line or the inferior pleural line according to a preset rule includes:
obtaining a pressing instruction, drawing a target auxiliary line in the lung ultrasonic image according to the pressing instruction, wherein the target auxiliary line is intersected with the selected pleural line, modifying the y-axis coordinate of the selected pleural line by taking the intersected first contact point as a base point, and redrawing the corresponding pleural line according to the modified y-axis coordinate of the pleural line.
In fig. 3c, the editing of the pleural line means that the y-axis coordinate of the edited pleural line can be readjusted, so that the drawn pleural line fits the actual pleural line, the upper pleural line or the lower pleural line is selected by clicking the editing button, when the pleural line changes color, the selected pleural line can be switched by clicking the editing button, and the selection can be cancelled by clicking the image area. And (3) pressing in the image area, drawing an auxiliary line, intersecting the selected pleural line, and redrawing the y-axis coordinate of the selected pleural line by using the y relative displacement of Touch move events with the first contact point as a base point. Because the touch area can be not on the pleural line, the editing area can not be blocked by a stylus or a finger in the operation process; as shown in fig. 3c, the thicker points, the edit points, the movable finger or stylus causes the pleural line to more closely follow the actual pleural contour.
In fig. 2, after editing for freeze analysis, clicking the save map key saves a single picture and calculation results, and when entering the report page, a lung examination report is provided. The lung examination report takes the latest analysis result as the current examination, the comparison examination is selected by clicking the adding button, the image comparison can be carried out, the measurement result comparison and the difference of the measurement result can be automatically calculated, and the change of the state of an illness can be visually identified.
An embodiment of the present invention further provides an ultrasound device, a schematic structural diagram of the ultrasound device is shown in fig. 4, and the ultrasound device may include: at least one processor 41, such as a CPU (Central Processing Unit), at least one communication interface 43, memory 44, and at least one communication bus 42. Wherein a communication bus 22 is used to enable the connection communication between these components. The communication interface 43 may include a Display (Display) and a Keyboard (Keyboard), and the optional communication interface 43 may also include a standard wired interface and a standard wireless interface. The Memory 44 may be a high-speed RAM Memory (volatile Random Access Memory) or a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. The memory 44 may alternatively be at least one memory device located remotely from the aforementioned processor 41. Wherein the memory 44 stores an application program and the processor 41 calls the program code stored in the memory 44 for performing any of the above-mentioned method steps.
The communication bus 42 may be a Peripheral Component Interconnect (PCI) bus or an Extended Industry Standard Architecture (EISA) bus. The communication bus 42 may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in FIG. 4, but this does not indicate only one bus or one type of bus.
The memory 44 may include a volatile memory (RAM), such as a random-access memory (RAM); the memory may also include a non-volatile memory (english: non-volatile memory), such as a flash memory (english: flash memory), a hard disk (english: hard disk drive, abbreviated: HDD) or a solid-state drive (english: SSD); the memory 44 may also comprise a combination of the above-mentioned kinds of memories.
The processor 41 may be a Central Processing Unit (CPU), a Network Processor (NP), or a combination of CPU and NP.
The processor 41 may further include a hardware chip. The hardware chip may be an application-specific integrated circuit (ASIC), a Programmable Logic Device (PLD), or a combination thereof. The PLD may be a Complex Programmable Logic Device (CPLD), a field-programmable gate array (FPGA), a General Array Logic (GAL), or any combination thereof.
Optionally, the memory 44 is also used to store program instructions. The processor 41 may call program instructions to implement a human-computer interaction method of the ultrasound device as shown in the embodiment of fig. 1 of the present application.
The embodiment of the invention also provides a non-transitory computer storage medium, wherein the computer storage medium stores computer executable instructions, and the computer executable instructions can execute the human-computer interaction method of the ultrasonic equipment in any method embodiment. The storage medium may be a magnetic Disk, an optical Disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a Flash Memory (Flash Memory), a Hard Disk (Hard Disk Drive, abbreviated as HDD), a Solid State Drive (SSD), or the like; the storage medium may also comprise a combination of memories of the kind described above.
Finally, it should be noted that the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting, and although the present invention has been described in detail with reference to examples, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention, which should be covered by the claims of the present invention.

Claims (10)

1. A human-computer interaction method of an ultrasonic device is characterized by comprising the following steps:
acquiring a lung ultrasonic image comprising a pleural line and a B line;
identifying a pleural line and a B line in the lung ultrasound image, and drawing the pleural line and the B line in the lung ultrasound image;
when an editing instruction for the pleural line and the B line input by a user is received, editing the pleural line and the B line in the lung ultrasonic image according to the editing instruction;
and measuring the edited pleural line and B line to obtain the thickness related parameters of the pleural line and the B line interval and generate a lung examination report.
2. The human-computer interaction method of an ultrasound device of claim 1, wherein the pleural line comprises an upper pleural line and a lower pleural line, and the thickness-related parameter comprises a pleural line thickness, a pleural line thickness standard deviation, wherein the pleural line thickness is a distance between the upper pleural line and the lower pleural line, and the pleural line thickness comprises a pleural line average thickness, a pleural line maximum thickness, and a pleural line minimum thickness.
3. The human-computer interaction method of an ultrasound device of claim 1, further comprising: inputting the editing instructions of the pleural line and the B line in the lung ultrasound image through one or more of a keyboard, a touch screen and a mouse.
4. The human-computer interaction method of the ultrasound apparatus according to claim 1, wherein when receiving a user-inputted editing instruction for the pleural line and the B line, editing the pleural line and the B line in the lung ultrasound image according to the editing instruction comprises:
and receiving a movement editing instruction for the line B input by a user, and moving the line B left and right according to the movement editing instruction, wherein the position of the line B is adjusted within the X-axis coordinate range of the lower pleural line.
5. The human-computer interaction method of the ultrasound device according to claim 1, wherein when the edit instruction for the pleural line and the B line is received, the pleural line and the B line are edited in the lung ultrasound image according to the edit instruction, and the method comprises:
and receiving an adding and editing instruction for the B line input by a user, and adding the B line in the middle position of the lower pleural line according to the adding and editing instruction.
6. The human-computer interaction method of the ultrasound apparatus according to claim 1, wherein when receiving a user-inputted editing instruction for the pleural line and the B line, editing the pleural line and the B line in the lung ultrasound image according to the editing instruction comprises:
and receiving a deleting and editing instruction for the B line input by a user, and deleting the B line according to the deleting and editing instruction.
7. The human-computer interaction method of the ultrasound device according to claim 1, wherein when the edit instruction for the pleural line and the B line is received, the pleural line and the B line are edited in the lung ultrasound image according to the edit instruction, and the method comprises:
and receiving a button click command input by a user, selecting an upper pleural line or a lower pleural line according to the button click command, and adjusting the y-axis coordinate of the upper pleural line or the lower pleural line according to a preset rule.
8. The human-computer interaction method of the ultrasound device according to claim 7, wherein the adjusting the y-axis coordinate of the superior pleural line or the inferior pleural line according to the preset rule comprises:
obtaining a pressing instruction, drawing a target auxiliary line in the lung ultrasonic image according to the pressing instruction, wherein the target auxiliary line is intersected with the selected pleural line, modifying the y-axis coordinate of the selected pleural line by taking the intersected first contact point as a base point, and redrawing the corresponding pleural line according to the modified y-axis coordinate of the pleural line.
9. An ultrasound device, comprising:
a memory and a processor communicatively coupled to each other, the memory having stored therein computer instructions, the processor performing the steps of the method of any one of claims 1-8 by executing the computer instructions.
10. A computer-readable storage medium, characterized in that it stores computer instructions for causing the computer to perform the steps of the method according to any one of claims 1-8.
CN202011442113.9A 2020-12-11 2020-12-11 Human-computer interaction method of ultrasonic device, ultrasonic device and storage medium Withdrawn CN114628011A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011442113.9A CN114628011A (en) 2020-12-11 2020-12-11 Human-computer interaction method of ultrasonic device, ultrasonic device and storage medium
CN202110519851.7A CN113053498B (en) 2020-12-11 2021-05-13 Man-machine interaction method of ultrasonic equipment, ultrasonic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011442113.9A CN114628011A (en) 2020-12-11 2020-12-11 Human-computer interaction method of ultrasonic device, ultrasonic device and storage medium

Publications (1)

Publication Number Publication Date
CN114628011A true CN114628011A (en) 2022-06-14

Family

ID=81896130

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202011442113.9A Withdrawn CN114628011A (en) 2020-12-11 2020-12-11 Human-computer interaction method of ultrasonic device, ultrasonic device and storage medium
CN202110519851.7A Active CN113053498B (en) 2020-12-11 2021-05-13 Man-machine interaction method of ultrasonic equipment, ultrasonic equipment and storage medium

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202110519851.7A Active CN113053498B (en) 2020-12-11 2021-05-13 Man-machine interaction method of ultrasonic equipment, ultrasonic equipment and storage medium

Country Status (1)

Country Link
CN (2) CN114628011A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024105491A1 (en) * 2022-11-14 2024-05-23 Kimura Bruce Systems and methods for providing sensory output indicating lung thickening due to congestion, infection, inflammation, and/or fibrosis in a subject

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170086790A1 (en) * 2015-09-29 2017-03-30 General Electric Company Method and system for enhanced visualization and selection of a representative ultrasound image by automatically detecting b lines and scoring images of an ultrasound scan
US11839515B2 (en) * 2017-08-21 2023-12-12 Koninklijke Philips N.V. Detection, presentation and reporting of B-lines in lung ultrasound
EP3482689A1 (en) * 2017-11-13 2019-05-15 Koninklijke Philips N.V. Detection, presentation and reporting of b-lines in lung ultrasound
BR112020009982A2 (en) * 2017-11-22 2020-11-03 Koninklijke Philips N.V. ultrasound system, ultrasound imaging system, non-transitory computer-readable method and media

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024105491A1 (en) * 2022-11-14 2024-05-23 Kimura Bruce Systems and methods for providing sensory output indicating lung thickening due to congestion, infection, inflammation, and/or fibrosis in a subject

Also Published As

Publication number Publication date
CN113053498A8 (en) 2023-06-23
CN113053498B (en) 2023-08-11
CN113053498A (en) 2021-06-29

Similar Documents

Publication Publication Date Title
US10254113B2 (en) Inspection program editing environment providing user defined collision avoidance volumes
US11860602B2 (en) Inspection program editing environment with automatic transparency operations for occluded workpiece features
US20200033109A1 (en) Workpiece measurement device, workpiece measurement method and non-transitory computer readable medium recording a program
JP2021196705A (en) Image processing system, image processing method and program
CN110136153A (en) A kind of image processing method, equipment and storage medium
CN110942447A (en) OCT image segmentation method, device, equipment and storage medium
CN113053498B (en) Man-machine interaction method of ultrasonic equipment, ultrasonic equipment and storage medium
JP2932193B2 (en) Graphic processing unit
CN113467877A (en) Data display system and method
JP6246287B1 (en) Medical image processing apparatus and program
CN110554820B (en) GIS data editing method
CN111881049A (en) Acceptance method and device for application program interface and electronic equipment
CN116858102A (en) Weld joint size detection method, system, medium and equipment based on point cloud matching
JP7020636B2 (en) Optical interference tomography scan control
US11080875B2 (en) Shape measuring apparatus, shape measuring method, non-transitory computer readable medium storing program
JP2018049498A (en) Image processor, operation detection method, computer program, and storage medium
JP6802854B2 (en) Image processing part shape data creation system and image processing part shape data creation method
JP4221236B2 (en) Diagnostic imaging support program
JP2005025605A (en) System and method for generating physical data fitting coefficient
JP2019168251A (en) Shape measuring apparatus, shape measuring method, and program
JP2020071106A (en) Visual inspection method and program
CN115793893B (en) Touch writing handwriting generation method and device, electronic equipment and storage medium
JP7311099B2 (en) image display system
CN113805993B (en) Method for rapidly and continuously capturing images
JP6985157B2 (en) Image measuring machines, tool editing methods, and programs

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20220614