CN110269641B - Ultrasonic imaging auxiliary guiding method, system, equipment and storage medium - Google Patents

Ultrasonic imaging auxiliary guiding method, system, equipment and storage medium Download PDF

Info

Publication number
CN110269641B
CN110269641B CN201910543045.6A CN201910543045A CN110269641B CN 110269641 B CN110269641 B CN 110269641B CN 201910543045 A CN201910543045 A CN 201910543045A CN 110269641 B CN110269641 B CN 110269641B
Authority
CN
China
Prior art keywords
user
image
information
scanning
grade
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910543045.6A
Other languages
Chinese (zh)
Other versions
CN110269641A (en
Inventor
张佳民
贺勇庭
孙传景
孔维智
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sonoscape Medical Corp
Original Assignee
Sonoscape Medical Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sonoscape Medical Corp filed Critical Sonoscape Medical Corp
Priority to CN201910543045.6A priority Critical patent/CN110269641B/en
Publication of CN110269641A publication Critical patent/CN110269641A/en
Application granted granted Critical
Publication of CN110269641B publication Critical patent/CN110269641B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Public Health (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Primary Health Care (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Epidemiology (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The application provides an ultrasonic imaging auxiliary guide method, a system, equipment and a storage medium, wherein the method comprises the following steps: acquiring user grade information and imaging parameter information selected by a user; the user level information represents imaging quality level information of the user in historical scanning operation, and the imaging parameter information comprises diagnostic part information, a characteristic image set corresponding to the diagnostic part, and an automatic operation flow protocol; and outputting the scanning operation guide information which is matched with the user level and corresponds to the automatic operation flow protocol. The guiding method can output different operation guiding information according to different user grades, thereby meeting diversified user requirements.

Description

Ultrasonic imaging auxiliary guide method, system, equipment and storage medium
Technical Field
The present application relates to the field of ultrasound imaging technologies, and in particular, to an ultrasound imaging aided guiding method, system, device, and storage medium.
Background
The ultrasound imaging auxiliary guide is a key tool for assisting an ultrasound clinician in scanning imaging and helping the physician to improve the scanning quality, and when the physician operates an ultrasound imaging system to scan, the system synchronously outputs scanning guide prompt information so that the physician can perform accurate scanning operation according to the prompt information.
In practical situations, different doctors have different experience accumulation and operation proficiency degrees, and some doctors have less experience accumulation, so that the scanning guidance is relatively dependent, detailed scanning guidance is required to assist in completing scanning imaging, and the purpose of doing while learning is achieved; some doctors may have rich experience accumulation and high operation proficiency, so scanning guidance may not be needed at all, and the scanning guidance prompt information may be useless information for the doctors, and even may become burdensome for the doctors to perform scanning operation.
While the conventional ultrasound imaging aided guidance does not consider the actual needs of the physicians, the unified preset guidance content is provided for the physicians according to the solidified scanning guidance program, and obviously, the diversified needs of the users cannot be met.
Disclosure of Invention
Based on the defects and requirements of the prior art, the application provides an ultrasonic imaging auxiliary guide method, system, device and storage medium, which can provide scanning operation guide information matched with a user grade according to the user grade.
In order to achieve the above purpose, the following technical solutions are specifically proposed in the present application:
an ultrasound imaging assisted guidance method comprising:
acquiring user grade information and imaging parameter information selected by a user; the user grade information represents imaging quality grade information of the user in historical scanning operation, and the imaging parameter information comprises diagnostic part information, a characteristic image set corresponding to the diagnostic part, and an automatic operation flow protocol;
and outputting the scanning operation guide information which is matched with the user level and corresponds to the automatic operation flow protocol.
In one embodiment, the outputting of the scanning operation guidance information matched with the user level and corresponding to the automation operation flow protocol comprises:
determining the grade of the user according to the user grade information, and determining the scanning operation to be executed according to the current scanning operation of the user and the automatic operation flow protocol;
outputting drawing operation guide information which is matched with the grade of the user and corresponds to the drawing operation to be executed; wherein the completeness of the scanning operation guide information is in a matching relation of negative correlation with the grade of the user.
In one embodiment, the outputting of the scan operation guidance information which matches with the user level and corresponds to the content of the scan operation to be executed includes:
when the level of the user is a first level, outputting scanning operation guide information of a first integrity corresponding to the scanning operation content to be executed;
when the level of the user is a second level, outputting scanning operation guide information of a second integrity corresponding to the scanning operation content to be executed;
wherein the first level is less than the second level and the first integrity is greater than the second integrity.
In one embodiment, the scan operation guidance information includes:
the text form and/or the voice form of the drawing operation guide information.
In one embodiment, the method further comprises:
and receiving a voice instruction of the user, and executing the map scanning operation corresponding to the voice instruction.
In one embodiment, the method further comprises:
after the scanning image is formed, the scanning image is compared with the images in the characteristic image set, and the scanning image is scored;
and carrying out image recognition and target labeling processing on the image with the highest score.
In one embodiment, the method further comprises:
and scoring the scanned image of the user, and updating the grade of the user according to the score of the scanned image of the user and the score of the historical scanned image of the user.
It is also presented an ultrasound imaging assisted guidance system comprising:
the information acquisition module is used for acquiring user grade information and imaging parameter information selected by the user; the user grade information represents imaging quality grade information of the user in historical scanning operation, and the imaging parameter information comprises diagnostic part information, a characteristic image set corresponding to the diagnostic part, and an automatic operation flow protocol;
and the imaging guide module is used for outputting scanning operation guide information which is matched with the user grade and corresponds to the automatic operation flow protocol.
In one embodiment, when the imaging guidance module outputs the scanning operation guidance information which matches the user level and corresponds to the automated operation flow protocol, the imaging guidance module is specifically configured to:
determining the grade of the user according to the user grade information, and determining the scanning operation to be executed according to the current scanning operation of the user and the automatic operation flow protocol;
outputting drawing operation guide information which is matched with the grade of the user and corresponds to the drawing operation to be executed; wherein the completeness of the scanning operation guide information is in a matching relation of negative correlation with the grade of the user.
In one embodiment, the system further comprises:
and the image template database is used for storing standard section characteristic images corresponding to the diagnosis target.
It is further proposed an ultrasound imaging assisted guidance device comprising:
a memory and a processor;
wherein the memory is connected with the processor and used for storing programs;
the processor is used for implementing the ultrasonic imaging auxiliary guide method by executing the program in the memory.
A storage medium is also proposed, on which a computer program is stored which, when being executed by a processor, implements the ultrasound imaging assisted guidance method according to any one of the preceding claims.
According to the ultrasonic imaging auxiliary guiding method, after the user grade information and the imaging parameter information are obtained, the scanning operation guiding information matched with the user grade is output. The guiding method can output different operation guiding information according to different user grades, thereby meeting diversified user requirements.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is a schematic flowchart of an ultrasound imaging aided guidance method provided in an embodiment of the present application;
fig. 2 is a schematic flowchart of another ultrasound imaging-assisted guidance method provided in an embodiment of the present application;
fig. 3 is a schematic structural diagram of an ultrasound imaging auxiliary guidance system provided by an embodiment of the present application;
fig. 4 is a schematic structural diagram of an ultrasound imaging auxiliary guide apparatus provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides an ultrasonic imaging auxiliary guiding method, and referring to fig. 1, the method includes:
s101, obtaining user grade information and imaging parameter information selected by a user; the user grade information represents imaging quality grade information of the user in historical scanning operation, and the imaging parameter information comprises diagnostic part information, a characteristic image set corresponding to the diagnostic part, and an automatic operation flow protocol;
specifically, when a clinician operates the ultrasound imaging system to perform scanning imaging, a user name is firstly input into the system to log in the system, and the system can obtain the grade information of the user according to the user name. The user grade information is quality grade information for scanning imaging in historical scanning operation of a user, and can be specifically evaluated from the aspects of scanning imaging quality, positioning characteristic image speed, overall diagnosis time and the like, and the grade is divided into 1-3 grades from low to high.
When a user logs in the ultrasonic imaging system by using a user name, the user name and user grade information are displayed on a system interface, and the user who logs in for the first time defaults to 1 grade.
Then, the user opens a system preset page and sets a complete inspection operation flow sequence. Specifically, a user enters a setting interface, opens an intelligent operation flow guidance function preset interface, and selects a diagnosis part mode (such as a heart diagnosis mode, a thyroid diagnosis mode, and the like); selecting a standard section characteristic image set of the compared diagnosis part mode from an image template database; and selecting a supported automation operation flow protocol through the preset interface (the user can customize the combined configuration by selecting from the supported operation flow steps).
The automatic operation flow protocol comprises two parts of characteristic image finding and last finding. The method comprises the steps of pre-setting before finding a characteristic image, optionally selecting, setting the post-finding operation steps of the characteristic image to support the user to carry out configuration of front-back sequence and selection, so that the processing step after finding the characteristic image through the scanning imaging has correct characteristic image to support image recognition, and if the comparison of a real-time collected image through a background image (the pre-setting step) is successful, selecting to automatically freeze back to the clearest frame, automatically measuring the characteristic image, automatically annotating, marking a body surface and the like; and after the intelligent guide function is started, entering the preset diagnosis, and starting the intelligent diagnosis operation flow function by default.
After the user finishes the login and the setting, the scanning imaging operation is started, and correspondingly, the system executes the following step S102 to provide scanning operation guide information for the user:
and S102, outputting the scanning operation guide information which is matched with the user grade information and corresponds to the automatic operation flow protocol.
Specifically, when the physician operates the probe to scan the positioning characteristic image in real time, the system outputs guidance information matched with the physician grade, and the guidance information is scanning operation guidance information corresponding to an automatic operation flow protocol selected by the physician before scanning starts.
Illustratively, within a rectangular frame of the non-image area of the system display interface, the scanning operation guide information matched with the user level is displayed. For example, for the scanning operation guidance information of a level 1 physician, direction prompts of ' up, down, left, right, and rotation by XX angle ' are given, and in the middle of the operation, if the real-time scanned image is matched with the image in the non-target standard section image library, the ' XX section image is given, and the guidance information of moving the probe according to the direction prompt helps the physician to deepen the experience memory of different section images in the real-time scanning process of operating the probe. For the level 2 doctor, only brief graphic prompt is given, for example, if 'please find the XX part image', after the target diagnosis part is found, direction graphic prompt of 'up, down, left, right, and XX angle rotation' is given, so that the prompt guidance efficiency of the level 2 user can be improved; for the level 3 user, the experience of the user is rich, and the middle process graphic presentation is not provided.
Through the introduction, the ultrasonic imaging auxiliary guidance method provided by the embodiment of the application outputs the scanning operation guidance information matched with the user grade after the user grade information and the imaging parameter information are acquired. The guiding method can output different operation guiding information according to different user grades, thereby meeting diversified user requirements.
Illustratively, in another embodiment of the present application, specific processing contents of the scanning operation guidance information corresponding to the automatic operation flow protocol and matching the output with the user level are disclosed, and referring to fig. 2, the scanning operation guidance information corresponding to the automatic operation flow protocol and matching the output with the user level includes:
s202, determining the grade of the user according to the user grade information, and determining the scanning operation to be executed according to the current scanning operation of the user and the automatic operation flow protocol;
specifically, the automation workflow protocol defines the execution sequence and specification of the user for performing the scan operation. The user should operate the probe to scan according to the automated operational flow protocol selected before the scan begins.
When a user scans according to the operation flow protocol, the system automatically identifies the operation of the user on the probe, compares the current scanning operation of the user with the operation flow and the specification specified in the automatic operation flow protocol, and determines the standard operation required to be executed by the user next step, namely determines the scanning operation to be executed.
Meanwhile, the grade of the user can be determined by analyzing the grade information of the user.
S203, outputting drawing scanning operation guide information which is matched with the grade of the user and corresponds to the drawing scanning operation to be executed; wherein the completeness of the scanning operation guide information is in a matching relation of negative correlation with the grade of the user.
Specifically, after the information is determined by executing step S201, the system determines the integrity of the guidance information matching the user level according to the user level. The guide information integrity is used for representing the specific degree of the guide information, and the more specific the guide information is, the higher the integrity is; conversely, the more concise the guidance information, the less complete it is.
The embodiment of the application sets that the integrity of the scanning operation guide information is in a negative correlation matching relationship with the grade of the user, namely when the grade of the user is lower, the integrity of the scanning operation guide information matched with the user is higher, namely more specific; when the user level is higher, the completeness of the scanning operation guide information matched with the user level is lower, namely the scanning operation guide information is more brief.
After the user level, the integrity of the scanning operation guide information and the scanning operation content to be executed are respectively determined, the system outputs the scanning operation guide information which is matched with the level of the user, namely the integrity of the guide information is matched with the level of the user and corresponds to the scanning operation content to be executed.
Illustratively, when the level of the user is a first level (for example, level 1), scanning operation guidance information of a first integrity corresponding to the content of the scanning operation to be executed is output;
when the level of the user is a second level (for example, 2 levels), outputting second-integrity sweeping operation guide information corresponding to the to-be-executed sweeping operation content;
wherein the first level is less than the second level and the first integrity is greater than the second integrity.
Step S201 in the embodiment shown in fig. 2 corresponds to step S101 in the method embodiment shown in fig. 1, and for the specific content, please refer to the content of the method embodiment shown in fig. 1, which is not repeated herein.
Optionally, in another embodiment of the present application, it is disclosed that the scanning operation guidance information output by the system may be displayed in a text form in a rectangular frame in a non-image area of the system display interface, or may be output in a voice form, or may be output in a form of text together with voice.
Specifically, the embodiment of the application sets that when the scanning operation guide information is output in cooperation with the real-time scanning process of the probe operated by the user, the scanning operation guide information can be output in a voice mode, so that a voice assistant function is provided for the user. For example, the voice prompt of 'inhale one breath, exhale one breath, please keep still' and the like is given to the user, and the voice prompt of 'please see the prompt operation of the blue graphic assistant area at the lower left corner of the screen' helps the user to better meet the requirement of the patient and find the best standard section image more quickly.
It should be noted that, the completeness of the voice prompt content is also matched with the user's rating, and the specific processing manner can be described with reference to the above embodiment.
Further, the embodiment of the present application further sets that the ultrasound imaging system not only can output the scanogram operation guidance information in a voice manner, but also can receive a user voice instruction and execute a scanogram operation corresponding to the received voice instruction.
Specifically, the ultrasonic imaging system can provide voice prompts for the user in a single direction, and can also receive voice instructions of the user to control the machine. When the two hands of the user are far away from the machine, the system voice assistant receives a voice instruction of the user, and if the voice instruction is 'frozen', the voice assistant translates the voice instruction into a machine instruction to control the machine to automatically freeze through voice recognition; when a user freezes an image, the image is played back and browsed through a trackball and a touch screen key, the user can find the best image frame through frequent manual operation, at the moment, the user can send voice control playback content 'play to the 20 th frame', 'previous frame' and the like through a voice assistant, and the voice assistant translates the playback content into a corresponding machine instruction according to the current machine use scene to play back the image, so that the manual operation of the user can be greatly reduced, and the working efficiency of the user is improved.
Optionally, another embodiment of the present application further discloses that, after the user completes the scanogram imaging under the assistance of the scanogram operation guidance, the system scores the scanogram imaging image by comparing the scanogram imaging image with the images in the feature image set; and carrying out image recognition and target labeling processing on the image with the highest score.
Specifically, with the help of scanning guidance, a user scans a real-time image to find a standard section image and then freezes the standard section image, at the moment, the system automatically compares the stored image scanned in real time with a standard section image library to find several frames which are closest to the frozen image, each frame gives a comparison score, the image is clearer and better in quality when the score is higher, the most clear frame of image is automatically replayed and jumped according to the comparison score, a characteristic image is identified through an image identification algorithm, the characteristic image is automatically traced, a note is marked, a measurement scale is automatically drawn, a measurement result is output, a doctor judges according to personal experience, the note can be finely adjusted, the measurement scale is recalculated, clicking is determined, a single frame of image is automatically stored, and the next automatic diagnosis interface is jumped.
Further, another embodiment of the application also discloses that with the assistance of the guidance of the scanning operation, after the user finishes scanning imaging, the scanning imaging image of the user is scored, and the grade of the user is updated according to the scoring of the scanning imaging image of the user and the scoring of the historical scanning imaging image of the user.
Specifically, the embodiment of the application further provides a perfect user grade updating mechanism, and the user records and evaluates the scanning quality and the scanning speed of the user in the scanning imaging process of the long-term operating system so as to evaluate the user grade.
And after the user completely scans the image each time, scoring the scanned image of the user, evaluating the scanning operation process of the user, and updating the user grade according to the evaluation.
For example, in the embodiment of the application, the default user level is level 1 when the user logs in for the first time, and when the user continuously performs scanning operation, the user refers to a probe real-time scanning of a doctor, and searches for the image quality of a standard section, the time of positioning a characteristic image and the operation time of the whole diagnosis doctor to perform comprehensive scoring.
As an optional implementation manner, the average daily requirement of the chinese physician to examine 20 patients is as an example, two weeks of continuous work are as an example, 10 days of working day is as an example, the number of times of continuous 200 patients with comprehensive scores exceeding 90 reaches 180 times, and the physician user level is promoted to one level from the current level; if the number of times of the continuous 200 cases of comprehensive scores exceeding 90 is less than 140, the doctor grade is reduced by one grade; the number of times that the total score of 200 consecutive cases exceeds 90 points is between 140 times and 180 times, and the current grade is maintained.
Corresponding to the above-mentioned ultrasound imaging aided guidance method, another embodiment of the present application further discloses an ultrasound imaging aided guidance system, which is described with reference to fig. 3 and includes:
an information obtaining module 100, configured to obtain user level information and imaging parameter information selected by the user; the user grade information represents imaging quality grade information of the user in historical scanning operation, and the imaging parameter information comprises diagnostic part information, a characteristic image set corresponding to the diagnostic part, and an automatic operation flow protocol;
and the imaging guide module 110 is used for outputting scanning operation guide information which is matched with the user grade and corresponds to the automatic operation flow protocol.
Optionally, another embodiment of the present application discloses that, when the imaging guidance module outputs the scan operation guidance information which matches the user level and corresponds to the automated operation flow protocol, the imaging guidance module is specifically configured to:
determining the grade of the user according to the user grade information, and determining the scanning operation to be executed according to the current scanning operation of the user and the automatic operation flow protocol;
outputting drawing operation guide information which is matched with the grade of the user and corresponds to the drawing operation to be executed; wherein the completeness of the scanning operation guide information is in a matching relation of negative correlation with the grade of the user.
Wherein the outputting of the scanning operation guide information which is matched with the grade of the user and corresponds to the scanning operation content to be executed comprises:
when the level of the user is a first level, outputting first-integrity sweeping operation guide information corresponding to the to-be-executed sweeping operation content;
when the level of the user is a second level, outputting scanning operation guide information of a second integrity corresponding to the scanning operation content to be executed;
wherein the first level is less than the second level and the first integrity is greater than the second integrity.
Optionally, another embodiment of the present application further discloses that the system further includes:
and the voice control module is used for outputting the scanning operation guide information in a voice mode.
Optionally, another embodiment of the present application further discloses that the voice control module is further configured to:
and receiving a voice instruction of the user, and executing a map scanning operation corresponding to the voice instruction.
Optionally, another embodiment of the present application further discloses that the system further includes:
the image template database is used for storing standard tangent plane characteristic images corresponding to the diagnosis targets;
optionally, another embodiment of the present application further discloses that the system further includes:
the image post-processing module is used for grading the scanning image by comparing the scanning image with the images in the characteristic image set after the scanning image is imaged; and carrying out image recognition and target labeling processing on the image with the highest score.
And the grade evaluation module is used for scoring the scanogram imaging image of the user and updating the grade of the user according to the score of the scanogram imaging image of the user and the score of the historical scanogram imaging image of the user.
Specifically, please refer to the content of the method embodiment for the specific working content of each unit in each embodiment of the ultrasound imaging auxiliary guidance system, which is not described herein again.
Optionally, in another embodiment of the present application, an ultrasound imaging auxiliary guiding apparatus is further disclosed, and as shown in fig. 4, the apparatus includes:
a memory 200 and a processor 210;
wherein, the memory 200 is connected to the processor 210 for storing programs;
the processor 210 is configured to implement the following functions by executing the program in the memory 200:
acquiring user grade information and imaging parameter information selected by a user; the user level information represents imaging quality level information of the user in historical scanning operation, and the imaging parameter information comprises diagnostic part information, a characteristic image set corresponding to the diagnostic part, and an automatic operation flow protocol;
and outputting the scanning operation guide information which is matched with the user level and corresponds to the automatic operation flow protocol.
Optionally, the outputting the scan operation guidance information matched with the user level and corresponding to the automation operation flow protocol includes:
determining the grade of the user according to the user grade information, and determining the scanning operation to be executed according to the current scanning operation of the user and the automatic operation flow protocol;
outputting drawing operation guide information which is matched with the grade of the user and corresponds to the drawing operation to be executed; wherein the completeness of the scanning operation guide information is in a matching relation of negative correlation with the grade of the user.
Optionally, the outputting map scanning operation guidance information that matches with the user's level and corresponds to the content of the map scanning operation to be executed includes:
when the level of the user is a first level, outputting scanning operation guide information of a first integrity corresponding to the scanning operation content to be executed;
when the level of the user is a second level, outputting scanning operation guide information of a second integrity corresponding to the scanning operation content to be executed;
wherein the first level is less than the second level and the first integrity is greater than the second integrity.
Optionally, the map scanning operation guidance information includes:
the text form and/or the voice form of the drawing operation guide information.
Optionally, the method further includes:
and receiving a voice instruction of the user, and executing a map scanning operation corresponding to the voice instruction.
Optionally, the method further includes:
after the scanning image is formed, the scanning image is compared with the images in the characteristic image set, and the scanning image is scored;
and carrying out image identification and target labeling processing on the image with the highest score.
Optionally, the method further includes:
and scoring the scanogram imaging images of the user, and updating the grade of the user according to the score of the scanogram imaging images of the user and the score of the historical scanogram imaging images of the user.
Specifically, please refer to the content of the method embodiment for the specific working content of each part in each embodiment of the ultrasound imaging auxiliary guide device, which is not described herein again.
Optionally, another embodiment of the present application further discloses a storage medium, where a computer program is stored, and when the computer program is executed by a processor, the method for ultrasound imaging assisted guidance according to any of the above embodiments is implemented.
While, for purposes of simplicity of explanation, the foregoing method embodiments are presented as a series of acts or combinations, it will be appreciated by those of ordinary skill in the art that the present application is not limited by the illustrated ordering of acts, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
It should be noted that, in the present specification, the embodiments are all described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments may be referred to each other. For the device-like embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The steps in the method of the embodiments of the present application may be sequentially adjusted, combined, and deleted according to actual needs.
The modules and sub-modules in the device and the terminal in the embodiments of the application can be combined, divided and deleted according to actual needs.
In the several embodiments provided in the present application, it should be understood that the disclosed terminal, apparatus and method may be implemented in other manners. For example, the above-described terminal embodiments are merely illustrative, and for example, the division of a module or a sub-module is only one logical division, and there may be other divisions when the terminal is actually implemented, for example, a plurality of sub-modules or modules may be combined or integrated into another module, or some features may be omitted or not executed. In addition, the shown or discussed coupling or direct coupling or communication connection between each other may be through some interfaces, indirect coupling or communication connection between devices or modules, and may be in an electrical, mechanical or other form.
The modules or sub-modules described as separate parts may or may not be physically separate, and parts that are modules or sub-modules may or may not be physical modules or sub-modules, may be located in one place, or may be distributed over a plurality of network modules or sub-modules. Some or all of the modules or sub-modules can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, each functional module or sub-module in the embodiments of the present application may be integrated into one processing module, or each module or sub-module may exist alone physically, or two or more modules or sub-modules may be integrated into one module. The integrated modules or sub-modules may be implemented in the form of hardware, or may be implemented in the form of software functional modules or sub-modules.
Those of skill would further appreciate that the various illustrative components and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the components and steps of the various examples have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software unit executed by a processor, or in a combination of the two. The software cells may reside in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (11)

1. An ultrasound imaging assisted guidance method, comprising:
acquiring user grade information and imaging parameter information selected by a user; the user grade information represents imaging quality grade information of the user in historical scanning operation, the imaging quality grade information of the user in the historical scanning operation is obtained according to scanning imaging quality, positioning characteristic image speed and overall diagnosis time evaluation, and the imaging parameter information comprises diagnosis part information, a characteristic image set corresponding to the diagnosis part and an automatic operation flow protocol; the automatic operation flow protocol comprises two parts, namely before finding a characteristic image and after finding the characteristic image; the characteristic image finding front part comprises finding a characteristic image through scanning image imaging, and the characteristic image finding rear part comprises automatically freezing back to the clearest frame;
outputting map scanning operation guide information which is matched with the user level and corresponds to the automatic operation flow protocol;
scanning a real-time image, freezing the real-time image after finding a standard section image, comparing a storage image scanned in real time with a standard section image library to find a plurality of frames adjacent to the frozen image, giving a comparison score for each frame, and automatically playing back and jumping to the clearest frame image according to the comparison score;
and updating the grade of the user according to the grade of the scanning image and the grade of the historical scanning image of the user.
2. The method of claim 1, wherein outputting the scan operation guidance information corresponding to the user level and the automation operation flow protocol comprises:
determining the grade of the user according to the user grade information, and determining the scanning operation to be executed according to the current scanning operation of the user and the automatic operation flow protocol;
outputting drawing operation guide information which is matched with the grade of the user and corresponds to the drawing operation to be executed; wherein the completeness of the scanning operation guide information is in a matching relation of negative correlation with the grade of the user.
3. The method according to claim 2, wherein outputting the scan operation guidance information which matches with the user's grade and corresponds to the contents of the scan operation to be performed comprises:
when the level of the user is a first level, outputting first-integrity sweeping operation guide information corresponding to the to-be-executed sweeping operation content;
when the level of the user is a second level, outputting scanning operation guide information of a second integrity corresponding to the scanning operation content to be executed;
wherein the first level is less than the second level and the first integrity is greater than the second integrity.
4. The method of claim 1, wherein the map-sweeping operation guide information comprises:
and scanning the picture operation guide information in a text form and/or a voice form.
5. The method of claim 1, further comprising:
and receiving a voice instruction of the user, and executing a map scanning operation corresponding to the voice instruction.
6. The method of claim 1, further comprising:
after the scanning image is formed, the scanning image is compared with the images in the characteristic image set, and the scanning image is scored;
and carrying out image identification and target labeling processing on the image with the highest score.
7. An ultrasound imaging assisted guidance system, comprising:
the information acquisition module is used for acquiring user grade information and imaging parameter information selected by the user; the user grade information represents imaging quality grade information of the user in historical scanning operation, the imaging quality grade information of the user in the historical scanning operation is obtained according to scanning imaging quality, positioning characteristic image speed and overall diagnosis time evaluation, and the imaging parameter information comprises diagnosis part information, a characteristic image set corresponding to the diagnosis part and an automatic operation flow protocol; the automatic operation flow protocol comprises two parts, namely before finding a characteristic image and after finding the characteristic image; the characteristic image finding front part comprises finding a characteristic image through scanning image imaging, and the characteristic image finding rear part comprises automatically freezing back to the clearest frame; the method comprises the steps of scanning a real-time image, freezing the real-time image after finding a standard section image, comparing a storage image scanned in real time with a standard section image library to find multiple frames adjacent to the frozen image, giving a comparison score for each frame, and automatically playing back and jumping to the clearest frame of image according to the comparison score;
the imaging guide module is used for outputting scanning operation guide information which is matched with the user grade and corresponds to the automatic operation flow protocol;
and the grade evaluation module is used for updating the grade of the user according to the grade of the scanning image and the grade of the historical scanning image of the user.
8. The system according to claim 7, wherein the imaging guidance module, when outputting the scanning operation guidance information that matches the user level and corresponds to the automated operation flow protocol, is specifically configured to:
determining the grade of the user according to the user grade information, and determining the scanning operation to be executed according to the current scanning operation of the user and the automatic operation flow protocol;
outputting drawing operation guide information which is matched with the grade of the user and corresponds to the drawing operation to be executed; and matching the integrity of the scanning operation guide information and the grade of the user in a negative correlation manner.
9. The system of claim 7, further comprising:
and the image template database is used for storing standard tangent plane characteristic images corresponding to the diagnosis targets.
10. An ultrasound imaging assisted guidance apparatus, comprising:
a memory and a processor;
wherein the memory is connected with the processor and used for storing programs;
the processor is used for realizing the ultrasonic imaging auxiliary guiding method according to any one of claims 1 to 6 by executing the program in the memory.
11. A storage medium, characterized in that the storage medium has stored thereon a computer program which, when executed by a processor, implements the ultrasound imaging assisted guidance method of any of claims 1 to 6.
CN201910543045.6A 2019-06-21 2019-06-21 Ultrasonic imaging auxiliary guiding method, system, equipment and storage medium Active CN110269641B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910543045.6A CN110269641B (en) 2019-06-21 2019-06-21 Ultrasonic imaging auxiliary guiding method, system, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910543045.6A CN110269641B (en) 2019-06-21 2019-06-21 Ultrasonic imaging auxiliary guiding method, system, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110269641A CN110269641A (en) 2019-09-24
CN110269641B true CN110269641B (en) 2022-09-30

Family

ID=67961439

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910543045.6A Active CN110269641B (en) 2019-06-21 2019-06-21 Ultrasonic imaging auxiliary guiding method, system, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110269641B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114375179B (en) * 2019-11-04 2024-07-23 深圳迈瑞生物医疗电子股份有限公司 Ultrasonic image analysis method, ultrasonic imaging system and computer storage medium
CN114072059B (en) * 2019-12-27 2024-09-06 深圳迈瑞生物医疗电子股份有限公司 Ultrasonic imaging equipment and method for quickly setting ultrasonic automatic workflow

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6458081B1 (en) * 1999-04-23 2002-10-01 Kabushiki Kaisha Toshiba Ultrasonic diagnostic apparatus
JP2011072526A (en) * 2009-09-30 2011-04-14 Toshiba Corp Ultrasonic diagnostic apparatus

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6834207B2 (en) * 2001-02-08 2004-12-21 Kabushiki Kaisha Toshiba Operating guidance system for medical equipment
KR20100011669A (en) * 2008-07-25 2010-02-03 (주)메디슨 Method and device for providing customized interface in the ultrasound system
WO2012123943A1 (en) * 2011-03-17 2012-09-20 Mor Research Applications Ltd. Training, skill assessment and monitoring users in ultrasound guided procedures
KR101716421B1 (en) * 2013-06-21 2017-03-14 삼성전자주식회사 Method for providing information and medical diagnosis apparatus thereto
WO2015039302A1 (en) * 2013-09-18 2015-03-26 Shenzhen Mindray Bio-Medical Electronics Co., Ltd Method and system for guided ultrasound image acquisition
CN104680481B (en) * 2013-11-28 2018-09-11 深圳迈瑞生物医疗电子股份有限公司 A kind of ultrasonic wave added checking method and system
WO2015157666A1 (en) * 2014-04-11 2015-10-15 Wake Forest University Health Sciences Apparatus, methods, and systems for target-based assessment and training for ultrasound-guided procedures
US10709416B2 (en) * 2015-06-30 2020-07-14 Wisconsin Alumni Research Foundation Obstetrical imaging at the point of care for untrained or minimally trained operators
CN105138250A (en) * 2015-08-03 2015-12-09 科大讯飞股份有限公司 Human-computer interaction operation guide method, human-computer interaction operation guide system, human-computer interaction device and server
EP3574504A1 (en) * 2017-01-24 2019-12-04 Tietronix Software, Inc. System and method for three-dimensional augmented reality guidance for use of medical equipment
US20190076120A1 (en) * 2017-09-08 2019-03-14 General Electric Company Systems and methods for guiding ultrasound probe placement using haptic feedback
CN107679574A (en) * 2017-09-29 2018-02-09 深圳开立生物医疗科技股份有限公司 Ultrasonoscopy processing method and system
CN109044400A (en) * 2018-08-31 2018-12-21 上海联影医疗科技有限公司 Ultrasound image mask method, device, processor and readable storage medium storing program for executing
CN109259800A (en) * 2018-10-26 2019-01-25 深圳开立生物医疗科技股份有限公司 Ultrasonic imaging control system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6458081B1 (en) * 1999-04-23 2002-10-01 Kabushiki Kaisha Toshiba Ultrasonic diagnostic apparatus
JP2011072526A (en) * 2009-09-30 2011-04-14 Toshiba Corp Ultrasonic diagnostic apparatus

Also Published As

Publication number Publication date
CN110269641A (en) 2019-09-24

Similar Documents

Publication Publication Date Title
CN112151155A (en) Ultrasonic image intelligent training method and system based on artificial intelligence and application system
TWI788620B (en) Methods and systems for recording and processing an image information of tissue based on voice
US11284855B2 (en) Ultrasound needle positioning system and ultrasound needle positioning method utilizing convolutional neural networks
CN110269641B (en) Ultrasonic imaging auxiliary guiding method, system, equipment and storage medium
CN112614489A (en) User pronunciation accuracy evaluation method and device and electronic equipment
CN111986182A (en) Auxiliary diagnosis method, system, electronic device and storage medium
CN117558439A (en) Question interaction method and device based on large language model and related equipment
CN112667834A (en) Image annotation method and related device
CN113485555B (en) Medical image film reading method, electronic equipment and storage medium
US20130011027A1 (en) System and method for composing a medical image analysis
CN103705271B (en) A kind of man-machine interactive system for medical imaging diagnosis and method
CN116580801A (en) Ultrasonic inspection method based on large language model
CN115691793A (en) Method and device for processing tongue picture data and related equipment
CN113313359A (en) Evaluation method and device for image labeling diagnosis quality
JP2023526412A (en) Information processing method, electronic device, and computer storage medium
WO2023151578A1 (en) Embryo selection process
CN114167993B (en) Information processing method and device
CN111144506A (en) Ultrasound image-based hydatid identification method, storage medium and ultrasound equipment
US20230034589A1 (en) Ultrasound imaging method and system for identifying an anatomical feature of a spine
CN114328864A (en) Ophthalmic question-answering system based on artificial intelligence and knowledge graph
WO2020195823A1 (en) Index value assigning device, index value assigning method, and program
CN113693625A (en) Ultrasonic imaging method and ultrasonic imaging apparatus
CN116596919B (en) Endoscopic image quality control method, endoscopic image quality control device, endoscopic image quality control system, endoscopic image quality control computer device and endoscopic image quality control storage medium
Qiao et al. A deep learning-based intelligent analysis platform for fetal ultrasound four-chamber views
CN110660479A (en) AI training and diagnosis system for dynamic medical image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant