WO2022263269A1 - User scanning hand evaluation - Google Patents

User scanning hand evaluation Download PDF

Info

Publication number
WO2022263269A1
WO2022263269A1 PCT/EP2022/065640 EP2022065640W WO2022263269A1 WO 2022263269 A1 WO2022263269 A1 WO 2022263269A1 EP 2022065640 W EP2022065640 W EP 2022065640W WO 2022263269 A1 WO2022263269 A1 WO 2022263269A1
Authority
WO
WIPO (PCT)
Prior art keywords
ultrasound
user
scanning hand
usage
time period
Prior art date
Application number
PCT/EP2022/065640
Other languages
French (fr)
Inventor
Seyedali SADEGHI
Anup Agarwal
Claudia ERRICO
Hua Xie
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2022263269A1 publication Critical patent/WO2022263269A1/en

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/70ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients

Definitions

  • the present disclosure pertains to systems and methods for determining the usage of a ultrasound user’s scanning hand in every ultrasound exam.
  • Particular implementations include systems configured to determine the movement of a ultrasound user’s scanning hand during a medical imaging examination and assess the efficiency of such movement by implementing an intelligent computer-based prediction system.
  • WRMSDs Work-related musculoskeletal disorders
  • WRMSDs are painful conditions caused by workplace activities affecting the muscles, ligaments, and tendons.
  • WRMSDs often impose a substantial personal toll on those affected since they may no longer be able to perform personal tasks and routine activities of daily living.
  • WRMSDs develop gradually over time from repeated exposure to a variety of risk factors and may be painful during work and even at rest.
  • the present disclosure describes systems and methods for monitoring and reducing physical stress in the scanning hand of a ultrasound user.
  • the ultrasound user’s scanning hand moves the ultrasound probe over the target area of the patient while the non-scanning hand engages with a control panel of the ultrasound system.
  • the systems disclosed herein are configured to monitor, measure, and determine the usage of the ultrasound user’s scanning hand during an ultrasound exam and determine whether the measured usage deviates from the usage expected for the same ultrasound exam performed on a similar patient.
  • Disclosed systems can include graphical user interfaces configured to extract and display user- defined datasets regarding scanning hand usage and performance.
  • the graphical user interface together with one or more additional processors, can display scanning hand performance metrics reflecting ultrasound user performance during a single exam, or across multiple exams performed over periods lasting weeks, months, or even years.
  • a ultrasound user performance evaluation system may include or be communicatively coupled with an image acquisition device configured to acquire images of a patient during an ultrasound exam.
  • the system can also include or be communicatively coupled with an electromagnetic tracking device.
  • the acquisition device and the tracking device may be coupled with one or more processors disclosed herein.
  • a control panel can also be included in or communicatively coupled with the system to adjust imaging parameters upon receiving input by the non-scanning hand of the ultrasound user performing a given exam.
  • One or more processors can be configured to receive a clinical context input by the ultrasound user and/or the clinical context input may be received from another processor or database.
  • the one or more processors can apply an intelligence system, which may include a neural network, to the clinical context.
  • the intelligence system can be configured to generate an expected usage of a scanning hand of the ultrasound user based on the received clinical context.
  • the one or more processors can also determine an actual usage of the scanning hand of the ultrasound user expended during the ultrasound exam, and compare the expected usage to the actual usage to generate a performance metric of the scanning hand of the ultrasound user.
  • the performance metric can then be displayed on a graphical user interface communicatively coupled with the processor(s).
  • a ultrasound user performance evaluation system can include one or more processors configured to determine a scanning hand performance metric indicative of a scanning hand usage of a ultrasound user over a time period in response to a user inquiry.
  • the scanning hand performance metric can be based on a comparison of an actual scanning hand usage to an expected scanning hand usage determined over the time period.
  • the system can also include a graphical user interface configured to receive the user inquiry, obtain the scanning hand performance metric, and display the scanning performance metric.
  • the one or more processors can be further configured to receive one or more clinical contexts.
  • Each clinical context can include a description of an ultrasound exam conducted over the time period, a model of ultrasound machine utilized over the time period, at least one attribute of a patient examined over the time period, a medical history for the patient, or combinations thereof.
  • the one or more processors can be further configured to apply a neural network to each of the clinical contexts. The neural network can be configured to generate the expected scanning hand usage based on each of the clinical contexts.
  • the one or more processors can be further configured to determine the actual scanning hand usage expended during each ultrasound exam conducted over the time period. In some examples, the one or more processors can be configured to determine the actual scanning hand usage by tracking ultrasound transducer movement caused by the ultrasound user performing each ultrasound exam over the time period. In some examples, ultrasound transducer movement comprises an input of translational movement and rotational movement as detected by an electromagnetic tracking device.
  • the performance metric comprises an efficiency rating of the actual scanning hand usage. In some examples, the performance metric comprises efficiency ratings corresponding to the translational movement and/or rotational movement. In some examples, the graphical user interface is further configured to display the time period over which the performance metric is determined, the one or more clinical contexts, qualitative remarks describing the performance metric, or combinations thereof.
  • the neural network is operatively associated with a training algorithm configured to receive an array of training inputs and known outputs. The training inputs can comprise a sample of clinical context datasets obtained from previously performed ultrasound exams, and the known outputs can comprise performance metrics of the scanning hand of ultrasound users who performed the previous ultrasound exams. In some examples, the graphical user interface can be further configured to adjust a depiction of the performance metric in response to a user input.
  • a method of evaluating and displaying ultrasound user performance can involve receiving a user inquiry regarding a scanning hand performance of a ultrasound user over a time period.
  • the method can also involve obtaining a scanning hand performance metric indicative of the scanning hand performance of the ultrasound user over the time period.
  • the scanning hand performance metric can be based on a comparison of an actual scanning hand usage to an expected scanning hand usage determined over the time period.
  • the method can also involve displaying the performance metric.
  • the method can also involve adjusting a depiction of the performance metric in response to a user input.
  • the expected scanning hand usage can be based on one or more clinical contexts.
  • Each clinical context can comprise a description of an ultrasound exam conducted over the time period, a model of ultrasound machine utilized over the time period, at least one attribute of a patient examined over the time period, a medical history for the patient, or combinations thereof
  • Any of the methods described herein, or steps thereof, may be embodied in a non-transitory computer-readable medium comprising executable instructions, which when executed may cause one or more hardware processors to perform the method or steps embodied herein.
  • FIG. 1 is a schematic overview of a system configured to determine and display scanning hand usage and efficiency in accordance with embodiments of the present disclosure.
  • FIG. 2 is a schematic two electromagnetic tracking systems utilized in accordance with embodiments of the present disclosure.
  • FIG. 3 is a schematic of a neural network implemented to determine an expected usage of a ultrasound user’s scanning hand during an ultrasound exam in accordance with embodiments of the present disclosure.
  • FIG. 4 is a graphical user interface configured to generate and display customizable health status reports in accordance with embodiments of the present disclosure.
  • FIG. 5 is a schematic of a processor utilized in accordance with embodiments of the present disclosure.
  • FIG. 6 is a flow diagram of a method of performed in accordance with embodiments of the present disclosure.
  • the disclosed systems and methods overcome the lack of intelligent, systematic tools for monitoring, determining, and ultimately improving the health of the scanning hand of medical ultrasound users.
  • scanning hand usage which may comprise translational and rotational hand movement.
  • the actual scanning hand usage can be compared to an expected scanning hand usage predicted by an artificial intelligence system trained to predict scanning hand usage levels based on clinical context.
  • the clinical context typically comprises the particular exam performed, the ultrasound machinery utilized, and various physical attributes of the current patient and his/her medical history.
  • the disclosed systems can identify deviations from the expected usage, thereby exposing potential inefficiencies related to scanning hand usage.
  • embodiments also include a graphical user interface configured to display the acquired information in customizable reports.
  • the graphical user interface can display the data directly to the ultrasound user and/or a lab manager overseeing the ultrasound user’s performance. The user interface can thus be displayed locally or across different locations.
  • the electromagnetic motion tracking devices disclosed herein can detect the pose (position and orientation) of an imaging device moveable in six degrees of freedom using an electromagnetic sensor attached to or integrated within the imaging device, along with an external device configured to establish a fixed reference frame.
  • the imaging device can be an ultrasound transducer or probe, but the embodiments disclosed herein are not limited to ultrasound imaging devices. Utilization of electromagnetic trackers in accordance with the disclosed embodiments enables the localization of small sensors in an electromagnetic field without line-of-sight restrictions, which can impede motion tracking efforts implemented using optical sensors.
  • electromagnetic tracking device includes a device or sensor integrated within or attached to an imaging device, such as an ultrasound transducer.
  • An “electromagnetic tracking system” includes the electromagnetic tracking device and an external device configured to provide the fixed reference frame.
  • the external device may comprise a field generator or transmitter configured to emit a low-intensity electromagnetic field through which the electromagnetic tracking device passes during an exam.
  • the electromagnetic field can generate a current within the sensor of the tracking device, which can be detected and converted into a trackable signal by one or more processors included in embodiments disclosed herein. While electromagnetic-based tracking is described herein, additional embodiments consistent with the present disclosure may utilize other hand motion tracking systems, such as camera-based tracking.
  • an infrared camera may be utilized for patient privacy.
  • tracking systems featuring accelerometers and gyroscopes may be utilized alone or in combination with other tracking systems. The use of such tracking systems may supplement, and thereby improve, the electromagnetic tracking systems described herein.
  • usage and activity may be used interchangeably and may include all motions and movements of the ultrasound user’s scanning hand.
  • the ultrasound user’s scanning hand translational and/or rotational movement can be encompassed within this definition of “usage” or “activity” such that a greater amount and/or degree of translational and/or rotational movement of the ultrasound user’s scanning hand required to accomplish the same all impact the measured “usage” or “activity” levels.
  • the usage or activity of a ultrasound user’s scanning hand may be measured and/or displayed in the form of one or more performance metrics, which may include a usage or efficiency of total translational and/or rotational movement of the scanning hand, for example.
  • total linear hand motion encompasses the total translational hand distance in the X, Y, and Z direction during an ultrasound exam.
  • Total rotational hand motion encompasses the total rotational hand movement around the X, Y, and Z axes during an exam.
  • clinical context and “upstream clinical context” may be used interchangeably and may include or be based on the type of exam being performed, e.g., pulmonary exam or cardiac exam, as well as patient-specific information, non-limiting examples of which may include patient weight, height, body mass index (BMI), age, underlying health condition(s), clinical history, and combinations thereof. Additional information that may be encompassed within the clinical context can include the reason(s) for performing an exam, e.g., to evaluate left ventricular function, along with the patient’s length of stay, e.g., inpatient and/or outpatient time. Information constituting the clinical context can be combined and/or displayed on the user interface in various ways.
  • the clinical context can include categorical descriptors of a patient’s body type and/or associated levels of exam difficulty, such as “easy,” “moderate,” or “difficult.”
  • the clinical information can also include the particular ultrasound model being used to perform the exam, as the scanning hand motion required to perform an ultrasound exam may differ for different models.
  • performance metrics can include total translational scanning hand movement and/or total rotational scanning hand movement, one or more of which may be compared to expected total translational scanning hand movement, expected rotational scanning hand movement to generate efficiency scores or ratings.
  • the “health status” of a ultrasound user’s scanning hand can encompass the health of various anatomical parts associated with, comprising, or attached to the ultrasound user’s scanning hand, non-limiting examples of which may include the ultrasound user’s shoulder, elbow, forearm, wrist, hand, finger(s), or combinations thereof.
  • the “health status” can also embody an overall indication of a ultrasound user’s scanning hand health based on the total stresses incurred by each body part directly or indirectly connected to the scanning hand.
  • “expert” or “experienced” ultrasound users may include certified ultrasound users having at least a certain length of experience, such as at least one year, two years, three years, four years, five years, or longer. “Expert” or “experienced” ultrasound users may also include ultrasound users who have received or attained a form of recognized achievement or certification.
  • ultrasound-based exams are contemplated herein, non-limiting examples of which include diagnostic imaging, cardiac imaging, vascular imaging, lung imaging, and combinations thereof.
  • the particular exam being performed likely impacts the scanning hand activity of the ultrasound user, especially if an exam requires the acquisition of a greater number of images at a variety of depths and/or angles.
  • FIG. 1 depicts an overview of a scanning hand monitoring and evaluation system 100 implemented in accordance with embodiments described herein.
  • the techniques disclosed herein may reference ultrasound users such as sonographers. Further, the figures, including Fig. 1A may make reference to a sonographer however, such term should be seen as interchangeable with any ultrasound user regardless of specific certification unless otherwise indicated.
  • information constituting the clinical context 102 can include patient type, which in this particular example includes categories such as “easy,” “moderate,” and “difficult.”
  • the clinical context 102 can also be based on whether the ultrasound exam is being performed at an inpatient or outpatient facility.
  • the patient history, reason for exam, and model of the ultrasound system being used are further included in this example of a clinical context 102.
  • the data constituting the clinical context 102 can be provided as input received by an artificial intelligence system represented in the illustrated embodiment as a neural network 104.
  • the neural network 104 can be trained to predict the expected usage 106 of a ultrasound user’s scanning hand, for example in the form of an expected electromagnetic tracker output, based on the clinical context 102.
  • the expected electromagnetic tracker output can include a total amount and/or degree of translational and/or rotational scanning hand movement. Changes to the clinical context may change the expected usage, even for the same exams performed with the same equipment.
  • the expected activity for the same pulmonary ultrasound exam performed using the same ultrasound machine can differ between two patients, especially if one patient is considered “difficult” and the other is considered “easy” based on their respective physical attributes, such as BMI.
  • Clinical context can also impact the expected duration of an ultrasound exam, such that a longer expected duration likely translates into higher expected usage levels.
  • Clinical information about a patient undergoing an ultrasound exam is therefore utilized by the neural network 104 to refine the expected usage 106 of the scanning hand during a given ultrasound exam.
  • the actual usage 108 of a ultrasound user’s scanning hand can be determined by mining and extracting the translational and rotational coordinates of the ultrasound transducer obtained from the electromagnetic tracking system during an ultrasound exam.
  • This data can also be analyzed by one or more processors of the system 100 in view of the relevant clinical context 102, such that actual usage levels are properly considered together with the corresponding clinical context.
  • a high activity level measured for an “easy” patient may indicate a likely inefficiency in the ultrasound user’s scanning hand usage, which may be confirmed by the system depending on the output 106 of the neural network 104.
  • Efficiency in this context may be determined by the amount of non-essential linear and/or rotational motion of the scanning hand, such that a greater amount of non-essential motion is less efficient than a lesser amount of non-essential motion.
  • Non-essential motion may increase if the amount of transducer twisting required to obtain a certain image, for example, is greater than that utilized by an experienced ultrasound user to obtain the same image.
  • the actual usage 108 obtained from the electromagnetic tracking device can then be compared to the expected usage 106 predicted by the neural network 104 to determine the adherence (or deviation) of the ultrasound user’s current performance relative to the performance expected for the same exam performed on a patient having similar attributes.
  • the result of this comparison can be generated and subsequently displayed on a graphical user interface 110 in the form of a customizable health status report 112, as further described below in connection with FIG. 4.
  • the total usage of the scanning hand determined by the disclosed systems from the movement detected by the electromagnetic tracking device can then be compared to the expected usage determined by the aforementioned intelligence system to identify performance deviations, which may reveal user inefficiencies.
  • FIG. 2 shows two examples of electromagnetic tracking systems that may be utilized herein.
  • the first system shown is an integrated tracking system 200, which includes an ultrasound probe or transducer 202 that contains an electromagnetic tracking device integrated within the device.
  • a processor 204 physically and/or communicatively coupled to the transducer 202 is configured to detect the position and orientation of the electromagnetic tracking device within the transducer throughout an ultrasound exam.
  • the second system shown is an attached tracking system 206, which features an external electromagnetic tracking device 208 attached to the ultrasound transducer 210.
  • An external field generator 212 is also shown, as is a processor 214 physically and/or communicatively coupled to the external electromagnetic tracking device 208 and configured to detect the position and orientation tracking device 208, and thus the transducer 210.
  • Both ultrasound transducers 202, 210 are configured to acquire images of a target region within a patient during an ultrasound exam.
  • the ultrasound user performing an ultrasound exam can move and manipulate the transducer with his/her scanning hand, such that the scanning hand is moving longitudinally and/or laterally over the surface of the patient.
  • FIG. 3 is a depiction of a neural network that may be trained and implemented to generate an expected usage of a ultrasound user’s scanning hand based on a particular clinical context.
  • the neural network 300 may include an input layer 302 configured to receive a variety of discrete clinical context datasets.
  • the number of nodes or neurons in the input layer 302 may vary, and while only one neuron is depicted for illustrative purposes, non-limiting embodiments may include a number of neurons equal to the number of variables included in the clinical context training set(s), or the number of training set variables plus one.
  • the neural network 300 can be trained to receive a clinical context at the input layer 302 and generate an expected usage output based on the ground truth usage of experienced ultrasound users.
  • Embodiments of the neural network may be configured to implement an algorithmic regressive prediction model.
  • the output layer 304 of the neural network 300 can provide an expected usage, which can be parsed into individual activities, e.g., translational movement and rotational movement, represented in FIG. 3 as expected usage output neurons 304a and 304b, respectively.
  • the ground truth for the expected values of rotational and/or translational movement can be retrieved from usages previously employed by experienced ultrasound users for each given upstream clinical context.
  • the number of neurons in the output layer 304 may vary.
  • the output layer 304 may include one total neuron or one neuron for each movement constituting the total usage.
  • a custom score may be generated for each of the outputs.
  • a range may be assigned to each of the outputs, which may be based on uncertainty levels, so that if a ultrasound user’s actual usage falls within the defined usage range, that usage is deemed acceptable, normal, or efficient.
  • the risk of scanning hand overuse can be calculated based on averaging (or weighted averaging) the ratio of the ultrasound user’s output values to the expected values of the experienced ultrasound user for each output item. This ratio can be determined in some examples according to Equation 1.1:
  • Equation 1.1 [041] Equation 1.1 :
  • the Wi,2 represents the weight factor based on the assigned importance of each output item. For example, it is expected that the rotational scanning-hand motion for some tasks, such as cardiac imaging, in difficult patients is more complex than the translational hand motion. Therefore, a higher weight factor can be assigned to the W2 compared to the Wi. Ideally, the risk ratio should be close to 1. If the ratio is much higher than 1, it can be flagged with additional remarks (e.g. overuse due to excessive scanning hand rotational motion), which can be displayed on a graphical user interface, as described in connection with FIG. 4.
  • the input layer 302 and the output layer 304 can be one or more hidden layers 306 configured to assign and optimize weights associated with the clinical context inputs, for example via backpropagation, and apply the weighted inputs to an activation function, e.g., the rectified linear activation function.
  • the number of hidden layers and the number of neurons present therein may also vary. In some embodiments, the number of hidden neurons in a given hidden layer may equal the product of the total number of input neurons and output neurons, together multiplied by the number of datasets used to train the network.
  • the particular neural network(s) implemented in accordance with the disclosed embodiments may vary. The number of neurons may change, for example, along with their arrangement within the network. The size, width, depth, capacity, and/or architecture of the network may vary.
  • the neural network 300 may be hardware- (e.g., neurons are represented by physical components) or software-based (e.g., neurons and pathways implemented in a software application), and can use a variety of topologies and learning algorithms for training the neural network to produce the desired output.
  • a software-based neural network may be implemented using a processor (e.g., single- or multi-core CPU, a single GPU or GPU cluster, or multiple processors arranged for parallel-processing) configured to execute instructions, which may be stored in a computer-readable medium, and which when executed cause the processor to perform a machine-trained algorithm for receiving clinical context input(s) and generating an expected usage level of the scanning hand of a ultrasound user performing the ultrasound exam included within the input clinical context.
  • the neural network 300 may be implemented, at least in part, in a computer-readable medium comprising executable instructions, which when executed by a processor, may cause the processor to perform a machine-trained algorithm to output expected activity levels for the scanning hand of a ultrasound user performing a particular exam.
  • the neural network(s) may be trained using any of a variety of currently known or later developed machine learning techniques to obtain a neural network (e.g., a machine-trained algorithm or hardware-based system of nodes) configured to analyze input data in the form of patient- and exam-specific information collectively constituting a clinical context.
  • a neural network e.g., a machine-trained algorithm or hardware-based system of nodes
  • the ground truth used for training the network 300 can include documented activity levels of expert, experienced, or average ultrasound users exerted in the same or similar clinical context.
  • Supervised learning models can be trained on a comprehensive data set of clinical contexts and associated usages. The accuracy of the neural network can thus grow stronger over time as more data is input.
  • the model may be (partially) realized as an AI-based learning network.
  • the computer-implemented techniques utilized to generate the expected usage may vary, and may involve artificial intelligence-based processing, e.g., sophisticated supervised machine learning.
  • the neural network 300 can also be coupled to a training database 308.
  • the training database 308 may provide a large sample of clinical context data sets and corresponding usages used to train the neural network.
  • Communication between the training database 308 and the neural network 300 can be bidirectional, such that the training database 308 may provide usages obtained by experienced ultrasound users to the network for training purposes, and the neural network 300 can transmit new clinical context datasets for storage in the training database 308, thereby increasing the sample size of clinical contexts paired with usage outputs and further refining future output from the neural network.
  • neural networks e.g., network 300
  • network 300 may be utilized to generate expected usages of a ultrasound user’s scanning hand
  • embodiments are not confined to neural networks, and a number of additional and alternative intelligence systems or components may be utilized, such as random forests or support vector machines.
  • FIG. 4 shows a graphical user interface (GUI) 400, which may also be considered a
  • “dashboard” or “panel,” configured to display a customized health status report 402 corresponding to a particular ultrasound user, who can be selected from a list of ultrasound users included in a dropdown menu 404.
  • the GUI 400 can receive free text and/or search entries corresponding to specific ultrasound users.
  • the health status report 402 can be displayed via the GUI 400 to various personnel, including ultrasound users or clinical lab managers seeking periodic health status reports of scanning hand usage.
  • the report 402 can include selectable performance output 406, which can include options such as “average exam” or “single exam.” As the displayed options indicate, an average exam performance output causes the report 402 to show an average performance output determined based on two or more exams, whereas the single exam performance output causes the report 402 to show the performance output of a single exam.
  • the report 402 can also feature a selectable time period or date range 408 over which a given ultrasound user’s scanning hand usage is determined. The ultrasound user’s scanning hand performance may vary depending on the date range specified at the GUI 400.
  • a wide range e.g., one year or longer, may reveal approximately average efficiency levels, whereas a date range spanning the first six months of that same one-year period may show low-efficiency levels, and a date range spanning the second six months of the one-year period may show high-efficiency levels.
  • the date range can span less than one day and stretch as long as months or years.
  • the report 402 can thus provide detailed, customizable information to benchmark certain performance metrics and enable continuous performance improvement.
  • the report 402 can also include an exam number selection 410, which allows the user to view information regarding a specific exam performed on a specific day. In the illustrated example, the user selected exam number 3 performed on November 12, 2020.
  • a ultrasound user can thus query the system on demand to provide a health status report 402 based solely on a specific ultrasound exam, which may be the most recent exam performed, or on multiple exams performed over a defined period also specified by a ultrasound user. For example, a ultrasound user can query the system to provide a scanning hand health status report based on all exams performed within the previous week, month, or year. In this manner, the ultrasound user can determine how a particular exam or collection of exams has impacted the current health status of his/her scanning hand.
  • the ultrasound user selection e.g., dropdown menu 404), selectable performance output 406, selectable date range 408, and/or exam number selection 410 can comprise all or part of one or more user inquiries or instructions input at the GUI 400.
  • the details section 412 of the report 402 provides the clinical context corresponding to the selected exam.
  • the clinical context of exam number 3 included a cardiac exam performed on a difficult outpatient using the Affiniti 70 ultrasound machine.
  • a qualitative remarks section 414 provides a summary of the ultrasound user’s scanning hand performance. The level of detail provided in the remarks section 414 can vary. The example shown indicates “scanning hand overuse, particularly extra rotational movement.” To provide additional, more specific information, the scanning hand efficiency graphic 416 parses out the efficiency of the ultrasound user’s total translational usage and total rotational usage.
  • Additional information included in the health status report 402 can include the health status of the ultrasound user, which may be based on one or more examinations, and the extent to which the ultrasound user’s performance metrics deviated from the expected metrics.
  • ultrasound users can query the health status of their scanning hand by clicking on a tag/button/other feature on the ultrasound imaging screen of the GUI 400 at the end of an ultrasound exam.
  • the disclosed systems can store and utilize ultrasound user-specific performance metrics to generate individualized health reports on a daily, weekly, biweekly, monthly, quarterly, and/or annual basis.
  • the GUI 400 can be configured to generate and display confidence levels depending on the number of exams included in a specified time period. For instance, confidence levels may be relatively low if a small number of exams is encompassed within a specified date range, whereas confidence levels may be relatively high for a larger number of exams.
  • a live pop-up message or alert 418 can be automatically generated and displayed on the GUI when abnormal (e.g., very large) translational and/or rotational movement is detected during an exam, as compared to the expected values forecasted by the predictive system for the same upstream clinical context. Enabling live alerts upon recognizing these abnormalities provides real-time guidance for ultrasound users, which may minimize potential overuse and injury risk to the scanning hand. In some examples, systems herein can connect a ultrasound user virtually to an expert who can recommend the implementation of any necessary adjustments.
  • the live alert 418 can be displayed on a health status report configured to monitor and display information regarding a current exam, instead of a previous exam.
  • the live alert 418 can also be displayed on the GUI 400, but not as a component of the health status report. In some examples, the live alert 418 can be displayed on a separate GUI or screen configured to display live ultrasound images obtained during the exam, and may be displayed in a manner that does not unnecessarily disrupt the current exam.
  • the GUI 400 can be physically and/or communicatively coupled to one or more underlying processors 420 configured to generate and modify the displayed graphics in response to different user inputs received at the GUI 400, for example regarding the date range over which scanning hand efficiency is calculated and the qualitative remarks associated therewith.
  • One or more of the processors 420 can be configured to operate an intelligence system, such as neural network 300.
  • one or more of the processors 420 can be configured to determine the position and orientation of an electromagnetic sensor, and may therefore form part of the electromagnetic tracking system configured to determine scanning hand usages.
  • the one or more processors 420 can also be configured to compare actual scanning hand usages to expected scanning hand usages.
  • Scanning hand data acquired over time can be stored in one or more data storage devices 422, which may be coupled with the graphical user interface 400 and/or the one or more processors 420.
  • the GUI 400 can be configured to identify, extract, and/or receive the stored data required to generate a particular health status report 402 in accordance with user-specified parameters received at the GUI 400.
  • the GUI 400 and one or more processors 420 coupled therewith can mine the data storage device(s) 422 for one or more datasets regarding the scanning hand usage of a particular ultrasound user over a specific time period, along with any particular exams performed during that time period.
  • the GUI 400 can thus be configured to initiate and/or perform a variety of data extractions and analyses, receive data corresponding to such extractions and analyses, and generate graphical displays in the form a health status report 402 customized in view of the same.
  • the GUI 400 can also be configured to generate and/or modify the graphics displayed on the health status report 402 in accordance with additional user inputs.
  • the scanning hand efficiency section 416 may be modified to remove the circular efficiency ratings and/or replace them with a linear efficiency display or numerical efficiency display.
  • the GUI 400 can also be configured to display absolute data regarding scanning hand usage, for example in terms of usage hours and the total translational and rotational movement accrued during certain periods.
  • the GUI 400 can be configured to selectively obtain/receive and display data corresponding to a particular clinical context.
  • the GUI 400 can obtain/receive and display data corresponding to a particular patient type (e.g., “difficult”), a particular model of ultrasound machine, a particular time of day (e.g., AM or PM), and a particular type of exam (e.g., cardiac).
  • a particular patient type e.g., “difficult”
  • a particular model of ultrasound machine e.g., AM or PM
  • a particular type of exam e.g., cardiac.
  • Ultrasound users and lab managers may therefore use the GUI 400 to identify specific areas needing improvement, which may also be used to adjust ultrasound user scheduling.
  • the graphical user interface 400 can convey risk-enhancing activities undertaken by a particular ultrasound user, for example in the form of over-usage and inefficiencies itemized by activity type, e.g., rotational motion, thereby enabling the ultrasound user to minimize the risk of injury.
  • one or more of the systems disclosed herein may be configured to predict a future time at which the scanning hand of a ultrasound user is likely to develop an injury, such as carpal tunnel syndrome.
  • the systems disclosed herein can be configured to recommend certain actions based on this information. For example, embodiments can be configured to recommend that a ultrasound user perform less of a certain exam and/or that a ultrasound user receive additional training to improve his/her efficiency with respect to a specific exam.
  • FIG. 5 is a block diagram illustrating an example processor 500 according to principles of the present disclosure.
  • processors utilized to implement the disclosed embodiments may be configured the same as or similar to processor 500.
  • Processor 500 may be used to implement one or more processes described herein.
  • processor 500 may be configured to implement an artificial intelligence system configured to generate predicted usage levels, such as neural network 300.
  • the processor 500 can also be configured to receive a clinical context data set and determine an actual usage of the scanning hand of the ultrasound user expended during the ultrasound exam.
  • the same processor, or a different processor configured similarly, can also compare the expected usage to the actual usage to generate a performance metric of the scanning hand of the ultrasound user.
  • the processor 500 can also be configured as a graphics processor programmed to generate displays for the graphical user interfaces described herein.
  • the processor 500 can also be configured to determine an actual usage of the scanning hand of a ultrasound user during the ultrasound exam by determining the total translational movement and total rotational movement of the ultrasound transducer.
  • the processor 500, or a different processor configured similarly, can be physically and/or communicatively coupled with an electromagnetic tracking device according to embodiments disclosed herein.
  • Processor 500 may be any suitable processor type including, but not limited to, a microprocessor, a microcontroller, a digital signal processor (DSP), a field programmable array (FPGA) where the FPGA has been programmed to form a processor, a graphical processing unit (GPU), an application specific circuit (ASIC) where the ASIC has been designed to form a processor, or a combination thereof.
  • DSP digital signal processor
  • FPGA field programmable array
  • GPU graphical processing unit
  • ASIC application specific circuit
  • the processor 500 may include one or more cores 502.
  • the core 502 may include one or more arithmetic logic units (ALU) 504.
  • ALU arithmetic logic unit
  • the core 502 may include a floating point logic unit (FPLU) 506 and/or a digital signal processing unit (DSPU) 508 in addition to or instead of the ALU 504.
  • FPLU floating point logic unit
  • DSPU digital signal processing unit
  • the processor 500 may include one or more registers 512 communicatively coupled to the core 502.
  • the registers 512 may be implemented using dedicated logic gate circuits (e.g., flip- flops) and/or any memory technology. In some examples, the registers 512 may be implemented using static memory.
  • the register may provide data, instructions and addresses to the core 502.
  • processor 500 may include one or more levels of cache memory 510 communicatively coupled to the core 502.
  • the cache memory 510 may provide computer- readable instructions to the core 502 for execution.
  • the cache memory 510 may provide data for processing by the core 502.
  • the computer-readable instructions may have been provided to the cache memory 510 by a local memory, for example, local memory attached to the external bus 516.
  • the cache memory 510 may be implemented with any suitable cache memory type, for example, metal-oxide semiconductor (MOS) memory such as static random access memory (SRAM), dynamic random access memory (DRAM), and/or any other suitable memory technology.
  • MOS metal-oxide semiconductor
  • the processor 500 may include a controller 514, which may control input to one or more processors included herein, e.g., processors 204, 214. Controller 514 may control the data paths in the ALU 504, FPLU 506 and/or DSPU 508. Controller 514 may be implemented as one or more state machines, data paths and/or dedicated control logic. The gates of controller 514 may be implemented as standalone gates, FPGA, ASIC or any other suitable technology.
  • the registers 512 and the cache memory 510 may communicate with controller 514 and core 502 via internal connections 520A, 520B, 520C and 520D.
  • Internal connections may be implemented as a bus, multiplexor, crossbar switch, and/or any other suitable connection technology.
  • Inputs and outputs for the processor 500 may be provided via a bus 516, which may include one or more conductive lines.
  • the bus 516 may be communicatively coupled to one or more components of processor 500, for example the controller 514, cache 510, and/or register 512.
  • the bus 516 may be coupled to one or more components of the system.
  • the bus 516 may be coupled to one or more external memories.
  • the external memories may include Read Only Memory (ROM) 532.
  • ROM 532 may be a masked ROM, Electronically Programmable Read Only Memory (EPROM) or any other suitable technology.
  • the external memory may include Random Access Memory (RAM) 533.
  • RAM 533 may be a static RAM, battery backed up static RAM, Dynamic RAM (DRAM) or any other suitable technology.
  • the external memory may include Electrically Erasable Programmable Read Only Memory (EEPROM) 535.
  • the external memory may include Flash memory 534.
  • the external memory may include a magnetic storage device such as disc 536.
  • FIG. 6 is a flow diagram of a method of evaluating and displaying ultrasound user performance.
  • the example method 600 shows the steps that may be utilized, in any sequence, by the systems and/or apparatuses described herein.
  • examples of the present system have been illustrated with particular reference to ultrasound imaging modalities, the present system can be extended to other medical imaging systems where one or more images are obtained in a systematic manner.
  • the method 600 may be performed with the aid of one or more imaging systems including but not limited to MRI or CT.
  • the method involves “receiving a user inquiry regarding a scanning hand performance of a ultrasound user over a time period.”
  • the method 600 involves “obtaining a scanning hand performance metric indicative of the scanning hand performance of the ultrasound user over the time period, the scanning hand performance metric based on a comparison of an actual scanning hand usage to an expected scanning hand usage determined over the time period.”
  • the method 600 involves “displaying the performance metric,” which may comprise a scanning hand efficiency rating.
  • the method 600 can further involve adjusting a depiction of the performance metric, for example in response to a user input, which may comprise an instruction or query.
  • the expected scanning hand usage can be based on one or more clinical contexts, for example the clinical context 102 shown in FIG. 1.
  • Non-limiting examples of clinical contexts can include a description of an ultrasound exam conducted over the selected time period, a model of ultrasound machine utilized over the selected time period, at least one attribute of at least one patient examined over the selected time period, and/or the medical history of at least one patient examined over the selected time period.
  • the actual scanning hand usage can be based on the rotational and/or translational movement of an ultrasound transducer caused by a ultrasound user while performing one or more ultrasound exams over the selected time period.
  • Transducer movement can be determined at least in part by an electromagnetic tracking device included within or attached to the transducer, but embodiments are not limited to electromagnetic tracking and may include additional tracking systems that include an accelerometer and/or gyroscope, for example.
  • the method can also involve generating and displaying qualitative remarks or messages describing the performance metric and/or recommended adjustments to be implemented by the selected the ultrasound user over time.
  • One or more of the steps included in the method 600 may be implemented automatically by a disclosed system, such that user input is not required.
  • Display of the performance metric for example, may be performed automatically by the GUI operating together with one or more processors.
  • Suggested adjustments for a given ultrasound user can also be updated automatically in response to the selection of different time periods and/or exams performed therein.
  • the storage media can provide the information and programs to the device, thus enabling the device to perform functions of the systems and/or methods described herein.
  • the computer could receive the information, appropriately configure itself and perform the functions of the various systems and methods outlined in the diagrams and flowcharts above to implement the various functions. That is, the computer could receive various portions of information from the disk relating to different elements of the above-described systems and/or methods, implement the individual systems and/or methods and coordinate the functions of the individual systems and/or methods described above.
  • processors described herein can be implemented in hardware, software and firmware. Further, the various methods and parameters are included by way of example only and not in any limiting sense. In view of this disclosure, those of ordinary skill in the art can implement the present teachings in determining their own techniques and needed equipment to affect these techniques, while remaining within the scope of the invention.
  • the functionality of one or more of the processors described herein may be incorporated into a fewer number or a single processing unit (e.g., a CPU) and may be implemented using application specific integrated circuits (ASICs) or general-purpose processing circuits which are programmed responsive to executable instruction to perform the functions described herein.
  • ASICs application specific integrated circuits
  • the present system may be used to obtain and/or project image information related to, but not limited to renal, testicular, breast, ovarian, uterine, thyroid, hepatic, lung, musculoskeletal, splenic, and cardiac applications.
  • the present system may also include one or more programs which may be used with conventional imaging systems so that they may provide features and advantages of the present system. Certain additional advantages and features of this disclosure may be apparent to those skilled in the art upon studying the disclosure, or may be experienced by persons employing the novel system and method of the present disclosure. Another advantage of the present systems and method may be that conventional medical image systems can be easily upgraded to incorporate the features and advantages of the present systems, devices, and methods.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Epidemiology (AREA)
  • Data Mining & Analysis (AREA)
  • Primary Health Care (AREA)
  • Pathology (AREA)
  • Databases & Information Systems (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The present disclosure describes systems configured to monitor and evaluate the performance of the scanning hand of ultrasound users. The systems can include a neural network trained to receive information regarding the clinical context of an ultrasound exam performed by a ultrasound user, and based on the clinical context, generate an expected usage of the ultrasound user's scanning hand. The systems can include a processor configured to determine an actual usage of the scanning hand of the ultrasound user expended during the ultrasound exam and compare the actual usage to the expected usage to generate a performance metric displayed on a user interface. The processor can determine the actual usage of the ultrasound user's scanninghand by determining the translational and rotational movement of the ultrasound transducer using an electromagnetic tracking system.

Description

USER SCANNING HAND EVALUATION
TECHNICAL FIELD
[001] The present disclosure pertains to systems and methods for determining the usage of a ultrasound user’s scanning hand in every ultrasound exam. Particular implementations include systems configured to determine the movement of a ultrasound user’s scanning hand during a medical imaging examination and assess the efficiency of such movement by implementing an intelligent computer-based prediction system.
BACKGROUND
[002] Work-related musculoskeletal disorders (WRMSDs) are painful conditions caused by workplace activities affecting the muscles, ligaments, and tendons. Aside from interfering with workers’ ability to perform work-related tasks, WRMSDs often impose a substantial personal toll on those affected since they may no longer be able to perform personal tasks and routine activities of daily living. Unlike acute injuries, WRMSDs develop gradually over time from repeated exposure to a variety of risk factors and may be painful during work and even at rest.
[003] Ultrasound users develop WRMSDs at an especially high rate. Despite improvements in the flexibility of ultrasound systems and exam tables, it has been reported that about 90% of clinical ultrasound users experience symptoms of WRMSDs, 20% of which suffer career-ending injuries. Studies have also shown that 60% of ultrasound users experience wrist/hand/finger discomfort in scanning and scanning hands caused by WRMSDs. Cross-sectional studies have revealed the high prevalence of WRMSD symptoms in the scanning hand of ultrasound users, particularly in the wrist (59.2%-76.9%) and the hand/fingers (55.3%— 74%). Excessive twisting motion of the scanning hand during an ultrasound exam, for example, is positively related to symptoms of carpal tunnel syndrome. These findings highlight the extent and severity of scanning hand injuries commonly experienced by ultrasound users, which also limits patient access for those in need of ultrasound examination.
[004] Improved technologies are therefore needed to reduce the prevalence of WRMSDs caused by injuries to the scanning hand of ultrasound users. SUMMARY
[005] The present disclosure describes systems and methods for monitoring and reducing physical stress in the scanning hand of a ultrasound user. During an ultrasound exam, the ultrasound user’s scanning hand moves the ultrasound probe over the target area of the patient while the non-scanning hand engages with a control panel of the ultrasound system. The systems disclosed herein are configured to monitor, measure, and determine the usage of the ultrasound user’s scanning hand during an ultrasound exam and determine whether the measured usage deviates from the usage expected for the same ultrasound exam performed on a similar patient. Disclosed systems can include graphical user interfaces configured to extract and display user- defined datasets regarding scanning hand usage and performance. The graphical user interface, together with one or more additional processors, can display scanning hand performance metrics reflecting ultrasound user performance during a single exam, or across multiple exams performed over periods lasting weeks, months, or even years.
[006] In accordance with some embodiments disclosed herein, a ultrasound user performance evaluation system may include or be communicatively coupled with an image acquisition device configured to acquire images of a patient during an ultrasound exam. The system can also include or be communicatively coupled with an electromagnetic tracking device. The acquisition device and the tracking device may be coupled with one or more processors disclosed herein. A control panel can also be included in or communicatively coupled with the system to adjust imaging parameters upon receiving input by the non-scanning hand of the ultrasound user performing a given exam. One or more processors can be configured to receive a clinical context input by the ultrasound user and/or the clinical context input may be received from another processor or database. The one or more processors can apply an intelligence system, which may include a neural network, to the clinical context. The intelligence system can be configured to generate an expected usage of a scanning hand of the ultrasound user based on the received clinical context. The one or more processors can also determine an actual usage of the scanning hand of the ultrasound user expended during the ultrasound exam, and compare the expected usage to the actual usage to generate a performance metric of the scanning hand of the ultrasound user. The performance metric can then be displayed on a graphical user interface communicatively coupled with the processor(s). [007] In accordance with embodiments of the present disclosure, a ultrasound user performance evaluation system can include one or more processors configured to determine a scanning hand performance metric indicative of a scanning hand usage of a ultrasound user over a time period in response to a user inquiry. The scanning hand performance metric can be based on a comparison of an actual scanning hand usage to an expected scanning hand usage determined over the time period. The system can also include a graphical user interface configured to receive the user inquiry, obtain the scanning hand performance metric, and display the scanning performance metric.
[008] In some examples, the one or more processors can be further configured to receive one or more clinical contexts. Each clinical context can include a description of an ultrasound exam conducted over the time period, a model of ultrasound machine utilized over the time period, at least one attribute of a patient examined over the time period, a medical history for the patient, or combinations thereof. In some examples, the one or more processors can be further configured to apply a neural network to each of the clinical contexts. The neural network can be configured to generate the expected scanning hand usage based on each of the clinical contexts.
[009] In some examples, the one or more processors can be further configured to determine the actual scanning hand usage expended during each ultrasound exam conducted over the time period. In some examples, the one or more processors can be configured to determine the actual scanning hand usage by tracking ultrasound transducer movement caused by the ultrasound user performing each ultrasound exam over the time period. In some examples, ultrasound transducer movement comprises an input of translational movement and rotational movement as detected by an electromagnetic tracking device.
[010] In some examples, the performance metric comprises an efficiency rating of the actual scanning hand usage. In some examples, the performance metric comprises efficiency ratings corresponding to the translational movement and/or rotational movement. In some examples, the graphical user interface is further configured to display the time period over which the performance metric is determined, the one or more clinical contexts, qualitative remarks describing the performance metric, or combinations thereof. In some examples, the neural network is operatively associated with a training algorithm configured to receive an array of training inputs and known outputs. The training inputs can comprise a sample of clinical context datasets obtained from previously performed ultrasound exams, and the known outputs can comprise performance metrics of the scanning hand of ultrasound users who performed the previous ultrasound exams. In some examples, the graphical user interface can be further configured to adjust a depiction of the performance metric in response to a user input.
[Oil] In accordance with embodiments of the present disclosure, a method of evaluating and displaying ultrasound user performance can involve receiving a user inquiry regarding a scanning hand performance of a ultrasound user over a time period. The method can also involve obtaining a scanning hand performance metric indicative of the scanning hand performance of the ultrasound user over the time period. The scanning hand performance metric can be based on a comparison of an actual scanning hand usage to an expected scanning hand usage determined over the time period. The method can also involve displaying the performance metric.
[012] In some examples, the method can also involve adjusting a depiction of the performance metric in response to a user input. In some examples, the expected scanning hand usage can be based on one or more clinical contexts. Each clinical context can comprise a description of an ultrasound exam conducted over the time period, a model of ultrasound machine utilized over the time period, at least one attribute of a patient examined over the time period, a medical history for the patient, or combinations thereof
[013] Any of the methods described herein, or steps thereof, may be embodied in a non-transitory computer-readable medium comprising executable instructions, which when executed may cause one or more hardware processors to perform the method or steps embodied herein.
BRIEF DESCRIPTION OF THE DRAWINGS
[014] FIG. 1 is a schematic overview of a system configured to determine and display scanning hand usage and efficiency in accordance with embodiments of the present disclosure.
[015] FIG. 2 is a schematic two electromagnetic tracking systems utilized in accordance with embodiments of the present disclosure.
[016] FIG. 3 is a schematic of a neural network implemented to determine an expected usage of a ultrasound user’s scanning hand during an ultrasound exam in accordance with embodiments of the present disclosure.
[017] FIG. 4 is a graphical user interface configured to generate and display customizable health status reports in accordance with embodiments of the present disclosure. [018] FIG. 5 is a schematic of a processor utilized in accordance with embodiments of the present disclosure.
[019] FIG. 6 is a flow diagram of a method of performed in accordance with embodiments of the present disclosure.
DETAILED DESCRIPTION
[020] The following description of certain embodiments is merely exemplary in nature and is in no way intended to limit the invention or its applications or uses. In the following detailed description of embodiments of the present systems and methods, reference is made to the accompanying drawings which form a part hereof, and which are shown by way of illustration specific embodiments in which the described systems and methods may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice presently disclosed systems and methods, and it is to be understood that other embodiments may be utilized and that structural and logical changes may be made without departing from the spirit and scope of the present system. Moreover, for the purpose of clarity, detailed descriptions of certain features will not be discussed when they would be apparent to those with skill in the art so as not to obscure the description of the present system. The following detailed description is therefore not to be taken in a limiting sense, and the scope of the present system is defined only by the appended claims.
[021] The disclosed systems and methods overcome the lack of intelligent, systematic tools for monitoring, determining, and ultimately improving the health of the scanning hand of medical ultrasound users. By tracking the scanning hand motion using an electromagnetic tracking device for each ultrasound exam, embodiments of the disclosed systems determine scanning hand usage, which may comprise translational and rotational hand movement. The actual scanning hand usage can be compared to an expected scanning hand usage predicted by an artificial intelligence system trained to predict scanning hand usage levels based on clinical context. Among other factors, the clinical context typically comprises the particular exam performed, the ultrasound machinery utilized, and various physical attributes of the current patient and his/her medical history. By comparing the actual usage against the expected usage, the disclosed systems can identify deviations from the expected usage, thereby exposing potential inefficiencies related to scanning hand usage. Such inefficiencies may increase the likelihood of developing injuries to the scanning hand, especially if repeated over time. To convey the acquired information regarding scanning hand usage and efficiency in a meaningful way, embodiments also include a graphical user interface configured to display the acquired information in customizable reports. The graphical user interface can display the data directly to the ultrasound user and/or a lab manager overseeing the ultrasound user’s performance. The user interface can thus be displayed locally or across different locations.
[022] The electromagnetic motion tracking devices disclosed herein can detect the pose (position and orientation) of an imaging device moveable in six degrees of freedom using an electromagnetic sensor attached to or integrated within the imaging device, along with an external device configured to establish a fixed reference frame. The imaging device can be an ultrasound transducer or probe, but the embodiments disclosed herein are not limited to ultrasound imaging devices. Utilization of electromagnetic trackers in accordance with the disclosed embodiments enables the localization of small sensors in an electromagnetic field without line-of-sight restrictions, which can impede motion tracking efforts implemented using optical sensors.
[023] As used herein, “electromagnetic tracking device” includes a device or sensor integrated within or attached to an imaging device, such as an ultrasound transducer. An “electromagnetic tracking system” includes the electromagnetic tracking device and an external device configured to provide the fixed reference frame. The external device may comprise a field generator or transmitter configured to emit a low-intensity electromagnetic field through which the electromagnetic tracking device passes during an exam. The electromagnetic field can generate a current within the sensor of the tracking device, which can be detected and converted into a trackable signal by one or more processors included in embodiments disclosed herein. While electromagnetic-based tracking is described herein, additional embodiments consistent with the present disclosure may utilize other hand motion tracking systems, such as camera-based tracking. According to such embodiments, an infrared camera may be utilized for patient privacy. In addition or alternatively, tracking systems featuring accelerometers and gyroscopes may be utilized alone or in combination with other tracking systems. The use of such tracking systems may supplement, and thereby improve, the electromagnetic tracking systems described herein.
[024] As used herein, the terms “usage” and “activity” may be used interchangeably and may include all motions and movements of the ultrasound user’s scanning hand. The ultrasound user’s scanning hand translational and/or rotational movement can be encompassed within this definition of “usage” or “activity” such that a greater amount and/or degree of translational and/or rotational movement of the ultrasound user’s scanning hand required to accomplish the same all impact the measured “usage” or “activity” levels. The usage or activity of a ultrasound user’s scanning hand may be measured and/or displayed in the form of one or more performance metrics, which may include a usage or efficiency of total translational and/or rotational movement of the scanning hand, for example.
[025] As used herein, “total linear hand motion” encompasses the total translational hand distance in the X, Y, and Z direction during an ultrasound exam. “Total rotational hand motion” encompasses the total rotational hand movement around the X, Y, and Z axes during an exam.
[026] As used herein, “clinical context” and “upstream clinical context” may be used interchangeably and may include or be based on the type of exam being performed, e.g., pulmonary exam or cardiac exam, as well as patient-specific information, non-limiting examples of which may include patient weight, height, body mass index (BMI), age, underlying health condition(s), clinical history, and combinations thereof. Additional information that may be encompassed within the clinical context can include the reason(s) for performing an exam, e.g., to evaluate left ventricular function, along with the patient’s length of stay, e.g., inpatient and/or outpatient time. Information constituting the clinical context can be combined and/or displayed on the user interface in various ways. For example, the clinical context can include categorical descriptors of a patient’s body type and/or associated levels of exam difficulty, such as “easy,” “moderate,” or “difficult.” The clinical information can also include the particular ultrasound model being used to perform the exam, as the scanning hand motion required to perform an ultrasound exam may differ for different models.
[027] As used herein, “performance metrics” can include total translational scanning hand movement and/or total rotational scanning hand movement, one or more of which may be compared to expected total translational scanning hand movement, expected rotational scanning hand movement to generate efficiency scores or ratings.
[028] As used herein, the “health status” of a ultrasound user’s scanning hand can encompass the health of various anatomical parts associated with, comprising, or attached to the ultrasound user’s scanning hand, non-limiting examples of which may include the ultrasound user’s shoulder, elbow, forearm, wrist, hand, finger(s), or combinations thereof. The “health status” can also embody an overall indication of a ultrasound user’s scanning hand health based on the total stresses incurred by each body part directly or indirectly connected to the scanning hand.
[029] As used herein, “expert” or “experienced” ultrasound users may include certified ultrasound users having at least a certain length of experience, such as at least one year, two years, three years, four years, five years, or longer. “Expert” or “experienced” ultrasound users may also include ultrasound users who have received or attained a form of recognized achievement or certification.
[030] Various ultrasound-based exams are contemplated herein, non-limiting examples of which include diagnostic imaging, cardiac imaging, vascular imaging, lung imaging, and combinations thereof. The particular exam being performed likely impacts the scanning hand activity of the ultrasound user, especially if an exam requires the acquisition of a greater number of images at a variety of depths and/or angles.
[031] FIG. 1 depicts an overview of a scanning hand monitoring and evaluation system 100 implemented in accordance with embodiments described herein. The techniques disclosed herein may reference ultrasound users such as sonographers. Further, the figures, including Fig. 1A may make reference to a sonographer however, such term should be seen as interchangeable with any ultrasound user regardless of specific certification unless otherwise indicated. As shown, information constituting the clinical context 102 can include patient type, which in this particular example includes categories such as “easy,” “moderate,” and “difficult.” The clinical context 102 can also be based on whether the ultrasound exam is being performed at an inpatient or outpatient facility. The patient history, reason for exam, and model of the ultrasound system being used are further included in this example of a clinical context 102.
[032] As further shown, the data constituting the clinical context 102 can be provided as input received by an artificial intelligence system represented in the illustrated embodiment as a neural network 104. The neural network 104 can be trained to predict the expected usage 106 of a ultrasound user’s scanning hand, for example in the form of an expected electromagnetic tracker output, based on the clinical context 102. The expected electromagnetic tracker output can include a total amount and/or degree of translational and/or rotational scanning hand movement. Changes to the clinical context may change the expected usage, even for the same exams performed with the same equipment. For example, the expected activity for the same pulmonary ultrasound exam performed using the same ultrasound machine can differ between two patients, especially if one patient is considered “difficult” and the other is considered “easy” based on their respective physical attributes, such as BMI. Clinical context can also impact the expected duration of an ultrasound exam, such that a longer expected duration likely translates into higher expected usage levels. Clinical information about a patient undergoing an ultrasound exam is therefore utilized by the neural network 104 to refine the expected usage 106 of the scanning hand during a given ultrasound exam.
[033] The actual usage 108 of a ultrasound user’s scanning hand can be determined by mining and extracting the translational and rotational coordinates of the ultrasound transducer obtained from the electromagnetic tracking system during an ultrasound exam.
[034] This data can also be analyzed by one or more processors of the system 100 in view of the relevant clinical context 102, such that actual usage levels are properly considered together with the corresponding clinical context. A high activity level measured for an “easy” patient, for example, may indicate a likely inefficiency in the ultrasound user’s scanning hand usage, which may be confirmed by the system depending on the output 106 of the neural network 104. Efficiency in this context may be determined by the amount of non-essential linear and/or rotational motion of the scanning hand, such that a greater amount of non-essential motion is less efficient than a lesser amount of non-essential motion. Non-essential motion may increase if the amount of transducer twisting required to obtain a certain image, for example, is greater than that utilized by an experienced ultrasound user to obtain the same image.
[035] The actual usage 108 obtained from the electromagnetic tracking device can then be compared to the expected usage 106 predicted by the neural network 104 to determine the adherence (or deviation) of the ultrasound user’s current performance relative to the performance expected for the same exam performed on a patient having similar attributes. The result of this comparison can be generated and subsequently displayed on a graphical user interface 110 in the form of a customizable health status report 112, as further described below in connection with FIG. 4. The total usage of the scanning hand determined by the disclosed systems from the movement detected by the electromagnetic tracking device can then be compared to the expected usage determined by the aforementioned intelligence system to identify performance deviations, which may reveal user inefficiencies. In this manner, the electromagnetic tracker can provide real time data regarding excessive scanning hand usage, for example indicating that a ultrasound user is struggling to find the right scanning angle necessary to obtain a specific image. [036] FIG. 2 shows two examples of electromagnetic tracking systems that may be utilized herein. The first system shown is an integrated tracking system 200, which includes an ultrasound probe or transducer 202 that contains an electromagnetic tracking device integrated within the device. A processor 204 physically and/or communicatively coupled to the transducer 202 is configured to detect the position and orientation of the electromagnetic tracking device within the transducer throughout an ultrasound exam.
[037] The second system shown is an attached tracking system 206, which features an external electromagnetic tracking device 208 attached to the ultrasound transducer 210. An external field generator 212 is also shown, as is a processor 214 physically and/or communicatively coupled to the external electromagnetic tracking device 208 and configured to detect the position and orientation tracking device 208, and thus the transducer 210. Both ultrasound transducers 202, 210 are configured to acquire images of a target region within a patient during an ultrasound exam. The ultrasound user performing an ultrasound exam can move and manipulate the transducer with his/her scanning hand, such that the scanning hand is moving longitudinally and/or laterally over the surface of the patient.
[038] FIG. 3 is a depiction of a neural network that may be trained and implemented to generate an expected usage of a ultrasound user’s scanning hand based on a particular clinical context. As shown, the neural network 300 may include an input layer 302 configured to receive a variety of discrete clinical context datasets. The number of nodes or neurons in the input layer 302 may vary, and while only one neuron is depicted for illustrative purposes, non-limiting embodiments may include a number of neurons equal to the number of variables included in the clinical context training set(s), or the number of training set variables plus one. The neural network 300 can be trained to receive a clinical context at the input layer 302 and generate an expected usage output based on the ground truth usage of experienced ultrasound users. Embodiments of the neural network may be configured to implement an algorithmic regressive prediction model.
[039] The output layer 304 of the neural network 300 can provide an expected usage, which can be parsed into individual activities, e.g., translational movement and rotational movement, represented in FIG. 3 as expected usage output neurons 304a and 304b, respectively. The ground truth for the expected values of rotational and/or translational movement can be retrieved from usages previously employed by experienced ultrasound users for each given upstream clinical context. [040] Like the input layer 302, the number of neurons in the output layer 304 may vary. For example, the output layer 304 may include one total neuron or one neuron for each movement constituting the total usage. In some embodiments, a custom score may be generated for each of the outputs. A range may be assigned to each of the outputs, which may be based on uncertainty levels, so that if a ultrasound user’s actual usage falls within the defined usage range, that usage is deemed acceptable, normal, or efficient. In addition or alternatively, the risk of scanning hand overuse can be calculated based on averaging (or weighted averaging) the ratio of the ultrasound user’s output values to the expected values of the experienced ultrasound user for each output item. This ratio can be determined in some examples according to Equation 1.1:
[041] Equation 1.1 :
Figure imgf000013_0001
[042] The Wi,2 represents the weight factor based on the assigned importance of each output item. For example, it is expected that the rotational scanning-hand motion for some tasks, such as cardiac imaging, in difficult patients is more complex than the translational hand motion. Therefore, a higher weight factor can be assigned to the W2 compared to the Wi. Ideally, the risk ratio should be close to 1. If the ratio is much higher than 1, it can be flagged with additional remarks (e.g. overuse due to excessive scanning hand rotational motion), which can be displayed on a graphical user interface, as described in connection with FIG. 4.
[043] Operating between the input layer 302 and the output layer 304 can be one or more hidden layers 306 configured to assign and optimize weights associated with the clinical context inputs, for example via backpropagation, and apply the weighted inputs to an activation function, e.g., the rectified linear activation function. The number of hidden layers and the number of neurons present therein may also vary. In some embodiments, the number of hidden neurons in a given hidden layer may equal the product of the total number of input neurons and output neurons, together multiplied by the number of datasets used to train the network. The particular neural network(s) implemented in accordance with the disclosed embodiments may vary. The number of neurons may change, for example, along with their arrangement within the network. The size, width, depth, capacity, and/or architecture of the network may vary.
[044] The neural network 300 may be hardware- (e.g., neurons are represented by physical components) or software-based (e.g., neurons and pathways implemented in a software application), and can use a variety of topologies and learning algorithms for training the neural network to produce the desired output. For example, a software-based neural network may be implemented using a processor (e.g., single- or multi-core CPU, a single GPU or GPU cluster, or multiple processors arranged for parallel-processing) configured to execute instructions, which may be stored in a computer-readable medium, and which when executed cause the processor to perform a machine-trained algorithm for receiving clinical context input(s) and generating an expected usage level of the scanning hand of a ultrasound user performing the ultrasound exam included within the input clinical context. The neural network 300 may be implemented, at least in part, in a computer-readable medium comprising executable instructions, which when executed by a processor, may cause the processor to perform a machine-trained algorithm to output expected activity levels for the scanning hand of a ultrasound user performing a particular exam.
[045] In various examples, the neural network(s) may be trained using any of a variety of currently known or later developed machine learning techniques to obtain a neural network (e.g., a machine-trained algorithm or hardware-based system of nodes) configured to analyze input data in the form of patient- and exam-specific information collectively constituting a clinical context. As noted above, the ground truth used for training the network 300 can include documented activity levels of expert, experienced, or average ultrasound users exerted in the same or similar clinical context. Supervised learning models can be trained on a comprehensive data set of clinical contexts and associated usages. The accuracy of the neural network can thus grow stronger over time as more data is input. The model may be (partially) realized as an AI-based learning network. The computer-implemented techniques utilized to generate the expected usage may vary, and may involve artificial intelligence-based processing, e.g., sophisticated supervised machine learning.
[046] The neural network 300 can also be coupled to a training database 308. The training database 308 may provide a large sample of clinical context data sets and corresponding usages used to train the neural network. Communication between the training database 308 and the neural network 300 can be bidirectional, such that the training database 308 may provide usages obtained by experienced ultrasound users to the network for training purposes, and the neural network 300 can transmit new clinical context datasets for storage in the training database 308, thereby increasing the sample size of clinical contexts paired with usage outputs and further refining future output from the neural network.
[047] While one or more neural networks, e.g., network 300, may be utilized to generate expected usages of a ultrasound user’s scanning hand, embodiments are not confined to neural networks, and a number of additional and alternative intelligence systems or components may be utilized, such as random forests or support vector machines.
[048] FIG. 4 shows a graphical user interface (GUI) 400, which may also be considered a
“dashboard” or “panel,” configured to display a customized health status report 402 corresponding to a particular ultrasound user, who can be selected from a list of ultrasound users included in a dropdown menu 404. In additional embodiments, the GUI 400 can receive free text and/or search entries corresponding to specific ultrasound users. The health status report 402 can be displayed via the GUI 400 to various personnel, including ultrasound users or clinical lab managers seeking periodic health status reports of scanning hand usage.
[049] The report 402 can include selectable performance output 406, which can include options such as “average exam” or “single exam.” As the displayed options indicate, an average exam performance output causes the report 402 to show an average performance output determined based on two or more exams, whereas the single exam performance output causes the report 402 to show the performance output of a single exam. The report 402 can also feature a selectable time period or date range 408 over which a given ultrasound user’s scanning hand usage is determined. The ultrasound user’s scanning hand performance may vary depending on the date range specified at the GUI 400. For example, a wide range, e.g., one year or longer, may reveal approximately average efficiency levels, whereas a date range spanning the first six months of that same one-year period may show low-efficiency levels, and a date range spanning the second six months of the one-year period may show high-efficiency levels. The date range can span less than one day and stretch as long as months or years. The report 402 can thus provide detailed, customizable information to benchmark certain performance metrics and enable continuous performance improvement.
[050] The report 402 can also include an exam number selection 410, which allows the user to view information regarding a specific exam performed on a specific day. In the illustrated example, the user selected exam number 3 performed on November 12, 2020. A ultrasound user can thus query the system on demand to provide a health status report 402 based solely on a specific ultrasound exam, which may be the most recent exam performed, or on multiple exams performed over a defined period also specified by a ultrasound user. For example, a ultrasound user can query the system to provide a scanning hand health status report based on all exams performed within the previous week, month, or year. In this manner, the ultrasound user can determine how a particular exam or collection of exams has impacted the current health status of his/her scanning hand. The ultrasound user selection (e.g., dropdown menu 404), selectable performance output 406, selectable date range 408, and/or exam number selection 410 can comprise all or part of one or more user inquiries or instructions input at the GUI 400.
[051] The details section 412 of the report 402 provides the clinical context corresponding to the selected exam. As shown, the clinical context of exam number 3 included a cardiac exam performed on a difficult outpatient using the Affiniti 70 ultrasound machine. A qualitative remarks section 414 provides a summary of the ultrasound user’s scanning hand performance. The level of detail provided in the remarks section 414 can vary. The example shown indicates “scanning hand overuse, particularly extra rotational movement.” To provide additional, more specific information, the scanning hand efficiency graphic 416 parses out the efficiency of the ultrasound user’s total translational usage and total rotational usage. Additional information included in the health status report 402 can include the health status of the ultrasound user, which may be based on one or more examinations, and the extent to which the ultrasound user’s performance metrics deviated from the expected metrics. Upon request, ultrasound users can query the health status of their scanning hand by clicking on a tag/button/other feature on the ultrasound imaging screen of the GUI 400 at the end of an ultrasound exam. The disclosed systems can store and utilize ultrasound user-specific performance metrics to generate individualized health reports on a daily, weekly, biweekly, monthly, quarterly, and/or annual basis. In some embodiments, the GUI 400 can be configured to generate and display confidence levels depending on the number of exams included in a specified time period. For instance, confidence levels may be relatively low if a small number of exams is encompassed within a specified date range, whereas confidence levels may be relatively high for a larger number of exams.
[052] Additionally, a live pop-up message or alert 418 can be automatically generated and displayed on the GUI when abnormal (e.g., very large) translational and/or rotational movement is detected during an exam, as compared to the expected values forecasted by the predictive system for the same upstream clinical context. Enabling live alerts upon recognizing these abnormalities provides real-time guidance for ultrasound users, which may minimize potential overuse and injury risk to the scanning hand. In some examples, systems herein can connect a ultrasound user virtually to an expert who can recommend the implementation of any necessary adjustments. The live alert 418 can be displayed on a health status report configured to monitor and display information regarding a current exam, instead of a previous exam. In some examples, the live alert 418 can also be displayed on the GUI 400, but not as a component of the health status report. In some examples, the live alert 418 can be displayed on a separate GUI or screen configured to display live ultrasound images obtained during the exam, and may be displayed in a manner that does not unnecessarily disrupt the current exam.
[053] The GUI 400 can be physically and/or communicatively coupled to one or more underlying processors 420 configured to generate and modify the displayed graphics in response to different user inputs received at the GUI 400, for example regarding the date range over which scanning hand efficiency is calculated and the qualitative remarks associated therewith. One or more of the processors 420 can be configured to operate an intelligence system, such as neural network 300. In addition or alternatively, one or more of the processors 420 can be configured to determine the position and orientation of an electromagnetic sensor, and may therefore form part of the electromagnetic tracking system configured to determine scanning hand usages. The one or more processors 420 can also be configured to compare actual scanning hand usages to expected scanning hand usages. Scanning hand data acquired over time can be stored in one or more data storage devices 422, which may be coupled with the graphical user interface 400 and/or the one or more processors 420. The GUI 400 can be configured to identify, extract, and/or receive the stored data required to generate a particular health status report 402 in accordance with user-specified parameters received at the GUI 400. For example, the GUI 400 and one or more processors 420 coupled therewith can mine the data storage device(s) 422 for one or more datasets regarding the scanning hand usage of a particular ultrasound user over a specific time period, along with any particular exams performed during that time period. The GUI 400 can thus be configured to initiate and/or perform a variety of data extractions and analyses, receive data corresponding to such extractions and analyses, and generate graphical displays in the form a health status report 402 customized in view of the same.
[054] The GUI 400 can also be configured to generate and/or modify the graphics displayed on the health status report 402 in accordance with additional user inputs. For example, the scanning hand efficiency section 416 may be modified to remove the circular efficiency ratings and/or replace them with a linear efficiency display or numerical efficiency display. The GUI 400 can also be configured to display absolute data regarding scanning hand usage, for example in terms of usage hours and the total translational and rotational movement accrued during certain periods. In some examples, the GUI 400 can be configured to selectively obtain/receive and display data corresponding to a particular clinical context. For example, the GUI 400 can obtain/receive and display data corresponding to a particular patient type (e.g., “difficult”), a particular model of ultrasound machine, a particular time of day (e.g., AM or PM), and a particular type of exam (e.g., cardiac). Ultrasound users and lab managers may therefore use the GUI 400 to identify specific areas needing improvement, which may also be used to adjust ultrasound user scheduling.
[055] As noted above, individual hand movements in sonography are not necessarily harmful alone, but frequent repetition or prolonged duration of exposure, compounded with a pace that lacks sufficient time for recovery, can increase the risk of injury significantly. Ultrasound users who repeatedly perform the same type(s) of exams utilizing the same muscle groups are therefore more susceptible to injury. The graphical user interface 400 can convey risk-enhancing activities undertaken by a particular ultrasound user, for example in the form of over-usage and inefficiencies itemized by activity type, e.g., rotational motion, thereby enabling the ultrasound user to minimize the risk of injury.
[056] In additional embodiments, one or more of the systems disclosed herein may be configured to predict a future time at which the scanning hand of a ultrasound user is likely to develop an injury, such as carpal tunnel syndrome. The systems disclosed herein can be configured to recommend certain actions based on this information. For example, embodiments can be configured to recommend that a ultrasound user perform less of a certain exam and/or that a ultrasound user receive additional training to improve his/her efficiency with respect to a specific exam.
[057] FIG. 5 is a block diagram illustrating an example processor 500 according to principles of the present disclosure. One or more processors utilized to implement the disclosed embodiments may be configured the same as or similar to processor 500. Processor 500 may be used to implement one or more processes described herein. For example, processor 500 may be configured to implement an artificial intelligence system configured to generate predicted usage levels, such as neural network 300. Accordingly, the processor 500 can also be configured to receive a clinical context data set and determine an actual usage of the scanning hand of the ultrasound user expended during the ultrasound exam. The same processor, or a different processor configured similarly, can also compare the expected usage to the actual usage to generate a performance metric of the scanning hand of the ultrasound user. The processor 500 can also be configured as a graphics processor programmed to generate displays for the graphical user interfaces described herein. The processor 500 can also be configured to determine an actual usage of the scanning hand of a ultrasound user during the ultrasound exam by determining the total translational movement and total rotational movement of the ultrasound transducer. The processor 500, or a different processor configured similarly, can be physically and/or communicatively coupled with an electromagnetic tracking device according to embodiments disclosed herein.
[058] Processor 500 may be any suitable processor type including, but not limited to, a microprocessor, a microcontroller, a digital signal processor (DSP), a field programmable array (FPGA) where the FPGA has been programmed to form a processor, a graphical processing unit (GPU), an application specific circuit (ASIC) where the ASIC has been designed to form a processor, or a combination thereof.
[059] The processor 500 may include one or more cores 502. The core 502 may include one or more arithmetic logic units (ALU) 504. In some examples, the core 502 may include a floating point logic unit (FPLU) 506 and/or a digital signal processing unit (DSPU) 508 in addition to or instead of the ALU 504.
[060] The processor 500 may include one or more registers 512 communicatively coupled to the core 502. The registers 512 may be implemented using dedicated logic gate circuits (e.g., flip- flops) and/or any memory technology. In some examples, the registers 512 may be implemented using static memory. The register may provide data, instructions and addresses to the core 502.
[061] In some examples, processor 500 may include one or more levels of cache memory 510 communicatively coupled to the core 502. The cache memory 510 may provide computer- readable instructions to the core 502 for execution. The cache memory 510 may provide data for processing by the core 502. In some examples, the computer-readable instructions may have been provided to the cache memory 510 by a local memory, for example, local memory attached to the external bus 516. The cache memory 510 may be implemented with any suitable cache memory type, for example, metal-oxide semiconductor (MOS) memory such as static random access memory (SRAM), dynamic random access memory (DRAM), and/or any other suitable memory technology.
[062] The processor 500 may include a controller 514, which may control input to one or more processors included herein, e.g., processors 204, 214. Controller 514 may control the data paths in the ALU 504, FPLU 506 and/or DSPU 508. Controller 514 may be implemented as one or more state machines, data paths and/or dedicated control logic. The gates of controller 514 may be implemented as standalone gates, FPGA, ASIC or any other suitable technology.
[063] The registers 512 and the cache memory 510 may communicate with controller 514 and core 502 via internal connections 520A, 520B, 520C and 520D. Internal connections may be implemented as a bus, multiplexor, crossbar switch, and/or any other suitable connection technology.
[064] Inputs and outputs for the processor 500 may be provided via a bus 516, which may include one or more conductive lines. The bus 516 may be communicatively coupled to one or more components of processor 500, for example the controller 514, cache 510, and/or register 512. The bus 516 may be coupled to one or more components of the system.
[065] The bus 516 may be coupled to one or more external memories. The external memories may include Read Only Memory (ROM) 532. ROM 532 may be a masked ROM, Electronically Programmable Read Only Memory (EPROM) or any other suitable technology. The external memory may include Random Access Memory (RAM) 533. RAM 533 may be a static RAM, battery backed up static RAM, Dynamic RAM (DRAM) or any other suitable technology. The external memory may include Electrically Erasable Programmable Read Only Memory (EEPROM) 535. The external memory may include Flash memory 534. The external memory may include a magnetic storage device such as disc 536.
[066] FIG. 6 is a flow diagram of a method of evaluating and displaying ultrasound user performance. The example method 600 shows the steps that may be utilized, in any sequence, by the systems and/or apparatuses described herein. Although examples of the present system have been illustrated with particular reference to ultrasound imaging modalities, the present system can be extended to other medical imaging systems where one or more images are obtained in a systematic manner. For example, the method 600 may be performed with the aid of one or more imaging systems including but not limited to MRI or CT.
[067] At step 602, the method involves “receiving a user inquiry regarding a scanning hand performance of a ultrasound user over a time period.” At step 604, the method 600 involves “obtaining a scanning hand performance metric indicative of the scanning hand performance of the ultrasound user over the time period, the scanning hand performance metric based on a comparison of an actual scanning hand usage to an expected scanning hand usage determined over the time period.” At step 606, the method 600 involves “displaying the performance metric,” which may comprise a scanning hand efficiency rating. The method 600 can further involve adjusting a depiction of the performance metric, for example in response to a user input, which may comprise an instruction or query. The expected scanning hand usage can be based on one or more clinical contexts, for example the clinical context 102 shown in FIG. 1. Non-limiting examples of clinical contexts can include a description of an ultrasound exam conducted over the selected time period, a model of ultrasound machine utilized over the selected time period, at least one attribute of at least one patient examined over the selected time period, and/or the medical history of at least one patient examined over the selected time period. As described herein, the actual scanning hand usage can be based on the rotational and/or translational movement of an ultrasound transducer caused by a ultrasound user while performing one or more ultrasound exams over the selected time period. Transducer movement can be determined at least in part by an electromagnetic tracking device included within or attached to the transducer, but embodiments are not limited to electromagnetic tracking and may include additional tracking systems that include an accelerometer and/or gyroscope, for example. The method can also involve generating and displaying qualitative remarks or messages describing the performance metric and/or recommended adjustments to be implemented by the selected the ultrasound user over time. One or more of the steps included in the method 600 may be implemented automatically by a disclosed system, such that user input is not required. Display of the performance metric, for example, may be performed automatically by the GUI operating together with one or more processors. Suggested adjustments for a given ultrasound user can also be updated automatically in response to the selection of different time periods and/or exams performed therein.
[068] In various embodiments where components, systems and/or methods are implemented using a programmable device, such as a computer-based system or programmable logic, it should be appreciated that the above-described systems and methods can be implemented using any of various known or later developed programming languages, such as “C”, “C++”, “FORTRAN”, “Pascal”, “VHDL” and the like. Accordingly, various storage media, such as magnetic computer disks, optical disks, electronic memories and the like, can be prepared that can contain information that can direct a device, such as a computer, to implement the above-described systems and/or methods. Once an appropriate device has access to the information and programs contained on the storage media, the storage media can provide the information and programs to the device, thus enabling the device to perform functions of the systems and/or methods described herein. For example, if a computer disk containing appropriate materials, such as a source file, an object file, an executable file or the like, were provided to a computer, the computer could receive the information, appropriately configure itself and perform the functions of the various systems and methods outlined in the diagrams and flowcharts above to implement the various functions. That is, the computer could receive various portions of information from the disk relating to different elements of the above-described systems and/or methods, implement the individual systems and/or methods and coordinate the functions of the individual systems and/or methods described above.
[069] In view of this disclosure, it is noted that the various methods and devices described herein can be implemented in hardware, software and firmware. Further, the various methods and parameters are included by way of example only and not in any limiting sense. In view of this disclosure, those of ordinary skill in the art can implement the present teachings in determining their own techniques and needed equipment to affect these techniques, while remaining within the scope of the invention. The functionality of one or more of the processors described herein may be incorporated into a fewer number or a single processing unit (e.g., a CPU) and may be implemented using application specific integrated circuits (ASICs) or general-purpose processing circuits which are programmed responsive to executable instruction to perform the functions described herein.
[070] Accordingly, the present system may be used to obtain and/or project image information related to, but not limited to renal, testicular, breast, ovarian, uterine, thyroid, hepatic, lung, musculoskeletal, splenic, and cardiac applications. Further, the present system may also include one or more programs which may be used with conventional imaging systems so that they may provide features and advantages of the present system. Certain additional advantages and features of this disclosure may be apparent to those skilled in the art upon studying the disclosure, or may be experienced by persons employing the novel system and method of the present disclosure. Another advantage of the present systems and method may be that conventional medical image systems can be easily upgraded to incorporate the features and advantages of the present systems, devices, and methods.
[071] Of course, it is to be appreciated that any one of the examples, embodiments or processes described herein may be combined with one or more other examples, embodiments and/or processes or be separated and/or performed amongst separate devices or device portions in accordance with the present systems, devices and methods. [072] Finally, the above-discussion is intended to be merely illustrative of the present system and should not be construed as limiting the appended claims to any particular embodiment or group of embodiments. Thus, while the present system has been described in particular detail with reference to exemplary embodiments, it should also be appreciated that numerous modifications and alternative embodiments may be devised by those having ordinary skill in the art without departing from the broader and intended spirit and scope of the present system as set forth in the claims that follow. Accordingly, the specification and drawings are to be regarded in an illustrative manner and are not intended to limit the scope of the appended claims.

Claims

CLAIMS What is claimed is:
1. A ultrasound user performance evaluation system comprising: one or more processors (204, 214, 420, 500) configured to: determine a scanning hand performance metric (416) indicative of a scanning hand usage of a ultrasound user over a time period (408) in response to a user inquiry (404, 406, 410), the scanning hand performance metric based on a comparison of an input of an actual scanning hand usage (108) to an input of an expected scanning hand usage (106) determined over the time period; and a graphical user interface (400) configured to: receive the user inquiry; obtain the scanning hand performance metric; and display the scanning hand performance metric.
2. The ultrasound user performance evaluation system of claim 1, wherein the one or more processors are further configured to receive an input of one or more clinical contexts, each clinical context comprising: a description of an ultrasound exam conducted over the time period, a model of ultrasound machine utilized over the time period, at least one attribute of a patient examined over the time period, a medical history for the patient, or combinations thereof.
3. The ultrasound user performance evaluation system of claim 2, wherein the one or more processors are further configured to apply a neural network to each of the clinical contexts, the neural network configured to generate the expected scanning hand usage based on each of the clinical contexts.
4. The ultrasound user performance evaluation system of claim 3, wherein the one or more processors are further configured to determine the actual scanning hand usage expended during each ultrasound exam conducted over the time period.
5. The ultrasound user performance evaluation system of claim 4, wherein the one or more processors are configured to determine the actual scanning hand usage by tracking ultrasound transducer movement caused by the ultrasound user performing each ultrasound exam over the time period.
6. The ultrasound user performance evaluation system of claim 5, wherein ultrasound transducer movement comprises an input of translational movement and rotational movement as detected by an electromagnetic tracking device.
7. The ultrasound user performance evaluation system of claim 1, wherein the performance metric comprises an efficiency rating of the actual scanning hand usage.
8. The ultrasound user performance evaluation system of claim 6, wherein the performance metric comprises efficiency ratings corresponding to the translational movement and/or rotational movement.
9. The ultrasound user performance evaluation system of claim 2, wherein the graphical user interface is further configured to display the time period over which the performance metric is determined, the one or more clinical contexts, qualitative remarks describing the performance metric, or combinations thereof.
10. The ultrasound user performance evaluation system of claim 3, wherein the neural network is operatively associated with a training algorithm configured to receive an array of training inputs and known outputs, wherein the training inputs comprise a sample of clinical context datasets obtained from previously performed ultrasound exams, and the known outputs comprise performance metrics of the scanning hand of ultrasound users who performed the previous ultrasound exams.
11. The ultrasound user performance evaluation system of claim 1, wherein the graphical user interface is further configured to adjust a depiction of the performance metric in response to a user input.
12. A method of evaluating and displaying ultrasound user performance, the method comprising: receiving (602) a user inquiry (404, 406, 410) regarding a scanning hand performance of a ultrasound user over a time period (408); obtaining (604) a scanning hand performance metric (416) indicative of the scanning hand performance of the ultrasound user over the time period, the scanning hand performance metric based on a comparison of an input of an actual scanning hand usage (108) to an input of an expected scanning hand usage (106) determined over the time period; and displaying (606) the performance metric.
13. The method of claim 12, further comprising adjusting a depiction of the performance metric in response to a user input.
14. The method of claim 12, wherein the expected scanning hand usage is based on an input of one or more clinical contexts, each clinical context comprising: a description of an ultrasound exam conducted over the time period, a model of ultrasound machine utilized over the time period, at least one attribute of a patient examined over the time period, a medical history for the patient, or combinations thereof.
15. The method of claim 12, wherein the actual scanning hand usage is based on rotational and translational ultrasound transducer movement caused by the ultrasound user performing one or more ultrasound exams over the time period.
16. The method of claim 15, wherein the rotational and translational ultrasound transducer movement is determined at least in part by an electromagnetic tracking device.
17. The method of claim 12, wherein the performance metric comprises an efficiency rating of the actual scanning hand usage.
18. The method of claim 15, wherein the performance metric comprises efficiency ratings corresponding to the translational movement and/or rotational movement.
19. The method of claim 12, further comprising generating and displaying qualitative remarks describing the performance metric.
20. A non-transitory computer-readable medium comprising executable instructions, which when executed cause a processor of a ultrasound user performance evaluation system to perform any of the methods of claims 12-19.
PCT/EP2022/065640 2021-06-16 2022-06-09 User scanning hand evaluation WO2022263269A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202163211063P 2021-06-16 2021-06-16
US63/211,063 2021-06-16

Publications (1)

Publication Number Publication Date
WO2022263269A1 true WO2022263269A1 (en) 2022-12-22

Family

ID=82196590

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2022/065640 WO2022263269A1 (en) 2021-06-16 2022-06-09 User scanning hand evaluation

Country Status (1)

Country Link
WO (1) WO2022263269A1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170071468A1 (en) * 2015-09-11 2017-03-16 Carestream Health, Inc. Motion tracking method for sonographer

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170071468A1 (en) * 2015-09-11 2017-03-16 Carestream Health, Inc. Motion tracking method for sonographer

Similar Documents

Publication Publication Date Title
Sathiyanarayanan et al. MYO Armband for physiotherapy healthcare: A case study using gesture recognition application
US9861308B2 (en) Method and system for monitoring stress conditions
US20230055828A1 (en) Systems and methods for processing electronic images and updating based on sensor data
Baghdadi et al. Monitoring worker fatigue using wearable devices: A case study to detect changes in gait parameters
US10825178B1 (en) Apparatus for quality management of medical image interpretation using machine learning, and method thereof
US8458610B2 (en) Medical information generation and recordation methods and apparatus
CN109313817A (en) System and method for generating medical diagnosis
CN113473896A (en) Analysis subject
Patil et al. A proposed model for lifestyle disease prediction using support vector machine
US20200357520A1 (en) Diagnosis support apparatus
Zhou et al. End-user development for interactive data analytics: Uncertainty, correlation and user confidence
Lin et al. AI-driven decision making for auxiliary diagnosis of epidemic diseases
TW201445493A (en) A self-care system for assisting quantitative assessment of rehabilitation movement
JP2022537702A (en) Systems and methods using machine learning to predict contact lens fit
WO2022263269A1 (en) User scanning hand evaluation
US20220157467A1 (en) System and method for predicting wellness metrics
Condell et al. Finger movement measurements in arthritic patients using wearable sensor enabled gloves
Sahli et al. VNG technique for a convenient vestibular neuritis rating
WO2022263298A1 (en) User non-scanning hand evaluation
Jackson et al. Computer-assisted approaches for measuring, segmenting, and analyzing functional upper extremity movement: a narrative review of the current state, limitations, and future directions
WO2022263215A1 (en) User load balancing
Dawadi et al. Monitoring everyday abilities and cognitive health using pervasive technologies: Current state and prospect
Aljaaf et al. A study of data classification and selection techniques for medical decision support systems
US20240186009A1 (en) Systems and methods of providing deep learning based neurocognitive impairment evaluation using extended reality
US20210319893A1 (en) Avatar assisted telemedicine platform systems, methods for providing said systems, and methods for providing telemedicine services over said systems

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22733363

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE