WO2022263298A1 - User non-scanning hand evaluation - Google Patents

User non-scanning hand evaluation Download PDF

Info

Publication number
WO2022263298A1
WO2022263298A1 PCT/EP2022/065783 EP2022065783W WO2022263298A1 WO 2022263298 A1 WO2022263298 A1 WO 2022263298A1 EP 2022065783 W EP2022065783 W EP 2022065783W WO 2022263298 A1 WO2022263298 A1 WO 2022263298A1
Authority
WO
WIPO (PCT)
Prior art keywords
ultasound
user
scanning hand
usage
scanning
Prior art date
Application number
PCT/EP2022/065783
Other languages
French (fr)
Inventor
Seyedali SADEGHI
Anup Agarwal
Claudia ERRICO
Hua Xie
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2022263298A1 publication Critical patent/WO2022263298A1/en

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/20ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management or administration of healthcare resources or facilities, e.g. managing hospital staff or surgery rooms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/09Supervised learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent

Definitions

  • the present disclosure pertains to systems and methods for determining the usage of a ultasound user’s non-scanning hand in every ultrasound exam.
  • Particular implementations include systems configured to determine the activity and associated movement of a ultasound user’s non scanning hand during a medical imaging examination and assess the efficiency of such movement by implementing an intelligent computer-based prediction system.
  • WRMSDs Work-related musculoskeletal disorders
  • WRMSDs are painful conditions caused by workplace activities affecting the muscles, ligaments, and tendons.
  • WRMSDs often impose a substantial personal toll on those affected since they may no longer be able to perform personal tasks and routine activities of daily living.
  • WRMSDs develop gradually over time from repeated exposure to a variety of risk factors and may be painful during work and even at rest.
  • the present disclosure describes systems and methods for monitoring and reducing physical stress in the non-scanning hand of a ultasound user.
  • the ultasound user’s scanning hand moves the ultrasound probe over the target area of the patient.
  • the ultasound user’s non-scanning hand engages with a control panel of the ultrasound system.
  • the control panel includes a variety of buttons and knobs collectively programmed to, among other things, adjust the scanning parameters of the ultrasound probe, acquire ultrasound images, and annotate the acquired images with information relevant to the exam being performed.
  • the systems disclosed herein are configured to monitor, measure, and determine the usage of the ultasound user’s non-scanning hand during an ultrasound exam and determine whether the measured usage deviates from the usage expected for the same ultrasound exam performed on a similar patient.
  • a ultasound user performance evaluation system may include or be communicatively coupled with an image acquisition device configured to acquire images of a patient during an ultrasound exam.
  • the image acquisition device can include an ultrasound probe communicatively coupled with a control panel, which includes one or more user engagement features, such as physical buttons, touchscreen buttons, rotatable knobs, etc.
  • the control panel can be configured to adjust imaging parameters implemented by the image acquisition device upon receiving inputs from the ultasound user performing the ultrasound exam.
  • the system can also include one or more processors in communication with the control panel.
  • the one or more processors can be configured to receive a clinical context input by the ultasound user and/or the clinical context input may be received from another processor or database.
  • the one or more processors can also be configured to apply an intelligence system, which may include a neural network, to the clinical context.
  • the intelligence system can be configured to generate an expected usage of a non-scanning hand of the ultasound user based on the received clinical context.
  • the one or more processors can also determine an actual usage of the non-scanning hand of the ultasound user expended during the ultrasound exam, and compare the expected usage to the actual usage to generate a performance metric of the non-scanning hand of the ultasound user. The performance metric can then be displayed on a graphical user interface communicatively coupled with the processor(s).
  • a ultasound user performance evaluation system includes one or more processors configured to determine a non-scanning hand performance metric indicative of the non-scanning hand usage of a ultasound user over a time period in response to a user inquiry.
  • the non-scanning hand performance metric can be based on a comparison of an input of an actual non-scanning hand usage compared to an input of an expected non-scanning hand usage determined over the time period.
  • the system can also include a graphical user interface configured to receive the user inquiry, obtain the non-scanning hand performance metric, and display the non-scanning hand performance metric.
  • the one or more processors are further configured to receive an input of one or more clinical contexts.
  • Each clinical context can include a description of an ultrasound exam conducted over the time period, a model of ultrasound machine utilized over the time period, at least one attribute of a patient examined over the time period, a medical history for the patient, or combinations thereof.
  • the one or more processors are further configured to apply a neural network to each of the clinical contexts.
  • the neural network can be configured to generate the expected usage of the non-scanning hand based on the clinical contexts.
  • the one or more processors are further configured to determine the actual usage of the non-scanning hand expended during the ultrasound exams.
  • the one or more processors are configured to determine the actual usage of the non-scanning hand based on inputs from one or more service log files obtained and recorded by the one or more processors during the ultrasound exams.
  • the one or more processors are communicatively coupled with one or more control panels, each control panel configured to adjust imaging parameters implemented by an ultrasound image acquisition device in response to input received from the ultasound user during an ultrasound exam.
  • the service log files include data indicative of a total number, chronology, and type of ultasound user actions received at the control panel.
  • the one or more processors are configured to determine the actual usage of the non-scanning hand by determining a total linear motion and a total angular motion of the non-scanning hand based on a physical layout of the control panel and the total number, chronology, and type of ultasound user actions received at the control panel.
  • the performance metric comprises an efficiency rating of the actual usage of the non-scanning hand.
  • the performance metric comprises efficiency ratings corresponding to the total linear motion, the total angular motion, the total number of ultasound user inputs, or combinations thereof.
  • the ultasound user actions comprise total button pushes and total knob usage.
  • the graphical user interface is further configured to generate and display qualitative remarks describing the performance metric.
  • the neural network is operatively associated with a training algorithm configured to receive an array of training inputs and known outputs.
  • the training inputs can comprise a sample of clinical context datasets obtained from previously performed ultrasound exams, and the known outputs can comprise performance metrics of the non-scanning hand of ultasound users who performed the previous ultrasound exams.
  • a method of evaluating and displaying ultasound user performance involves receiving a user inquiry regarding a non scanning hand performance of a ultasound user over a time period.
  • the method may also involve obtaining a non-scanning hand performance metric indicative of the non-scanning hand performance of the ultasound user over the time period.
  • the non-scanning hand performance metric can be based on a comparison of an actual non-scanning hand usage to an expected non scanning hand usage determined over the time period.
  • the method may further involve displaying the performance metric.
  • the method further involves adjusting a depiction of the performance metric in response to a user input.
  • the expected non-scanning hand usage is based on one or more clinical contexts.
  • Each clinical context can include a description of one or more ultrasound exams conducted over the time period, one or more models of ultrasound machine utilized over the time period, at least one attribute of the patient, a medical history of the patient, or combinations thereof.
  • the actual usage of the non-scanning hand is based on service log files obtained and recorded during ultrasound exams performed during the time period.
  • the service log files can include data indicative of a total number, chronology, and type of ultasound user actions received at a control panel of an ultrasound image acquisition device.
  • the performance metric comprises an efficiency rating of the actual usage of the non-scanning hand.
  • the method further involves generating and displaying qualitative remarks describing the performance metric.
  • FIG. 1 is a schematic overview of a system configured to determine and display non scanning hand usage and efficiency in accordance with embodiments of the present disclosure.
  • FIG. 2 is a schematic of a control panel of an image acquisition system utilized in accordance with embodiments of the present disclosure.
  • FIG. 3 is a table showing average linear and angular hand movements accrued during an ultrasound exam performed using different ultrasound machines in accordance with embodiments of the present disclosure.
  • FIG. 4 is a diagram of system components utilized to determine the actual usage of a ultasound user’s non-scanning hand during an ultrasound exam in accordance with embodiments of the present disclosure.
  • FIG. 5 is a schematic of a neural network implemented to determine an expected usage of a ultasound user’s non-scanning hand during an ultrasound exam in accordance with embodiments of the present disclosure.
  • FIG. 6 is a graphical user interface configured to generate and display customizable health status reports in accordance with embodiments of the present disclosure.
  • FIG. 7 is a schematic of a processor utilized in accordance with embodiments of the present disclosure.
  • FIG. 8 is a flow diagram of a method of performed in accordance with embodiments of the present disclosure.
  • the disclosed systems and methods overcome the lack of intelligent, systematic tools for monitoring, determining, and ultimately improving the health of medical ultasound users’ non scanning hands.
  • embodiments of the disclosed systems determine non-scanning hand usage, which may comprise linear and angular hand movement as well as user engagement with various buttons and knobs included on the control panel of most ultrasound machines.
  • the actual non-scanning hand usage can be compared to an expected non scanning hand usage predicted by an artificial intelligence system trained to predict non-scanning hand usage levels based on clinical context.
  • the clinical context typically comprises the particular exam performed, the ultrasound machinery utilized, and various physical attributes of the current patient and his/her medical history.
  • embodiments can include a graphical user interface configured to display the acquired information in customizable reports.
  • the graphical user interface can display the data directly to the ultasound user and/or a lab manager overseeing the ultasound user’s performance. The user interface can thus be displayed locally or across different locations.
  • usage and “activity” may be used interchangeably and may include all motions and movements of the ultasound user’s non-scanning hand.
  • the ultasound user’s engagement with the controls on (or displayed on) the control panel, e.g., button pushes and knob rotations, may all be encompassed within this definition of “usage” or “activity” such that a greater number and/or degree of button pushes, knob rotations, and the lateral/longitudinal movement of the ultasound user’s non-scanning hand required to accomplish the same all impact the measured “usage” or “activity” levels.
  • the usage or activity of a ultasound user’s non-scanning hand may be measured and/or displayed in the form of one or more performance metrics, which may include a usage or efficiency of total knob usage, for example.
  • clinical context and “upstream clinical context” may be used interchangeably and may include or be based on the type of exam being performed, e.g., pulmonary exam or cardiac exam, as well as patient-specific information, non-limiting examples of which may include patient weight, height, body mass index (BMI), age, underlying health condition(s), clinical history, and combinations thereof. Additional information that may be encompassed within the clinical context can include the reason(s) for performing an exam, e.g., to evaluate left ventricular function, along with the patient’s length of stay, e.g., inpatient and/or outpatient time. Information constituting the clinical context can be combined and/or displayed on the user interface in various ways.
  • the clinical context can include categorical descriptors of a patient’s body type and/or associated levels of exam difficulty, such as “easy,” “moderate,” or “difficult.”
  • the clinical information can also include the particular ultrasound model being used to perform the exam, as the control panels and resulting hand motion often differ for different models, as shown for example in FIG. 3.
  • performance metrics can include total button pushes, total knob usage, total linear hand motion, and/or total angular hand motion, one or more of which may be compared to expected total button pushes, expected total knob usage, expected total linear hand motion, and/or expected total angular hand motion to generate efficiency scores or ratings.
  • the “health status” of a ultasound user’s non-scanning hand can encompass the health of various anatomical parts associated with, comprising, or attached to the ultasound user’s non-scanning hand, non-limiting examples of which may include the ultasound user’s shoulder, elbow, forearm, wrist, hand, finger(s), or combinations thereof.
  • the “health status” can also embody an overall indication of a ultasound user’s non-scanning hand health based on the total stresses incurred by each body part directly or indirectly connected to the non-scanning hand.
  • “expert” or “experienced” ultasound users may include certified ultasound users having at least a certain length of experience, such as at least one year, two years, three years, four years, five years, or longer. “Expert” or “experienced” ultasound users may also include ultasound users who have received or attained a form of recognized achievement or certification.
  • ultrasound-based exams are contemplated herein, non-limiting examples of which include diagnostic imaging, cardiac imaging, vascular imaging, lung imaging, and combinations thereof.
  • the particular exam being performed likely impacts the non-scanning hand activity of the ultasound user, especially if an exam requires the acquisition of a greater number of images at a variety of depths and/or angles.
  • FIG. 1 depicts an overview of a non-scanning hand monitoring and evaluation system 100 implemented in accordance with embodiments described herein.
  • ultrasound user may include and be shown as sonographer, but intends to include ultrasound user regardless of certification body or title unless otherwise indicated.
  • information constituting the clinical context 102 can include patient type, which in this particular example includes categories such as “easy,” “moderate,” and “difficult.”
  • the clinical context 102 can also be based on whether the ultrasound exam is being performed at an inpatient or outpatient facility.
  • the patient history, reason for exam, and model of the ultrasound system being used are further included in this example of a clinical context 102.
  • the data constituting the clinical context 102 can be provided as input received by an artificial intelligence system represented in the illustrated embodiment as a neural network 104.
  • the neural network 104 can be trained to predict the expected usage 106 of a ultasound user’s non-scanning hand, for example in the form of an expected service log file output, based on the clinical context 102.
  • the expected usage 106 output may include an expected log file imaging workflow sequence, which can include a total number and chronology of button pushes, knob usage, linear hand movement, and/or angular hand movement. Changes to the clinical context may change the expected usage, even for the same exams performed with the same equipment.
  • the expected activity for the same pulmonary ultrasound exam performed using the same ultrasound machine can differ between two patients, especially if one patient is considered “difficult” and the other is considered “easy” based on their respective physical attributes, such as BMI.
  • Clinical information about a patient undergoing an ultrasound exam is therefore utilized by the neural network 104 to refine the expected usage 106 of the non scanning hand during a given ultrasound exam.
  • the actual usage 108 of a ultasound user’s non-scanning hand can be determined by identifying and extracting specific data embodied within the service log files obtained and logged during an ultrasound exam.
  • This data can also be analyzed by one or more processors of the system 100 in view of the relevant clinical context 102, such that actual activity levels are properly considered together with the corresponding clinical context.
  • a high activity level measured for an “easy” patient may indicate a likely inefficiency in the ultasound user’s non-scanning hand usage, which may be confirmed by the system depending on the output 106 of the neural network 104.
  • Efficiency in this context may be determined by the amount of non-essential linear and/or rotational motion of the non-scanning hand, such that a greater amount of non-essential motion is less efficient than a lesser amount of non-essential motion.
  • Non-essential motion may increase if the number of button presses required to obtain a certain imaging plane, for example, are higher than those utilized by an experienced ultasound user to arrive at the same imaging plane.
  • the actual usage 108 embodied within the obtained service log files can then be compared to the expected usage 106 predicted by the neural network 104 to determine the adherence (or deviation) of the ultasound user’s current performance relative to the performance expected for the same exam performed on a patient having similar attributes.
  • the result of this comparison can be generated and subsequently displayed on a graphical user interface 110 in the form of a customizable health status report 112, as further described below in connection with FIG. 6.
  • FIG. 2 shows an example of a control panel 200 of an ultrasound machine.
  • the control panel 200 may be physically and/or communicatively coupled to an ultrasound probe configured to acquire images of a target region within a patient.
  • the control panel 200 can include a display 202, which can be user-interactive and/or configured to display a live or previously acquired ultrasound image.
  • the control panel 200 can also include a variety of buttons 204, at least one rotatable knob 206, and/or one or more slidable adjustment controls 208, all of which may be collectively referred to as “controls” herein.
  • the ultasound user performing an ultrasound exam can engage any or all of the illustrated features of the control panel 200 with his/her non-scanning hand, such that the non-scanning hand is pressing buttons, sliding controls, rotating knobs, and/or otherwise interacting with the panel during an ultrasound exam.
  • the linear and angular movement required to perform such actions also impacts the total usage of the non-scanning hand. For example, pressing a button 210 on the far left side of the control panel 200 and then pressing a button 212 on the far right side of the control panel 200 requires greater linear movement of the non-scanning hand than pressing button 210 and then pressing button 214. Similarly, repeated pressing of button 210 requires less linear movement, and thus usage, than repeated pressing of button 210 interleaved with button 214.
  • FIG. 3 is a table 300 highlighting the impact of using different ultrasound machine models on average linear and angular hand movements accrued during the same ultrasound exam performed on the same patient.
  • the Affiniti 50, Affiniti 50G, Affiniti 70, and Affiniti 70G models of ultrasound machine, all sold by Koninklijke Philips N.V., are represented in the table, which shows experimentally derived data acquired using each of the models.
  • the Affiniti 70 model required the greatest average hand distance (over 51 meters) to perform the ultrasound exam relative to the other models.
  • the Affiniti 50G model only required 14.73 meters of linear hand movement to perform the same exam.
  • Affiniti 70 Average angular movement of the non scanning hand was also the highest for the Affiniti 70 (over 14,600 degrees).
  • the Affiniti 50G only required about 3,900 degrees of angular motion to perform the same exam. Accordingly, the particular model of ultrasound machine may be an important factor of the clinical context utilized by the systems disclosed herein to predict usage levels of a ultasound user’s non-scanning hand.
  • FIG. 4 is a diagram of various system components 400 utilized for determining the actual usage of a ultasound user’s non-scanning hand during a given ultrasound exam.
  • the service log files 402 obtained during an ultrasound exam can be mined to extract and process various forms of information regarding the ultasound user’s engagement with various features of a control panel throughout the exam.
  • the ultasound user’s control panel engagement can be captured in the form of total button pushes for each of a variety of buttons, along with the chronology of the button pushes, neither of which are directly shown in the service log files, but rather obtained via one or more post-processing techniques implemented by one or more underlying processors.
  • the user engagement information obtained from the service log files 402 can be utilized to determine the linear distance and angular motion of the ultasound user’s non-scanning hand during an ultrasound exam based on the layout of the particular control panel 404 used to perform the exam.
  • Total linear hand motion can be determined based on the distance between buttons engaged during an exam, as well as the relative arrangement of the buttons. This information can be combined with the information mined from the service log files, e.g., the number and sequence of button pushes/knob rotation, to determine the linear hand motion. In some embodiments, the total linear hand motion can be determined using Equation 1.1:
  • Equation 1.1 [038] Equation 1.1:
  • Total planar angular hand motion can be determined based on the angular motion in the x- y plane during an exam. In some embodiments, the total planar angular hand motion can be determined (in degrees) using Equation 1.2:
  • Equation 1.2 ⁇ g l( arctan (gg)
  • All workflow information, ordered chronologically, can thus be derived indirectly from the service log files and associated control panel layout, such that a user (e.g., lab manager or ultasound user) can identify which controls the ultasound user has pushed/rotated/slid during an exam, the number of times those controls were pushed/rotated/slid, and the overall hand motion of the ultasound user required to perform such activities.
  • a user e.g., lab manager or ultasound user
  • Transforming the data gleaned from the service log files into linear and angular hand motion requires detailed information regarding the layout of a variety of ultrasound control panels. Such detailed information includes the particular distance between each of the user engagement features included thereon, along with the angles between such features.
  • the detailed information may also include the sensitivity of any rotatable knobs, which may impact the degree of angular motion required to implement desired imaging adjustments. Highly sensitive knobs, for example, may require less angular twisting motion than less sensitive knobs to implement the same imaging adjustment.
  • an action implemented by one control panel may require the ultasound user to push a series of buttons or to push one button and rotate one knob, while the same action implemented by another control panel may simply require the pushing of a single button.
  • the systems disclosed herein are configured to store and filter such information to ultimately obtain the information necessary to accurately determine the activity of a ultasound user’s non-scanning hand.
  • the total usage of the non-scanning hand determined by the disclosed systems in view of the service log files 402 and stored layout of the control panel 404 is then compared to the expected usage determined by the aforementioned intelligence system to identify performance deviations 406, which may reveal user inefficiencies.
  • the service log files 402 can provide real-time data regarding excessive non-scanning hand usage, for example indicating that a ultasound user is struggling to find the right acoustic imaging parameters, e.g., gain, focus, changing modes, etc.
  • the service log files 402 are mined and utilized by the disclosed systems in a manner not conceived or achievable using pre-existing ultrasound systems.
  • the service log files provide an enhanced set of attributes not available in the radiological information system (RIS) or picture archiving and communication system (PACS) communicatively coupled with most ultrasound systems. If queried and interpreted correctly, service log files can provide the entire narrative related to a ultasound user’s workflow and imaging process implemented to perform a given ultrasound exam. Insightful information related to ultrasound scanning sessions can thus be retrieved from the service log files.
  • RIS radiological information system
  • PES picture archiving and communication system
  • systems disclosed herein can identify this problem by pinpointing and selectively extracting certain information from the service log files and subsequently utilizing such information to identify performance inefficiencies based on the corresponding control panel utilized and the output generated by a neural network or other artificial intelligence models.
  • FIG. 5 is a depiction of a neural network that may be trained and implemented to generate an expected usage of a ultasound user’s non-scanning hand based on a particular clinical context.
  • the neural network 500 may include an input layer 502 configured to receive a variety of discrete clinical context datasets.
  • the number of nodes or neurons in the input layer 502 may vary, and while only one neuron is depicted for illustrative purposes, embodiments may include a number of neurons equal to the number of variables included in the clinical context training set(s), or the number of training set variables plus one.
  • the neural network 500 can be trained to receive a clinical context at the input layer 502 and generate an expected usage output based on the ground truth usage of experienced ultasound users.
  • Embodiments of the neural network may be configured to implement an algorithmic regressive prediction model.
  • the output layer 504 of the neural network 500 can provide an expected usage, which can be parsed into individual activities, e.g., total knob usage, button pushes, angular hand movement, etc., represented in FIG. 5 as expected usage output neurons 504a, 504b, 504c, respectively.
  • the model may be (partially) realized as an AI-based learning network.
  • the ground truth for the expected values of button presses, knob usage, total linear hand motion, and total angular hand motion are collected from the experienced ultasound users for each given upstream clinical context.
  • the accuracy of an AI-based learning network can get stronger over time by adding more data to it (such as a self-learning algorithm.
  • the computer-implemented techniques utilized to generate the expected usage may vary, and may involve artificial intelligence-based processing, e.g., sophisticated supervised machine learning. Supervised learning models can be trained on a comprehensive dataset of clinical contexts and associated usages.
  • the number of neurons in the output layer 504 may vary.
  • the output layer 504 may include one total neuron or one neuron for each activity constituting a total usage.
  • a custom score may be generated for each of the outputs.
  • a range may be assigned to each of the outputs, which may be based on uncertainty levels, so that if a ultasound user’s actual usage falls within the defined usage range, that usage is deemed acceptable, normal, or efficient.
  • the risk of non-scanning hand overuse can be calculated based on averaging (or weighted averaging) the ratio of the ultasound user’s output values to the expected values of the experienced ultasound user for each output item.
  • This ratio can be determined in some examples according to Equation 2.1 : total button pushes total knob usage C L I total linear hand distance
  • Equation 2.1 Wi - + VV2 - +
  • the Wi , 2 , 3,4 represents the weight factor based on the importance of each output item. For example, it is expected that the hand motion for some tasks such as rotating knob is more complex compared to pressing a push-button. Therefore, a higher weight factor can be assigned to the W2 compared to the others. Ideally, the ratio should be close to 1. If the ratio is much higher than 1, it can be flagged with additional remarks (e.g. overuse due to excessive non-scanning hand rotational motion) for later display.
  • the input layer 502 and the output layer 504 can be one or more hidden layers 506 configured to assign and optimize weights associated with the clinical context inputs, for example via backpropagation, and apply the weighted inputs to an activation function, e.g., the rectified linear activation function.
  • the number of hidden layers and the number of neurons present therein may also vary. In some embodiments, the number of hidden neurons in a given hidden layer may equal the product of the total number of input neurons and output neurons, together multiplied by the number of datasets used to train the network.
  • the particular neural network(s) implemented in accordance with the disclosed embodiments may vary. The number of neurons may change, for example, along with their arrangement within the network. The size, width, depth, capacity, and/or architecture of the network may vary.
  • the neural network 500 may be hardware- (e.g., neurons are represented by physical components) or software-based (e.g., neurons and pathways implemented in a software application), and can use a variety of topologies and learning algorithms for training the neural network to produce the desired output.
  • hardware- e.g., neurons are represented by physical components
  • software-based e.g., neurons and pathways implemented in a software application
  • a software- based neural network may be implemented using a processor (e.g., single- or multi-core CPU, a single GPU or GPU cluster, or multiple processors arranged for parallel-processing) configured to execute instructions, which may be stored in a computer-readable medium, and which when executed cause the processor to perform a machine-trained algorithm for receiving clinical context input(s) and generating an expected activity level of the non-scanning hand of a ultasound user performing the ultrasound exam included within the input clinical context.
  • a processor e.g., single- or multi-core CPU, a single GPU or GPU cluster, or multiple processors arranged for parallel-processing
  • the neural network 500 may be implemented, at least in part, in a computer-readable medium comprising executable instructions, which when executed by a processor, may cause the processor to perform a machine-trained algorithm to output expected activity levels for the non-scanning hand of a ultasound user performing a particular exam.
  • the neural network(s) may be trained using any of a variety of currently known or later developed machine learning techniques to obtain a neural network (e.g., a machine-trained algorithm or hardware-based system of nodes) that is configured to analyze input data in the form of patient- and exam-specific information collectively constituting a clinical context.
  • the ground truth used for training the network 500 can include documented activity levels of expert, experienced, or average ultasound users exerted in the same or similar clinical context. The accuracy of the neural network can grow stronger over time as more data is input.
  • the neural network 500 can also be coupled to a training database 508.
  • the training database 508 may provide a large sample of clinical context data sets and corresponding usages used to train the neural network.
  • Communication between the training database 508 and the neural network 500 can be bidirectional, such that the training database 508 may provide usages obtained by experienced ultasound users to the network for training purposes, and the neural network 500 can transmit new clinical context datasets for storage in the training database 508, thereby increasing the sample size of clinical contexts paired with usage outputs and further refining future output from the neural network.
  • neural networks e.g., network 500
  • network 500 may be utilized to generate expected usages of a ultasound user’s non-scanning hand
  • embodiments are not confined to neural networks, and a number of additional and alternative intelligence systems or components may be utilized, such as random forests or support vector machines.
  • FIG. 6 shows a graphical user interface (GUI) 600, which may also be considered a
  • “dashboard” or “panel,” configured to display a health status report 602 corresponding to a particular ultasound user, who can be selected from a list of ultasound users included in a dropdown menu 604.
  • the GUI 600 can receive free text and/or search entries corresponding to specific ultasound users.
  • the health status report 602 can be displayed via the GUI 600 to various personnel, including ultasound users or clinical lab managers seeking periodic health status reports of non-scanning hand usage.
  • the report 602 can include selectable performance output 606, which can include options such as “average exam” or “single exam.” As the displayed options indicate, an average exam performance output causes the report 602 to show an average performance output determined based on two or more exams, whereas the single exam performance output causes the report 602 to show the performance output of a single exam.
  • the report 602 can also feature a selectable time period or date range 608 over which a given ultasound user’s non-scanning hand usage is determined. The ultasound user’s non-scanning hand performance may vary depending on the date range specified at the GUI 600.
  • a wide range e.g., one year or longer, may reveal approximately average efficiency levels, whereas a date range spanning the first six months of that same one-year period may show low-efficiency levels, and a date range spanning the second six months of the one-year period may show high-efficiency levels.
  • the date range can span less than one day and stretch as long as months or years.
  • the report 602 can thus provide detailed, customizable information to benchmark certain performance metrics and enable continuous performance improvement.
  • the report 602 can also include an exam number selection 610, which allows the user to view information regarding a specific exam performed on a specific day. In the illustrated example, the user selected exam number 3 performed on November 12, 2020.
  • a ultasound user can thus query the system on-demand to provide a health status report 602 based solely on a specific ultrasound exam, which may be the most recent exam performed, or on multiple exams performed over a defined period also specified by a ultasound user. For example, a ultasound user can query the system to provide a non-scanning hand health status report based on all exams performed within the previous week, month, or year. In this manner, the ultasound user can determine how a particular exam or collection of exams has impacted the current health status of his/her non-scanning hand.
  • the details section 612 of the report 602 provides the clinical context corresponding to the selected exam.
  • the clinical context of exam number 3 included a cardiac exam performed on a difficult outpatient using the Affiniti 50 ultrasound machine.
  • a qualitative remarks section 614 provides a summary of the ultasound user’s non-scanning hand performance.
  • the level of detail provided in the remarks section 614 can vary. The example shown indicates “non scanning hand overuse, particularly extra linear and angular movement.”
  • the non-scanning hand efficiency graphic 616 parses out the efficiency of the ultasound user’s linear motion, angular motion, total button usage, and total knob usage.
  • Additional information included in the health status report 602 can include the health status of the ultasound user, which may be based on one or more examinations, and the extent to which the ultasound user’s performance metrics deviated from the expected metrics.
  • ultasound users can query the health status of their non-scanning hand by clicking on a tag/button/other feature on the ultrasound imaging screen of the GUI 600 at the end of an ultrasound exam.
  • the disclosed systems can store and utilize ultasound user-specific performance metrics to generate individualized health reports on a daily, weekly, biweekly, monthly, quarterly, and/or annual basis.
  • the GUI 600 can be configured to generate and display confidence levels depending on the number of exams included in a specified time period. For instance, confidence levels may be relatively low if a small number of exams is encompassed within a specified date range, whereas confidence levels may be relatively high for a larger number of exams.
  • a live pop-up message or alert 618 can be automatically generated and displayed on the GUI when abnormal (e.g., very large) linear movement, angular motion, and/or button pushes are detected during an exam, as compared to the expected values forecasted by the predictive system for the same upstream clinical context. Enabling live alerts upon recognizing these abnormalities provides real-time guidance for ultasound users, which may minimize potential overuse and injury risk to the non-scanning hand. In some examples, systems herein can connect a ultasound user virtually to an expert who can recommend the implementation of any necessary adjustments.
  • the live alert 618 can be displayed on a health status report configured to monitor and display information regarding a current exam, instead of a previous exam.
  • the live alert 618 can be displayed on GUI 600, but not as a component of the health status report. In some examples, the live alert 618 can be displayed on a separate GUI or screen configured to display live ultrasound images obtained during the exam, and may be displayed in a manner that does not unnecessarily disrupt the current exam.
  • the GUI 600 can be physically and/or communicatively coupled to one or more underlying processors 620 configured to generate and modify the displayed graphics in response to different user inputs received at the GUI 600, for example regarding the date range over which non-scanning hand efficiency is calculated and the qualitative remarks associated therewith.
  • One or more of the processors 620 can be configured to operate an intelligence system, such as neural network 500.
  • one or more of the processors 620 can be configured to mine service log files after one or more ultrasound exams, and may be configured to determine non-scanning hand usages.
  • the one or more processors 620 can also be configured to compare actual non scanning hand usages to expected non-scanning hand usages.
  • Non-scanning hand data acquired over time can be stored in one or more data storage devices 622, which may be coupled with the graphical user interface 600 and/or the one or more processors 620.
  • the GUI 600 can be configured to identify, extract, and/or receive the stored data required to generate a particular health status report 602 in accordance with user-specified parameters received at the GUI 600.
  • the GUI 600 and one or more processors 620 coupled therewith can mine the data storage device(s) 622 for one or more datasets regarding the non-scanning hand usage of a particular ultasound user over a specific time period, along with any particular exams performed during that time period.
  • the GUI 600 can thus be configured to initiate and/or perform a variety of data extractions and analyses, receive data responsive to such extractions and analysis, and/or generate graphical displays in the form a health status report 602 customized in view of the same.
  • the GUI 600 can also be configured to generate and/or modify the graphics displayed on the health status report 602 in accordance with additional user inputs.
  • the non scanning hand efficiency section 616 may be modified to remove the circular efficiency ratings and/or replace them with a linear efficiency display or numerical efficiency display.
  • the GUI 600 can also be configured to display absolute data regarding non-scanning hand usage, for example in terms of usage hours and the total linear and angular movement accrued during certain periods.
  • the GUI 600 can be configured to selectively obtain/receive and display data corresponding to a particular clinical context.
  • the GUI 600 can obtain/receive and display data corresponding to a particular patient type (e.g., “difficult”), a particular model of ultrasound machine, a particular time of day (e.g., AM or PM), and a particular type of exam (e.g., cardiac). Ultasound users and lab managers may therefore use the GUI 600 to identify specific areas needing improvement, which may also be used to adjust ultasound user scheduling.
  • a particular patient type e.g., “difficult”
  • a particular model of ultrasound machine e.g., AM or PM
  • a particular type of exam e.g., cardiac
  • the graphical user interface 600 can convey risk-enhancing activities undertaken by a particular ultasound user, for example in the form of over-usage and inefficiencies itemized by activity type, e.g., knob rotation, thereby enabling the ultasound user to minimize the risk of injury.
  • one or more of the systems disclosed herein may be configured to predict a future time at which the non-scanning hand of a ultasound user is likely to develop an injury, such as carpal tunnel syndrome.
  • the systems disclosed herein can be configured to recommend certain actions or adjustments based on this information. For example, embodiments can be configured to recommend that a ultasound user perform less of a certain exam and/or that a ultasound user receive additional training to improve his/her efficiency with respect to a specific exam.
  • FIG. 7 is a block diagram illustrating an example processor 700 according to principles of the present disclosure.
  • One or more processors utilized to implement the disclosed embodiments may be configured the same as or similar to processor 700.
  • Processor 700 may be used to implement one or more processes described herein.
  • processor 700 may be configured to implement an artificial intelligence system configured to generate predicted usage levels, such as neural network 500.
  • the processor 700 can also be configured to receive a clinical context data set and determine an actual usage of the non-scanning hand of the ultasound user expended during the ultrasound exam.
  • the same processor, or a different processor configured similarly, can also compare the expected usage to the actual usage to generate a performance metric of the non-scanning hand of the ultasound user.
  • the processor 700 can also be configured as a graphics processor programmed to generate displays for the graphical user interfaces described herein.
  • the processor 700 can also be configured to determine the actual usage of the non-scanning hand during an ultrasound exam by mining service log fdes, which may be obtained and recorded by the same processor or a different processor configured similarly. In some examples, the processor 700 can be configured to determine an actual usage of the non scanning hand during an ultrasound exam by determining a total linear motion and a total angular motion of the non-scanning hand based on a physical layout of the control panel and the total number, chronology, and type of ultasound user actions received at the control panel.
  • Processor 700 may be any suitable processor type including, but not limited to, a microprocessor, a microcontroller, a digital signal processor (DSP), a field programmable array (FPGA) where the FPGA has been programmed to form a processor, a graphical processing unit (GPU), an application specific circuit (ASIC) where the ASIC has been designed to form a processor, or a combination thereof.
  • DSP digital signal processor
  • FPGA field programmable array
  • GPU graphical processing unit
  • ASIC application specific circuit
  • the processor 700 may include one or more cores 702.
  • the core 702 may include one or more arithmetic logic units (ALU) 704.
  • ALU arithmetic logic unit
  • the core 702 may include a floating point logic unit (FPLU) 706 and/or a digital signal processing unit (DSPU) 708 in addition to or instead of the ALU 704.
  • FPLU floating point logic unit
  • DSPU digital signal processing unit
  • the processor 700 may include one or more registers 712 communicatively coupled to the core 702.
  • the registers 712 may be implemented using dedicated logic gate circuits (e.g., flip- flops) and/or any memory technology. In some examples, the registers 712 may be implemented using static memory.
  • the register may provide data, instructions and addresses to the core 702.
  • processor 700 may include one or more levels of cache memory 710 communicatively coupled to the core 702.
  • the cache memory 710 may provide computer-readable instructions to the core 702 for execution.
  • the cache memory 710 may provide data for processing by the core 702.
  • the computer-readable instructions may have been provided to the cache memory 710 by a local memory, for example, local memory attached to the external bus 716.
  • the cache memory 710 may be implemented with any suitable cache memory type, for example, metal-oxide semiconductor (MOS) memory such as static random access memory (SRAM), dynamic random access memory (DRAM), and/or any other suitable memory technology.
  • MOS metal-oxide semiconductor
  • the processor 700 may include a controller 714, which may control input to one or more processors included in the systems disclosed herein. Controller 714 may control the data paths in the ALU 704, FPLU 706 and/or DSPU 708. Controller 714 may be implemented as one or more state machines, data paths and/or dedicated control logic. The gates of controller 714 may be implemented as standalone gates, FPGA, ASIC or any other suitable technology.
  • the registers 712 and the cache memory 710 may communicate with controller 714 and core 702 via internal connections 720A, 720B, 720C and 720D.
  • Internal connections may be implemented as a bus, multiplexor, crossbar switch, and/or any other suitable connection technology.
  • Inputs and outputs for the processor 700 may be provided via a bus 716, which may include one or more conductive lines.
  • the bus 716 may be communicatively coupled to one or more components of processor 700, for example the controller 714, cache 710, and/or register 712.
  • the bus 716 may be coupled to one or more components of the system.
  • the bus 716 may be coupled to one or more external memories.
  • the external memories may include Read Only Memory (ROM) 732.
  • ROM 732 may be a masked ROM, Electronically Programmable Read Only Memory (EPROM) or any other suitable technology.
  • the external memory may include Random Access Memory (RAM) 733.
  • RAM 733 may be a static RAM, battery backed up static RAM, Dynamic RAM (DRAM) or any other suitable technology.
  • the external memory may include Electrically Erasable Programmable Read Only Memory (EEPROM) 735.
  • the external memory may include Flash memory 734.
  • the external memory may include a magnetic storage device such as disc 736.
  • FIG. 8 is a flow diagram of a method of evaluating and displaying ultasound user performance.
  • the example method 800 shows the steps that may be utilized, in any sequence, by the systems and/or apparatuses described herein. Although examples of the present system have been illustrated with particular reference to ultrasound imaging modalities, the present system can be extended to other medical imaging systems where one or more images are obtained in a systematic manner. For example, the method 800 may be performed with the aid of one or more imaging systems including but not limited to MRI or CT.
  • the method 800 begins at block 802 by “receiving a user inquiry regarding a non-scanning hand performance of a ultasound user over a time period.”
  • the method involves “obtaining a non-scanning hand performance metric indicative of the non-scanning hand performance of the ultasound user over the time period, the non-scanning hand performance metric based on a comparison of an actual non-scanning hand usage compared to an expected non-scanning hand usage determined over the time period.”
  • the method involves “displaying the performance metric,” which may comprise a non-scanning hand efficiency rating.
  • the method 800 can also involve adjusting a depiction of the performance metric in response to a user input, which may relate to the time period over which the performance metric is determined, the ultasound user being evaluated, and/or the exams included in the time period.
  • the expected non-scanning hand usage can be based on one or more clinical context inputs.
  • Each clinical context input can include a description of one or more ultrasound exams conducted over the time period, one or more models of ultrasound machine utilized over the time period to perform the exam(s), at least one attribute of the patient(s), and/or a medical history of the patient(s).
  • the actual usage of the non-scanning hand can be based on service log files obtained and recorded during the ultrasound exam(s) performed during the time period.
  • the service log files can include data indicative of the total number, chronology, and type of ultasound user actions received at the control panel of an ultrasound image acquisition device.
  • Qualitative remarks or messages describing the performance metric and/or one or more recommended adjustments to be implemented by a ultasound user can also be generated and displayed.
  • One or more of the steps included in the method 800 may be implemented automatically by a disclosed system, such that user input is not required. Display of the performance metric, for example, may be performed automatically by the GUI operating together with one or more processors. Suggested adjustments for a given ultasound user can also be updated automatically in response to the selection of different time periods and/or exams performed therein.
  • the storage media can provide the information and programs to the device, thus enabling the device to perform functions of the systems and/or methods described herein.
  • the computer could receive the information, appropriately configure itself and perform the functions of the various systems and methods outlined in the diagrams and flowcharts above to implement the various functions. That is, the computer could receive various portions of information from the disk relating to different elements of the above-described systems and/or methods, implement the individual systems and/or methods and coordinate the functions of the individual systems and/or methods described above.
  • processors described herein can be implemented in hardware, software and firmware. Further, the various methods and parameters are included by way of example only and not in any limiting sense. In view of this disclosure, those of ordinary skill in the art can implement the present teachings in determining their own techniques and needed equipment to affect these techniques, while remaining within the scope of the invention.
  • the functionality of one or more of the processors described herein may be incorporated into a fewer number or a single processing unit (e.g., a CPU) and may be implemented using application specific integrated circuits (ASICs) or general purpose processing circuits which are programmed responsive to executable instruction to perform the functions described herein.
  • ASICs application specific integrated circuits
  • the present system may be used to obtain and/or project image information related to, but not limited to renal, testicular, breast, ovarian, uterine, thyroid, hepatic, lung, musculoskeletal, splenic, and cardiac applications. Further, the present system may also include one or more programs which may be used with conventional imaging systems so that they may provide features and advantages of the present system. Certain additional advantages and features of this disclosure may be apparent to those skilled in the art upon studying the disclosure, or may be experienced by persons employing the novel system and method of the present disclosure. Another advantage of the present systems and method may be that conventional medical image systems can be easily upgraded to incorporate the features and advantages of the present systems, devices, and methods.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • Biomedical Technology (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Primary Health Care (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Public Health (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Epidemiology (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The present disclosure describes systems configured to monitor and evaluate the performance of the non- scanning hand of ultasound users. The systems can include a neural network trained to receive information regarding the clinical context of an ultrasound exam performed by a ultasound user, and based on the clinical context, generate an expected usage of the ultasound user's non-scanning hand. The systems can include a processor configured to determine an actual usage of the non-scanning hand of the ultasound user expended during the ultrasound exam and compare the actual usage to the expected usage to generate a performance metric displayed on a user interface. The processor can determine the actual usage of the ultasound user's non-scanning hand by mining service log files corresponding to the ultrasound exam. The linear and angular motion of the non-scanning hand can also be determined based on the physical layout of the control panel used to perform the ultrasound exam.

Description

USER NON-SCANNING HAND EVALUATION
TECHNICAL FIELD
[001] The present disclosure pertains to systems and methods for determining the usage of a ultasound user’s non-scanning hand in every ultrasound exam. Particular implementations include systems configured to determine the activity and associated movement of a ultasound user’s non scanning hand during a medical imaging examination and assess the efficiency of such movement by implementing an intelligent computer-based prediction system.
BACKGROUND
[002] Work-related musculoskeletal disorders (WRMSDs) are painful conditions caused by workplace activities affecting the muscles, ligaments, and tendons. Aside from interfering with workers’ ability to perform work-related tasks, WRMSDs often impose a substantial personal toll on those affected since they may no longer be able to perform personal tasks and routine activities of daily living. Unlike acute injuries, WRMSDs develop gradually over time from repeated exposure to a variety of risk factors and may be painful during work and even at rest.
[003] Ultasound users develop WRMSDs at an especially high rate. Despite improvements in the flexibility of ultrasound systems and exam tables, it has been reported that about 90% of clinical ultasound users experience symptoms of WRMSDs, 20% of which suffer career-ending injuries. Studies have also shown that 60% of ultasound users experience wrist/hand/finger discomfort in scanning and non-scanning hands caused by WRMSDs. Cross-sectional studies have revealed the high prevalence of WRMSD symptoms in the non-scanning hand of ultasound users, particularly in the wrist (23.1%-40.8%) and the hand/fingers (26%-44.7%). Symptoms attributed to WRMSDs often begin after only about six months of work experience, after which the prevalence of such symptoms usually continues to increase. These findings highlight the extent and severity of non-scanning hand injuries commonly experienced by ultasound users, which also limits patient access for those in need of ultrasound examination.
[004] Improved technologies are therefore needed to reduce the prevalence of WRMSDs caused by injuries to the non-scanning hand of ultasound users. SUMMARY
[005] The present disclosure describes systems and methods for monitoring and reducing physical stress in the non-scanning hand of a ultasound user. During an ultrasound exam, the ultasound user’s scanning hand moves the ultrasound probe over the target area of the patient. At the same time, the ultasound user’s non-scanning hand engages with a control panel of the ultrasound system. The control panel includes a variety of buttons and knobs collectively programmed to, among other things, adjust the scanning parameters of the ultrasound probe, acquire ultrasound images, and annotate the acquired images with information relevant to the exam being performed. The systems disclosed herein are configured to monitor, measure, and determine the usage of the ultasound user’s non-scanning hand during an ultrasound exam and determine whether the measured usage deviates from the usage expected for the same ultrasound exam performed on a similar patient.
[006] In accordance with some embodiments disclosed herein, a ultasound user performance evaluation system may include or be communicatively coupled with an image acquisition device configured to acquire images of a patient during an ultrasound exam. The image acquisition device can include an ultrasound probe communicatively coupled with a control panel, which includes one or more user engagement features, such as physical buttons, touchscreen buttons, rotatable knobs, etc. The control panel can be configured to adjust imaging parameters implemented by the image acquisition device upon receiving inputs from the ultasound user performing the ultrasound exam. The system can also include one or more processors in communication with the control panel. The one or more processors can be configured to receive a clinical context input by the ultasound user and/or the clinical context input may be received from another processor or database. The one or more processors can also be configured to apply an intelligence system, which may include a neural network, to the clinical context. The intelligence system can be configured to generate an expected usage of a non-scanning hand of the ultasound user based on the received clinical context. The one or more processors can also determine an actual usage of the non-scanning hand of the ultasound user expended during the ultrasound exam, and compare the expected usage to the actual usage to generate a performance metric of the non-scanning hand of the ultasound user. The performance metric can then be displayed on a graphical user interface communicatively coupled with the processor(s). [007] In accordance with some embodiments disclosed herein, a ultasound user performance evaluation system includes one or more processors configured to determine a non-scanning hand performance metric indicative of the non-scanning hand usage of a ultasound user over a time period in response to a user inquiry. The non-scanning hand performance metric can be based on a comparison of an input of an actual non-scanning hand usage compared to an input of an expected non-scanning hand usage determined over the time period. The system can also include a graphical user interface configured to receive the user inquiry, obtain the non-scanning hand performance metric, and display the non-scanning hand performance metric.
[008] In some examples, the one or more processors are further configured to receive an input of one or more clinical contexts. Each clinical context can include a description of an ultrasound exam conducted over the time period, a model of ultrasound machine utilized over the time period, at least one attribute of a patient examined over the time period, a medical history for the patient, or combinations thereof. In some examples, the one or more processors are further configured to apply a neural network to each of the clinical contexts. The neural network can be configured to generate the expected usage of the non-scanning hand based on the clinical contexts. In some examples, the one or more processors are further configured to determine the actual usage of the non-scanning hand expended during the ultrasound exams.
[009] In some examples, the one or more processors are configured to determine the actual usage of the non-scanning hand based on inputs from one or more service log files obtained and recorded by the one or more processors during the ultrasound exams. In some examples, the one or more processors are communicatively coupled with one or more control panels, each control panel configured to adjust imaging parameters implemented by an ultrasound image acquisition device in response to input received from the ultasound user during an ultrasound exam. In some examples, the service log files include data indicative of a total number, chronology, and type of ultasound user actions received at the control panel. In some examples, the one or more processors are configured to determine the actual usage of the non-scanning hand by determining a total linear motion and a total angular motion of the non-scanning hand based on a physical layout of the control panel and the total number, chronology, and type of ultasound user actions received at the control panel.
[010] In some examples, the performance metric comprises an efficiency rating of the actual usage of the non-scanning hand. In some examples, the performance metric comprises efficiency ratings corresponding to the total linear motion, the total angular motion, the total number of ultasound user inputs, or combinations thereof. In some examples, the ultasound user actions comprise total button pushes and total knob usage. In some examples, the graphical user interface is further configured to generate and display qualitative remarks describing the performance metric. In some examples, the neural network is operatively associated with a training algorithm configured to receive an array of training inputs and known outputs. The training inputs can comprise a sample of clinical context datasets obtained from previously performed ultrasound exams, and the known outputs can comprise performance metrics of the non-scanning hand of ultasound users who performed the previous ultrasound exams.
[Oil] In accordance with some embodiments of the present disclosure, a method of evaluating and displaying ultasound user performance involves receiving a user inquiry regarding a non scanning hand performance of a ultasound user over a time period. The method may also involve obtaining a non-scanning hand performance metric indicative of the non-scanning hand performance of the ultasound user over the time period. The non-scanning hand performance metric can be based on a comparison of an actual non-scanning hand usage to an expected non scanning hand usage determined over the time period. The method may further involve displaying the performance metric.
[012] In some examples, the method further involves adjusting a depiction of the performance metric in response to a user input. In some examples, the expected non-scanning hand usage is based on one or more clinical contexts. Each clinical context can include a description of one or more ultrasound exams conducted over the time period, one or more models of ultrasound machine utilized over the time period, at least one attribute of the patient, a medical history of the patient, or combinations thereof. In some examples, the actual usage of the non-scanning hand is based on service log files obtained and recorded during ultrasound exams performed during the time period. The service log files can include data indicative of a total number, chronology, and type of ultasound user actions received at a control panel of an ultrasound image acquisition device. In some examples, the performance metric comprises an efficiency rating of the actual usage of the non-scanning hand. In some examples, the method further involves generating and displaying qualitative remarks describing the performance metric. [013] Any of the methods described herein, or steps thereof, may be embodied in a non-transitory computer-readable medium comprising executable instructions, which when executed may cause one or more hardware processors to perform the method or steps embodied herein.
BRIEF DESCRIPTION OF THE DRAWINGS
[014] FIG. 1 is a schematic overview of a system configured to determine and display non scanning hand usage and efficiency in accordance with embodiments of the present disclosure.
[015] FIG. 2 is a schematic of a control panel of an image acquisition system utilized in accordance with embodiments of the present disclosure.
[016] FIG. 3 is a table showing average linear and angular hand movements accrued during an ultrasound exam performed using different ultrasound machines in accordance with embodiments of the present disclosure.
[017] FIG. 4 is a diagram of system components utilized to determine the actual usage of a ultasound user’s non-scanning hand during an ultrasound exam in accordance with embodiments of the present disclosure.
[018] FIG. 5 is a schematic of a neural network implemented to determine an expected usage of a ultasound user’s non-scanning hand during an ultrasound exam in accordance with embodiments of the present disclosure.
[019] FIG. 6 is a graphical user interface configured to generate and display customizable health status reports in accordance with embodiments of the present disclosure.
[020] FIG. 7 is a schematic of a processor utilized in accordance with embodiments of the present disclosure.
[021] FIG. 8 is a flow diagram of a method of performed in accordance with embodiments of the present disclosure.
DETAIFED DESCRIPTION
[022] The following description of certain embodiments is merely exemplary in nature and is in no way intended to limit the invention or its applications or uses. In the following detailed description of embodiments of the present systems and methods, reference is made to the accompanying drawings which form a part hereof, and which are shown by way of illustration specific embodiments in which the described systems and methods may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice presently disclosed systems and methods, and it is to be understood that other embodiments may be utilized and that structural and logical changes may be made without departing from the spirit and scope of the present system. Moreover, for the purpose of clarity, detailed descriptions of certain features will not be discussed when they would be apparent to those with skill in the art so as not to obscure the description of the present system. The following detailed description is therefore not to be taken in a limiting sense, and the scope of the present system is defined only by the appended claims.
[023] The disclosed systems and methods overcome the lack of intelligent, systematic tools for monitoring, determining, and ultimately improving the health of medical ultasound users’ non scanning hands. By pinpointing, extracting, and analyzing specific information acquired and stored in service log files for each ultrasound exam, embodiments of the disclosed systems determine non-scanning hand usage, which may comprise linear and angular hand movement as well as user engagement with various buttons and knobs included on the control panel of most ultrasound machines. The actual non-scanning hand usage can be compared to an expected non scanning hand usage predicted by an artificial intelligence system trained to predict non-scanning hand usage levels based on clinical context. Among other factors, the clinical context typically comprises the particular exam performed, the ultrasound machinery utilized, and various physical attributes of the current patient and his/her medical history. By comparing the actual usage against the expected usage, the disclosed systems can identify deviations from the expected usage, thereby exposing potential inefficiencies related to non-scanning hand usage. Such inefficiencies may increase the likelihood of developing injuries to the non-scanning hand, especially if repeated over time. To convey the acquired information regarding non-scanning hand usage and efficiency in a meaningful way, embodiments also include a graphical user interface configured to display the acquired information in customizable reports. The graphical user interface can display the data directly to the ultasound user and/or a lab manager overseeing the ultasound user’s performance. The user interface can thus be displayed locally or across different locations.
[024] As used herein, the terms “usage” and “activity” may be used interchangeably and may include all motions and movements of the ultasound user’s non-scanning hand. The ultasound user’s engagement with the controls on (or displayed on) the control panel, e.g., button pushes and knob rotations, may all be encompassed within this definition of “usage” or “activity” such that a greater number and/or degree of button pushes, knob rotations, and the lateral/longitudinal movement of the ultasound user’s non-scanning hand required to accomplish the same all impact the measured “usage” or “activity” levels. The usage or activity of a ultasound user’s non-scanning hand may be measured and/or displayed in the form of one or more performance metrics, which may include a usage or efficiency of total knob usage, for example.
[025] As used herein, “clinical context” and “upstream clinical context” may be used interchangeably and may include or be based on the type of exam being performed, e.g., pulmonary exam or cardiac exam, as well as patient-specific information, non-limiting examples of which may include patient weight, height, body mass index (BMI), age, underlying health condition(s), clinical history, and combinations thereof. Additional information that may be encompassed within the clinical context can include the reason(s) for performing an exam, e.g., to evaluate left ventricular function, along with the patient’s length of stay, e.g., inpatient and/or outpatient time. Information constituting the clinical context can be combined and/or displayed on the user interface in various ways. For example, the clinical context can include categorical descriptors of a patient’s body type and/or associated levels of exam difficulty, such as “easy,” “moderate,” or “difficult.” The clinical information can also include the particular ultrasound model being used to perform the exam, as the control panels and resulting hand motion often differ for different models, as shown for example in FIG. 3.
[026] As used herein, “performance metrics” can include total button pushes, total knob usage, total linear hand motion, and/or total angular hand motion, one or more of which may be compared to expected total button pushes, expected total knob usage, expected total linear hand motion, and/or expected total angular hand motion to generate efficiency scores or ratings.
[027] As used herein, the “health status” of a ultasound user’s non-scanning hand can encompass the health of various anatomical parts associated with, comprising, or attached to the ultasound user’s non-scanning hand, non-limiting examples of which may include the ultasound user’s shoulder, elbow, forearm, wrist, hand, finger(s), or combinations thereof. The “health status” can also embody an overall indication of a ultasound user’s non-scanning hand health based on the total stresses incurred by each body part directly or indirectly connected to the non-scanning hand.
[028] As used herein, “expert” or “experienced” ultasound users may include certified ultasound users having at least a certain length of experience, such as at least one year, two years, three years, four years, five years, or longer. “Expert” or “experienced” ultasound users may also include ultasound users who have received or attained a form of recognized achievement or certification.
[029] Various ultrasound-based exams are contemplated herein, non-limiting examples of which include diagnostic imaging, cardiac imaging, vascular imaging, lung imaging, and combinations thereof. The particular exam being performed likely impacts the non-scanning hand activity of the ultasound user, especially if an exam requires the acquisition of a greater number of images at a variety of depths and/or angles.
[030] FIG. 1 depicts an overview of a non-scanning hand monitoring and evaluation system 100 implemented in accordance with embodiments described herein. As used in Fig. 1A and throughout this application, the term ultrasound user may include and be shown as sonographer, but intends to include ultrasound user regardless of certification body or title unless otherwise indicated. As shown, information constituting the clinical context 102 can include patient type, which in this particular example includes categories such as “easy,” “moderate,” and “difficult.” The clinical context 102 can also be based on whether the ultrasound exam is being performed at an inpatient or outpatient facility. The patient history, reason for exam, and model of the ultrasound system being used are further included in this example of a clinical context 102.
[031] As further shown, the data constituting the clinical context 102 can be provided as input received by an artificial intelligence system represented in the illustrated embodiment as a neural network 104. The neural network 104 can be trained to predict the expected usage 106 of a ultasound user’s non-scanning hand, for example in the form of an expected service log file output, based on the clinical context 102. The expected usage 106 output may include an expected log file imaging workflow sequence, which can include a total number and chronology of button pushes, knob usage, linear hand movement, and/or angular hand movement. Changes to the clinical context may change the expected usage, even for the same exams performed with the same equipment. For example, the expected activity for the same pulmonary ultrasound exam performed using the same ultrasound machine can differ between two patients, especially if one patient is considered “difficult” and the other is considered “easy” based on their respective physical attributes, such as BMI. Clinical information about a patient undergoing an ultrasound exam is therefore utilized by the neural network 104 to refine the expected usage 106 of the non scanning hand during a given ultrasound exam. [032] The actual usage 108 of a ultasound user’s non-scanning hand can be determined by identifying and extracting specific data embodied within the service log files obtained and logged during an ultrasound exam. This data can also be analyzed by one or more processors of the system 100 in view of the relevant clinical context 102, such that actual activity levels are properly considered together with the corresponding clinical context. A high activity level measured for an “easy” patient, for example, may indicate a likely inefficiency in the ultasound user’s non-scanning hand usage, which may be confirmed by the system depending on the output 106 of the neural network 104. Efficiency in this context may be determined by the amount of non-essential linear and/or rotational motion of the non-scanning hand, such that a greater amount of non-essential motion is less efficient than a lesser amount of non-essential motion. Non-essential motion may increase if the number of button presses required to obtain a certain imaging plane, for example, are higher than those utilized by an experienced ultasound user to arrive at the same imaging plane.
[033] The actual usage 108 embodied within the obtained service log files can then be compared to the expected usage 106 predicted by the neural network 104 to determine the adherence (or deviation) of the ultasound user’s current performance relative to the performance expected for the same exam performed on a patient having similar attributes. The result of this comparison can be generated and subsequently displayed on a graphical user interface 110 in the form of a customizable health status report 112, as further described below in connection with FIG. 6.
[034] FIG. 2 shows an example of a control panel 200 of an ultrasound machine. The control panel 200 may be physically and/or communicatively coupled to an ultrasound probe configured to acquire images of a target region within a patient. The control panel 200 can include a display 202, which can be user-interactive and/or configured to display a live or previously acquired ultrasound image. The control panel 200 can also include a variety of buttons 204, at least one rotatable knob 206, and/or one or more slidable adjustment controls 208, all of which may be collectively referred to as “controls” herein. The ultasound user performing an ultrasound exam can engage any or all of the illustrated features of the control panel 200 with his/her non-scanning hand, such that the non-scanning hand is pressing buttons, sliding controls, rotating knobs, and/or otherwise interacting with the panel during an ultrasound exam. In addition to direct user engagement with each of these features, the linear and angular movement required to perform such actions also impacts the total usage of the non-scanning hand. For example, pressing a button 210 on the far left side of the control panel 200 and then pressing a button 212 on the far right side of the control panel 200 requires greater linear movement of the non-scanning hand than pressing button 210 and then pressing button 214. Similarly, repeated pressing of button 210 requires less linear movement, and thus usage, than repeated pressing of button 210 interleaved with button 214.
[035] FIG. 3 is a table 300 highlighting the impact of using different ultrasound machine models on average linear and angular hand movements accrued during the same ultrasound exam performed on the same patient. The Affiniti 50, Affiniti 50G, Affiniti 70, and Affiniti 70G models of ultrasound machine, all sold by Koninklijke Philips N.V., are represented in the table, which shows experimentally derived data acquired using each of the models. As shown, the Affiniti 70 model required the greatest average hand distance (over 51 meters) to perform the ultrasound exam relative to the other models. By comparison, the Affiniti 50G model only required 14.73 meters of linear hand movement to perform the same exam. Average angular movement of the non scanning hand was also the highest for the Affiniti 70 (over 14,600 degrees). The Affiniti 50G only required about 3,900 degrees of angular motion to perform the same exam. Accordingly, the particular model of ultrasound machine may be an important factor of the clinical context utilized by the systems disclosed herein to predict usage levels of a ultasound user’s non-scanning hand.
[036] FIG. 4 is a diagram of various system components 400 utilized for determining the actual usage of a ultasound user’s non-scanning hand during a given ultrasound exam. As shown, the service log files 402 obtained during an ultrasound exam can be mined to extract and process various forms of information regarding the ultasound user’s engagement with various features of a control panel throughout the exam. The ultasound user’s control panel engagement can be captured in the form of total button pushes for each of a variety of buttons, along with the chronology of the button pushes, neither of which are directly shown in the service log files, but rather obtained via one or more post-processing techniques implemented by one or more underlying processors. The service log files 402 depicted in FIG. 4, for example, reveal the numeric engagement totals for a variety of control panel features, e.g., buttons, knobs, etc., along with the qualitative descriptions corresponding thereto, such as imaging depth changes and the annotation of acquired images. Total button pushes can also include the frequency of button pushes during an exam. Total knob usage includes the frequency count of knobs during an exam. Knob engagement contributing to the total knob count can include knob-driven adjustments to the focus, depth, etc. of the ultrasound probe. [037] The user engagement information obtained from the service log files 402 can be utilized to determine the linear distance and angular motion of the ultasound user’s non-scanning hand during an ultrasound exam based on the layout of the particular control panel 404 used to perform the exam. An example of the chronology, linear distance, and angular motion involved in engaging with various control features is represented by the sequential linear movements depicted on control panel 404, numbered 1-5. Total linear hand motion can be determined based on the distance between buttons engaged during an exam, as well as the relative arrangement of the buttons. This information can be combined with the information mined from the service log files, e.g., the number and sequence of button pushes/knob rotation, to determine the linear hand motion. In some embodiments, the total linear hand motion can be determined using Equation 1.1:
[038] Equation 1.1:
Figure imgf000013_0001
[039] Total planar angular hand motion can be determined based on the angular motion in the x- y plane during an exam. In some embodiments, the total planar angular hand motion can be determined (in degrees) using Equation 1.2:
[040] Equation 1.2: ågl(arctan (gg)
[041] In both equations (xi,yi,zi) and IN represent the buttons’ coordinates and the total number of button movements, respectively.
[042] All workflow information, ordered chronologically, can thus be derived indirectly from the service log files and associated control panel layout, such that a user (e.g., lab manager or ultasound user) can identify which controls the ultasound user has pushed/rotated/slid during an exam, the number of times those controls were pushed/rotated/slid, and the overall hand motion of the ultasound user required to perform such activities. Transforming the data gleaned from the service log files into linear and angular hand motion requires detailed information regarding the layout of a variety of ultrasound control panels. Such detailed information includes the particular distance between each of the user engagement features included thereon, along with the angles between such features. The detailed information may also include the sensitivity of any rotatable knobs, which may impact the degree of angular motion required to implement desired imaging adjustments. Highly sensitive knobs, for example, may require less angular twisting motion than less sensitive knobs to implement the same imaging adjustment. Similarly, an action implemented by one control panel may require the ultasound user to push a series of buttons or to push one button and rotate one knob, while the same action implemented by another control panel may simply require the pushing of a single button. The systems disclosed herein are configured to store and filter such information to ultimately obtain the information necessary to accurately determine the activity of a ultasound user’s non-scanning hand.
[043] The total usage of the non-scanning hand determined by the disclosed systems in view of the service log files 402 and stored layout of the control panel 404 is then compared to the expected usage determined by the aforementioned intelligence system to identify performance deviations 406, which may reveal user inefficiencies. In this manner, the service log files 402 can provide real-time data regarding excessive non-scanning hand usage, for example indicating that a ultasound user is struggling to find the right acoustic imaging parameters, e.g., gain, focus, changing modes, etc.
[044] The service log files 402 are mined and utilized by the disclosed systems in a manner not conceived or achievable using pre-existing ultrasound systems. The service log files provide an enhanced set of attributes not available in the radiological information system (RIS) or picture archiving and communication system (PACS) communicatively coupled with most ultrasound systems. If queried and interpreted correctly, service log files can provide the entire narrative related to a ultasound user’s workflow and imaging process implemented to perform a given ultrasound exam. Insightful information related to ultrasound scanning sessions can thus be retrieved from the service log files. For instance, if a user is struggling to find the necessary acoustic imaging parameters during an exam (e.g., the need to switch probe or tissue-specific pre set, change in gain, etc.), systems disclosed herein can identify this problem by pinpointing and selectively extracting certain information from the service log files and subsequently utilizing such information to identify performance inefficiencies based on the corresponding control panel utilized and the output generated by a neural network or other artificial intelligence models.
[045] FIG. 5 is a depiction of a neural network that may be trained and implemented to generate an expected usage of a ultasound user’s non-scanning hand based on a particular clinical context. As shown, the neural network 500 may include an input layer 502 configured to receive a variety of discrete clinical context datasets. The number of nodes or neurons in the input layer 502 may vary, and while only one neuron is depicted for illustrative purposes, embodiments may include a number of neurons equal to the number of variables included in the clinical context training set(s), or the number of training set variables plus one. The neural network 500 can be trained to receive a clinical context at the input layer 502 and generate an expected usage output based on the ground truth usage of experienced ultasound users. Embodiments of the neural network may be configured to implement an algorithmic regressive prediction model.
[046] The output layer 504 of the neural network 500 can provide an expected usage, which can be parsed into individual activities, e.g., total knob usage, button pushes, angular hand movement, etc., represented in FIG. 5 as expected usage output neurons 504a, 504b, 504c, respectively. The model may be (partially) realized as an AI-based learning network. The ground truth for the expected values of button presses, knob usage, total linear hand motion, and total angular hand motion are collected from the experienced ultasound users for each given upstream clinical context. The accuracy of an AI-based learning network can get stronger over time by adding more data to it (such as a self-learning algorithm. The computer-implemented techniques utilized to generate the expected usage may vary, and may involve artificial intelligence-based processing, e.g., sophisticated supervised machine learning. Supervised learning models can be trained on a comprehensive dataset of clinical contexts and associated usages.
[047] Like the input layer 502, the number of neurons in the output layer 504 may vary. For example, the output layer 504 may include one total neuron or one neuron for each activity constituting a total usage. In some embodiments, a custom score may be generated for each of the outputs. A range may be assigned to each of the outputs, which may be based on uncertainty levels, so that if a ultasound user’s actual usage falls within the defined usage range, that usage is deemed acceptable, normal, or efficient. In addition or alternatively, the risk of non-scanning hand overuse can be calculated based on averaging (or weighted averaging) the ratio of the ultasound user’s output values to the expected values of the experienced ultasound user for each output item. This ratio can be determined in some examples according to Equation 2.1 : total button pushes total knob usage C L I total linear hand distance
[048] Equation 2.1: Wi - + VV2 - +
Expected button pushes Expected knob usage + w3 * Expected linear hand distance total angular hand distance
W4 * Expected total angular hand distance
[049] The Wi,2,3,4 represents the weight factor based on the importance of each output item. For example, it is expected that the hand motion for some tasks such as rotating knob is more complex compared to pressing a push-button. Therefore, a higher weight factor can be assigned to the W2 compared to the others. Ideally, the ratio should be close to 1. If the ratio is much higher than 1, it can be flagged with additional remarks (e.g. overuse due to excessive non-scanning hand rotational motion) for later display. [050] Operating between the input layer 502 and the output layer 504 can be one or more hidden layers 506 configured to assign and optimize weights associated with the clinical context inputs, for example via backpropagation, and apply the weighted inputs to an activation function, e.g., the rectified linear activation function. The number of hidden layers and the number of neurons present therein may also vary. In some embodiments, the number of hidden neurons in a given hidden layer may equal the product of the total number of input neurons and output neurons, together multiplied by the number of datasets used to train the network. The particular neural network(s) implemented in accordance with the disclosed embodiments may vary. The number of neurons may change, for example, along with their arrangement within the network. The size, width, depth, capacity, and/or architecture of the network may vary.
[051] The neural network 500 may be hardware- (e.g., neurons are represented by physical components) or software-based (e.g., neurons and pathways implemented in a software application), and can use a variety of topologies and learning algorithms for training the neural network to produce the desired output. For example, a software- based neural network may be implemented using a processor (e.g., single- or multi-core CPU, a single GPU or GPU cluster, or multiple processors arranged for parallel-processing) configured to execute instructions, which may be stored in a computer-readable medium, and which when executed cause the processor to perform a machine-trained algorithm for receiving clinical context input(s) and generating an expected activity level of the non-scanning hand of a ultasound user performing the ultrasound exam included within the input clinical context. The neural network 500 may be implemented, at least in part, in a computer-readable medium comprising executable instructions, which when executed by a processor, may cause the processor to perform a machine-trained algorithm to output expected activity levels for the non-scanning hand of a ultasound user performing a particular exam.
[052] In various examples, the neural network(s) may be trained using any of a variety of currently known or later developed machine learning techniques to obtain a neural network (e.g., a machine-trained algorithm or hardware-based system of nodes) that is configured to analyze input data in the form of patient- and exam-specific information collectively constituting a clinical context. The ground truth used for training the network 500 can include documented activity levels of expert, experienced, or average ultasound users exerted in the same or similar clinical context. The accuracy of the neural network can grow stronger over time as more data is input. [053] The neural network 500 can also be coupled to a training database 508. The training database 508 may provide a large sample of clinical context data sets and corresponding usages used to train the neural network. Communication between the training database 508 and the neural network 500 can be bidirectional, such that the training database 508 may provide usages obtained by experienced ultasound users to the network for training purposes, and the neural network 500 can transmit new clinical context datasets for storage in the training database 508, thereby increasing the sample size of clinical contexts paired with usage outputs and further refining future output from the neural network.
[054] While one or more neural networks, e.g., network 500, may be utilized to generate expected usages of a ultasound user’s non-scanning hand, embodiments are not confined to neural networks, and a number of additional and alternative intelligence systems or components may be utilized, such as random forests or support vector machines.
[055] FIG. 6 shows a graphical user interface (GUI) 600, which may also be considered a
“dashboard” or “panel,” configured to display a health status report 602 corresponding to a particular ultasound user, who can be selected from a list of ultasound users included in a dropdown menu 604. In additional embodiments, the GUI 600 can receive free text and/or search entries corresponding to specific ultasound users. The health status report 602 can be displayed via the GUI 600 to various personnel, including ultasound users or clinical lab managers seeking periodic health status reports of non-scanning hand usage.
[056] As shown, the report 602 can include selectable performance output 606, which can include options such as “average exam” or “single exam.” As the displayed options indicate, an average exam performance output causes the report 602 to show an average performance output determined based on two or more exams, whereas the single exam performance output causes the report 602 to show the performance output of a single exam. The report 602 can also feature a selectable time period or date range 608 over which a given ultasound user’s non-scanning hand usage is determined. The ultasound user’s non-scanning hand performance may vary depending on the date range specified at the GUI 600. For example, a wide range, e.g., one year or longer, may reveal approximately average efficiency levels, whereas a date range spanning the first six months of that same one-year period may show low-efficiency levels, and a date range spanning the second six months of the one-year period may show high-efficiency levels. The date range can span less than one day and stretch as long as months or years. The report 602 can thus provide detailed, customizable information to benchmark certain performance metrics and enable continuous performance improvement.
[057] The report 602 can also include an exam number selection 610, which allows the user to view information regarding a specific exam performed on a specific day. In the illustrated example, the user selected exam number 3 performed on November 12, 2020. A ultasound user can thus query the system on-demand to provide a health status report 602 based solely on a specific ultrasound exam, which may be the most recent exam performed, or on multiple exams performed over a defined period also specified by a ultasound user. For example, a ultasound user can query the system to provide a non-scanning hand health status report based on all exams performed within the previous week, month, or year. In this manner, the ultasound user can determine how a particular exam or collection of exams has impacted the current health status of his/her non-scanning hand.
[058] The details section 612 of the report 602 provides the clinical context corresponding to the selected exam. As shown, the clinical context of exam number 3 included a cardiac exam performed on a difficult outpatient using the Affiniti 50 ultrasound machine. A qualitative remarks section 614 provides a summary of the ultasound user’s non-scanning hand performance. The level of detail provided in the remarks section 614 can vary. The example shown indicates “non scanning hand overuse, particularly extra linear and angular movement.” To provide additional, more specific information, the non-scanning hand efficiency graphic 616 parses out the efficiency of the ultasound user’s linear motion, angular motion, total button usage, and total knob usage. Additional information included in the health status report 602 can include the health status of the ultasound user, which may be based on one or more examinations, and the extent to which the ultasound user’s performance metrics deviated from the expected metrics. Upon request, ultasound users can query the health status of their non-scanning hand by clicking on a tag/button/other feature on the ultrasound imaging screen of the GUI 600 at the end of an ultrasound exam. The disclosed systems can store and utilize ultasound user-specific performance metrics to generate individualized health reports on a daily, weekly, biweekly, monthly, quarterly, and/or annual basis. In some embodiments, the GUI 600 can be configured to generate and display confidence levels depending on the number of exams included in a specified time period. For instance, confidence levels may be relatively low if a small number of exams is encompassed within a specified date range, whereas confidence levels may be relatively high for a larger number of exams.
[059] Additionally, a live pop-up message or alert 618 can be automatically generated and displayed on the GUI when abnormal (e.g., very large) linear movement, angular motion, and/or button pushes are detected during an exam, as compared to the expected values forecasted by the predictive system for the same upstream clinical context. Enabling live alerts upon recognizing these abnormalities provides real-time guidance for ultasound users, which may minimize potential overuse and injury risk to the non-scanning hand. In some examples, systems herein can connect a ultasound user virtually to an expert who can recommend the implementation of any necessary adjustments. The live alert 618 can be displayed on a health status report configured to monitor and display information regarding a current exam, instead of a previous exam. In some examples, the live alert 618 can be displayed on GUI 600, but not as a component of the health status report. In some examples, the live alert 618 can be displayed on a separate GUI or screen configured to display live ultrasound images obtained during the exam, and may be displayed in a manner that does not unnecessarily disrupt the current exam.
[060] The GUI 600 can be physically and/or communicatively coupled to one or more underlying processors 620 configured to generate and modify the displayed graphics in response to different user inputs received at the GUI 600, for example regarding the date range over which non-scanning hand efficiency is calculated and the qualitative remarks associated therewith. One or more of the processors 620 can be configured to operate an intelligence system, such as neural network 500. In addition or alternatively, one or more of the processors 620 can be configured to mine service log files after one or more ultrasound exams, and may be configured to determine non-scanning hand usages. The one or more processors 620 can also be configured to compare actual non scanning hand usages to expected non-scanning hand usages. Non-scanning hand data acquired over time can be stored in one or more data storage devices 622, which may be coupled with the graphical user interface 600 and/or the one or more processors 620. The GUI 600 can be configured to identify, extract, and/or receive the stored data required to generate a particular health status report 602 in accordance with user-specified parameters received at the GUI 600. For example, the GUI 600 and one or more processors 620 coupled therewith can mine the data storage device(s) 622 for one or more datasets regarding the non-scanning hand usage of a particular ultasound user over a specific time period, along with any particular exams performed during that time period. The GUI 600 can thus be configured to initiate and/or perform a variety of data extractions and analyses, receive data responsive to such extractions and analysis, and/or generate graphical displays in the form a health status report 602 customized in view of the same.
[061] The GUI 600 can also be configured to generate and/or modify the graphics displayed on the health status report 602 in accordance with additional user inputs. For example, the non scanning hand efficiency section 616 may be modified to remove the circular efficiency ratings and/or replace them with a linear efficiency display or numerical efficiency display. The GUI 600 can also be configured to display absolute data regarding non-scanning hand usage, for example in terms of usage hours and the total linear and angular movement accrued during certain periods. In some examples, the GUI 600 can be configured to selectively obtain/receive and display data corresponding to a particular clinical context. For example, the GUI 600 can obtain/receive and display data corresponding to a particular patient type (e.g., “difficult”), a particular model of ultrasound machine, a particular time of day (e.g., AM or PM), and a particular type of exam (e.g., cardiac). Ultasound users and lab managers may therefore use the GUI 600 to identify specific areas needing improvement, which may also be used to adjust ultasound user scheduling.
[062] As noted above, individual hand movements and activities in sonography are not necessarily harmful alone, but frequent repetition or prolonged duration of exposure, compounded with a pace that lacks sufficient time for recovery, can increase the risk of injury significantly. Ultasound users who repeatedly perform the same type(s) of exams utilizing the same muscle groups are therefore more susceptible to injury. The graphical user interface 600 can convey risk-enhancing activities undertaken by a particular ultasound user, for example in the form of over-usage and inefficiencies itemized by activity type, e.g., knob rotation, thereby enabling the ultasound user to minimize the risk of injury.
[063] In additional embodiments, one or more of the systems disclosed herein may be configured to predict a future time at which the non-scanning hand of a ultasound user is likely to develop an injury, such as carpal tunnel syndrome. The systems disclosed herein can be configured to recommend certain actions or adjustments based on this information. For example, embodiments can be configured to recommend that a ultasound user perform less of a certain exam and/or that a ultasound user receive additional training to improve his/her efficiency with respect to a specific exam. [064] FIG. 7 is a block diagram illustrating an example processor 700 according to principles of the present disclosure. One or more processors utilized to implement the disclosed embodiments may be configured the same as or similar to processor 700. Processor 700 may be used to implement one or more processes described herein. For example, processor 700 may be configured to implement an artificial intelligence system configured to generate predicted usage levels, such as neural network 500. Accordingly, the processor 700 can also be configured to receive a clinical context data set and determine an actual usage of the non-scanning hand of the ultasound user expended during the ultrasound exam. The same processor, or a different processor configured similarly, can also compare the expected usage to the actual usage to generate a performance metric of the non-scanning hand of the ultasound user. The processor 700 can also be configured as a graphics processor programmed to generate displays for the graphical user interfaces described herein. The processor 700 can also be configured to determine the actual usage of the non-scanning hand during an ultrasound exam by mining service log fdes, which may be obtained and recorded by the same processor or a different processor configured similarly. In some examples, the processor 700 can be configured to determine an actual usage of the non scanning hand during an ultrasound exam by determining a total linear motion and a total angular motion of the non-scanning hand based on a physical layout of the control panel and the total number, chronology, and type of ultasound user actions received at the control panel.
[065] Processor 700 may be any suitable processor type including, but not limited to, a microprocessor, a microcontroller, a digital signal processor (DSP), a field programmable array (FPGA) where the FPGA has been programmed to form a processor, a graphical processing unit (GPU), an application specific circuit (ASIC) where the ASIC has been designed to form a processor, or a combination thereof.
[066] The processor 700 may include one or more cores 702. The core 702 may include one or more arithmetic logic units (ALU) 704. In some examples, the core 702 may include a floating point logic unit (FPLU) 706 and/or a digital signal processing unit (DSPU) 708 in addition to or instead of the ALU 704.
[067] The processor 700 may include one or more registers 712 communicatively coupled to the core 702. The registers 712 may be implemented using dedicated logic gate circuits (e.g., flip- flops) and/or any memory technology. In some examples, the registers 712 may be implemented using static memory. The register may provide data, instructions and addresses to the core 702. [068] In some examples, processor 700 may include one or more levels of cache memory 710 communicatively coupled to the core 702. The cache memory 710 may provide computer-readable instructions to the core 702 for execution. The cache memory 710 may provide data for processing by the core 702. In some examples, the computer-readable instructions may have been provided to the cache memory 710 by a local memory, for example, local memory attached to the external bus 716. The cache memory 710 may be implemented with any suitable cache memory type, for example, metal-oxide semiconductor (MOS) memory such as static random access memory (SRAM), dynamic random access memory (DRAM), and/or any other suitable memory technology.
[069] The processor 700 may include a controller 714, which may control input to one or more processors included in the systems disclosed herein. Controller 714 may control the data paths in the ALU 704, FPLU 706 and/or DSPU 708. Controller 714 may be implemented as one or more state machines, data paths and/or dedicated control logic. The gates of controller 714 may be implemented as standalone gates, FPGA, ASIC or any other suitable technology.
[070] The registers 712 and the cache memory 710 may communicate with controller 714 and core 702 via internal connections 720A, 720B, 720C and 720D. Internal connections may be implemented as a bus, multiplexor, crossbar switch, and/or any other suitable connection technology.
[071] Inputs and outputs for the processor 700 may be provided via a bus 716, which may include one or more conductive lines. The bus 716 may be communicatively coupled to one or more components of processor 700, for example the controller 714, cache 710, and/or register 712. The bus 716 may be coupled to one or more components of the system.
[072] The bus 716 may be coupled to one or more external memories. The external memories may include Read Only Memory (ROM) 732. ROM 732 may be a masked ROM, Electronically Programmable Read Only Memory (EPROM) or any other suitable technology. The external memory may include Random Access Memory (RAM) 733. RAM 733 may be a static RAM, battery backed up static RAM, Dynamic RAM (DRAM) or any other suitable technology. The external memory may include Electrically Erasable Programmable Read Only Memory (EEPROM) 735. The external memory may include Flash memory 734. The external memory may include a magnetic storage device such as disc 736. [073] FIG. 8 is a flow diagram of a method of evaluating and displaying ultasound user performance. The example method 800 shows the steps that may be utilized, in any sequence, by the systems and/or apparatuses described herein. Although examples of the present system have been illustrated with particular reference to ultrasound imaging modalities, the present system can be extended to other medical imaging systems where one or more images are obtained in a systematic manner. For example, the method 800 may be performed with the aid of one or more imaging systems including but not limited to MRI or CT.
[074] In the embodiment shown, the method 800 begins at block 802 by “receiving a user inquiry regarding a non-scanning hand performance of a ultasound user over a time period.” At block 804, the method involves “obtaining a non-scanning hand performance metric indicative of the non-scanning hand performance of the ultasound user over the time period, the non-scanning hand performance metric based on a comparison of an actual non-scanning hand usage compared to an expected non-scanning hand usage determined over the time period.” At block 806, the method involves “displaying the performance metric,” which may comprise a non-scanning hand efficiency rating. The method 800 can also involve adjusting a depiction of the performance metric in response to a user input, which may relate to the time period over which the performance metric is determined, the ultasound user being evaluated, and/or the exams included in the time period. The expected non-scanning hand usage can be based on one or more clinical context inputs. Each clinical context input can include a description of one or more ultrasound exams conducted over the time period, one or more models of ultrasound machine utilized over the time period to perform the exam(s), at least one attribute of the patient(s), and/or a medical history of the patient(s). The actual usage of the non-scanning hand can be based on service log files obtained and recorded during the ultrasound exam(s) performed during the time period. As disclosed herein, the service log files can include data indicative of the total number, chronology, and type of ultasound user actions received at the control panel of an ultrasound image acquisition device. Qualitative remarks or messages describing the performance metric and/or one or more recommended adjustments to be implemented by a ultasound user can also be generated and displayed. One or more of the steps included in the method 800 may be implemented automatically by a disclosed system, such that user input is not required. Display of the performance metric, for example, may be performed automatically by the GUI operating together with one or more processors. Suggested adjustments for a given ultasound user can also be updated automatically in response to the selection of different time periods and/or exams performed therein.
[075] In various embodiments where components, systems and/or methods are implemented using a programmable device, such as a computer-based system or programmable logic, it should be appreciated that the above-described systems and methods can be implemented using any of various known or later developed programming languages, such as “C”, “C++”, “FORTRAN”, “Pascal”, “VHDL” and the like. Accordingly, various storage media, such as magnetic computer disks, optical disks, electronic memories and the like, can be prepared that can contain information that can direct a device, such as a computer, to implement the above-described systems and/or methods. Once an appropriate device has access to the information and programs contained on the storage media, the storage media can provide the information and programs to the device, thus enabling the device to perform functions of the systems and/or methods described herein. For example, if a computer disk containing appropriate materials, such as a source file, an object file, an executable file or the like, were provided to a computer, the computer could receive the information, appropriately configure itself and perform the functions of the various systems and methods outlined in the diagrams and flowcharts above to implement the various functions. That is, the computer could receive various portions of information from the disk relating to different elements of the above-described systems and/or methods, implement the individual systems and/or methods and coordinate the functions of the individual systems and/or methods described above.
[076] In view of this disclosure, it is noted that the various methods and devices described herein can be implemented in hardware, software and firmware. Further, the various methods and parameters are included by way of example only and not in any limiting sense. In view of this disclosure, those of ordinary skill in the art can implement the present teachings in determining their own techniques and needed equipment to affect these techniques, while remaining within the scope of the invention. The functionality of one or more of the processors described herein may be incorporated into a fewer number or a single processing unit (e.g., a CPU) and may be implemented using application specific integrated circuits (ASICs) or general purpose processing circuits which are programmed responsive to executable instruction to perform the functions described herein.
[077] Accordingly, the present system may be used to obtain and/or project image information related to, but not limited to renal, testicular, breast, ovarian, uterine, thyroid, hepatic, lung, musculoskeletal, splenic, and cardiac applications. Further, the present system may also include one or more programs which may be used with conventional imaging systems so that they may provide features and advantages of the present system. Certain additional advantages and features of this disclosure may be apparent to those skilled in the art upon studying the disclosure, or may be experienced by persons employing the novel system and method of the present disclosure. Another advantage of the present systems and method may be that conventional medical image systems can be easily upgraded to incorporate the features and advantages of the present systems, devices, and methods.
[078] Of course, it is to be appreciated that any one of the examples, embodiments or processes described herein may be combined with one or more other examples, embodiments and/or processes or be separated and/or performed amongst separate devices or device portions in accordance with the present systems, devices and methods.
[079] Finally, the above-discussion is intended to be merely illustrative of the present system and should not be construed as limiting the appended claims to any particular embodiment or group of embodiments. Thus, while the present system has been described in particular detail with reference to exemplary embodiments, it should also be appreciated that numerous modifications and alternative embodiments may be devised by those having ordinary skill in the art without departing from the broader and intended spirit and scope of the present system as set forth in the claims that follow. Accordingly, the specification and drawings are to be regarded in an illustrative manner and are not intended to limit the scope of the appended claims.

Claims

CLAIMS What is claimed is:
1. A ultasound user performance evaluation system comprising: one or more processors (620, 700) configured to: determine a non-scanning hand performance metric (616) indicative of a non scanning hand usage of a ultasound user over a time period (608) in response to a user inquiry (604, 606, 610), the non-scanning hand performance metric based on a comparison of an input of an actual non-scanning hand usage (108) to an input of an expected non-scanning hand usage (106) determined over the time period; and a graphical user interface (600) configured to: receive the user inquiry; obtain the non-scanning hand performance metric; and display the non-scanning hand performance metric.
2. The ultasound user performance evaluation system of claim 1, wherein the one or more processors are further configured to receive an input of one or more clinical contexts, each clinical context comprising: a description of an ultrasound exam conducted over the time period, a model of ultrasound machine utilized over the time period, at least one attribute of a patient examined over the time period, a medical history for the patient, or combinations thereof.
3. The ultasound user performance evaluation system of claim 2, wherein the one or more processors are further configured to apply a neural network to each of the clinical contexts, the neural network configured to generate the expected usage of the non-scanning hand based on the clinical contexts.
4. The ultasound user performance evaluation system of claim 3, wherein the one or more processors are further configured to determine the actual usage of the non-scanning hand expended during the ultrasound exams.
5. The ultasound user performance evaluation system of claim 4, wherein the one or more processors are configured to determine the actual usage of the non-scanning hand based on inputs from one or more service log files obtained and recorded by the one or more processors during the ultrasound exams.
6. The ultasound user performance evaluation system of claim 5, wherein the one or more processors are communicatively coupled with one or more control panels, each control panel configured to adjust imaging parameters implemented by an ultrasound image acquisition device in response to input received from the ultasound user during an ultrasound exam.
7. The ultasound user performance evaluation system of claim 6, wherein the service log files include data indicative of a total number, chronology, and type of ultasound user actions received at the control panel.
8. The ultasound user performance evaluation system of claim 7, wherein the one or more processors are configured to determine the actual usage of the non-scanning hand by determining a total linear motion and a total angular motion of the non-scanning hand based on a physical layout of the control panel and the total number, chronology, and type of ultasound user actions received at the control panel.
9. The ultasound user performance evaluation system of claim 1, wherein the performance metric comprises an efficiency rating of the actual usage of the non-scanning hand.
10. The ultasound user performance evaluation system of claim 8, wherein the performance metric comprises efficiency ratings corresponding to the total linear motion, the total angular motion, the total number of ultasound user inputs, or combinations thereof.
11. The ultasound user performance evaluation system of claim 7, wherein the ultasound user actions comprise total button pushes and total knob usage.
12. The ultasound user performance evaluation system of claim 1, wherein the graphical user interface is further configured to generate and display qualitative remarks describing the performance metric.
13. The ultasound user performance evaluation system of claim 3, wherein the neural network is operatively associated with a training algorithm configured to receive an array of training inputs and known outputs, wherein the training inputs comprise a sample of clinical context datasets obtained from previously performed ultrasound exams, and the known outputs comprise performance metrics of the non-scanning hand of ultasound users who performed the previous ultrasound exams.
14. A method of evaluating and displaying ultasound user performance, the method comprising: receiving (802) a user inquiry (604, 606, 610) regarding a non-scanning hand performance of a ultasound user over a time period (608); obtaining (804) a non-scanning hand performance metric (616) indicative of the non scanning hand performance of the ultasound user over the time period, the non-scanning hand performance metric based on a comparison of an input of an actual non-scanning hand usage (108) to an input of an expected non-scanning hand usage (106) determined over the time period; and displaying (806) the performance metric.
15. The method of claim 14, further comprising adjusting a depiction of the performance metric in response to a user input.
16. The method of claim 14, wherein the expected non-scanning hand usage is based on an input of one or more clinical contexts, each clinical context comprising: a description of one or more ultrasound exams conducted over the time period, one or more models of ultrasound machine utilized over the time period, at least one attribute of the patient, a medical history of the patient, or combinations thereof.
17. The method of claim 14, wherein the actual usage of the non-scanning hand is based on service log files obtained and recorded during ultrasound exams performed during the time period, wherein the service log files include data indicative of a total number, chronology, and type of ultasound user actions received at a control panel of an ultrasound image acquisition device.
18. The method of claim 14, wherein the performance metric comprises an efficiency rating of the actual usage of the non-scanning hand.
19. The method of claim 14, further comprising generating and displaying qualitative remarks describing the performance metric.
20. A non-transitory computer-readable medium comprising executable instructions, which when executed cause a processor of a ultasound user performance evaluation system to perform any of the methods of claims 14-19.
PCT/EP2022/065783 2021-06-16 2022-06-10 User non-scanning hand evaluation WO2022263298A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202163211060P 2021-06-16 2021-06-16
US63/211,060 2021-06-16

Publications (1)

Publication Number Publication Date
WO2022263298A1 true WO2022263298A1 (en) 2022-12-22

Family

ID=82214498

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2022/065783 WO2022263298A1 (en) 2021-06-16 2022-06-10 User non-scanning hand evaluation

Country Status (1)

Country Link
WO (1) WO2022263298A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100203487A1 (en) * 2009-02-12 2010-08-12 American Registry for Diagnostic Medical Sonography, Inc. Systems and methods for assessing a medical ultrasound imaging operator's competency
US20190340956A1 (en) * 2018-05-05 2019-11-07 Mentice Inc. Simulation-based training and assessment systems and methods

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100203487A1 (en) * 2009-02-12 2010-08-12 American Registry for Diagnostic Medical Sonography, Inc. Systems and methods for assessing a medical ultrasound imaging operator's competency
US20190340956A1 (en) * 2018-05-05 2019-11-07 Mentice Inc. Simulation-based training and assessment systems and methods

Similar Documents

Publication Publication Date Title
US10340040B2 (en) Method and system for identifying diagnostic and therapeutic options for medical conditions using electronic health records
Baghdadi et al. Monitoring worker fatigue using wearable devices: A case study to detect changes in gait parameters
US20170140114A1 (en) Machine learning clinical decision support system for risk categorization
EP3861552A1 (en) Systems and methods for designing clinical trials
US8458610B2 (en) Medical information generation and recordation methods and apparatus
EP2780884A1 (en) Pathophysiologic storm tracker
Liu et al. Noninvasive evaluation of mental stress using by a refined rough set technique based on biomedical signals
Sadikov et al. Feasibility of spirography features for objective assessment of motor function in Parkinson's disease
JP6801122B2 (en) Abnormal data processing system and abnormal data processing method
WO2016181490A1 (en) Analysis system and analysis method
Rodríguez-González et al. Analysis of a multilevel diagnosis decision support system and its implications: a case study
Cascarano et al. Machine and deep learning for longitudinal biomedical data: a review of methods and applications
US8428965B2 (en) System for clinical research and clinical management of cardiovascular risk using ambulatory blood pressure monitoring and actigraphy
JP2011138376A (en) Diagnosis support system
WO2022263298A1 (en) User non-scanning hand evaluation
Patnaik Intelligent Decision Support System in Healthcare using Machine Learning Models
US11728041B2 (en) Real-time time series matrix pathophysiologic pattern processor and quality assessment method
WO2017127459A1 (en) Method and system for identifying diagnostic and therapeutic options for medical conditions using electronic health records
WO2022263269A1 (en) User scanning hand evaluation
US20170065211A1 (en) System and method for passive remote monitoring of patients' fine motor behavior
Dawadi et al. Monitoring everyday abilities and cognitive health using pervasive technologies: Current state and prospect
Jimison et al. Unobtrusive computer monitoring of sensory-motor function
Ramamoorthy et al. Identifying patterns of ALS progression from sparse longitudinal data
CN115762812B (en) Digital diagnosis and treatment method, system, equipment and medium for stroke patient
Iezzoni Disability as a covariate in risk adjustment models for predicting hospital deaths

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22733587

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE