WO2022263215A1 - User load balancing - Google Patents

User load balancing Download PDF

Info

Publication number
WO2022263215A1
WO2022263215A1 PCT/EP2022/065283 EP2022065283W WO2022263215A1 WO 2022263215 A1 WO2022263215 A1 WO 2022263215A1 EP 2022065283 W EP2022065283 W EP 2022065283W WO 2022263215 A1 WO2022263215 A1 WO 2022263215A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
ultrasound
exam
users
exams
Prior art date
Application number
PCT/EP2022/065283
Other languages
French (fr)
Inventor
Seyedali SADEGHI
Anup Agarwal
Claudia ERRICO
Hua Xie
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2022263215A1 publication Critical patent/WO2022263215A1/en

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/20ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management or administration of healthcare resources or facilities, e.g. managing hospital staff or surgery rooms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology

Definitions

  • the present disclosure pertains to systems and methods for minimizing physical stress sustained by users performing ultrasound exams over various lengths of time. Implementations include systems configured to generate and display user- and patient-specific estimations of ultrasound exam complexity. Systems are further configured to generate and display individualized exam schedules based on the estimated complexities.
  • WRMSDs Work-related musculoskeletal disorders
  • WRMSDs are painful conditions caused by workplace activities affecting the muscles, ligaments, and tendons.
  • WRMSDs often impose a substantial personal toll on those affected since they may no longer be able to perform personal tasks and routine activities of daily living.
  • WRMSDs develop gradually over time from repeated exposure to a variety of risk factors and may be painful during work and even at rest.
  • the present disclosure describes systems and methods for monitoring, predicting, and reducing the physical stress sustained by ultrasound users, such as sonographers.
  • a user s scanning hand moves an ultrasound probe over the target area of a patient while the non-scanning hand engages with a control panel of the ultrasound system.
  • User injuries are often caused by prolonged, repetitious overuse of the scanning hand, the non-scanning hand, and/or one or both shoulders.
  • Systems disclosed herein can reduce the risk of injury by balancing the workload placed on the hands, wrists, arms, and/or shoulders of users.
  • Embodiments can include graphical user interfaces configured to extract and display user-defined datasets reflecting anticipated ultrasound exam complexities specific to one or more users.
  • the graphical user interface can also display the exam complexities expected for one or more users over user-defined time frames and optimize exam scheduling in view of the same to reduce the predicted burden placed on user scanning and non-scanning hands, thereby minimizing the likelihood of injury.
  • one or more processors can be configured to receive clinical context information input by a user, a lab manager, and/or received from another processor or database. Predicted usages of a user’s scanning and non-scanning hand can also be generated and/or received, along with information regarding the experience and/or skill level of a particular user and additional patient data impacting the complexity of one or more ultrasound exams.
  • the processor(s) can apply an intelligence system, which may include one or more neural networks, to the received information to determine an estimated complexity rating specific to each exam.
  • a user scheduling system can include one or more processors configured to predict one or more ultrasound exam complexity levels, each of the complexity levels corresponding to an ultrasound exam assigned to a user.
  • the system may also include a graphical user interface configured to receive a user inquiry regarding one or more users and one or more ultrasound exams assigned thereto, may distribute the ultrasound exams over a time period based on the complexity levels associated therewith, and may generate a user exam schedule displaying the distributed complexity levels.
  • each of the complexity levels can be based on the expected scanning hand usage required to perform the ultrasound exam, the expected non-scanning hand usage required to perform the ultrasound exam, and the skill level of the user.
  • the expected scanning hand usage can include an expected translational and rotational movement of the scanning hand.
  • the expected non-scanning hand usage can include a total linear and angular motion of the non-scanning hand and a total number and type of user actions received at a control panel of an ultrasound machine utilized to perform the ultrasound exam.
  • each of the complexity levels can be further based on a physical attribute and/or clinical history of a patient to be examined.
  • the one or more processors can be configured to predict the complexity levels by applying a neural network to the expected scanning hand usage, the expected non-scanning hand usage, and the skill level of the user.
  • the graphical user interface is further configured to receive at least one user input specifying the time period over which the user exam schedule should span, along with the users to be included in the exam schedule.
  • the graphical user interface is configured to distribute the ultrasound exams over the time period by maximizing the time between ultrasound exams having high complexities assigned to each of the users.
  • the time period spans one day and the graphical user interface is configured to assign ultrasound exams having high complexities earlier in the day for one or more of the users.
  • the graphical user interface is further configured to receive a user input specifying one or more scheduling conditions used to distribute the ultrasound exams over the time period.
  • the one or more scheduling conditions are user-specific.
  • the neural network is operatively associated with a training algorithm configured to receive an array of training inputs and known outputs.
  • the training inputs can include a sample of non-scanning hand usages, scanning hand usages, and user skill levels obtained from previously performed ultrasound exams.
  • the known outputs can include complexity ratings reported by users who performed the previous ultrasound exams.
  • a method of generating and displaying a user exam schedule can involve receiving a user inquiry regarding one or more users and one or more ultrasound exams assigned thereto, obtaining complexity levels corresponding to the ultrasound exams assigned to the users, distributing the ultrasound exams over a time period based on the complexity levels associated therewith, and generating a user exam schedule displaying the distributed complexity levels.
  • the method further involves receiving a user input specifying the time period and the users included in the user exam schedule. In some examples, the method further involves receiving a user input specifying one or more scheduling conditions used to distribute the ultrasound exams over the time period. In some examples, distributing the ultrasound exams over the time period comprises maximizing a time between ultrasound exams having high complexities assigned to each of the one or more users. In some examples, the time period spans one day and distributing the ultrasound exams comprises assigning ultrasound exams having high complexities earlier in the day for one or more of the users.
  • each of the complexity levels can be based on the expected scanning hand usage required to perform one of the ultrasound exams, the expected non-scanning hand usage required to perform one of the ultrasound exams, and the skill level of the user.
  • the complexity levels are predicted by applying a neural network to the expected scanning hand usage, the expected non-scanning hand usage, and the skill level of the user.
  • Any of the methods described herein, or steps thereof, may be embodied in a non-transitory computer-readable medium comprising executable instructions, which when executed may cause one or more hardware processors to perform the method or steps embodied herein.
  • FIG. 1A is a schematic overview of an exam complexity prediction system implemented in accordance with embodiments of the present disclosure.
  • FIG. IB is a schematic of a graphical user interface displaying a customized user exam schedule in accordance with embodiments of the present disclosure.
  • FIG. 2 is a schematic of a neural network implemented to determine expected exam complexities in accordance with embodiments of the present disclosure.
  • FIG. 3 is a schematic of a processor utilized in accordance with embodiments of the present disclosure.
  • FIG. 4 is a flow diagram of a method of performed in accordance with embodiments of the present disclosure.
  • Embodiments of the disclosed systems are configured to utilize expected usage levels of a user’s scanning and non-scanning hands required to perform a variety of scheduled ultrasound exams.
  • Systems are also configured to utilize a user’s skill level and the body type of each patient to generate an individualized complexity rating for each forthcoming ultrasound exam.
  • Predicted complexity ratings can be determined for multiple exams scheduled for multiple users over adjustable time frames.
  • the disclosed systems can optimize each user’s exam schedule in a manner that balances the user’s workload and minimizes the physical stress placed on the scanning hand, non-scanning hand, and/or one or both shoulders.
  • the terms “usage” and “activity” may be used interchangeably and may include all motions, movements, and activities of a user’s scanning hand, non-scanning hand, and one or both shoulders.
  • Scanning hand usage can be based on the total translational and/or rotational movement of a user’s scanning hand necessary to perform a given exam.
  • Non-scanning hand usage can encompass a user’s engagement with the controls on (or displayed on) the control panel of an ultrasound system. Such engagement can include button pushes, knob rotations, and the lateral/longitudinal movement of the non-scanning hand required to accomplish the same.
  • Scanning and non-scanning hand usages can be directly related and/or used interchangeably with scanning and non-scanning hand “complexity.”
  • total linear hand motion encompasses the total translational hand distance traveled by the scanning hand in the X, Y, and Z direction during an ultrasound exam.
  • Total rotational hand motion encompasses the total rotational hand movement of the scanning hand in the X, Y, and Z oriented axes during an exam.
  • clinical context and “upstream clinical context” may be used interchangeably and may include or be based on the type of exam being performed, e.g., pulmonary exam or cardiac exam, as well as patient-specific information, non-limiting examples of which may include patient weight, height, body mass index (BMI), age, underlying health condition(s), clinical history, and combinations thereof Additional information that may be included within a given clinical context can include the reason(s) for performing a particular exam, e.g., to evaluate left ventricular function, along with the patient’s length of stay, e.g., inpatient and/or outpatient time. Information constituting the clinical context can be combined and/or displayed on the user interface in various ways.
  • the clinical context can include categorical descriptors of a patient’s body type and/or associated levels of exam difficulty, such as “easy,” “moderate,” or “difficult.”
  • the clinical information can also include the particular ultrasound model being used to perform the exam, as the hand and body motion required to perform an ultrasound exam may differ for different models depending for example on the layout of the associated control panel(s).
  • the “health status” of a user’s scanning hand, non-scanning hand, and one or both shoulders can encompass the health of various anatomical parts associated with, including, or attached to the user’s shoulder(s), elbow(s), forearm(s), wrist(s), hand(s), finger(s), or combinations thereof.
  • “expert” or “experienced” users may include certified users such as certified sonographers having at least a certain length of experience, such as at least one year, two years, three years, four years, five years, or longer. “Expert” or “experienced” users such as sonographers may also include users such as sonographers who have received or attained a form of recognized achievement or certification.
  • FIG. 1A depicts an overview of an exam complexity prediction system 100 implemented in accordance with embodiments described herein.
  • the Figures, including FIG 1 A may reference users of ultrasound such as a sonographer, however the techniques disclosed here are not limited to sonographers alone and include many different types of users of ultrasound technology.
  • the inputs 102 utilized by the system 100 for a particular exam can include an expected non-scanning hand complexity 104 and an expected scanning hand complexity 106, along with patient and user information 108.
  • the inputs 102 can be received by a neural network 110 or other intelligence system configured to generate an expected complexity rating 112 for a forthcoming ultrasound exam that is specific to the patient being examined and the user scheduled to perform the exam.
  • the output(s) 112 generated by the neural network 110 can be used by an intelligent scheduling system to optimize user scheduling, as further shown in FIG. IB.
  • the neural network 110 can be configured to receive inputs 102 and generate outputs 112 successively. Additionally or alternatively, multiple neural networks 110 can be configured to operate in parallel to process sets of inputs 102 simultaneously.
  • the expected non-scanning hand complexity 104 and expected scanning hand complexity 106 can be predicted by one or more artificial intelligence systems, e.g., one or more neural networks, trained to predict non-scanning hand and scanning hand usage levels for a variety of exams based on the clinical context relevant to each exam.
  • clinical context typically includes the particular exam performed, the ultrasound machinery utilized, and various physical attributes of the current patient and his/her medical history. The clinical context thus directly impacts the expected non-scanning hand usage and scanning hand usage for each exam.
  • the expected non-scanning hand complexity 104 for each ultrasound exam can be provided in the form of predicted service log file output for each ultrasound exam.
  • the expected service log file output can include an expected log file imaging workflow sequence, which can include a total number and chronology of button pushes and knob rotations received by an ultrasound machine control panel, along with the total linear and angular movement required to implement such button pushes and knob rotations.
  • the predicted linear and angular movement of the non-scanning hand can be determined based on the layout of the control panel used for each exam, which can also be included in the exam-specific clinical context. For example, the particular distance between each of the controls, along with the angles between such controls, often varies between control panels included in different ultrasound machines.
  • the sensitivity of any rotatable knobs which may impact the degree of angular motion required to implement desired imaging adjustments, can also vary from one control panel to the next.
  • the neural network or other intelligence system implemented to generate expected non scanning hand usages can be trained to receive a clinical context for each planned exam and generate an expected non-scanning hand usage output based on the ground truth usage of experienced users.
  • the expected usage of the non-scanning hand for an ultrasound exam can be parsed into individual activities, e.g., total knob usage, button pushes, linear hand movement, and/or angular hand movement.
  • systems disclosed herein can combine data regarding expected non-scanning hand activities into an overall complexity rating 104 specific to the non-scanning hand.
  • the expected non-scanning hand complexity 104 will increase as the expected button pushes, knob usage, linear movement, and/or angular movement increase.
  • the expected scanning hand complexity 106 can include the translational and/or rotational movement of the scanning hand predicted for any given ultrasound exam.
  • the expected translational and rotational movement can be embodied in an expected electromagnetic tracking system output indicating the total amount and/or degree of translational and/or rotational scanning hand movement required for a particular ultrasound exam.
  • An electromagnetic tracking system can include an electromagnetic tracking device, e.g., a device or sensor integrated within or attached to an imaging device such as an ultrasound probe, and an external device configured to emit a low-intensity electromagnetic field through which the electromagnetic tracking device passes during an exam. Expected output of an electromagnetic tracking system can thus be based on anticipated movement of the electromagnetic tracking device during an ultrasound exam.
  • the neural network or other intelligence system implemented to generate expected scanning hand usages can be trained to receive a clinical context for each planned exam and generate an expected scanning hand usage output based on the ground truth usage of experienced users previously measured by an electromagnetic tracking system.
  • the expected usage of the scanning hand for an ultrasound exam can be parsed into individual activities, e.g., translational movement and rotational movement.
  • systems can combine data regarding expected activities into an overall complexity rating 106 specific to the scanning hand.
  • the expected scanning hand complexity 106 will increase as the expected translational and/or rotational movement also increase.
  • patient and user information 108 can also be input into the neural network 110.
  • Such input 108 can include the skill level, e.g., novice, junior, intermediate, and/or expert, of a user scheduled to perform an ultrasound exam.
  • User skill level can be based on years of experience, the number of exams performed, and/or the variety of exams performed.
  • Patient information included in the input 108 can reflect the complexity, e.g., easy, moderate, or difficult, associated with imaging a patient. This information can depend at least in part on a patient’s physical attributes, e.g., height, weight, and/or BMI, such that a patient having a higher BMI, for example, may be more difficult to examine than a patient having a lower BMI.
  • a patient’s clinical history which may include underlying medical conditions and/or the reason for performing a particular exam, can also be included in input 108.
  • the estimated complexity rating 112 generated by the neural network 110 can include a numerical rating ranging from 1 to 5, with 5 indicating high complexity (or “difficult” rating) and 1 indicating low complexity (or “easy” rating).
  • the complexity scale can vary, ranging anywhere from 1 to 10, 1 to 50, or 1 to 100, for example.
  • Additional embodiments can include letter ratings, color ratings, and/or qualitative ratings.
  • Qualitative complexity ratings can include a short description of the expected complexity, which may include predicted complexities associated with usage of the scanning and/or non-scanning hand. The disclosed embodiments are not limited to a particular rating format.
  • FIG. IB shows a graphical user interface (GUI) 114 (which may also be referred to as a GUI) 114 (which may also be referred to as a GUI).
  • GUI graphical user interface
  • “dashboard” or “panel”) configured to display a user exam schedule 116 determined by an optimization system 118, which may include one or more underlying processors that form a component of, are included within, or are communicatively coupled with the GUI 114.
  • the optimization system 118 can receive complexity ratings 112 generated by the neural network 110, each complexity rating corresponding to an ultrasound exam scheduled to be performed on a specific patient by a particular user.
  • the optimization system 118 can distribute the exam assignments based on their associated complexity ratings 112 in a manner that balances the physical stresses sustained by each user over a particular time frame spanning one or more days, weeks, or months.
  • the methodology implemented by the optimization system 118 for distributing exam assignments can vary and may be different for different users.
  • the optimization system 118 can be configured to maximize the time between high-complexity exams.
  • the user exam schedule 116 for a single day was generated by an optimization system 118 configured to maximize the separation between the most complex exams, such that the two exams having a complexity rating of 5 were arranged first and last in the day for User A.
  • the same methodology was applied to the exam assignments for User B and User C.
  • Intervening exams can be arranged such that the most complex exams are evenly distributed, for example as shown for User A.
  • the exams having the second-highest complexity ratings were front-loaded such that the users were scheduled to perform the majority of complex exams earlier in the day, when the users may be the most fresh.
  • the scheduling conditions applied by the optimization system 118 can be user-specific, such that scheduling conditions may be different from one user to the next. For example, some users may prefer to perform the most complex exams earlier in the day, while others may prefer an even distribution of complex exams throughout the day to maximize the recovery time therebetween. These preferences can be stored and factored into the methodology applied by the optimization system 118.
  • the conditions applied by the optimization system 118 for a particular user may be defined by a user, such as a lab manager responsible for exam assignments. Users who exhibit reduced performance efficiency later in the day may be assigned to higher-complexity exams in the morning, for example.
  • the optimization system 118 can apply conditions requiring more complex exams to be separated by the maximum length of time and further requiring that each user be assigned exams with fair hand motion complexity, for example having an average rating of 3.
  • the optimization system 118 can require an even distribution of complex exams across a group of users operating in the same facility. Together with the GUI 114, the optimization system 118 can thus ensure a fair and well-balanced assignment of exams to users in a manner that minimizes the risk of hand injury caused by the continuous and/or consecutive assignment of prolonged and complex exams to certain users.
  • the optimization system 118 can be configured to recognize and/or parse free-form text and utilize the recognized text to distribute the exams accordingly.
  • words or phrases reflecting the anticipated duration, difficulty, and/or hand usage(s) can be identified and used to rank a set of scheduled exams from least complex to most complex. The ranking can then be utilized by the optimization system 118 to distribute the exams in accordance with a particular set of predefined conditions.
  • the user exam schedule 116 illustrated in FIG. IB is a daily schedule.
  • the exam schedule can span only a portion of one day or more than one day, e.g., one or more weeks or months.
  • Embodiments can generate exam schedules for any future time point, such as a future day or week. Accordingly, the time period 117 defined by a user for a particular user exam schedule may vary.
  • the GUI 114 can thus generate customized user exam schedules based on a variety of factors and display the schedules directly to a user such as a sonographer and/or a lab manager overseeing the user’s work assignments and performance.
  • a user such as a sonographer and/or a lab manager overseeing the user’s work assignments and performance.
  • the GUI 114 can also receive free text and/or search entries corresponding to specific users.
  • the schedule 116 can also feature a selectable date range over which a user schedule spans.
  • the GUI 114 can be physically and/or communicatively coupled to one or more underlying processors 120 configured to generate and modify the displayed graphics in response to different user inputs received at the GUI 114, for example regarding the users included in a given schedule or the date range encompassed by a given schedule.
  • One or more of the processors 120 can be configured to operate an intelligence system, such as neural network 200.
  • User data and scheduling data acquired over time can be stored in one or more data storage devices 122, which may be coupled with the GUI 114 and/or the one or more processors 120.
  • the GUI 114 can be configured to identify, extract, and/or receive the stored data required to generate a particular user exam schedule 116 in accordance with user-specified parameters received at the GUI 114.
  • the GUI 114 and one or more processors 120 coupled therewith can mine the data storage device(s) 122 for one or more datasets regarding the exams previously and/or currently scheduled for one or more users.
  • the GUI 114 can also mine the data storage device(s) 122 for one or more datasets corresponding to the current skill level of a user added to a schedule by a user.
  • the GUI 114 can also mine the data storage device(s) 122 for one or more datasets corresponding to the physical attributes and clinical history of a scheduled patient, along with the expected scanning and non-scanning hand usages associated with a particular exam based on its surrounding clinical context.
  • Each user inquiry 119 received at the GUI 114 can initiate the exam complexity prediction system 100 and subsequent implementation of the optimization system 118, thereby automatically generating and/or modifying the user exam schedule 116.
  • the GUI 114 can thus be configured to initiate and/or perform a variety of data extractions and analyses, receive data corresponding to such extractions and analyses, and generate graphical displays in the form a user schedule 116 customized in view of the same.
  • the GUI 114 can also be configured to generate and/or modify the graphics displayed on the user exam schedule 116 in accordance with additional user inputs. For example, the GUI 114 can automatically rearrange the exam schedule for one or more users upon receiving a patient cancellation and/or upon the addition or removal of one or more users from an existing schedule.
  • FIG. 2 is a depiction of a neural network 200 that may be trained and implemented to generate expected exam complexity ratings based on expected scanning hand usage, non-scanning hand usage, user skill level, and patient body type.
  • the neural network 200 may include an input layer 202 configured to receive inputs including but not limited to an expected scanning hand usage 202a, an expected non-scanning hand usage 202b, and patient- and user-specific information 202c.
  • the number of nodes or neurons in the input layer 202 may vary, and in some embodiments may equal the number of input categories, or the number of input categories plus one.
  • the neural network 200 can be trained to receive the inputs 202a/b/c and generate an expected complexity rating based on the ground truth rating of experienced users provided after performing an exam.
  • Embodiments of the neural network 200 may be configured to implement an algorithmic regressive prediction model.
  • the output layer or node 204 of the neural network 200 can provide the expected complexity rating, which can be numerical, letter-based, and/or qualitative text-based. Tike the input layer 202, the number of neurons in the output layer 204 may vary. For example, the output layer 204 may include one total neuron or one neuron for each factor constituting the overall complexity rating, such as non-scanning hand complexity.
  • Operating between the input layer 202 and the output layer 204 can be one or more hidden layers 206 configured to assign and optimize weights associated with the inputs 202a/b/c, for example via backpropagation, and apply the weighted inputs to an activation function, e.g., the rectified linear activation function.
  • the number of hidden layers 206 and the number of neurons present therein may also vary. In some embodiments, the number of hidden neurons in a given hidden layer may equal the product of the total number of input neurons and output neurons, together multiplied by the number of datasets used to train the network.
  • the particular neural network(s) implemented in accordance with the disclosed embodiments may vary. The number of neurons may change, for example, along with their arrangement within the network. The size, width, depth, capacity, and/or architecture of the network may vary.
  • the neural network 200 may be hardware- (e.g., neurons are represented by physical components) or software-based (e.g., neurons and pathways implemented in a software application), and can use a variety of topologies and learning algorithms for training the neural network to produce the desired output.
  • a software-based neural network may be implemented using a processor (e.g., single- or multi-core CPU, a single GPU or GPU cluster, or multiple processors arranged for parallel-processing) configured to execute instructions, which may be stored in a computer-readable medium, and which when executed cause the processor to perform a machine-trained algorithm for receiving input(s) 202a/b/c and generating an expected complexity rating 204 in view of the same.
  • the neural network 200 may be implemented, at least in part, in a computer-readable medium including executable instructions, which when executed by a processor, may cause the processor to perform a machine-trained algorithm to output the expected complexity rating(s).
  • the neural network(s) may be trained using any of a variety of currently known or later developed machine learning techniques to obtain a neural network (e.g., a machine-trained algorithm or hardware-based system of nodes) configured to analyze input data in the form of patient- and exam-specific information collectively constituting a clinical context.
  • a neural network e.g., a machine-trained algorithm or hardware-based system of nodes
  • the ground truth used for training the network 200 can include documented complexity ratings provided by expert, experienced, or average users after performing the same or similar exams on similar patients.
  • Supervised learning models can be trained on a comprehensive data set of exam-specific non-scanning hand complexities, scanning hand complexities, and patient- and user-specific information. The accuracy of the neural network can therefore grow stronger over time as more data is input.
  • the model may be (partially) realized as an Al-based learning network.
  • the computer-implemented techniques utilized to generate the expected complexity ratings may vary, and may involve artificial intelligence-based processing, e.g., sophisticated supervised machine learning.
  • the neural network 200 can also be coupled to a training database 208.
  • the training database 208 may provide a large sample of non-scanning hand usages, scanning hand usages, and patient- and user-specific datasets used to train the neural network.
  • Communication between the training database 208 and the neural network 200 can be bidirectional, such that the training database 208 may provide sample inputs and associated complexity ratings provided by experienced users to the network for training purposes, and the neural network 200 can transmit new training datasets for storage in the training database 208, thereby increasing the sample size of datasets embodying predicted scanning hand usage, non-scanning hand usage, and patient- and user- specific information, all paired with complexity outputs, to further refine future output from the neural network.
  • neural networks e.g., network 200
  • network 200 may be utilized to generate expected complexity ratings
  • embodiments are not confined to neural networks, and a number of additional and alternative intelligence systems or components may be utilized, such as random forests or support vector machines.
  • exam complexity and the associated hand usages in sonography are not necessarily harmful alone, but frequent repetition or prolonged duration of exposure, compounded with a pace that lacks sufficient time for recovery, can increase the risk of injury significantly. Users who repeatedly perform the same type(s) of exams utilizing the same muscle groups are therefore more susceptible to injury, especially if such exams are moderately or highly complex.
  • the graphical user interface 114 can minimize risk-enhancing activities undertaken by one or more users by distributing ultrasound exams in a manner that reduces the number of complex exams performed by a user and/or distributes the complex exams in a manner that reduces their impact on user exertion.
  • FIG. 3 is a block diagram illustrating an example processor 300 according to principles of the present disclosure.
  • processors utilized to implement the disclosed embodiments may be configured the same as or similar to processor 300.
  • Processor 300 may be used to implement one or more processes described herein.
  • processor 300 may be configured to implement an artificial intelligence system configured to generate predicted complexity ratings, such as neural network 200.
  • the processor 300 can also be configured to receive expected non-scanning hand usages, scanning hand usages, and patient- and user-specific information, and subsequently determine an expected complexity rating corresponding to a scheduled ultrasound exam that is specific to the patient being examined and the user performing the exam.
  • the processor 300 can also be configured to receive multiple complexity ratings and distribute them in a manner that reduces the physical stress sustained by one or more users.
  • the processor 300, or a different processor configured similarly can also be configured as a graphics processor programmed to generate displays for the graphical user interfaces described herein.
  • the processor 300 may be any suitable processor type including, but not limited to, a microprocessor, a microcontroller, a digital signal processor (DSP), a field programmable array (FPGA) where the FPGA has been programmed to form a processor, a graphical processing unit (GPU), an application specific circuit (ASIC) where the ASIC has been designed to form a processor, or a combination thereof.
  • DSP digital signal processor
  • FPGA field programmable array
  • GPU graphical processing unit
  • ASIC application specific circuit
  • the processor 300 may include one or more cores 302.
  • the core 302 may include one or more arithmetic logic units (ALU) 304.
  • the core 302 may include a floating point logic unit (FPLU) 306 and/or a digital signal processing unit (DSPU) 308 in addition to or instead of the ALU 304.
  • FPLU floating point logic unit
  • DSPU digital signal processing unit
  • the processor 300 may include one or more registers 312 communicatively coupled to the core 302.
  • the registers 312 may be implemented using dedicated logic gate circuits (e.g., flip- flops) and/or any memory technology. In some examples, the registers 312 may be implemented using static memory.
  • the register may provide data, instructions and addresses to the core 302.
  • processor 300 may include one or more levels of cache memory 310 communicatively coupled to the core 302.
  • the cache memory 310 may provide computer-readable instructions to the core 302 for execution.
  • the cache memory 310 may provide data for processing by the core 302.
  • the computer-readable instructions may have been provided to the cache memory 310 by a local memory, for example, local memory attached to the external bus 316.
  • the cache memory 310 may be implemented with any suitable cache memory type, for example, metal-oxide semiconductor (MOS) memory such as static random access memory (SRAM), dynamic random access memory (DRAM), and/or any other suitable memory technology.
  • MOS metal-oxide semiconductor
  • the processor 300 may include a controller 314, which may control input to one or more processors included herein, e.g., processor 120. Controller 314 may control the data paths in the ALU 304, FPLU 306 and/or DSPU 308. Controller 314 may be implemented as one or more state machines, data paths and/or dedicated control logic. The gates of controller 314 may be implemented as standalone gates, FPGA, ASIC or any other suitable technology.
  • the registers 312 and the cache memory 310 may communicate with controller 314 and core 302 via internal connections 320A, 320B, 320C and 320D.
  • Internal connections may be implemented as a bus, multiplexor, crossbar switch, and/or any other suitable connection technology.
  • Inputs and outputs for the processor 300 may be provided via a bus 316, which may include one or more conductive lines.
  • the bus 316 may be communicatively coupled to one or more components of processor 300, for example the controller 314, cache 310, and/or register 312.
  • the bus 316 may be coupled to one or more components of the system.
  • the bus 316 may be coupled to one or more external memories.
  • the external memories may include Read Only Memory (ROM) 332.
  • ROM 332 may be a masked ROM, Electronically Programmable Read Only Memory (EPROM) or any other suitable technology.
  • the external memory may include Random Access Memory (RAM) 333.
  • RAM 333 may be a static RAM, battery backed up static RAM, Dynamic RAM (DRAM) or any other suitable technology.
  • the external memory may include Electrically Erasable Programmable Read Only Memory (EEPROM) 335.
  • the external memory may include Flash memory 334.
  • the external memory may include a magnetic storage device such as disc 336.
  • FIG. 4 is a flow diagram of a method 400 of predicting ultrasound exam complexities and generating optimized exam schedules in view of the same.
  • the example method 400 shows the steps that may be utilized, in any sequence, by the systems, processors, graphical user interfaces, and/or apparatuses described herein.
  • examples of the present system have been illustrated with particular reference to ultrasound imaging modalities, the present system can be extended to other medical imaging systems where one or more images are obtained in a systematic manner.
  • the method 400 may be performed with the aid of one or more imaging systems including but not limited to MRI or CT.
  • the method involves “receiving a user inquiry regarding one or more users and one or more ultrasound exams assigned thereto.”
  • the method 400 involves “obtaining complexity levels corresponding to the ultrasound exams assigned to the users.”
  • the method 400 involves “distributing the ultrasound exams over a time period based on the complexity levels associated therewith.”
  • the method involves “generating a user exam schedule displaying the distributed complexity levels.”
  • the storage media can provide the information and programs to the device, thus enabling the device to perform functions of the systems and/or methods described herein.
  • the computer could receive the information, appropriately configure itself and perform the functions of the various systems and methods outlined in the diagrams and flowcharts above to implement the various functions. That is, the computer could receive various portions of information from the disk relating to different elements of the above-described systems and/or methods, implement the individual systems and/or methods and coordinate the functions of the individual systems and/or methods described above.
  • processors described herein can be implemented in hardware, software and firmware. Further, the various methods and parameters are included by way of example only and not in any limiting sense. In view of this disclosure, those of ordinary skill in the art can implement the present teachings in determining their own techniques and needed equipment to affect these techniques, while remaining within the scope of the invention.
  • the functionality of one or more of the processors described herein may be incorporated into a fewer number or a single processing unit (e.g., a CPU) and may be implemented using application specific integrated circuits (ASICs) or general purpose processing circuits which are programmed responsive to executable instruction to perform the functions described herein.
  • ASICs application specific integrated circuits
  • the present system may also include one or more programs which may be used with conventional imaging systems so that they may provide features and advantages of the present system. Certain additional advantages and features of this disclosure may be apparent to those skilled in the art upon studying the disclosure, or may be experienced by persons employing the novel system and method of the present disclosure. Another advantage of the present systems and method may be that conventional medical image systems can be easily upgraded to incorporate the features and advantages of the present systems, devices, and methods.

Abstract

The present disclosure describes systems configured to minimize physical stress sustained by users by predicting ultrasound exam complexities and assigning exams to users in view of the same. The systems include processors configured to predict ultrasound exam complexity levels based on expected scanning hand usage, non-scanning hand usage, and user skill level. The complexity levels can be predicted by implementing a neural network trained to output complexity levels based on exam- and patient-specific complexity ratings reported by expert users. Systems also include a graphical user interface configured to receive user inquiries regarding particular users and the ultrasound exams assigned thereto. The graphical user interface can distribute the ultrasound exams over various lengths of time based on the predicted complexity levels, and generate a customized user exam schedule displaying the distributed complexity levels for viewing and modification.

Description

USER LOAD BALANCING
TECHNICAL FIELD
[001] The present disclosure pertains to systems and methods for minimizing physical stress sustained by users performing ultrasound exams over various lengths of time. Implementations include systems configured to generate and display user- and patient-specific estimations of ultrasound exam complexity. Systems are further configured to generate and display individualized exam schedules based on the estimated complexities.
BACKGROUND
[002] Work-related musculoskeletal disorders (WRMSDs) are painful conditions caused by workplace activities affecting the muscles, ligaments, and tendons. Aside from interfering with workers’ ability to perform work-related tasks, WRMSDs often impose a substantial personal toll on those affected since they may no longer be able to perform personal tasks and routine activities of daily living. Unlike acute injuries, WRMSDs develop gradually over time from repeated exposure to a variety of risk factors and may be painful during work and even at rest.
[003] Users develop WRMSDs at an especially high rate. Despite improvements in the flexibility of ultrasound systems and exam tables, it has been reported that about 90% of clinical users experience symptoms of WRMSDs, 20% of which suffer career-ending injuries. Studies have also shown that 60% of sonographers experience wrist/hand/finger discomfort in scanning and non scanning hands caused by WRMSDs. Ultrasound users such as sonographers who perform the same type(s) of exams repeatedly, or exams using similar muscle activity, have increased exposure to risk factors associated with repetition. These findings highlight the extent and severity of injuries commonly experienced by frequent users of sonographic equipment such as sonographers, which also limits patient access for those in need of ultrasound examination.
[004] Improved technologies are therefore needed to reduce the prevalence of WRMSDs caused by injuries to users like sonographers.
SUMMARY
[005] The present disclosure describes systems and methods for monitoring, predicting, and reducing the physical stress sustained by ultrasound users, such as sonographers. During an ultrasound exam, a user’ s scanning hand moves an ultrasound probe over the target area of a patient while the non-scanning hand engages with a control panel of the ultrasound system. User injuries are often caused by prolonged, repetitious overuse of the scanning hand, the non-scanning hand, and/or one or both shoulders. Systems disclosed herein can reduce the risk of injury by balancing the workload placed on the hands, wrists, arms, and/or shoulders of users. Embodiments can include graphical user interfaces configured to extract and display user-defined datasets reflecting anticipated ultrasound exam complexities specific to one or more users. Together with one or more additional processors, the graphical user interface can also display the exam complexities expected for one or more users over user-defined time frames and optimize exam scheduling in view of the same to reduce the predicted burden placed on user scanning and non-scanning hands, thereby minimizing the likelihood of injury.
[006] In accordance with embodiments disclosed herein, one or more processors can be configured to receive clinical context information input by a user, a lab manager, and/or received from another processor or database. Predicted usages of a user’s scanning and non-scanning hand can also be generated and/or received, along with information regarding the experience and/or skill level of a particular user and additional patient data impacting the complexity of one or more ultrasound exams. The processor(s) can apply an intelligence system, which may include one or more neural networks, to the received information to determine an estimated complexity rating specific to each exam.
[007] In accordance with embodiments of the present disclosure, a user scheduling system can include one or more processors configured to predict one or more ultrasound exam complexity levels, each of the complexity levels corresponding to an ultrasound exam assigned to a user. The system may also include a graphical user interface configured to receive a user inquiry regarding one or more users and one or more ultrasound exams assigned thereto, may distribute the ultrasound exams over a time period based on the complexity levels associated therewith, and may generate a user exam schedule displaying the distributed complexity levels.
[008] In some examples, each of the complexity levels can be based on the expected scanning hand usage required to perform the ultrasound exam, the expected non-scanning hand usage required to perform the ultrasound exam, and the skill level of the user. In some examples, the expected scanning hand usage can include an expected translational and rotational movement of the scanning hand. In some examples, the expected non-scanning hand usage can include a total linear and angular motion of the non-scanning hand and a total number and type of user actions received at a control panel of an ultrasound machine utilized to perform the ultrasound exam. In some examples, each of the complexity levels can be further based on a physical attribute and/or clinical history of a patient to be examined. In some examples, the one or more processors can be configured to predict the complexity levels by applying a neural network to the expected scanning hand usage, the expected non-scanning hand usage, and the skill level of the user.
[009] In some examples, the graphical user interface is further configured to receive at least one user input specifying the time period over which the user exam schedule should span, along with the users to be included in the exam schedule. In some examples, the graphical user interface is configured to distribute the ultrasound exams over the time period by maximizing the time between ultrasound exams having high complexities assigned to each of the users. In some examples, the time period spans one day and the graphical user interface is configured to assign ultrasound exams having high complexities earlier in the day for one or more of the users. In some examples, the graphical user interface is further configured to receive a user input specifying one or more scheduling conditions used to distribute the ultrasound exams over the time period. In some examples, the one or more scheduling conditions are user-specific. In some examples, the neural network is operatively associated with a training algorithm configured to receive an array of training inputs and known outputs. The training inputs can include a sample of non-scanning hand usages, scanning hand usages, and user skill levels obtained from previously performed ultrasound exams. The known outputs can include complexity ratings reported by users who performed the previous ultrasound exams.
[010] In accordance with embodiments of the present disclosure, a method of generating and displaying a user exam schedule can involve receiving a user inquiry regarding one or more users and one or more ultrasound exams assigned thereto, obtaining complexity levels corresponding to the ultrasound exams assigned to the users, distributing the ultrasound exams over a time period based on the complexity levels associated therewith, and generating a user exam schedule displaying the distributed complexity levels.
[Oil] In some examples, the method further involves receiving a user input specifying the time period and the users included in the user exam schedule. In some examples, the method further involves receiving a user input specifying one or more scheduling conditions used to distribute the ultrasound exams over the time period. In some examples, distributing the ultrasound exams over the time period comprises maximizing a time between ultrasound exams having high complexities assigned to each of the one or more users. In some examples, the time period spans one day and distributing the ultrasound exams comprises assigning ultrasound exams having high complexities earlier in the day for one or more of the users.
[012] In some examples, each of the complexity levels can be based on the expected scanning hand usage required to perform one of the ultrasound exams, the expected non-scanning hand usage required to perform one of the ultrasound exams, and the skill level of the user. In some examples, the complexity levels are predicted by applying a neural network to the expected scanning hand usage, the expected non-scanning hand usage, and the skill level of the user.
[013] Any of the methods described herein, or steps thereof, may be embodied in a non-transitory computer-readable medium comprising executable instructions, which when executed may cause one or more hardware processors to perform the method or steps embodied herein.
BRIEF DESCRIPTION OF THE DRAWINGS
[014] FIG. 1A is a schematic overview of an exam complexity prediction system implemented in accordance with embodiments of the present disclosure.
[015] FIG. IB is a schematic of a graphical user interface displaying a customized user exam schedule in accordance with embodiments of the present disclosure.
[016] FIG. 2 is a schematic of a neural network implemented to determine expected exam complexities in accordance with embodiments of the present disclosure.
[017] FIG. 3 is a schematic of a processor utilized in accordance with embodiments of the present disclosure.
[018] FIG. 4 is a flow diagram of a method of performed in accordance with embodiments of the present disclosure.
DETAIFED DESCRIPTION
[019] The following description of certain embodiments is merely exemplary in nature and is in no way intended to limit the invention or its applications or uses. In the following detailed description of embodiments of the present systems and methods, reference is made to the accompanying drawings which form a part hereof, and which are shown by way of illustration specific embodiments in which the described systems and methods may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice presently disclosed systems and methods, and it is to be understood that other embodiments may be utilized and that structural and logical changes may be made without departing from the spirit and scope of the present system. Moreover, for the purpose of clarity, detailed descriptions of certain features will not be discussed when they would be apparent to those with skill in the art so as not to obscure the description of the present system. The following detailed description is therefore not to be taken in a limiting sense, and the scope of the present system is defined only by the appended claims.
[020] The disclosed systems and methods overcome the lack of intelligent, systematic tools for monitoring, determining, and ultimately improving the physical health of medical users. Embodiments of the disclosed systems are configured to utilize expected usage levels of a user’s scanning and non-scanning hands required to perform a variety of scheduled ultrasound exams. Systems are also configured to utilize a user’s skill level and the body type of each patient to generate an individualized complexity rating for each forthcoming ultrasound exam. Predicted complexity ratings can be determined for multiple exams scheduled for multiple users over adjustable time frames. Using the complexity ratings, the disclosed systems can optimize each user’s exam schedule in a manner that balances the user’s workload and minimizes the physical stress placed on the scanning hand, non-scanning hand, and/or one or both shoulders.
[021] As used herein, the terms “usage” and “activity” may be used interchangeably and may include all motions, movements, and activities of a user’s scanning hand, non-scanning hand, and one or both shoulders. Scanning hand usage can be based on the total translational and/or rotational movement of a user’s scanning hand necessary to perform a given exam. Non-scanning hand usage can encompass a user’s engagement with the controls on (or displayed on) the control panel of an ultrasound system. Such engagement can include button pushes, knob rotations, and the lateral/longitudinal movement of the non-scanning hand required to accomplish the same. Scanning and non-scanning hand usages can be directly related and/or used interchangeably with scanning and non-scanning hand “complexity.”
[022] As used herein, “total linear hand motion” encompasses the total translational hand distance traveled by the scanning hand in the X, Y, and Z direction during an ultrasound exam. “Total rotational hand motion” encompasses the total rotational hand movement of the scanning hand in the X, Y, and Z oriented axes during an exam. [023] As used herein, “clinical context” and “upstream clinical context” may be used interchangeably and may include or be based on the type of exam being performed, e.g., pulmonary exam or cardiac exam, as well as patient-specific information, non-limiting examples of which may include patient weight, height, body mass index (BMI), age, underlying health condition(s), clinical history, and combinations thereof Additional information that may be included within a given clinical context can include the reason(s) for performing a particular exam, e.g., to evaluate left ventricular function, along with the patient’s length of stay, e.g., inpatient and/or outpatient time. Information constituting the clinical context can be combined and/or displayed on the user interface in various ways. For example, the clinical context can include categorical descriptors of a patient’s body type and/or associated levels of exam difficulty, such as “easy,” “moderate,” or “difficult.” The clinical information can also include the particular ultrasound model being used to perform the exam, as the hand and body motion required to perform an ultrasound exam may differ for different models depending for example on the layout of the associated control panel(s).
[024] As used herein, the “health status” of a user’s scanning hand, non-scanning hand, and one or both shoulders can encompass the health of various anatomical parts associated with, including, or attached to the user’s shoulder(s), elbow(s), forearm(s), wrist(s), hand(s), finger(s), or combinations thereof.
[025] As used herein, “expert” or “experienced” users may include certified users such as certified sonographers having at least a certain length of experience, such as at least one year, two years, three years, four years, five years, or longer. “Expert” or “experienced” users such as sonographers may also include users such as sonographers who have received or attained a form of recognized achievement or certification.
[026] Various ultrasound-based exams are contemplated herein, non -limiting examples of which include diagnostic imaging, cardiac imaging, vascular imaging, lung imaging, and combinations thereof. The particular exam being performed likely impacts the usage of a user’s scanning hand, non-scanning hand, and/or one or both shoulders, especially if an exam requires the acquisition of a greater number of images at a variety of depths and/or angles. For example, certain cardiac scans can be lengthy and could add burden on scanning hands, while certain OB/GYN exams can place an extra burden on non-scanning hands due to the biometric measurements required to guarantee the health of the fetus. [027] FIG. 1A depicts an overview of an exam complexity prediction system 100 implemented in accordance with embodiments described herein. The Figures, including FIG 1 A may reference users of ultrasound such as a sonographer, however the techniques disclosed here are not limited to sonographers alone and include many different types of users of ultrasound technology. As shown, the inputs 102 utilized by the system 100 for a particular exam can include an expected non-scanning hand complexity 104 and an expected scanning hand complexity 106, along with patient and user information 108. The inputs 102 can be received by a neural network 110 or other intelligence system configured to generate an expected complexity rating 112 for a forthcoming ultrasound exam that is specific to the patient being examined and the user scheduled to perform the exam. By determining a complexity rating corresponding to each exam scheduled to be performed by one or more users over a defined period, the output(s) 112 generated by the neural network 110 can be used by an intelligent scheduling system to optimize user scheduling, as further shown in FIG. IB. The neural network 110 can be configured to receive inputs 102 and generate outputs 112 successively. Additionally or alternatively, multiple neural networks 110 can be configured to operate in parallel to process sets of inputs 102 simultaneously.
[028] The expected non-scanning hand complexity 104 and expected scanning hand complexity
106 can include or be directly related to expected non-scanning hand usage and expected scanning hand usage, respectively. The expected non-scanning hand complexity 104 and expected scanning hand complexity 106 can be predicted by one or more artificial intelligence systems, e.g., one or more neural networks, trained to predict non-scanning hand and scanning hand usage levels for a variety of exams based on the clinical context relevant to each exam. Among other factors, clinical context typically includes the particular exam performed, the ultrasound machinery utilized, and various physical attributes of the current patient and his/her medical history. The clinical context thus directly impacts the expected non-scanning hand usage and scanning hand usage for each exam.
[029] The expected non-scanning hand complexity 104 for each ultrasound exam can be provided in the form of predicted service log file output for each ultrasound exam. In embodiments, the expected service log file output can include an expected log file imaging workflow sequence, which can include a total number and chronology of button pushes and knob rotations received by an ultrasound machine control panel, along with the total linear and angular movement required to implement such button pushes and knob rotations. The predicted linear and angular movement of the non-scanning hand can be determined based on the layout of the control panel used for each exam, which can also be included in the exam-specific clinical context. For example, the particular distance between each of the controls, along with the angles between such controls, often varies between control panels included in different ultrasound machines. The sensitivity of any rotatable knobs, which may impact the degree of angular motion required to implement desired imaging adjustments, can also vary from one control panel to the next.
[030] The neural network or other intelligence system implemented to generate expected non scanning hand usages can be trained to receive a clinical context for each planned exam and generate an expected non-scanning hand usage output based on the ground truth usage of experienced users. In some examples, the expected usage of the non-scanning hand for an ultrasound exam can be parsed into individual activities, e.g., total knob usage, button pushes, linear hand movement, and/or angular hand movement. Additionally or alternatively, systems disclosed herein can combine data regarding expected non-scanning hand activities into an overall complexity rating 104 specific to the non-scanning hand. Generally, the expected non-scanning hand complexity 104 will increase as the expected button pushes, knob usage, linear movement, and/or angular movement increase.
[031] The expected scanning hand complexity 106 can include the translational and/or rotational movement of the scanning hand predicted for any given ultrasound exam. In embodiments, the expected translational and rotational movement can be embodied in an expected electromagnetic tracking system output indicating the total amount and/or degree of translational and/or rotational scanning hand movement required for a particular ultrasound exam. An electromagnetic tracking system can include an electromagnetic tracking device, e.g., a device or sensor integrated within or attached to an imaging device such as an ultrasound probe, and an external device configured to emit a low-intensity electromagnetic field through which the electromagnetic tracking device passes during an exam. Expected output of an electromagnetic tracking system can thus be based on anticipated movement of the electromagnetic tracking device during an ultrasound exam.
[032] The neural network or other intelligence system implemented to generate expected scanning hand usages can be trained to receive a clinical context for each planned exam and generate an expected scanning hand usage output based on the ground truth usage of experienced users previously measured by an electromagnetic tracking system. In some examples, the expected usage of the scanning hand for an ultrasound exam can be parsed into individual activities, e.g., translational movement and rotational movement. Additionally or alternatively, systems can combine data regarding expected activities into an overall complexity rating 106 specific to the scanning hand. Generally, the expected scanning hand complexity 106 will increase as the expected translational and/or rotational movement also increase.
[033] As further shown, patient and user information 108 can also be input into the neural network 110. Such input 108 can include the skill level, e.g., novice, junior, intermediate, and/or expert, of a user scheduled to perform an ultrasound exam. User skill level can be based on years of experience, the number of exams performed, and/or the variety of exams performed. Patient information included in the input 108 can reflect the complexity, e.g., easy, moderate, or difficult, associated with imaging a patient. This information can depend at least in part on a patient’s physical attributes, e.g., height, weight, and/or BMI, such that a patient having a higher BMI, for example, may be more difficult to examine than a patient having a lower BMI. A patient’s clinical history, which may include underlying medical conditions and/or the reason for performing a particular exam, can also be included in input 108.
[034] In embodiments, the estimated complexity rating 112 generated by the neural network 110 can include a numerical rating ranging from 1 to 5, with 5 indicating high complexity (or “difficult” rating) and 1 indicating low complexity (or “easy” rating). The complexity scale can vary, ranging anywhere from 1 to 10, 1 to 50, or 1 to 100, for example. Additional embodiments can include letter ratings, color ratings, and/or qualitative ratings. Qualitative complexity ratings can include a short description of the expected complexity, which may include predicted complexities associated with usage of the scanning and/or non-scanning hand. The disclosed embodiments are not limited to a particular rating format.
[035] FIG. IB shows a graphical user interface (GUI) 114 (which may also be referred to as a
“dashboard” or “panel”) configured to display a user exam schedule 116 determined by an optimization system 118, which may include one or more underlying processors that form a component of, are included within, or are communicatively coupled with the GUI 114. The optimization system 118 can receive complexity ratings 112 generated by the neural network 110, each complexity rating corresponding to an ultrasound exam scheduled to be performed on a specific patient by a particular user. The optimization system 118 can distribute the exam assignments based on their associated complexity ratings 112 in a manner that balances the physical stresses sustained by each user over a particular time frame spanning one or more days, weeks, or months.
[036] The methodology implemented by the optimization system 118 for distributing exam assignments can vary and may be different for different users. In some examples, the optimization system 118 can be configured to maximize the time between high-complexity exams. In FIG. IB, for instance, the user exam schedule 116 for a single day was generated by an optimization system 118 configured to maximize the separation between the most complex exams, such that the two exams having a complexity rating of 5 were arranged first and last in the day for User A. The same methodology was applied to the exam assignments for User B and User C. Intervening exams can be arranged such that the most complex exams are evenly distributed, for example as shown for User A. For Users B and C, the exams having the second-highest complexity ratings were front-loaded such that the users were scheduled to perform the majority of complex exams earlier in the day, when the users may be the most fresh.
[037] As noted above, the scheduling conditions applied by the optimization system 118 can be user-specific, such that scheduling conditions may be different from one user to the next. For example, some users may prefer to perform the most complex exams earlier in the day, while others may prefer an even distribution of complex exams throughout the day to maximize the recovery time therebetween. These preferences can be stored and factored into the methodology applied by the optimization system 118. In some examples, the conditions applied by the optimization system 118 for a particular user may be defined by a user, such as a lab manager responsible for exam assignments. Users who exhibit reduced performance efficiency later in the day may be assigned to higher-complexity exams in the morning, for example. In one embodiment, the optimization system 118 can apply conditions requiring more complex exams to be separated by the maximum length of time and further requiring that each user be assigned exams with fair hand motion complexity, for example having an average rating of 3. In addition or alternatively, the optimization system 118 can require an even distribution of complex exams across a group of users operating in the same facility. Together with the GUI 114, the optimization system 118 can thus ensure a fair and well-balanced assignment of exams to users in a manner that minimizes the risk of hand injury caused by the continuous and/or consecutive assignment of prolonged and complex exams to certain users. [038] In embodiments featuring qualitative complexity ratings 112, the optimization system 118 can be configured to recognize and/or parse free-form text and utilize the recognized text to distribute the exams accordingly. For example, words or phrases reflecting the anticipated duration, difficulty, and/or hand usage(s) can be identified and used to rank a set of scheduled exams from least complex to most complex. The ranking can then be utilized by the optimization system 118 to distribute the exams in accordance with a particular set of predefined conditions.
[039] The user exam schedule 116 illustrated in FIG. IB is a daily schedule. In additional embodiments, the exam schedule can span only a portion of one day or more than one day, e.g., one or more weeks or months. Embodiments can generate exam schedules for any future time point, such as a future day or week. Accordingly, the time period 117 defined by a user for a particular user exam schedule may vary.
[040] The GUI 114 can thus generate customized user exam schedules based on a variety of factors and display the schedules directly to a user such as a sonographer and/or a lab manager overseeing the user’s work assignments and performance. To add users to the schedule 116, the GUI 114 can also receive free text and/or search entries corresponding to specific users. The schedule 116 can also feature a selectable date range over which a user schedule spans.
[041] The GUI 114 can be physically and/or communicatively coupled to one or more underlying processors 120 configured to generate and modify the displayed graphics in response to different user inputs received at the GUI 114, for example regarding the users included in a given schedule or the date range encompassed by a given schedule. One or more of the processors 120 can be configured to operate an intelligence system, such as neural network 200. User data and scheduling data acquired over time can be stored in one or more data storage devices 122, which may be coupled with the GUI 114 and/or the one or more processors 120. The GUI 114 can be configured to identify, extract, and/or receive the stored data required to generate a particular user exam schedule 116 in accordance with user-specified parameters received at the GUI 114. For example, the GUI 114 and one or more processors 120 coupled therewith can mine the data storage device(s) 122 for one or more datasets regarding the exams previously and/or currently scheduled for one or more users. The GUI 114 can also mine the data storage device(s) 122 for one or more datasets corresponding to the current skill level of a user added to a schedule by a user. The GUI 114 can also mine the data storage device(s) 122 for one or more datasets corresponding to the physical attributes and clinical history of a scheduled patient, along with the expected scanning and non-scanning hand usages associated with a particular exam based on its surrounding clinical context. Each user inquiry 119 received at the GUI 114 can initiate the exam complexity prediction system 100 and subsequent implementation of the optimization system 118, thereby automatically generating and/or modifying the user exam schedule 116. The GUI 114 can thus be configured to initiate and/or perform a variety of data extractions and analyses, receive data corresponding to such extractions and analyses, and generate graphical displays in the form a user schedule 116 customized in view of the same. The GUI 114 can also be configured to generate and/or modify the graphics displayed on the user exam schedule 116 in accordance with additional user inputs. For example, the GUI 114 can automatically rearrange the exam schedule for one or more users upon receiving a patient cancellation and/or upon the addition or removal of one or more users from an existing schedule.
[042] FIG. 2 is a depiction of a neural network 200 that may be trained and implemented to generate expected exam complexity ratings based on expected scanning hand usage, non-scanning hand usage, user skill level, and patient body type. As shown, the neural network 200 may include an input layer 202 configured to receive inputs including but not limited to an expected scanning hand usage 202a, an expected non-scanning hand usage 202b, and patient- and user-specific information 202c. The number of nodes or neurons in the input layer 202 may vary, and in some embodiments may equal the number of input categories, or the number of input categories plus one. The neural network 200 can be trained to receive the inputs 202a/b/c and generate an expected complexity rating based on the ground truth rating of experienced users provided after performing an exam. Embodiments of the neural network 200 may be configured to implement an algorithmic regressive prediction model.
[043] The output layer or node 204 of the neural network 200 can provide the expected complexity rating, which can be numerical, letter-based, and/or qualitative text-based. Tike the input layer 202, the number of neurons in the output layer 204 may vary. For example, the output layer 204 may include one total neuron or one neuron for each factor constituting the overall complexity rating, such as non-scanning hand complexity.
[044] Operating between the input layer 202 and the output layer 204 can be one or more hidden layers 206 configured to assign and optimize weights associated with the inputs 202a/b/c, for example via backpropagation, and apply the weighted inputs to an activation function, e.g., the rectified linear activation function. The number of hidden layers 206 and the number of neurons present therein may also vary. In some embodiments, the number of hidden neurons in a given hidden layer may equal the product of the total number of input neurons and output neurons, together multiplied by the number of datasets used to train the network. The particular neural network(s) implemented in accordance with the disclosed embodiments may vary. The number of neurons may change, for example, along with their arrangement within the network. The size, width, depth, capacity, and/or architecture of the network may vary.
[045] The neural network 200 may be hardware- (e.g., neurons are represented by physical components) or software-based (e.g., neurons and pathways implemented in a software application), and can use a variety of topologies and learning algorithms for training the neural network to produce the desired output. For example, a software-based neural network may be implemented using a processor (e.g., single- or multi-core CPU, a single GPU or GPU cluster, or multiple processors arranged for parallel-processing) configured to execute instructions, which may be stored in a computer-readable medium, and which when executed cause the processor to perform a machine-trained algorithm for receiving input(s) 202a/b/c and generating an expected complexity rating 204 in view of the same. The neural network 200 may be implemented, at least in part, in a computer-readable medium including executable instructions, which when executed by a processor, may cause the processor to perform a machine-trained algorithm to output the expected complexity rating(s).
[046] In various examples, the neural network(s) may be trained using any of a variety of currently known or later developed machine learning techniques to obtain a neural network (e.g., a machine-trained algorithm or hardware-based system of nodes) configured to analyze input data in the form of patient- and exam-specific information collectively constituting a clinical context. As noted above, the ground truth used for training the network 200 can include documented complexity ratings provided by expert, experienced, or average users after performing the same or similar exams on similar patients. Supervised learning models can be trained on a comprehensive data set of exam-specific non-scanning hand complexities, scanning hand complexities, and patient- and user-specific information. The accuracy of the neural network can therefore grow stronger over time as more data is input. The model may be (partially) realized as an Al-based learning network. The computer-implemented techniques utilized to generate the expected complexity ratings may vary, and may involve artificial intelligence-based processing, e.g., sophisticated supervised machine learning. [047] The neural network 200 can also be coupled to a training database 208. The training database 208 may provide a large sample of non-scanning hand usages, scanning hand usages, and patient- and user-specific datasets used to train the neural network. Communication between the training database 208 and the neural network 200 can be bidirectional, such that the training database 208 may provide sample inputs and associated complexity ratings provided by experienced users to the network for training purposes, and the neural network 200 can transmit new training datasets for storage in the training database 208, thereby increasing the sample size of datasets embodying predicted scanning hand usage, non-scanning hand usage, and patient- and user- specific information, all paired with complexity outputs, to further refine future output from the neural network.
[048] While one or more neural networks, e.g., network 200, may be utilized to generate expected complexity ratings, embodiments are not confined to neural networks, and a number of additional and alternative intelligence systems or components may be utilized, such as random forests or support vector machines.
[049] As noted above, exam complexity and the associated hand usages in sonography are not necessarily harmful alone, but frequent repetition or prolonged duration of exposure, compounded with a pace that lacks sufficient time for recovery, can increase the risk of injury significantly. Users who repeatedly perform the same type(s) of exams utilizing the same muscle groups are therefore more susceptible to injury, especially if such exams are moderately or highly complex. The graphical user interface 114 can minimize risk-enhancing activities undertaken by one or more users by distributing ultrasound exams in a manner that reduces the number of complex exams performed by a user and/or distributes the complex exams in a manner that reduces their impact on user exertion.
[050] FIG. 3 is a block diagram illustrating an example processor 300 according to principles of the present disclosure. One or more processors utilized to implement the disclosed embodiments may be configured the same as or similar to processor 300. Processor 300 may be used to implement one or more processes described herein. For example, processor 300 may be configured to implement an artificial intelligence system configured to generate predicted complexity ratings, such as neural network 200. Accordingly, the processor 300 can also be configured to receive expected non-scanning hand usages, scanning hand usages, and patient- and user-specific information, and subsequently determine an expected complexity rating corresponding to a scheduled ultrasound exam that is specific to the patient being examined and the user performing the exam. The processor 300, or a different processor configured similarly, can also be configured to receive multiple complexity ratings and distribute them in a manner that reduces the physical stress sustained by one or more users. The processor 300, or a different processor configured similarly, can also be configured as a graphics processor programmed to generate displays for the graphical user interfaces described herein.
[051] The processor 300 may be any suitable processor type including, but not limited to, a microprocessor, a microcontroller, a digital signal processor (DSP), a field programmable array (FPGA) where the FPGA has been programmed to form a processor, a graphical processing unit (GPU), an application specific circuit (ASIC) where the ASIC has been designed to form a processor, or a combination thereof.
[052] The processor 300 may include one or more cores 302. The core 302 may include one or more arithmetic logic units (ALU) 304. In some examples, the core 302 may include a floating point logic unit (FPLU) 306 and/or a digital signal processing unit (DSPU) 308 in addition to or instead of the ALU 304.
[053] The processor 300 may include one or more registers 312 communicatively coupled to the core 302. The registers 312 may be implemented using dedicated logic gate circuits (e.g., flip- flops) and/or any memory technology. In some examples, the registers 312 may be implemented using static memory. The register may provide data, instructions and addresses to the core 302.
[054] In some examples, processor 300 may include one or more levels of cache memory 310 communicatively coupled to the core 302. The cache memory 310 may provide computer-readable instructions to the core 302 for execution. The cache memory 310 may provide data for processing by the core 302. In some examples, the computer-readable instructions may have been provided to the cache memory 310 by a local memory, for example, local memory attached to the external bus 316. The cache memory 310 may be implemented with any suitable cache memory type, for example, metal-oxide semiconductor (MOS) memory such as static random access memory (SRAM), dynamic random access memory (DRAM), and/or any other suitable memory technology.
[055] The processor 300 may include a controller 314, which may control input to one or more processors included herein, e.g., processor 120. Controller 314 may control the data paths in the ALU 304, FPLU 306 and/or DSPU 308. Controller 314 may be implemented as one or more state machines, data paths and/or dedicated control logic. The gates of controller 314 may be implemented as standalone gates, FPGA, ASIC or any other suitable technology.
[056] The registers 312 and the cache memory 310 may communicate with controller 314 and core 302 via internal connections 320A, 320B, 320C and 320D. Internal connections may be implemented as a bus, multiplexor, crossbar switch, and/or any other suitable connection technology.
[057] Inputs and outputs for the processor 300 may be provided via a bus 316, which may include one or more conductive lines. The bus 316 may be communicatively coupled to one or more components of processor 300, for example the controller 314, cache 310, and/or register 312. The bus 316 may be coupled to one or more components of the system.
[058] The bus 316 may be coupled to one or more external memories. The external memories may include Read Only Memory (ROM) 332. ROM 332 may be a masked ROM, Electronically Programmable Read Only Memory (EPROM) or any other suitable technology. The external memory may include Random Access Memory (RAM) 333. RAM 333 may be a static RAM, battery backed up static RAM, Dynamic RAM (DRAM) or any other suitable technology. The external memory may include Electrically Erasable Programmable Read Only Memory (EEPROM) 335. The external memory may include Flash memory 334. The external memory may include a magnetic storage device such as disc 336.
[059] FIG. 4 is a flow diagram of a method 400 of predicting ultrasound exam complexities and generating optimized exam schedules in view of the same. The example method 400 shows the steps that may be utilized, in any sequence, by the systems, processors, graphical user interfaces, and/or apparatuses described herein. Although examples of the present system have been illustrated with particular reference to ultrasound imaging modalities, the present system can be extended to other medical imaging systems where one or more images are obtained in a systematic manner. For example, the method 400 may be performed with the aid of one or more imaging systems including but not limited to MRI or CT.
[060] At step 402, the method involves “receiving a user inquiry regarding one or more users and one or more ultrasound exams assigned thereto.” At step 404, the method 400 involves “obtaining complexity levels corresponding to the ultrasound exams assigned to the users.” At step 406, the method 400 involves “distributing the ultrasound exams over a time period based on the complexity levels associated therewith.” At step 408, the method involves “generating a user exam schedule displaying the distributed complexity levels.”
[061] In various embodiments where components, systems and/or methods are implemented using a programmable device, such as a computer-based system or programmable logic, it should be appreciated that the above-described systems and methods can be implemented using any of various known or later developed programming languages, such as “C”, “C++”, “FORTRAN”, “Pascal”, “VHDL” and the like. Accordingly, various storage media, such as magnetic computer disks, optical disks, electronic memories and the like, can be prepared that can contain information that can direct a device, such as a computer, to implement the above-described systems and/or methods. Once an appropriate device has access to the information and programs contained on the storage media, the storage media can provide the information and programs to the device, thus enabling the device to perform functions of the systems and/or methods described herein. For example, if a computer disk containing appropriate materials, such as a source file, an object file, an executable file or the like, were provided to a computer, the computer could receive the information, appropriately configure itself and perform the functions of the various systems and methods outlined in the diagrams and flowcharts above to implement the various functions. That is, the computer could receive various portions of information from the disk relating to different elements of the above-described systems and/or methods, implement the individual systems and/or methods and coordinate the functions of the individual systems and/or methods described above.
[062] In view of this disclosure, it is noted that the various methods and devices described herein can be implemented in hardware, software and firmware. Further, the various methods and parameters are included by way of example only and not in any limiting sense. In view of this disclosure, those of ordinary skill in the art can implement the present teachings in determining their own techniques and needed equipment to affect these techniques, while remaining within the scope of the invention. The functionality of one or more of the processors described herein may be incorporated into a fewer number or a single processing unit (e.g., a CPU) and may be implemented using application specific integrated circuits (ASICs) or general purpose processing circuits which are programmed responsive to executable instruction to perform the functions described herein.
[063] The present system may also include one or more programs which may be used with conventional imaging systems so that they may provide features and advantages of the present system. Certain additional advantages and features of this disclosure may be apparent to those skilled in the art upon studying the disclosure, or may be experienced by persons employing the novel system and method of the present disclosure. Another advantage of the present systems and method may be that conventional medical image systems can be easily upgraded to incorporate the features and advantages of the present systems, devices, and methods.
[064] Of course, it is to be appreciated that any one of the examples, embodiments or processes described herein may be combined with one or more other examples, embodiments and/or processes or be separated and/or performed amongst separate devices or device portions in accordance with the present systems, devices and methods.
[065] Finally, the above-discussion is intended to be merely illustrative of the present system and should not be construed as limiting the appended claims to any particular embodiment or group of embodiments. Thus, while the present system has been described in particular detail with reference to exemplary embodiments, it should also be appreciated that numerous modifications and alternative embodiments may be devised by those having ordinary skill in the art without departing from the broader and intended spirit and scope of the present system as set forth in the claims that follow. Accordingly, the specification and drawings are to be regarded in an illustrative manner and are not intended to limit the scope of the appended claims.

Claims

CLAIMS What is claimed is:
1. A user scheduling system comprising: one or more processors (120, 300) configured to predict one or more ultrasound exam complexity levels (112), each of the complexity levels corresponding to an ultrasound exam assigned to a user; and a graphical user interface (114) configured to: receive a user inquiry (119, 402) regarding one or more users and one or more ultrasound exams assigned thereto; distribute the ultrasound exams (406) over a time period (117) based on the complexity levels associated therewith; and generate a user exam schedule (116, 408) displaying the distributed complexity levels.
2. The user scheduling system of claim 1, wherein each of the complexity levels is based on: an expected scanning hand usage required to perform the ultrasound exam; an expected non-scanning hand usage required to perform the ultrasound exam; and a skill level of the user.
3. The user scheduling system of claim 2, wherein the expected scanning hand usage comprises an expected translational and rotational movement of the scanning hand.
4. The user scheduling system of claim 2, wherein the expected non-scanning hand usage comprises a total linear and angular motion of the non-scanning hand and a total number and type of user actions received at a control panel of an ultrasound machine utilized to perform the ultrasound exam.
5. The user scheduling system of claim 2, wherein each of the complexity levels is further based on a physical attribute and/or clinical history of a patient to be examined.
6. The user scheduling system of claim 2, wherein the one or more processors are configured to predict the complexity levels by applying a neural network to the expected scanning hand usage, the expected non-scanning hand usage, and the skill level of the user.
7. The user scheduling system of claim 1, wherein the graphical user interface is further configured to receive a user input specifying the time period and the one or more users included in the user exam schedule.
8. The user scheduling system of claim 1, wherein the graphical user interface is configured to distribute the ultrasound exams over the time period by maximizing a time between ultrasound exams having high complexities assigned to each of the one or more users.
9. The user scheduling system of claim 1, wherein the time period spans one day and wherein the graphical user interface is configured to assign ultrasound exams having high complexities earlier in the day for one or more of the users.
10. The user scheduling system of claim 1, wherein the graphical user interface is further configured to receive a user input specifying one or more scheduling conditions used to distribute the ultrasound exams over the time period.
11. The user scheduling system of claim 10, wherein the one or more scheduling conditions are user-specific.
12. The user scheduling system of claim 6, wherein the neural network is operatively associated with a training algorithm configured to receive an array of training inputs and known outputs, wherein the training inputs comprise a sample of non-scanning hand usages, scanning hand usages, and user skill levels obtained from previously performed ultrasound exams, and the known outputs comprise complexity ratings reported by users who performed the previous ultrasound exams.
13. A method of generating and displaying a user exam schedule, the method comprising: receiving a user inquiry (119, 402) regarding one or more users and one or more ultrasound exams assigned thereto; obtaining complexity levels (112, 404) corresponding to the ultrasound exams assigned to the users; distributing the ultrasound exams (406) over a time period (117) based on the complexity levels associated therewith; and generating a user exam schedule (116, 408) displaying the distributed complexity levels.
14. The method of claim 13, further comprising receiving a user input specifying the time period and the one or more users included in the user exam schedule.
15. The method of claim 13, further comprising receiving a user input specifying one or more scheduling conditions used to distribute the ultrasound exams over the time period.
16. The method of claim 13, wherein distributing the ultrasound exams over the time period comprises maximizing a time between ultrasound exams having high complexities assigned to each of the one or more users.
17. The method of claim 13, wherein the time period spans one day and wherein distributing the ultrasound exams comprises assigning ultrasound exams having high complexities earlier in the day for one or more of the users.
18. The method of claim 13, wherein each of the complexity levels is based on: an expected scanning hand usage required to perform one of the ultrasound exams; an expected non-scanning hand usage required to perform one of the ultrasound exams; and a skill level of the user.
19. The method of claim 18, wherein the complexity levels are predicted by applying a neural network to the expected scanning hand usage, the expected non-scanning hand usage, and the skill level of the user.
20. A non-transitory computer-readable medium comprising executable instructions, which when executed cause a processor of a user scheduling system to perform any of the methods of claims 13-19.
PCT/EP2022/065283 2021-06-16 2022-06-06 User load balancing WO2022263215A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202163211051P 2021-06-16 2021-06-16
US63/211,051 2021-06-16

Publications (1)

Publication Number Publication Date
WO2022263215A1 true WO2022263215A1 (en) 2022-12-22

Family

ID=82163513

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2022/065283 WO2022263215A1 (en) 2021-06-16 2022-06-06 User load balancing

Country Status (1)

Country Link
WO (1) WO2022263215A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150327841A1 (en) * 2014-05-13 2015-11-19 Kabushiki Kaisha Toshiba Tracking in ultrasound for imaging and user interface
US20200337673A1 (en) * 2017-12-19 2020-10-29 Koninklijke Philips N.V. Combining image based and inertial probe tracking
US20210098120A1 (en) * 2019-09-27 2021-04-01 Hologic, Inc. AI System for Predicting Reading Time and Reading Complexity for Reviewing 2D/3D Breast Images

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150327841A1 (en) * 2014-05-13 2015-11-19 Kabushiki Kaisha Toshiba Tracking in ultrasound for imaging and user interface
US20200337673A1 (en) * 2017-12-19 2020-10-29 Koninklijke Philips N.V. Combining image based and inertial probe tracking
US20210098120A1 (en) * 2019-09-27 2021-04-01 Hologic, Inc. AI System for Predicting Reading Time and Reading Complexity for Reviewing 2D/3D Breast Images

Similar Documents

Publication Publication Date Title
JP7182554B2 (en) Methods and apparatus for assessing developmental diseases and providing control over coverage and reliability
US20210043324A1 (en) Computer aided medical method and medical system for medical prediction
Commenges Inference for multi-state models from interval-censored data
US9861308B2 (en) Method and system for monitoring stress conditions
US9443002B1 (en) Dynamic data analysis and selection for determining outcomes associated with domain specific probabilistic data sets
Yaya et al. Evaluating the efficiency of China’s healthcare service: A weighted DEA-game theory in a competitive environment
US20190198172A1 (en) Systems, methods, and diagnostic support tools for facilitating the diagnosis of medical conditions
Ehrlich et al. Moderate and vigorous intensity exercise during pregnancy and gestational weight gain in women with gestational diabetes
Bernabeu-Mora et al. Determinants of each domain of the Short Physical Performance Battery in COPD
Swami et al. Psychometric properties of an Arabic translation of the Functionality Appreciation Scale (FAS) in Lebanese adults
Nijs et al. Kinesiophobia, catastrophizing and anticipated symptoms before stair climbing in chronic fatigue syndrome: an experimental study
Patil et al. A proposed model for lifestyle disease prediction using support vector machine
Lin et al. Predicting wait times in pediatric ophthalmology outpatient clinic using machine learning
WO2016160509A1 (en) Methods and apparatus related to electronic display of a human avatar with display properties particularized to health risks of a patient
Baevsky et al. Pre-nosology diagnostics.
Traina et al. Pragmatic measurement of health satisfaction in people with type 2 diabetes mellitus using the Current Health Satisfaction Questionnaire
Kennedy et al. Clinical prediction rules: a systematic review of healthcare provider opinions and preferences
Lesuis et al. Choosing wisely in daily practice: an intervention study on antinuclear antibody testing by rheumatologists
Luvizutto et al. Use of artificial intelligence as an instrument of evaluation after stroke: a scoping review based on international classification of functioning, disability and health concept: AI applications for stroke evaluation
WO2022263215A1 (en) User load balancing
CN114007489A (en) System and method for predicting contact lens compatibility using machine learning
Ekstrand et al. Clinical interpretation and cutoff scores for manual ability measured by the ABILHAND questionnaire in people with stroke
Liu et al. Joint models for time-to-event data and longitudinal biomarkers of high dimension
Kasturi et al. Responsiveness of the Patient‐Reported Outcomes Measurement Information System Global Health Short Form in Outpatients With Systemic Lupus Erythematosus
Maeda-Minami et al. A prediction model of qi stagnation: A prospective observational study referring to two existing models

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22732984

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE