WO2024083690A1 - Interactive training and education - Google Patents
Interactive training and education Download PDFInfo
- Publication number
- WO2024083690A1 WO2024083690A1 PCT/EP2023/078549 EP2023078549W WO2024083690A1 WO 2024083690 A1 WO2024083690 A1 WO 2024083690A1 EP 2023078549 W EP2023078549 W EP 2023078549W WO 2024083690 A1 WO2024083690 A1 WO 2024083690A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- parameters
- values
- synthetic
- angiogram
- processor
- Prior art date
Links
- 230000002452 interceptive effect Effects 0.000 title claims abstract description 41
- 238000012549 training Methods 0.000 title claims abstract description 39
- 230000015654 memory Effects 0.000 claims abstract description 36
- 238000000034 method Methods 0.000 claims description 63
- 230000003902 lesion Effects 0.000 claims description 24
- 238000003384 imaging method Methods 0.000 claims description 18
- 230000002792 vascular Effects 0.000 claims description 18
- 208000031481 Pathologic Constriction Diseases 0.000 claims description 13
- 230000000877 morphologic effect Effects 0.000 claims description 12
- 210000005166 vasculature Anatomy 0.000 claims description 9
- 208000037804 stenosis Diseases 0.000 claims description 8
- 230000036262 stenosis Effects 0.000 claims description 8
- 238000004590 computer program Methods 0.000 claims description 7
- 210000003484 anatomy Anatomy 0.000 claims description 5
- 238000003860 storage Methods 0.000 claims description 5
- 239000008280 blood Substances 0.000 claims description 4
- 210000004369 blood Anatomy 0.000 claims description 4
- 230000035479 physiological effects, processes and functions Effects 0.000 claims description 3
- 208000004434 Calcinosis Diseases 0.000 claims description 2
- 230000002308 calcification Effects 0.000 claims description 2
- 201000010099 disease Diseases 0.000 description 12
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 12
- 238000012545 processing Methods 0.000 description 8
- 230000002966 stenotic effect Effects 0.000 description 7
- 230000008569 process Effects 0.000 description 6
- 238000004088 simulation Methods 0.000 description 6
- 230000003068 static effect Effects 0.000 description 6
- 239000002872 contrast media Substances 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 238000002583 angiography Methods 0.000 description 4
- 238000013459 approach Methods 0.000 description 4
- 238000009826 distribution Methods 0.000 description 4
- 210000004204 blood vessel Anatomy 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 230000002411 adverse Effects 0.000 description 2
- 238000003491 array Methods 0.000 description 2
- 230000003143 atherosclerotic effect Effects 0.000 description 2
- 238000012905 input function Methods 0.000 description 2
- 238000000513 principal component analysis Methods 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 230000009897 systematic effect Effects 0.000 description 2
- 239000013598 vector Substances 0.000 description 2
- 238000012800 visualization Methods 0.000 description 2
- 201000001320 Atherosclerosis Diseases 0.000 description 1
- 208000024172 Cardiovascular disease Diseases 0.000 description 1
- 241000940612 Medina Species 0.000 description 1
- 230000005856 abnormality Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 210000001367 artery Anatomy 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000000747 cardiac effect Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000002591 computed tomography Methods 0.000 description 1
- 208000029078 coronary artery disease Diseases 0.000 description 1
- 238000011157 data evaluation Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000004165 myocardium Anatomy 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000001575 pathological effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000001766 physiological effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000003325 tomography Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B23/00—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
- G09B23/28—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
- G09B23/286—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine for scanning or photography techniques, e.g. X-rays, ultrasonics
Definitions
- Intra-observer variability and inter-observer variability among cardiologists is a known problem when assigning standardized clinical scores to individual patient records and images.
- Clinical scores are designed to describe and characterize certain markers of a disease. Typical examples for cardio-vascular disease include a Gensini score used to quantify stenotic lesions, a Medina classification for classifying bifurcations, and a Syntax II score for atherosclerotic burden or disease severity. Some scores may make use of anatomical descriptors as derived from image data, but such scores often extend to more general patient records such as patient demographics, medical history, or comorbidities.
- an interactive system for medical training includes a memory and a processor.
- the processor is configured to: obtain actual values of parameters associated with a medical scenario; generate a sequence of customized synthetic angiogram images of the medical scenario based on the actual values of the parameters; display, to a user, the sequence of customized synthetic angiogram images; interactively request the user to estimate values for the parameters based on the displayed sequence of customized synthetic angiogram images; compare the estimated values for the parameters to target values for the parameters to determine a measure of difference between the estimated values and the target values, wherein the target values are defined based on the actual values of the parameters; and provide to the user feedback with respect to the measure of difference.
- a method for medical training includes obtaining actual values of parameters associated with a medical scenario; generating a sequence of customized synthetic angiogram images of the medical scenario based on the actual values of the parameters; displaying, to a user, the sequence of customized synthetic angiogram images; interactively requesting the user to estimate values for the parameters based on the displayed sequence of customized synthetic angiogram images; comparing the estimated values for the parameters to target values for the parameters to determine a measure of difference between the estimated values and the target values, wherein the target values are defined based on the actual values of the parameters; and providing to the user feedback with respect to the measure of difference.
- a non-transitory computer-readable storage medium has stored a computer program comprising instructions. When executed by a processor, the instructions cause the processor to: obtain actual values of parameters associated with a medical scenario; generate a sequence of customized synthetic angiogram images of the medical scenario based on the actual values of the parameters; display, to a user, the sequence of customized synthetic angiogram images; interactively request the user to estimate values for the parameters based on the displayed sequence of customized synthetic angiogram images; compare the estimated values for the parameters to target values for the parameters to determine a measure of difference between the estimated values and the target values, wherein the target values are defined based on the actual values of the parameters; and provide to the user feedback with respect to the measure of difference.
- FIG. 1 illustrates a system for interactive training and education, in accordance with a representative embodiment.
- FIG. 2 illustrates a flow for interactive training and education, in accordance with a representative embodiment.
- FIG. 3 illustrates another flow for interactive training and education, in accordance with a representative embodiment.
- FIG. 4 illustrates a user interface for interactive training and education, in accordance with a representative embodiment.
- FIG. 5A illustrates a method for interactive training and education, in accordance with a representative embodiment.
- FIG. 5B illustrates another method for interactive training and education, in accordance with a representative embodiment.
- FIG. 5C illustrates another method for interactive training and education, in accordance with a representative embodiment.
- FIG. 5D illustrates another method for interactive training and education, in accordance with a representative embodiment.
- FIG. 6 illustrates a computer system, on which a method for interactive training and education is implemented, in accordance with another representative embodiment.
- embodiments provide for customized synthetic angiogram images that are generated by modelling an anatomical structure using controlled parameter values.
- the synthetic angiogram images may be used for a variety of medical training and education purposes in an interactive context.
- the teachings herein provide a flexible and light-weight simulation framework for generation of angiograms with control over the parameters defining the angiogram characteristics. By modelling the morphology and physiology of vascular trees and subsequently simulating the imaging process using controlled parameter values, in embodiments, control governing the angiogram appearance is provided.
- FIG. 1 illustrates a system 100 for interactive training and education, in accordance with a representative embodiment.
- the system 100 in FIG. l is a system for interactive training and education and includes components that may be provided together or that may be distributed.
- the system 100 includes an imaging device 101, a computer 140 and a display 180.
- the computer 140 includes a controller 150, and the controller 150 includes at least a memory 151 that stores instructions and a processor 152 that executes the instructions.
- a computer that can be used to implement the computer 140 is depicted in FIG. 6, though a computer 140 may include more or fewer elements than depicted in FIG. 1 or FIG. 6.
- the imaging device 101 may be configured to generate X-ray images.
- angiography is a type of X-ray imaging used to check blood vessels.
- a contrast medium may be injected into blood for angiography.
- the contrast medium is radiopaque at X-ray wavelengths and allows highlighting of blood vessels in the resultant X-ray images.
- Angiogram images obtained from X-ray such as via the imaging device 101 may be known as angiograms.
- the imaging device 101 may be provided together with or separate from the computer 140.
- the imaging device 101 may also be representative of multiple imaging devices including X-ray imaging devices distributed among multiple locations.
- the computer 140 may be configured to generate customized synthetic angiogram images.
- the customized synthetic angiogram images may be generated to model a medical scenario (such as a particular disease condition of a vasculature).
- the synthetic angiogram images may be created based on actual angiogram images from the imaging device 101.
- the computer 140 may be configured to obtain values for parameters associated with a medical scenario and input the parameter values into a model configured to generate the customized angiogram images based on the parameter values, and then provide the synthetic angiogram images for display on the display 180.
- These parameter values are referred to herein as “actual values” as they are the values of the parameters actually used to generate the synthetic angiogram images.
- the parameters may include, for example, parameters associated with anatomical characteristics (e.g., clinically scoring such anatomical characteristics) in the synthetic angiogram images.
- anatomical characteristics may include presence, location, dimensions, severity, and/or other feature of a condition in an anatomy.
- the parameters may be characteristics associated with a disease condition (e.g., stenotic lesion) in a vasculature and the actual values set for the parameters define the dimension, location, and severity of the disease condition to be generated in the synthetic angiogram images.
- the computer 140 may also be configured to interactively request (e.g., prompt) a user to estimate values for the parameters based on viewing the displayed synthetic angiogram images, and to accept the estimated values.
- the estimates may be interactively requested from the user visually, for example, via the display 180, or audibly via an audio output, and the estimates may be taken as a form of training or as a form of education with respect to the medical scenario.
- the synthetic angiogram images may include a stenotic lesion and the user may be interactively requested to estimate a value of the severity of the stenotic lesion (e.g., score the severity of the stenotic lesion).
- the computer 140 may be configured to compare the estimated values of the parameters to target values defined for the parameters.
- the target values for the parameters may be the actual values of the parameters, or other values (e.g., acceptable ranges) derived from the actual values of the parameters.
- the computer may compare the user’s estimated value (e.g., clinical score) for the severity of the lesion to the actual value of the severity of the lesion used to generate the angiogram images.
- the computer 140 may be configured to determine a measure of difference between the actual values of the parameters and the target values of the parameters, and then output feedback to the user with respect to the measure of the difference.
- the computer 140 may be configured to output a measure of the difference as feedback visually via the display 180.
- the display 180 may be local to the computer 140 or may be remotely connected to the computer 140.
- the display 180 may be connected to the computer 140 via a local wired interface such as an Ethernet cable or via a local wireless interface such as a Wi-Fi connection.
- the display 180 may be interfaced with other user input devices by which users can input instructions, including mouses, keyboards, thumbwheels and so on.
- the display 180 may be a monitor such as a computer monitor, a display on a mobile device, an augmented reality display, a television, an electronic whiteboard, or another screen configured to display electronic imagery.
- the display 180 may also include one or more input interface(s) such as those noted above that may connect to other elements or components, as well as an interactive touch screen configured to display prompts or other interactive interfaces to users and collect touch input from users.
- the controller 150 may also include interfaces, such as a first interface, a second interface, a third interface, and a fourth interface.
- One or more of the interfaces may include ports, disk drives, wireless antennas, or other types of receiver circuitry that connect the controller 150 to other electronic elements.
- One or more of the interfaces may also include user interfaces such as buttons, keys, a mouse, a microphone, a speaker, a display separate from the display 180, or other elements that users can use to interact with the controller 150 such as to enter instructions and receive output.
- the controller 150 may perform some of the operations described herein directly and may implement other operations described herein indirectly.
- the controller 150 may indirectly control operations such as by generating and transmitting content to be displayed on the display 180.
- the controller 150 may directly control other operations such as logical operations performed by the processor 152 executing instructions from the memory 151 based on input received from electronic elements and/or users via the interfaces. Accordingly, the processes implemented by the controller 150 when the processor 152 executes instructions from the memory 151 may include steps not directly performed by the controller 150.
- FIG. 2 illustrates a flow for interactive training and education, in accordance with a representative embodiment.
- the flow in FIG. 2 starts with inputting an input sample and/or ground truth to a synthetic angiogram generator at S210 for modelling a medical scenario.
- the synthetic angiogram generator may be the computer 140, and may be implemented using a computer program executed by the processor 152 of the controller 150.
- a computer program may be opened so that a user can provide input to a model consisting of one or more algorithms included in the computer program.
- the user may input boundary conditions from which a score and input parameters may be randomly sampled.
- An example of an input boundary condition is a user providing instructions to generate synthetic angiogram images corresponding to scores in a range between a low score of A and a high score of B.
- Examples of the input sample and/or ground truth may include one or more score values.
- a first user may input the score values as the actual values of parameters associated with a medical scenario, and these actual values may be used to generate a customized plurality of synthetic angiogram images for presentation to a second user, such as a clinical expert.
- the synthetic angiogram generator may output an angiography sequence of two-dimensional X-ray images based on a set of the actual values of the parameters, whether input directly by a user or generated based on input boundary conditions.
- the two-dimensional X-ray images may comprise a sequence of synthetic angiogram images simulating movement of a vasculature vessel tree.
- a model implemented by the computer 140 to generate angiogram images may be rules- based, but may also include data-driven components such as a neural network trained to generate synthetic backgrounds or simply a statistical principal component analysis (PCA) model which defined spatial eigenmodes of the heart.
- a rules-based scoring model may be employed as a forward model to convert scores into input parameters.
- An optimization may be employed subsequently to optimize parameters under boundary conditions that vary case-by- case until the score computed by the forward model fits the desired target score.
- a “model” as the term is used herein may include multiple models including models of different types.
- the parameters may be anatomical characteristics and the rules-based scoring model may be for quantifying anatomical characteristics of disease as the actual values of the parameters.
- the actual values of the parameters are input to a bias identifier.
- the input sample and/or ground truth may be used as input to a bias identifier.
- the bias identifier may be the computer 140, and may be implemented using the same or a different computer program than the computer program used for the synthetic angiogram generator.
- the parameters include properties and parameters which are relevant for clinical scores such as lesions (length, shape, stenosis significance, calcifications, eccentricity ... ), bifurcations, view parameters etc.
- the synthetic angiogram generator generates a synthetic angiogram.
- the synthetic angiogram may be displayed on the display 180.
- the synthetic angiogram may be displayed as a form of prompting to a clinical expert.
- the synthetic angiogram generator generates synthetic angiogram images based on the actual values for a set of input parameters.
- These input parameters may be morphological parameters, physiological parameters, procedural parameters, or imaging parameters, and may govern the appearance of the resulting image sequence of synthetic angiogram images.
- the input parameters also may form the ground truth for the scoring output prompted from an individual. For example, the parameters are associated with a vascular and the actual values are set to generate the synthetic angiogram to provide the vascular in a particular disease state (e.g., with lesions of a certain severity).
- the synthetic angiogram generator may start from a score value and use the score value to generate a set of plausible parameters which yield the score value.
- the score value may be a severity level for lesion(s) in a vasculature and the synthetic angiogram generator defines values for parameters associated with the vascular which generate the synthetic angiogram with lesion(s) of the severity level corresponding to the score value.
- one or more angiogram image(s) are provided to the clinical expert to score and/or estimate values for the parameters. The clinical expert is prompted to enter a score or other value(s) for the parameters for each angiogram image.
- the clinical expert may be prompted to estimate the score or actual values of the parameters used to generate the angiogram image and resulting in the score. For example, the user may be prompted to enter a score for lesion(s) in the synthetic angiogram that was generated based on setting actual values for certain parameters of a vascular.
- the clinical expert assigns estimates based on viewing the synthetic angiogram image.
- the estimates may be estimates of the parameters, or estimates of other parameters relating to the synthetic angiogram and derived from the parameters used to generate the synthetic angiogram image.
- the estimates may be a score reflecting the combination of parameters, such as a score indicating a grading of the parameters of the synthetic angiogram image as to the severity of a medical condition (e.g., quantify severity of stenotic lesions or severity of other disease condition).
- the estimated values of the parameters may be input to the computer 140, and may include estimates of a score value or score input parameters corresponding to the synthetic angiogram.
- the angiogram image may be used as an interactive scoring tool so that the clinical expert may read back the relevant input parameters for the score from the angiogram image or the score directly.
- the estimated values of the parameters are input to the bias identifier.
- the bias identifier may compare them with target values of the parameters determined based on the actual values of the parameters from S215, and output interactive feedback to the clinical expert at S260.
- the target values may comprise the actual values, or may comprise other values of the parameters relating to the angiogram image and derived from the actual values of the parameters.
- the resulting set of values and/or the score value from the estimates provided by the individual are compared with the target values of the parameters or the corresponding reference which served as an input to the synthetic angiogram generator to compute the user error and give feedback on how to achieve more consistency.
- the flow of FIG. 2 may be a general overview of a scoring trainer system.
- Score input of parameters may be sampled from distributions or generated from a score input value.
- the score input of the parameters are input into a synthetic angiogram generator, which produces a coronary angiogram to be scored by the clinical expert.
- the clinical expert estimates a score for the image based on the parameters and hence assigns a score to the image or parts of the image such as individual stenoses.
- the estimated score by the human is compared to the actual score of the parameter or another target score based on the actual score, and the error in the estimation is fed back to the individual as a reflection of bias to improve their scoring.
- the feedback may be provided after each input of an estimated score of an image or after scoring several images, so as to identify trends based on scoring several images.
- values of parameters may be input to the synthetic angiogram generator at S210, and then a score may be prompted from the individual at S240.
- a score may be input to the synthetic angiogram generator at S210, and estimated values for parameters associated with the score may be prompted from the individual at S240.
- FIG. 3 illustrates another flow for interactive training and education, in accordance with a representative embodiment.
- the flow in FIG. 3 may be performed by the computer 140 in FIG. 1 for synthetic angiogram generation.
- a hybrid modelling approach may be used to generate synthetic angiography sequences with a contrast-filled vessel tree in the foreground and typical attenuation patterns in the background.
- the flow in FIG. 3 starts with providing information of a catheter tip position at S311 for non-contrast frames.
- the non-contrast frames are provided at S312 for generating a synthetic angiogram and labels at S351.
- the information provided at S311 is provided to generate a vessel tree at S321.
- the vessel tree is generated at S321 as a three- dimensional representation such as a triangular mesh.
- This morphological description and a physiological description of the vessel tree on an epicardial surface mesh are provided to a contrast propagator at S331.
- the morphological tree description may include vessel labels and/or stenoses.
- the physiological tree description may include volume flow and/or velocity.
- the morphological tree description may include descriptions of morphological parameters, and the physiological description of the vessel tree may include physiological parameters.
- the 3D representation is voxelized into one or more discretized image volumes.
- the morphological description of the vascular tree is either generated at S331 from scratch or taken from some other three-dimensional imaging modality such as computed tomography.
- the morphological description defines with a centerline and a lumen hull the general anatomy of the vessel tree.
- a sample from a statistical atlas may control the results including parameters such as: vessel lengths, vessel diameters, branching point locations and angles, lumen contour shape at each centerline point, an also abnormalities such as vessel tortuosity or stenoses.
- the morphological description may have a time factor to allow for heart motion.
- the physiological parameters either are sampled at S331 from empirical distributions such as the blood volume flow in the left main artery, or computed based on the existing morphological description. Examples of an existing morphological description might include volume flow after branching, bolus front wave velocity based on cross-sectional slab volume, time of contrast arrival etc.
- these physiological parameters are set at S331 by a simulation module such as a lumped one-dimensional model such as the Windkessel model.
- a simulation module such as a lumped one-dimensional model such as the Windkessel model.
- a lumped one-dimensional model starts from boundary conditions such as the aortic pressure and microvascular resistances in the myocardium and simulates the remaining parameters such as volume flow at each vessel segment based on the underlying model which also captures the tree morphology.
- the vascular tree description is then voxelized in three dimensions into a discretized volume block by marking volume grid points lying within the vascular mesh.
- Each cross- sectional slab along the centerline may be given a unique integer value in the volume, so that the propagator module can assign the contrast filling status for any specific time point using look-up tables generated from the physiological description.
- the volume-indexed three- dimensional rendering result may be reused for any time point of the contrast propagation.
- AIF arterial input function
- the contrast propagator provides a dynamic description to an X-ray projector.
- a contrast medium is propagated through the three-dimensional rendering of the vessel tree with a given arterial input function and frame rate.
- the contrast propagator outputs a dynamic description to the X-ray projector.
- the three-dimensional voxelization of the vessel tree generated at S321 may be provided as a sequence of three-dimensional images for a dynamic volume-based description from the contrast propagator which is then passed through the X-ray projector at S341.
- the sequence of three-dimensional images is not limited only to the vasculature, but may also include surrounding objects which may be modelled by input parameters but which are not the main clinical focus of the estimates based on parameters prompted according to the teachings herein.
- the surrounding objects may include diaphragm, heart shadow and more which move in the coronary context.
- contrast medium contained in the vessel will move in sequences displaying physiological properties of the tree.
- the X-ray projector may comprise a projector module implemented by the computer 140.
- the X-ray projector may generate a sequence of two-dimensional foreground projections including images and labels.
- the flow of FIG. 3 may include adding a background to the foreground.
- the background may originate from a data-driven path.
- the computer may either draw real images from a clinical data pool or may generate synthetic backgrounds from a data distribution.
- a data distribution may be modelled as a generative adversarial network (GAN) to result in the synthetic backgrounds added to the foregrounds in FIG. 3.
- GAN generative adversarial network
- the X-ray projector outputs synthetic angiogram images and labels.
- the X-ray projector outputs the synthetic angiogram and images based on the sequence of two-dimensional foreground projections from the X-ray projector and based on the non-contrast frames from S312.
- a sequence of three-dimensional volumes show partially contrast-filled vascular trees depending on the related time index.
- the sequence may be passed to a projection module, projecting the three- dimensional volume onto the two-dimensional detector plane.
- the flow of FIG. 3 is again governed by procedural parameters such as the position of a C-arm isocenter relative to the vascular tree or the viewing angles.
- a background image is added to the foreground. This background can either be taken from an available pool of real backgrounds based on angiograms without visible contrast medium, or may be generated by a generative model which has learned from real world data to generate plausible looking angiogram backgrounds.
- FIG. 4 illustrates a user interface for interactive training and education, in accordance with a representative embodiment.
- the user interface 481 in FIG. 4 shows stenosis scoring and modelling.
- a definition of a Gensini score may be provided on the lower left .
- the Gensini score (GS) is a tool for assessing the severity and complexity of coronary artery diseases.
- a Gensini score is used to quantify angiographic atherosclerosis.
- the definition of the Gensini score may include a graphic with two columns.
- a left column may include a stack of icons and corresponding percentages for a reduction of lumen diameter.
- the right column may include a stack of icons and corresponding scores for severity of eccentric plaque.
- a zero Gensini score indicates absence of atherosclerotic disease.
- Stenoses are rated by a Gensini score according to their location in the vessel, their concentricity or eccentricity.
- a morphological mesh model of the vascular tree is provided.
- the vascular tree is defined by a centerline and a pair of cross-sectional vectors (u, v) orthogonal to the centerline.
- the pair of cross-sectional vectors span a cross- sectional coordinate system in which the lumen contour is defined as dots.
- the narrowing may be adjusted by the average distance between contour and centerline point over multiple crosssections, and the eccentricity may be model by the centerline point position within the closed lumen contour.
- An example implementation for introducing stenoses may use spatial eigenmodes added on top of the cylindrical vessel mesh.
- image A and image B may show a left CA and a right CA.
- Metrics shown below image A and Image B include LAD scores, LCx scores, RCA scores and a final score.
- the information shown in FIG. 4 may be part of the actual values for parameters for synthetic angiogram images corresponding to image A and image B, though more and different values and/or parameters than shown may be required as input to a model to generate the angiogram image. That is, the number and type of parameters input to the generator may be more numerous and varied than those set by a user. For example, a user may set a desired range, and the generator may select values within the range to generate an angiogram image.
- the user does not set input parameters, and instead the generator may randomly sample input parameters from ranges so that the sampled input parameters or corresponding scores are the targets for the clinical expert to estimate
- Image 1 underneath the MLCA/LAD label may be a standardized image provided as an icon to guide the user.
- Image 2 underneath the CFx label may also be a standardized image provided as an icon to guide the user.
- Image 3 underneath the RCA label may also be a standardized image provided as an icon to guide the user.
- FIG. 5A illustrates a method for interactive training and education, in accordance with a representative embodiment.
- actual values of the parameters are input.
- the actual values may be input to the computer 140 in order to start a process for measuring biases of professionals in a manner that will help train the professionals to reduce or eliminate such biases.
- the actual values may be parameters used to generate one or more angiogram image.
- the actual values may include, for example, clinical scores for anatomical characteristics in the synthetic angiogram images.
- the anatomical characteristics may include, for example, presence, location, and dimensions of an anatomical characteristic.
- a three-dimensional angiogram image is generated based on the actual values of the parameters.
- the parameters may comprise anatomical characteristics of disease and a rules-based scoring model may be used for quantifying the anatomical characteristics of disease as the actual values of the parameters, though the model used to generate the three- dimensional angiogram images may also include data-driven components.
- the three-dimensional angiogram image from S510 is projected onto a two- dimensional plane.
- Synthetic angiogram images used for the training and education comparisons and scoring may each comprise a synthetic two-dimensional X-ray image, and may be generated based on at least one synthetic three-dimensional angiogram image from S510.
- the projection at S515 may result in generating a corresponding synthetic angiogram image among the synthetic angiogram images for each synthetic three-dimensional angiogram image projected onto the two-dimensional detector plane.
- a synthetic foreground angiogram image is generated based on the projection at S515.
- the synthetic foreground angiogram image is generated as a two-dimensional image.
- a background angiogram image is added.
- the background angiogram image is added to the two-dimensional synthetic foreground angiogram image generated at S520.
- the two-dimensional synthetic angiogram images may comprise a sequence of synthetic angiogram images simulating movement of a vasculature vessel tree.
- a determination is made as to whether the generation of two-dimensional angiogram images is complete.
- estimated parameters are input.
- the estimated parameters may be input by a clinician or other individual based on the image(s) displayed at S535. For example, images may be displayed one at a time at S535, and the individual make be prompted to enter estimated parameters corresponding to the displayed images at S540. Accordingly, S535 and S540 may be performed as a loop until the estimated parameters are input for the last image.
- a human reader may be prompted to assign a score to the synthetically generated angiogram image at S540.
- the individual may determine the relevant set of parameters from the image sequence, and this may include measuring/estimating, for example, lengths and angles from the image sequence which can be enabled by interactive tooling.
- the score and the input parameters for the score may then be passed to a feedback module.
- the estimated values of parameters for each image are compared to target values for the parameters determined based on the actual values for the parameters for each image.
- Target values may comprise the actual values, or may be derived from the actual values used to generate each synthetic angiogram image.
- differences between the estimated values and the actual values of the parameters are identified.
- the differences may reflect a bias insofar as the actual values may be considered a ground truth and the estimated values may be considered subjective estimates that will vary for different individuals.
- a measure is output.
- the measure may be a metric of bias identified based on the differences between the actual values and the estimated values of the parameters for one or more images.
- the measure may be output at S555 via a feedback module.
- the computer 140 may include a program that takes the estimated values and the actual values of the parameters and computes an error with respect to the ground truth reference which was used to generate the synthetic image sequence. Based on this error on the score and its parameters, feedback is provided as a measure output via, for example, the display 180.
- the measure may include a visualization of errors, and may include simple identification of errors or may include a measure of the extent of the errors.
- the generation of new angiograms may be influenced by these error statistics so as to provide a higher variation focus on estimated parameters with the greatest discrepancy from ground truth.
- FIG. 5B illustrates another method for interactive training and education, in accordance with a representative embodiment.
- the method of FIG. 5B is for an embodiment consistent with the method of FIG. 5 A.
- S505, S510 and S515 of FIG. 5 A may be performed before the synthetic foreground angiogram image is generated at S520.
- a corresponding score is generated at S522 based on the synthetic foreground angiogram image generated at S520.
- the background image is added at S535 and the process between S525 and S550 in FIG. 5A is performed.
- subjectivity is determined based on the differences identified at S550, and at S555 a measure of the subjectivity is output as feedback to the individual estimating the parameters at S540.
- the measure of subjectivity output at S555 in FIG. 5B is provided as interactive feedback to train the individual to remedy the subjectivity exhibited by their estimates at S540.
- the method of FIG. 5B may be provided as an on-demand service over the internet so as to provide bias training to clinical experts.
- the on-demand service may determine subjectivity reflected in the estimates based on the actual values of the parameters, and may help the clinical experts avoid the expense of having to procure their own hardware and software for such training.
- individual systematic biases on certain parameters may be used by the system 100 for calibration.
- the system 100 may incorporate the individual systematic biases once identified to achieve consensus via consolidating selfconsistency for each expert and applying a bias term based on individual user profiles before computing the score.
- a consensus may be developed by the system 100 in the context of a clinical team, so that the consensus may approach the intended score definitions without necessarily forcing a match with the intended score definitions.
- This extension may be used when a group develops a consensus as to score definitions which differs from the score definitions.
- procedural parameters and viewing angles may be made a function of anatomical parameters. Scoring for certain vessel is often dependent on views. For example, scoring for a proximal lesion in LAD may be scored more easily from one viewpoint than another as foreshortening effects have less impact on, for example, angle or length parameter estimates. Using this dependency, the degree of how well the parameters can potentially be read from the two-dimensional projection can be controlled, such as based on an imperfect pose of relevant vascular segments with respect to the projection direction. An individual may receive feedback as to the likelihood that such an imperfect view has hampered their scoring, and may learn to compensate for some deviations from optimal views.
- medical learning scenarios or challenges may be selected as groups from pre-set groups or may be generated dynamically according to specific rules given a set of learning goals.
- Human response to challenges presented by an educational tool may be compared to ground truth in a gaming context.
- different clinicians may address the same challenge in parallel, such as in a simultaneous education session or a gaming context wherein the different clinicians compete against one another.
- Feedback may be presented to clinicians and patients individually and/or as a group, or not at all. In some contexts, feedback may not be presented such as when a patient requests information on a specific condition, and responsive information may be presented direction without a challenges block.
- FIG. 5C illustrates another method for interactive training and education, in accordance with a representative embodiment.
- the method of FIG. 5C is for another embodiment consistent with the method of FIG. 5A.
- the method of FIG. 5C provides an educational tool which may generate different medical learning scenarios with full user control in order to address specific user needs.
- the method of FIG. 5C starts with accepting a selection of a group at S502.
- a group may be selected from a plurality of predetermined groups of sets of parameters, such as for an educational session.
- the method of FIG. 5C may be provided in parallel for multiple students in an educational session over a computer network.
- the method of FIG. 5C then includes the steps from S505 through inputting estimated parameters at S540 in FIG. 5A.
- S545 a comparison between the actual values of the parameters and target values of the parameters (e.g., actual values of the parameters) is made, and at S550, differences are identified.
- An example of a specific medical learning scenario for FIG. 5C may be the best acquisition parameters to use to maximize visualization coverage of the coronary tree for a specific lesion.
- a prompted input is accepted at S552, and at S554 a grade is assigned to the prompted input.
- the method of FIG. 5C may be used to grade the estimates based on the actual values for the parameters input by a clinical expert in an educational context.
- a measure of the grade is output.
- the method of FIG. 5C may be an embodiment for educating an individual based on their estimates based on parameters input at S540. Through interactive challenges and feedback, the education tool provided by the method of FIG. 5C may be turned into a gaming activity.
- FIG. 5D illustrates another method for interactive training and education, in accordance with a representative embodiment.
- the method of FIG. 5D also provides an educational tool which may generate different medical learning scenarios with full user control in order to address specific user needs.
- the method of FIG. 5D may include the method of FIG. 5A through the display at S535.
- the prompt(s) for input may be provided via the display 180, and may include test questions such as multiple choice questions, questions prompting for free-form input, true/false questions, or other types of prompt(s).
- response(s) to the prompt(s) are obtained, such as via a keyboard, microphone, or touchscreen with softkeys.
- the response(s) are graded, and at S555 a measure of the grading is output.
- the method of FIG. 5D may be provided in an educational context where individuals are being educated using interactive feedback based on angiogram images.
- the education tool provided by the method of FIG. 5D may also be turned into a gaming activity through interactive challenged and feedback.
- an individual may be prompted to label an image with general tags, classifications, or properties.
- the labels may not be identical to the parameters, but also may not be expected to be identical to the parameters.
- the labels may be an explicit subset of the parameters used to generate the angiogram images. The grading may simply reflect whether the estimates of the parameter values meet expectations.
- FIG. 5C and FIG. 5D may result in improved education for both clinicians and patients.
- the methods of FIG. 5C and FIG. 5D provide flexible education tools which are not bound to specific facilities such as clinics, and which may be used for continuous training for both clinicians and patients.
- a learner may be provided guidance for orientation through a comparison of a labeled three-dimensional coronary tree and the corresponding two-dimensional angiogram images for different projection scenarios.
- the learner may be challenged to identify the correct labels on two-dimensional angiogram images when viewing the three-dimensional coronary tree or a three-dimensional computerized tomography surrogate.
- a learner may be presented with disease simulations, such as through flow simulation (e.g., stenosis), (pathological) shape simulation, or other types of physiological simulations.
- flow simulation e.g., stenosis
- pathological e.g., shape simulation
- a learner may be presented with heart motion, including motion of a diseased heart.
- the system 100 may also generate synthetic models in which stenosis is difficult to see under standard projections. In other words, synthetic models in which stenosis is not visible under a certain projection direction may be presented to a learner in an educational scenario.
- Learners may also be presented with random coronary trees, such as with random lesions. The learner may be presented with N projections by the system 100, and challenged to check visibility of lesions.
- the system may generate three-dimensional models of vessels and N projections, and the learner may be challenged to estimate the percentage of the coronary tree which has been imaged within the N projections (score) including whether, for example, one or more specific anatomical feature has been missed.
- the system 100 may guide the learner towards projection parameters to ensure that the three-dimensional models will include 100 % of the coronary tree.
- the learner may be challenged to identify locations of blank spots during a simulated procedure.
- overlap may exist in two-dimensional angiogram images such that stenosis is in the three-dimensional image but not in the two-dimensional angiogram images because there is overlap in multiple projections, and the learner may be challenged to identify the location of the stenosis.
- the learner may be challenged to estimate the number of projections needed to image the entirety of the coronary tree.
- the system 100 may offer projections that will assist the learner to assemble a three-dimensional shape of the coronary tree, and prompt the learner to assemble the coronary tree.
- the learner may be scored based on how much of the coronary tree is imaged.
- the above examples are illustrative of how the system 100 may be used to educate clinicians and patients using parameters based on preset groups or dynamic selections, so that familiarity with angiogram images in context may be improved.
- the system 100 may not particularly require any hardware phantoms or imaging devices, and instead the framework of the system 100 may generate images synthetically for medical educational scenarios.
- the unique link between underlying parameters and the appearance of angiogram images may directly incorporate guideline and score definitions without requiring examples from clinical experts or other empirical data evaluation to be initiated.
- FIG. 6 illustrates a computer system, on which a method for interactive training and education is implemented, in accordance with another representative embodiment.
- the computer system 600 includes a set of software instructions that can be executed to cause the computer system 600 to perform any of the methods or computer- based functions disclosed herein.
- the computer system 600 may operate as a standalone device or may be connected, for example, using a network 601, to other computer systems or peripheral devices.
- a computer system 600 performs logical processing based on digital signals received via an analog-to-digital converter.
- the computer system 600 operates in the capacity of a server or as a client user computer in a server-client user network environment, or as a peer computer system in a peer-to-peer (or distributed) network environment.
- the computer system 600 can also be implemented as or incorporated into various devices, such as a workstation that includes a controller, a stationary computer, a mobile computer, a personal computer (PC), a laptop computer, a tablet computer, or any other machine capable of executing a set of software instructions (sequential or otherwise) that specify actions to be taken by that machine.
- the computer system 600 can be incorporated as or in a device that in turn is in an integrated system that includes additional devices.
- the computer system 600 can be implemented using electronic devices that provide voice, video, or data communication. Further, while the computer system 600 is illustrated in the singular, the term “system” shall also be taken to include any collection of systems or sub-systems that individually or jointly execute a set, or multiple sets, of software instructions to perform one or more computer functions.
- the computer system 600 includes a processor 610.
- the processor 610 may be considered a representative example of a processor of a controller and executes instructions to implement some or all aspects of methods and processes described herein.
- the processor 610 is tangible and non-transitory.
- non- transitory is to be interpreted not as an eternal characteristic of a state, but as a characteristic of a state that will last for a period.
- non-transitory specifically disavows fleeting characteristics such as characteristics of a carrier wave or signal or other forms that exist only transitorily in any place at any time.
- the processor 610 is an article of manufacture and/or a machine component.
- the processor 610 is configured to execute software instructions to perform functions as described in the various embodiments herein.
- the processor 610 may be a general- purpose processor or may be part of an application specific integrated circuit (ASIC).
- the processor 610 may also be a microprocessor, a microcomputer, a processor chip, a controller, a microcontroller, a digital signal processor (DSP), a state machine, or a programmable logic device.
- the processor 610 may also be a logical circuit, including a programmable gate array (PGA), such as a field programmable gate array (FPGA), or another type of circuit that includes discrete gate and/or transistor logic.
- the processor 610 may be a central processing unit (CPU), a graphics processing unit (GPU), or both. Additionally, any processor described herein may include multiple processors, parallel processors, or both. Multiple processors may be included in, or coupled to, a single device or multiple devices.
- processor encompasses an electronic component able to execute a program or machine executable instruction.
- references to a computing device comprising “a processor” should be interpreted to include more than one processor or processing core, as in a multi-core processor.
- a processor may also refer to a collection of processors within a single computer system or distributed among multiple computer systems.
- the term computing device should also be interpreted to include a collection or network of computing devices each including a processor or processors. Programs have software instructions performed by one or multiple processors that may be within the same computing device or which may be distributed across multiple computing devices.
- the computer system 600 further includes a main memory 620 and a static memory 630, where memories in the computer system 600 communicate with each other and the processor 610 via a bus 608.
- main memory 620 and static memory 630 may be considered representative examples of a memory of a controller, and store instructions used to implement some or all aspects of methods and processes described herein.
- Memories described herein are tangible storage mediums for storing data and executable software instructions and are non-transitory during the time software instructions are stored therein. As used herein, the term “non-transitory” is to be interpreted not as an eternal characteristic of a state, but as a characteristic of a state that will last for a period.
- the term “non-transitory” specifically disavows fleeting characteristics such as characteristics of a carrier wave or signal or other forms that exist only transitorily in any place at any time.
- the main memory 620 and the static memory 630 are articles of manufacture and/or machine components.
- the main memory 620 and the static memory 630 are computer-readable mediums from which data and executable software instructions can be read by a computer (e.g., the processor 610).
- Each of the main memory 620 and the static memory 630 may be implemented as one or more of random access memory (RAM), read only memory (ROM), flash memory, electrically programmable read only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), registers, a hard disk, a removable disk, tape, compact disk read only memory (CD-ROM), digital versatile disk (DVD), floppy disk, blu-ray disk, or any other form of storage medium 1 known in the art.
- RAM random access memory
- ROM read only memory
- EPROM electrically programmable read only memory
- EEPROM electrically erasable programmable read-only memory
- registers a hard disk, a removable disk, tape, compact disk read only memory (CD-ROM), digital versatile disk (DVD), floppy disk, blu-ray disk, or any other form of storage medium 1 known in the art.
- the memories may be volatile or non-volatile, secure and/or encrypted, unsecure and/or unencrypted.
- Memory is an example of a computer-readable storage medium.
- Computer memory is any memory which is directly accessible to a processor. Examples of computer memory include, but are not limited to RAM memory, registers, and register files. References to “computer memory” or “memory” should be interpreted as possibly being multiple memories. The memory may for instance be multiple memories within the same computer system. The memory may also be multiple memories distributed amongst multiple computer systems or computing devices.
- the computer system 600 further includes a video display unit 650, such as a liquid crystal display (LCD), an organic light emitting diode (OLED), a flat panel display, a solid-state display, or a cathode ray tube (CRT), for example.
- a video display unit 650 such as a liquid crystal display (LCD), an organic light emitting diode (OLED), a flat panel display, a solid-state display, or a cathode ray tube (CRT), for example.
- the computer system 600 includes an input device 660, such as a keyboard/virtual keyboard or touch-sensitive input screen or speech input with speech recognition, and a cursor control device 670, such as a mouse or touch-sensitive input screen or pad.
- the computer system 600 also optionally includes a disk drive unit 680, a signal generation device 690, such as a speaker or remote control, and/or a network interface device 640.
- the disk drive unit 680 includes a computer- readable medium 682 in which one or more sets of software instructions 684 (software) are embedded.
- the sets of software instructions 684 are read from the computer-readable medium 682 to be executed by the processor 610.
- the software instructions 684 when executed by the processor 610, perform one or more steps of the methods and processes as described herein.
- the software instructions 684 reside all or in part within the main memory 620, the static memory 630 and/or the processor 610 during execution by the computer system 600.
- the computer-readable medium 682 may include software instructions 684 or receive and execute software instructions 684 responsive to a propagated signal, so that a device connected to a network 601 communicates voice, video, or data over the network 601.
- the software instructions 684 may be transmitted or received over the network 601 via the network interface device 640.
- dedicated hardware implementations such as application-specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), programmable logic arrays and other hardware components, are constructed to implement one or more of the methods described herein.
- ASICs application-specific integrated circuits
- FPGAs field programmable gate arrays
- programmable logic arrays and other hardware components are constructed to implement one or more of the methods described herein.
- One or more embodiments described herein may implement functions using two or more specific interconnected hardware modules or devices with related control and data signals that can be communicated between and through the modules. Accordingly, the present disclosure encompasses software, firmware, and hardware implementations. None in the present application should be interpreted as being implemented or implementable solely with software and not hardware such as a tangible non-transitory processor and/or memory.
- the methods described herein may be implemented using a hardware computer system that executes software programs. Further, in an exemplary, non-limited embodiment, implementations can include distributed processing, component/object distributed processing, and parallel processing. Virtual computer system processing may implement one or more of the methods or functionalities as described herein, and a processor described herein may be used to support a virtual processing environment.
- interactive training and education provides a framework for generating angiogram images for a given set of score input parameters or for a set of scores directly, which can then be scored by an individual.
- the individual receives feedback on performance and specific deviations from the known ground truth is known so as to stepwise learn how to improve and to be more consistent with the score definition.
- the framework described herein may be fully digital and may be used without requiring a cathlab or even a clinical institute.
- the computer 140 may be provided via the internet or as a cloud-based service to minimize device requirements.
- the system 100 may use a stand-alone software package to provide education for clinicians to improve proficiency in the analysis and interpretation of two-dimensional angiogram images.
- the educational aspects of the teachings herein may also be used by patients to increase awareness of patient conditions.
- An educational tool consistent with the teachings herein may also be provided via an internet website or as a cloud service so that usage is not restricted to a particular device, but can be appreciated from a variety of different devices via a browser or application.
- the educational tool may also or alternatively be provided as an element of one or more software suites associated with a particular device, or in a cathlab, or in particular imaging equipment. In this way, the software may be tailored to the equipment at hand and may make use of real device parameters such as the X-ray imaging geometry to train clinicians on images which are as close as possible to the real- world medical scenarios they will encounter using their own medical equipment.
- inventions of the disclosure may be referred to herein, individually and/or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any particular invention or inventive concept.
- inventions merely for convenience and without intending to voluntarily limit the scope of this application to any particular invention or inventive concept.
- specific embodiments have been illustrated and described herein, it should be appreciated that any subsequent arrangement designed to achieve the same or similar purpose may be substituted for the specific embodiments shown.
- This disclosure is intended to cover any and all subsequent adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the description.
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Computational Mathematics (AREA)
- Mathematical Optimization (AREA)
- Medical Informatics (AREA)
- Medicinal Chemistry (AREA)
- Chemical & Material Sciences (AREA)
- Algebra (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Mathematical Analysis (AREA)
- General Health & Medical Sciences (AREA)
- Mathematical Physics (AREA)
- Pure & Applied Mathematics (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Theoretical Computer Science (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
An interactive system for medical training that includes a memory and a processor. The processor is configured to obtain actual values of parameters associated with a medical scenario; generate a plurality of customized synthetic angiogram images based on the actual values of the parameters; display, to a user, the sequence of synthetic angiogram images; interactively request the user to estimate values for the parameters based on the displayed sequence of customized synthetic angiogram images; compare the estimated values for the parameters to target values for the parameters to determine a measure of difference between the estimated values and the target values, where the target values are defined based on the actual values of the parameters; and provide to the user feedback with respect to the measure of difference.
Description
INTERACTIVE TRAINING AND EDUCATION
BACKGROUND
[0001] Intra-observer variability and inter-observer variability among cardiologists is a known problem when assigning standardized clinical scores to individual patient records and images. [0002] Clinical scores are designed to describe and characterize certain markers of a disease. Typical examples for cardio-vascular disease include a Gensini score used to quantify stenotic lesions, a Medina classification for classifying bifurcations, and a Syntax II score for atherosclerotic burden or disease severity. Some scores may make use of anatomical descriptors as derived from image data, but such scores often extend to more general patient records such as patient demographics, medical history, or comorbidities. Clinical studies link these pooled descriptors to different clinical endpoints such as the risk for major adverse cardiac events (MACE) or mortality. With this established correlation, scores can be used to reason about patient outcome or guide treatment decisions as part of guidelines. As an example, the SYTNAX score is used to decide for CABG or PCI treatment in patients based on their individual mortality risk as described by the score. The validity of such reasonings however not only depends on the imaging quality but also on the individual bias of the human reader.
[0003] Interpretation of images such as a coronary angiogram typically suffers from high variance and user bias. To increase consistency with the score definitions and approach a consensus, group trainings including domain experts have proven beneficial, but are not always an efficient solution in clinical practice. It has been shown that a basic tutorial is insufficient to increase the agreement across clinicians, but that consistency can be achieved e.g. via a 6-hour extensive review session with a highly experienced angiographic core lab team. Such a team is either not always available, or human, lab space or time resources cannot easily be procured. [0004] Therefore, there is a high clinical need for alternative ways to increase consistency with the score definitions so the individuals may approach consensus in a manner that reduces or eliminates bias. Similarly, an interactive feedback system for angiogram images may be used for education independent of identifying and reducing bias.
[0005] Additionally, educational tools have been specifically developed for angiograms. Such tools are aimed at teaching relationships between three-dimensional coronary trees and projected
angiograms. However, the coronary trees used in these tools are datasets which are not readily modified or adapted to specific learning needs. Additionally, such tools do not include an underlying framework that allows manipulation or inclusion of functional/ adverse events.
[0006] Accordingly, there is a need for educational tools that can be used in learning aspects of cardiology, such as the technical details relating to generation of three-dimensional coronary trees.
SUMMARY
[0007] According to an aspect of the present disclosure, an interactive system for medical training includes a memory and a processor. The processor is configured to: obtain actual values of parameters associated with a medical scenario; generate a sequence of customized synthetic angiogram images of the medical scenario based on the actual values of the parameters; display, to a user, the sequence of customized synthetic angiogram images; interactively request the user to estimate values for the parameters based on the displayed sequence of customized synthetic angiogram images; compare the estimated values for the parameters to target values for the parameters to determine a measure of difference between the estimated values and the target values, wherein the target values are defined based on the actual values of the parameters; and provide to the user feedback with respect to the measure of difference.
[0008] According to another aspect of the present disclosure, a method for medical training includes obtaining actual values of parameters associated with a medical scenario; generating a sequence of customized synthetic angiogram images of the medical scenario based on the actual values of the parameters; displaying, to a user, the sequence of customized synthetic angiogram images; interactively requesting the user to estimate values for the parameters based on the displayed sequence of customized synthetic angiogram images; comparing the estimated values for the parameters to target values for the parameters to determine a measure of difference between the estimated values and the target values, wherein the target values are defined based on the actual values of the parameters; and providing to the user feedback with respect to the measure of difference.
[0009] According to another aspect of the present disclosure, a non-transitory computer-readable storage medium has stored a computer program comprising instructions. When executed by a processor, the instructions cause the processor to: obtain actual values of parameters associated with a medical scenario; generate a sequence of customized synthetic angiogram images of the
medical scenario based on the actual values of the parameters; display, to a user, the sequence of customized synthetic angiogram images; interactively request the user to estimate values for the parameters based on the displayed sequence of customized synthetic angiogram images; compare the estimated values for the parameters to target values for the parameters to determine a measure of difference between the estimated values and the target values, wherein the target values are defined based on the actual values of the parameters; and provide to the user feedback with respect to the measure of difference.
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] The example embodiments are best understood from the following detailed description when read with the accompanying drawing figures. It is emphasized that the various features are not necessarily drawn to scale. In fact, the dimensions may be arbitrarily increased or decreased for clarity of discussion. Wherever applicable and practical, like reference numerals refer to like elements.
[0011] FIG. 1 illustrates a system for interactive training and education, in accordance with a representative embodiment.
[0012] FIG. 2 illustrates a flow for interactive training and education, in accordance with a representative embodiment.
[0013] FIG. 3 illustrates another flow for interactive training and education, in accordance with a representative embodiment.
[0014] FIG. 4 illustrates a user interface for interactive training and education, in accordance with a representative embodiment.
[0015] FIG. 5A illustrates a method for interactive training and education, in accordance with a representative embodiment.
[0016] FIG. 5B illustrates another method for interactive training and education, in accordance with a representative embodiment.
[0017] FIG. 5C illustrates another method for interactive training and education, in accordance with a representative embodiment.
[0018] FIG. 5D illustrates another method for interactive training and education, in accordance with a representative embodiment.
[0019] FIG. 6 illustrates a computer system, on which a method for interactive training and
education is implemented, in accordance with another representative embodiment.
DETAILED DESCRIPTION
[0020] In the following detailed description, for the purposes of explanation and not limitation, representative embodiments disclosing specific details are set forth in order to provide a thorough understanding of embodiments according to the present teachings. However, other embodiments consistent with the present disclosure that depart from specific details disclosed herein remain within the scope of the appended claims. Descriptions of known systems, devices, materials, methods of operation and methods of manufacture may be omitted so as to avoid obscuring the description of the representative embodiments. Nonetheless, systems, devices, materials, and methods that are within the purview of one of ordinary skill in the art are within the scope of the present teachings and may be used in accordance with the representative embodiments. It is to be understood that the terminology used herein is for purposes of describing particular embodiments only and is not intended to be limiting. Definitions and explanations for terms herein are in addition to the technical and scientific meanings of the terms as commonly understood and accepted in the technical field of the present teachings.
[0021] It will be understood that, although the terms first, second, third etc. may be used herein to describe various elements or components, these elements or components should not be limited by these terms. These terms are only used to distinguish one element or component from another element or component. Thus, a first element or component discussed below could be termed a second element or component without departing from the teachings of the inventive concept. [0022] As used in the specification and appended claims, the singular forms of terms ‘a’, ‘an’ and ‘the’ are intended to include both singular and plural forms, unless the context clearly dictates otherwise. Additionally, the terms "comprises", and/or "comprising," and/or similar terms when used in this specification, specify the presence of stated features, elements, and/or components, but do not preclude the presence or addition of one or more other features, elements, components, and/or groups thereof. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
[0023] Unless otherwise noted, when an element or component is said to be “connected to”, “coupled to”, or “adjacent to” another element or component, it will be understood that the element or component can be directly connected or coupled to the other element or component,
or intervening elements or components may be present. That is, these and similar terms encompass cases where one or more intermediate elements or components may be employed to connect two elements or components. However, when an element or component is said to be “directly connected” to another element or component, this encompasses only cases where the two elements or components are connected to each other without any intermediate or intervening elements or components.
[0024] The present disclosure, through one or more of its various aspects, embodiments and/or specific features or sub-components, is thus intended to bring out one or more of the advantages as specifically noted below.
[0025] As described herein, embodiments provide for customized synthetic angiogram images that are generated by modelling an anatomical structure using controlled parameter values. The synthetic angiogram images may be used for a variety of medical training and education purposes in an interactive context. The teachings herein provide a flexible and light-weight simulation framework for generation of angiograms with control over the parameters defining the angiogram characteristics. By modelling the morphology and physiology of vascular trees and subsequently simulating the imaging process using controlled parameter values, in embodiments, control governing the angiogram appearance is provided.
[0026] FIG. 1 illustrates a system 100 for interactive training and education, in accordance with a representative embodiment.
[0027] The system 100 in FIG. l is a system for interactive training and education and includes components that may be provided together or that may be distributed. The system 100 includes an imaging device 101, a computer 140 and a display 180. The computer 140 includes a controller 150, and the controller 150 includes at least a memory 151 that stores instructions and a processor 152 that executes the instructions. A computer that can be used to implement the computer 140 is depicted in FIG. 6, though a computer 140 may include more or fewer elements than depicted in FIG. 1 or FIG. 6.
[0028] The imaging device 101 may be configured to generate X-ray images. As is known, angiography is a type of X-ray imaging used to check blood vessels. Insofar as blood vessels may not show clearly on a normal X-ray, a contrast medium may be injected into blood for angiography. The contrast medium is radiopaque at X-ray wavelengths and allows highlighting of blood vessels in the resultant X-ray images. Angiogram images obtained from X-ray such as
via the imaging device 101 may be known as angiograms. The imaging device 101 may be provided together with or separate from the computer 140. The imaging device 101 may also be representative of multiple imaging devices including X-ray imaging devices distributed among multiple locations.
[0029] The computer 140 may be configured to generate customized synthetic angiogram images. In some embodiments, the customized synthetic angiogram images may be generated to model a medical scenario (such as a particular disease condition of a vasculature). The synthetic angiogram images may be created based on actual angiogram images from the imaging device 101. The computer 140 may be configured to obtain values for parameters associated with a medical scenario and input the parameter values into a model configured to generate the customized angiogram images based on the parameter values, and then provide the synthetic angiogram images for display on the display 180. These parameter values are referred to herein as “actual values” as they are the values of the parameters actually used to generate the synthetic angiogram images. The parameters may include, for example, parameters associated with anatomical characteristics (e.g., clinically scoring such anatomical characteristics) in the synthetic angiogram images. Such anatomical characteristics may include presence, location, dimensions, severity, and/or other feature of a condition in an anatomy. For example, the parameters may be characteristics associated with a disease condition (e.g., stenotic lesion) in a vasculature and the actual values set for the parameters define the dimension, location, and severity of the disease condition to be generated in the synthetic angiogram images.
[0030] The computer 140 may also be configured to interactively request (e.g., prompt) a user to estimate values for the parameters based on viewing the displayed synthetic angiogram images, and to accept the estimated values. The estimates may be interactively requested from the user visually, for example, via the display 180, or audibly via an audio output, and the estimates may be taken as a form of training or as a form of education with respect to the medical scenario. For example, the synthetic angiogram images may include a stenotic lesion and the user may be interactively requested to estimate a value of the severity of the stenotic lesion (e.g., score the severity of the stenotic lesion).
[0031] The computer 140 may be configured to compare the estimated values of the parameters to target values defined for the parameters. The target values for the parameters may be the actual values of the parameters, or other values (e.g., acceptable ranges) derived from the actual
values of the parameters. For example, if the synthetic angiogram images include a stenotic lesion, the computer may compare the user’s estimated value (e.g., clinical score) for the severity of the lesion to the actual value of the severity of the lesion used to generate the angiogram images. The computer 140 may be configured to determine a measure of difference between the actual values of the parameters and the target values of the parameters, and then output feedback to the user with respect to the measure of the difference. For example, the computer 140 may be configured to output a measure of the difference as feedback visually via the display 180.
[0032] The display 180 may be local to the computer 140 or may be remotely connected to the computer 140. The display 180 may be connected to the computer 140 via a local wired interface such as an Ethernet cable or via a local wireless interface such as a Wi-Fi connection. The display 180 may be interfaced with other user input devices by which users can input instructions, including mouses, keyboards, thumbwheels and so on. The display 180 may be a monitor such as a computer monitor, a display on a mobile device, an augmented reality display, a television, an electronic whiteboard, or another screen configured to display electronic imagery. The display 180 may also include one or more input interface(s) such as those noted above that may connect to other elements or components, as well as an interactive touch screen configured to display prompts or other interactive interfaces to users and collect touch input from users.
[0033] The controller 150 may also include interfaces, such as a first interface, a second interface, a third interface, and a fourth interface. One or more of the interfaces may include ports, disk drives, wireless antennas, or other types of receiver circuitry that connect the controller 150 to other electronic elements. One or more of the interfaces may also include user interfaces such as buttons, keys, a mouse, a microphone, a speaker, a display separate from the display 180, or other elements that users can use to interact with the controller 150 such as to enter instructions and receive output.
[0034] The controller 150 may perform some of the operations described herein directly and may implement other operations described herein indirectly. For example, the controller 150 may indirectly control operations such as by generating and transmitting content to be displayed on the display 180. The controller 150 may directly control other operations such as logical operations performed by the processor 152 executing instructions from the memory 151 based on input received from electronic elements and/or users via the interfaces. Accordingly, the
processes implemented by the controller 150 when the processor 152 executes instructions from the memory 151 may include steps not directly performed by the controller 150.
[0035] FIG. 2 illustrates a flow for interactive training and education, in accordance with a representative embodiment.
[0036] The flow in FIG. 2 starts with inputting an input sample and/or ground truth to a synthetic angiogram generator at S210 for modelling a medical scenario. The synthetic angiogram generator may be the computer 140, and may be implemented using a computer program executed by the processor 152 of the controller 150. For example, a computer program may be opened so that a user can provide input to a model consisting of one or more algorithms included in the computer program. For example, the user may input boundary conditions from which a score and input parameters may be randomly sampled. An example of an input boundary condition is a user providing instructions to generate synthetic angiogram images corresponding to scores in a range between a low score of A and a high score of B. Examples of the input sample and/or ground truth may include one or more score values. In other embodiments, a first user may input the score values as the actual values of parameters associated with a medical scenario, and these actual values may be used to generate a customized plurality of synthetic angiogram images for presentation to a second user, such as a clinical expert. The synthetic angiogram generator may output an angiography sequence of two-dimensional X-ray images based on a set of the actual values of the parameters, whether input directly by a user or generated based on input boundary conditions. The two-dimensional X-ray images may comprise a sequence of synthetic angiogram images simulating movement of a vasculature vessel tree. [0037] A model implemented by the computer 140 to generate angiogram images may be rules- based, but may also include data-driven components such as a neural network trained to generate synthetic backgrounds or simply a statistical principal component analysis (PCA) model which defined spatial eigenmodes of the heart. In some embodiments, a rules-based scoring model may be employed as a forward model to convert scores into input parameters. An optimization may be employed subsequently to optimize parameters under boundary conditions that vary case-by- case until the score computed by the forward model fits the desired target score. Accordingly, a “model” as the term is used herein may include multiple models including models of different types. For example, the parameters may be anatomical characteristics and the rules-based scoring model may be for quantifying anatomical characteristics of disease as the actual values of the
parameters.
[0038] At S215, the actual values of the parameters are input to a bias identifier. For example, the input sample and/or ground truth may be used as input to a bias identifier. The bias identifier may be the computer 140, and may be implemented using the same or a different computer program than the computer program used for the synthetic angiogram generator. Examples of the parameters include properties and parameters which are relevant for clinical scores such as lesions (length, shape, stenosis significance, calcifications, eccentricity ... ), bifurcations, view parameters etc.
[0039] At S220, the synthetic angiogram generator generates a synthetic angiogram. The synthetic angiogram may be displayed on the display 180. The synthetic angiogram may be displayed as a form of prompting to a clinical expert. The synthetic angiogram generator generates synthetic angiogram images based on the actual values for a set of input parameters. These input parameters may be morphological parameters, physiological parameters, procedural parameters, or imaging parameters, and may govern the appearance of the resulting image sequence of synthetic angiogram images. The input parameters also may form the ground truth for the scoring output prompted from an individual. For example, the parameters are associated with a vascular and the actual values are set to generate the synthetic angiogram to provide the vascular in a particular disease state (e.g., with lesions of a certain severity).
[0040] In some embodiments, the synthetic angiogram generator may start from a score value and use the score value to generate a set of plausible parameters which yield the score value. For example, the score value may be a severity level for lesion(s) in a vasculature and the synthetic angiogram generator defines values for parameters associated with the vascular which generate the synthetic angiogram with lesion(s) of the severity level corresponding to the score value. [0041] At S230, one or more angiogram image(s) are provided to the clinical expert to score and/or estimate values for the parameters. The clinical expert is prompted to enter a score or other value(s) for the parameters for each angiogram image. In some embodiments, the clinical expert may be prompted to estimate the score or actual values of the parameters used to generate the angiogram image and resulting in the score. For example, the user may be prompted to enter a score for lesion(s) in the synthetic angiogram that was generated based on setting actual values for certain parameters of a vascular.
[0042] At S240, the clinical expert assigns estimates based on viewing the synthetic angiogram
image. The estimates may be estimates of the parameters, or estimates of other parameters relating to the synthetic angiogram and derived from the parameters used to generate the synthetic angiogram image. The estimates may be a score reflecting the combination of parameters, such as a score indicating a grading of the parameters of the synthetic angiogram image as to the severity of a medical condition (e.g., quantify severity of stenotic lesions or severity of other disease condition). The estimated values of the parameters may be input to the computer 140, and may include estimates of a score value or score input parameters corresponding to the synthetic angiogram. The angiogram image may be used as an interactive scoring tool so that the clinical expert may read back the relevant input parameters for the score from the angiogram image or the score directly.
[0043] At S250, the estimated values of the parameters are input to the bias identifier. The bias identifier may compare them with target values of the parameters determined based on the actual values of the parameters from S215, and output interactive feedback to the clinical expert at S260. The target values may comprise the actual values, or may comprise other values of the parameters relating to the angiogram image and derived from the actual values of the parameters. The resulting set of values and/or the score value from the estimates provided by the individual are compared with the target values of the parameters or the corresponding reference which served as an input to the synthetic angiogram generator to compute the user error and give feedback on how to achieve more consistency.
[0044] The flow of FIG. 2 may be a general overview of a scoring trainer system. Score input of parameters may be sampled from distributions or generated from a score input value. The score input of the parameters are input into a synthetic angiogram generator, which produces a coronary angiogram to be scored by the clinical expert. The clinical expert estimates a score for the image based on the parameters and hence assigns a score to the image or parts of the image such as individual stenoses. The estimated score by the human is compared to the actual score of the parameter or another target score based on the actual score, and the error in the estimation is fed back to the individual as a reflection of bias to improve their scoring. The feedback may be provided after each input of an estimated score of an image or after scoring several images, so as to identify trends based on scoring several images.
[0045] In some embodiments, values of parameters may be input to the synthetic angiogram generator at S210, and then a score may be prompted from the individual at S240. In alternative
embodiments, a score may be input to the synthetic angiogram generator at S210, and estimated values for parameters associated with the score may be prompted from the individual at S240. [0046] FIG. 3 illustrates another flow for interactive training and education, in accordance with a representative embodiment.
[0047] The flow in FIG. 3 may be performed by the computer 140 in FIG. 1 for synthetic angiogram generation. A hybrid modelling approach may be used to generate synthetic angiography sequences with a contrast-filled vessel tree in the foreground and typical attenuation patterns in the background. The flow in FIG. 3 starts with providing information of a catheter tip position at S311 for non-contrast frames. The non-contrast frames are provided at S312 for generating a synthetic angiogram and labels at S351. The information provided at S311 is provided to generate a vessel tree at S321. The vessel tree is generated at S321 as a three- dimensional representation such as a triangular mesh. This morphological description and a physiological description of the vessel tree on an epicardial surface mesh are provided to a contrast propagator at S331. The morphological tree description may include vessel labels and/or stenoses. The physiological tree description may include volume flow and/or velocity. The morphological tree description may include descriptions of morphological parameters, and the physiological description of the vessel tree may include physiological parameters. At the contrast propagator the 3D representation is voxelized into one or more discretized image volumes. [0048] The morphological description of the vascular tree is either generated at S331 from scratch or taken from some other three-dimensional imaging modality such as computed tomography. The morphological description defines with a centerline and a lumen hull the general anatomy of the vessel tree. When synthetically grown, a sample from a statistical atlas may control the results including parameters such as: vessel lengths, vessel diameters, branching point locations and angles, lumen contour shape at each centerline point, an also abnormalities such as vessel tortuosity or stenoses. The morphological description may have a time factor to allow for heart motion. The physiological parameters either are sampled at S331 from empirical distributions such as the blood volume flow in the left main artery, or computed based on the existing morphological description. Examples of an existing morphological description might include volume flow after branching, bolus front wave velocity based on cross-sectional slab volume, time of contrast arrival etc. In another embodiment, these physiological parameters are set at S331 by a simulation module such as a lumped one-dimensional model such as the
Windkessel model. A lumped one-dimensional model starts from boundary conditions such as the aortic pressure and microvascular resistances in the myocardium and simulates the remaining parameters such as volume flow at each vessel segment based on the underlying model which also captures the tree morphology.
[0049] The vascular tree description is then voxelized in three dimensions into a discretized volume block by marking volume grid points lying within the vascular mesh. Each cross- sectional slab along the centerline may be given a unique integer value in the volume, so that the propagator module can assign the contrast filling status for any specific time point using look-up tables generated from the physiological description. As a result, the volume-indexed three- dimensional rendering result may be reused for any time point of the contrast propagation. Necessary input parameters may include the arterial input function (AIF) of the injected contrast bolus, the imaging frame rate (e.g., 15 frames/s) and the information at which time the contrast font wave arrives at each individual slab after it had passed the tree root (t=Os) as computed from the physiological parameters.
[0050] At S341, the contrast propagator provides a dynamic description to an X-ray projector. A contrast medium is propagated through the three-dimensional rendering of the vessel tree with a given arterial input function and frame rate. The contrast propagator outputs a dynamic description to the X-ray projector. For example, the three-dimensional voxelization of the vessel tree generated at S321 may be provided as a sequence of three-dimensional images for a dynamic volume-based description from the contrast propagator which is then passed through the X-ray projector at S341. The sequence of three-dimensional images is not limited only to the vasculature, but may also include surrounding objects which may be modelled by input parameters but which are not the main clinical focus of the estimates based on parameters prompted according to the teachings herein. The surrounding objects may include diaphragm, heart shadow and more which move in the coronary context. Also contrast medium contained in the vessel will move in sequences displaying physiological properties of the tree. The X-ray projector may comprise a projector module implemented by the computer 140. The X-ray projector may generate a sequence of two-dimensional foreground projections including images and labels. In a final step, the flow of FIG. 3 may include adding a background to the foreground. The background may originate from a data-driven path. For example, the computer may either draw real images from a clinical data pool or may generate synthetic backgrounds from a data
distribution. For example, a data distribution may be modelled as a generative adversarial network (GAN) to result in the synthetic backgrounds added to the foregrounds in FIG. 3. [0051] At S351, the X-ray projector outputs synthetic angiogram images and labels. The X-ray projector outputs the synthetic angiogram and images based on the sequence of two-dimensional foreground projections from the X-ray projector and based on the non-contrast frames from S312.
[0052] In FIG. 3, a sequence of three-dimensional volumes show partially contrast-filled vascular trees depending on the related time index. To receive the final two-dimensional angiogram foreground, the sequence may be passed to a projection module, projecting the three- dimensional volume onto the two-dimensional detector plane. The flow of FIG. 3 is again governed by procedural parameters such as the position of a C-arm isocenter relative to the vascular tree or the viewing angles. A background image is added to the foreground. This background can either be taken from an available pool of real backgrounds based on angiograms without visible contrast medium, or may be generated by a generative model which has learned from real world data to generate plausible looking angiogram backgrounds.
[0053] FIG. 4 illustrates a user interface for interactive training and education, in accordance with a representative embodiment.
[0054] The user interface 481 in FIG. 4 shows stenosis scoring and modelling. For example, on the lower left a definition of a Gensini score may be provided. The Gensini score (GS) is a tool for assessing the severity and complexity of coronary artery diseases. A Gensini score is used to quantify angiographic atherosclerosis. The definition of the Gensini score may include a graphic with two columns. A left column may include a stack of icons and corresponding percentages for a reduction of lumen diameter. The right column may include a stack of icons and corresponding scores for severity of eccentric plaque. For example, a zero Gensini score indicates absence of atherosclerotic disease. Stenoses are rated by a Gensini score according to their location in the vessel, their concentricity or eccentricity. On the right, a morphological mesh model of the vascular tree is provided. The vascular tree is defined by a centerline and a pair of cross-sectional vectors (u, v) orthogonal to the centerline. The pair of cross-sectional vectors span a cross- sectional coordinate system in which the lumen contour is defined as dots. The narrowing may be adjusted by the average distance between contour and centerline point over multiple crosssections, and the eccentricity may be model by the centerline point position within the closed
lumen contour. An example implementation for introducing stenoses may use spatial eigenmodes added on top of the cylindrical vessel mesh.
[0055] In FIG. 4, image A and image B may show a left CA and a right CA. Metrics shown below image A and Image B include LAD scores, LCx scores, RCA scores and a final score. The information shown in FIG. 4 may be part of the actual values for parameters for synthetic angiogram images corresponding to image A and image B, though more and different values and/or parameters than shown may be required as input to a model to generate the angiogram image. That is, the number and type of parameters input to the generator may be more numerous and varied than those set by a user. For example, a user may set a desired range, and the generator may select values within the range to generate an angiogram image. In some embodiments, the user does not set input parameters, and instead the generator may randomly sample input parameters from ranges so that the sampled input parameters or corresponding scores are the targets for the clinical expert to estimate Also in FIG. 4, Image 1 underneath the MLCA/LAD label may be a standardized image provided as an icon to guide the user. Image 2 underneath the CFx label may also be a standardized image provided as an icon to guide the user. Image 3 underneath the RCA label may also be a standardized image provided as an icon to guide the user.
[0056] FIG. 5A illustrates a method for interactive training and education, in accordance with a representative embodiment.
[0057] At S505, actual values of the parameters are input. For example, the actual values may be input to the computer 140 in order to start a process for measuring biases of professionals in a manner that will help train the professionals to reduce or eliminate such biases. The actual values may be parameters used to generate one or more angiogram image. The actual values may include, for example, clinical scores for anatomical characteristics in the synthetic angiogram images. The anatomical characteristics may include, for example, presence, location, and dimensions of an anatomical characteristic.
[0058] At S510, a three-dimensional angiogram image is generated based on the actual values of the parameters. For example, the parameters may comprise anatomical characteristics of disease and a rules-based scoring model may be used for quantifying the anatomical characteristics of disease as the actual values of the parameters, though the model used to generate the three- dimensional angiogram images may also include data-driven components..
[0059] At S515, the three-dimensional angiogram image from S510 is projected onto a two- dimensional plane. Synthetic angiogram images used for the training and education comparisons and scoring may each comprise a synthetic two-dimensional X-ray image, and may be generated based on at least one synthetic three-dimensional angiogram image from S510. In other words, the projection at S515 may result in generating a corresponding synthetic angiogram image among the synthetic angiogram images for each synthetic three-dimensional angiogram image projected onto the two-dimensional detector plane.
[0060] At S520, a synthetic foreground angiogram image is generated based on the projection at S515. The synthetic foreground angiogram image is generated as a two-dimensional image.
[0061] At S525, a background angiogram image is added. The background angiogram image is added to the two-dimensional synthetic foreground angiogram image generated at S520.
[0062] The two-dimensional synthetic angiogram images may comprise a sequence of synthetic angiogram images simulating movement of a vasculature vessel tree. At S530, a determination is made as to whether the generation of two-dimensional angiogram images is complete.
[0063] If the image(s) are not complete (S530 = No), at S537 the parameters and/or actual values of the parameters are updated and the process returns to S510.
[0064] If the image(s) are complete (S530 = Yes), at S535, the image(s) are displayed. Multiple images may be displayed simultaneously or sequentially.
[0065] At S540, estimated parameters are input. The estimated parameters may be input by a clinician or other individual based on the image(s) displayed at S535. For example, images may be displayed one at a time at S535, and the individual make be prompted to enter estimated parameters corresponding to the displayed images at S540. Accordingly, S535 and S540 may be performed as a loop until the estimated parameters are input for the last image.
[0066] A human reader may be prompted to assign a score to the synthetically generated angiogram image at S540. The individual may determine the relevant set of parameters from the image sequence, and this may include measuring/estimating, for example, lengths and angles from the image sequence which can be enabled by interactive tooling. The score and the input parameters for the score may then be passed to a feedback module.
[0067] At S545, the estimated values of parameters for each image are compared to target values for the parameters determined based on the actual values for the parameters for each image. Target values may comprise the actual values, or may be derived from the actual values used to
generate each synthetic angiogram image.
[0068] At S550, differences between the estimated values and the actual values of the parameters are identified. The differences may reflect a bias insofar as the actual values may be considered a ground truth and the estimated values may be considered subjective estimates that will vary for different individuals.
[0069] At S555, a measure is output. For example, the measure may be a metric of bias identified based on the differences between the actual values and the estimated values of the parameters for one or more images. The measure may be output at S555 via a feedback module. For example, the computer 140 may include a program that takes the estimated values and the actual values of the parameters and computes an error with respect to the ground truth reference which was used to generate the synthetic image sequence. Based on this error on the score and its parameters, feedback is provided as a measure output via, for example, the display 180. The measure may include a visualization of errors, and may include simple identification of errors or may include a measure of the extent of the errors. In some embodiments, the generation of new angiograms may be influenced by these error statistics so as to provide a higher variation focus on estimated parameters with the greatest discrepancy from ground truth.
[0070] FIG. 5B illustrates another method for interactive training and education, in accordance with a representative embodiment.
[0071] The method of FIG. 5B is for an embodiment consistent with the method of FIG. 5 A. For example, S505, S510 and S515 of FIG. 5 A may be performed before the synthetic foreground angiogram image is generated at S520.
[0072] In the method of FIG. 5B, a corresponding score is generated at S522 based on the synthetic foreground angiogram image generated at S520. The background image is added at S535 and the process between S525 and S550 in FIG. 5A is performed. However, at S551, subjectivity is determined based on the differences identified at S550, and at S555 a measure of the subjectivity is output as feedback to the individual estimating the parameters at S540. The measure of subjectivity output at S555 in FIG. 5B is provided as interactive feedback to train the individual to remedy the subjectivity exhibited by their estimates at S540.
[0073] The method of FIG. 5B may be provided as an on-demand service over the internet so as to provide bias training to clinical experts. The on-demand service may determine subjectivity reflected in the estimates based on the actual values of the parameters, and may help the clinical
experts avoid the expense of having to procure their own hardware and software for such training.
[0074] In an extension of the method of FIG. 5B, individual systematic biases on certain parameters may be used by the system 100 for calibration. The system 100 may incorporate the individual systematic biases once identified to achieve consensus via consolidating selfconsistency for each expert and applying a bias term based on individual user profiles before computing the score. A consensus may be developed by the system 100 in the context of a clinical team, so that the consensus may approach the intended score definitions without necessarily forcing a match with the intended score definitions. This extension may be used when a group develops a consensus as to score definitions which differs from the score definitions.
[0075] In another extension of the method of FIG. 5B, procedural parameters and viewing angles may be made a function of anatomical parameters. Scoring for certain vessel is often dependent on views. For example, scoring for a proximal lesion in LAD may be scored more easily from one viewpoint than another as foreshortening effects have less impact on, for example, angle or length parameter estimates. Using this dependency, the degree of how well the parameters can potentially be read from the two-dimensional projection can be controlled, such as based on an imperfect pose of relevant vascular segments with respect to the projection direction. An individual may receive feedback as to the likelihood that such an imperfect view has hampered their scoring, and may learn to compensate for some deviations from optimal views.
[0076] The methods of FIG. 5C and FIG. 5D are explained next. According to embodiments based on these methods, medical learning scenarios or challenges may be selected as groups from pre-set groups or may be generated dynamically according to specific rules given a set of learning goals. Human response to challenges presented by an educational tool may be compared to ground truth in a gaming context. Additionally, different clinicians may address the same challenge in parallel, such as in a simultaneous education session or a gaming context wherein the different clinicians compete against one another. Feedback may be presented to clinicians and patients individually and/or as a group, or not at all. In some contexts, feedback may not be presented such as when a patient requests information on a specific condition, and responsive information may be presented direction without a challenges block.
[0077] FIG. 5C illustrates another method for interactive training and education, in accordance
with a representative embodiment.
[0078] The method of FIG. 5C is for another embodiment consistent with the method of FIG. 5A. The method of FIG. 5C provides an educational tool which may generate different medical learning scenarios with full user control in order to address specific user needs. The method of FIG. 5C starts with accepting a selection of a group at S502. A group may be selected from a plurality of predetermined groups of sets of parameters, such as for an educational session. For example, the method of FIG. 5C may be provided in parallel for multiple students in an educational session over a computer network.
[0079] The method of FIG. 5C then includes the steps from S505 through inputting estimated parameters at S540 in FIG. 5A. At S545, a comparison between the actual values of the parameters and target values of the parameters (e.g., actual values of the parameters) is made, and at S550, differences are identified. An example of a specific medical learning scenario for FIG. 5C may be the best acquisition parameters to use to maximize visualization coverage of the coronary tree for a specific lesion.
[0080] However, in the method of FIG. 5C, a prompted input is accepted at S552, and at S554 a grade is assigned to the prompted input. The method of FIG. 5C may be used to grade the estimates based on the actual values for the parameters input by a clinical expert in an educational context. At S555, a measure of the grade is output.
[0081] The method of FIG. 5C may be an embodiment for educating an individual based on their estimates based on parameters input at S540. Through interactive challenges and feedback, the education tool provided by the method of FIG. 5C may be turned into a gaming activity.
[0082] FIG. 5D illustrates another method for interactive training and education, in accordance with a representative embodiment.
[0083] The method of FIG. 5D is for another embodiment consistent with the method of FIG.
5 A. The method of FIG. 5D also provides an educational tool which may generate different medical learning scenarios with full user control in order to address specific user needs. The method of FIG. 5D may include the method of FIG. 5A through the display at S535.
[0084] At S537, one or more prompt(s) for input are provided. The prompt(s) for input may be provided via the display 180, and may include test questions such as multiple choice questions, questions prompting for free-form input, true/false questions, or other types of prompt(s).
[0085] At S542, response(s) to the prompt(s) are obtained, such as via a keyboard, microphone,
or touchscreen with softkeys. At S552, the response(s) are graded, and at S555 a measure of the grading is output. In other words, the method of FIG. 5D may be provided in an educational context where individuals are being educated using interactive feedback based on angiogram images. The education tool provided by the method of FIG. 5D may also be turned into a gaming activity through interactive challenged and feedback.
[0086] As an example of the embodiments of FIG. 5C and FIG. 5D, an individual may be prompted to label an image with general tags, classifications, or properties. The labels may not be identical to the parameters, but also may not be expected to be identical to the parameters. Alternatively, the labels may be an explicit subset of the parameters used to generate the angiogram images. The grading may simply reflect whether the estimates of the parameter values meet expectations.
[0087] Insofar as proficiency in interpreting angiogram images is typically a slow process, the educational aspects of the methods in FIG. 5C and FIG. 5D may result in improved education for both clinicians and patients. The methods of FIG. 5C and FIG. 5D provide flexible education tools which are not bound to specific facilities such as clinics, and which may be used for continuous training for both clinicians and patients.
[0088] As an example, embodiment for medical educational scenarios provided by the methods of FIG. 5C and FIG. 5D, a learner may be provided guidance for orientation through a comparison of a labeled three-dimensional coronary tree and the corresponding two-dimensional angiogram images for different projection scenarios. The learner may be challenged to identify the correct labels on two-dimensional angiogram images when viewing the three-dimensional coronary tree or a three-dimensional computerized tomography surrogate.
[0089] As another set of example embodiments, a learner may be presented with disease simulations, such as through flow simulation (e.g., stenosis), (pathological) shape simulation, or other types of physiological simulations.
[0090] In another set of example embodiments, a learner may be presented with heart motion, including motion of a diseased heart.
[0091] The system 100 may also generate synthetic models in which stenosis is difficult to see under standard projections. In other words, synthetic models in which stenosis is not visible under a certain projection direction may be presented to a learner in an educational scenario. [0092] Learners may also be presented with random coronary trees, such as with random lesions.
The learner may be presented with N projections by the system 100, and challenged to check visibility of lesions.
[0093] In some embodiments, the system may generate three-dimensional models of vessels and N projections, and the learner may be challenged to estimate the percentage of the coronary tree which has been imaged within the N projections (score) including whether, for example, one or more specific anatomical feature has been missed. The system 100 may guide the learner towards projection parameters to ensure that the three-dimensional models will include 100 % of the coronary tree. For example, the learner may be challenged to identify locations of blank spots during a simulated procedure. As an example, overlap may exist in two-dimensional angiogram images such that stenosis is in the three-dimensional image but not in the two-dimensional angiogram images because there is overlap in multiple projections, and the learner may be challenged to identify the location of the stenosis.
[0094] In an additional set of embodiments, the learner may be challenged to estimate the number of projections needed to image the entirety of the coronary tree. The system 100 may offer projections that will assist the learner to assemble a three-dimensional shape of the coronary tree, and prompt the learner to assemble the coronary tree. The learner may be scored based on how much of the coronary tree is imaged.
[0095] The above examples are illustrative of how the system 100 may be used to educate clinicians and patients using parameters based on preset groups or dynamic selections, so that familiarity with angiogram images in context may be improved. The system 100 may not particularly require any hardware phantoms or imaging devices, and instead the framework of the system 100 may generate images synthetically for medical educational scenarios. The unique link between underlying parameters and the appearance of angiogram images may directly incorporate guideline and score definitions without requiring examples from clinical experts or other empirical data evaluation to be initiated.
[0096] FIG. 6 illustrates a computer system, on which a method for interactive training and education is implemented, in accordance with another representative embodiment.
[0097] Referring to FIG.6, the computer system 600 includes a set of software instructions that can be executed to cause the computer system 600 to perform any of the methods or computer- based functions disclosed herein. The computer system 600 may operate as a standalone device or may be connected, for example, using a network 601, to other computer systems or peripheral
devices. In embodiments, a computer system 600 performs logical processing based on digital signals received via an analog-to-digital converter.
[0098] In a networked deployment, the computer system 600 operates in the capacity of a server or as a client user computer in a server-client user network environment, or as a peer computer system in a peer-to-peer (or distributed) network environment. The computer system 600 can also be implemented as or incorporated into various devices, such as a workstation that includes a controller, a stationary computer, a mobile computer, a personal computer (PC), a laptop computer, a tablet computer, or any other machine capable of executing a set of software instructions (sequential or otherwise) that specify actions to be taken by that machine. The computer system 600 can be incorporated as or in a device that in turn is in an integrated system that includes additional devices. In an embodiment, the computer system 600 can be implemented using electronic devices that provide voice, video, or data communication. Further, while the computer system 600 is illustrated in the singular, the term “system” shall also be taken to include any collection of systems or sub-systems that individually or jointly execute a set, or multiple sets, of software instructions to perform one or more computer functions.
[0099] As illustrated in FIG. 6, the computer system 600 includes a processor 610. The processor 610 may be considered a representative example of a processor of a controller and executes instructions to implement some or all aspects of methods and processes described herein. The processor 610 is tangible and non-transitory. As used herein, the term “non- transitory” is to be interpreted not as an eternal characteristic of a state, but as a characteristic of a state that will last for a period. The term “non-transitory” specifically disavows fleeting characteristics such as characteristics of a carrier wave or signal or other forms that exist only transitorily in any place at any time. The processor 610 is an article of manufacture and/or a machine component. The processor 610 is configured to execute software instructions to perform functions as described in the various embodiments herein. The processor 610 may be a general- purpose processor or may be part of an application specific integrated circuit (ASIC). The processor 610 may also be a microprocessor, a microcomputer, a processor chip, a controller, a microcontroller, a digital signal processor (DSP), a state machine, or a programmable logic device. The processor 610 may also be a logical circuit, including a programmable gate array (PGA), such as a field programmable gate array (FPGA), or another type of circuit that includes discrete gate and/or transistor logic. The processor 610 may be a central processing unit (CPU), a
graphics processing unit (GPU), or both. Additionally, any processor described herein may include multiple processors, parallel processors, or both. Multiple processors may be included in, or coupled to, a single device or multiple devices.
[00100] The term “processor” as used herein encompasses an electronic component able to execute a program or machine executable instruction. References to a computing device comprising “a processor” should be interpreted to include more than one processor or processing core, as in a multi-core processor. A processor may also refer to a collection of processors within a single computer system or distributed among multiple computer systems. The term computing device should also be interpreted to include a collection or network of computing devices each including a processor or processors. Programs have software instructions performed by one or multiple processors that may be within the same computing device or which may be distributed across multiple computing devices.
[00101] The computer system 600 further includes a main memory 620 and a static memory 630, where memories in the computer system 600 communicate with each other and the processor 610 via a bus 608. Either or both of the main memory 620 and the static memory 630 may be considered representative examples of a memory of a controller, and store instructions used to implement some or all aspects of methods and processes described herein. Memories described herein are tangible storage mediums for storing data and executable software instructions and are non-transitory during the time software instructions are stored therein. As used herein, the term “non-transitory” is to be interpreted not as an eternal characteristic of a state, but as a characteristic of a state that will last for a period. The term “non-transitory” specifically disavows fleeting characteristics such as characteristics of a carrier wave or signal or other forms that exist only transitorily in any place at any time. The main memory 620 and the static memory 630 are articles of manufacture and/or machine components. The main memory 620 and the static memory 630 are computer-readable mediums from which data and executable software instructions can be read by a computer (e.g., the processor 610). Each of the main memory 620 and the static memory 630 may be implemented as one or more of random access memory (RAM), read only memory (ROM), flash memory, electrically programmable read only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), registers, a hard disk, a removable disk, tape, compact disk read only memory (CD-ROM), digital versatile disk (DVD), floppy disk, blu-ray disk, or any other form of storage medium 1
known in the art. The memories may be volatile or non-volatile, secure and/or encrypted, unsecure and/or unencrypted.
[00102] “Memory” is an example of a computer-readable storage medium. Computer memory is any memory which is directly accessible to a processor. Examples of computer memory include, but are not limited to RAM memory, registers, and register files. References to “computer memory” or “memory” should be interpreted as possibly being multiple memories. The memory may for instance be multiple memories within the same computer system. The memory may also be multiple memories distributed amongst multiple computer systems or computing devices.
[00103] As shown, the computer system 600 further includes a video display unit 650, such as a liquid crystal display (LCD), an organic light emitting diode (OLED), a flat panel display, a solid-state display, or a cathode ray tube (CRT), for example. Additionally, the computer system 600 includes an input device 660, such as a keyboard/virtual keyboard or touch-sensitive input screen or speech input with speech recognition, and a cursor control device 670, such as a mouse or touch-sensitive input screen or pad. The computer system 600 also optionally includes a disk drive unit 680, a signal generation device 690, such as a speaker or remote control, and/or a network interface device 640.
[00104] In an embodiment, as depicted in FIG. 6, the disk drive unit 680 includes a computer- readable medium 682 in which one or more sets of software instructions 684 (software) are embedded. The sets of software instructions 684 are read from the computer-readable medium 682 to be executed by the processor 610. Further, the software instructions 684, when executed by the processor 610, perform one or more steps of the methods and processes as described herein. In an embodiment, the software instructions 684 reside all or in part within the main memory 620, the static memory 630 and/or the processor 610 during execution by the computer system 600. Further, the computer-readable medium 682 may include software instructions 684 or receive and execute software instructions 684 responsive to a propagated signal, so that a device connected to a network 601 communicates voice, video, or data over the network 601. The software instructions 684 may be transmitted or received over the network 601 via the network interface device 640.
[00105] In an embodiment, dedicated hardware implementations, such as application-specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), programmable logic arrays
and other hardware components, are constructed to implement one or more of the methods described herein. One or more embodiments described herein may implement functions using two or more specific interconnected hardware modules or devices with related control and data signals that can be communicated between and through the modules. Accordingly, the present disclosure encompasses software, firmware, and hardware implementations. Nothing in the present application should be interpreted as being implemented or implementable solely with software and not hardware such as a tangible non-transitory processor and/or memory.
[00106] In accordance with various embodiments of the present disclosure, the methods described herein may be implemented using a hardware computer system that executes software programs. Further, in an exemplary, non-limited embodiment, implementations can include distributed processing, component/object distributed processing, and parallel processing. Virtual computer system processing may implement one or more of the methods or functionalities as described herein, and a processor described herein may be used to support a virtual processing environment.
[00107] Accordingly, interactive training and education provides a framework for generating angiogram images for a given set of score input parameters or for a set of scores directly, which can then be scored by an individual. The individual receives feedback on performance and specific deviations from the known ground truth is known so as to stepwise learn how to improve and to be more consistent with the score definition. The framework described herein may be fully digital and may be used without requiring a cathlab or even a clinical institute. The computer 140 may be provided via the internet or as a cloud-based service to minimize device requirements. When implemented for education, the system 100 may use a stand-alone software package to provide education for clinicians to improve proficiency in the analysis and interpretation of two-dimensional angiogram images. The educational aspects of the teachings herein may also be used by patients to increase awareness of patient conditions. An educational tool consistent with the teachings herein may also be provided via an internet website or as a cloud service so that usage is not restricted to a particular device, but can be appreciated from a variety of different devices via a browser or application. The educational tool may also or alternatively be provided as an element of one or more software suites associated with a particular device, or in a cathlab, or in particular imaging equipment. In this way, the software may be tailored to the equipment at hand and may make use of real device parameters such as the
X-ray imaging geometry to train clinicians on images which are as close as possible to the real- world medical scenarios they will encounter using their own medical equipment.
[00108] Although interactive training and education has been described with reference to several exemplary embodiments, it is understood that the words that have been used are words of description and illustration, rather than words of limitation. Changes may be made within the purview of the appended claims, as presently stated, and as amended, without departing from the scope and spirit of interactive training and education in its aspects. Although interactive training and education has been described with reference to particular means, materials and embodiments, interactive training and education is not intended to be limited to the particulars disclosed; rather interactive training and education extends to all functionally equivalent structures, methods, and uses such as are within the scope of the appended claims.
[00109] The illustrations of the embodiments described herein are intended to provide a general understanding of the structure of the various embodiments. The illustrations are not intended to serve as a complete description of all of the elements and features of the disclosure described herein. Many other embodiments may be apparent to those of skill in the art upon reviewing the disclosure. Other embodiments may be utilized and derived from the disclosure, such that structural and logical substitutions and changes may be made without departing from the scope of the disclosure. Additionally, the illustrations are merely representational and may not be drawn to scale. Certain proportions within the illustrations may be exaggerated, while other proportions may be minimized. Accordingly, the disclosure and the figures are to be regarded as illustrative rather than restrictive.
[00110] One or more embodiments of the disclosure may be referred to herein, individually and/or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any particular invention or inventive concept. Moreover, although specific embodiments have been illustrated and described herein, it should be appreciated that any subsequent arrangement designed to achieve the same or similar purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all subsequent adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the description.
[00111] The Abstract of the Disclosure is provided to comply with 37 C.F.R. §1.72(b) and is
submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, various features may be grouped together or described in a single embodiment for the purpose of streamlining the disclosure. This disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter may be directed to less than all of the features of any of the disclosed embodiments. Thus, the following claims are incorporated into the Detailed Description, with each claim standing on its own as defining separately claimed subject matter.
[00112] The preceding description of the disclosed embodiments is provided to enable any person skilled in the art to practice the concepts described in the present disclosure. As such, the above disclosed subject matter is to be considered illustrative, and not restrictive, and the appended claims are intended to cover all such modifications, enhancements, and other embodiments which fall within the true spirit and scope of the present disclosure. Thus, to the maximum extent allowed by law, the scope of the present disclosure is to be determined by the broadest permissible interpretation of the following claims and their equivalents and shall not be restricted or limited by the foregoing detailed description.
Claims
1. An interactive system for medical training, the system comprising: a processor and memory, the processor configured to: obtain actual values of parameters associated with a medical scenario; generate a sequence of customized synthetic angiogram images of the medical scenario based on the actual values of the parameters; display, to a user, the sequence of customized synthetic angiogram images; interactively request the user to estimate values for the parameters based on the displayed sequence of customized synthetic angiogram images; compare the estimated values for the parameters to target values for the parameters to determine a measure of difference between the estimated values and the target values, wherein the target values are defined based on the actual values of the parameters; and provide to the user feedback with respect to the measure of difference.
2. The system of claim 1, wherein the actual values, estimated values, and target values comprise clinical scores.
3. The system of claim 1, wherein the processor is further configured to generate an appearance of the customized synthetic angiogram images based on the actual values of the parameters.
4. The system of claim 1, wherein the parameters comprise at least one of morphological parameters of an anatomy, physiological parameters of an anatomy, and imaging parameters.
5. The system of claim 1, wherein the parameters represent anatomical characteristics of a vasculature and the actual values quantify the anatomical characteristics to provide the medical scenario.
6. The system of claim 5, wherein the customized synthetic angiogram images comprise a vascular tree and the processor is configured to model morphology and physiology of the
vascular tree in the customized synthetic angiogram images based on the parameters and the actual values of the parameters.
7. The system of claim 1 , wherein the processor is configured to generate the sequence of customized synthetic angiogram images to simulate movement of the vascular tree.
8. The system of claim 5, wherein the anatomical characteristics of a vasculature represented by the parameters include at least one of lesion length, lesion shape, lesion significance, lesion severity, lesion calcifications, lesion eccentricity, bifurcations, vessel length, vessel diameter, vessel branching point location, vessel branching point angle, lumen contour shape, vessel tortuosity, vessel stenosis, blood volume flow, blood velocity, and view parameters.
9. The system of claim 5, wherein the actual values of the parameters are set such that the anatomical characteristics in the customized synthetic angiogram images represent a diseased state of the vasculature.
10. The system of claim 1, wherein the synthetic angiogram images comprise two-dimensional synthetic X-ray image.
11. The system of claim 1 , wherein the processor further is configured further to: generate at least one three-dimensional image based on the actual values of the parameters; project each three-dimensional angiogram image onto a two-dimensional detector plane; and generate a synthetic angiogram image for each three-dimensional image projected onto the two-dimensional detector plane.
12. The system of claim 9, wherein the processor is further configured further to: add a background image to a foreground of each synthetic angiogram image.
13. The system of claim 1, wherein, the processor is further configured to: grade the estimated values of the parameters based on the target values of the parameters; and output the measure of the difference based on the grade.
14. The system of claim 1, wherein the processor is further configured to select the parameters as a group from among a plurality of predetermined groups of parameters corresponding to medical scenarios.
15. The system of claim 1, wherein processor is further configured to: display each of the sequence of synthetic angiogram images in parallel.
16. An method for medical training, the method comprising: obtaining actual values of parameters associated with a medical scenario; generating a sequence of customized synthetic angiogram images of the medical scenario based on the actual values of the parameters; displaying, to a user, the sequence of customized synthetic angiogram images; interactively request the user to estimate values for the parameters based on the displayed sequence of customized synthetic angiogram images; comparing the estimated values for the parameters to target values for the parameters to determine a measure of difference between the estimated values and the target values, wherein the target values are defined based on the actual values of the parameters; and providing to the user feedback with respect to the measure of difference.
17. The method of claim 16, wherein the actual values, estimated values, and target values comprise clinical scores.
18. The method of claim 1, further comprising: generating an appearance of the customized synthetic angiogram images based on the actual values of the parameters.
19. The method of claim 1, wherein the customized synthetic angiogram images comprise a vascular tree and the method further comprises modeling morphology and physiology of the vascular tree in the customized synthetic angiogram images based on the parameters and the actual values of the parameters.
20. A non-transitory computer-readable storage medium having stored a computer program comprising instructions, which, when executed by a processor, cause the processor to: obtain actual values of parameters associated with a medical scenario; generate a sequence of customized synthetic angiogram images of the medical scenario based on the actual values of the parameters; display, to a user, the sequence of customized synthetic angiogram images; interactively request the user to estimate values for the parameters based on the displayed sequence of customized synthetic angiogram images; compare the estimated values for the parameters to target values for the parameters to determine a measure of difference between the estimated values and the target values, wherein the target values are defined based on the actual values of the parameters; and provide to the user feedback with respect to the measure of difference.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
RU2022127314 | 2022-10-20 | ||
RU2022127314 | 2022-10-20 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024083690A1 true WO2024083690A1 (en) | 2024-04-25 |
Family
ID=88417395
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2023/078549 WO2024083690A1 (en) | 2022-10-20 | 2023-10-13 | Interactive training and education |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2024083690A1 (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040120557A1 (en) * | 2002-12-18 | 2004-06-24 | Sabol John M. | Data processing and feedback method and system |
US20070134637A1 (en) * | 2005-12-08 | 2007-06-14 | Simbionix Ltd. | Medical simulation device with motion detector |
US20070232868A1 (en) * | 2006-01-30 | 2007-10-04 | Bruce Reiner | Method and apparatus for generating a radiologist quality assurance scorecard |
US20100311034A1 (en) * | 2007-09-04 | 2010-12-09 | Koninklijke Philips Electronics N.V. | Receiver operating characteristic-based training |
US20180182262A1 (en) * | 2015-06-25 | 2018-06-28 | Koninklijke Philips N.V. | Interactive intravascular procedure training and associated devices, systems, and methods |
US20200008772A1 (en) * | 2017-02-24 | 2020-01-09 | Bayer Healthcare Llc | Systems and methods for generating simulated computed tomography (ct) images |
-
2023
- 2023-10-13 WO PCT/EP2023/078549 patent/WO2024083690A1/en unknown
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040120557A1 (en) * | 2002-12-18 | 2004-06-24 | Sabol John M. | Data processing and feedback method and system |
US20070134637A1 (en) * | 2005-12-08 | 2007-06-14 | Simbionix Ltd. | Medical simulation device with motion detector |
US20070232868A1 (en) * | 2006-01-30 | 2007-10-04 | Bruce Reiner | Method and apparatus for generating a radiologist quality assurance scorecard |
US20100311034A1 (en) * | 2007-09-04 | 2010-12-09 | Koninklijke Philips Electronics N.V. | Receiver operating characteristic-based training |
US20180182262A1 (en) * | 2015-06-25 | 2018-06-28 | Koninklijke Philips N.V. | Interactive intravascular procedure training and associated devices, systems, and methods |
US20200008772A1 (en) * | 2017-02-24 | 2020-01-09 | Bayer Healthcare Llc | Systems and methods for generating simulated computed tomography (ct) images |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7241790B2 (en) | Systems and methods for identifying and visualizing functional relationships between vascular networks and perfused tissues | |
JP6553675B2 (en) | System and method for numerical evaluation of vasculature | |
JP6175073B2 (en) | Real-time display of vascular view for optimal device navigation | |
CN107705855B (en) | Personalized percutaneous coronary angioplasty (PTCA) simulation training system and method | |
US8449301B2 (en) | Systems and methods for assessing a medical ultrasound imaging operator's competency | |
CN104271066A (en) | Hybrid image/scene renderer with hands free control | |
US20230248439A1 (en) | Method for generating surgical simulation information and program | |
US12096990B2 (en) | Systems and methods for estimation of blood flow using response surface and reduced order modeling | |
US20230008541A1 (en) | Methods for multi-modal bioimaging data integration and visualization | |
CN114173692A (en) | System and method for recommending parameters for surgical procedures | |
Li et al. | Design and evaluation of personalized percutaneous coronary intervention surgery simulation system | |
JP2015219371A (en) | Treatment unit, blood vessel model, image processing method, program, and molding device | |
KR20200096155A (en) | Method for analysis and recognition of medical image | |
Mill et al. | Domain expert evaluation of advanced visual computing solutions and 3D printing for the planning of the left atrial appendage occluder interventions | |
CN111954907A (en) | Resolving and manipulating decision focus in machine learning-based vessel imaging | |
Hur et al. | Integration of three‐dimensional echocardiography into the modern‐day echo laboratory | |
US20230260427A1 (en) | Method and system for generating a simulated medical image | |
WO2024083690A1 (en) | Interactive training and education | |
Urbán et al. | Simulated medical ultrasound trainers a review of solutions and applications | |
US12076158B2 (en) | Intuitive display for rotator cuff tear diagnostics | |
EP4413589A1 (en) | A system and method for contouring a medical image and reviewing existing contours | |
Coelho et al. | A mobile device tool to assist the ECG interpretation based on a realistic 3D virtual heart simulation | |
Uzun | Augmented reality in cardiology | |
Sutherland et al. | Towards an augmented ultrasound guided spinal needle insertion system | |
Hamza-Lup et al. | Haptic user interfaces and practice-based learning for minimally invasive surgical training |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23790291 Country of ref document: EP Kind code of ref document: A1 |