US20170200270A1 - Computer-aided analysis and rendering of medical images - Google Patents

Computer-aided analysis and rendering of medical images Download PDF

Info

Publication number
US20170200270A1
US20170200270A1 US15/469,296 US201715469296A US2017200270A1 US 20170200270 A1 US20170200270 A1 US 20170200270A1 US 201715469296 A US201715469296 A US 201715469296A US 2017200270 A1 US2017200270 A1 US 2017200270A1
Authority
US
United States
Prior art keywords
medical
images
series
rules
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US15/469,296
Other versions
US9934568B2 (en
Inventor
Murray A. Reicher
Michael Trambert
Evan K. Fram
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Merative US LP
Original Assignee
DR Systems Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by DR Systems Inc filed Critical DR Systems Inc
Priority to US15/469,296 priority Critical patent/US9934568B2/en
Assigned to D.R. SYSTEMS, INC. reassignment D.R. SYSTEMS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: REICHER, MURRAY A., FRAM, EVAN K., TRAMBERT, MICHAEL
Publication of US20170200270A1 publication Critical patent/US20170200270A1/en
Application granted granted Critical
Publication of US9934568B2 publication Critical patent/US9934568B2/en
Assigned to MERGE HEALTHCARE SOLUTIONS INC. reassignment MERGE HEALTHCARE SOLUTIONS INC. NUNC PRO TUNC ASSIGNMENT (SEE DOCUMENT FOR DETAILS). Assignors: D.R. SYSTEMS, INC.
Assigned to INTERNATIONAL BUSINESS MACHINES CORPORATION reassignment INTERNATIONAL BUSINESS MACHINES CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MERGE HEALTHCARE SOLUTIONS INC.
Assigned to MERATIVE US L.P. reassignment MERATIVE US L.P. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: INTERNATIONAL BUSINESS MACHINES CORPORATION
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/08Volume rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/02Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2576/00Medical imaging apparatus involving image processing or analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2101/00Indexing scheme relating to the type of digital function generated
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/70Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by syntax aspects related to video coding, e.g. related to compression standards

Definitions

  • This disclosure relates to medical image analysis, medical image processing, medical image rendering, medical image transformation, medical image transferring, and/or medical image viewing.
  • Imaging is increasingly moving into the digital realm. This includes imaging techniques that were traditionally analog, such as mammography, x-ray imaging, angiography, endoscopy, and pathology, where information can now be acquired directly using digital sensors, or by digitizing information that was acquired in analog form.
  • Many imaging modalities are inherently digital, such as computed radiography (CR), digital radiography (DR), MRI (magnetic resonance imaging), CT (computed tomography), PET (positron emission tomography), NM (nuclear medicine scanning), FFDM (full-field digital mammography), and US (ultrasound), often yielding hundreds or even thousands of images per examination.
  • CR computed radiography
  • DR digital radiography
  • MRI magnetic resonance imaging
  • CT computed tomography
  • PET positron emission tomography
  • NM positron emission tomography
  • NM neuronuclear medicine scanning
  • FFDM full-field digital mammography
  • US ultrasound
  • a method of selecting medical images of an image series for transfer comprises, by a computing device executing software, performing operations comprising, accessing transfer rules associated with a first computing device, the transfer rules indicating thin slice criteria for determining whether respective medical images of an image series are classified as thin slices, wherein the thin slice criteria include at least one of a maximum thickness of a particular medical image for classification of the particular image as a thin slice image or a maximum quantity of medical images in the image series for classification of the medical images of the image series as thin slice images, selecting a first group of medical images of the image series that are thin slice images as indicated by the thin slice criteria, initiating transfer of the first group of medical images to the first computing device, wherein medical images of the image series that are not in the first group are not transferred to the first computing device.
  • a method comprises, by a computing device comprising hardware, performing operations comprising accessing thin slice criteria for determining whether respective medical images of an image series are classified as thin slices, selecting a thin slice series of medical images that includes one or more medical images that are classified as thin slices based on the thin slice criteria, selecting a non-thin slice series of medical images that includes one or more medical images that are not thin slice images based on the thin slice criteria, and transferring or displaying the thin slice series in a different manner than the non-thin slice series, such that either the images of the thin slice series are selectively not transferred, selectively transferred with lower priority, selectively transferred but not displayed, or selectively transferred but otherwise displayed or processed in a selectively different manner.
  • a tangible computer readable medium has instructions stored thereon, wherein the instructions are configured for reading by a computing device in order to cause the computing device to perform operations comprising accessing transfer rules associated with a first computing device and a first user, the transfer rules indicating thin slice criteria for determining whether respective medical images of an image series are classified as thin slice images, wherein the thin slice criteria include at least one of a maximum thickness of a particular medical image for classification of the particular image as a thin slice image or a maximum quantity of medical images in the image series for classification of the medical images of the image series as thin slice images, selecting a first group of medical images of an image series that are not thin slices as indicated by the thin slice criteria, and initiating transfer of the first group of medical images to the first viewing environment, wherein medical images of the image series that are not in the first group are not transferred to the first viewing environment.
  • a method comprises by a computing device configured for displaying medical images, determining a storage location of a dataset, determining an availability of a processing server to process the dataset, in response to determining that the dataset is locally available to a processing server configured to process the dataset, requesting processing of the dataset by the processing server and accessing the processed dataset from the processing server, and in response to determining that the dataset is not locally available to the processing server and/or the processing server is not available to process the dataset, requesting transmission of at least some of the dataset to the computing device and processing at least some of the dataset at the computing device.
  • a computer-implemented method of rendering medical images comprising: by one or more processors executing program instructions: accessing a set of medical image data; analyzing the set of medical image data to determine a value of an attribute of the set of medical image data; accessing a user-defined rule indicating an association between the value of the attribute and a desired image slice thickness; and applying the rule to the set of medical image data to render a series of medical images, wherein medical images of the series match the desired slice thickness.
  • the method further comprises: by the one or more processors executing program instructions: automatically initiating display of the series of medical images at a computing device.
  • the set of medical image data comprises an original series of medical images, and wherein the medical images comprise two-dimensional medical images.
  • applying the rule comprises: by the one or more processors executing program instructions: accessing a three-dimensional set of medical image data from which the original series was derived; and rendering the series of medical images from the three-dimensional set of medical image data.
  • applying the rule further comprises: by the one or more processors executing program instructions: selecting a group of medical images of the original series of medical images, wherein each medical image of the group of medical images has an image slice thickness greater or less than the desired image slice thickness; and reformatting each of the medical images of the group of medical images to have the desired image slice thickness.
  • applying the rule further comprises: by the one or more processors executing program instructions: determining that one or more medical images of the original series of medical images has an image slice thickness greater or less than the desired image slice thickness; reformatting the original series of medical images to include a plurality of reformatted medical images having an image slice thicknesses matching the desired image slice thickness, wherein the original series of medical images is reformatted from a volumetric data set associated with the original series of medical images; and initiating display of the original series of medical images including the plurality of reformatted medical images at a computing device.
  • the set of medical image data comprises three-dimensional medical image data.
  • the rule further indicates an association between the value of the attribute and a desired image slice increment
  • applying the rule comprises: by the one or more processors executing program instructions: rendering the series of medical images from the three-dimensional set of medical image data, wherein the medical images of the series match both the desired slice thickness and the desired slice increment.
  • applying the rule comprises: by the one or more processors executing program instructions: generating each successive image slice of the series from the three-dimensional set of medical image data, where each successive image slice has the desired image thickness and is the desired image slice increment apart.
  • the method further comprises: by the one or more processors executing program instructions: automatically initiating display of the series of medical images at a computing device.
  • the value of the attribute is derived from a DICOM header file.
  • the attribute comprises at least one of: an image series view, a patient's age, a medical history, a risk factor, a tissue characteristic, or a modality.
  • the method further comprises: by the one or more processors executing program instructions: receiving a request from a user to view the series of medical images; and accessing the user-defined rule associated with the user.
  • the method further comprises: by the one or more processors executing program instructions: displaying the series of medical images on a computing device; receiving a reformat request indicating a new desired image slice thickness different from the desired image slice thickness indicated by the rule; and in response to receiving the reformat request: re-rendering the set of medical image data to generate a new series of medical images, wherein medical images of the new series match the new desired slice thickness.
  • a system comprising: a computer readable storage medium having program instructions embodied therewith; and one or more processors configured to execute the program instructions to cause the one or more processors to: access a set of medical image data; analyze the set of medical image data to determine a value of an attribute of the set of medical image data; access a user-defined rule indicating an association between the value of the attribute and a desired image slice thickness; and apply the rule to the set of medical image data to render a series of medical images, wherein medical images of the series match the desired slice thickness.
  • the set of medical image data comprises three-dimensional medical image data
  • the one or more processors are configured to execute the program instructions to further cause the one or more processors to: render the series of medical images from the three-dimensional set of medical image data, wherein the medical images of the series match both the desired slice thickness and the desired slice increment.
  • the one or more processors are configured to execute the program instructions to further cause the one or more processors to: receive a request from a user to view the series of medical images; access the user-defined rule associated with the user; display the series of medical images on a computing device; receive a reformat request indicating a new desired image slice thickness different from the desired image slice thickness indicated by the rule; and in response to receiving the reformat request: re-render the set of medical image data to generate a new series of medical images, wherein medical images of the new series match the new desired slice thickness.
  • a computer program product comprising a computer readable storage medium having program instructions embodied therewith, the program instructions executable by one or more processors to cause the one or more processors to: access a set of medical image data; analyze the set of medical image data to determine a value of an attribute of the set of medical image data; access a user-defined rule indicating an association between the value of the attribute and a desired image slice thickness; and apply the rule to the set of medical image data to render a series of medical images, wherein medical images of the series match the desired slice thickness.
  • the program instructions are executable by one or more processors to further cause the one or more processors to: render the series of medical images from the three-dimensional set of medical image data, wherein the medical images of the series match both the desired slice thickness and the desired slice increment.
  • the program instructions are executable by one or more processors to further cause the one or more processors to: receive a request from a user to view the series of medical images; access the user-defined rule associated with the user; display the series of medical images on a computing device; receive a reformat request indicating a new desired image slice thickness different from the desired image slice thickness indicated by the rule; and in response to receiving the reformat request: re-render the set of medical image data to generate a new series of medical images, wherein medical images of the new series match the new desired slice thickness.
  • a system configured to analyze and display medical images comprising: a computer readable storage medium having program instructions embodied therewith; and one or more processors configured to execute the program instructions to cause the one or more processors to: access a series of medical images; analyze the series of medical images to determine a value of an attribute of the series of medical images, wherein the attribute comprises an image plane; access a user-defined rule indicating an association between the value of the attribute and a desired display preference; and apply the rule to the series of medical images to cause display of the series of medical images according to the desired display preference.
  • the desired display preference includes at least one of: an image rendering process, an image slice thickness, an image slice increment, a field of view, a window, a level, or a color setting.
  • the value of the attribute is derived from a DICOM header file.
  • the attribute further comprises at least one of: an image series view, a patient's age, a medical history, a risk factor, a tissue characteristic, or a modality.
  • applying the rule to the series of medical images comprises: rendering the series of medical images according to a user-defined rule associated with the value of the attribute.
  • the rule indicates a desired slice thickness.
  • the one or more processors are configured to execute the program instructions to further cause the one or more processors to: identify a previously displayed series of medical images having the attribute with a second value that matches the value associated with the series of medical images; determine one or more display parameters associated with display of the previously displayed series of medical images; and designating the one or more display parameters as the desired display preference.
  • identifying a previously displayed series of medical images comprises: by the one or more processors executing program instructions: determining that the previously displayed series of medical images was previously displayed to a same user that the series of medical images are to be displayed to.
  • a computer-implemented method of analyzing and displaying medical images comprising: by one or more processors executing program instructions: accessing a series of medical images; analyzing the series of medical images to determine a value of an attribute of the series of medical images, wherein the attribute comprises an image plane; accessing a user-defined rule indicating an association between the value of the attribute and a desired display preference; and applying the rule to the series of medical images to cause display of the series of medical images according to the desired display preference.
  • the desired display preference includes at least one of: an image rendering process, an image slice thickness, an image slice increment, a field of view, a window, a level, or a color setting.
  • the value of the attribute is derived from a DICOM header file.
  • the attribute further comprises at least one of: an image series view, a patient's age, a medical history, a risk factor, a tissue characteristic, or a modality.
  • applying the rule to the series of medical images comprises: by the one or more processors executing program instructions: rendering the series of medical images according to a user-defined rule associated with the value of the attribute.
  • the rule indicates a desired slice thickness.
  • a computer-implemented method of analyzing and displaying medical images comprising: by one or more processors executing program instructions: accessing a series of medical images; analyzing the series of medical images to determine a first value of an attribute of the series of medical images; identifying a previously displayed series of medical images having the attribute with a second value that matches the first value; determining one or more display parameters associated with display of the previously displayed series of medical images; and applying the one or more display parameters to the series of medical images to cause display of the series of medical images according to the one or more display parameters.
  • identifying a previously displayed series of medical images comprises: by the one or more processors executing program instructions: determining that the previously displayed series of medical images was previously displayed to a same user that the series of medical images are to be displayed to.
  • the series of medical images are automatically initially displayed to the user according to the one or more display parameters.
  • a computer-implemented method of automated analysis of medical images comprising: by one or more processors executing program instructions: accessing a set of medical image data; automatically analyzing the set of medical image data by a CAP action to determine a value of an attribute of the set of medical image data; accessing a user-defined rule indicating an association between the value of the attribute and a desired image slice thickness; and applying the rule to the set of medical image data to render a series of medical images, wherein medical images of the series match the desired slice thickness.
  • the method further comprises: by one or more processors executing program instructions: accessing a plurality of user-defined CAP rules; identifying a CAP rule associated with the set of medical imaging data; and determining the CAP action indicated by the rule.
  • the CAP rule is associated with the set of medical imaging data based on at least one of: a modality, an anatomical region, or a medical indicator.
  • the attribute comprises at least one of: a tissue density, or a presence of an abnormality or a suspected abnormality.
  • the method further comprises: by one or more processors executing program instructions: identifying a possible abnormality based on the analyzing of the set of medical image data by the CAP action; and fusing a group of medical images from the series of medical images in a region associated with the possible abnormality.
  • the method further comprises: by the one or more processors executing program instructions: automatically initiating display of the group of medical images of the series of medical images at a computing device.
  • the method further comprises: by one or more processors executing program instructions: determining a rendering location for rendering the series of medical images based on a second user-defined rule.
  • the second user-defined rule indicates at least one of: a user associated with the user-defined rule, a location at which the series of medical images are to be displayed, or a characteristic of a device upon which the series of medical images to be displayed.
  • a system comprising: a computer readable storage medium having program instructions embodied therewith; and one or more processors configured to execute the program instructions to cause the one or more processors to: access a set of medical image data; automatically analyze the set of medical image data by a CAP action to determine a value of an attribute of the set of medical image data; access a user-defined rule indicating an association between the value of the attribute and a desired image slice thickness; and apply the rule to the set of medical image data to render a series of medical images, wherein medical images of the series match the desired slice thickness.
  • the one or more processors are configured to execute the program instructions to further cause the one or more processors to: access a plurality of user-defined CAP rules; identify a CAP rule associated with the set of medical imaging data; and determine the CAP action indicated by the rule.
  • the CAP rule is associated with the set of medical imaging data based on at least one of: a modality, an anatomical region, or a medical indicator.
  • the attribute comprises at least one of: a tissue density, or a presence of an abnormality or a suspected abnormality.
  • the one or more processors are configured to execute the program instructions to further cause the one or more processors to: identify a possible abnormality based on the analyzing of the set of medical image data by the CAP action; and fuse a group of medical images from the series of medical images in a region associated with the possible abnormality.
  • the one or more processors are configured to execute the program instructions to further cause the one or more processors to: automatically initiate display of the group of medical images of the series of medical images at a computing device.
  • the one or more processors are configured to execute the program instructions to further cause the one or more processors to: determine a rendering location for rendering the series of medical images based on a second user-defined rule.
  • the second user-defined rule indicates at least one of: a user associated with the user-defined rule, a location at which the series of medical images are to be displayed, or a characteristic of a device upon which the series of medical images to be displayed.
  • a computer program product comprising a computer readable storage medium having program instructions embodied therewith, the program instructions executable by one or more processors to cause the one or more processors to: access a set of medical image data; automatically analyze the set of medical image data by a CAP action to determine a value of an attribute of the set of medical image data; access a user-defined rule indicating an association between the value of the attribute and a desired image slice thickness; and apply the rule to the set of medical image data to render a series of medical images, wherein medical images of the series match the desired slice thickness.
  • the program instructions are executable by one or more processors to further cause the one or more processors to: access a plurality of user-defined CAP rules; identify a CAP rule associated with the set of medical imaging data; and determine the CAP action indicated by the rule.
  • the program instructions are executable by one or more processors to further cause the one or more processors to: identify a possible abnormality based on the analyzing of the set of medical image data by the CAP action; and fuse a group of medical images from the series of medical images in a region associated with the possible abnormality.
  • the program instructions are executable by one or more processors to further cause the one or more processors to: automatically initiate display of the group of medical images of the series of medical images at a computing device.
  • systems and/or computer systems comprise a computer readable storage medium having program instructions embodied therewith, and one or more processors configured to execute the program instructions to cause the one or more processors to perform operations comprising one or more aspects of the above- and/or below-described embodiments (including one or more aspects of the appended claims).
  • computer program products comprising a computer readable storage medium
  • the computer readable storage medium has program instructions embodied therewith, the program instructions executable by one or more processors to cause the one or more processors to perform operations comprising one or more aspects of the above- and/or below-described embodiments (including one or more aspects of the appended claims).
  • FIG. 1A is a block diagram of an example medical imaging system including two PACS workstations in communication with various devices;
  • FIG. 1B is a block diagram of an example medical imaging system wherein PACS workstations are in communication with a PACS server via a separate network which may comprise a secured local area network, for example;
  • FIG. 2 illustrates a table of transfer and display rules, including example categories of properties/attributes on which transfer and display rules may be based for a user, site, or other group of viewing devices;
  • FIGS. 3A, 3B, and 3C illustrate example data structures having transfer and display rules of various types
  • FIG. 4 is a flowchart illustrating an embodiment of a method of the present disclosure
  • FIG. 5 is a block diagram of three sites that are configured to perform one or more of the operations discussed herein;
  • FIG. 6 illustrates additional example display rules of the present disclosure
  • FIG. 7 is a flowchart illustrating an embodiment of another method of the present disclosure.
  • FIGS. 8A-8B show tables illustrating additional examples of rules of the present disclosure.
  • FIG. 9 illustrates various example attributes that may be associated with exams, image series, and images, according to embodiments of the present disclosure.
  • User includes an individual (or group of individuals) that interfaces with a computing device to, for example, access or view medical images.
  • Users may include, for example, physicians (including, for example, doctors, radiologists, etc.) hospital staff, and/or any other individuals (including persons not medically trained) involved in analysis, annotation, comparison, acquisition, storage, management, or other tasks related to medical images (or any other types of images) as described herein.
  • user preferences and/or rules e.g., display rules and transfer rules
  • user group preferences or rules associated with groups of users
  • site preferences/rules or rules associated with sites of users
  • system preference/rules or default software preferences/rules.
  • Medical data is defined to include any data related to medical information, images, exams, image series, and/or other patient information.
  • medical data may include, but is not limited to, medical images, such as radiograph (e.g., an x-ray image, CR, DR, etc.), computed tomography (CT), magnetic resonance imaging (MRI), Ultrasound (US), mammogram, positron emission tomography scan (PET), nuclear scan (NM), full-field mammography (FFDM) (and other types of digital mammography), Tomosynthesis (e.g., breast tomosynthesis), and images related to gross and microscopic pathology, ophthalmology, endoscopy, or other medical images, as well as medical reports, such as text files containing reports, voice files with results summaries, and/or full digital dictation voice files for transcription.
  • radiograph e.g., an x-ray image, CR, DR, etc.
  • CT computed tomography
  • MRI magnetic resonance imaging
  • US Ultrasound
  • Medical images may be reconstructed and/or rendered from 3D or volumetric image data using methods including multiplanar reformation/reconstruction (MPR), maximum intensity projection (MIP), and/or the like (including, e.g., any Computerized Advanced Processing (CAP), as described below).
  • MPR multiplanar reformation/reconstruction
  • MIP maximum intensity projection
  • CAP Computerized Advanced Processing
  • FIG. 9 illustrates an example of a medical image 912 and possible attributes that may be associated with a medical image.
  • medical image data may be used herein to refer to medical data primarily associated with imaged or imaging data.
  • Image series (also referred to herein as a “series”), as used herein, includes any two or more images that are related. Images in a series typically share one or more common attributes, for example, a type of anatomic plane and/or an image orientation.
  • an image series may comprise two or more images of a particular patient that are acquired on a particular date, e.g., different x-ray projections of the chest.
  • a series of contiguous 3 mm axial CT scans of the chest is another example of an image series.
  • a brain MRI scan might include the following series: sagittal T1 weighted images, axial T1 weighted images, axial FLAIR images, axial T2 weighted images, as well as post contrast axial, sagittal and coronal T1 weighted series.
  • An image series of an exam may be identified by its “type” (also referred to herein as a “series type” and/or a “view type”).
  • series may be acquired using different pulse sequences, acquired in different anatomic planes (also referred to herein as “imaging planes,” e.g., axial, coronal, sagittal, etc.), and/or acquired before or after administration of intravenous contrast material.
  • An image series may be limited to images of a certain modality or may comprise images of multiple modalities.
  • FIG. 9 illustrates an example of an image series 908 , as well as example attributes that may be associated with an image series. As shown, the image series 908 includes multiple medical images, such as medical image 912 .
  • Medical imaging exam includes collection of medical data related to an examination of a patient.
  • An exam may be specific to a particular time or time period.
  • an exam includes one or more medical images and/or image series, montages, reports, notes, graphs, measurements, annotations, videos, sounds or voice data, diagnoses, and/or other related information.
  • An exam may include multiple image series of multiple modalities, volumetric imaging data, reconstructed images and/or rendered images.
  • an exam of a patient may be the brain MRI scan mentioned above, and may include each of the image series obtained on a particular date including: sagittal T1 weighted images, axial T1 weighted images, axial FLAIR images, axial T2 weighted images, as well as post contrast axial, sagittal and coronal T1 weighted series.
  • Another example of an exam may be a dual-energy radiography exam, which may include image data including traditional x-ray image images, bone subtracted (or “bone out”) x-ray images, and/or tissue subtracted (or “tissue out”) x-ray images.
  • FIG. 9 illustrates two example medical exams 902 and 904 . As shown, each medical exam 902 and 904 includes multiple image series, such as image series 908 which is a part of medical exam 904 .
  • Attributes include any characteristics associated with a data item (e.g., a data item such as a medical exam, an image series, a medical image, and/or the like). Attributes may be inherited in a hierarchical manner. For example, a medical image may inherit attributes of an image series of which it is a part, and an image series may inherit attributes of a medical exam of which it is a part. Attributes may be stored as part of an associated data item (e.g., as metadata, Digital Imaging and Communications in Medicine (“DICOM”) header data, etc.) and/or separately from an associated data item.
  • DICOM Digital Imaging and Communications in Medicine
  • image attributes include, without limitation, image angle (e.g., an angle of an image with reference to a standard one or more planes of human anatomy; also referred to herein as “scan plane” or “imaging plane”), anatomical position (and/or location) (e.g., a location, with reference to a standard one or more planes of human anatomy, of the patient represented in a particular image), image orientation (e.g., an orientation of the image with reference to a standard one or more planes of human anatomy), image rotation (e.g., a rotation of the image with reference to a standard one or more planes of human anatomy), image field of view, slice thickness, image window and/or level (e.g., a contrast of the image, a brightness of the image, and/or the like), image color map (e.g., image angle of an image with reference to a standard one or more planes of human anatomy; also referred to herein as “scan plane” or “imaging plane”), anatomical position (and/or location)
  • one or more image attributes may be user defined and/or based on user preferences/rules.
  • An image attribute may refer to the physical anatomy of a patient from which the image was obtained. For example, a medical image may be obtained to show a particular slice of a patient at a particular location such that a diagnosis of the patient may be made.
  • Computerized Advanced Processing includes any computerized image analysis, image analysis technique, and/or image processing technique discussed herein, and/or any similar computerized processing technique that is currently or later available.
  • CAP is described herein with regard to radiology images and other types of medical images, but CAP and the systems and methods described herein may be applied in other areas including, but not limited to, other types of medical images (for example, cardiology, dermatology, pathology and/or endoscopy, among others), computer generated images (for example, 3D images from virtual colonoscopy, 3D images of vessels from CTA, and the like), images from other fields (for example, surveillance imaging, satellite imaging, and the like), as well as non-imaging data including audio, text, and numeric data.
  • other types of medical images for example, cardiology, dermatology, pathology and/or endoscopy, among others
  • computer generated images for example, 3D images from virtual colonoscopy, 3D images of vessels from CTA, and the like
  • images from other fields for example, surveillance imaging, satellite imaging, and
  • CAP may include, but is not limited to, volume rendering (including, for example, multiplanar reformation/reconstruction (MPR), minimum intensity projection (MinIP), maximum intensity pixel display (MIP), color slabs with various look-up tables (LUTs), full volumes, 3D volume rendering, and/or 3D surface rendering), graphical processing/reporting (e.g., automated identification and outlining of lesions, lumbar discs etc.), automated measurement of lesions or other anatomical features, other computer aided diagnosis (CAD), machine learning-based analysis, artificial intelligence-based analysis, Bayesian analytics, other image processing techniques, and/or the like. Examples of certain types of CAP rules are described below in reference to FIGS. 8A and 8B .
  • Slice generally refers to medical images obtained, rendered, constructed, reconstructed, etc., from medical imaging data.
  • the term “slab” generally refers to a relatively thick slice.
  • Slices and slabs, and series of slices and slabs may have any attributes of medical images and series as described herein.
  • a slice may be defined by its thickness (e.g., a thickness of the cross section of imaged tissue).
  • Series of slices may be defined by their anatomical positions, by an increment between each successive slice, by a number slices in the series, etc.
  • the successive slices may overlap (e.g., contain overlapping medical image data) or not (e.g., in some cases, there may be gaps between the successive slices).
  • slices may be combined, or reformatted/re-rendered (e.g., to change the slices' thickness, etc.) by any CAP process and/or in response to a display or transfer rule, as described herein.
  • thin slices generally refers to any medical images, or related images (e.g., a series of images), having certain slice characteristics/attributes.
  • thin slices may be defined based on a number of slices, e.g., medical images, that are included in an image series.
  • “thin slices” might be defined to include images of any imaging series having more than 200 slices (or any other defined number of slices).
  • Medical images may also, or alternatively, be defined based on a thickness of the cross section of the imaged tissue (“slice thickness”) represented in the medical image.
  • slice thickness a thickness of the cross section of the imaged tissue
  • “thin slices” might be defined to include images having a slice thickness of less than 1 mm (or any other defined slice thickness).
  • the definition of thin slices may be adjusted based on one or more of many rules or criteria, such as a particular user, site, imaging type, viewing environment (e.g., viewing device characteristic, bandwidth of viewing device) and any number of other related attributes.
  • thin slices for a particular user at a first site may be defined differently than thin slices for another user at a second viewing site.
  • thin slices for a first modality or exam type may be defined differently than thin slices for a second modality or exam type.
  • the definition of thin slices and how thin slices (as well as thick slices) are managed may vary in response to one or more of various attributes associated with the images.
  • thin slices are typically stored in order to ensure that the thin slices are later available for use, if necessary.
  • some or all of the thin slices may not be necessary to download/view in order to allow the viewer to accurately read an image series.
  • a viewing system such as a picture archiving and communication system (“PACS”) workstation, may download many thin slices, consuming valuable bandwidth and storage space, where the downloaded thin slices are never viewed by a viewer on the PACS workstation.
  • PACS picture archiving and communication system
  • certain thin slices such as images associated with particular exam types or modalities, may need to be viewed in order to allow the viewer to accurately read an image series.
  • display environments e.g., a combination of one or more of a display device, bandwidth, connection speed, availability of thin slices locally, etc.
  • display environments e.g., a combination of one or more of a display device, bandwidth, connection speed, availability of thin slices locally, etc.
  • display environments e.g., a combination of one or more of a display device, bandwidth, connection speed, availability of thin slices locally, etc.
  • display environments e.g., a combination of one or more of a display device, bandwidth, connection speed, availability of thin slices locally, etc.
  • display environments e.g., a combination of one or more of a display device, bandwidth, connection speed, availability of thin slices locally, etc.
  • PACS server 120 of FIG. 1 render thin slices for viewing via a thin client interface with the display environment, while still other display environments may be better suited to not view or download thin slices at all, even via a thin client interface.
  • the description below describes systems and methods that allow rules to be defined based on one or more of several attributes, such as a particular user, site, or device, as well as whether individual images and/or image series are classified as thin slices, and applied to medical images in order to determine which images are downloaded, viewed, stored, and/or any number of other actions that might be performed with respect to particular images.
  • Data store includes any computer readable storage medium and/or device (or collection of data storage mediums and/or devices). Examples of data stores include, but are not limited to, optical disks (e.g., CD-ROM, DVD-ROM, etc.), magnetic disks (e.g., hard disks, floppy disks, etc.), memory circuits (e.g., solid state drives, random-access memory (RAM), etc.), and/or the like.
  • optical disks e.g., CD-ROM, DVD-ROM, etc.
  • magnetic disks e.g., hard disks, floppy disks, etc.
  • memory circuits e.g., solid state drives, random-access memory (RAM), etc.
  • RAM random-access memory
  • Database includes any data structure (and/or combinations of multiple data structures) for storing and/or organizing data, including, but not limited to, relational databases (e.g., Oracle databases, mySQL databases, etc.), non-relational databases (e.g., NoSQL databases, etc.), in-memory databases, spreadsheets, comma separated values (CSV) files, eXtendible markup language (XML) files, TeXT (TXT) files, flat files, spreadsheet files, and/or any other widely used or proprietary format for data storage.
  • Databases are typically stored in one or more data stores. Accordingly, each database referred to herein (e.g., in the description herein and/or the figures of the present application) is to be understood as being stored in one or more data stores.
  • FIG. 1A is a block diagram of an example medical imaging system 100 including two PACS workstations 170 A and 170 B in communication with various devices.
  • the PACS workstations 170 A and 170 B each include computing systems that are configured to view medical data, such as series of medical images.
  • the PACS workstations 170 and remote viewing device 180 receive medical data, including medical images, via a network 160 , which may comprise one or more of any suitable network, such as local area networks (LAN), wide area networks (WAN), personal area networks (PAN), and/or the Internet, for example.
  • LAN local area networks
  • WAN wide area networks
  • PAN personal area networks
  • the Internet for example.
  • the PACS workstations 170 (which includes workstations 170 A and 170 B) are illustrated in communication with a PACS server 120 via a separate network 165 which may comprise a secured local area network, for example.
  • the viewing devices such as the PACS workstations 170 and the remote viewing device 180 , are configured to access, download, and/or view medical images.
  • the viewing devices may each include a special-purpose computing device that is configured especially for viewing medical images or a general purpose computing device, such as a personal computer, that executes software for viewing medical images.
  • a number of entities/devices including the imaging center computing device(s) 130 , the hospital computing device(s) 140 , the electronic medical record (“EMR”) system 150 , and one or more Computer Aided Diagnosis (CAD) systems 105 may generate and/or store medical data, including medical images of varying types, formats, etc., from various imaging devices.
  • EMR electronic medical record
  • CAD Computer Aided Diagnosis
  • the imaging center computing device(s) 130 and the hospital computing device(s) 140 may each include multiple computing devices that are configured to generate, store, and/or transmit medical data, including medical images, to other computing devices, such as the EMR system 150 , the PACS server 120 , the CAD system 105 , and/or any number of processing or viewing devices, such as the PACS workstations 170 and/or remote viewing device 180 .
  • the CAD system 105 may generally perform various Computer Aided Processing (CAP) actions as described above and below (e.g., analyzing images, rendering images, re-rendering images, etc.).
  • the CAD systems functionality may reside within another computing system, e.g., the PACS Server 120 , the EMR system 150 , or the like.
  • each of the viewing devices 170 and 180 may be configured to access, download, process and/or view thin slices in a different manner.
  • a user of the PACS workstation 170 A may configure his workstation to not download any thin slices to the PACS workstation 170 A, where thin slices are defined as images representing a slice thickness of less than 1 mm, while the user of the PACS workstation 170 B may also configure his workstation to not download any thin slices, but thin slices are defined as images of an image series having more than 200 images.
  • other criteria may be used to determine downloading/transfer/etc. of images.
  • each of the PACS workstations 170 may download and provide for viewing to viewers different images, such as different images that are classified as thin slices.
  • the remote viewing device 180 which is representative of a device that is not dedicated to viewing medical data, such as a home computing device of a doctor (e.g., a radiologist), may define thin slices in a different manner, in consideration of bandwidth and hardware limitations, for example.
  • the systems and methods described herein allow rules for downloading, viewing, processing, storage and/or other management of thin slices, as well as other medical images and types of medical data, based on one or more of a plurality of attributes that are referred to herein as “transfer rules” and/or “display rules.”
  • the transfer and display rules may not only include rules regarding transfer of data (e.g., images, image series, exams, etc.), but may also include rules associated with viewing, processing, storing, printing, and/or otherwise manipulating or managing medical data, such as medical images.
  • Transfer and display rules related to transfer of data may be implemented in a push and/or pull architecture.
  • transfer and display rules may be applied at a device that has local access to the image data for pushing of matching image data to a viewing device or transfer and display rules may be applied at a viewing device for requesting (pulling) matching image data from the device that has local access to the image data.
  • transfer and display rules may include criteria based on any attributes of images, series of images, exam descriptions, clinical indications, DICOM information, any other attribute, and/or any combination of attributes.
  • transfer and display rules for transferring medical images to the PACS workstation 170 A may be based on not only whether medical images qualify as thin slices, the transfer and display rules may be based on one or more of client (or viewing device) properties, connection properties, site properties, user properties, exam properties, and/or temporal properties, for example.
  • transfer and display rules may be based on image attributes, as well as series attributes. For example, transfer and display rules may indicate that a first type of image should be transferred, but not a second type of image, even though the two image types might be within the same image series.
  • transfer and display rules may indicate that a first type of image should be transferred, but not a second type of image, even though the two image types might be within the same image series.
  • the user in diffusion imaging of the brain, the user might want transfer of the isotropic diffusion images, but not the large number of anisotropic diffusion images that were used to create the isotropic diffusion images on the scanner. Thus, the user might want a rule indicating that anisotropic diffusion images of the brain are not transferred, burned on CD, or printed.
  • Transfer and display rules may also include series-based rules, such as to transfer image series of a particular modality while not transferring image series of other modalities.
  • series-based transfer rules such as to transfer image series of a particular modality while not transferring image series of other modalities.
  • series-based transfer rules consider a CT spine imaging series that includes two series, each 1,000 images ⁇ 0.5 mm, where the first series is reconstructed in bone algorithm for detection of fractures and the second series is reconstructed in soft tissue algorithm. The viewer might want to routinely transfer the image series in bone algorithm, but not the image series in soft tissue algorithm.
  • the two series may not be distinguishable by number of slices or slice thicknesses, but other information in the DICOM headers of the images in the series (e.g., series description or reconstruction algorithm) may be accessed to determine how and/or when the image series are transferred, viewed, processed, etc.
  • images may be transferred, displayed, or otherwise processed, based on imaging plane.
  • transfer and display rules may indicate specific actions to be performed based on whether the images are axial, coronal, sagittal, or any other imaging plane.
  • images may be transferred or otherwise processed based on series attributes as well as series description.
  • a brain CTA might include two large series, a large series of 0.6 mm images of the brain in bone algorithm before contrast and a large series of 0.6 mm images after contrast.
  • the user might want the second images transferred routinely to evaluate the vessels, but not the first series.
  • the clinical history is “trauma” the user might elect to view the first series (bone algorithm) to evaluate for fractures.
  • the two large series have similar slice thickness and image numbers but differ by other criteria, e.g., series description and administration of contrast.
  • transfer and display rules are discussed primarily with reference to display of images and transfers from a server, such as a PACS server 120 to a PACS workstation or other client machine, transfer and display rules may also apply to other types of image processing, such as transfer from a server to other machines, such as in response to viewing of an exam, or portions of the exam, on the PACS workstation. For example, when a viewer displays an exam, the transfer and display rules might cause the thin slices to be transferred to a rendering server, and might involve transfer to more than one place.
  • CAP e.g., 3D volumetric rendering or CAD (computer aided diagnosis)
  • the rules for a particular viewer might be to transfer all image series to the viewer (e.g. to a remote viewing device at the viewer's home or office), including CTA source images that are categorized as thin slices, and to transfer the CTA source images to a 3D rendering server (because the remote viewing device may not be capable of 3D rendering).
  • the rules for a particular viewer might be to transfer all image series to the viewer, except the CTA source images that are categorized as thin slices (because the remote viewing device is on a slow network connection), and to transfer the CTA source images to a 3D rendering server for both rendering of the images and viewing the source images in client-server mode.
  • transfer and display rules may be based on such a large range of attributes, transfer and display rules for each user may differ. Similarly, transfer and display rules for combinations of a user with different sites may differ and transfer and display rules for combinations of a user with different viewing devices may differ. Thus, in one embodiment transfer and display rules comprise a set of rules that may vary from one combination of an image, user, site, viewing device, etc., to another combination of an image, user, site, viewing device, etc.
  • systems and methods described herein provide flexibility in configuring a particular device, one or more devices at a particular site, any device operated by a particular user or class of user, etc., for downloading, viewing, and/or otherwise managing or processing medical images, including thin slices.
  • the imaging center computing device(s) 130 may include one or more imaging devices of any type, such as imaging devices that are configured to generate MRI, x-ray, mammography, or CT images (or any other type of image, as described above).
  • image data for certain medical images is stored in DICOM format.
  • the complete DICOM specifications may be found on the National Electrical Manufactures Association Website at ⁇ medical.nema.org>.
  • NEMA PS 3 Digital Imaging and Communications in Medicine
  • 2004 ed. Global Engineering Documents, Englewood Colo., 2004, provides an overview of the DICOM standard.
  • Each of the above-cited references is hereby incorporated by reference in their entireties.
  • the example PACS server 120 is configured to store images from multiple sources and in multiple formats, and may be configured to render certain medical images for display on one or more viewing devices via a thin network communication link, for example.
  • the PACS server 120 may be configured to receive medical images in the DICOM format from multiple sources, store these images, and selectively transmit medical images to requesting viewing devices.
  • the hospital computing device(s) 140 may include and/or be replaced with any other medical facility, such as clinic, doctor's office, or any other medical facility. These medical facilities may each include one or more imaging devices and may share medical images with the PACS server 120 or other authorized computing devices.
  • each of the PACS workstations 170 and the remote viewing device 180 are configured to download, store, display, and allow user interaction with, medical images.
  • each of these devices is part of a respective display environment, where the display environment for each device may also include attributes such as a network connection speed, display device characteristics, allotted and/or available storage space, processing speed, and/or other attributes that may affect how medical data is downloaded, stored, and/or viewed by the devices.
  • the term “viewing device” refers to a computing system that is used in a display environment, where the viewing devices could include any type of device, such as a smart phone, a tablet computer, a notebook computer, a desktop computer, a server, any other computing device or combination of devices.
  • each of the PACS workstations 170 and remote viewing device 180 include one or more viewing devices having associated display environments.
  • Each viewing device includes, for example, one or more computing devices and/or servers that are IBM, Macintosh, Windows, Linux/Unix, or other operating system compatible.
  • viewing devices each include one or more central processing units (“CPU”), such as conventional or proprietary microprocessors, and one or more application modules that comprise one or more various applications that may be executed by the CPU.
  • CPU central processing units
  • the viewing devices may further include one or more computer readable storage mediums, such as random access memory (“RAM”) for temporary storage of information and read only memory (“ROM”) for permanent storage of information, and one or more mass storage devices, such as a hard drive, diskette, and/or optical media storage device. Further details regarding the viewing devices, and other computing devices of the present disclosure, are described below.
  • RAM random access memory
  • ROM read only memory
  • mass storage devices such as a hard drive, diskette, and/or optical media storage device.
  • the viewing devices may include one or more of commonly available input/output (I/O) devices and interfaces, such as a keyboard, mouse, touchpad, and/or printer.
  • I/O devices and interfaces for a viewing device may include one or more display devices, such as a monitor, that allows the visual presentation of data, such as medical images and/or other medical data, to a user.
  • display devices provide for the presentation of graphical user interfaces (“GUIs”), application software data, and multimedia presentations, for example.
  • GUIs graphical user interfaces
  • a GUI includes one or more display panes in which medical images may be displayed.
  • medical images may be stored on the viewing device or another device that is local or remote, displayed on a display device, and manipulated by one or more application modules.
  • Viewing devices may also include one or more multimedia devices, such as speakers, video cards, graphics accelerators, and microphones, for example.
  • the viewing devices may include one or more communication interfaces that allow communications with various external devices.
  • the viewing devices of FIG. 1A each may interface with a network 160 that includes one or more of a LAN, WAN, or the Internet, for example.
  • the network 160 may further interface with various computing devices and/or other electronic devices.
  • the network 160 is coupled to imaging center 130 , a hospital 140 , an EMR 150 , and a PACS server 120 , which may each include one or more computing devices.
  • the network 160 may communicate with other computing, imaging, and storage devices.
  • a server may include a processor with increased processing power, additional storage devices, but not all of the input/output devices.
  • the methods described and claimed herein may be performed by any suitable computing device, such as the viewing devices and/or computing devices of imaging centers, hospitals, EMR systems, PACS, and the like.
  • various imaging devices generate medical images having a wide range of attributes/characteristics, such as slice thicknesses and slice quantities included in respective image series.
  • medical images of various modalities such as CT, MRI, PET, digital mammography (e.g., breast tomosynthesis), among others, may be processed and/or transferred for display by one or more viewing devices.
  • the medical images may include image series with thin slices, as well as thicker slices, such as thicker axial, coronal, and/or sagittal images, for example.
  • the thin slices may be stored for occasional use in the near or long term.
  • One or more sets of rules that collectively comprise transfer and display rules may be defined by individual users and/or administrators of entire sites, for example, to define what medical images should be classified as “thin slices,” based on image slice thickness and/or the number of images in a series, for example.
  • the transfer and display rules which may include thin rules and/or other types of rules, may be stored at any location(s), such as local to a remote viewing device, local to a PACS server, and/or at a network location that is accessible to both remote viewing devices and PACS servers, for example.
  • the transfer and display rules may be stored in any data store, such as in a database or XML file, for example, on any computer readable medium.
  • the transfer and display rules may allow a user to indicate a preference as to whether thin slices are automatically displayed by default, or if thin slices are only displayed when requested by the user. For example, if an exam includes several image series, including one or more series having thin slices, a site may establish transfer and display rules that control if and how these thin slices are downloaded and/or displayed. Additionally, the transfer and display rules may allow individual users to assign rules that override the site rules. Thus, a first user at a particular display environment may be automatically presented with thin slices based on the transfer and display rules, while a second user at the same display environment is only presented with the thin slices if a specific request to download/view the thin slices is made by the second user, in view of different user rules in the transfer and display rules.
  • the transfer and display rules may indicate whether thin slices are transmitted to remote sites, such as the remote viewing device 180 of FIG. 1 , based on a variety of criteria, including preferences of the individual user, characteristics of the viewing environment, the user role, available bandwidth, the software client, or many others. For example, a rule might indicate that referring doctors are not presented with thin slices for display by default. Another rule might indicate that for certain users or sites, for example, when media such as a CD or DVD is created and/or when an exam is uploaded to a personal health record (“PHR”) or EMR system, for example, the thin slices are not included. Another rule might indicate that when film is printed, the thin slices are not printed.
  • PHR personal health record
  • Rules may also indicate whether thin slices are preloaded into RAM while reading and/or may indicate the priority of preloading. For example, thin slices and/or an entire image series having thin slices might be preloaded into RAM when an exam is displayed, but the thin slices might not actually be displayed, but instead maintained in RAM for immediate access if the user decides to display the thin slice. Other alternatives for transferring, loading into memory, and/or displaying thin slices may also be specified in rules.
  • rules may indicate that (i) thin slices are to be loaded into RAM and displayed after (or while) downloading the images, (ii) the thin slices are loaded into the local memory of the viewing computer and not automatically pre-loaded into RAM or displayed until the user specifies, and/or (iii) thin slices are assigned a lowest priority for use of RAM, such that the thin slices are loaded into RAM only after the other non-thin images of an exam are loaded.
  • transfer and display rules may include rules that not only relate to movement of images between computing devices, but also within a device, such as whether images are pre-loaded into RAM, the priority of such loading, and whether they are displayed.
  • Rules may also indicate how and when images are to be displayed, including rules for rendering of images with particular characteristics (e.g., a thickness of images to be displayed).
  • FIG. 2 illustrates a transfer and display rules table including example categories of properties/attributes on which rules may be based for a user, site, or other group of viewing devices.
  • the first two listed properties are slice thickness and number of images in series.
  • medical images may be classified as thin slices based on one or both of these first two properties.
  • transfer and display rules may include only rules used to categorize images as thin slices.
  • many other attributes associated with medical images and the viewing environment may be included in the transfer and display rules, such that different viewing environments may not only define thin slices differently, but may interface with thin slices in different manners, based on other transfer and display rules.
  • the transfer and display rules may also be used as criteria for downloading, viewing, printing, storing, deleting, and/or otherwise managing medical data that match the indicated transfer and display rules.
  • separate sets of criteria similar to the transfer and display rules, may be established for downloading, viewing, and/or storing medical images, for example.
  • the properties include one or more client properties, such as a client ID that may identify a particular viewing device and/or viewing environment, for example.
  • the client properties may include individual characteristics of a client, such as hardware and/or software capabilities of a particular viewing device.
  • the connection properties may indicate various characteristics of a connection between a viewing device and the Internet, for example, such as a connection speed, type of Internet service, and/or Internet service provider.
  • the site properties may include a site identifier that identifies a particular site, class of sites, or group of sites.
  • the user properties may indicate a particular user, such as by a user ID, username, or the like, as well as a user role, user access rights, user preferences, etc.
  • site rules may indicate a default definition for thin slices, as well as rules for how thin slices are downloaded, viewed, and/or otherwise used by viewing devices associated with the particular site.
  • user rules may indicate a user specific, or user group specific, definition for thin slices, as well as user rules for how thin slices are downloaded, viewed, and/or otherwise used by viewing devices which the user is controlling.
  • the site rules and the user rules may differ and, accordingly, may be reconciled by either overriding conflicting rules with the site rules, the user rules, requesting further input from the user, or applying some other rules for reconciling the conflicting rules.
  • the data structure of FIG. 2 also indicates that rules might be based on exam properties and/or temporal characteristics.
  • rules associated with one or more exam properties such as an exam type, imaging device, referring doctor, imaging center, etc., may be considered in determining whether images are transferred to certain viewing devices.
  • Other rules might indicate that only thin slices having a particular type of compression are to be transferred or viewed by a particular user, client, and/or site, for example.
  • Such rules may be used in a real-time manner by a PACS server, to select a compression type that matches a compression type indicated in a rule so that the corresponding thin slices may be transmitted to the associated viewing devices.
  • Rules may also be based on one or more temporal properties, such as when certain images should be downloaded, viewed, etc., by particular viewing devices, users, sites, etc.
  • a user may define a temporal rule indicating that thin slices, defined by the user and/or site rules, for example, should be downloaded to the user's viewing device only after 7:00 PM on weeknights or on weekend days.
  • the data structure of FIG. 2 also indicates that rules may be based on various other exam, series, and image attributes.
  • images 912 , image series 908 , and exams 902 and 904 may have associated attributes.
  • “Attributes” may propagate through hierarchy.
  • an exam 904 may include multiple image series 908 , and each image series may include multiple images 912 .
  • one or more attribute associated with an instance of lower orders may inherit attributes associated with an instance of higher order.
  • a user may choose to override the inherited attributes in each order having those attributes.
  • a user may tailor a rule based on these hierarchical orders/relations of various attributes.
  • a user may specify a rule to require first ten image slices (image attribute) of a patient's head having a sagittal plane (series attributes) from exams taken within last two years (exam attribute) reformatted to 2 mm thickness to be transferred to a hospital computing device 140 .
  • image attribute image attribute
  • series attributes sagittal plane
  • the data structure of FIG. 2 also indicates that rules may be based on prior exam display attributes.
  • a set of medical data having a certain modality may benefit from sharing/inheriting one or more attributes from a prior exam having the same modality.
  • slice thickness rules from a prior exam may be applied to a later-taken same modality exam and facilitate direct comparison between the two medical data sets. Additional examples of transfer and display rules and functionality of the system based on prior exam display attributes are described below.
  • rules associated with one or multiple of the attributes/properties indicated in FIGS. 2 and 9 may be considered in determining if, when, and/or how, medical images, such as thin slices, are transferred, presented for viewing, viewed, stored, printed, etc.
  • a first viewer may set user rules indicating a personal preference to have only thick sections of CT scans displayed as a routine while reading, but also wants the thin slices preloaded into RAM and displayed only if the viewer goes into MPR or volume rendering mode.
  • the same user may also set rules indicating that for certain exam types, e.g., CTA's for example, the thin slices are presented for viewing as a default, overriding the user's general preference to have only thick sections of CT scans displayed as a default.
  • the user may also establish rules associated with different sites, such that when the viewer is at home, the same CTA thin slices are not displayed.
  • Such rules may also be defined in terms of connection properties, such that if the home viewing device does not meet a particular minimum access speed, the thin slices are not downloaded to the viewing device or the thin slices are downloaded to the viewing device in the background, or the thin slices are downloaded only at certain days of the week and/or times, for example.
  • the user rules that comprise transfer and display rules may include various attributes and may be combined in various manners to produce robust and precisely customizable rules for transferring, viewing, and managing medical images.
  • FIGS. 3A, 3B, 3C illustrate example data structures having rules of varying types that may be included as part of transfer and display rules.
  • the transfer and display rules described in reference to FIGS. 3A, 3B, and 3C , and other rules as described below (e.g., in reference to FIGS. 6, 8A, and 8B ) may be stored in one or more databases of the system (e.g., on the PACS server 120 , the CAD system 105 , the EMR system 150 , any other systems shown in FIGS. 1A and 1B , or any combination of the above).
  • transfer and display rules may vary depending on many attributes of a medical image or image series, as well as the respective viewing device to which the medical image might be transferred.
  • transfer and display rules for a particular medical image might classify the medical image as a thin slice based on site rules of a first viewing device, while the same medical image is classified as a thick slice based on site rules of a second viewing device. Accordingly, the eventual transfer, viewing, and/or management of the medical image (or lack thereof) by the first and second viewing devices may be quite different. While the data structures of FIGS. 3A, 3B, and 3C illustrate certain rules that may be included in transfer and display rules, rules based on any other attributes may also be included in other systems. Likewise, transfer and display rules may include fewer rules and/or types of rules than is illustrated in the data structures of FIGS. 3A, 3B, and 3C .
  • the transfer and display rules may include more specific criteria for a particular type of rule than is illustrated in FIGS. 3A, 3B, and 3C .
  • the example of FIGS. 3A, 3B, and 3C is not intended as limiting the scope of rules or attributes on which transfer criteria may be based, but rather is provided as one example of such transfer and display rules.
  • FIGS. 3A and 3B illustrate a sample data structure including user-specific rules.
  • the user establishes different rules for each of two locations associated with the user (home and hospital).
  • section 310 of the data structure of FIG. 3A includes rules that are specific to a viewing device at the user's home (but could be applied to various other locations or other criteria).
  • a user may specify an orientational preference of the image (e.g., field-of-view and W/L ratio) and/or dimensions of each image (e.g., slice thickness and increment among the slices).
  • a user may specify, in a client-server type of setting, where image processing should occur and where the resulting rendered images should be stored.
  • rules may conflict with other rules when strictly followed.
  • a user may have specified a rule to display an axial view of an exam when a patient is diagnosed with breast cancer and another rule to display a sagittal view when a patient has a history of spinal injury.
  • the system may reconcile the such conflicting rules. For example, some embodiments may evaluate the rules in a sequential order and allow, generally, a later evaluated rule to override a prior conflicting rule or vice versa.
  • rule reconciliation may be based on a hierarchy of rights or user specified exceptions.
  • the rules associated with the user's home also include exceptions for a particular image type, specifically, a CT image type in this embodiment.
  • a CT image type in this embodiment.
  • the user has established rules that define how images are classified as thin slices, and further has defined how the thin slices should be managed by the user's home viewing device, with a particular set of rules for only CT images.
  • the user may also establish rules in section 320 that are specific to a viewing device used by the user at a hospital location.
  • the viewing devices of the user at the home and hospital locations may receive different sets of the same medical images.
  • the user may specify any set of rules to configure the system to cater to the user's needs. For example, a user may specify a rule that diagnoses an abnormality (CAD), and when an abnormality is detected, automatically transfers only the image set containing the abnormality (transfer rule) to a second authorized user who can give a second opinion.
  • the second user may have the user's viewing device configured such that it runs one or more coloring CAPs to enhance the images, and reformat the slice thickness (display rule) automatically before the user opens the images.
  • FIG. 3C illustrates a sample data structure including site rules, such as those that are developed by an administrator of a particular medical imaging center, doctors office, hospital, or other site that views and/or receives medical images.
  • site rules may be used as default rules for all viewing systems and/or users at or associated with the site.
  • the site rules indicate which rules can be overridden by user rules such as those in FIGS. 3A and 3B .
  • the site rules may be configured to always override conflicting user rules or the site rules may be configured to always be overridden by user rules.
  • other rules are used for reconciling conflicting site rules and user rules.
  • rules based on various other attributes may be established by users of the system, and similar reconciliation schemes or rules may be used to determine the proper transfer and display rules for a given image or image series.
  • FIG. 4 is a flowchart illustrating an embodiment of a method of the present disclosure, including, e.g., executing rules and CAP processes, rendering images/series, transmitting and displaying images, and various other functionality related to transfer and display rules such as reconciling default, such as site rules, with user rules.
  • various sets of rules such as site rules and user rules, that are associated with a particular medical image or series of medical images, are reconciled in order to determine a set of transfer and display rules associated with the particular medical image and viewing environment.
  • the transfer and display rules that are applied to a particular medical image may vary depending on rules associated with one or more of the viewing environment, the viewer, the site, the connection, and/or the client, for example.
  • the method of FIG. 4 is performed by a device that stores the medical images, such as the PACS server 120 or EMR system 150 of FIGS. 1A and 1B .
  • the method of FIG. 4 may include fewer or additional blocks and blocks may be performed in a different order than is illustrated.
  • Similar methods may be performed in order to determine other functionality with respect to medical images, such as whether medical images should be displayed, stored, printed, etc., based on the determined transfer and display rules.
  • Software code configured for execution on a computing device in order to perform the method of FIG. 4 may be provided on a computer readable medium, such as a compact disc, digital video disc, flash drive, or any other tangible medium. Such software code may be stored, partially or fully, on a memory device of the computing device in order to perform the method outlined in FIG. 4 by those respective devices.
  • default rules such as site rules and user rules are accessed.
  • site rules and user rules may differ in various ways, including in both defining what medical images should be classified as thin slices and in determining how thin slices and/or thick slices should be treated in view of multiple possible attributes.
  • transfer and/or display rules based on the default/user/site/etc. rules are determined.
  • the default and/or user rules may include rules defining required or excluded properties of many types, such as properties of the viewing environment, the viewer, the sights, the connection, the exam, and/or the client, for example.
  • the default/user/site/etc. rules may include an optional block to execute one or more CAP processes. Examples of executing CAP processes are described below in reference to FIG. 7 and other figures.
  • the system may optionally render images based on determined display rules or the results of block 432 . For example, a user may have defined a rule in which a calcification detection algorithm is executed at block 432 and an associated rule that renders the images with thicker slices is executed at block 434 . Further examples of rendering images in response to CAP processes and various rules are described in reference to various figures below.
  • the system determines, based on one or more transfer and display rules, whether to transmit the images (e.g., rendered images) to one or more computing devices. If so, the system applies the transfer rules to the image series and/or images in the image series at block 440 .
  • the determined transfer rules are applied to the image and/or image series in order to determine how the images should be transferred, and/or stored, if at all, to one or more receiving devices (e.g., a device that is scheduled to receive the particular image series or a device that requests image series).
  • medical images such as thin and thick slices that meet the transfer rules, are transferred to the viewing device.
  • the system determines whether to display the rendered images, e.g., based on one or more display rules. If so, then at block 454 the system applies the display rules to the image and/or image series in order to determine how the images should be displayed by one or more viewing devices. Display of images according to display rules is described in further detail below.
  • one or both of user-side and server side applications operate upon a similar dataset of information, such as a dataset representing a series of medical images encompassing a volume of tissue.
  • Such applications may be configured to render medical images from the dataset, for example, such as by using advanced processing of cross-sectional medical image series.
  • the server typically stores a large number of medical data, including datasets associated with medical images, because the server cannot anticipate what medical data will require such advanced processing.
  • the costs for such server-side systems may be high.
  • various simultaneous users may compete for server availability and if the server goes down or becomes unavailable, all users may be adversely affected.
  • a pure client side approach when a user needs to process a series of images, the images must first be loaded into the memory of their viewing device (client), which often involves transmission of the datasets to the client from a storing server, which may cause a delay of several seconds to several minutes or more as the dataset is made available to the client application.
  • client viewing device
  • a pure client-side approach requires that the application be loaded and maintained on each client.
  • certain viewing devices and/or image serving devices such as a PACS server or EMR system, execute a smart switching software module.
  • a smart switching application may include a client-based software application capable of processing data (in this case, rendering of CT, MRI, PET, Ultrasound, for example, as multiplanar or volume rendered images) and optionally a server-based software application with that capability residing on the same network (e.g., network 160 and/or 165 of FIG. 1A, 1B ).
  • the smart switching module on a viewing device is configured to detect the location of a desired dataset and, if the dataset is available on the rendering server, the processing operation(s) (e.g., rendering of the medical images) occurs there, and the results may be viewed on the viewing device via a thin-client viewer, for example.
  • the processing operation(s) e.g., rendering of the medical images
  • the results may be viewed on the viewing device via a thin-client viewer, for example.
  • the rendering server if the dataset is not available on the rendering server, the rendering server is not available, is busy, and/or there is no rendering server, the dataset may be processed on (possibly after transmission to) the client-based smart switching module on the viewing device.
  • smart switching may be performed in response to application of transfer and display rules that are based on any attributes, such as any one or more of the properties listed in FIGS.
  • transfer and display rules may include attributes related to a particular user (or group of users) or site (or group of sites), such that transfer and display might be performed differently based on the current user and/or site, for example.
  • the transfer and display rules may be part of the transfer and display rules or may be a separate rule set.
  • a user and/or viewing device may want selected image series transferred to a rendering server as opposed to (or in addition to) transfer to a viewing device.
  • a viewer may want to do side-by-side volume rendering of a patient's abdominal aortic aneurysm by comparing the current exam to one from 2 years ago, but the thin slices from the old exam are not in the cache of the rendering server.
  • the transfer and display rules may indicate that the images from the 2 year old exam are transferred from a PACS server (that still stores the images) to a rendering server.
  • transfer and display rules might specify that thin slices are not transferred to a viewing device, but instead might specify that the series be transferred to a central rendering server for 3D processing.
  • Any other set of transfer and display rules may be used to control transfer of images to a rendering server, one or more viewing devices, and/or other devices.
  • CAD computer aided diagnosis
  • transfer and display rules might specify that the images be transferred to a processing server (in this case for CAD) as well as the viewing device for viewing.
  • the axial series may be defined by the individual user or site as “thin” because of a slice thickness criteria of under 1 mm, or because of an image per series rule of >200 images.
  • transfer and display rules may indicate a user's preference to not display the thin series when images are typically viewed for reading, or not display them under only certain reading circumstances such as when the client is running with a slow network connection.
  • the user is working on a client that has not downloaded the thin slices and decides to perform additional image processing while viewing the exam (e.g., perhaps the user wishes to create oblique slices or volume-rendered images).
  • the system recognizes that the thin slices are not present locally, determines that the thin slices are on a connected rendering server, and enables the user to access the advanced processing application on the server via a thin client (or other connection type), with the rendering operations performed on the server side.
  • the smart switching application determines that there is no server-side rendering engine available
  • the smart switching module determines that the thin slices are available but will need to be downloaded, and may initiate download (and possibly background downloading) of the thin slices.
  • the user is working on a client that has downloaded the thin slices into local memory, and decides to perform additional image processing on the thin slices.
  • the smart switching modules may be configured to determine if there is a networked rendering server with a copy of the thin slices, and if so, initiate the processing operations on the server side immediately, with display on the client via a thin client viewer and, if there is no rendering server, make the locally downloaded thin slices available to the locally resident processing application instead.
  • Use of one or more smart switching modules may advantageously load-balance activities of clients and servers in a way that can be user defined and based on real-time availability of one or more servers, for example. Such load balancing may allow the rendering server to store fewer exams, while allowing for decreased processing times to process data sets that might normally be processed by a client when the server is available for processing.
  • a smart switching system for advanced image processing may be combined with rules of varying types, including rules for defining thin slices.
  • a smart switching module supports the following workflow: (1) if a rule is set (e.g., a user, site, device, or other rule) to show no thin slices, the thin slices are not displayed under the indicated rule preferences (e.g., site rules may indicate that thin slices are not to be downloaded or viewed at a home location), and only the thick slice images are presented; (2) if a user wants to have an exam automatically loaded in advanced visualization mode (or otherwise processed) by rule or switch to that mode, and the thin slices are available on the rendering server, the thin slices can be rendered by the rendering server with minimal delay (as compared to the delay in rendering that might be experienced if rendered on the client); and/or (3) if a user wants to have an exam automatically loaded in advanced visualization mode by rule or switch to that mode, but the images are not available on any rendering server (e.g., the rendering server(s) are busy, down, not available, or absent entirely), the local client is used to render the thin slices, such as by making the appropriate dataset available to the client processing application in
  • a rendering server may comprise a cluster or grid of computing devices.
  • a rendering server may include many computing devices that could serve as rendering servers, dedicated rendering servers, and/or other computing devices that can also serve as a rendering server, such as PACS workstation that renders images in a background process for other viewing devices.
  • images may or may not be stored at each rendering device (where there are multiple possible rendering devices). For example, if there are multiple rendering devices capable of serving as a rendering server, it may not be efficient for each of them to have a copy of all the cases that might be subjected to rendering.
  • Rendering may include any type of image generation or modification, such as 3D image rendering, 2D image rendering, adjusting a bit depth of images (e.g., adjusting 12-bit DICOM images into 8-bit JPEG images so they can be displayed in browsers and other light clients), and/or any other image manipulations.
  • 3D rendering is discussed herein; however, any discussion of 3D rendering should be construed to also describe any other type of rendering.
  • a viewing device queries the group of rendering devices, or a device that load-balances rendering requests, with a request that the “fastest” rendering server provide processing.
  • the “fastest” server might not be the fastest rendering device or least busy rendering device, but the one with the most rapid access to the data. For example, consider a series of imaging centers connected with a WAN, each with its own PACS server and 3D rendering device (could be the same machine).
  • a user at site A might need to perform interactive 3D rendering of an exam at remote site B. While equivalent 3D rendering devices might exist at both site A and site B, the system may utilize the remote rendering server at site B because it has the fastest connection to the data, as the data resides locally at site B. Similarly, if images are stored at one rendering device, but not at others, the system may select the rendering device that already has a copy of the images for rendering, rather than another rendering device that might otherwise be used for the rendering.
  • rendering of multiple exams requested by a particular viewer/viewing device might occur on different rendering devices. For example, a viewer may need to render a current exam and an old exam, such as to compare the images of the exams. If the old exam was performed at a hospital and the new exam at an imaging center, where it is also being read, the viewer at the imaging center needs to perform 3D volumetric rendering of the two exams, to assess for growth of an abdominal aortic aneurysm, for example.
  • rendering of the old exam might be performed by a first rendering device (e.g., at the hospital where the exam is stored), while the new exam is rendered by a second rendering devices (e.g., a client machine at the imaging center).
  • a first rendering device e.g., at the hospital where the exam is stored
  • a second rendering devices e.g., a client machine at the imaging center
  • FIG. 5 is a block diagram of three sites that are configured to perform one or more of the operations noted above.
  • site A 510 includes an image server and archive, a 3D rendering server, and a PACS workstation
  • each of Sites B 520 and C 530 also include an image server and archive and a 3D rendering server.
  • the PACS workstation at site A 510 is operated by a viewer (e.g., radiologist or other doctor) in order to view medical images that are rendered by any of the site A, B, or C rendering servers.
  • the PACS workstation at site A may be replaced by any other viewing device, such as a desktop computer configured to access medical images via a browser or other software, a smart phone, a tablet computer, and/or laptop computer.
  • Site A has requested rendered images that include thin slice images (e.g., either in response to rules that initiate automatic rendering/transfer of the images and/or a request for the rendered images from a user of the PACS workstation at site A 510 , for example).
  • the workstation A utilizes the rendering server at Site B 520 , which has fast local access to the thin slice image data.
  • processing power at Site A 510 may be used for other purposes while Site B 520 renders the images.
  • the 3D images may be rendered locally at Site A 510 .
  • the rules for defining which device should perform rendering may be based on any number of factors.
  • the rendering servers at Sites B 520 and Sites C 530 may be automatically utilized to render the images at those remote sites.
  • such remote rendering may be selected in response to rules, such as a rule that automatically initiates remote rendering of image data for images that are defined as thin slice images and/or based on the bandwidth between the remote rendering servers (e.g., at sites B 520 and C 530 ).
  • images may be rendered at multiple remote rendering devices (e.g., rendering servers at Sites B 520 and C 530 ) for viewing at a workstation (e.g., PACS workstation at Site A 510 ).
  • a workstation e.g., PACS workstation at Site A 510
  • images may be rendered by each of the rendering servers at Sites B 520 and C 530 using the respective locally available image data so that Site A 520 may not need to do any image rendering in order to view rendered images of two exams that have been rendered by different remote rendering servers.
  • images may be rendered via in locations determined based on user-defined rules, CAP results, attributes of a viewing device (e.g., screen resolution, locations of the device), and/or various other factors, as described below.
  • FIG. 6 illustrates additional example display rules of the present disclosure.
  • some medical imaging which only showed 2D images can now show 3D images or image data, and/or 2D images may be rendered from 3D image data.
  • many ultrasound systems can obtain 3D image data of a fetus, which may be rendered into one or more 2D images.
  • Different users may prefer rendering, transfer, and/or display of 2D slice images (e.g., from 3D medical imaging data) with specific slice thicknesses (and/or other image characteristics).
  • a user may prefer to view image slices with a 3 mm thickness instead of a 0.5-1 mm thickness. Such as preference may be associated with one or more image characteristics.
  • a rendering device may, in response to rules set by the user, reformat the image set having a different thickness and display as if the image set originally had 3 mm thickness.
  • a user may provide one or more display rules to a rendering device, be it a server, client, or in some cases switching between the two, that may retrieve said rules and reformat images based on the retrieved rules.
  • FIG. 6 provides example display rules having conditions (left column) and associated actions (right column).
  • rule 602 specifies that when a rendering device gets an image set having each slice less than 0.5 mm thick, it will reformat the image set to output a slab having thickness 3 mm. Rules can be flexible and can be defined to trigger on a wide variety of conditions.
  • triggering conditions include triggering when the input image set is associated with a specific plane (e.g., an oblique view condition 608 ), when an exam property/attribute shows that the patient is a high risk patient (e.g., a patient from a family with history of cancer 610 ), and/or when an image set has too many slices (e.g., more than thousand slices 612 making it difficult to visually manage).
  • a specific plane e.g., an oblique view condition 608
  • an exam property/attribute shows that the patient is a high risk patient
  • an image set has too many slices (e.g., more than thousand slices 612 making it difficult to visually manage).
  • a rule may have a single condition or have a combined condition (e.g., multiple conditions) that comes into effect only when multiple conditions are concurrently satisfied.
  • Rule 604 is one such rule with a combined condition, in which a condition of exam type being tomography is combined with a condition of each image slice having “thickness less than 0.5 mm.” The rule 604 will only come into effect upon encountering an image series that satisfies both criteria before rendering 2 mm slabs.
  • FIG. 6 shows another example of a multiply-defined condition 606 where a patient age is less than 30 years and exam type is tomography. This condition will not trigger the reformat when a patient's age is 31.
  • the system is not limited to reformatting slices from thin to thick and should not be construed to be limited to offer only unidirectional process.
  • the system is equally capable of reformatting slices from thick to thin (e.g., by generating thin slices from source volumetric/3D data, or source thinner slices, etc.).
  • a default rule may suggest 3 mm thickness slabs but some user rules may prefer to reformat to 1 mm slices.
  • a display rule for a high risk patient 610 may be one such situation where reformatting to thin slices with small increments may allow the viewer to have a finer inspection of the image data.
  • a user may specify any thickness and increment with rules similar to those shown in FIG. 3A . More specifically, rules 318 and 320 show an embodiment specifying thickness and increment, respectively.
  • a user may specify a plurality of rules to be concurrently in effect.
  • the high risk patient rule 610 and number of slices rule 612 can both apply to a same image set.
  • a prioritizing scheme that overrides one rule with the other may be implemented. For example, where rules 610 and 612 conflict, the system may give higher priority to prior-defined rule 610 and ignore rule 612 , or vice versa.
  • all triggered rules will run and the last defined rule may override all prior rules' renditions of images. Any outcome determinable rule reconciliation may be adequate.
  • Some rules may specify “overlap” of slices along with reformatting thickness, as indicated by example display rules 606 and 612 . Having overlapping slices may be helpful in providing a viewer with a sense of image continuity and/or allowing the user to perform some CAP to prepare the images for better analysis, such as filtering with moving average filter to remove noise in the image sets.
  • Overlapping slices are a function defined in both slice thickness and increment between the slices. For example, where slice thickness is specified to be 3 mm and increment 4 mm, there will be a 1 mm gap between the slices. Where thickness and increment are specified to be 3 mm, sequential image slices will have neither gaps nor overlaps. Where slices are 3 mm and increment is 2 mm, each slice will have an overlap of 1 mm on each end with an adjacent slice.
  • a user may modify either parameter to reformat images and obtain desired overlaps. It should be apparent to a skilled person in the art that a slice overlap is not limited to one or two slices. For example, a user specifying 0.5 mm thickness and 0.2 mm increment (rules 318 and 320 ) may get a slice that overlaps with at most 4 other slices.
  • the rules in FIG. 6 are by no means exclusive and rules bearing on other attributes or display environmental variables are also available.
  • An example of an attribute not specified in FIG. 6 is a “user type” ( FIG. 3A gives an example of Radiologist as a value to this attribute).
  • An example of a display environmental variable is network speed.
  • transfer and display rules may also take into account other database entries or information in a DICOM header file, such as the patient's age. For example, a user may put forth rules to render slices of a breast tomosynthesis exam progressively thinner based on a patient's age, such as first rule requesting 1.0 mm for 30 years old and above, second rule requesting 0.9 mm for 27 to 30 years old, and third rule requesting 0.8 mm everyone else.
  • rules may incorporate other relevant information that pertains to a patient's risk factors, such as age of the patient or a family history of a particular cancer or other diseases.
  • a user may also create a rule based on other information in the DICOM header, such as view, and specify to render any oblique image data with 4 mm thickness.
  • these rules may be applied as a default rule to apply to image series. For example, if a rule applied as a default specifies presentation of 3 mm slabs where original images comprise 0.5 mm slices, the system may automatically reformat the 0.5 mm slices to 3 mm slabs.
  • the system may automatically reformat images following an exam data generation event or in response to user requests (e.g., user sends the system a command equivalent to “read image data”).
  • the system may render image data compliant with predetermined rules at block 434 .
  • the term “reformat” is not to be construed narrowly but broadly to include, in a nonlimiting way, a decrease in resolution as well as increase in resolution, applying interpolation algorithms or the like, in addition to already discussed adjustments in slice thickness. Conversion to lower resolution, if not too low so to affect later analyses, may alleviate storage and bandwidth concerns.
  • a user may manually select one or more display rules and request the rendering device to apply the rules.
  • An image set rendered at display block 454 may, in some instances, need to be adjusted by a user, and the system may allow the user to specify and manually apply temporary rules (that may later be saved and become a permanent rule) to reformat images for a more in depth analysis. For example, a user may identify a different complication while viewing an image set with a known abnormality and may want to reformat the image set with a rule that can best show the complication.
  • FIGS. 6 shows an example rule structure showing what form of conditions and functions various rules may take.
  • the system may apply a display rule and/or a transfer rule at blocks 432 , 434 , 440 , 454 , and/or 460 .
  • a rule taking on network speed, data compression, and/or storage space as parameters at block 440 may be a transfer rule.
  • a display rule may leave the original image data intact even though it displays a subset of the image data, a transfer rule may cause transmission of only a subset of images.
  • a display rule is unlikely to erase the original after display but a transfer rule may cause an original image to be erased after transfer.
  • rules may take on results of CAP as a condition parameter.
  • CAP actions are described in further detail below in reference to FIG. 7 .
  • the system executes blocks 702 , 704 , and/or 706 of the flowchart of FIG. 7 to obtain a result useful for evaluating the outcome of the rule's condition (unless a relevant CAP result is already available from some previous execution of FIG. 7 blocks). See the section on Computer-Aided Analysis and Rendering of Medical Images below for how an independent call to FIG. 7 blocks may be made.
  • rules 614 and 616 may be checked and if satisfied, blocks 708 , 710 , and 712 of FIG. 7 may be performed at block 434 of FIG. 4 .
  • the system may allow a user to set rules, based on the modality and the nature of the exam or other attributes, to initially display the image set in a helpful context. For example, a certain user may, through his/her experience with the system, find one configuration to work best and specify one or more rules catered to the user's specific purpose.
  • transfer and display rules may be based on one or more of exam attributes, series attributes, images attributes (e.g., image plane), prior example display attributes, and/or various other attributes or criteria. Referring to FIG.
  • example display rules may include specifying particular image planes, Field of View (FOV) 314 and 332 , and window and level (W/L) 316 . Additional examples are described below and shown in FIGS. 3B and 3C .
  • FOV Field of View
  • W/L window and level
  • a user may, for example, set up a rule for a 3D ultrasound image of a baby to initially display a particular view or image plane, with a particular slice thickness, and with particular window and/or level settings (among other settings).
  • These rules may provide (by specifying display parameters such as image plane, color, thickness, W/L, FOV, and numerous others) a uniform initial presentation of different image sets (a presentation, here, includes the concept of orientations and is defined to have a broader meaning than an orientation). Where there are many different viewers sharing few viewing devices and each viewing device is frequented by numerous viewers, there are multiple benefits to be had for having user specific presentation of images.
  • the system enables a user to apply display rules for specific orientation, zoom, or performing other transformations to best suit the image to the user's specific needs. It should be noted that these display rules may be applied to transfers and storage as well.
  • FIGS. 3B and 3C show additional example rules that specify initial presentations defaulting to a user default 326 or to a site default 330 .
  • Site rules and user rules may be reconciled at block 430 of FIG. 4 (as described herein), and any optional CAP processes are executed at block 432 of FIG. 4 (as described herein).
  • the system renders an initial image presentation compliant with display rules and optional CAP action (as described herein).
  • the system presents the user with the rendered images.
  • an exam image set may be identified as a subsequent image set related to (e.g., related to the same patient, medical diagnosis, image plane, modality, and/or any other attribute) an image set from one or more prior exams
  • a rendering device may choose display parameters that best mimic display parameters from the prior exams.
  • the display parameters may not be exact, but the rendering device may take a close approximation to present similar presentation between the related exams.
  • a rendering device may retrieve the viewing rules (e.g., slice thickness, plane, or other display parameters), and/or stored state (e.g., brightness or the contrast) of the prior exam's viewing session and apply the rules and the state to the current exam image set.
  • a user may conveniently switch between similarly presented images from different exams in order to identify any growing abnormalities.
  • the display parameters may be user-specific, such that only exams previously displayed to the user may be used as a basis for determining display parameters for a new exam displayed to the user.
  • the display rules may further indicate particular image rendering methods, slice thicknesses, slice increments, and/or the like, based on any image attribute.
  • transfer and display rules may take into account one or more CAP (e.g., computer aided diagnoses (CAD)) related to medical images.
  • CAP computer aided diagnoses
  • a CAP action on a mammogram may automatically reveal information on breast density, calcification formation, and/or existence of implants.
  • one or more rules may indicate that a dense breast is to be rendered with thinner slices (or slices or a particular thickness) while images with detected calcification might be rendered with thicker slices (or slices with a particular thickness). Patients with implants might be rendered differently yet.
  • FIG. 4 includes blocks related to CAP actions.
  • the system may optionally execute one or more CAP actions, and may optionally render images based on display and/or transfer rules related to results of the CAP.
  • transfer and/or display rules may indicate CAP-related criteria for display and/or transfer of images, as described herein.
  • FIG. 7 is a flowchart illustrating an embodiment of the present disclosure related to CAP actions and rendering of images more generally (as may be applicable to various transfer and display rules described above).
  • Blocks 432 , 434 , 440 , 454 , and 460 may request FIG. 7 CAP actions, and/or may operate based on results of CAP actions.
  • An example of the flow of FIG. 7 may be understood by reference to example CAP rule/action 810 of FIG. 8 .
  • the system retrieves the rule (e.g., rule 810 ) and determines that this rule is to only apply to medical data having modality of MRI and brain exams.
  • the retrieved rule specifies a condition that the exam data (or DICOM header) indicate an existence of a “dementia.”
  • the rule requests the system to run an MRI Brain Volumetric Analysis at block 704 .
  • the system may, in the process, isolate the brain from other matters such as the structure of cranium, and color code the brain to highlight the result.
  • the system may then evaluate the CAP result and determine the brain to be significantly smaller than a healthy brain.
  • CAP actions may involve more than simple morphologically based CAD. Rather, CAP action may include, e.g., artificial intelligence and Bayesian analytics as a determining factor in the rendering decision. Such CAP action may be particularly useful in assessing lesion morphology, patient risk factors, and other clinical factors (e.g., in determining the relative suspicion of a breast cancer).
  • the system may include a rule that indicates that a lesion of is to be rendered in a certain slab thickness, color, window/level, opacity, etc., based on a best depiction of the lesion characteristics that are most associated, e.g., with neoplasm risk (or some other clinical indication determined by a CAP action).
  • a lesion that is suspicious (as determined by one or more CAP actions as mentioned herein) may be rendered as a slab of a certain obliquity and thickness to best show, e.g., a branching calcification because the system determined (e.g., by a CAP action) that such a calcification is a likely sign of breast cancer.
  • a user or other system may include a rule that indicates, e.g., “display the images so that the features that most raise suspicion are optimally displayed” or “only adjust the rendering when a lesion is found that has a >2% probability of being neoplastic” or “pre-display a collection of such rendered images that optimally show the most suspicious lesions in an exam in an order based on rank of suspicion.”
  • blocks 702 , 704 , and 706 may execute any number of combination of CAP action (the description immediately above providing just one example).
  • the system may determine an image rendering location at block 708 .
  • the rendering of the images may occur in the client side, server side, or any capable computing devices shown in FIGS. 1A and 1B .
  • images are rendered on a server.
  • a higher resolution viewing device may require rendering at a higher resolution so complete rendering on the server followed by a transfer would slow down local rendering.
  • a file is small, it is advantageous to transfer the small file for rendering locally.
  • other image set properties such as number of images to be transferred, bit depth of the images, original file size, and/or rendered file size may be used as criteria in determining whether to transfer an image set or not.
  • any CAPs that consume much computing power or need extensive sets of prior exam data may provide a different rationale in transfer decision making process. For these reasons, a user may prefer to specify one or more rules to supplement Smart Switching mechanism.
  • rules 322 , 324 , and 328 are examples of user specified location rules.
  • a user has defined that if the user is accessing medical images from home, render images on server.
  • the user has configured the server to store the re-rendered images.
  • Rule 328 is an example of how a user may specify render location based on different access location types. When a user is accessing the medical images at hospital, rule 328 will apply instead of rule 324 .
  • the rendering device will determine relevant rendering parameters from rules, data attributes, and/or environment variables (such as display resolution) at block 710 . When ready, the system will render images at block 712 .
  • tissue density example rule 614 images are to be reformatted/re-rendered to a thickness of 1 mm if the results of a CAP action indicate that tissue density is greater than 5 .
  • the system may automatically perform a CAP process to determine the tissue density by following the flowchart of FIG. 7 .
  • Blocks 702 , 704 , and 706 calculate tissue density and blocks 708 , 710 and 712 may even color-gradient the 3D image based on tissue density.
  • the system will reformat the images to 1 mm at block 454 .
  • example rule 614 indicates that the system is to reformat image to 1 mm slices. Such rules may be helpful in detecting abnormalities including breast cancer. For example, women with high breast density may be more likely to get breast cancer than women with low breast density.
  • CAP results e.g., calculation of breast density
  • a diagnostic rule e.g., where tissue density is greater than 5
  • the system may automatically vary image slice thickness (as rendered by the system) based on breast density (or other tissue density). For example, images of dense (or greater than average denseness) tissue (as detected by an automatic CAP action) may be rendered with thinner slice thickness (e.g., a default or base thickness), while images of less dense (or average or lower denseness) tissue (as detected by an automatic CAP action) may be rendered with thicker slices (e.g., the slice thickness may vary between a default value and increase to 10 mm). The opposite may also be true, in another implementation.
  • rule 614 only looked at one CAP result—calculated tissue density—and compared it to a scalar value of 5 .
  • a user may specify more sophisticated rules that employ multiple CAPs.
  • rule 616 has a CAP searching for an abnormality.
  • the CAP in 616 may be any of the CAPs in FIG. 8A , including spine fracture detection, lung nodule detection, lung parenchyma analysis, etc.
  • a user may provide a list of CAPs to run in search for multiple types of abnormalities.
  • the system finds an abnormality such as a calcified breast, it may then fuse together relevant slices of the image set and return a subset containing only the fused slices. The fused slices might more clearly show the suspected abnormality.
  • a user may choose to perform any further and optional image processing.
  • a user will simply select one or more predefined rules and activate the selected rules.
  • a user may create a new rule and activate the rule and/or save the rule in a storage device connected to the network.
  • a user may interactively re-orient, zoom, color, highlight, annotate, comment, or perform any semantically and/or visually enhancing operations. Some of these optional image processing or enhancing operations may utilize CAP.
  • All of the blocks where CAP can be called may trigger an alert and/or notification for the user.
  • the alert and/or notification is automatically transmitted to a device operated by the user associated with a corresponding trigger.
  • the alert and/or notification can be transmitted at the time that the alert and/or notification is generated or at some determined time after generation of the alert and/or notification.
  • the alert and/or notification can cause the device to display the alert and/or notification via the activation of an application on the device (e.g., a browser, a mobile application, etc.).
  • receipt of the alert and/or notification may automatically activate an application on the device, such as a messaging application (e.g., SMS or MMS messaging application), a standalone application (e.g., a health data monitoring application or collection management application used by a collection agency), or a browser, for example, and display information included in the alert and/or notification.
  • a messaging application e.g., SMS or MMS messaging application
  • a standalone application e.g., a health data monitoring application or collection management application used by a collection agency
  • a browser e.g., a browser, for example, and display information included in the alert and/or notification.
  • the application may be automatically activated when the device is online such that the alert and/or notification is displayed.
  • receipt of the alert and/or notification may cause a browser to open and be redirected to a login page generated by the system so that the user can log in to the system and view the alert and/or notification.
  • the alert and/or notification may include a URL of a webpage (or other online information) associated with the alert and/or notification, such that when the device (e.g., a mobile device) receives the alert, a browser (or other application) is automatically activated and the URL included in the alert and/or notification is accessed via the Internet.
  • a URL of a webpage or other online information
  • FIG. 8A is a table illustrating additional example rules related to CAP actions that may be performed by the system.
  • various of the rules shown in FIG. 8A may be used to automatically determine one or more CAP to perform on an image, image series, and/or imaging exam.
  • the table (which may be any other data structure in other embodiments) indicates associations between particular modalities (column 802 ), exam types (column 804 ), and CAP (column 806 ) that may be valuable to examination of the exam images.
  • the table further includes a rules column 808 that includes rules for execution of the CAP indicated in column 806 .
  • the rules may indicate that certain CAP are performed automatically (for example, without any input from the doctor), automatically if certain conditions are met (for example, insurance covers, exam has certain characteristics, previous CAP has certain results, and the like), or after confirmation from a radiologist, for example.
  • words in quotes indicate clinical indication or history, such as “trauma.”
  • the rules may include other criteria for executing one or more CAP, for example based on one or more of:
  • CAP systems are available Exam characteristics, for example, MRI of spine vs. CT of brain
  • Clinical information for example, brain MRI where clinical question is dementia (one type of processing) vs. trauma (another type of processing)
  • User preference Site preference Insurance approval Billable status Referring docs order Presence of comparison exam Whether or not a certain type of CAP was already performed on the exam and/or on a prior exam, for example:
  • a rule may indicate that a particular CAP should be run if another specific CAP had a certain result (for example, another CAP had a result that was abnormal, normal, demonstrated a particular finding, demonstrated a measurement in a particular range, and/or the like). Status of another CAP.
  • a rule may indicate that two CAP should be performed, but that a second CAP should not be performed until the first CAP is complete.
  • “Brain aneurysm detection CAD” may require that a “3D Vessel tracking” CAP be run first, as “Brain aneurysm detection CAD” may process the results of “3D Vessel tracking” CAP.
  • the last example rule listed in the example CAP Rules table of FIG. 8B illustrates another example in which three CAP are automatically run in a particular sequence in the event that two conditions are met.
  • certain results of a CAP may automatically trigger the scheduling of another CAP (for example, based on the rules in column 808 ).
  • the modality and exam in rule 810 is associated with Brain MRI exams (as indicated in columns 802 and 804 ), and the indicated CAP of “MRI brain volumetric analysis” is associated with a rule (column 808 ) indicating that the CAP is automatically performed when the clinical indication is “dementia.”
  • scheduling of a particular CAP may automatically cause one or more other CAP to be scheduled before or after that particular CAP.
  • exam rule 812 indicates that scheduling of “Brain aneurysm detection CAD” should result in the automatic scheduling of “3D Vessel tracking” CAP, and that “3D Vessel tracking” CAP should be run before “Brain aneurysm detection CAD”, for example because “Brain aneurysm detection CAD” involves processing the results of “3D Vessel tracking” CAP.
  • the modality and exam in rule 811 is associated with Brain MRI exams (as indicated in columns 802 and 804 ), and the indicated CAP of “MRI brain CSF analysis” is associated with a rule (column 808 ) indicating that the CAP is automatically performed when the clinical indication is “hydrocephalus” “or if an abnormal brain volumetric analysis resulted from another CAP (e.g., a result of running rule 810 for “Dementia”).
  • the first CAP in rule 810 (“MRI Brain volumetric analysis”) may first be automatically performed on a brain MRI, such as in response to an indication of “dementia” in the MRI order from the referring doctor. Once the MRI brain volumetric analysis has been performed, the rules of FIG. 8A may again be applied to determine if one or more additional CAP should be performed. In this example, if the result of the MRI brain volumetric analysis is “abnormal” (or equivalent nomenclature), another CAP listed in rule 811 (MRI brain CSF analysis) is triggered for automated execution.
  • the rules may be configured to initiate execution of multiple CAP in response to results of previously performed CAP.
  • a rules data structure may be used to determine which CAP are compatible and/or available for a particular one or more image series, such as based on various characteristics associated with the one or more image series.
  • a rules data structure comprising modality, exam, and CAD/processing, such as columns 802 , 804 , and 806 in the example of FIG. 8A , may be used to determine which of the various CAD/processing are compatible with medical images in particular exam modalities and exams. In one embodiment, this information may be presented to users. In the example of rows 810 and 811 , “MRI brain volume analysis” and “MRI brain CSF analysis” are listed as compatible and/or available for MRI exams of the brain.
  • CAD is a type of CAP that returns, in general, a relevant Boolean or probabilistic result to an inquiry regarding an existence of a disease and/or abnormality.
  • the example CAPs 810 , 811 , and 812 are few examples of CADs.
  • Some CAD rules may, in addition to the relevant Boolean or probabilistic result, also provide an image set (or any other relevant data) for a user to verify and/or further investigate the CAD result.
  • the example CAP 812 takes in the “3D Vessel Tracking” results, which may contain 3D vessel images of brain, that can be used as the input to “Brain Aneurysm Detection CAD.”
  • FIG. 8B is a table illustrating additional example rules related to CAP actions that may be performed by the system.
  • the rules of FIG. 8B may be used by the system to automatically determine one or more CAP to perform on an image or image series.
  • rules related to CAP may be evaluated automatically, for example when:
  • An exam is completed on a scanner.
  • An exam is communicated, for example, from a scanner to a PACS System or from a PACS System to a PACS Workstation.
  • a CAP is performed, for example, such that the result of the CAP may automatically trigger performance of another CAP.
  • evaluation of rules related to CAP may be performed on one or more computing devices, such as scanners, PACS Systems, PACS Workstations, and the like. Based on the evaluation of rules related to CAP, one or more CAP may be automatically executed.
  • Various embodiments of the present disclosure may be a system, a method, and/or a computer program product at any possible technical detail level of integration
  • the computer program product may include a computer readable storage medium (or mediums) having computer readable program instructions thereon for causing a processor to carry out aspects of the present disclosure
  • the functionality described herein may be performed as software instructions are executed by, and/or in response to software instructions being executed by, one or more hardware processors and/or any other suitable computing devices.
  • the software instructions and/or other executable code may be read from a computer readable storage medium (or mediums).
  • the computer readable storage medium can be a tangible device that can retain and store data and/or instructions for use by an instruction execution device.
  • the computer readable storage medium may be, for example, but is not limited to, an electronic storage device (including any volatile and/or non-volatile electronic storage devices), a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
  • a non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a solid state drive, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read-only memory
  • EPROM or Flash memory erasable programmable read-only memory
  • SRAM static random access memory
  • CD-ROM compact disc read-only memory
  • DVD digital versatile disk
  • memory stick a floppy disk
  • a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon
  • a computer readable storage medium is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
  • Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network.
  • the network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers.
  • a network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
  • Computer readable program instructions (as also referred to herein as, for example, “code,” “instructions,” “module,” “application,” “software application,” and/or the like) for carrying out operations of the present disclosure may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, configuration data for integrated circuitry, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++, or the like, and procedural programming languages, such as the “C” programming language or similar programming languages.
  • Computer readable program instructions may be callable from other instructions or from itself, and/or may be invoked in response to detected events or interrupts.
  • Computer readable program instructions configured for execution on computing devices may be provided on a computer readable storage medium, and/or as a digital download (and may be originally stored in a compressed or installable format that requires installation, decompression or decryption prior to execution) that may then be stored on a computer readable storage medium.
  • Such computer readable program instructions may be stored, partially or fully, on a memory device (e.g., a computer readable storage medium) of the executing computing device, for execution by the computing device.
  • the computer readable program instructions may execute entirely on a user's computer (e.g., the executing computing device), partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present disclosure.
  • These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart(s) and/or block diagram(s) block or blocks.
  • the computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • the instructions may initially be carried on a magnetic disk or solid state drive of a remote computer.
  • the remote computer may load the instructions and/or modules into its dynamic memory and send the instructions over a telephone, cable, or optical line using a modem.
  • a modem local to a server computing system may receive the data on the telephone/cable/optical line and use a converter device including the appropriate circuitry to place the data on a bus.
  • the bus may carry the data to a memory, from which a processor may retrieve and execute the instructions.
  • the instructions received by the memory may optionally be stored on a storage device (e.g., a solid state drive) either before or after execution by the computer processor.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the blocks may occur out of the order noted in the Figures.
  • two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
  • certain blocks may be omitted in some implementations.
  • the methods and processes described herein are also not limited to any particular sequence, and the blocks or states relating thereto can be performed in other sequences that are appropriate.
  • any of the processes, methods, algorithms, elements, blocks, applications, or other functionality (or portions of functionality) described in the preceding sections may be embodied in, and/or fully or partially automated via, electronic hardware such application-specific processors (e.g., application-specific integrated circuits (ASICs)), programmable processors (e.g., field programmable gate arrays (FPGAs)), application-specific circuitry, and/or the like (any of which may also combine custom hard-wired logic, logic circuits, ASICs, FPGAs, etc. with custom programming/execution of software instructions to accomplish the techniques).
  • ASICs application-specific integrated circuits
  • FPGAs field programmable gate arrays
  • any of the above-mentioned processors, and/or devices incorporating any of the above-mentioned processors may be referred to herein as, for example, “computers,” “computer devices,” “computing devices,” “hardware computing devices,” “hardware processors,” “processing units,” and/or the like.
  • Computing devices of the above-embodiments may generally (but not necessarily) be controlled and/or coordinated by operating system software, such as Mac OS, iOS, Android, Chrome OS, Windows OS (e.g., Windows XP, Windows Vista, Windows 7, Windows 8, Windows 10, Windows Server, etc.), Windows CE, Unix, Linux, SunOS, Solaris, Blackberry OS, VxWorks, or other suitable operating systems.
  • operating system software such as Mac OS, iOS, Android, Chrome OS, Windows OS (e.g., Windows XP, Windows Vista, Windows 7, Windows 8, Windows 10, Windows Server, etc.), Windows CE, Unix, Linux, SunOS, Solaris, Blackberry OS, VxWorks, or other suitable operating systems.
  • the computing devices may be controlled by a proprietary operating system.
  • Conventional operating systems control and schedule computer processes for execution, perform memory management, provide file system, networking, I/O services, and provide user interface functionality, such as a graphical user interface (“GUI”), among other things.
  • GUI graphical user interface
  • certain functionality may be accessible by a user through a web-based viewer (such as a web browser), or other suitable software program.
  • the user interface may be generated by a server computing system and transmitted to a web browser of the user (e.g., running on the user's computing system).
  • data e.g., user interface data
  • the user interface may be generated (e.g., the user interface data may be executed by a browser accessing a web service and may be configured to render the user interfaces based on the user interface data).
  • the user may then interact with the user interface through the web-browser.
  • User interfaces of certain implementations may be accessible through one or more dedicated software applications.
  • one or more of the computing devices and/or systems of the disclosure may include mobile computing devices, and user interfaces may be accessible through such mobile computing devices (for example, smartphones and/or tablets).
  • Conditional language such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment.
  • a general purpose computer comprising one or more processors should not be interpreted as excluding other computer components, and may possibly include such components as memory, input/output devices, and/or network interfaces, among others.

Abstract

Systems and methods that allow transfer and display rules to be defined based on one or more of several attributes, such as a particular user, site, device, and/or image/series characteristic, as well as whether individual images and/or image series are classified as thin slices and/or based on other characteristics, and applied to medical images in order to determine which images and/or image data are analyzed, downloaded, viewed, stored, rendered, processed, and/or any number of other actions that might be performed with respect to medical image data. The system and methods may include image analysis, image rendering, image transformation, image enhancement, and/or other aspects to enable efficient and customized review of medical images.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation-in-part of U.S. patent application Ser. No. 15/292,023 filed on Oct. 12, 2016, titled “Selective Display of Medical Images,” which application is a continuation of U.S. patent application Ser. No. 15/163,600 filed on May 24, 2016, titled “Selective Display of Medical Images,” which application is a continuation of U.S. patent application Ser. No. 14/687,853 filed on Apr. 15, 2015, titled “Selective Processing of Medical Images,” which application is a continuation of U.S. patent application Ser. No. 14/179,328 filed on Feb. 12, 2014, titled “Rules-Based Approach to Rendering Medical Images,” which application is a continuation of U.S. patent application Ser. No. 12/891,543 filed on Sep. 27, 2010, titled “Rules-Based Approach to Transferring and/or Viewing Medical Images,” which application claims the benefit of priority from U.S. Provisional Patent Application No. 61/246,479 filed on Sep. 28, 2009, titled “Rules-Based Approach to Transferring and/or Viewing Medical Images.” The entire disclosure of each of the above items is hereby made part of this specification as if set forth fully herein and incorporated by reference for all purposes, for all that it contains.
  • This application is also related to U.S. Patent Application No. ___/______, filed on Mar. 24, 2017, with attorney docket number SVL920155370US7/M009P1, and titled “RULES-BASED RENDERING OF MEDICAL IMAGES,” and U.S. Patent Application No. ___/______, filed on Mar. 24, 2017, with attorney docket number SVL920155370US8/M009P2, and titled “RULES-BASED PROCESSING AND PRESENTATION OF MEDICAL IMAGES.” The entire disclosure of each of the above items is hereby made part of this specification as if set forth fully herein and incorporated by reference for all purposes, for all that it contains.
  • Any and all applications for which a foreign or domestic priority claim is identified in the Application Data Sheet as filed with the present application are hereby incorporated by reference under 37 CFR 1.57 for all purposes and for all that they contain.
  • All publications and patent applications mentioned in this specification are hereby incorporated by reference in their entirety to the same extent as if each individual publication or patent application was specifically and individually indicated to be incorporated by reference.
  • TECHNICAL FIELD
  • This disclosure relates to medical image analysis, medical image processing, medical image rendering, medical image transformation, medical image transferring, and/or medical image viewing.
  • BACKGROUND
  • Medical imaging is increasingly moving into the digital realm. This includes imaging techniques that were traditionally analog, such as mammography, x-ray imaging, angiography, endoscopy, and pathology, where information can now be acquired directly using digital sensors, or by digitizing information that was acquired in analog form. Many imaging modalities are inherently digital, such as computed radiography (CR), digital radiography (DR), MRI (magnetic resonance imaging), CT (computed tomography), PET (positron emission tomography), NM (nuclear medicine scanning), FFDM (full-field digital mammography), and US (ultrasound), often yielding hundreds or even thousands of images per examination. Increasingly these digital images are viewed, manipulated, and interpreted using computers and related computer equipment. Accordingly, improved systems and methods are needed for distributing, viewing, and prioritizing these digital images under various network and viewing conditions.
  • SUMMARY
  • The systems, methods, and devices described herein each have several aspects, no single one of which is solely responsible for its desirable attributes. Without limiting the scope of this disclosure, several non-limiting features will now be described briefly.
  • In one embodiment, a method of selecting medical images of an image series for transfer comprises, by a computing device executing software, performing operations comprising, accessing transfer rules associated with a first computing device, the transfer rules indicating thin slice criteria for determining whether respective medical images of an image series are classified as thin slices, wherein the thin slice criteria include at least one of a maximum thickness of a particular medical image for classification of the particular image as a thin slice image or a maximum quantity of medical images in the image series for classification of the medical images of the image series as thin slice images, selecting a first group of medical images of the image series that are thin slice images as indicated by the thin slice criteria, initiating transfer of the first group of medical images to the first computing device, wherein medical images of the image series that are not in the first group are not transferred to the first computing device.
  • In one embodiment, a method comprises, by a computing device comprising hardware, performing operations comprising accessing thin slice criteria for determining whether respective medical images of an image series are classified as thin slices, selecting a thin slice series of medical images that includes one or more medical images that are classified as thin slices based on the thin slice criteria, selecting a non-thin slice series of medical images that includes one or more medical images that are not thin slice images based on the thin slice criteria, and transferring or displaying the thin slice series in a different manner than the non-thin slice series, such that either the images of the thin slice series are selectively not transferred, selectively transferred with lower priority, selectively transferred but not displayed, or selectively transferred but otherwise displayed or processed in a selectively different manner.
  • In one embodiment, a tangible computer readable medium has instructions stored thereon, wherein the instructions are configured for reading by a computing device in order to cause the computing device to perform operations comprising accessing transfer rules associated with a first computing device and a first user, the transfer rules indicating thin slice criteria for determining whether respective medical images of an image series are classified as thin slice images, wherein the thin slice criteria include at least one of a maximum thickness of a particular medical image for classification of the particular image as a thin slice image or a maximum quantity of medical images in the image series for classification of the medical images of the image series as thin slice images, selecting a first group of medical images of an image series that are not thin slices as indicated by the thin slice criteria, and initiating transfer of the first group of medical images to the first viewing environment, wherein medical images of the image series that are not in the first group are not transferred to the first viewing environment.
  • In one embodiment, a method comprises by a computing device configured for displaying medical images, determining a storage location of a dataset, determining an availability of a processing server to process the dataset, in response to determining that the dataset is locally available to a processing server configured to process the dataset, requesting processing of the dataset by the processing server and accessing the processed dataset from the processing server, and in response to determining that the dataset is not locally available to the processing server and/or the processing server is not available to process the dataset, requesting transmission of at least some of the dataset to the computing device and processing at least some of the dataset at the computing device.
  • According to an embodiment, a computer-implemented method of rendering medical images is disclosed comprising: by one or more processors executing program instructions: accessing a set of medical image data; analyzing the set of medical image data to determine a value of an attribute of the set of medical image data; accessing a user-defined rule indicating an association between the value of the attribute and a desired image slice thickness; and applying the rule to the set of medical image data to render a series of medical images, wherein medical images of the series match the desired slice thickness.
  • According to an aspect, the method further comprises: by the one or more processors executing program instructions: automatically initiating display of the series of medical images at a computing device.
  • According to another aspect, the set of medical image data comprises an original series of medical images, and wherein the medical images comprise two-dimensional medical images.
  • According to yet another aspect, applying the rule comprises: by the one or more processors executing program instructions: accessing a three-dimensional set of medical image data from which the original series was derived; and rendering the series of medical images from the three-dimensional set of medical image data.
  • According to another aspect, applying the rule further comprises: by the one or more processors executing program instructions: selecting a group of medical images of the original series of medical images, wherein each medical image of the group of medical images has an image slice thickness greater or less than the desired image slice thickness; and reformatting each of the medical images of the group of medical images to have the desired image slice thickness.
  • According to yet another aspect, applying the rule further comprises: by the one or more processors executing program instructions: determining that one or more medical images of the original series of medical images has an image slice thickness greater or less than the desired image slice thickness; reformatting the original series of medical images to include a plurality of reformatted medical images having an image slice thicknesses matching the desired image slice thickness, wherein the original series of medical images is reformatted from a volumetric data set associated with the original series of medical images; and initiating display of the original series of medical images including the plurality of reformatted medical images at a computing device.
  • According to another aspect, the set of medical image data comprises three-dimensional medical image data.
  • According to yet another aspect, the rule further indicates an association between the value of the attribute and a desired image slice increment, and wherein applying the rule comprises: by the one or more processors executing program instructions: rendering the series of medical images from the three-dimensional set of medical image data, wherein the medical images of the series match both the desired slice thickness and the desired slice increment.
  • According to another aspect, applying the rule comprises: by the one or more processors executing program instructions: generating each successive image slice of the series from the three-dimensional set of medical image data, where each successive image slice has the desired image thickness and is the desired image slice increment apart.
  • According to yet another aspect, the method further comprises: by the one or more processors executing program instructions: automatically initiating display of the series of medical images at a computing device.
  • According to another aspect, the value of the attribute is derived from a DICOM header file.
  • According to yet another aspect, the attribute comprises at least one of: an image series view, a patient's age, a medical history, a risk factor, a tissue characteristic, or a modality.
  • According to another aspect, the method further comprises: by the one or more processors executing program instructions: receiving a request from a user to view the series of medical images; and accessing the user-defined rule associated with the user.
  • According to yet another aspect, the method further comprises: by the one or more processors executing program instructions: displaying the series of medical images on a computing device; receiving a reformat request indicating a new desired image slice thickness different from the desired image slice thickness indicated by the rule; and in response to receiving the reformat request: re-rendering the set of medical image data to generate a new series of medical images, wherein medical images of the new series match the new desired slice thickness.
  • According to another embodiment, a system is disclosed comprising: a computer readable storage medium having program instructions embodied therewith; and one or more processors configured to execute the program instructions to cause the one or more processors to: access a set of medical image data; analyze the set of medical image data to determine a value of an attribute of the set of medical image data; access a user-defined rule indicating an association between the value of the attribute and a desired image slice thickness; and apply the rule to the set of medical image data to render a series of medical images, wherein medical images of the series match the desired slice thickness.
  • According to an aspect, the set of medical image data comprises three-dimensional medical image data, and wherein the one or more processors are configured to execute the program instructions to further cause the one or more processors to: render the series of medical images from the three-dimensional set of medical image data, wherein the medical images of the series match both the desired slice thickness and the desired slice increment.
  • According to another aspect, the one or more processors are configured to execute the program instructions to further cause the one or more processors to: receive a request from a user to view the series of medical images; access the user-defined rule associated with the user; display the series of medical images on a computing device; receive a reformat request indicating a new desired image slice thickness different from the desired image slice thickness indicated by the rule; and in response to receiving the reformat request: re-render the set of medical image data to generate a new series of medical images, wherein medical images of the new series match the new desired slice thickness.
  • According to yet another embodiment, a computer program product is disclosed comprising a computer readable storage medium having program instructions embodied therewith, the program instructions executable by one or more processors to cause the one or more processors to: access a set of medical image data; analyze the set of medical image data to determine a value of an attribute of the set of medical image data; access a user-defined rule indicating an association between the value of the attribute and a desired image slice thickness; and apply the rule to the set of medical image data to render a series of medical images, wherein medical images of the series match the desired slice thickness.
  • According to an aspect, the program instructions are executable by one or more processors to further cause the one or more processors to: render the series of medical images from the three-dimensional set of medical image data, wherein the medical images of the series match both the desired slice thickness and the desired slice increment.
  • According to another aspect, the program instructions are executable by one or more processors to further cause the one or more processors to: receive a request from a user to view the series of medical images; access the user-defined rule associated with the user; display the series of medical images on a computing device; receive a reformat request indicating a new desired image slice thickness different from the desired image slice thickness indicated by the rule; and in response to receiving the reformat request: re-render the set of medical image data to generate a new series of medical images, wherein medical images of the new series match the new desired slice thickness.
  • According to another embodiment, a system configured to analyze and display medical images is disclosed comprising: a computer readable storage medium having program instructions embodied therewith; and one or more processors configured to execute the program instructions to cause the one or more processors to: access a series of medical images; analyze the series of medical images to determine a value of an attribute of the series of medical images, wherein the attribute comprises an image plane; access a user-defined rule indicating an association between the value of the attribute and a desired display preference; and apply the rule to the series of medical images to cause display of the series of medical images according to the desired display preference.
  • According to an aspect, the desired display preference includes at least one of: an image rendering process, an image slice thickness, an image slice increment, a field of view, a window, a level, or a color setting.
  • According to another aspect, the value of the attribute is derived from a DICOM header file.
  • According to yet another aspect, the attribute further comprises at least one of: an image series view, a patient's age, a medical history, a risk factor, a tissue characteristic, or a modality.
  • According to another aspect, applying the rule to the series of medical images comprises: rendering the series of medical images according to a user-defined rule associated with the value of the attribute.
  • According to yet another aspect, the rule indicates a desired slice thickness.
  • According to another aspect, the one or more processors are configured to execute the program instructions to further cause the one or more processors to: identify a previously displayed series of medical images having the attribute with a second value that matches the value associated with the series of medical images; determine one or more display parameters associated with display of the previously displayed series of medical images; and designating the one or more display parameters as the desired display preference.
  • According to yet another aspect, identifying a previously displayed series of medical images comprises: by the one or more processors executing program instructions: determining that the previously displayed series of medical images was previously displayed to a same user that the series of medical images are to be displayed to.
  • According to yet another embodiment, a computer-implemented method of analyzing and displaying medical images is disclosed comprising: by one or more processors executing program instructions: accessing a series of medical images; analyzing the series of medical images to determine a value of an attribute of the series of medical images, wherein the attribute comprises an image plane; accessing a user-defined rule indicating an association between the value of the attribute and a desired display preference; and applying the rule to the series of medical images to cause display of the series of medical images according to the desired display preference.
  • According to an aspect, the desired display preference includes at least one of: an image rendering process, an image slice thickness, an image slice increment, a field of view, a window, a level, or a color setting.
  • According to another aspect, the value of the attribute is derived from a DICOM header file.
  • According to yet another aspect, the attribute further comprises at least one of: an image series view, a patient's age, a medical history, a risk factor, a tissue characteristic, or a modality.
  • According to another aspect, applying the rule to the series of medical images comprises: by the one or more processors executing program instructions: rendering the series of medical images according to a user-defined rule associated with the value of the attribute.
  • According to yet another aspect, the rule indicates a desired slice thickness.
  • According to another embodiment, a computer-implemented method of analyzing and displaying medical images is disclosed comprising: by one or more processors executing program instructions: accessing a series of medical images; analyzing the series of medical images to determine a first value of an attribute of the series of medical images; identifying a previously displayed series of medical images having the attribute with a second value that matches the first value; determining one or more display parameters associated with display of the previously displayed series of medical images; and applying the one or more display parameters to the series of medical images to cause display of the series of medical images according to the one or more display parameters.
  • According to an aspect, identifying a previously displayed series of medical images comprises: by the one or more processors executing program instructions: determining that the previously displayed series of medical images was previously displayed to a same user that the series of medical images are to be displayed to.
  • According to another aspect, the series of medical images are automatically initially displayed to the user according to the one or more display parameters.
  • According to yet another embodiment, a computer-implemented method of automated analysis of medical images is disclosed comprising: by one or more processors executing program instructions: accessing a set of medical image data; automatically analyzing the set of medical image data by a CAP action to determine a value of an attribute of the set of medical image data; accessing a user-defined rule indicating an association between the value of the attribute and a desired image slice thickness; and applying the rule to the set of medical image data to render a series of medical images, wherein medical images of the series match the desired slice thickness.
  • According to an aspect, the method further comprises: by one or more processors executing program instructions: accessing a plurality of user-defined CAP rules; identifying a CAP rule associated with the set of medical imaging data; and determining the CAP action indicated by the rule.
  • According to another aspect, the CAP rule is associated with the set of medical imaging data based on at least one of: a modality, an anatomical region, or a medical indicator.
  • According to yet another aspect, the attribute comprises at least one of: a tissue density, or a presence of an abnormality or a suspected abnormality.
  • According to another aspect, the method further comprises: by one or more processors executing program instructions: identifying a possible abnormality based on the analyzing of the set of medical image data by the CAP action; and fusing a group of medical images from the series of medical images in a region associated with the possible abnormality.
  • According to yet another aspect, the method further comprises: by the one or more processors executing program instructions: automatically initiating display of the group of medical images of the series of medical images at a computing device.
  • According to another aspect, the method further comprises: by one or more processors executing program instructions: determining a rendering location for rendering the series of medical images based on a second user-defined rule.
  • According to yet another aspect, the second user-defined rule indicates at least one of: a user associated with the user-defined rule, a location at which the series of medical images are to be displayed, or a characteristic of a device upon which the series of medical images to be displayed.
  • According to another embodiment, a system is disclosed comprising: a computer readable storage medium having program instructions embodied therewith; and one or more processors configured to execute the program instructions to cause the one or more processors to: access a set of medical image data; automatically analyze the set of medical image data by a CAP action to determine a value of an attribute of the set of medical image data; access a user-defined rule indicating an association between the value of the attribute and a desired image slice thickness; and apply the rule to the set of medical image data to render a series of medical images, wherein medical images of the series match the desired slice thickness.
  • According to an aspect, the one or more processors are configured to execute the program instructions to further cause the one or more processors to: access a plurality of user-defined CAP rules; identify a CAP rule associated with the set of medical imaging data; and determine the CAP action indicated by the rule.
  • According to another aspect, the CAP rule is associated with the set of medical imaging data based on at least one of: a modality, an anatomical region, or a medical indicator.
  • According to yet another aspect, the attribute comprises at least one of: a tissue density, or a presence of an abnormality or a suspected abnormality.
  • According to another aspect, the one or more processors are configured to execute the program instructions to further cause the one or more processors to: identify a possible abnormality based on the analyzing of the set of medical image data by the CAP action; and fuse a group of medical images from the series of medical images in a region associated with the possible abnormality.
  • According to yet another aspect, the one or more processors are configured to execute the program instructions to further cause the one or more processors to: automatically initiate display of the group of medical images of the series of medical images at a computing device.
  • According to another aspect, the one or more processors are configured to execute the program instructions to further cause the one or more processors to: determine a rendering location for rendering the series of medical images based on a second user-defined rule.
  • According to yet another aspect, the second user-defined rule indicates at least one of: a user associated with the user-defined rule, a location at which the series of medical images are to be displayed, or a characteristic of a device upon which the series of medical images to be displayed.
  • According to yet another embodiment, a computer program product is disclosed comprising a computer readable storage medium having program instructions embodied therewith, the program instructions executable by one or more processors to cause the one or more processors to: access a set of medical image data; automatically analyze the set of medical image data by a CAP action to determine a value of an attribute of the set of medical image data; access a user-defined rule indicating an association between the value of the attribute and a desired image slice thickness; and apply the rule to the set of medical image data to render a series of medical images, wherein medical images of the series match the desired slice thickness.
  • According to an aspect, the program instructions are executable by one or more processors to further cause the one or more processors to: access a plurality of user-defined CAP rules; identify a CAP rule associated with the set of medical imaging data; and determine the CAP action indicated by the rule.
  • According to another aspect, the program instructions are executable by one or more processors to further cause the one or more processors to: identify a possible abnormality based on the analyzing of the set of medical image data by the CAP action; and fuse a group of medical images from the series of medical images in a region associated with the possible abnormality.
  • According to yet another aspect, the program instructions are executable by one or more processors to further cause the one or more processors to: automatically initiate display of the group of medical images of the series of medical images at a computing device.
  • Additional embodiments of the disclosure are described below in reference to the appended claims, which may serve as an additional summary of the disclosure.
  • In various embodiments, systems and/or computer systems are disclosed that comprise a computer readable storage medium having program instructions embodied therewith, and one or more processors configured to execute the program instructions to cause the one or more processors to perform operations comprising one or more aspects of the above- and/or below-described embodiments (including one or more aspects of the appended claims).
  • In various embodiments, computer-implemented methods are disclosed in which, by one or more processors executing program instructions, one or more aspects of the above- and/or below-described embodiments (including one or more aspects of the appended claims) are implemented and/or performed.
  • In various embodiments, computer program products comprising a computer readable storage medium are disclosed, wherein the computer readable storage medium has program instructions embodied therewith, the program instructions executable by one or more processors to cause the one or more processors to perform operations comprising one or more aspects of the above- and/or below-described embodiments (including one or more aspects of the appended claims).
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The following drawings and the associated descriptions are provided to illustrate embodiments of the present disclosure and do not limit the scope of the claims. Aspects and many of the attendant advantages of this disclosure will become more readily appreciated as the same become better understood by reference to the following detailed description, when taken in conjunction with the accompanying drawings, wherein:
  • FIG. 1A is a block diagram of an example medical imaging system including two PACS workstations in communication with various devices;
  • FIG. 1B is a block diagram of an example medical imaging system wherein PACS workstations are in communication with a PACS server via a separate network which may comprise a secured local area network, for example;
  • FIG. 2 illustrates a table of transfer and display rules, including example categories of properties/attributes on which transfer and display rules may be based for a user, site, or other group of viewing devices;
  • FIGS. 3A, 3B, and 3C illustrate example data structures having transfer and display rules of various types;
  • FIG. 4 is a flowchart illustrating an embodiment of a method of the present disclosure;
  • FIG. 5 is a block diagram of three sites that are configured to perform one or more of the operations discussed herein;
  • FIG. 6 illustrates additional example display rules of the present disclosure;
  • FIG. 7 is a flowchart illustrating an embodiment of another method of the present disclosure;
  • FIGS. 8A-8B show tables illustrating additional examples of rules of the present disclosure; and
  • FIG. 9 illustrates various example attributes that may be associated with exams, image series, and images, according to embodiments of the present disclosure.
  • DETAILED DESCRIPTION
  • Embodiments of the disclosure will now be described with reference to the accompanying figures, wherein like numerals refer to like elements throughout. The terminology used in the description presented herein is not intended to be interpreted in any limited or restrictive manner, simply because it is being utilized in conjunction with a detailed description of certain specific embodiments of the disclosure. Furthermore, embodiments of the disclosure may include several novel features, no single one of which is solely responsible for its desirable attributes or which is essential to practicing the embodiments of the disclosure herein described.
  • Terms
  • In order to facilitate an understanding of the systems and methods discussed herein, a number of terms are defined below. The terms defined below, as well as other terms used herein, should be construed broadly to include the provided definitions, the ordinary and customary meaning of the terms, and/or any other implied meaning for the respective terms. Thus, the definitions below do not limit the meaning of these terms, but only provide exemplary definitions.
  • “User” (also referred to herein as “reviewer” and/or “viewer”), as used herein, includes an individual (or group of individuals) that interfaces with a computing device to, for example, access or view medical images. Users may include, for example, physicians (including, for example, doctors, radiologists, etc.) hospital staff, and/or any other individuals (including persons not medically trained) involved in analysis, annotation, comparison, acquisition, storage, management, or other tasks related to medical images (or any other types of images) as described herein. Any discussion herein of user preferences and/or rules (e.g., display rules and transfer rules) associated with users should be construed to also, or alternatively, include user group preferences (or rules associated with groups of users), site preferences/rules, system preference/rules, and/or default software preferences/rules.
  • “Medical data,” as used herein, is defined to include any data related to medical information, images, exams, image series, and/or other patient information. As non-limiting examples, medical data may include, but is not limited to, medical images, such as radiograph (e.g., an x-ray image, CR, DR, etc.), computed tomography (CT), magnetic resonance imaging (MRI), Ultrasound (US), mammogram, positron emission tomography scan (PET), nuclear scan (NM), full-field mammography (FFDM) (and other types of digital mammography), Tomosynthesis (e.g., breast tomosynthesis), and images related to gross and microscopic pathology, ophthalmology, endoscopy, or other medical images, as well as medical reports, such as text files containing reports, voice files with results summaries, and/or full digital dictation voice files for transcription. While this description is directed substantially to transmitting and viewing of medical images, the methods and systems described herein may also be used in conjunction with non-medical images, such as, images of circuit boards, airplane wings, and satellite images, for example. Medical images may be reconstructed and/or rendered from 3D or volumetric image data using methods including multiplanar reformation/reconstruction (MPR), maximum intensity projection (MIP), and/or the like (including, e.g., any Computerized Advanced Processing (CAP), as described below). FIG. 9 illustrates an example of a medical image 912 and possible attributes that may be associated with a medical image. The term “medical image data” may be used herein to refer to medical data primarily associated with imaged or imaging data.
  • “Image series” (also referred to herein as a “series”), as used herein, includes any two or more images that are related. Images in a series typically share one or more common attributes, for example, a type of anatomic plane and/or an image orientation. For example, an image series may comprise two or more images of a particular patient that are acquired on a particular date, e.g., different x-ray projections of the chest. A series of contiguous 3 mm axial CT scans of the chest is another example of an image series. A brain MRI scan might include the following series: sagittal T1 weighted images, axial T1 weighted images, axial FLAIR images, axial T2 weighted images, as well as post contrast axial, sagittal and coronal T1 weighted series. An image series of an exam may be identified by its “type” (also referred to herein as a “series type” and/or a “view type”). For example, series may be acquired using different pulse sequences, acquired in different anatomic planes (also referred to herein as “imaging planes,” e.g., axial, coronal, sagittal, etc.), and/or acquired before or after administration of intravenous contrast material. An image series may be limited to images of a certain modality or may comprise images of multiple modalities. FIG. 9 illustrates an example of an image series 908, as well as example attributes that may be associated with an image series. As shown, the image series 908 includes multiple medical images, such as medical image 912.
  • “Medical imaging exam” (also referred to herein as a “medical exam” and/or an “exam”), as used herein, includes collection of medical data related to an examination of a patient. An exam may be specific to a particular time or time period. Generally, an exam includes one or more medical images and/or image series, montages, reports, notes, graphs, measurements, annotations, videos, sounds or voice data, diagnoses, and/or other related information. An exam may include multiple image series of multiple modalities, volumetric imaging data, reconstructed images and/or rendered images. For example, an exam of a patient may be the brain MRI scan mentioned above, and may include each of the image series obtained on a particular date including: sagittal T1 weighted images, axial T1 weighted images, axial FLAIR images, axial T2 weighted images, as well as post contrast axial, sagittal and coronal T1 weighted series. Another example of an exam may be a dual-energy radiography exam, which may include image data including traditional x-ray image images, bone subtracted (or “bone out”) x-ray images, and/or tissue subtracted (or “tissue out”) x-ray images. FIG. 9 illustrates two example medical exams 902 and 904. As shown, each medical exam 902 and 904 includes multiple image series, such as image series 908 which is a part of medical exam 904.
  • “Attributes” (also referred to herein as “properties” and “characteristics”), as used herein, include any characteristics associated with a data item (e.g., a data item such as a medical exam, an image series, a medical image, and/or the like). Attributes may be inherited in a hierarchical manner. For example, a medical image may inherit attributes of an image series of which it is a part, and an image series may inherit attributes of a medical exam of which it is a part. Attributes may be stored as part of an associated data item (e.g., as metadata, Digital Imaging and Communications in Medicine (“DICOM”) header data, etc.) and/or separately from an associated data item. FIG. 9 illustrates various example attributes that may be associated with exams (e.g., example attributes 906), image series (e.g., example attributes 910), and images (e.g., example attributes 914). Examples of image attributes include, without limitation, image angle (e.g., an angle of an image with reference to a standard one or more planes of human anatomy; also referred to herein as “scan plane” or “imaging plane”), anatomical position (and/or location) (e.g., a location, with reference to a standard one or more planes of human anatomy, of the patient represented in a particular image), image orientation (e.g., an orientation of the image with reference to a standard one or more planes of human anatomy), image rotation (e.g., a rotation of the image with reference to a standard one or more planes of human anatomy), image field of view, slice thickness, image window and/or level (e.g., a contrast of the image, a brightness of the image, and/or the like), image color map (e.g., that includes information for rendering different pixel intensities as different colors), other color characteristics, image opacity (and/or opacity map), image zoom level, image cropping information, image/series/exam description, whether or not contrast agent was administered, and/or the like. In some instances, one or more image attributes may be user defined and/or based on user preferences/rules. An image attribute may refer to the physical anatomy of a patient from which the image was obtained. For example, a medical image may be obtained to show a particular slice of a patient at a particular location such that a diagnosis of the patient may be made.
  • “Computerized Advanced Processing” (also referred to herein as “CAP”), as used herein, includes any computerized image analysis, image analysis technique, and/or image processing technique discussed herein, and/or any similar computerized processing technique that is currently or later available. CAP is described herein with regard to radiology images and other types of medical images, but CAP and the systems and methods described herein may be applied in other areas including, but not limited to, other types of medical images (for example, cardiology, dermatology, pathology and/or endoscopy, among others), computer generated images (for example, 3D images from virtual colonoscopy, 3D images of vessels from CTA, and the like), images from other fields (for example, surveillance imaging, satellite imaging, and the like), as well as non-imaging data including audio, text, and numeric data. In some embodiments, CAP may include, but is not limited to, volume rendering (including, for example, multiplanar reformation/reconstruction (MPR), minimum intensity projection (MinIP), maximum intensity pixel display (MIP), color slabs with various look-up tables (LUTs), full volumes, 3D volume rendering, and/or 3D surface rendering), graphical processing/reporting (e.g., automated identification and outlining of lesions, lumbar discs etc.), automated measurement of lesions or other anatomical features, other computer aided diagnosis (CAD), machine learning-based analysis, artificial intelligence-based analysis, Bayesian analytics, other image processing techniques, and/or the like. Examples of certain types of CAP rules are described below in reference to FIGS. 8A and 8B.
  • “Slice,” “slab,” and the like, as used herein, generally refer to medical images obtained, rendered, constructed, reconstructed, etc., from medical imaging data. The term “slab” generally refers to a relatively thick slice. Slices and slabs, and series of slices and slabs, may have any attributes of medical images and series as described herein. For example, a slice may be defined by its thickness (e.g., a thickness of the cross section of imaged tissue). Series of slices may be defined by their anatomical positions, by an increment between each successive slice, by a number slices in the series, etc. For example, in a series of slices, the successive slices may overlap (e.g., contain overlapping medical image data) or not (e.g., in some cases, there may be gaps between the successive slices). In some instances, slices may be combined, or reformatted/re-rendered (e.g., to change the slices' thickness, etc.) by any CAP process and/or in response to a display or transfer rule, as described herein.
  • The term “thin slices,” as used herein, generally refers to any medical images, or related images (e.g., a series of images), having certain slice characteristics/attributes. For example, thin slices may be defined based on a number of slices, e.g., medical images, that are included in an image series. For example, “thin slices” might be defined to include images of any imaging series having more than 200 slices (or any other defined number of slices). Medical images may also, or alternatively, be defined based on a thickness of the cross section of the imaged tissue (“slice thickness”) represented in the medical image. For example, “thin slices” might be defined to include images having a slice thickness of less than 1 mm (or any other defined slice thickness). The definition of thin slices may be adjusted based on one or more of many rules or criteria, such as a particular user, site, imaging type, viewing environment (e.g., viewing device characteristic, bandwidth of viewing device) and any number of other related attributes. Thus, thin slices for a particular user at a first site may be defined differently than thin slices for another user at a second viewing site. Similarly, thin slices for a first modality or exam type may be defined differently than thin slices for a second modality or exam type. As described in further detail below, the definition of thin slices and how thin slices (as well as thick slices) are managed (e.g., transferred to various viewing devices) may vary in response to one or more of various attributes associated with the images.
  • In many medical imaging systems, thin slices, whether defined by a slice thickness and/or slice quantity, are typically stored in order to ensure that the thin slices are later available for use, if necessary. However, in many image viewing applications, some or all of the thin slices may not be necessary to download/view in order to allow the viewer to accurately read an image series. Thus, a viewing system, such as a picture archiving and communication system (“PACS”) workstation, may download many thin slices, consuming valuable bandwidth and storage space, where the downloaded thin slices are never viewed by a viewer on the PACS workstation. On the other hand, certain thin slices, such as images associated with particular exam types or modalities, may need to be viewed in order to allow the viewer to accurately read an image series. For example, for certain exams thin slices may need to be reviewed in order to view details that cannot be appreciated in thick slices. Additionally, display environments, e.g., a combination of one or more of a display device, bandwidth, connection speed, availability of thin slices locally, etc., vary widely, such that some display environments might be better suited for viewing and/or downloading a greater quantity of thin slices, while other display environments might be better suited for having a server, such as the PACS server 120 of FIG. 1, render thin slices for viewing via a thin client interface with the display environment, while still other display environments may be better suited to not view or download thin slices at all, even via a thin client interface. Accordingly, the description below describes systems and methods that allow rules to be defined based on one or more of several attributes, such as a particular user, site, or device, as well as whether individual images and/or image series are classified as thin slices, and applied to medical images in order to determine which images are downloaded, viewed, stored, and/or any number of other actions that might be performed with respect to particular images.
  • “Data store,” as used herein, includes any computer readable storage medium and/or device (or collection of data storage mediums and/or devices). Examples of data stores include, but are not limited to, optical disks (e.g., CD-ROM, DVD-ROM, etc.), magnetic disks (e.g., hard disks, floppy disks, etc.), memory circuits (e.g., solid state drives, random-access memory (RAM), etc.), and/or the like. Another example of a data store is a hosted storage environment that includes a collection of physical data storage devices that may be remotely accessible and may be rapidly provisioned as needed (commonly referred to as “cloud” storage).
  • “Database,” as used herein, includes any data structure (and/or combinations of multiple data structures) for storing and/or organizing data, including, but not limited to, relational databases (e.g., Oracle databases, mySQL databases, etc.), non-relational databases (e.g., NoSQL databases, etc.), in-memory databases, spreadsheets, comma separated values (CSV) files, eXtendible markup language (XML) files, TeXT (TXT) files, flat files, spreadsheet files, and/or any other widely used or proprietary format for data storage. Databases are typically stored in one or more data stores. Accordingly, each database referred to herein (e.g., in the description herein and/or the figures of the present application) is to be understood as being stored in one or more data stores.
  • Example Medical Imaging System
  • FIG. 1A is a block diagram of an example medical imaging system 100 including two PACS workstations 170A and 170B in communication with various devices. The PACS workstations 170A and 170B each include computing systems that are configured to view medical data, such as series of medical images. In the embodiment of FIG. 1A, the PACS workstations 170 and remote viewing device 180 receive medical data, including medical images, via a network 160, which may comprise one or more of any suitable network, such as local area networks (LAN), wide area networks (WAN), personal area networks (PAN), and/or the Internet, for example. In the embodiment of FIG. 1B, the PACS workstations 170 (which includes workstations 170A and 170B) are illustrated in communication with a PACS server 120 via a separate network 165 which may comprise a secured local area network, for example. In either embodiment, the viewing devices, such as the PACS workstations 170 and the remote viewing device 180, are configured to access, download, and/or view medical images. Depending on the embodiment, the viewing devices may each include a special-purpose computing device that is configured especially for viewing medical images or a general purpose computing device, such as a personal computer, that executes software for viewing medical images.
  • In the embodiment of FIGS. 1A and 1B, a number of entities/devices, including the imaging center computing device(s) 130, the hospital computing device(s) 140, the electronic medical record (“EMR”) system 150, and one or more Computer Aided Diagnosis (CAD) systems 105 may generate and/or store medical data, including medical images of varying types, formats, etc., from various imaging devices. The imaging center computing device(s) 130 and the hospital computing device(s) 140 may each include multiple computing devices that are configured to generate, store, and/or transmit medical data, including medical images, to other computing devices, such as the EMR system 150, the PACS server 120, the CAD system 105, and/or any number of processing or viewing devices, such as the PACS workstations 170 and/or remote viewing device 180. The CAD system 105 may generally perform various Computer Aided Processing (CAP) actions as described above and below (e.g., analyzing images, rendering images, re-rendering images, etc.). In an embodiment, the CAD systems functionality may reside within another computing system, e.g., the PACS Server 120, the EMR system 150, or the like.
  • In accordance with the systems and methods further described below, each of the viewing devices 170 and 180 may be configured to access, download, process and/or view thin slices in a different manner. For example, a user of the PACS workstation 170A may configure his workstation to not download any thin slices to the PACS workstation 170A, where thin slices are defined as images representing a slice thickness of less than 1 mm, while the user of the PACS workstation 170B may also configure his workstation to not download any thin slices, but thin slices are defined as images of an image series having more than 200 images. In various embodiments, as described below, other criteria may be used to determine downloading/transfer/etc. of images. Thus, each of the PACS workstations 170 may download and provide for viewing to viewers different images, such as different images that are classified as thin slices. Similarly, the remote viewing device 180, which is representative of a device that is not dedicated to viewing medical data, such as a home computing device of a doctor (e.g., a radiologist), may define thin slices in a different manner, in consideration of bandwidth and hardware limitations, for example.
  • In addition to allowing medical image downloading, viewing, processing, and management to be based on their qualifications as thin slices, the systems and methods described herein allow rules for downloading, viewing, processing, storage and/or other management of thin slices, as well as other medical images and types of medical data, based on one or more of a plurality of attributes that are referred to herein as “transfer rules” and/or “display rules.” Thus, the transfer and display rules may not only include rules regarding transfer of data (e.g., images, image series, exams, etc.), but may also include rules associated with viewing, processing, storing, printing, and/or otherwise manipulating or managing medical data, such as medical images. Transfer and display rules related to transfer of data may be implemented in a push and/or pull architecture. For example, transfer and display rules may be applied at a device that has local access to the image data for pushing of matching image data to a viewing device or transfer and display rules may be applied at a viewing device for requesting (pulling) matching image data from the device that has local access to the image data.
  • Depending on the embodiment, transfer and display rules may include criteria based on any attributes of images, series of images, exam descriptions, clinical indications, DICOM information, any other attribute, and/or any combination of attributes. For example, transfer and display rules for transferring medical images to the PACS workstation 170A, may be based on not only whether medical images qualify as thin slices, the transfer and display rules may be based on one or more of client (or viewing device) properties, connection properties, site properties, user properties, exam properties, and/or temporal properties, for example.
  • As noted above, transfer and display rules may be based on image attributes, as well as series attributes. For example, transfer and display rules may indicate that a first type of image should be transferred, but not a second type of image, even though the two image types might be within the same image series. As one example, in diffusion imaging of the brain, the user might want transfer of the isotropic diffusion images, but not the large number of anisotropic diffusion images that were used to create the isotropic diffusion images on the scanner. Thus, the user might want a rule indicating that anisotropic diffusion images of the brain are not transferred, burned on CD, or printed. Such a rule would not affect the transfer, burning, or printing of isotropic diffusion images, even where the isotropic diffusion images are included in the same image series as anisotropic diffusion images that are affected by the rule. Accordingly, transfer and display rules indicating that isotropic diffusion images are transferred to a particular workstation, but that anisotropic diffusion images in the same series are not transferred to that workstation would not transfer any anisotropic diffusion images of an image series. The type of individual images may be determined in various manners, such as according to attributes in their respective DICOM headers.
  • Transfer and display rules may also include series-based rules, such as to transfer image series of a particular modality while not transferring image series of other modalities. As another example of series-based transfer rules, consider a CT spine imaging series that includes two series, each 1,000 images×0.5 mm, where the first series is reconstructed in bone algorithm for detection of fractures and the second series is reconstructed in soft tissue algorithm. The viewer might want to routinely transfer the image series in bone algorithm, but not the image series in soft tissue algorithm. The two series may not be distinguishable by number of slices or slice thicknesses, but other information in the DICOM headers of the images in the series (e.g., series description or reconstruction algorithm) may be accessed to determine how and/or when the image series are transferred, viewed, processed, etc.
  • In another example of series- or image-based rules, images may be transferred, displayed, or otherwise processed, based on imaging plane. For example, transfer and display rules may indicate specific actions to be performed based on whether the images are axial, coronal, sagittal, or any other imaging plane.
  • In another specific example of transfer and display rules, images may be transferred or otherwise processed based on series attributes as well as series description. For example, a brain CTA might include two large series, a large series of 0.6 mm images of the brain in bone algorithm before contrast and a large series of 0.6 mm images after contrast. The user might want the second images transferred routinely to evaluate the vessels, but not the first series. However, if the clinical history is “trauma” the user might elect to view the first series (bone algorithm) to evaluate for fractures. The two large series have similar slice thickness and image numbers but differ by other criteria, e.g., series description and administration of contrast.
  • As those of skill in the art will recognize, one reason to transfer certain images is for processing, such as one or more CAP (e.g., 3D volumetric rendering or CAD (computer aided diagnosis)), for example. While transfer and display rules, as used herein, are discussed primarily with reference to display of images and transfers from a server, such as a PACS server 120 to a PACS workstation or other client machine, transfer and display rules may also apply to other types of image processing, such as transfer from a server to other machines, such as in response to viewing of an exam, or portions of the exam, on the PACS workstation. For example, when a viewer displays an exam, the transfer and display rules might cause the thin slices to be transferred to a rendering server, and might involve transfer to more than one place. As a specific example, the rules for a particular viewer (or group of viewers) might be to transfer all image series to the viewer (e.g. to a remote viewing device at the viewer's home or office), including CTA source images that are categorized as thin slices, and to transfer the CTA source images to a 3D rendering server (because the remote viewing device may not be capable of 3D rendering). As another specific example, the rules for a particular viewer (or group of viewers) might be to transfer all image series to the viewer, except the CTA source images that are categorized as thin slices (because the remote viewing device is on a slow network connection), and to transfer the CTA source images to a 3D rendering server for both rendering of the images and viewing the source images in client-server mode.
  • Because transfer and display rules may be based on such a large range of attributes, transfer and display rules for each user may differ. Similarly, transfer and display rules for combinations of a user with different sites may differ and transfer and display rules for combinations of a user with different viewing devices may differ. Thus, in one embodiment transfer and display rules comprise a set of rules that may vary from one combination of an image, user, site, viewing device, etc., to another combination of an image, user, site, viewing device, etc. Thus, systems and methods described herein provide flexibility in configuring a particular device, one or more devices at a particular site, any device operated by a particular user or class of user, etc., for downloading, viewing, and/or otherwise managing or processing medical images, including thin slices.
  • The imaging center computing device(s) 130 may include one or more imaging devices of any type, such as imaging devices that are configured to generate MRI, x-ray, mammography, or CT images (or any other type of image, as described above). In one embodiment, image data for certain medical images is stored in DICOM format. The complete DICOM specifications may be found on the National Electrical Manufactures Association Website at <medical.nema.org>. Also, “NEMA PS 3—Digital Imaging and Communications in Medicine,” 2004 ed., Global Engineering Documents, Englewood Colo., 2004, provides an overview of the DICOM standard. Each of the above-cited references is hereby incorporated by reference in their entireties.
  • The example PACS server 120 is configured to store images from multiple sources and in multiple formats, and may be configured to render certain medical images for display on one or more viewing devices via a thin network communication link, for example. For example, the PACS server 120 may be configured to receive medical images in the DICOM format from multiple sources, store these images, and selectively transmit medical images to requesting viewing devices.
  • The hospital computing device(s) 140 may include and/or be replaced with any other medical facility, such as clinic, doctor's office, or any other medical facility. These medical facilities may each include one or more imaging devices and may share medical images with the PACS server 120 or other authorized computing devices.
  • Example Display Environments
  • In one embodiment, each of the PACS workstations 170 and the remote viewing device 180 are configured to download, store, display, and allow user interaction with, medical images. Thus, each of these devices is part of a respective display environment, where the display environment for each device may also include attributes such as a network connection speed, display device characteristics, allotted and/or available storage space, processing speed, and/or other attributes that may affect how medical data is downloaded, stored, and/or viewed by the devices. As used herein, the term “viewing device” refers to a computing system that is used in a display environment, where the viewing devices could include any type of device, such as a smart phone, a tablet computer, a notebook computer, a desktop computer, a server, any other computing device or combination of devices. Thus, each of the PACS workstations 170 and remote viewing device 180 include one or more viewing devices having associated display environments.
  • Each viewing device includes, for example, one or more computing devices and/or servers that are IBM, Macintosh, Windows, Linux/Unix, or other operating system compatible. In one embodiment, viewing devices each include one or more central processing units (“CPU”), such as conventional or proprietary microprocessors, and one or more application modules that comprise one or more various applications that may be executed by the CPU.
  • The viewing devices may further include one or more computer readable storage mediums, such as random access memory (“RAM”) for temporary storage of information and read only memory (“ROM”) for permanent storage of information, and one or more mass storage devices, such as a hard drive, diskette, and/or optical media storage device. Further details regarding the viewing devices, and other computing devices of the present disclosure, are described below.
  • The viewing devices may include one or more of commonly available input/output (I/O) devices and interfaces, such as a keyboard, mouse, touchpad, and/or printer. In one embodiment, the I/O devices and interfaces for a viewing device may include one or more display devices, such as a monitor, that allows the visual presentation of data, such as medical images and/or other medical data, to a user. More particularly, display devices provide for the presentation of graphical user interfaces (“GUIs”), application software data, and multimedia presentations, for example. In one embodiment, a GUI includes one or more display panes in which medical images may be displayed. According to the systems and methods described below, medical images may be stored on the viewing device or another device that is local or remote, displayed on a display device, and manipulated by one or more application modules. Viewing devices may also include one or more multimedia devices, such as speakers, video cards, graphics accelerators, and microphones, for example.
  • The viewing devices may include one or more communication interfaces that allow communications with various external devices. For example, the viewing devices of FIG. 1A each may interface with a network 160 that includes one or more of a LAN, WAN, or the Internet, for example. The network 160 may further interface with various computing devices and/or other electronic devices. In the example embodiment of FIG. 1A, the network 160 is coupled to imaging center 130, a hospital 140, an EMR 150, and a PACS server 120, which may each include one or more computing devices. In addition to the devices that are illustrated in FIG. 1, the network 160 may communicate with other computing, imaging, and storage devices.
  • All computing devices described herein include different combinations of the components described with reference to FIGS. 1A and 1B, as well possibly additional components. For example, a server may include a processor with increased processing power, additional storage devices, but not all of the input/output devices.
  • The methods described and claimed herein may be performed by any suitable computing device, such as the viewing devices and/or computing devices of imaging centers, hospitals, EMR systems, PACS, and the like.
  • Rules-Based Approach To Image Processing, Display, Routing and/or Preloading
  • As noted above, various imaging devices generate medical images having a wide range of attributes/characteristics, such as slice thicknesses and slice quantities included in respective image series. Thus, medical images of various modalities, such as CT, MRI, PET, digital mammography (e.g., breast tomosynthesis), among others, may be processed and/or transferred for display by one or more viewing devices. The medical images may include image series with thin slices, as well as thicker slices, such as thicker axial, coronal, and/or sagittal images, for example. In one embodiment, the thin slices may be stored for occasional use in the near or long term. One or more sets of rules that collectively comprise transfer and display rules may be defined by individual users and/or administrators of entire sites, for example, to define what medical images should be classified as “thin slices,” based on image slice thickness and/or the number of images in a series, for example. Depending on the embodiment, the transfer and display rules, which may include thin rules and/or other types of rules, may be stored at any location(s), such as local to a remote viewing device, local to a PACS server, and/or at a network location that is accessible to both remote viewing devices and PACS servers, for example. The transfer and display rules may be stored in any data store, such as in a database or XML file, for example, on any computer readable medium.
  • The transfer and display rules may allow a user to indicate a preference as to whether thin slices are automatically displayed by default, or if thin slices are only displayed when requested by the user. For example, if an exam includes several image series, including one or more series having thin slices, a site may establish transfer and display rules that control if and how these thin slices are downloaded and/or displayed. Additionally, the transfer and display rules may allow individual users to assign rules that override the site rules. Thus, a first user at a particular display environment may be automatically presented with thin slices based on the transfer and display rules, while a second user at the same display environment is only presented with the thin slices if a specific request to download/view the thin slices is made by the second user, in view of different user rules in the transfer and display rules.
  • In one embodiment, the transfer and display rules may indicate whether thin slices are transmitted to remote sites, such as the remote viewing device 180 of FIG. 1, based on a variety of criteria, including preferences of the individual user, characteristics of the viewing environment, the user role, available bandwidth, the software client, or many others. For example, a rule might indicate that referring doctors are not presented with thin slices for display by default. Another rule might indicate that for certain users or sites, for example, when media such as a CD or DVD is created and/or when an exam is uploaded to a personal health record (“PHR”) or EMR system, for example, the thin slices are not included. Another rule might indicate that when film is printed, the thin slices are not printed.
  • Rules may also indicate whether thin slices are preloaded into RAM while reading and/or may indicate the priority of preloading. For example, thin slices and/or an entire image series having thin slices might be preloaded into RAM when an exam is displayed, but the thin slices might not actually be displayed, but instead maintained in RAM for immediate access if the user decides to display the thin slice. Other alternatives for transferring, loading into memory, and/or displaying thin slices may also be specified in rules. For example, rules may indicate that (i) thin slices are to be loaded into RAM and displayed after (or while) downloading the images, (ii) the thin slices are loaded into the local memory of the viewing computer and not automatically pre-loaded into RAM or displayed until the user specifies, and/or (iii) thin slices are assigned a lowest priority for use of RAM, such that the thin slices are loaded into RAM only after the other non-thin images of an exam are loaded. Thus, transfer and display rules may include rules that not only relate to movement of images between computing devices, but also within a device, such as whether images are pre-loaded into RAM, the priority of such loading, and whether they are displayed.
  • Rules may also indicate how and when images are to be displayed, including rules for rendering of images with particular characteristics (e.g., a thickness of images to be displayed).
  • FIG. 2 illustrates a transfer and display rules table including example categories of properties/attributes on which rules may be based for a user, site, or other group of viewing devices. In the embodiment of FIG. 2, the first two listed properties are slice thickness and number of images in series. As noted above, medical images may be classified as thin slices based on one or both of these first two properties. In some embodiments, transfer and display rules may include only rules used to categorize images as thin slices. However, as indicated in FIG. 2, for example, many other attributes associated with medical images and the viewing environment may be included in the transfer and display rules, such that different viewing environments may not only define thin slices differently, but may interface with thin slices in different manners, based on other transfer and display rules. In some embodiments, the transfer and display rules may also be used as criteria for downloading, viewing, printing, storing, deleting, and/or otherwise managing medical data that match the indicated transfer and display rules. Alternatively, separate sets of criteria, similar to the transfer and display rules, may be established for downloading, viewing, and/or storing medical images, for example.
  • In the embodiment of FIG. 2, the properties include one or more client properties, such as a client ID that may identify a particular viewing device and/or viewing environment, for example. The client properties may include individual characteristics of a client, such as hardware and/or software capabilities of a particular viewing device. The connection properties may indicate various characteristics of a connection between a viewing device and the Internet, for example, such as a connection speed, type of Internet service, and/or Internet service provider. The site properties may include a site identifier that identifies a particular site, class of sites, or group of sites. The user properties may indicate a particular user, such as by a user ID, username, or the like, as well as a user role, user access rights, user preferences, etc.
  • In one embodiment, site rules may indicate a default definition for thin slices, as well as rules for how thin slices are downloaded, viewed, and/or otherwise used by viewing devices associated with the particular site. Similarly, user rules may indicate a user specific, or user group specific, definition for thin slices, as well as user rules for how thin slices are downloaded, viewed, and/or otherwise used by viewing devices which the user is controlling. Thus, in one embodiment the site rules and the user rules may differ and, accordingly, may be reconciled by either overriding conflicting rules with the site rules, the user rules, requesting further input from the user, or applying some other rules for reconciling the conflicting rules.
  • The data structure of FIG. 2 also indicates that rules might be based on exam properties and/or temporal characteristics. In one embodiment, rules associated with one or more exam properties, such as an exam type, imaging device, referring doctor, imaging center, etc., may be considered in determining whether images are transferred to certain viewing devices. Other rules might indicate that only thin slices having a particular type of compression are to be transferred or viewed by a particular user, client, and/or site, for example. Such rules may be used in a real-time manner by a PACS server, to select a compression type that matches a compression type indicated in a rule so that the corresponding thin slices may be transmitted to the associated viewing devices. Rules may also be based on one or more temporal properties, such as when certain images should be downloaded, viewed, etc., by particular viewing devices, users, sites, etc. For example, a user may define a temporal rule indicating that thin slices, defined by the user and/or site rules, for example, should be downloaded to the user's viewing device only after 7:00 PM on weeknights or on weekend days.
  • The data structure of FIG. 2 also indicates that rules may be based on various other exam, series, and image attributes. For example, as described above in reference to FIG. 9, images 912, image series 908, and exams 902 and 904 may have associated attributes. “Attributes” may propagate through hierarchy. For example, an exam 904 may include multiple image series 908, and each image series may include multiple images 912. As discussed, one or more attribute associated with an instance of lower orders may inherit attributes associated with an instance of higher order. A user may choose to override the inherited attributes in each order having those attributes. In some embodiments, a user may tailor a rule based on these hierarchical orders/relations of various attributes. For example, a user may specify a rule to require first ten image slices (image attribute) of a patient's head having a sagittal plane (series attributes) from exams taken within last two years (exam attribute) reformatted to 2 mm thickness to be transferred to a hospital computing device 140. Additional examples of transfer and display rules and functionality of the system based on various exam, series, and image attributes are described below.
  • The data structure of FIG. 2 also indicates that rules may be based on prior exam display attributes. A set of medical data having a certain modality may benefit from sharing/inheriting one or more attributes from a prior exam having the same modality. For example, slice thickness rules from a prior exam may be applied to a later-taken same modality exam and facilitate direct comparison between the two medical data sets. Additional examples of transfer and display rules and functionality of the system based on prior exam display attributes are described below.
  • As mentioned, rules associated with one or multiple of the attributes/properties indicated in FIGS. 2 and 9, as well as other properties or attributes, may be considered in determining if, when, and/or how, medical images, such as thin slices, are transferred, presented for viewing, viewed, stored, printed, etc. As an example of a combination of rules based on multiple attributes, a first viewer may set user rules indicating a personal preference to have only thick sections of CT scans displayed as a routine while reading, but also wants the thin slices preloaded into RAM and displayed only if the viewer goes into MPR or volume rendering mode. The same user may also set rules indicating that for certain exam types, e.g., CTA's for example, the thin slices are presented for viewing as a default, overriding the user's general preference to have only thick sections of CT scans displayed as a default. The user may also establish rules associated with different sites, such that when the viewer is at home, the same CTA thin slices are not displayed. Such rules may also be defined in terms of connection properties, such that if the home viewing device does not meet a particular minimum access speed, the thin slices are not downloaded to the viewing device or the thin slices are downloaded to the viewing device in the background, or the thin slices are downloaded only at certain days of the week and/or times, for example. Thus, the user rules that comprise transfer and display rules may include various attributes and may be combined in various manners to produce robust and precisely customizable rules for transferring, viewing, and managing medical images.
  • Example Rules
  • FIGS. 3A, 3B, 3C illustrate example data structures having rules of varying types that may be included as part of transfer and display rules. The transfer and display rules described in reference to FIGS. 3A, 3B, and 3C, and other rules as described below (e.g., in reference to FIGS. 6, 8A, and 8B) may be stored in one or more databases of the system (e.g., on the PACS server 120, the CAD system 105, the EMR system 150, any other systems shown in FIGS. 1A and 1B, or any combination of the above). As discussed above, transfer and display rules may vary depending on many attributes of a medical image or image series, as well as the respective viewing device to which the medical image might be transferred. For example, transfer and display rules for a particular medical image might classify the medical image as a thin slice based on site rules of a first viewing device, while the same medical image is classified as a thick slice based on site rules of a second viewing device. Accordingly, the eventual transfer, viewing, and/or management of the medical image (or lack thereof) by the first and second viewing devices may be quite different. While the data structures of FIGS. 3A, 3B, and 3C illustrate certain rules that may be included in transfer and display rules, rules based on any other attributes may also be included in other systems. Likewise, transfer and display rules may include fewer rules and/or types of rules than is illustrated in the data structures of FIGS. 3A, 3B, and 3C. Additionally, the transfer and display rules may include more specific criteria for a particular type of rule than is illustrated in FIGS. 3A, 3B, and 3C. The example of FIGS. 3A, 3B, and 3C is not intended as limiting the scope of rules or attributes on which transfer criteria may be based, but rather is provided as one example of such transfer and display rules.
  • FIGS. 3A and 3B illustrate a sample data structure including user-specific rules. In this embodiment, the user establishes different rules for each of two locations associated with the user (home and hospital). In particular, section 310 of the data structure of FIG. 3A includes rules that are specific to a viewing device at the user's home (but could be applied to various other locations or other criteria). For example, a user may specify an orientational preference of the image (e.g., field-of-view and W/L ratio) and/or dimensions of each image (e.g., slice thickness and increment among the slices). Further, a user may specify, in a client-server type of setting, where image processing should occur and where the resulting rendered images should be stored.
  • However, while a user may specify various aspects of the user's viewing experience with rules by adjusting wide variety of parameters, some rules may conflict with other rules when strictly followed. For example, a user may have specified a rule to display an axial view of an exam when a patient is diagnosed with breast cancer and another rule to display a sagittal view when a patient has a history of spinal injury. When processing a patient who has both conditions, the system may reconcile the such conflicting rules. For example, some embodiments may evaluate the rules in a sequential order and allow, generally, a later evaluated rule to override a prior conflicting rule or vice versa. In some embodiments, rule reconciliation may be based on a hierarchy of rights or user specified exceptions.
  • The rules associated with the user's home also include exceptions for a particular image type, specifically, a CT image type in this embodiment. Thus, the user has established rules that define how images are classified as thin slices, and further has defined how the thin slices should be managed by the user's home viewing device, with a particular set of rules for only CT images.
  • As shown in FIG. 3B, the user may also establish rules in section 320 that are specific to a viewing device used by the user at a hospital location. Thus, the viewing devices of the user at the home and hospital locations may receive different sets of the same medical images. It should be apparent to a person skilled in the art that having “exceptions” providing for differentiation based on access location type gives the user a great flexibility. The user may specify any set of rules to configure the system to cater to the user's needs. For example, a user may specify a rule that diagnoses an abnormality (CAD), and when an abnormality is detected, automatically transfers only the image set containing the abnormality (transfer rule) to a second authorized user who can give a second opinion. The second user may have the user's viewing device configured such that it runs one or more coloring CAPs to enhance the images, and reformat the slice thickness (display rule) automatically before the user opens the images.
  • FIG. 3C illustrates a sample data structure including site rules, such as those that are developed by an administrator of a particular medical imaging center, doctors office, hospital, or other site that views and/or receives medical images. In one embodiment, site rules may be used as default rules for all viewing systems and/or users at or associated with the site. In the embodiment of FIG. 3C, the site rules indicate which rules can be overridden by user rules such as those in FIGS. 3A and 3B. In other embodiments, the site rules may be configured to always override conflicting user rules or the site rules may be configured to always be overridden by user rules. In other embodiments, other rules are used for reconciling conflicting site rules and user rules. As also noted above, rules based on various other attributes may be established by users of the system, and similar reconciliation schemes or rules may be used to determine the proper transfer and display rules for a given image or image series.
  • Additional examples of transfer and display rules such as those of FIGS. 3A, 3B, and 3C, and associated functionality of the system, are described below.
  • Reconciliation of Rules
  • FIG. 4 is a flowchart illustrating an embodiment of a method of the present disclosure, including, e.g., executing rules and CAP processes, rendering images/series, transmitting and displaying images, and various other functionality related to transfer and display rules such as reconciling default, such as site rules, with user rules.
  • In one embodiment, various sets of rules, such as site rules and user rules, that are associated with a particular medical image or series of medical images, are reconciled in order to determine a set of transfer and display rules associated with the particular medical image and viewing environment. Thus, the transfer and display rules that are applied to a particular medical image may vary depending on rules associated with one or more of the viewing environment, the viewer, the site, the connection, and/or the client, for example. In one embodiment, the method of FIG. 4 is performed by a device that stores the medical images, such as the PACS server 120 or EMR system 150 of FIGS. 1A and 1B. Depending on embodiment, the method of FIG. 4 may include fewer or additional blocks and blocks may be performed in a different order than is illustrated. In addition, similar methods may be performed in order to determine other functionality with respect to medical images, such as whether medical images should be displayed, stored, printed, etc., based on the determined transfer and display rules. Software code configured for execution on a computing device in order to perform the method of FIG. 4 may be provided on a computer readable medium, such as a compact disc, digital video disc, flash drive, or any other tangible medium. Such software code may be stored, partially or fully, on a memory device of the computing device in order to perform the method outlined in FIG. 4 by those respective devices.
  • Beginning at blocks 410 and 420, default rules, such as site rules and user rules are accessed. As indicated in FIGS. 3A, 3B, and 3C, site rules and user rules may differ in various ways, including in both defining what medical images should be classified as thin slices and in determining how thin slices and/or thick slices should be treated in view of multiple possible attributes. At block 430, transfer and/or display rules based on the default/user/site/etc. rules are determined. As noted above, the default and/or user rules may include rules defining required or excluded properties of many types, such as properties of the viewing environment, the viewer, the sights, the connection, the exam, and/or the client, for example. The computing device that performs the method of FIG. 4 reconciles the various default/user/site/etc. rules in order to determine transfer and display rules for a particular image or image series (or other set of medical data). Following the rule reconciliation, at block 432, the default/user/site/etc. rules may include an optional block to execute one or more CAP processes. Examples of executing CAP processes are described below in reference to FIG. 7 and other figures. At block 434, the system may optionally render images based on determined display rules or the results of block 432. For example, a user may have defined a rule in which a calcification detection algorithm is executed at block 432 and an associated rule that renders the images with thicker slices is executed at block 434. Further examples of rendering images in response to CAP processes and various rules are described in reference to various figures below.
  • Next, at block 436, the system determines, based on one or more transfer and display rules, whether to transmit the images (e.g., rendered images) to one or more computing devices. If so, the system applies the transfer rules to the image series and/or images in the image series at block 440. At block 440, the determined transfer rules are applied to the image and/or image series in order to determine how the images should be transferred, and/or stored, if at all, to one or more receiving devices (e.g., a device that is scheduled to receive the particular image series or a device that requests image series). At block 450, medical images, such as thin and thick slices that meet the transfer rules, are transferred to the viewing device.
  • At block 452, the system determines whether to display the rendered images, e.g., based on one or more display rules. If so, then at block 454 the system applies the display rules to the image and/or image series in order to determine how the images should be displayed by one or more viewing devices. Display of images according to display rules is described in further detail below.
  • Finally, at block 460, additional optional image processing may optionally be performed by the system, as further described below.
  • Smart Switching
  • In certain medical imaging systems, one or both of user-side and server side applications operate upon a similar dataset of information, such as a dataset representing a series of medical images encompassing a volume of tissue. Such applications may be configured to render medical images from the dataset, for example, such as by using advanced processing of cross-sectional medical image series. However, if only a rendering server-side approach is used, the server typically stores a large number of medical data, including datasets associated with medical images, because the server cannot anticipate what medical data will require such advanced processing. Thus, the costs for such server-side systems may be high. In addition, various simultaneous users may compete for server availability and if the server goes down or becomes unavailable, all users may be adversely affected. Alternatively, if a pure client side approach is employed, when a user needs to process a series of images, the images must first be loaded into the memory of their viewing device (client), which often involves transmission of the datasets to the client from a storing server, which may cause a delay of several seconds to several minutes or more as the dataset is made available to the client application. In addition, a pure client-side approach requires that the application be loaded and maintained on each client.
  • In one embodiment, certain viewing devices and/or image serving devices, such as a PACS server or EMR system, execute a smart switching software module. A smart switching application may include a client-based software application capable of processing data (in this case, rendering of CT, MRI, PET, Ultrasound, for example, as multiplanar or volume rendered images) and optionally a server-based software application with that capability residing on the same network (e.g., network 160 and/or 165 of FIG. 1A, 1B). In one embodiment, the smart switching module on a viewing device is configured to detect the location of a desired dataset and, if the dataset is available on the rendering server, the processing operation(s) (e.g., rendering of the medical images) occurs there, and the results may be viewed on the viewing device via a thin-client viewer, for example. In this embodiment, if the dataset is not available on the rendering server, the rendering server is not available, is busy, and/or there is no rendering server, the dataset may be processed on (possibly after transmission to) the client-based smart switching module on the viewing device. In one embodiment, smart switching may be performed in response to application of transfer and display rules that are based on any attributes, such as any one or more of the properties listed in FIGS. 3A, 3B, and 3C, for example. In one embodiment, transfer and display rules may include attributes related to a particular user (or group of users) or site (or group of sites), such that transfer and display might be performed differently based on the current user and/or site, for example. Depending on the embodiment, the transfer and display rules may be part of the transfer and display rules or may be a separate rule set.
  • In some embodiments, a user and/or viewing device may want selected image series transferred to a rendering server as opposed to (or in addition to) transfer to a viewing device. For example, a viewer may want to do side-by-side volume rendering of a patient's abdominal aortic aneurysm by comparing the current exam to one from 2 years ago, but the thin slices from the old exam are not in the cache of the rendering server. Thus, the transfer and display rules may indicate that the images from the 2 year old exam are transferred from a PACS server (that still stores the images) to a rendering server. Similarly, transfer and display rules might specify that thin slices are not transferred to a viewing device, but instead might specify that the series be transferred to a central rendering server for 3D processing. Any other set of transfer and display rules may be used to control transfer of images to a rendering server, one or more viewing devices, and/or other devices. For example, in the case of computer aided diagnosis (CAD), e.g., chest CT for nodules or breast imaging, transfer and display rules might specify that the images be transferred to a processing server (in this case for CAD) as well as the viewing device for viewing.
  • As an example, consider a CT examination that contains 4 series: one axial series with two hundred 4 mm thick slices, one coronal series with one hundred 3 mm thick slices, one sagittal series with one hundred 3 mm thick sections, and one axial series with sixteen hundred 0.5 mm thick sections. Using an example set of rules for defining thin slices, the axial series may be defined by the individual user or site as “thin” because of a slice thickness criteria of under 1 mm, or because of an image per series rule of >200 images.
  • As noted above, transfer and display rules may indicate a user's preference to not display the thin series when images are typically viewed for reading, or not display them under only certain reading circumstances such as when the client is running with a slow network connection. Suppose the user is working on a client that has not downloaded the thin slices and decides to perform additional image processing while viewing the exam (e.g., perhaps the user wishes to create oblique slices or volume-rendered images). When the user accesses a tool for such advanced processing, e.g., included in the client smart switching module, the system recognizes that the thin slices are not present locally, determines that the thin slices are on a connected rendering server, and enables the user to access the advanced processing application on the server via a thin client (or other connection type), with the rendering operations performed on the server side. Alternatively, if the smart switching application determines that there is no server-side rendering engine available, the smart switching module determines that the thin slices are available but will need to be downloaded, and may initiate download (and possibly background downloading) of the thin slices. Suppose now that the user is working on a client that has downloaded the thin slices into local memory, and decides to perform additional image processing on the thin slices. The smart switching modules may be configured to determine if there is a networked rendering server with a copy of the thin slices, and if so, initiate the processing operations on the server side immediately, with display on the client via a thin client viewer and, if there is no rendering server, make the locally downloaded thin slices available to the locally resident processing application instead.
  • Use of one or more smart switching modules may advantageously load-balance activities of clients and servers in a way that can be user defined and based on real-time availability of one or more servers, for example. Such load balancing may allow the rendering server to store fewer exams, while allowing for decreased processing times to process data sets that might normally be processed by a client when the server is available for processing. As noted above, a smart switching system for advanced image processing may be combined with rules of varying types, including rules for defining thin slices. In one embodiment, a smart switching module supports the following workflow: (1) if a rule is set (e.g., a user, site, device, or other rule) to show no thin slices, the thin slices are not displayed under the indicated rule preferences (e.g., site rules may indicate that thin slices are not to be downloaded or viewed at a home location), and only the thick slice images are presented; (2) if a user wants to have an exam automatically loaded in advanced visualization mode (or otherwise processed) by rule or switch to that mode, and the thin slices are available on the rendering server, the thin slices can be rendered by the rendering server with minimal delay (as compared to the delay in rendering that might be experienced if rendered on the client); and/or (3) if a user wants to have an exam automatically loaded in advanced visualization mode by rule or switch to that mode, but the images are not available on any rendering server (e.g., the rendering server(s) are busy, down, not available, or absent entirely), the local client is used to render the thin slices, such as by making the appropriate dataset available to the client processing application in background so that other work can be performed in the meantime.
  • Rendering Server Cluster or Grid
  • In some embodiments, a rendering server may comprise a cluster or grid of computing devices. For example, a rendering server may include many computing devices that could serve as rendering servers, dedicated rendering servers, and/or other computing devices that can also serve as a rendering server, such as PACS workstation that renders images in a background process for other viewing devices.
  • For server side rendering, images may or may not be stored at each rendering device (where there are multiple possible rendering devices). For example, if there are multiple rendering devices capable of serving as a rendering server, it may not be efficient for each of them to have a copy of all the cases that might be subjected to rendering. Rendering may include any type of image generation or modification, such as 3D image rendering, 2D image rendering, adjusting a bit depth of images (e.g., adjusting 12-bit DICOM images into 8-bit JPEG images so they can be displayed in browsers and other light clients), and/or any other image manipulations. For ease of explanation, 3D rendering is discussed herein; however, any discussion of 3D rendering should be construed to also describe any other type of rendering.
  • In addition to each rendering device not having a copy of all image data, users may desire to render old exams that are not stored by any rendering device, for example to compare to new exams. In both cases, the images to be rendered may be transferred to the rendering device at the time of rendering. In one embodiment, a viewing device queries the group of rendering devices, or a device that load-balances rendering requests, with a request that the “fastest” rendering server provide processing. The “fastest” server might not be the fastest rendering device or least busy rendering device, but the one with the most rapid access to the data. For example, consider a series of imaging centers connected with a WAN, each with its own PACS server and 3D rendering device (could be the same machine). A user at site A might need to perform interactive 3D rendering of an exam at remote site B. While equivalent 3D rendering devices might exist at both site A and site B, the system may utilize the remote rendering server at site B because it has the fastest connection to the data, as the data resides locally at site B. Similarly, if images are stored at one rendering device, but not at others, the system may select the rendering device that already has a copy of the images for rendering, rather than another rendering device that might otherwise be used for the rendering.
  • In one embodiment, rendering of multiple exams requested by a particular viewer/viewing device might occur on different rendering devices. For example, a viewer may need to render a current exam and an old exam, such as to compare the images of the exams. If the old exam was performed at a hospital and the new exam at an imaging center, where it is also being read, the viewer at the imaging center needs to perform 3D volumetric rendering of the two exams, to assess for growth of an abdominal aortic aneurysm, for example. Rather than requiring rendering of both the current and old exams on a particular rendering device (e.g., at the imaging center), according to the systems and methods described herein, rendering of the old exam might be performed by a first rendering device (e.g., at the hospital where the exam is stored), while the new exam is rendered by a second rendering devices (e.g., a client machine at the imaging center).
  • FIG. 5 is a block diagram of three sites that are configured to perform one or more of the operations noted above. In particular, FIG. 5 illustrates that site A 510 includes an image server and archive, a 3D rendering server, and a PACS workstation, while each of Sites B 520 and C 530 also include an image server and archive and a 3D rendering server. In one embodiment, the PACS workstation at site A 510 is operated by a viewer (e.g., radiologist or other doctor) in order to view medical images that are rendered by any of the site A, B, or C rendering servers. In other embodiments, the PACS workstation at site A may be replaced by any other viewing device, such as a desktop computer configured to access medical images via a browser or other software, a smart phone, a tablet computer, and/or laptop computer.
  • In the embodiment of FIG. 5, Site A has requested rendered images that include thin slice images (e.g., either in response to rules that initiate automatic rendering/transfer of the images and/or a request for the rendered images from a user of the PACS workstation at site A 510, for example). However, rather than transferring the large set of thin slice image data from the image server at Site B 520 for rendering by the local rendering server at Site A 510, the workstation A utilizes the rendering server at Site B 520, which has fast local access to the thin slice image data. Thus, processing power at Site A 510 may be used for other purposes while Site B 520 renders the images. Alternatively, if the thin slice image data is stored local to Site A 510, the 3D images may be rendered locally at Site A 510. As discussed above, the rules for defining which device should perform rendering may be based on any number of factors.
  • In another example, there might be exams that a user at Site A 510 wants to render that are at remote sites B and C, which are connected via slow networks. Thus, one option is to have the exams transferred from these remote sites to the local 3D rendering server at Cite A 510. However, transfer of the image data might be slow. Alternatively, using the smart switching methods described above, the rendering servers at Sites B 520 and Sites C 530 may be automatically utilized to render the images at those remote sites. In one embodiment, such remote rendering may be selected in response to rules, such as a rule that automatically initiates remote rendering of image data for images that are defined as thin slice images and/or based on the bandwidth between the remote rendering servers (e.g., at sites B 520 and C 530).
  • In one embodiment, images may be rendered at multiple remote rendering devices (e.g., rendering servers at Sites B 520 and C 530) for viewing at a workstation (e.g., PACS workstation at Site A 510). For example, if image data for a first exam of a patient is locally accessible to Site B 520 and image data for a second exam of the patient is locally accessible to Site C 530, and a user of the workstation at Site A 510 wishes to compare rendered images from the first and second exams, images may be rendered by each of the rendering servers at Sites B 520 and C 530 using the respective locally available image data so that Site A 520 may not need to do any image rendering in order to view rendered images of two exams that have been rendered by different remote rendering servers.
  • Advantageously, distribution of rendering tasks in the manners discussed above may be performed based on rules that initiate rendering by local and/or remote rendering devices, such that the user does not need to manually coordinate rendering of various image data. In various embodiments, images may be rendered via in locations determined based on user-defined rules, CAP results, attributes of a viewing device (e.g., screen resolution, locations of the device), and/or various other factors, as described below.
  • Examples of Rules-Based Rendering of Medical Images
  • FIG. 6 illustrates additional example display rules of the present disclosure. With technological advances, some medical imaging which only showed 2D images can now show 3D images or image data, and/or 2D images may be rendered from 3D image data. For example, many ultrasound systems can obtain 3D image data of a fetus, which may be rendered into one or more 2D images. Different users may prefer rendering, transfer, and/or display of 2D slice images (e.g., from 3D medical imaging data) with specific slice thicknesses (and/or other image characteristics). For example, a user may prefer to view image slices with a 3 mm thickness instead of a 0.5-1 mm thickness. Such as preference may be associated with one or more image characteristics. Thus, for example, where an original image set does not have 3 mm thickness, a rendering device may, in response to rules set by the user, reformat the image set having a different thickness and display as if the image set originally had 3 mm thickness. A user may provide one or more display rules to a rendering device, be it a server, client, or in some cases switching between the two, that may retrieve said rules and reformat images based on the retrieved rules.
  • FIG. 6 provides example display rules having conditions (left column) and associated actions (right column). For example, rule 602 specifies that when a rendering device gets an image set having each slice less than 0.5 mm thick, it will reformat the image set to output a slab having thickness 3 mm. Rules can be flexible and can be defined to trigger on a wide variety of conditions. Some examples of a variety of triggering conditions include triggering when the input image set is associated with a specific plane (e.g., an oblique view condition 608), when an exam property/attribute shows that the patient is a high risk patient (e.g., a patient from a family with history of cancer 610), and/or when an image set has too many slices (e.g., more than thousand slices 612 making it difficult to visually manage).
  • A rule may have a single condition or have a combined condition (e.g., multiple conditions) that comes into effect only when multiple conditions are concurrently satisfied. Rule 604 is one such rule with a combined condition, in which a condition of exam type being tomography is combined with a condition of each image slice having “thickness less than 0.5 mm.” The rule 604 will only come into effect upon encountering an image series that satisfies both criteria before rendering 2 mm slabs. FIG. 6 shows another example of a multiply-defined condition 606 where a patient age is less than 30 years and exam type is tomography. This condition will not trigger the reformat when a patient's age is 31.
  • The system is not limited to reformatting slices from thin to thick and should not be construed to be limited to offer only unidirectional process. The system is equally capable of reformatting slices from thick to thin (e.g., by generating thin slices from source volumetric/3D data, or source thinner slices, etc.). A default rule may suggest 3 mm thickness slabs but some user rules may prefer to reformat to 1 mm slices. A display rule for a high risk patient 610 may be one such situation where reformatting to thin slices with small increments may allow the viewer to have a finer inspection of the image data. A user may specify any thickness and increment with rules similar to those shown in FIG. 3A. More specifically, rules 318 and 320 show an embodiment specifying thickness and increment, respectively.
  • In some embodiments, a user may specify a plurality of rules to be concurrently in effect. For example, the high risk patient rule 610 and number of slices rule 612 can both apply to a same image set. In such situations where multiple rules are concurrently in effect and the command of the rules conflict, a prioritizing scheme that overrides one rule with the other may be implemented. For example, where rules 610 and 612 conflict, the system may give higher priority to prior-defined rule 610 and ignore rule 612, or vice versa. In some embodiments, all triggered rules will run and the last defined rule may override all prior rules' renditions of images. Any outcome determinable rule reconciliation may be adequate.
  • Some rules may specify “overlap” of slices along with reformatting thickness, as indicated by example display rules 606 and 612. Having overlapping slices may be helpful in providing a viewer with a sense of image continuity and/or allowing the user to perform some CAP to prepare the images for better analysis, such as filtering with moving average filter to remove noise in the image sets. Overlapping slices are a function defined in both slice thickness and increment between the slices. For example, where slice thickness is specified to be 3 mm and increment 4 mm, there will be a 1 mm gap between the slices. Where thickness and increment are specified to be 3 mm, sequential image slices will have neither gaps nor overlaps. Where slices are 3 mm and increment is 2 mm, each slice will have an overlap of 1 mm on each end with an adjacent slice. A user may modify either parameter to reformat images and obtain desired overlaps. It should be apparent to a skilled person in the art that a slice overlap is not limited to one or two slices. For example, a user specifying 0.5 mm thickness and 0.2 mm increment (rules 318 and 320) may get a slice that overlaps with at most 4 other slices.
  • The rules in FIG. 6 are by no means exclusive and rules bearing on other attributes or display environmental variables are also available. An example of an attribute not specified in FIG. 6 is a “user type” (FIG. 3A gives an example of Radiologist as a value to this attribute). An example of a display environmental variable is network speed. Additionally, transfer and display rules may also take into account other database entries or information in a DICOM header file, such as the patient's age. For example, a user may put forth rules to render slices of a breast tomosynthesis exam progressively thinner based on a patient's age, such as first rule requesting 1.0 mm for 30 years old and above, second rule requesting 0.9 mm for 27 to 30 years old, and third rule requesting 0.8 mm everyone else.
  • In some examples, rules may incorporate other relevant information that pertains to a patient's risk factors, such as age of the patient or a family history of a particular cancer or other diseases. A user may also create a rule based on other information in the DICOM header, such as view, and specify to render any oblique image data with 4 mm thickness. In some embodiments, these rules may be applied as a default rule to apply to image series. For example, if a rule applied as a default specifies presentation of 3 mm slabs where original images comprise 0.5 mm slices, the system may automatically reformat the 0.5 mm slices to 3 mm slabs. The system may automatically reformat images following an exam data generation event or in response to user requests (e.g., user sends the system a command equivalent to “read image data”). Referring again to FIG. 4, having reconciled conflicting rules at 430, the system may render image data compliant with predetermined rules at block 434. Here and elsewhere, the term “reformat” is not to be construed narrowly but broadly to include, in a nonlimiting way, a decrease in resolution as well as increase in resolution, applying interpolation algorithms or the like, in addition to already discussed adjustments in slice thickness. Conversion to lower resolution, if not too low so to affect later analyses, may alleviate storage and bandwidth concerns.
  • In some embodiments, at block 460 of FIG. 4, a user may manually select one or more display rules and request the rendering device to apply the rules. An image set rendered at display block 454 may, in some instances, need to be adjusted by a user, and the system may allow the user to specify and manually apply temporary rules (that may later be saved and become a permanent rule) to reformat images for a more in depth analysis. For example, a user may identify a different complication while viewing an image set with a known abnormality and may want to reformat the image set with a rule that can best show the complication.
  • It should be noted that the example rules of FIG. 6 are not limited to displaying but should be construed to also include transfer rules. FIGS. 6 shows an example rule structure showing what form of conditions and functions various rules may take. Depending on the nature of the action and its conditions, the system may apply a display rule and/or a transfer rule at blocks 432, 434, 440, 454, and/or 460. For example, a rule taking on network speed, data compression, and/or storage space as parameters at block 440 may be a transfer rule. In some embodiments, while a display rule may leave the original image data intact even though it displays a subset of the image data, a transfer rule may cause transmission of only a subset of images. A display rule is unlikely to erase the original after display but a transfer rule may cause an original image to be erased after transfer.
  • In various embodiments, as illustrated with blocks 614 and 616, rules may take on results of CAP as a condition parameter. CAP actions are described in further detail below in reference to FIG. 7. For example, when the system encounters rules with a condition requiring a CAP result, it executes blocks 702, 704, and/or 706 of the flowchart of FIG. 7 to obtain a result useful for evaluating the outcome of the rule's condition (unless a relevant CAP result is already available from some previous execution of FIG. 7 blocks). See the section on Computer-Aided Analysis and Rendering of Medical Images below for how an independent call to FIG. 7 blocks may be made. With the CAP outcome, rules 614 and 616 may be checked and if satisfied, blocks 708, 710, and 712 of FIG. 7 may be performed at block 434 of FIG. 4.
  • Examples of Rules-Based Processing and/or Presentation of Medical Images
  • Advantageously, accordingly to certain embodiments various anomalies may be easily identified in medical images when they are displayed with particular views and thicknesses. Accordingly, the system may allow a user to set rules, based on the modality and the nature of the exam or other attributes, to initially display the image set in a helpful context. For example, a certain user may, through his/her experience with the system, find one configuration to work best and specify one or more rules catered to the user's specific purpose. Referring to FIG. 2, and as discussed herein, transfer and display rules may be based on one or more of exam attributes, series attributes, images attributes (e.g., image plane), prior example display attributes, and/or various other attributes or criteria. Referring to FIG. 3A, example display rules may include specifying particular image planes, Field of View (FOV) 314 and 332, and window and level (W/L) 316. Additional examples are described below and shown in FIGS. 3B and 3C. By defining these types of display rules, a user may, for example, set up a rule for a 3D ultrasound image of a baby to initially display a particular view or image plane, with a particular slice thickness, and with particular window and/or level settings (among other settings).
  • These rules may provide (by specifying display parameters such as image plane, color, thickness, W/L, FOV, and numerous others) a uniform initial presentation of different image sets (a presentation, here, includes the concept of orientations and is defined to have a broader meaning than an orientation). Where there are many different viewers sharing few viewing devices and each viewing device is frequented by numerous viewers, there are multiple benefits to be had for having user specific presentation of images. Advantageously, according to certain embodiments, the system enables a user to apply display rules for specific orientation, zoom, or performing other transformations to best suit the image to the user's specific needs. It should be noted that these display rules may be applied to transfers and storage as well. In some embodiments, a previous viewer may have closed an image set in a different presentation after viewing, but the system may allow the next viewer to specify a preference of displaying the image set in a familiar presentation. FIGS. 3B and 3C show additional example rules that specify initial presentations defaulting to a user default 326 or to a site default 330. Site rules and user rules may be reconciled at block 430 of FIG. 4 (as described herein), and any optional CAP processes are executed at block 432 of FIG. 4 (as described herein). Then at block 434 of FIG. 4, the system renders an initial image presentation compliant with display rules and optional CAP action (as described herein). Depending on the decision path taken at block 452 of FIG. 4, the system presents the user with the rendered images.
  • There are other advantages to having rules that configure a presentation. For example, when printing an image set, having a default initial presentation may save the trouble of trying to rematch presentation with prior presentations/displays. The rules do not have to generate an image having one presentation, but may generate many initial images with many pre-defined presentations. For example, when analyzing ultrasound 3D images of a baby, instead of having to re-orient a 3D ultrasound image set for one image of each view, a user may want to define a rule or set of rules which can quickly display, transfer, and/or store one top, one bottom, two sides, and one perspective view.
  • In some embodiments where an exam image set may be identified as a subsequent image set related to (e.g., related to the same patient, medical diagnosis, image plane, modality, and/or any other attribute) an image set from one or more prior exams, a rendering device may choose display parameters that best mimic display parameters from the prior exams. The display parameters may not be exact, but the rendering device may take a close approximation to present similar presentation between the related exams. A rendering device may retrieve the viewing rules (e.g., slice thickness, plane, or other display parameters), and/or stored state (e.g., brightness or the contrast) of the prior exam's viewing session and apply the rules and the state to the current exam image set. In some embodiments, a user may conveniently switch between similarly presented images from different exams in order to identify any growing abnormalities. In some embodiments, the display parameters may be user-specific, such that only exams previously displayed to the user may be used as a basis for determining display parameters for a new exam displayed to the user.
  • In some embodiments the display rules may further indicate particular image rendering methods, slice thicknesses, slice increments, and/or the like, based on any image attribute.
  • Examples of Computer-Aided Analysis and Rendering of Medical Images
  • In some embodiments, transfer and display rules may take into account one or more CAP (e.g., computer aided diagnoses (CAD)) related to medical images. For example, a CAP action on a mammogram may automatically reveal information on breast density, calcification formation, and/or existence of implants. After the CAP action, for better visual analysis, one or more rules may indicate that a dense breast is to be rendered with thinner slices (or slices or a particular thickness) while images with detected calcification might be rendered with thicker slices (or slices with a particular thickness). Patients with implants might be rendered differently yet.
  • FIG. 4, as described above, includes blocks related to CAP actions. For example, at blocks 432 and 434 the system may optionally execute one or more CAP actions, and may optionally render images based on display and/or transfer rules related to results of the CAP. Additionally, transfer and/or display rules may indicate CAP-related criteria for display and/or transfer of images, as described herein.
  • FIG. 7 is a flowchart illustrating an embodiment of the present disclosure related to CAP actions and rendering of images more generally (as may be applicable to various transfer and display rules described above). Blocks 432, 434, 440, 454, and 460 may request FIG. 7 CAP actions, and/or may operate based on results of CAP actions. An example of the flow of FIG. 7 may be understood by reference to example CAP rule/action 810 of FIG. 8. At block 702, the system retrieves the rule (e.g., rule 810) and determines that this rule is to only apply to medical data having modality of MRI and brain exams. The retrieved rule specifies a condition that the exam data (or DICOM header) indicate an existence of a “dementia.”
  • Assuming the condition is satisfied, the rule requests the system to run an MRI Brain Volumetric Analysis at block 704. The system may, in the process, isolate the brain from other matters such as the structure of cranium, and color code the brain to highlight the result. At block 706, the system may then evaluate the CAP result and determine the brain to be significantly smaller than a healthy brain.
  • In various implementations, as described herein, CAP actions may involve more than simple morphologically based CAD. Rather, CAP action may include, e.g., artificial intelligence and Bayesian analytics as a determining factor in the rendering decision. Such CAP action may be particularly useful in assessing lesion morphology, patient risk factors, and other clinical factors (e.g., in determining the relative suspicion of a breast cancer). Thus, in one example the system may include a rule that indicates that a lesion of is to be rendered in a certain slab thickness, color, window/level, opacity, etc., based on a best depiction of the lesion characteristics that are most associated, e.g., with neoplasm risk (or some other clinical indication determined by a CAP action). Thus, in this example, a lesion that is suspicious (as determined by one or more CAP actions as mentioned herein) may be rendered as a slab of a certain obliquity and thickness to best show, e.g., a branching calcification because the system determined (e.g., by a CAP action) that such a calcification is a likely sign of breast cancer. Furthermore, a user or other system may include a rule that indicates, e.g., “display the images so that the features that most raise suspicion are optimally displayed” or “only adjust the rendering when a lesion is found that has a >2% probability of being neoplastic” or “pre-display a collection of such rendered images that optimally show the most suspicious lesions in an exam in an order based on rank of suspicion.”
  • In various embodiments, blocks 702, 704, and 706 may execute any number of combination of CAP action (the description immediately above providing just one example).
  • At block 708 the system may determine an image rendering location at block 708.
  • As discussed in reference to various figures above, including FIG. 5, the rendering of the images may occur in the client side, server side, or any capable computing devices shown in FIGS. 1A and 1B. Generally, images are rendered on a server. However, a higher resolution viewing device may require rendering at a higher resolution so complete rendering on the server followed by a transfer would slow down local rendering. For this reason, if a file is small, it is advantageous to transfer the small file for rendering locally. Also, other image set properties such as number of images to be transferred, bit depth of the images, original file size, and/or rendered file size may be used as criteria in determining whether to transfer an image set or not. Further, any CAPs that consume much computing power or need extensive sets of prior exam data may provide a different rationale in transfer decision making process. For these reasons, a user may prefer to specify one or more rules to supplement Smart Switching mechanism.
  • Referring to FIGS. 3A and 3B, rules 322, 324, and 328 are examples of user specified location rules. In rule 322, a user has defined that if the user is accessing medical images from home, render images on server. With rule 324, the user has configured the server to store the re-rendered images. Rule 328 is an example of how a user may specify render location based on different access location types. When a user is accessing the medical images at hospital, rule 328 will apply instead of rule 324. Referring again to FIG. 7, after a rendering location is determined, the rendering device will determine relevant rendering parameters from rules, data attributes, and/or environment variables (such as display resolution) at block 710. When ready, the system will render images at block 712.
  • Referring to FIG. 4, at blocks 440 and 454 some display or transfer rules may be conditioned on results from execution of CAPs. Example rules 614 and 616 of FIG. 6 are of this type. Whenever the necessary CAP result does not yet exist, the system may execute the flow of FIG. 7 from within blocks 440 or 454 of the flowchart of FIG. 4 in order to obtain the CAP result. In tissue density example rule 614, images are to be reformatted/re-rendered to a thickness of 1 mm if the results of a CAP action indicate that tissue density is greater than 5. Advantageously, while the underlying image set may not provide tissue density information by itself, the system may automatically perform a CAP process to determine the tissue density by following the flowchart of FIG. 7. Blocks 702, 704, and 706 calculate tissue density and blocks 708, 710 and 712 may even color-gradient the 3D image based on tissue density. In display rule 614, if CAP result satisfied the tissue density condition, the system will reformat the images to 1 mm at block 454.
  • Thus, a user's task of scrutinizing image and sorting out a relevant subset of images and making a diagnosis may be greatly sped up and made more accurate as images may be automatically rendered to a thickness that is most beneficial to accurate reading of an exam. As explained above, example rule 614 indicates that the system is to reformat image to 1 mm slices. Such rules may be helpful in detecting abnormalities including breast cancer. For example, women with high breast density may be more likely to get breast cancer than women with low breast density. CAP results (e.g., calculation of breast density) paired with a diagnostic rule (e.g., where tissue density is greater than 5) is a powerful tool that may greatly reduce viewers' time and effort in sifting through irrelevant images. In an embodiment, the system (as directed by one or more rules) may automatically vary image slice thickness (as rendered by the system) based on breast density (or other tissue density). For example, images of dense (or greater than average denseness) tissue (as detected by an automatic CAP action) may be rendered with thinner slice thickness (e.g., a default or base thickness), while images of less dense (or average or lower denseness) tissue (as detected by an automatic CAP action) may be rendered with thicker slices (e.g., the slice thickness may vary between a default value and increase to 10 mm). The opposite may also be true, in another implementation.
  • The rule 614 only looked at one CAP result—calculated tissue density—and compared it to a scalar value of 5. However, a user may specify more sophisticated rules that employ multiple CAPs. For example, rule 616 has a CAP searching for an abnormality. The CAP in 616 may be any of the CAPs in FIG. 8A, including spine fracture detection, lung nodule detection, lung parenchyma analysis, etc. A user may provide a list of CAPs to run in search for multiple types of abnormalities. For rule 616, when the system finds an abnormality such as a calcified breast, it may then fuse together relevant slices of the image set and return a subset containing only the fused slices. The fused slices might more clearly show the suspected abnormality.
  • Referring to the flowchart of FIG. 4, at block 460 a user may choose to perform any further and optional image processing. In some embodiments, a user will simply select one or more predefined rules and activate the selected rules. In some embodiments, a user may create a new rule and activate the rule and/or save the rule in a storage device connected to the network. In some embodiments, a user may interactively re-orient, zoom, color, highlight, annotate, comment, or perform any semantically and/or visually enhancing operations. Some of these optional image processing or enhancing operations may utilize CAP.
  • All of the blocks where CAP can be called may trigger an alert and/or notification for the user. In some embodiments, the alert and/or notification is automatically transmitted to a device operated by the user associated with a corresponding trigger. The alert and/or notification can be transmitted at the time that the alert and/or notification is generated or at some determined time after generation of the alert and/or notification. When received by the device, the alert and/or notification can cause the device to display the alert and/or notification via the activation of an application on the device (e.g., a browser, a mobile application, etc.). For example, receipt of the alert and/or notification may automatically activate an application on the device, such as a messaging application (e.g., SMS or MMS messaging application), a standalone application (e.g., a health data monitoring application or collection management application used by a collection agency), or a browser, for example, and display information included in the alert and/or notification. If the device is offline when the alert and/or notification is transmitted, the application may be automatically activated when the device is online such that the alert and/or notification is displayed. As another example, receipt of the alert and/or notification may cause a browser to open and be redirected to a login page generated by the system so that the user can log in to the system and view the alert and/or notification. Alternatively, the alert and/or notification may include a URL of a webpage (or other online information) associated with the alert and/or notification, such that when the device (e.g., a mobile device) receives the alert, a browser (or other application) is automatically activated and the URL included in the alert and/or notification is accessed via the Internet.
  • FIG. 8A is a table illustrating additional example rules related to CAP actions that may be performed by the system. For example, various of the rules shown in FIG. 8A may be used to automatically determine one or more CAP to perform on an image, image series, and/or imaging exam. In this example, the table (which may be any other data structure in other embodiments) indicates associations between particular modalities (column 802), exam types (column 804), and CAP (column 806) that may be valuable to examination of the exam images. The table further includes a rules column 808 that includes rules for execution of the CAP indicated in column 806. The rules may indicate that certain CAP are performed automatically (for example, without any input from the doctor), automatically if certain conditions are met (for example, insurance covers, exam has certain characteristics, previous CAP has certain results, and the like), or after confirmation from a radiologist, for example. In the example rules 808, words in quotes indicate clinical indication or history, such as “trauma.” The rules may include other criteria for executing one or more CAP, for example based on one or more of:
  • Which CAP systems are available
    Exam characteristics, for example, MRI of spine vs. CT of brain
    Clinical information, for example, brain MRI where clinical question is dementia (one type of processing) vs. trauma (another type of processing)
    User preference
    Site preference
    Insurance approval
    Billable status
    Referring docs order
    Presence of comparison exam
    Whether or not a certain type of CAP was already performed on the exam and/or on a prior exam, for example:
  • If prior exam used CAD, automatically compare to result.
  • If prior exam used Quantitative Analysis, automatically compare to result.
  • Results of another CAP. For example, a rule may indicate that a particular CAP should be run if another specific CAP had a certain result (for example, another CAP had a result that was abnormal, normal, demonstrated a particular finding, demonstrated a measurement in a particular range, and/or the like).
    Status of another CAP. For example, a rule may indicate that two CAP should be performed, but that a second CAP should not be performed until the first CAP is complete. By way of example, “Brain aneurysm detection CAD” may require that a “3D Vessel tracking” CAP be run first, as “Brain aneurysm detection CAD” may process the results of “3D Vessel tracking” CAP. The last example rule listed in the example CAP Rules table of FIG. 8B (described below) illustrates another example in which three CAP are automatically run in a particular sequence in the event that two conditions are met.
  • In some embodiments certain results of a CAP may automatically trigger the scheduling of another CAP (for example, based on the rules in column 808). For example, the modality and exam in rule 810 is associated with Brain MRI exams (as indicated in columns 802 and 804), and the indicated CAP of “MRI brain volumetric analysis” is associated with a rule (column 808) indicating that the CAP is automatically performed when the clinical indication is “dementia.”
  • In some embodiments, scheduling of a particular CAP, either automatically or manually, may automatically cause one or more other CAP to be scheduled before or after that particular CAP. For example, exam rule 812 indicates that scheduling of “Brain aneurysm detection CAD” should result in the automatic scheduling of “3D Vessel tracking” CAP, and that “3D Vessel tracking” CAP should be run before “Brain aneurysm detection CAD”, for example because “Brain aneurysm detection CAD” involves processing the results of “3D Vessel tracking” CAP.
  • In another example, the modality and exam in rule 811 is associated with Brain MRI exams (as indicated in columns 802 and 804), and the indicated CAP of “MRI brain CSF analysis” is associated with a rule (column 808) indicating that the CAP is automatically performed when the clinical indication is “hydrocephalus” “or if an abnormal brain volumetric analysis resulted from another CAP (e.g., a result of running rule 810 for “Dementia”).
  • Thus, in an embodiment, the first CAP in rule 810 (“MRI Brain volumetric analysis”) may first be automatically performed on a brain MRI, such as in response to an indication of “dementia” in the MRI order from the referring doctor. Once the MRI brain volumetric analysis has been performed, the rules of FIG. 8A may again be applied to determine if one or more additional CAP should be performed. In this example, if the result of the MRI brain volumetric analysis is “abnormal” (or equivalent nomenclature), another CAP listed in rule 811 (MRI brain CSF analysis) is triggered for automated execution. Thus, in various embodiments, the rules may be configured to initiate execution of multiple CAP in response to results of previously performed CAP.
  • In one embodiment, a rules data structure may be used to determine which CAP are compatible and/or available for a particular one or more image series, such as based on various characteristics associated with the one or more image series. For example, a rules data structure comprising modality, exam, and CAD/processing, such as columns 802, 804, and 806 in the example of FIG. 8A, may be used to determine which of the various CAD/processing are compatible with medical images in particular exam modalities and exams. In one embodiment, this information may be presented to users. In the example of rows 810 and 811, “MRI brain volume analysis” and “MRI brain CSF analysis” are listed as compatible and/or available for MRI exams of the brain.
  • In various embodiments, different rules may apply to different users and/or different user groups (for example, based on preferences of the users and/or user groups). In any of the blocks where CAP rules may be performed, a user may run a CAD. CAD is a type of CAP that returns, in general, a relevant Boolean or probabilistic result to an inquiry regarding an existence of a disease and/or abnormality. The example CAPs 810, 811, and 812 are few examples of CADs. Some CAD rules may, in addition to the relevant Boolean or probabilistic result, also provide an image set (or any other relevant data) for a user to verify and/or further investigate the CAD result. For instance, the example CAP 812 takes in the “3D Vessel Tracking” results, which may contain 3D vessel images of brain, that can be used as the input to “Brain Aneurysm Detection CAD.”
  • FIG. 8B is a table illustrating additional example rules related to CAP actions that may be performed by the system. For example, the rules of FIG. 8B may be used by the system to automatically determine one or more CAP to perform on an image or image series.
  • In various embodiments, rules related to CAP may be evaluated automatically, for example when:
  • An exam is completed on a scanner.
    An exam is communicated, for example, from a scanner to a PACS System or from a PACS System to a PACS Workstation.
    A CAP is performed, for example, such that the result of the CAP may automatically trigger performance of another CAP.
  • In various embodiments, evaluation of rules related to CAP may be performed on one or more computing devices, such as scanners, PACS Systems, PACS Workstations, and the like. Based on the evaluation of rules related to CAP, one or more CAP may be automatically executed.
  • Additional Implementation Details and Embodiments
  • Various embodiments of the present disclosure may be a system, a method, and/or a computer program product at any possible technical detail level of integration. The computer program product may include a computer readable storage medium (or mediums) having computer readable program instructions thereon for causing a processor to carry out aspects of the present disclosure.
  • For example, the functionality described herein may be performed as software instructions are executed by, and/or in response to software instructions being executed by, one or more hardware processors and/or any other suitable computing devices. The software instructions and/or other executable code may be read from a computer readable storage medium (or mediums).
  • The computer readable storage medium can be a tangible device that can retain and store data and/or instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic storage device (including any volatile and/or non-volatile electronic storage devices), a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a solid state drive, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing. A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
  • Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network. The network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
  • Computer readable program instructions (as also referred to herein as, for example, “code,” “instructions,” “module,” “application,” “software application,” and/or the like) for carrying out operations of the present disclosure may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, configuration data for integrated circuitry, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++, or the like, and procedural programming languages, such as the “C” programming language or similar programming languages. Computer readable program instructions may be callable from other instructions or from itself, and/or may be invoked in response to detected events or interrupts. Computer readable program instructions configured for execution on computing devices may be provided on a computer readable storage medium, and/or as a digital download (and may be originally stored in a compressed or installable format that requires installation, decompression or decryption prior to execution) that may then be stored on a computer readable storage medium. Such computer readable program instructions may be stored, partially or fully, on a memory device (e.g., a computer readable storage medium) of the executing computing device, for execution by the computing device. The computer readable program instructions may execute entirely on a user's computer (e.g., the executing computing device), partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). In some embodiments, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present disclosure.
  • Aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.
  • These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart(s) and/or block diagram(s) block or blocks.
  • The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks. For example, the instructions may initially be carried on a magnetic disk or solid state drive of a remote computer. The remote computer may load the instructions and/or modules into its dynamic memory and send the instructions over a telephone, cable, or optical line using a modem. A modem local to a server computing system may receive the data on the telephone/cable/optical line and use a converter device including the appropriate circuitry to place the data on a bus. The bus may carry the data to a memory, from which a processor may retrieve and execute the instructions. The instructions received by the memory may optionally be stored on a storage device (e.g., a solid state drive) either before or after execution by the computer processor.
  • The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the blocks may occur out of the order noted in the Figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. In addition, certain blocks may be omitted in some implementations. The methods and processes described herein are also not limited to any particular sequence, and the blocks or states relating thereto can be performed in other sequences that are appropriate.
  • It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions. For example, any of the processes, methods, algorithms, elements, blocks, applications, or other functionality (or portions of functionality) described in the preceding sections may be embodied in, and/or fully or partially automated via, electronic hardware such application-specific processors (e.g., application-specific integrated circuits (ASICs)), programmable processors (e.g., field programmable gate arrays (FPGAs)), application-specific circuitry, and/or the like (any of which may also combine custom hard-wired logic, logic circuits, ASICs, FPGAs, etc. with custom programming/execution of software instructions to accomplish the techniques).
  • Any of the above-mentioned processors, and/or devices incorporating any of the above-mentioned processors, may be referred to herein as, for example, “computers,” “computer devices,” “computing devices,” “hardware computing devices,” “hardware processors,” “processing units,” and/or the like. Computing devices of the above-embodiments may generally (but not necessarily) be controlled and/or coordinated by operating system software, such as Mac OS, iOS, Android, Chrome OS, Windows OS (e.g., Windows XP, Windows Vista, Windows 7, Windows 8, Windows 10, Windows Server, etc.), Windows CE, Unix, Linux, SunOS, Solaris, Blackberry OS, VxWorks, or other suitable operating systems. In other embodiments, the computing devices may be controlled by a proprietary operating system. Conventional operating systems control and schedule computer processes for execution, perform memory management, provide file system, networking, I/O services, and provide user interface functionality, such as a graphical user interface (“GUI”), among other things.
  • As described above, in various embodiments certain functionality may be accessible by a user through a web-based viewer (such as a web browser), or other suitable software program. In such implementations, the user interface may be generated by a server computing system and transmitted to a web browser of the user (e.g., running on the user's computing system). Alternatively, data (e.g., user interface data) necessary for generating the user interface may be provided by the server computing system to the browser, where the user interface may be generated (e.g., the user interface data may be executed by a browser accessing a web service and may be configured to render the user interfaces based on the user interface data). The user may then interact with the user interface through the web-browser. User interfaces of certain implementations may be accessible through one or more dedicated software applications. In certain embodiments, one or more of the computing devices and/or systems of the disclosure may include mobile computing devices, and user interfaces may be accessible through such mobile computing devices (for example, smartphones and/or tablets).
  • Many variations and modifications may be made to the above-described embodiments, the elements of which are to be understood as being among other acceptable examples. All such modifications and variations are intended to be included herein within the scope of this disclosure. The foregoing description details certain embodiments. It will be appreciated, however, that no matter how detailed the foregoing appears in text, the systems and methods can be practiced in many ways. As is also stated above, it should be noted that the use of particular terminology when describing certain features or aspects of the systems and methods should not be taken to imply that the terminology is being re-defined herein to be restricted to including any specific characteristics of the features or aspects of the systems and methods with which that terminology is associated.
  • Conditional language, such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment.
  • Conjunctive language such as the phrase “at least one of X, Y, and Z,” or “at least one of X, Y, or Z,” unless specifically stated otherwise, is to be understood with the context as used in general to convey that an item, term, etc. may be either X, Y, or Z, or a combination thereof. For example, the term “or” is used in its inclusive sense (and not in its exclusive sense) so that when used, for example, to connect a list of elements, the term “or” means one, some, or all of the elements in the list. Thus, such conjunctive language is not generally intended to imply that certain embodiments require at least one of X, at least one of Y, and at least one of Z to each be present.
  • The term “a” as used herein should be given an inclusive rather than exclusive interpretation. For example, unless specifically noted, the term “a” should not be understood to mean “exactly one” or “one and only one”; instead, the term “a” means “one or more” or “at least one,” whether used in the claims or elsewhere in the specification and regardless of uses of quantifiers such as “at least one,” “one or more,” or “a plurality” elsewhere in the claims or specification.
  • The term “comprising” as used herein should be given an inclusive rather than exclusive interpretation. For example, a general purpose computer comprising one or more processors should not be interpreted as excluding other computer components, and may possibly include such components as memory, input/output devices, and/or network interfaces, among others.
  • While the above detailed description has shown, described, and pointed out novel features as applied to various embodiments, it may be understood that various omissions, substitutions, and changes in the form and details of the devices or processes illustrated may be made without departing from the spirit of the disclosure. As may be recognized, certain embodiments of the inventions described herein may be embodied within a form that does not provide all of the features and benefits set forth herein, as some features may be used or practiced separately from others. The scope of certain inventions disclosed herein is indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.

Claims (20)

What is claimed is:
1. A computer-implemented method of automated analysis of medical images, the method comprising:
by one or more processors executing program instructions:
accessing a set of medical image data;
automatically analyzing the set of medical image data by a CAP action to determine a value of an attribute of the set of medical image data;
accessing a user-defined rule indicating an association between the value of the attribute and a desired image slice thickness; and
applying the rule to the set of medical image data to render a series of medical images, wherein medical images of the series match the desired slice thickness.
2. The computer-implemented method of claim 1 further comprising:
by one or more processors executing program instructions:
accessing a plurality of user-defined CAP rules;
identifying a CAP rule associated with the set of medical imaging data; and
determining the CAP action indicated by the rule.
3. The computer-implemented method of claim 2, wherein the CAP rule is associated with the set of medical imaging data based on at least one of: a modality, an anatomical region, or a medical indicator.
4. The computer-implemented method of claim 2, wherein the attribute comprises at least one of: a tissue density, or a presence of an abnormality or a suspected abnormality.
5. The computer-implemented method of claim 2 further comprising:
by one or more processors executing program instructions:
identifying a possible abnormality based on the analyzing of the set of medical image data by the CAP action; and
fusing a group of medical images from the series of medical images in a region associated with the possible abnormality.
6. The computer-implemented method of claim 5 further comprising:
by the one or more processors executing program instructions:
automatically initiating display of the group of medical images of the series of medical images at a computing device.
7. The computer-implemented method of claim 2 further comprising:
by one or more processors executing program instructions:
determining a rendering location for rendering the series of medical images based on a second user-defined rule.
8. The computer-implemented method of claim 7, wherein the second user-defined rule indicates at least one of: a user associated with the user-defined rule, a location at which the series of medical images are to be displayed, or a characteristic of a device upon which the series of medical images to be displayed.
9. A system comprising:
a computer readable storage medium having program instructions embodied therewith; and
one or more processors configured to execute the program instructions to cause the one or more processors to:
access a set of medical image data;
automatically analyze the set of medical image data by a CAP action to determine a value of an attribute of the set of medical image data;
access a user-defined rule indicating an association between the value of the attribute and a desired image slice thickness; and
apply the rule to the set of medical image data to render a series of medical images, wherein medical images of the series match the desired slice thickness.
10. The system of claim 9, wherein the one or more processors are configured to execute the program instructions to further cause the one or more processors to:
access a plurality of user-defined CAP rules;
identify a CAP rule associated with the set of medical imaging data; and
determine the CAP action indicated by the rule.
11. The system of claim 10, wherein the CAP rule is associated with the set of medical imaging data based on at least one of: a modality, an anatomical region, or a medical indicator.
12. The system of claim 10, wherein the attribute comprises at least one of:
a tissue density, or a presence of an abnormality or a suspected abnormality.
13. The system of claim 10, wherein the one or more processors are configured to execute the program instructions to further cause the one or more processors to:
identify a possible abnormality based on the analyzing of the set of medical image data by the CAP action; and
fuse a group of medical images from the series of medical images in a region associated with the possible abnormality.
14. The system of claim 13, wherein the one or more processors are configured to execute the program instructions to further cause the one or more processors to:
automatically initiate display of the group of medical images of the series of medical images at a computing device.
15. The system of claim 10, wherein the one or more processors are configured to execute the program instructions to further cause the one or more processors to:
determine a rendering location for rendering the series of medical images based on a second user-defined rule.
16. The system of claim 15, wherein the second user-defined rule indicates at least one of: a user associated with the user-defined rule, a location at which the series of medical images are to be displayed, or a characteristic of a device upon which the series of medical images to be displayed.
17. A computer program product comprising a computer readable storage medium having program instructions embodied therewith, the program instructions executable by one or more processors to cause the one or more processors to:
access a set of medical image data;
automatically analyze the set of medical image data by a CAP action to determine a value of an attribute of the set of medical image data;
access a user-defined rule indicating an association between the value of the attribute and a desired image slice thickness; and
apply the rule to the set of medical image data to render a series of medical images, wherein medical images of the series match the desired slice thickness.
18. The computer program product of claim 17, wherein the program instructions are executable by one or more processors to further cause the one or more processors to:
access a plurality of user-defined CAP rules;
identify a CAP rule associated with the set of medical imaging data; and
determine the CAP action indicated by the rule.
19. The computer program product of claim 18, wherein the program instructions are executable by one or more processors to further cause the one or more processors to:
identify a possible abnormality based on the analyzing of the set of medical image data by the CAP action; and
fuse a group of medical images from the series of medical images in a region associated with the possible abnormality.
20. The computer program product of claim 19, wherein the program instructions are executable by one or more processors to further cause the one or more processors to:
automatically initiate display of the group of medical images of the series of medical images at a computing device.
US15/469,296 2009-09-28 2017-03-24 Computer-aided analysis and rendering of medical images using user-defined rules Active US9934568B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/469,296 US9934568B2 (en) 2009-09-28 2017-03-24 Computer-aided analysis and rendering of medical images using user-defined rules

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
US24647909P 2009-09-28 2009-09-28
US12/891,543 US8712120B1 (en) 2009-09-28 2010-09-27 Rules-based approach to transferring and/or viewing medical images
US14/179,328 US9042617B1 (en) 2009-09-28 2014-02-12 Rules-based approach to rendering medical imaging data
US14/687,853 US9386084B1 (en) 2009-09-28 2015-04-15 Selective processing of medical images
US15/163,600 US9501617B1 (en) 2009-09-28 2016-05-24 Selective display of medical images
US15/292,023 US9684762B2 (en) 2009-09-28 2016-10-12 Rules-based approach to rendering medical imaging data
US15/469,296 US9934568B2 (en) 2009-09-28 2017-03-24 Computer-aided analysis and rendering of medical images using user-defined rules

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US15/292,023 Continuation-In-Part US9684762B2 (en) 2009-09-28 2016-10-12 Rules-based approach to rendering medical imaging data

Publications (2)

Publication Number Publication Date
US20170200270A1 true US20170200270A1 (en) 2017-07-13
US9934568B2 US9934568B2 (en) 2018-04-03

Family

ID=50514288

Family Applications (8)

Application Number Title Priority Date Filing Date
US12/891,543 Active 2032-06-02 US8712120B1 (en) 2009-09-28 2010-09-27 Rules-based approach to transferring and/or viewing medical images
US14/179,328 Active US9042617B1 (en) 2009-09-28 2014-02-12 Rules-based approach to rendering medical imaging data
US14/687,853 Active US9386084B1 (en) 2009-09-28 2015-04-15 Selective processing of medical images
US15/163,600 Active US9501617B1 (en) 2009-09-28 2016-05-24 Selective display of medical images
US15/292,023 Active US9684762B2 (en) 2009-09-28 2016-10-12 Rules-based approach to rendering medical imaging data
US15/469,281 Active US10607341B2 (en) 2009-09-28 2017-03-24 Rules-based processing and presentation of medical images based on image plane
US15/469,296 Active US9934568B2 (en) 2009-09-28 2017-03-24 Computer-aided analysis and rendering of medical images using user-defined rules
US15/469,342 Active US9892341B2 (en) 2009-09-28 2017-03-24 Rendering of medical images using user-defined rules

Family Applications Before (6)

Application Number Title Priority Date Filing Date
US12/891,543 Active 2032-06-02 US8712120B1 (en) 2009-09-28 2010-09-27 Rules-based approach to transferring and/or viewing medical images
US14/179,328 Active US9042617B1 (en) 2009-09-28 2014-02-12 Rules-based approach to rendering medical imaging data
US14/687,853 Active US9386084B1 (en) 2009-09-28 2015-04-15 Selective processing of medical images
US15/163,600 Active US9501617B1 (en) 2009-09-28 2016-05-24 Selective display of medical images
US15/292,023 Active US9684762B2 (en) 2009-09-28 2016-10-12 Rules-based approach to rendering medical imaging data
US15/469,281 Active US10607341B2 (en) 2009-09-28 2017-03-24 Rules-based processing and presentation of medical images based on image plane

Family Applications After (1)

Application Number Title Priority Date Filing Date
US15/469,342 Active US9892341B2 (en) 2009-09-28 2017-03-24 Rendering of medical images using user-defined rules

Country Status (1)

Country Link
US (8) US8712120B1 (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170301090A1 (en) * 2004-11-04 2017-10-19 D.R. Systems, Inc. Systems and methods for interleaving series of medical images
US9892341B2 (en) 2009-09-28 2018-02-13 D.R. Systems, Inc. Rendering of medical images using user-defined rules
CN108968892A (en) * 2018-07-26 2018-12-11 武汉大学人民医院(湖北省人民医院) The system and method that blind area monitors under a kind of colonoscopy
US20180374234A1 (en) * 2017-06-27 2018-12-27 International Business Machines Corporation Dynamic image and image marker tracking
US10269149B2 (en) * 2016-02-19 2019-04-23 Fujifilm Corporation Tomographic image generation device, radiography imaging system, tomographic image generation method and tomographic image generation program storage medium
US10346981B2 (en) * 2016-11-04 2019-07-09 Eric Kenneth Anderson System and method for non-invasive tissue characterization and classification
US10437444B2 (en) 2004-11-04 2019-10-08 Merge Healthcare Soltuions Inc. Systems and methods for viewing medical images
US10540763B2 (en) 2004-11-04 2020-01-21 Merge Healthcare Solutions Inc. Systems and methods for matching, naming, and displaying medical images
US10607735B2 (en) * 2016-09-06 2020-03-31 International Business Machines Corporation Hybrid rendering system for medical imaging applications
US10614615B2 (en) 2004-11-04 2020-04-07 Merge Healthcare Solutions Inc. Systems and methods for viewing medical 3D imaging volumes
US10790057B2 (en) 2004-11-04 2020-09-29 Merge Healthcare Solutions Inc. Systems and methods for retrieval of medical data
US10896745B2 (en) 2006-11-22 2021-01-19 Merge Healthcare Solutions Inc. Smart placement rules
US10929508B2 (en) 2015-04-30 2021-02-23 Merge Healthcare Solutions Inc. Database systems and interactive user interfaces for dynamic interaction with, and indications of, digital medical image data
US11094416B2 (en) 2013-01-09 2021-08-17 International Business Machines Corporation Intelligent management of computerized advanced processing
DE102020111776A1 (en) 2020-04-30 2021-11-04 Schölly Fiberoptic GmbH Method for calculating an output image and / or a sequence of output images from raw image data, image recording device, image recording system and overall system
US20220022833A1 (en) * 2013-03-15 2022-01-27 PME IP Pty Ltd Apparatus and system for rule based visualization of digital breast tomosynthesis and other volumetric images
US11386588B2 (en) * 2016-12-27 2022-07-12 Sony Corporation Product design system and design image correction apparatus
US11429788B2 (en) * 2018-02-26 2022-08-30 Nippon Telegraph And Telephone Corporation Summary evaluation device, method, program and storage medium
US11468980B2 (en) * 2019-01-02 2022-10-11 Healthy.Io Ltd. Home testing data automatically changes insurance status
US11468979B2 (en) * 2020-02-06 2022-10-11 Ebm Technologies Incorporated Integrated system for picture archiving and communication system and computer aided diagnosis
US11587680B2 (en) * 2019-06-19 2023-02-21 Canon Medical Systems Corporation Medical data processing apparatus and medical data processing method
US20230076821A1 (en) * 2021-09-08 2023-03-09 Ai Metrics, Llc Systems and methods for facilitating image finding analysis

Families Citing this family (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8380533B2 (en) 2008-11-19 2013-02-19 DR Systems Inc. System and method of providing dynamic and customizable medical examination forms
US9378331B2 (en) 2010-11-19 2016-06-28 D.R. Systems, Inc. Annotation and assessment of images
US9323250B2 (en) 2011-01-28 2016-04-26 Intouch Technologies, Inc. Time-dependent navigation of telepresence robots
US9836485B2 (en) 2011-02-25 2017-12-05 International Business Machines Corporation Auditing database access in a distributed medical computing environment
US9098611B2 (en) 2012-11-26 2015-08-04 Intouch Technologies, Inc. Enhanced video interaction for a user interface of a telepresence network
US9779376B2 (en) * 2011-07-13 2017-10-03 International Business Machines Corporation Dynamically allocating business workflows
US9092727B1 (en) 2011-08-11 2015-07-28 D.R. Systems, Inc. Exam type mapping
CA2809549A1 (en) * 2012-03-14 2013-09-14 Ali Asaria Systems and methods for transmitting and rendering 3d visualizations over a network
US9361021B2 (en) 2012-05-22 2016-06-07 Irobot Corporation Graphical user interfaces including touchpad driving interfaces for telemedicine devices
WO2013176762A1 (en) 2012-05-22 2013-11-28 Intouch Technologies, Inc. Social behavior rules for a medical telepresence robot
CN104350522B (en) * 2012-06-11 2020-12-29 索尼公司 Information processing device, information processing system, information processing method, and program
US20140074913A1 (en) * 2012-09-10 2014-03-13 Calgary Scientific Inc. Client-side image rendering in a client-server image viewing architecture
US10540803B2 (en) * 2013-03-15 2020-01-21 PME IP Pty Ltd Method and system for rule-based display of sets of images
KR101633911B1 (en) * 2014-03-10 2016-06-27 (주)아이알엠 Method and apparatus for generating medical information of object
US10127662B1 (en) * 2014-08-11 2018-11-13 D.R. Systems, Inc. Systems and user interfaces for automated generation of matching 2D series of medical images and efficient annotation of matching 2D medical images
US9652846B1 (en) * 2015-10-22 2017-05-16 International Business Machines Corporation Viewpoint recognition in computer tomography images
JP2017099616A (en) * 2015-12-01 2017-06-08 ソニー株式会社 Surgical control device, surgical control method and program, and surgical system
US10296713B2 (en) * 2015-12-29 2019-05-21 Tomtec Imaging Systems Gmbh Method and system for reviewing medical study data
US10257174B2 (en) * 2016-01-20 2019-04-09 Medicom Technologies, Inc. Methods and systems for providing secure and auditable transfer of encrypted data between remote locations
US20180032675A1 (en) * 2016-07-29 2018-02-01 Lutz Dominick Automated application selection for medical devices
US10496789B2 (en) * 2016-09-02 2019-12-03 Siemens Healthcare Gmbh Display of preloaded predicted data
US10282918B2 (en) * 2016-09-20 2019-05-07 Siemens Healthcare Gmbh Two-dimensional cinematic medical imaging in color based on deep learning
US10663711B2 (en) 2017-01-04 2020-05-26 Corista, LLC Virtual slide stage (VSS) method for viewing whole slide images
CN108462817B (en) * 2017-02-22 2021-01-05 佳能株式会社 Communication apparatus, control method thereof, and storage medium
US10188361B2 (en) * 2017-03-27 2019-01-29 Siemens Healthcare Gmbh System for synthetic display of multi-modality data
US20190043441A1 (en) * 2017-08-07 2019-02-07 International Business Machines Corporation Automatically adjusting a display property of data to reduce impaired visual perception
US10886029B2 (en) * 2017-11-08 2021-01-05 International Business Machines Corporation 3D web-based annotation
US11191526B2 (en) * 2018-02-06 2021-12-07 Samsung Medison Co., Ltd. Ultrasound diagnosis apparatus and method of controlling the same
WO2019178617A1 (en) * 2018-03-15 2019-09-19 Digibrain4, Inc. Dento-craniofacial clinical cognitive diagnosis and treatment system and method
AU2019251343A1 (en) * 2018-04-10 2020-11-26 Blockchain Goose Inc Systems, apparatuses, and methods for assessing, managing, presenting and indicating health status through physical objects
JP7458328B2 (en) 2018-05-21 2024-03-29 コリスタ・エルエルシー Multi-sample whole-slide image processing via multi-resolution registration
US10643746B2 (en) * 2018-08-17 2020-05-05 Fujifilm Medical Systems U.S.A., Inc. Image viewer
US10910098B2 (en) 2018-12-11 2021-02-02 International Business Machines Corporation Automatic summarization of medical imaging studies
US11707276B2 (en) * 2020-09-08 2023-07-25 Covidien Lp Surgical buttress assemblies and techniques for surgical stapling
US11403820B1 (en) 2021-03-11 2022-08-02 International Business Machines Corporation Predictive rendering of an image
WO2023070157A1 (en) * 2021-10-27 2023-05-04 Annalise-Ai Pty Ltd Methods and systems for automated analysis of medical images with injection of clinical ranking

Family Cites Families (380)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS60196856A (en) 1984-03-20 1985-10-05 Olympus Optical Co Ltd Picture retrieval registering system
US5179651A (en) 1988-11-08 1993-01-12 Massachusetts General Hospital Apparatus for retrieval and processing of selected archived images for display at workstation terminals
US5123056A (en) 1990-02-02 1992-06-16 Siemens Medical Systems, Inc. Whole-leg x-ray image processing and display techniques
US5172419A (en) 1991-03-05 1992-12-15 Lumisys, Inc. Medical image processing system
US5779634A (en) 1991-05-10 1998-07-14 Kabushiki Kaisha Toshiba Medical information processing system for supporting diagnosis
US5384862A (en) 1992-05-29 1995-01-24 Cimpiter Corporation Radiographic image evaluation apparatus and method
US5734915A (en) 1992-11-25 1998-03-31 Eastman Kodak Company Method and apparatus for composing digital medical imagery
US5452416A (en) 1992-12-30 1995-09-19 Dominator Radiology, Inc. Automated system and a method for organizing, presenting, and manipulating medical images
EP0616290B1 (en) 1993-03-01 2003-02-05 Kabushiki Kaisha Toshiba Medical information processing system for supporting diagnosis.
US5431161A (en) 1993-04-15 1995-07-11 Adac Laboratories Method and apparatus for information acquistion, processing, and display within a medical camera system
US5515375A (en) * 1993-07-30 1996-05-07 Motorola, Inc. Method and apparatus for multiplexing fixed length message data and variably coded speech
US5542003A (en) 1993-09-13 1996-07-30 Eastman Kodak Method for maximizing fidelity and dynamic range for a region of interest within digitized medical image display
JP3277677B2 (en) * 1994-04-01 2002-04-22 ソニー株式会社 Signal encoding method and apparatus, signal recording medium, signal transmission method, and signal decoding method and apparatus
JP3544557B2 (en) 1994-04-08 2004-07-21 オリンパス株式会社 Image file device
EP0813720A4 (en) 1995-03-03 1998-07-01 Arch Dev Corp Method and system for the detection of lesions in medical images
US5857030A (en) 1995-08-18 1999-01-05 Eastman Kodak Company Automated method and system for digital image processing of radiologic images utilizing artificial neural networks
DE19620371A1 (en) 1996-05-21 1997-12-04 Philips Patentverwaltung X-ray procedure
US6222939B1 (en) 1996-06-25 2001-04-24 Eyematic Interfaces, Inc. Labeled bunch graphs for image analysis
JP3363735B2 (en) 1996-06-26 2003-01-08 松下電器産業株式会社 X-ray imaging device
US6128002A (en) 1996-07-08 2000-10-03 Leiper; Thomas System for manipulation and display of medical images
US6820093B2 (en) 1996-07-30 2004-11-16 Hyperphrase Technologies, Llc Method for verifying record code prior to an action based on the code
US6272235B1 (en) 1997-03-03 2001-08-07 Bacus Research Laboratories, Inc. Method and apparatus for creating a virtual microscope slide
US5924074A (en) 1996-09-27 1999-07-13 Azron Incorporated Electronic medical records system
US5986662A (en) 1996-10-16 1999-11-16 Vital Images, Inc. Advanced diagnostic viewer employing automated protocol selection for volume-rendered imaging
US6115486A (en) 1996-11-06 2000-09-05 Quinton Instrument Company Teleradiology system for the storage and transmission of angiographic and related image sequences
JP3878259B2 (en) 1996-11-13 2007-02-07 東芝医用システムエンジニアリング株式会社 Medical image processing device
US5987345A (en) 1996-11-29 1999-11-16 Arch Development Corporation Method and system for displaying medical images
US6243095B1 (en) 1996-12-05 2001-06-05 Peter E. Shile Navigation and display system for digital radiographs
US6151581A (en) 1996-12-17 2000-11-21 Pulsegroup Inc. System for and method of collecting and populating a database with physician/patient data for processing to improve practice quality and healthcare delivery
US6358504B1 (en) 1997-02-07 2002-03-19 Emisphere Technologies, Inc. Compounds and compositions for delivering active agents
JPH10276461A (en) 1997-03-28 1998-10-13 Sharp Corp Receiver
US5926568A (en) 1997-06-30 1999-07-20 The University Of North Carolina At Chapel Hill Image object matching using core analysis and deformable shape loci
US5995644A (en) 1997-06-30 1999-11-30 Siemens Corporate Research, Inc. Robust and automatic adjustment of display window width and center for MR images
US6008813A (en) 1997-08-01 1999-12-28 Mitsubishi Electric Information Technology Center America, Inc. (Ita) Real-time PC based volume rendering system
US5999639A (en) 1997-09-04 1999-12-07 Qualia Computing, Inc. Method and system for automated detection of clustered microcalcifications from digital mammograms
US6606171B1 (en) 1997-10-09 2003-08-12 Howtek, Inc. Digitizing scanner
US6630937B2 (en) 1997-10-30 2003-10-07 University Of South Florida Workstation interface for use in digital mammography and associated methods
US6130671A (en) 1997-11-26 2000-10-10 Vital Images, Inc. Volume rendering lighting using dot product methodology
US6175643B1 (en) 1997-12-18 2001-01-16 Siemens Corporate Research, Inc. Neural network based auto-windowing system for MR images
US6775670B2 (en) 1998-05-29 2004-08-10 Luc Bessette Method and apparatus for the management of data files
JPH11282937A (en) 1998-03-31 1999-10-15 Fuji Photo Film Co Ltd Medical network system
US6313841B1 (en) 1998-04-13 2001-11-06 Terarecon, Inc. Parallel volume rendering system with a resampling module for parallel and perspective projections
US6056691A (en) 1998-06-24 2000-05-02 Ecton, Inc. System for collecting ultrasound imaging data at an adjustable collection image frame rate
US6438533B1 (en) 1998-10-30 2002-08-20 College Of American Pathologists System for retrieval of information from data structure of medical records
US6427022B1 (en) 1998-11-10 2002-07-30 Western Research Company, Inc. Image comparator system and method for detecting changes in skin lesions
US6369816B1 (en) 1998-11-12 2002-04-09 Terarecon, Inc. Method for modulating volume samples using gradient magnitudes and complex functions over a range of values
US6404429B1 (en) 1998-11-12 2002-06-11 Terarecon, Inc. Method for modulating volume samples with gradient magnitude vectors and step functions
US6411296B1 (en) 1998-11-12 2002-06-25 Trrarecon, Inc. Method and apparatus for applying modulated lighting to volume data in a rendering pipeline
US6266733B1 (en) 1998-11-12 2001-07-24 Terarecon, Inc Two-level mini-block storage system for volume data sets
US6211884B1 (en) 1998-11-12 2001-04-03 Mitsubishi Electric Research Laboratories, Inc Incrementally calculated cut-plane region for viewing a portion of a volume data set in real-time
US6356265B1 (en) 1998-11-12 2002-03-12 Terarecon, Inc. Method and apparatus for modulating lighting with gradient magnitudes of volume data in a rendering pipeline
US6342885B1 (en) 1998-11-12 2002-01-29 Tera Recon Inc. Method and apparatus for illuminating volume data in a rendering pipeline
US6512517B1 (en) 1998-11-12 2003-01-28 Terarecon, Inc. Volume rendering integrated circuit
US6297799B1 (en) 1998-11-12 2001-10-02 James Knittel Three-dimensional cursor for a real-time volume rendering system
US6426749B1 (en) 1998-11-12 2002-07-30 Terarecon, Inc. Method and apparatus for mapping reflectance while illuminating volume data in a rendering pipeline
US6081267A (en) 1998-11-19 2000-06-27 Columbia Scientific Incorporated Computerized apparatus and method for displaying X-rays and the like for radiological analysis and manipulation and transmission of data
US6424996B1 (en) 1998-11-25 2002-07-23 Nexsys Electronics, Inc. Medical network system and method for transfer of information
US6603494B1 (en) 1998-11-25 2003-08-05 Ge Medical Systems Global Technology Company, Llc Multiple modality interface for imaging systems including remote services over a network
US6674449B1 (en) 1998-11-25 2004-01-06 Ge Medical Systems Global Technology Company, Llc Multiple modality interface for imaging systems
US6310620B1 (en) 1998-12-22 2001-10-30 Terarecon, Inc. Method and apparatus for volume rendering with multiple depth buffers
US6574629B1 (en) 1998-12-23 2003-06-03 Agfa Corporation Picture archiving and communication system
US6556695B1 (en) 1999-02-05 2003-04-29 Mayo Foundation For Medical Education And Research Method for producing high resolution real-time images, of structure and function during medical procedures
US6532311B1 (en) 1999-02-26 2003-03-11 Lockheed Martin Corporation Image browser
US6532299B1 (en) 2000-04-28 2003-03-11 Orametrix, Inc. System and method for mapping a surface
US6697506B1 (en) 1999-03-17 2004-02-24 Siemens Corporate Research, Inc. Mark-free computer-assisted diagnosis method and system for assisting diagnosis of abnormalities in digital medical images using diagnosis based image enhancement
US7107253B1 (en) 1999-04-05 2006-09-12 American Board Of Family Practice, Inc. Computer architecture and process of patient generation, evolution and simulation for computer based testing system using bayesian networks as a scripting language
US6351547B1 (en) 1999-04-28 2002-02-26 General Electric Company Method and apparatus for formatting digital images to conform to communications standard
US6388687B1 (en) 1999-04-28 2002-05-14 General Electric Company Operator-interactive display menu showing status of image transfer to remotely located devices
US6407737B1 (en) 1999-05-20 2002-06-18 Terarecon, Inc. Rendering a shear-warped partitioned volume data set
US6424346B1 (en) 1999-07-15 2002-07-23 Tera Recon, Inc. Method and apparatus for mapping samples in a rendering pipeline
US6476810B1 (en) 1999-07-15 2002-11-05 Terarecon, Inc. Method and apparatus for generating a histogram of a volume data set
US6421057B1 (en) 1999-07-15 2002-07-16 Terarecon, Inc. Configurable volume rendering pipeline
US6785410B2 (en) 1999-08-09 2004-08-31 Wake Forest University Health Sciences Image reporting method and system
US6697067B1 (en) 1999-09-28 2004-02-24 Cedera Software Corp. Method and system for storing information regarding a selected view of a three dimensional image generated from a multi-frame object
US6654012B1 (en) 1999-10-01 2003-11-25 Terarecon, Inc. Early ray termination in a parallel pipelined volume rendering system
US20020190984A1 (en) 1999-10-01 2002-12-19 Larry D. Seiler Voxel and sample pruning in a parallel pipelined volume rendering system
US6909436B1 (en) 1999-10-27 2005-06-21 The Board Of Supervisors Of Louisiana State University And Agrigultural & Mechanical College Radiologist workstation
US6621918B1 (en) 1999-11-05 2003-09-16 H Innovation, Inc. Teleradiology systems for rendering and visualizing remotely-located volume data sets
US6734880B2 (en) 1999-11-24 2004-05-11 Stentor, Inc. User interface for a medical informatics systems
US20020044696A1 (en) 1999-11-24 2002-04-18 Sirohey Saad A. Region of interest high resolution reconstruction for display purposes and a novel bookmarking capability
US6556724B1 (en) 1999-11-24 2003-04-29 Stentor Inc. Methods and apparatus for resolution independent image collaboration
US7263710B1 (en) 1999-12-31 2007-08-28 General Electric Company Medical diagnostic system with on-line real-time video training
US6383135B1 (en) 2000-02-16 2002-05-07 Oleg K. Chikovani System and method for providing self-screening of patient symptoms
JP2002165787A (en) 2000-02-22 2002-06-11 Nemoto Kyorindo:Kk Medical tomogram display device
US6988075B1 (en) 2000-03-15 2006-01-17 Hacker L Leonard Patient-controlled medical information system and method
US6665709B1 (en) 2000-03-27 2003-12-16 Securit-E-Doc, Inc. Method, apparatus, and system for secure data transport
US6778689B1 (en) 2000-03-29 2004-08-17 General Electric Company System and method of real-time multiple field-of-view imaging
FR2807543B1 (en) 2000-04-06 2004-11-05 Imstar S A IMAGING APPARATUS ASSOCIATED WITH AN IMAGE DATABASE
CA2304978C (en) 2000-04-13 2008-02-12 Giuceppe Milioto Limb extremity positioning device and measurement method
US6618060B1 (en) 2000-04-24 2003-09-09 Ge Medical Systems Global Technology Company, Llc Method and apparatus for formatting digital images in accordance with user-selected communications standard
US6711283B1 (en) 2000-05-03 2004-03-23 Aperio Technologies, Inc. Fully automatic rapid microscope slide scanner
JP2001346031A (en) 2000-06-05 2001-12-14 Fuji Photo Film Co Ltd Method and device for compositing image
US6304667B1 (en) 2000-06-21 2001-10-16 Carmen T. Reitano System and method for incorporating dyslexia detection in handwriting pattern recognition systems
US20020016718A1 (en) 2000-06-22 2002-02-07 Rothschild Peter A. Medical image management system and method
US8538770B2 (en) 2000-08-01 2013-09-17 Logical Images, Inc. System and method to aid diagnoses using cross-referenced knowledge and image databases
US20050027570A1 (en) 2000-08-11 2005-02-03 Maier Frith Ann Digital image collection and library system
DE10041165B4 (en) 2000-08-21 2005-07-07 Leica Microsystems Heidelberg Gmbh Method and arrangement for controlling the analysis and adjustment processes of a microscope
US20020091659A1 (en) 2000-09-12 2002-07-11 Beaulieu Christopher F. Portable viewing of medical images using handheld computers
US20030013951A1 (en) 2000-09-21 2003-01-16 Dan Stefanescu Database organization and searching
US6760755B1 (en) 2000-09-22 2004-07-06 Ge Medical Systems Global Technology Company, Llc Imaging system with user-selectable prestored files for configuring communication with remote devices
US7031504B1 (en) 2000-09-26 2006-04-18 Vital Images, Inc. Image data based retrospective temporal selection of medical images
JP4176299B2 (en) 2000-09-29 2008-11-05 富士フイルム株式会社 Medical image display system
JP2002111987A (en) 2000-09-29 2002-04-12 Fuji Photo Film Co Ltd Image managing system and method for managing image
US6680735B1 (en) 2000-10-04 2004-01-20 Terarecon, Inc. Method for correcting gradients of irregular spaced graphic data
US6614447B1 (en) 2000-10-04 2003-09-02 Terarecon, Inc. Method and apparatus for correcting opacity values in a rendering pipeline
US7106479B2 (en) 2000-10-10 2006-09-12 Stryker Corporation Systems and methods for enhancing the viewing of medical images
US7257832B2 (en) 2000-10-16 2007-08-14 Heartlab, Inc. Medical image capture system and method
US6678764B2 (en) 2000-10-20 2004-01-13 Sony Corporation Medical image processing system
US6925200B2 (en) 2000-11-22 2005-08-02 R2 Technology, Inc. Graphical user interface for display of anatomical information
US7103205B2 (en) * 2000-11-24 2006-09-05 U-Systems, Inc. Breast cancer screening with ultrasound image overlays
US7113186B2 (en) 2000-11-25 2006-09-26 Infinitt Co., Ltd. 3 Dimensional slab rendering system method and computer-readable medium
US7027633B2 (en) 2000-11-30 2006-04-11 Foran David J Collaborative diagnostic systems
FR2818781B1 (en) 2000-12-22 2003-03-21 Ge Med Sys Global Tech Co Llc METHOD OF SIMULTANEOUSLY DISPLAYING ORGAN IMAGES
US20020090119A1 (en) 2001-01-08 2002-07-11 Motoaki Saito Displaying multiple slice images
US20020103827A1 (en) 2001-01-26 2002-08-01 Robert Sesek System and method for filling out forms
US20020103673A1 (en) 2001-02-01 2002-08-01 Atwood Lindsay T. Methods and apparatus for facilitating the provision of services
TW514781B (en) 2001-02-15 2002-12-21 Way Tech Dev Inc Image management system and method providing real-time internet driving function
DE10112307A1 (en) 2001-03-14 2002-10-02 Siemens Ag Method and device for evaluating medical examination images
US6625310B2 (en) 2001-03-23 2003-09-23 Diamondback Vision, Inc. Video segmentation using statistical pixel modeling
US7050620B2 (en) 2001-03-30 2006-05-23 Heckman Carol A Method of assaying shape and structural features in cells
US6537219B2 (en) 2001-04-04 2003-03-25 Koninklijke Philips Electronics N.V. Static focus ultrasound apparatus and method
JP3715249B2 (en) 2001-04-27 2005-11-09 シャープ株式会社 Image processing circuit, image display device, and image processing method
WO2002088895A2 (en) 2001-05-01 2002-11-07 Amicas, Inc. System and method for repository storage of private data on a network for direct client access
JP3766608B2 (en) 2001-05-02 2006-04-12 テラリコン・インコーポレイテッド 3D image display device in network environment
US20020172409A1 (en) 2001-05-18 2002-11-21 Motoaki Saito Displaying three-dimensional medical images
US6826297B2 (en) 2001-05-18 2004-11-30 Terarecon, Inc. Displaying three-dimensional medical images
DE60230261D1 (en) 2001-05-23 2009-01-22 Vital Images Inc COVER MASKING FOR VOLUME PRESENTATION AN OBJECT ORDER
US7350231B2 (en) 2001-06-06 2008-03-25 Yahoo ! Inc. System and method for controlling access to digital content, including streaming media
US6886133B2 (en) 2001-06-07 2005-04-26 Microsoft Corporation Interactive formatting interface
US7130457B2 (en) 2001-07-17 2006-10-31 Accuimage Diagnostics Corp. Systems and graphical user interface for analyzing body images
US7031846B2 (en) 2001-08-16 2006-04-18 Affymetrix, Inc. Method, system, and computer software for the presentation and storage of analysis results
US7747453B2 (en) 2001-08-06 2010-06-29 Ulrich Medical Concepts, Inc. System and method for managing patient encounters
US20030037054A1 (en) 2001-08-09 2003-02-20 International Business Machines Corporation Method for controlling access to medical information
DE10141186A1 (en) 2001-08-22 2003-03-20 Siemens Ag Device for processing images, in particular medical images
US7039723B2 (en) 2001-08-31 2006-05-02 Hinnovation, Inc. On-line image processing and communication system
JP3704492B2 (en) 2001-09-11 2005-10-12 テラリコン・インコーポレイテッド Reporting system in network environment
US20030065613A1 (en) 2001-09-28 2003-04-03 Smith Diane K. Software for financial institution monitoring and management and for assessing risk for a financial institution
DE10149634B4 (en) 2001-10-09 2006-09-07 Mevis Breastcare Gmbh & Co. Kg Method for displaying images
US20030083903A1 (en) 2001-10-30 2003-05-01 Myers Gene E. Method and apparatus for contemporaneous billing and documenting with rendered services
WO2003040963A1 (en) 2001-11-02 2003-05-15 Medical Research Consultants L.P. Knowledge management system
US20030115083A1 (en) 2001-11-07 2003-06-19 Masarie Fred E. HTML-based clinical content
US20030120516A1 (en) 2001-11-19 2003-06-26 Perednia Douglas A. Interactive record-keeping system and method
US7155043B2 (en) 2001-11-21 2006-12-26 Confirma, Incorporated User interface having analysis status indicators
US7054473B1 (en) 2001-11-21 2006-05-30 R2 Technology, Inc. Method and apparatus for an improved computer aided diagnosis system
AU2002230442A1 (en) 2001-11-21 2003-06-10 Software Engineering 2000, Inc. System process and logic element for providing and managing record keeping applications
US20030101291A1 (en) 2001-11-23 2003-05-29 Mussack Christopher Joseph Application programming interface for provision of DICOM services
US6833843B2 (en) 2001-12-03 2004-12-21 Tempest Microsystems Panoramic imaging and display system with canonical magnifier
US7647320B2 (en) 2002-01-18 2010-01-12 Peoplechart Corporation Patient directed system and method for managing medical information
US7016952B2 (en) 2002-01-24 2006-03-21 Ge Medical Technology Services, Inc. System and method for universal remote access and display of diagnostic images for service delivery
US20030204420A1 (en) 2002-04-30 2003-10-30 Wilkes Gordon J. Healthcare database management offline backup and synchronization system and method
WO2003071779A1 (en) 2002-02-19 2003-08-28 Siemens Corporate Research, Inc. System and method for generating movie loop display from medical image data
US7139416B2 (en) 2002-02-22 2006-11-21 Agfa-Gevaert N.V. Method for enhancing the contrast of an image
US6772943B2 (en) 2002-02-22 2004-08-10 Softmed Systems, Inc. System and method for document storage management
US7296239B2 (en) 2002-03-04 2007-11-13 Siemens Corporate Research, Inc. System GUI for identification and synchronized display of object-correspondence in CT volume image sets
JP3889650B2 (en) 2002-03-28 2007-03-07 三洋電機株式会社 Image processing method, image processing apparatus, computer program, and recording medium
US20030187689A1 (en) 2002-03-28 2003-10-02 Barnes Robert D. Method and apparatus for a single database engine driven, configurable RIS-PACS functionality
US7092572B2 (en) 2002-03-29 2006-08-15 Sun Microsystems, Inc. Method and apparatus for global image quantification verification
JP4070493B2 (en) 2002-04-03 2008-04-02 株式会社東芝 X-ray diagnostic apparatus and medical image analysis apparatus
JP3838141B2 (en) 2002-04-09 2006-10-25 オムロンヘルスケア株式会社 Blood pressure measuring device and exercise equipment
WO2003088152A1 (en) 2002-04-12 2003-10-23 Koninklijke Philips Electronics Nv Graphical apparatus and method for tracking image volume review
US7043474B2 (en) 2002-04-15 2006-05-09 International Business Machines Corporation System and method for measuring image similarity based on semantic meaning
US7636413B2 (en) 2002-04-16 2009-12-22 General Electric Company Method and apparatus of multi-energy imaging
US8050938B1 (en) 2002-04-19 2011-11-01 Greenway Medical Technologies, Inc. Integrated medical software system with enhanced portability
US7295691B2 (en) 2002-05-15 2007-11-13 Ge Medical Systems Global Technology Company, Llc Computer aided diagnosis of an image set
WO2004057439A2 (en) 2002-05-31 2004-07-08 University Of Utah Research Foundation System and method for visual annotation and knowledge representation
US6785674B2 (en) 2003-01-17 2004-08-31 Intelitrac, Inc. System and method for structuring data in a computer system
US7450747B2 (en) 2002-07-12 2008-11-11 Ge Medical Systems Global Technology Company, Llc System and method for efficiently customizing an imaging system
JP2004072398A (en) 2002-08-06 2004-03-04 Sony Corp Information processing system, information processing apparatus and method therefor, program storing medium, and program
US7260249B2 (en) 2002-09-27 2007-08-21 Confirma Incorporated Rules-based approach for processing medical images
US20040061889A1 (en) 2002-09-27 2004-04-01 Confirma, Inc. System and method for distributing centrally located pre-processed medical image data to remote terminals
US7058901B1 (en) 2002-10-29 2006-06-06 Koninklijke Philips Electronics N.V. Methods and apparatus for controlling the display of medical images
US20040086163A1 (en) 2002-10-31 2004-05-06 Konica Minolta Holdings, Inc. Medical image radiographing system, medical image management method and portable terminal
US20040088192A1 (en) 2002-11-04 2004-05-06 Schmidt Tim W. Medical office electronic management system
WO2004047007A1 (en) 2002-11-15 2004-06-03 Bioarray Solutions, Ltd. Analysis, secure access to, and transmission of array images
JP2004171361A (en) 2002-11-21 2004-06-17 Canon Inc Device and method for displaying image, program, and storage medium
US7123684B2 (en) 2002-11-27 2006-10-17 Hologic, Inc. Full field mammography with tissue exposure control, tomosynthesis, and dynamic field of view processing
US7616801B2 (en) 2002-11-27 2009-11-10 Hologic, Inc. Image handling and display in x-ray mammography and tomosynthesis
US7760924B2 (en) 2002-11-27 2010-07-20 Hologic, Inc. System and method for generating a 2D image from a tomosynthesis data set
US7583861B2 (en) * 2002-11-27 2009-09-01 Teramedica, Inc. Intelligent medical image management system
US7831296B2 (en) 2002-11-27 2010-11-09 Hologic, Inc. X-ray mammography with tomosynthesis
SE524847C2 (en) 2002-11-29 2004-10-12 Sectra Imtec Ab Method for interpreting images
US6891920B1 (en) 2002-11-29 2005-05-10 Fischer Imaging Corporation Automated background processing mammographic image data
US7406150B2 (en) 2002-11-29 2008-07-29 Hologic, Inc. Distributed architecture for mammographic image acquisition and processing
US7346199B2 (en) 2002-11-30 2008-03-18 Intuitive Software, Inc. Anatomic triangulation
US20040172306A1 (en) 2002-12-02 2004-09-02 Recare, Inc. Medical data entry interface
US20040122705A1 (en) 2002-12-18 2004-06-24 Sabol John M. Multilevel integrated medical knowledge base system and method
US20040122787A1 (en) 2002-12-18 2004-06-24 Avinash Gopal B. Enhanced computer-assisted medical data processing system and method
DE10300545B4 (en) 2003-01-09 2010-10-07 Siemens Ag Device, method, storage medium and data structure for the identification and storage of data
JP4744883B2 (en) 2003-01-13 2011-08-10 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Image alignment method and medical image data processing apparatus
US7613335B2 (en) 2003-02-12 2009-11-03 The University Of Iowa Research Foundation Methods and devices useful for analyzing color medical images
US7212661B2 (en) 2003-02-14 2007-05-01 Ge Medical Systems Information Technologies. Inc. Image data navigation method and apparatus
US7796839B2 (en) 2003-02-19 2010-09-14 Agfa Healthcare, N.V. Method of detecting the orientation of an object in an image
US20040165791A1 (en) 2003-02-21 2004-08-26 Ted Kaltanji Dental image storage and retrieval apparatus
US7218763B2 (en) 2003-02-27 2007-05-15 Eastman Kodak Company Method for automated window-level settings for magnetic resonance images
KR101038489B1 (en) 2003-03-05 2011-06-01 삼성테크윈 주식회사 Video Transmission Apparatus Using ???? Network And Method thereof
US8292811B2 (en) 2003-03-20 2012-10-23 Siemens Medical Solutions Usa, Inc. Advanced application framework system and method for use with a diagnostic medical ultrasound streaming application
US7944478B2 (en) 2003-03-28 2011-05-17 Konica Minolta Holdings, Inc. Medical image photographing system and medical image managing method
US7022073B2 (en) 2003-04-02 2006-04-04 Siemens Medical Solutions Usa, Inc. Border detection for medical imaging
JP2007503282A (en) 2003-05-16 2007-02-22 オドリバク、アンドリュー System and method for automatic processing of endoscopic images
US20040243435A1 (en) 2003-05-29 2004-12-02 Med-Sched, Inc. Medical information management system
US20050065424A1 (en) 2003-06-06 2005-03-24 Ge Medical Systems Information Technologies, Inc. Method and system for volumemetric navigation supporting radiological reading in medical imaging systems
US6909795B2 (en) 2003-06-16 2005-06-21 R2 Technology, Inc. Communicating computer-aided detection results in a standards-based medical imaging environment
US6996205B2 (en) 2003-06-24 2006-02-07 Ge Medical Ssytems Global Technology Company, Llc Methods and apparatus to facilitate review of CT colonography exams
JP3910997B2 (en) 2003-07-02 2007-04-25 聰 山竹 Image database system
US20050010531A1 (en) * 2003-07-09 2005-01-13 Kushalnagar Nandakishore R. System and method for distributing digital rights management digital content in a controlled network ensuring digital rights
US20050027569A1 (en) 2003-07-31 2005-02-03 Sohrab Gollogly Systems and methods for documentation of encounters and communications regarding same
CN1836240A (en) 2003-08-13 2006-09-20 美国西门子医疗解决公司 CAD (computer-aided decision ) support systems and methods
US20050043970A1 (en) 2003-08-21 2005-02-24 Kent Hsieh Electronic imaging dental record management system
US7366992B2 (en) 2003-09-19 2008-04-29 Siemens Medical Solutions Usa, Inc. Method and system for displaying and/or manipulating medical image data
US20050063575A1 (en) 2003-09-22 2005-03-24 Ge Medical Systems Global Technology, Llc System and method for enabling a software developer to introduce informational attributes for selective inclusion within image headers for medical imaging apparatus applications
US7174054B2 (en) 2003-09-23 2007-02-06 Amazon Technologies, Inc. Method and system for access to electronic images of text based on user ownership of corresponding physical text
GB0322877D0 (en) 2003-09-30 2003-10-29 British Telecomm Search system and method
US20050074150A1 (en) 2003-10-03 2005-04-07 Andrew Bruss Systems and methods for emulating an angiogram using three-dimensional image data
US20050108058A1 (en) 2003-10-10 2005-05-19 Eastman Kodak Company Management of dental records and digital images
US7840067B2 (en) 2003-10-24 2010-11-23 Arcsoft, Inc. Color matching and color correction for images forming a panoramic image
AU2004288358A1 (en) 2003-11-06 2005-05-19 Atsushi Matsunaga Medical information computerized system, program and medium
US7545965B2 (en) 2003-11-10 2009-06-09 The University Of Chicago Image modification and detection using massive training artificial neural networks (MTANN)
JP2005149107A (en) 2003-11-14 2005-06-09 Konica Minolta Medical & Graphic Inc Medical image management system
US20050114179A1 (en) 2003-11-26 2005-05-26 Brackett Charles C. Method and apparatus for constructing and viewing a multi-media patient summary
US7574030B2 (en) 2003-11-26 2009-08-11 Ge Medical Systems Information Technologies, Inc. Automated digitized film slicing and registration tool
US20050114178A1 (en) 2003-11-26 2005-05-26 Anand Krishnamurthy Method for processing a workflow for automated patient scheduling in a hospital information system
US20050110791A1 (en) 2003-11-26 2005-05-26 Prabhu Krishnamoorthy Systems and methods for segmenting and displaying tubular vessels in volumetric imaging data
US20050143654A1 (en) 2003-11-29 2005-06-30 Karel Zuiderveld Systems and methods for segmented volume rendering using a programmable graphics pipeline
JP2006014928A (en) 2004-07-01 2006-01-19 Fuji Photo Film Co Ltd Method, device and program for displaying image
US20050171818A1 (en) 2004-01-23 2005-08-04 Mclaughlin Barbara K. Patient communication device and method
US20050197860A1 (en) 2004-02-23 2005-09-08 Rademr, Inc. Data management system
US7672491B2 (en) 2004-03-23 2010-03-02 Siemens Medical Solutions Usa, Inc. Systems and methods providing automated decision support and medical imaging
US7492970B2 (en) 2004-05-12 2009-02-17 Terarecon, Inc. Reporting system in a networked environment
US7639879B2 (en) 2004-05-31 2009-12-29 Kabushiki Kaisha Toshiba Group information generating system and group information generating method
US20050273009A1 (en) 2004-06-02 2005-12-08 Harald Deischinger Method and apparatus for co-display of inverse mode ultrasound images and histogram information
EP1617344A1 (en) 2004-07-07 2006-01-18 Canon Kabushiki Kaisha Image management device and method for managing a plurality of images
US20060031097A1 (en) 2004-08-09 2006-02-09 Catalis, Inc. Practice management system
US7310651B2 (en) * 2004-08-18 2007-12-18 Ashok Dave Medical media file management system and method
US8046044B2 (en) 2004-08-25 2011-10-25 Washington University Method and apparatus for acquiring overlapped medical image slices
US7375745B2 (en) 2004-09-03 2008-05-20 Seiko Epson Corporation Method for digital image stitching and apparatus for performing the same
GB2418094B (en) 2004-09-10 2010-05-12 Medicsight Plc User interface for CT scan analysis
US7734119B2 (en) 2004-09-21 2010-06-08 General Electric Company Method and system for progressive multi-resolution three-dimensional image reconstruction using region of interest information
JP4160548B2 (en) 2004-09-29 2008-10-01 株式会社東芝 Document summary creation system, method, and program
US8048080B2 (en) * 2004-10-15 2011-11-01 Baxano, Inc. Flexible tissue rasp
US7660488B2 (en) 2004-11-04 2010-02-09 Dr Systems, Inc. Systems and methods for viewing medical images
US7970625B2 (en) 2004-11-04 2011-06-28 Dr Systems, Inc. Systems and methods for retrieval of medical data
US7787672B2 (en) 2004-11-04 2010-08-31 Dr Systems, Inc. Systems and methods for matching, naming, and displaying medical images
US7885440B2 (en) 2004-11-04 2011-02-08 Dr Systems, Inc. Systems and methods for interleaving series of medical images
US7920152B2 (en) 2004-11-04 2011-04-05 Dr Systems, Inc. Systems and methods for viewing medical 3D imaging volumes
US7656543B2 (en) 2004-11-12 2010-02-02 Hewlett-Packard Development Company, L.P. Albuming images
US20060171574A1 (en) 2004-11-12 2006-08-03 Delmonego Brian Graphical healthcare order processing system and method
US7412111B2 (en) 2004-11-19 2008-08-12 General Electric Company Enhanced image processing method for the presentation of digitally-combined medical images
US20060122482A1 (en) 2004-11-22 2006-06-08 Foresight Imaging Inc. Medical image acquisition system for receiving and transmitting medical images instantaneously and method of using the same
US7834891B2 (en) 2004-11-23 2010-11-16 General Electric Company System and method for perspective-based procedure analysis
US8000979B2 (en) 2004-11-24 2011-08-16 Blom Michael G Automated patient management system
US7516417B2 (en) 2004-11-29 2009-04-07 Canon U.S.A. Display parameter adjustment
US7525554B2 (en) 2005-01-03 2009-04-28 General Electric Company Content based hanging protocols facilitated by rules based system
US7698152B2 (en) 2005-01-07 2010-04-13 Siemens Medical Solutions Health Services Corporation Medical image viewing management and status system
US8145503B2 (en) 2005-02-25 2012-03-27 Virtual Radiologic Corporation Medical image metadata processing
US7634121B2 (en) 2005-03-01 2009-12-15 General Electric Company Method and system for rule-based comparison study matching to customize a hanging protocol
US7859549B2 (en) * 2005-03-08 2010-12-28 Agfa Inc. Comparative image review system and method
US7660413B2 (en) 2005-04-08 2010-02-09 Shahram Partovi Secure digital couriering system and method
KR100777503B1 (en) * 2005-04-26 2007-11-20 가부시끼가이샤 도시바 Medical picture filing apparatus and medical picture filing method
EP1878239A2 (en) 2005-04-28 2008-01-16 Bruce Reiner Method and apparatus for automated quality assurance in medical imaging
JP2006338630A (en) 2005-05-31 2006-12-14 Terarikon Inc Three-dimensional image display device for creating three-dimensional image while sequentially and partially decompressing compressed image data
US8249687B2 (en) 2005-06-02 2012-08-21 Vital Images, Inc. Systems and methods for virtual identification of polyps
US7613620B2 (en) 2005-06-07 2009-11-03 Angadbir Singh Salwan Physician to patient network system for real-time electronic communications and transfer of patient health information
EP1736907A3 (en) 2005-06-10 2016-07-06 Siemens Healthcare GmbH Improvement of data acquisition and image reconstruction for MR images
WO2007002685A2 (en) 2005-06-24 2007-01-04 Volcano Corporation Co-registration of graphical image data representing three-dimensional vascular features
US7236558B2 (en) 2005-07-07 2007-06-26 Terarecon, Inc. Three-dimensional image display device creating three-dimensional image directly from projection data
US20070021977A1 (en) 2005-07-19 2007-01-25 Witt Biomedical Corporation Automated system for capturing and archiving information to verify medical necessity of performing medical procedure
WO2007014681A1 (en) 2005-08-01 2007-02-08 Barco N.V. Method and device for improved display standard conformance
US20070192140A1 (en) 2005-08-17 2007-08-16 Medcommons, Inc. Systems and methods for extending an information standard through compatible online access
US20070050701A1 (en) 2005-08-31 2007-03-01 Khaled El Emam Method, system and computer program product for medical form creation
US20070055550A1 (en) 2005-09-06 2007-03-08 Patient Practitioners Llc Personal transportable healthcare data base
US20070064984A1 (en) 2005-09-19 2007-03-22 General Electric Company System and method for dynamic configuration of PACS workstation displays
US20070073556A1 (en) 2005-09-23 2007-03-29 General Electric Co. System and method for coordinating examination scheduling
US8019622B2 (en) 2005-10-24 2011-09-13 CellTrak Technologies, Inc. Home health point-of-care and administration system
US8117549B2 (en) 2005-10-26 2012-02-14 Bruce Reiner System and method for capturing user actions within electronic workflow templates
US8384729B2 (en) 2005-11-01 2013-02-26 Kabushiki Kaisha Toshiba Medical image display system, medical image display method, and medical image display program
US20070109299A1 (en) 2005-11-15 2007-05-17 Vital Images, Inc. Surface-based characteristic path generation
US7660481B2 (en) 2005-11-17 2010-02-09 Vital Images, Inc. Image enhancement using anisotropic noise filtering
US7574029B2 (en) 2005-11-23 2009-08-11 Vital Images, Inc. Characteristic path-based colon segmentation
US7590272B2 (en) 2005-11-23 2009-09-15 Vital Images, Inc. Colon characteristic path registration
US7991210B2 (en) 2005-11-23 2011-08-02 Vital Images, Inc. Automatic aortic detection and segmentation in three-dimensional image data
US20070165917A1 (en) 2005-11-26 2007-07-19 Zhujiang Cao Fully automatic vessel tree segmentation
US7574452B2 (en) 2005-11-28 2009-08-11 General Electric Company Transactional storage and workflow routing for medical image objects
US7725658B2 (en) * 2005-11-29 2010-05-25 Siemens Aktiengesellschaft Self-optimizing caching system and method for data records
WO2007065157A2 (en) 2005-12-01 2007-06-07 Future Health, Inc. Method of efficiently and effectively providing unique and/or multiple office management services in one system and a method and system for automatically selecting educational, marketing and other business-related items to be provided to a client
US20070140536A1 (en) 2005-12-19 2007-06-21 Eastman Kodak Company Medical image processing method and apparatus
US20070245308A1 (en) 2005-12-31 2007-10-18 Hill John E Flexible XML tagging
US7657566B2 (en) 2006-01-10 2010-02-02 Siemens Aktiengesellschaft Computer implemented method and system for hanging protocol configuration simulator displaying desired order of medical images data
US20070162433A1 (en) * 2006-01-11 2007-07-12 Peters James D System and method for a secure process to perform distributed transactions
US7899514B1 (en) 2006-01-26 2011-03-01 The United States Of America As Represented By The Secretary Of The Army Medical image processing methodology for detection and discrimination of objects in tissue
US20070192138A1 (en) 2006-02-16 2007-08-16 Motoaki Saito Medical record system in a wide-area network environment
JP2007275312A (en) 2006-04-06 2007-10-25 Terarikon Inc Three-dimensional image display device with preprocessor based on analysis protocol
US8015024B2 (en) 2006-04-07 2011-09-06 Depuy Products, Inc. System and method for managing patient-related data
WO2007131157A2 (en) 2006-05-04 2007-11-15 Xoran Technologies, Inc. Medical imaging exchange network
US10387612B2 (en) 2006-06-14 2019-08-20 Koninklijke Philips N.V. Multi-modality medical image layout editor
US8280483B2 (en) 2006-06-14 2012-10-02 Koninklijke Philips Electronics N.V. Multi-modality medical image viewing
US20080021877A1 (en) 2006-07-20 2008-01-24 Terarecon, Inc. Medical image processing system in a wide-area network environment
JP4573818B2 (en) 2006-08-30 2010-11-04 富士フイルム株式会社 MEDICAL IMAGE MANAGEMENT METHOD, MEDICAL IMAGE MANAGEMENT DEVICE, AND MEDICAL NETWORK SYSTEM
US7634733B2 (en) 2006-09-18 2009-12-15 Agfa Inc. Imaging history display system and method
EP1913868A1 (en) 2006-10-19 2008-04-23 Esaote S.p.A. System for determining diagnostic indications
US8223143B2 (en) 2006-10-27 2012-07-17 Carl Zeiss Meditec, Inc. User interface for efficiently displaying relevant OCT imaging data
US20080103828A1 (en) 2006-11-01 2008-05-01 Squilla John R Automated custom report generation system for medical information
DE102006053261B4 (en) 2006-11-11 2015-04-16 Visus Technology Transfer Gmbh System for the reproduction of medical images
US7787679B2 (en) 2006-11-22 2010-08-31 Agfa Healthcare Inc. Study navigation system and method
US7953614B1 (en) 2006-11-22 2011-05-31 Dr Systems, Inc. Smart placement rules
US7970188B2 (en) 2006-11-22 2011-06-28 General Electric Company Systems and methods for automatic routing and prioritization of exams based on image classification
US20080125846A1 (en) 2006-11-29 2008-05-29 General Electric Company Method and system for stent placement
US9665597B2 (en) 2006-12-05 2017-05-30 Qualcomm Incorporated Method and system for processing images using time and location filters
US7992100B2 (en) 2006-12-21 2011-08-02 Sectra Ab Dynamic slabbing to render views of medical image data
US7885828B2 (en) 2007-01-17 2011-02-08 Siemens Aktiengesellschaft Knowledge-based ordering systeming for radiological procedures
DE102007009183A1 (en) 2007-02-26 2008-08-28 Siemens Ag Image representing method for object i.e. human, involves simultaneously planning measurements of imaging methods based on overview picture, and simultaneously executing planned measurements of imaging methods
US20080300484A1 (en) 2007-04-13 2008-12-04 Jing Wang Delay insensitive SVD algorithm for perfusion analysis
US20080275913A1 (en) 2007-05-01 2008-11-06 Van Arragon Paul Dynamic assignment of statements for selected exams using clinical concepts
GB0708600D0 (en) 2007-05-03 2007-06-13 Ucl Business Plc Registration method
US9427201B2 (en) 2007-06-30 2016-08-30 Accuray Incorporated Non-invasive method for using 2D angiographic images for radiosurgical target definition
JP5001077B2 (en) 2007-07-05 2012-08-15 本田技研工業株式会社 Step holder mounting structure for motorcycles
US20090068170A1 (en) 2007-07-13 2009-03-12 President And Fellows Of Harvard College Droplet-based selection
US8335359B2 (en) 2007-07-20 2012-12-18 General Electric Company Systems, apparatus and processes for automated medical image segmentation
JP2009022626A (en) 2007-07-23 2009-02-05 Ge Medical Systems Global Technology Co Llc Ultrasonic imager and image diagnostic system
US8379051B2 (en) 2007-08-22 2013-02-19 The Boeing Company Data set conversion systems and methods
US7936910B2 (en) 2007-09-20 2011-05-03 James Hamilton Watt Method, system and software for displaying medical images
US20090091566A1 (en) 2007-10-05 2009-04-09 Turney Stephen G System and methods for thick specimen imaging using a microscope based tissue sectioning device
US8065166B2 (en) 2007-10-30 2011-11-22 Onemednet Corporation Methods, systems, and devices for managing medical images and records
US8520978B2 (en) 2007-10-31 2013-08-27 Mckesson Technologies Inc. Methods, computer program products, apparatuses, and systems for facilitating viewing and manipulation of an image on a client device
WO2009064891A2 (en) 2007-11-13 2009-05-22 Wisconsin Alumni Research Foundation A method for producing highly constrained ultrasound images
US10438694B2 (en) 2007-11-19 2019-10-08 Medicalis Corporation Management of medical workflow
US8150175B2 (en) * 2007-11-20 2012-04-03 General Electric Company Systems and methods for image handling and presentation
US20090150481A1 (en) * 2007-12-08 2009-06-11 David Garcia Organizing And Publishing Assets In UPnP Networks
US20090164247A1 (en) 2007-12-21 2009-06-25 Siemens Aktiengesellschaft Data and Display Protocols
US20090182577A1 (en) 2008-01-15 2009-07-16 Carestream Health, Inc. Automated information management process
DE102008005923B4 (en) 2008-01-24 2022-07-07 Siemens Healthcare Gmbh Method and device for automatic contrast agent phase classification of image data
US8160899B2 (en) 2008-01-31 2012-04-17 Paul Rhodes Knowledge based electronic clinical record for dentistry
US20090248442A1 (en) 2008-03-28 2009-10-01 Medicalis Corp Processing of clinical data for validation of selected clinical procedures
US8406491B2 (en) 2008-05-08 2013-03-26 Ut-Battelle, Llc Image registration method for medical image sequences
DE102008030244A1 (en) 2008-06-25 2009-12-31 Siemens Aktiengesellschaft Method for supporting percutaneous interventions
US8370293B2 (en) 2008-08-21 2013-02-05 Terarecon Inc. Workflow template management for medical image data processing
US8072504B2 (en) 2008-08-27 2011-12-06 Micron Technology, Inc. Method and system for aiding user alignment for capturing partially overlapping digital images
US7941462B2 (en) 2008-09-24 2011-05-10 Toshiba Medical Visualization Systems Europe, Limited Method and apparatus for classification of coronary artery image data
WO2010038848A1 (en) 2008-10-03 2010-04-08 株式会社 日立メディコ Ultrasonic diagnostic apparatus and image processing apparatus for ultrasonic diagnosis
US8270695B2 (en) 2008-10-07 2012-09-18 Carestream Health, Inc. Diagnostic image processing with automatic self image quality validation
US9357974B2 (en) 2008-10-27 2016-06-07 Carestream Health, Inc. Integrated portable digital X-ray imaging system
US8577696B2 (en) 2008-11-19 2013-11-05 Dr Systems, Inc. System and method for communication of medical information
US8380533B2 (en) 2008-11-19 2013-02-19 DR Systems Inc. System and method of providing dynamic and customizable medical examination forms
US8214756B2 (en) 2008-11-25 2012-07-03 Vital Images, Inc. User interface for iterative image modification
US8150708B2 (en) 2009-02-17 2012-04-03 Virtual Radiologic Corporation Organizing medical images for display
US8634677B2 (en) 2009-03-30 2014-01-21 The Regents Of The University Of California PACS optimization techniques
EP2441023A1 (en) 2009-06-09 2012-04-18 Koninklijke Philips Electronics N.V. Apparatus and method for ordering stored images
JP5258694B2 (en) 2009-07-27 2013-08-07 富士フイルム株式会社 Medical image processing apparatus and method, and program
US8712120B1 (en) 2009-09-28 2014-04-29 Dr Systems, Inc. Rules-based approach to transferring and/or viewing medical images
JP5670079B2 (en) 2009-09-30 2015-02-18 富士フイルム株式会社 MEDICAL IMAGE DISPLAY DEVICE AND METHOD, AND PROGRAM
WO2011044942A1 (en) 2009-10-15 2011-04-21 Esaote Europe B.V. Apparatus and method for performing diagnostic imaging examinations with tutorial means for the user, both in the preparatory step and in the operative step
US8452063B2 (en) 2009-11-03 2013-05-28 Mela Sciences, Inc. Showing skin lesion information
US8520920B2 (en) 2009-11-11 2013-08-27 Siemens Corporation System for dynamically improving medical image acquisition quality
US9037988B2 (en) 2009-11-25 2015-05-19 Vital Images, Inc. User interface for providing clinical applications and associated data sets based on image data
JP5846755B2 (en) 2010-05-14 2016-01-20 株式会社東芝 Image diagnostic apparatus and medical image display apparatus
US8526694B2 (en) 2010-05-25 2013-09-03 Siemens Medical Solutions Usa, Inc. Medical image processing and registration system
US8724908B2 (en) 2010-09-03 2014-05-13 Adobe Systems Incorporated System and method for labeling a collection of images
JP5725797B2 (en) 2010-10-29 2015-05-27 株式会社東芝 Medical image processing device
US20120130729A1 (en) 2010-11-24 2012-05-24 General Electric Company Systems and methods for evaluation of exam record updates and relevance
US8797350B2 (en) 2010-12-20 2014-08-05 Dr Systems, Inc. Dynamic customizable human-computer interaction behavior
US20120324400A1 (en) 2011-06-15 2012-12-20 Caliendo Jr Neal Robert Rotation Of Multi-Workspace Environment Containing Tiles
US9092727B1 (en) 2011-08-11 2015-07-28 D.R. Systems, Inc. Exam type mapping
US9495012B2 (en) 2011-09-27 2016-11-15 Z124 Secondary single screen mode activation through user interface activation
US9152760B2 (en) 2011-11-23 2015-10-06 General Electric Company Smart 3D PACS workflow by learning
DE102012201169A1 (en) 2012-01-27 2013-08-01 Siemens Aktiengesellschaft Automatic registration of image pairs of medical image data sets
US9552147B2 (en) 2012-02-01 2017-01-24 Facebook, Inc. Hierarchical user interface
US8682049B2 (en) 2012-02-14 2014-03-25 Terarecon, Inc. Cloud-based medical image processing system with access control
US9324188B1 (en) 2012-04-30 2016-04-26 Dr Systems, Inc. Manipulation of 3D medical objects
US20130297331A1 (en) 2012-05-01 2013-11-07 Siemens Aktiengesellschaft Medical Imaging Guideline Compliance System
JP6188288B2 (en) 2012-07-20 2017-08-30 キヤノン株式会社 Information processing apparatus and control method thereof
US9098183B2 (en) 2012-09-28 2015-08-04 Qualcomm Incorporated Drag and drop application launches of user interface objects
JP6143433B2 (en) 2012-10-31 2017-06-07 キヤノン株式会社 Medical image photographing apparatus and medical image display method
US9495604B1 (en) 2013-01-09 2016-11-15 D.R. Systems, Inc. Intelligent management of computerized advanced processing
US8976190B1 (en) 2013-03-15 2015-03-10 Pme Ip Australia Pty Ltd Method and system for rule based display of sets of images
US20140378810A1 (en) 2013-04-18 2014-12-25 Digimarc Corporation Physiologic data acquisition and analysis
US20150046349A1 (en) 2013-08-12 2015-02-12 Thomas J. Michael, JR. Mobile Platform for Legal Process
US9536106B2 (en) 2013-10-08 2017-01-03 D.R. Systems, Inc. System and method for the display of restricted information on private displays
US8954884B1 (en) 2013-11-18 2015-02-10 Maestro Devices, LLC Navigation system for viewing an image data-stack in less time with less effort and less repetitive motions
US9898156B2 (en) 2014-07-30 2018-02-20 Change Healthcare Llc Method and computing device for window width and window level adjustment utilizing a multitouch user interface
US9536045B1 (en) 2015-03-16 2017-01-03 D.R. Systems, Inc. Dynamic digital image compression based on digital image characteristics
US10929508B2 (en) 2015-04-30 2021-02-23 Merge Healthcare Solutions Inc. Database systems and interactive user interfaces for dynamic interaction with, and indications of, digital medical image data

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10782862B2 (en) 2004-11-04 2020-09-22 Merge Healthcare Solutions Inc. Systems and methods for viewing medical images
US10614615B2 (en) 2004-11-04 2020-04-07 Merge Healthcare Solutions Inc. Systems and methods for viewing medical 3D imaging volumes
US20170301090A1 (en) * 2004-11-04 2017-10-19 D.R. Systems, Inc. Systems and methods for interleaving series of medical images
US10790057B2 (en) 2004-11-04 2020-09-29 Merge Healthcare Solutions Inc. Systems and methods for retrieval of medical data
US10096111B2 (en) * 2004-11-04 2018-10-09 D.R. Systems, Inc. Systems and methods for interleaving series of medical images
US11177035B2 (en) 2004-11-04 2021-11-16 International Business Machines Corporation Systems and methods for matching, naming, and displaying medical images
US10540763B2 (en) 2004-11-04 2020-01-21 Merge Healthcare Solutions Inc. Systems and methods for matching, naming, and displaying medical images
US10437444B2 (en) 2004-11-04 2019-10-08 Merge Healthcare Soltuions Inc. Systems and methods for viewing medical images
US10438352B2 (en) 2004-11-04 2019-10-08 Merge Healthcare Solutions Inc. Systems and methods for interleaving series of medical images
US10896745B2 (en) 2006-11-22 2021-01-19 Merge Healthcare Solutions Inc. Smart placement rules
US10607341B2 (en) 2009-09-28 2020-03-31 Merge Healthcare Solutions Inc. Rules-based processing and presentation of medical images based on image plane
US9892341B2 (en) 2009-09-28 2018-02-13 D.R. Systems, Inc. Rendering of medical images using user-defined rules
US11094416B2 (en) 2013-01-09 2021-08-17 International Business Machines Corporation Intelligent management of computerized advanced processing
US11666298B2 (en) * 2013-03-15 2023-06-06 PME IP Pty Ltd Apparatus and system for rule based visualization of digital breast tomosynthesis and other volumetric images
US20220022833A1 (en) * 2013-03-15 2022-01-27 PME IP Pty Ltd Apparatus and system for rule based visualization of digital breast tomosynthesis and other volumetric images
US10929508B2 (en) 2015-04-30 2021-02-23 Merge Healthcare Solutions Inc. Database systems and interactive user interfaces for dynamic interaction with, and indications of, digital medical image data
US10269149B2 (en) * 2016-02-19 2019-04-23 Fujifilm Corporation Tomographic image generation device, radiography imaging system, tomographic image generation method and tomographic image generation program storage medium
US10607735B2 (en) * 2016-09-06 2020-03-31 International Business Machines Corporation Hybrid rendering system for medical imaging applications
US10346981B2 (en) * 2016-11-04 2019-07-09 Eric Kenneth Anderson System and method for non-invasive tissue characterization and classification
US11386588B2 (en) * 2016-12-27 2022-07-12 Sony Corporation Product design system and design image correction apparatus
US10552978B2 (en) * 2017-06-27 2020-02-04 International Business Machines Corporation Dynamic image and image marker tracking
US20180374234A1 (en) * 2017-06-27 2018-12-27 International Business Machines Corporation Dynamic image and image marker tracking
US11429788B2 (en) * 2018-02-26 2022-08-30 Nippon Telegraph And Telephone Corporation Summary evaluation device, method, program and storage medium
CN108968892A (en) * 2018-07-26 2018-12-11 武汉大学人民医院(湖北省人民医院) The system and method that blind area monitors under a kind of colonoscopy
US11468980B2 (en) * 2019-01-02 2022-10-11 Healthy.Io Ltd. Home testing data automatically changes insurance status
US11587680B2 (en) * 2019-06-19 2023-02-21 Canon Medical Systems Corporation Medical data processing apparatus and medical data processing method
US11468979B2 (en) * 2020-02-06 2022-10-11 Ebm Technologies Incorporated Integrated system for picture archiving and communication system and computer aided diagnosis
DE102020111776A1 (en) 2020-04-30 2021-11-04 Schölly Fiberoptic GmbH Method for calculating an output image and / or a sequence of output images from raw image data, image recording device, image recording system and overall system
US11477370B2 (en) 2020-04-30 2022-10-18 Schölly Fiberoptic GmbH Method for reconstructing an output image and/or a sequence of output images from raw image data, image recording device, image recording system, and overall system
US11830607B2 (en) * 2021-09-08 2023-11-28 Ai Metrics, Llc Systems and methods for facilitating image finding analysis
US20230076821A1 (en) * 2021-09-08 2023-03-09 Ai Metrics, Llc Systems and methods for facilitating image finding analysis

Also Published As

Publication number Publication date
US9042617B1 (en) 2015-05-26
US9934568B2 (en) 2018-04-03
US20170200269A1 (en) 2017-07-13
US9684762B2 (en) 2017-06-20
US20170200064A1 (en) 2017-07-13
US9892341B2 (en) 2018-02-13
US9501617B1 (en) 2016-11-22
US9386084B1 (en) 2016-07-05
US20170046485A1 (en) 2017-02-16
US10607341B2 (en) 2020-03-31
US8712120B1 (en) 2014-04-29

Similar Documents

Publication Publication Date Title
US10607341B2 (en) Rules-based processing and presentation of medical images based on image plane
US10129553B2 (en) Dynamic digital image compression based on digital image characteristics
US20190051420A1 (en) Evolving contextual clinical data engine for medical data processing
US10229497B2 (en) Integration of medical software and advanced image processing
US10078727B2 (en) Cloud-based medical image processing system with tracking capability
US8553965B2 (en) Cloud-based medical image processing system with anonymous data upload and download
WO2020212762A2 (en) Methods and systems for syncing medical images across one or more networks and devices
US20150127379A1 (en) Evolving contextual clinical data engine for medical information
US8311847B2 (en) Displaying radiological images
JP2013154162A (en) Medical image processing apparatus, image diagnostic apparatus, computer system, medical image processing program, and medical image processing method
US10395762B1 (en) Customized presentation of data
ES2888523A2 (en) A platform for evaluating medical information and method for using the same
US8923582B2 (en) Systems and methods for computer aided detection using pixel intensity values
US10643746B2 (en) Image viewer
US20110222753A1 (en) Adjusting Radiological Images
US20220181005A1 (en) Utilizing multi-layer caching systems for storing and delivering images
US20230154594A1 (en) Systems and methods for protocol recommendations in medical imaging

Legal Events

Date Code Title Description
AS Assignment

Owner name: D.R. SYSTEMS, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:REICHER, MURRAY A.;TRAMBERT, MICHAEL;FRAM, EVAN K.;SIGNING DATES FROM 20170318 TO 20170323;REEL/FRAME:041735/0304

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: MERGE HEALTHCARE SOLUTIONS INC., WISCONSIN

Free format text: NUNC PRO TUNC ASSIGNMENT;ASSIGNOR:D.R. SYSTEMS, INC.;REEL/FRAME:048631/0123

Effective date: 20190218

AS Assignment

Owner name: INTERNATIONAL BUSINESS MACHINES CORPORATION, NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MERGE HEALTHCARE SOLUTIONS INC.;REEL/FRAME:055617/0985

Effective date: 20210315

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

AS Assignment

Owner name: MERATIVE US L.P., MICHIGAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:INTERNATIONAL BUSINESS MACHINES CORPORATION;REEL/FRAME:061496/0752

Effective date: 20220630