EP2737435A1 - Systeme und verfahren in der digitalen pathologie - Google Patents

Systeme und verfahren in der digitalen pathologie

Info

Publication number
EP2737435A1
EP2737435A1 EP12818209.4A EP12818209A EP2737435A1 EP 2737435 A1 EP2737435 A1 EP 2737435A1 EP 12818209 A EP12818209 A EP 12818209A EP 2737435 A1 EP2737435 A1 EP 2737435A1
Authority
EP
European Patent Office
Prior art keywords
management system
user
data
information
case
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
EP12818209.4A
Other languages
English (en)
French (fr)
Other versions
EP2737435A4 (de
Inventor
Michael Meissner
Raghavan Venougopal
Ronald Stone
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Omnyx LLC
Original Assignee
Omnyx LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Omnyx LLC filed Critical Omnyx LLC
Publication of EP2737435A1 publication Critical patent/EP2737435A1/de
Publication of EP2737435A4 publication Critical patent/EP2737435A4/de
Ceased legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/743Displaying an image simultaneously with additional graphical information, e.g. symbols, charts, function plots
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H70/00ICT specially adapted for the handling or processing of medical references
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2576/00Medical imaging apparatus involving image processing or analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/43Detecting, measuring or recording for evaluating the reproductive systems
    • A61B5/4375Detecting, measuring or recording for evaluating the reproductive systems for evaluating the male reproductive system
    • A61B5/4381Prostate evaluation or disorder diagnosis

Definitions

  • This invention generally relates to systems and methods for context and purpose driven interaction in digital pathology. More particularly, the invention relates to systems and methods for interfacing applications and/or systems, which provide for customized graphical user interfaces in the field of digital pathology.
  • An aspect of the present invention is a system and/or method that provides a customized user interface for interaction with digital pathology images.
  • the method includes acquiring data, wherein the data comprises at least one digital image, processing the data based on case information, for example, and providing a customized user graphical interface resulting from said processing.
  • FIG. 1 illustrates an exemplary system according to an embodiment of the present invention.
  • FIG. 2 illustrates a method according to another embodiment of the present invention.
  • FIG. 3 illustrates a method according to another embodiment of the present invention.
  • FIG. 4 illustrates a ReadfiowTM for GU (Gleason score template) in accordance to an embodiment of the present invention.
  • FIG. 5 illustrates a ReadfiowTM for GU (Prostate Hyaline, Corpora amylacea) in accordance with an embodiment of the present invention.
  • FIG. 5a demonstrates a user interface view of the ReadfiowTM illustrated in FIG. 5.
  • FIG. 6 illustrates a ReadfiowTM for GU (Prostate tumor ar) in accordance with an embodiment of the present invention.
  • FIG. 6a demonstrates a user interface view of the ReadfiowTM illustrated in FIG. 6.
  • FIG. 7 illustrates a ReadfiowTM for Kidney (cancer near margins) in accordance with an embodiment of the present invention.
  • FIG. 7a demonstrates a user interface view of the ReadfiowTM illustrated in FIG. 7.
  • FIG. 8 illustrates a ReadfiowTM for Kidney-Fuhrman Grade in accordance with an embodiment of the present invention.
  • FIG. 8a demonstrates a user interface view of the ReadfiowTM illustrated in FIG. 8.
  • FIG. 9 illustrates a Readflow for Tissue Jumper in accordance with an embodiment of the present invention.
  • FIG. 9a demonstrates a user interface view of the ReadflowTM illustrated in FIG. 9.
  • FIG. 10 illustrates a ReadflowTM for Derm (H&E) Melanoma in accordance with an embodiment of the present invention.
  • FIG. 10a demonstrates a user interface view of the ReadflowTM illustrated in FIG. 10.
  • FIG. 11 illustrates a ReadflowTM for Derm (H&E, IHC) Epidermis alignment in accordance with an embodiment of the present invention.
  • FIG. 12 illustrates a ReadflowTM for Breast Biopsy + IHC slides in accordance with an embodiment of the present invention.
  • FIG. 13 illustrates a ReadflowTM for Breast Biopsy (Control Tissue) in accordance with an embodiment of the present invention.
  • FIG. 13a demonstrates a user interface view of the ReadflowTM illustrated in FIG. 13.
  • FIG. 14 illustrates a ReadflowTM for Breast Core Biopsy in accordance with an embodiment of the present invention.
  • FIG. 14a demonstrates a user interface view of the ReadflowTM illustrated in FIG. 14.
  • case information includes any information relating to the case, including, but not limited to, patient name, birth date, age, gender, type of case, type of procedure, number of images, bench and stain types, patient admission date, DICOM header fields and the like.
  • digital image includes, but is not limited to, photographed or scanned slides of biological samples, including whole slide images (WSI) and the like.
  • information management system may be a digital library or database, such as a repository or the like as appreciated by those skilled in the art.
  • the information management system may include one or more of a user management system, a case or context management system, an image management system, a rules management system, a laboratory information management system (LIS), an electronic medical record (EMR) and/or any other management system as appreciated by one skilled in the art.
  • the image management system may include at least digital images.
  • the case or context management system may include at least case information.
  • the user management system may include information about a user's preferences such as preferences for viewing order of images, orientation of the WSI, the magnification of the WSI, tool options that can be initiated, and the like, as appreciated by one skilled in the art.
  • the preferences may be specific to a lab technician or doctor.
  • the preferences may be specific to a health institution or laboratory facility.
  • GUI graphical user interface
  • workstation includes any computer, display device, monitor, and the like as appreciated by one skilled in the art.
  • the invention is directed to systems and methods for interfacing applications and/or systems which provide customized graphical user interface(s) in the field of digital pathology.
  • the systems and methods of the invention allow for context and purpose driven interaction in digital pathology.
  • the invention allows for the processing of data and images from an information management system in conjunction with the needs or preferences of a pathologist to produce a customized user graphical interface which, for example, optimizes appearance and/or behavior of the graphical user interface(s).
  • the customized graphical user interface (the "ReadflowTM") allows for efficient work flow by pathologists of the case and/or the digital image.
  • the systems and methods of the invention utilize a processor that processes data relating to the type of case or panel being reviewed and the known needs or preferences of the user.
  • the method includes acquiring data including at least one digital pathology image, processing the data, utilizing the processed data to provide a customized graphical user interface.
  • the customized graphical user interface may be customized based on the case information related to at least one digital pathology image, for example, the type of case such as breast, cancer, skin, or other.
  • the graphical user interface may also be customized based on an end-user's preferences, such as, for example, magnification, orientation of specific image, addition of tools, or other.
  • the customized graphical user interface provides an interactive environment in which the end user may interact with at least one digital image. The environment may include changes to appearance and/or behavior of the user graphical interface.
  • the acquired data includes at least one digital pathology image.
  • the acquired data may also include a set of digital images.
  • the set of digital images may be grouped according to, for example, a patient or a case procedure.
  • the acquired data may include at least one digital image and any other desired information such as associated case information.
  • the acquired data may include at least one digital image and user preferences.
  • the acquired data may include information from a medical pathology atlas to impact appearance and/or behavior of the graphical user interface.
  • the method includes acquiring data including a digital image, wherein the digital image has associated case information, processing the data based on the case information, and utilizing the processed data to provide a customized graphical user interface.
  • the method of the invention may be performed by an exemplary system as illustrated in FIG. 1.
  • the system 100 in FIG. 1 includes at least one information management system 110, a processor 120 for running the software that makes up the information management system, and a workstation 130 including a display device such as a monitor or other screen for displaying a graphical user interface t o a user, a mouse or other means of identifying certain elements on the screen and keyboard or other means of entering information, for example, into the at least one information management system 110.
  • the at least one information management system 110, processor 120 and workstation 130 are in operable communication with each other in any manner as appreciated by one skilled in the art. Additionally, as is readily appreciated by those skilled in the art, the customized user interface according to the invention can be used in combination with any system having a processor and display.
  • the at least one information management system 110 includes at least a digital image and associated case information.
  • the case information and the corresponding digital image may be obtained from a single management system or multiple data management systems.
  • the case information may be associated with the digital image on the digital image itself, such as through a tag, DICOM header, etc.
  • the case information may be located in a case information management system and the digital images may be separately located in an image management system.
  • the case information management system and the image management system are in operable communication with each other as appreciated by one skilled in the art such that the case information may be correlated or associated with the corresponding digital image.
  • data fields such as patient identification can be used as a means for locating and correlating the case information with the corresponding digital image.
  • the at least one management system 110 then communicates at least the digital image and case type or procedure type to the processor 120.
  • the processor 120 may be any machine that receives and processes information as appreciated by one skilled in the art.
  • the processor 120 may be a central processing unit (CPU), microprocessor, graphics processing unit (GPU) and the like.
  • the processor 120 may be programmed to implement a rules engine.
  • the rules engine may have programmed thereon predefined rules such that it evaluates incoming information based on one or more predefined rules.
  • the rules engine may include a predefined set of rules determined or set forth in guidelines of governing bodies.
  • the rules engine may include a set of predefined rules for each case type (e.g.
  • the rules engines may include predefined rules based on an individual user's preference, or alternatively, an institution's preferences.
  • multiple predefined sets of rules can be used concurrently.
  • a rule set used for a skin case may be used in conjunction with a predefined set of rules for a specific user's preference.
  • the rules engine may learn an end user's desired preferences and/or needs based on previous behaviors of the user, as appreciated by one skilled in the art.
  • the processor 120 executes the determined actions by the rules engine.
  • the processor may process the acquired data or information for example, by running an algorithm on the received information (e.g. scanning the digitized images for areas of probable mitotic activity in order to present them to the pathologist in an automated way).
  • the rules engine may result in a determination that no algorithm needs to be performed on the data in which case no further action is taken by the processor.
  • the processor 120 may process the data a number of times, such as running multiple algorithms on the same information sequentially.
  • the acquired data may be continually processed until a determination is made by the rules engine that no further processing remains to be performed on the acquired data.
  • the processed data from the processor 120 is then utilized to provide a customized graphical user interface on workstation 130 for the pathologist to interface with.
  • the processed data may result in the modification of the appearance of the application and/or modification of the behavior of the application.
  • Appearance modification may include, for example, change in orientation, magnification of an image or a portion of an image, change in display area and the like.
  • Behavior modification may include for example, tools that by themselves could change appearance of the application or change existing as well add new behaviors to the application.
  • the user may manually outline a region of interest on an image, which then is automatically identified by the system 100 on corresponding images.
  • the tools available to the pathologist for manipulation of a particular image may be selected based on user preferences to provide a customized graphical user interface for the pathologist to interact with the digital image(s).
  • the software for providing a customized graphical user interface may be launched when the pathologist interacts with the workstation 130 such as by moving the mouse, using the keypad, clicking on an icon or the like. Once launched, the user clicks on the desired case and upon doing so will be presented with an interface that has been customized based on, for example, the type of image and/or case information.
  • the systems may include more than one workstation, management information system and/or processors.
  • the system may include multiple servers, such as an individual server that maintains user preferences and/or images.
  • the workstation 130 may be capable of communicating with the at least one information management system 110 and/or the processor 120.
  • the workstation, information management system and processor can communicate electronically over a wired or wireless communication, for example.
  • the processor 120 may be located in any component of the system 100.
  • the processor 120 may be located within the workstation 130 or the at least one information management system 110 and is in operable communication with a workstation. This results in providing a customized user interface as appreciated by one skilled in the art.
  • the processor 120 may communicate with a separate server retaining the information management system that includes, for example, a digital library of whole slide images.
  • the servers such as the at least one information management system 110 and/or processor 120 may reside at the same location or at different locations.
  • the servers may reside onsite at the pathologist's office or alternatively, remotely such as off- site at a
  • the system may include more than one processor.
  • the system 100 may further include an imaging modality such as a scanner, or the like, to capture the image data in a digital format.
  • an imaging modality such as a scanner, or the like
  • a user may scan the image and place the image automatically in the digital image library or, for example, image management system.
  • the image may be scanned and manually moved from a separate server or workstation to the digital archive.
  • the system may include a scanner, software, and medical imaging devices for electronic capture, viewing, analysis, storage and retrieval of digital images of pathology slides and specimens; software for clinical information and workflow management for pathologists, and/or image analysis software that uses algorithms.
  • FIG. 2 illustrates a method of the invention 200 where data is acquired from the at least one information management system 210, the data is then processed by the processor 220 and the processed data results in a customized user interface 230.
  • the method may include acquiring digital images from an image management system and data such as patient information and case type and/or procedure type from the context management system. The acquisition of data may occur sequentially in any order or may occur simultaneously.
  • the processor which may be on a separate server, evaluates the data through a rules engine and may process the data for example, by running an algorithm(s) on the data.
  • the processed data results in a modification or change of appearance and/or behavior of the graphical user interface.
  • a digital image is then displayed on a workstation via a graphical user interface in a customized and interactive manner based on behavior and application.
  • the method may be initiated by the end user interacting with the workstation 130.
  • the data is available in the at least one management system prior to the user's interaction with the workstation and may be simultaneously processed upon the user's interaction with the workstation.
  • the data may be pre-processed as illustrated in FIG. 3 through a processor such as a pre-processing engine.
  • the pre-processing engine may have additional and/or a different set of rules than on the rules engine of processor 120.
  • the pre-processing engine may have its own rule engine.
  • the input for the preprocessing engine may be a scan recently acquired from the scanner or from another source.
  • the pre-processing engine processes the recently acquired image based on its set of rules and for example may utilize different algorithms than that of processor 120.
  • the results of these algorithms may be stored anywhere on system 100, such as in the case management system (CMS).
  • CMS case management system
  • the pre-processing of the data therefore allows for processing of data that may, for example, take a longer period of time, prior to any end user interface with the workstation 130. For example, when the user subsequently views this slide or this case, the appropriate graphical user interface and/or data is readily available for display to the end user based on the pre-processing results, data and other information.
  • case types/procedure types that would undergo pre-processing include case types/procedure types that require longer processing times than normal.
  • the preprocessing engine thus allows for processing of the data prior to the end user's interaction with the workstation/graphical user interface.
  • An example of a case type that may undergo pre- processing is Mitosis counting/Mitosis identification that is processed by the respective algorithm of the applicable ReadflowTM.
  • mitosis may be present anywhere in the WSI and results in difficulty in processing the entire slide on the client side when the pathologists open the case/slide.
  • the pre-processing engine will use the rules engine to determine if a slide needs to be processed for Mitosis.
  • the pre-processing may also be used for any ReadflowTM that needs co-registration, for example, to determine how to register "n" number of slides, which happen to be different slices of the same sample, for example.
  • This information may be stored after the preprocessing step and may be used to setup the graphical user interface when the pathologists open a particular ReadflowTM that uses co-registration.
  • ReadflowsTM include Breast panel (H&E, IHC), breast biopsy and hotspots as discussed below.
  • Another aspect of the invention includes the Readflows for each case type and/or case panel as further described below.
  • the ReadflowsTM allow for the appearance, behavior, or both the appearance and behavior of the graphical user interface to change based on, for example, the case information and user preferences/needs.
  • Examples of ReadflowsTM discussed below that change the appearance of the original digital image include, but are not limited to the Gleason Score, GU (prostate tumor and hyaline), GU (kidney, cancer), skin (melanoma, epidermis), breast biopsy, H&E and IHC, and core, tissue jumper and hotspots.
  • Examples of ReadflowsTM discussed below that change appearance and behavior include, but are not limited to, the following: skin (melanoma, epidermis), and breast (H&E, IHC, biopsy and core).
  • Examples of ReadflowsTM that include data subject to pre-processing include, but are not limited to, breast (H&E, IHC) and biopsy and hotspots.
  • ReadflowsTM described herein are not to be all-inclusive and/or limiting and may be combined with each other and/or built on to produce additional ReadflowsTM.
  • the following examples are exemplary ReadflowsTM of the invention.
  • Gleason scores are needed for grading prostate cancer.
  • pathologists will access a reference source to review the grading patterns for determining Gleason scores.
  • the rules engine determines from the case information or some other source that the case is "prostate”, it automatically displays the Gleason score template on the user interface without any prompting or action from the user.
  • GU Prostate tumor area
  • GU prostate tumor area
  • ReadflowTM when the slide is loaded and the tissue type is prostate and stain type is H&E, a tumor area calculation tool may be enabled.
  • the pathologists can use this tool to mark a tumor region; the tool accumulates the tumor area across all the regions on the slide and case and displays that to the user. More importantly, the tool automatically finds the respective entire tumor area and displays a percent area of tumor for each slide as well as accumulated across all slides.
  • GU Kidney, cancer near margins
  • the ink analysis tool may be applied to identify the margins to enable localization of cancer near the margins.
  • GU kidney-Fuhrman ReadflowTM
  • the tissue type is "kidney” and the stain type is "H&E”
  • a Furman grading tool may also be enabled. The pathologists may use this tool to mark out a region on the slide and the tool calculates the Fuhrman grade and presents the results to the user.
  • tissue jumper ReadflowTM As demonstrated in FIGS. 9 and 9a, when the slide is loaded and the procedure type is biopsy, an algorithm may be executed to detect the different tissue sections on the slide. The algorithm orders the tissue based on user preferences. The user may automatically traverse between the different tissues using a keyboard shortcut or an input device. The order of traversal and field of view position for each tissue may be based on user preferences.
  • the Tissue jumper ReadflowTM may not be restricted to just the currently open slide; the tissue jumper may be extended to jump to the same tissue section across different slides in the same case. This feature extension would be available as part of the user preferences settings.
  • a melanoma detection tool may be enabled. This tool locates a melanoma in the slide and places the Breslow and Clark calculation measurements on the melanoma for the pathologist. Alternatively, the tool can be configured to permit the user to select a melanoma.
  • a hanging protocol associated with this type of tissue type can be applied such that, for example, the digital slides are automatically presented to the pathologist horizontally and at a 5x magnification, in addition to providing the melanoma detection tool in the user interface as discussed previously.
  • the rules engine may cause an epidermis detection algorithm may be launched.
  • the slide containing the epidermis is aligned based on the user preferences, for example, Epidermis on top/Dermis on bottom. Furthermore, the orientation of the slide containing epidermis may be altered based on user preferences, if any have been provided.
  • Hematoxilin and Eosin is a method of staining that is used to study the morphology of tissue samples. Oncologists attempt to identify particular types of cancer by detecting variations in the patterns from the normal tissue. H&E staining can also be used to determine the pathological grading/staging of cancer (e.g. the Richardson and Bloom).
  • the breast (H&E biopsy and IHC) ReadflowTM eliminates the current problems with reading these samples.
  • the tissue type is "Breast” and the stain type is "H&E” and the case panel contains other IHC slides (ER, PR, Hercept)
  • the slides are all co-registered and locked.
  • the interface displays a multitude of slides (H&E, ER, PR, Hercept, Ki-67, negative control and positive control) in any combination in a 2x2 grid (based on user preferences).
  • the user may either apply annotations to the H&E slide which then get applied across the other IHC slides or the user may run IHC image analysis on all the IHC slides with a single click instead of having to pick the appropriate algorithm in each view (done automatically by the ReadflowTM).
  • the co- registration may also happen using the pre-processing engine.
  • a control tissue detection algorithm may be enabled to find control tissue on the same slide, the slide in the panel that contains control tissue, or the daily or less frequent control tissue slide generated in the lab. If the control tissue detection algorithm detects a control tissue, it selects a region of interest from within the control tissue and displays that on the interface. If the control tissue algorithm doesn't automatically detect a control tissue, a manual control tissue selection tool may be enabled. This tool allows the user to select a Region of interest from the control tissue to be displayed on the interface.
  • FIGS. 14 and 14a for the breast core biopsy (H&E/IHC) ReadflowTM, when the slide is loaded and the tissue type is "Breast" and the stain type is either "H&E” or one of the IHC stains, an algorithm is applied to detect the angle of inclination for the slide and the entire slide is oriented either vertically, horizontally or whatever user preference is chosen.
  • the slide is also set to a lower magnification (i.e. lOx, 5x etc. based on chosen user preference) and the initial field of view is to the left top position of the left most tissue, or whatever was chosen in the user preferences.
  • the orientation angle calculation could be done using the pre-processing engine.
  • the invention described herein has a number of advantages and benefits.
  • the systems and methods discussed herein allow for the optimization and efficiency in the interaction between pathologists and the digital pathology images. Additionally, the systems and methods discussed herein allow for the optimization and efficiency in the interaction between pathologists and the digital pathology images. Additionally, the systems and methods discussed herein allow for the optimization and efficiency in the interaction between pathologists and the digital pathology images. Additionally, the systems and methods discussed herein allow for the optimization and efficiency in the interaction between pathologists and the digital pathology images. Additionally, the systems and methods discussed herein allow for the optimization and efficiency in the interaction between pathologists and the digital pathology images. Additionally, the systems and methods discussed herein allow for the optimization and efficiency in the interaction between pathologists and the digital pathology images. Additionally, the systems and methods discussed herein allow for the optimization and efficiency in the interaction between pathologists and the digital pathology images. Additionally, the systems and methods discussed herein allow for the optimization and efficiency in the interaction between pathologists and the digital pathology images. Additionally, the systems and methods discussed herein allow for the optimization and efficiency in
EP12818209.4A 2011-07-27 2012-07-27 Systeme und verfahren in der digitalen pathologie Ceased EP2737435A4 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201161512341P 2011-07-27 2011-07-27
PCT/US2012/048731 WO2013016715A1 (en) 2011-07-27 2012-07-27 Systems and methods in digital pathology

Publications (2)

Publication Number Publication Date
EP2737435A1 true EP2737435A1 (de) 2014-06-04
EP2737435A4 EP2737435A4 (de) 2015-04-08

Family

ID=47601577

Family Applications (1)

Application Number Title Priority Date Filing Date
EP12818209.4A Ceased EP2737435A4 (de) 2011-07-27 2012-07-27 Systeme und verfahren in der digitalen pathologie

Country Status (4)

Country Link
US (1) US9760678B2 (de)
EP (1) EP2737435A4 (de)
CA (1) CA2843468A1 (de)
WO (1) WO2013016715A1 (de)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10663711B2 (en) * 2017-01-04 2020-05-26 Corista, LLC Virtual slide stage (VSS) method for viewing whole slide images
JP7458328B2 (ja) 2018-05-21 2024-03-29 コリスタ・エルエルシー マルチ分解能登録を介したマルチサンプル全体スライド画像処理

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6411836B1 (en) * 1999-12-30 2002-06-25 General Electric Company Method and apparatus for user preferences configuring in an image handling system
WO2005076197A2 (en) * 2004-01-31 2005-08-18 Bioimagene, Inc. Method and system for morphology based mitosis identification and classification of digital images
US20060146071A1 (en) * 2005-01-03 2006-07-06 Morita Mark M Content based hanging protocols facilitated by rules based system
US20070140536A1 (en) * 2005-12-19 2007-06-21 Eastman Kodak Company Medical image processing method and apparatus
US20090138318A1 (en) * 2007-11-20 2009-05-28 General Electric Company Systems and methods for adaptive workflow and resource prioritization
US20090190812A1 (en) * 2008-01-25 2009-07-30 Maki Sano Pathological tissue image capturing system, pathological tissue image capturing method, and pathological tissue image capturing program
US20090276392A1 (en) * 2008-05-02 2009-11-05 John Yan Dynamic sequencing display protocols for medical imaging data
US20110060766A1 (en) * 2009-09-04 2011-03-10 Omnyx LLC Digital pathology system

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7089499B2 (en) * 2001-02-28 2006-08-08 International Business Machines Corporation Personalizing user interfaces across operating systems
US20020169735A1 (en) * 2001-03-07 2002-11-14 David Kil Automatic mapping from data to preprocessing algorithms
US7116440B2 (en) * 2003-02-28 2006-10-03 Aperio Technologies, Inc. Image processing and analysis framework
US20060195484A1 (en) 2005-02-25 2006-08-31 General Electric Company System and method for providing a dynamic user interface for workflow in hospitals
US20080120142A1 (en) * 2006-11-20 2008-05-22 Vivalog Llc Case management for image-based training, decision support, and consultation
US20090132916A1 (en) 2007-11-20 2009-05-21 Parascript, Llc User interface for adjusting thresholds and presenting mammography processing results
US8151188B2 (en) * 2008-07-23 2012-04-03 General Electric Company Intelligent user interface using on-screen force feedback and method of use
US20100131482A1 (en) 2008-11-26 2010-05-27 General Electric Company Adaptive user interface systems and methods for healthcare applications
US8150708B2 (en) 2009-02-17 2012-04-03 Virtual Radiologic Corporation Organizing medical images for display
US8199989B2 (en) 2009-05-15 2012-06-12 General Electric Company Automatic fly through review mechanism
JP5658451B2 (ja) 2009-11-30 2015-01-28 ソニー株式会社 情報処理装置、情報処理方法及びそのプログラム

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6411836B1 (en) * 1999-12-30 2002-06-25 General Electric Company Method and apparatus for user preferences configuring in an image handling system
WO2005076197A2 (en) * 2004-01-31 2005-08-18 Bioimagene, Inc. Method and system for morphology based mitosis identification and classification of digital images
US20060146071A1 (en) * 2005-01-03 2006-07-06 Morita Mark M Content based hanging protocols facilitated by rules based system
US20070140536A1 (en) * 2005-12-19 2007-06-21 Eastman Kodak Company Medical image processing method and apparatus
US20090138318A1 (en) * 2007-11-20 2009-05-28 General Electric Company Systems and methods for adaptive workflow and resource prioritization
US20090190812A1 (en) * 2008-01-25 2009-07-30 Maki Sano Pathological tissue image capturing system, pathological tissue image capturing method, and pathological tissue image capturing program
US20090276392A1 (en) * 2008-05-02 2009-11-05 John Yan Dynamic sequencing display protocols for medical imaging data
US20110060766A1 (en) * 2009-09-04 2011-03-10 Omnyx LLC Digital pathology system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of WO2013016715A1 *

Also Published As

Publication number Publication date
US20140257857A1 (en) 2014-09-11
EP2737435A4 (de) 2015-04-08
CA2843468A1 (en) 2013-01-31
WO2013016715A1 (en) 2013-01-31
US9760678B2 (en) 2017-09-12

Similar Documents

Publication Publication Date Title
Pallua et al. The future of pathology is digital
EP3776458B1 (de) Mikroskop mit erweiterter realität für pathologie mit überlagerung von quantitativen biomarkerdaten
US9478022B2 (en) Method and system for integrated radiological and pathological information for diagnosis, therapy selection, and monitoring
EP1654626B1 (de) Verfahren und system zur intelligenten qualitativen und quantitativen analyse für die medizinische diagnose
US8355553B2 (en) Systems, apparatus and processes for automated medical image segmentation using a statistical model
US8184854B2 (en) Method and system for evaluation of the behavior of users of a digital image information system
US6901277B2 (en) Methods for generating a lung report
US7925653B2 (en) Method and system for accessing a group of objects in an electronic document
US8335359B2 (en) Systems, apparatus and processes for automated medical image segmentation
US20230393036A1 (en) Roche Molecular Systems, Inc.
CN109791693A (zh) 用于提供可视化全切片图像分析的数字病理学系统及相关工作流程
JP2011505225A (ja) 効率的な撮像システムおよび方法
US8566727B2 (en) Method and system for automating a user interface
JP2018512072A (ja) 自動化スライド全域解析の品質管理
JP2009527063A (ja) 仮想環境において見本及びデータを使用及び統合するシステム及びその方法
JP4179510B2 (ja) 検査支援装置および検査支援プログラム
CN111223556B (zh) 集成医学图像可视化和探索
US20140153795A1 (en) Parametric imaging for the evaluation of biological condition
US9760678B2 (en) Systems and methods in digital pathology
Corvò et al. PathoVA: A visual analytics tool for pathology diagnosis and reporting
US11830621B2 (en) System and method for rapid and accurate histologic analysis of tumor margins using machine learning
Corvò et al. Visual analytics in digital pathology: Challenges and opportunities
Kuckertz et al. Fully automated longitudinal tracking and in-depth analysis of the entire tumor burden: unlocking the complexity
CN116235223A (zh) 使用基于目光的跟踪的注释数据收集
Corvò et al. Visual analytics in histopathology diagnostics: a protocol-based approach

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20140120

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
RA4 Supplementary search report drawn up and despatched (corrected)

Effective date: 20150310

RIC1 Information provided on ipc code assigned before grant

Ipc: G06F 19/00 20110101ALI20150304BHEP

Ipc: A61B 5/00 20060101ALI20150304BHEP

Ipc: G06Q 50/24 20120101ALI20150304BHEP

Ipc: G06K 9/00 20060101AFI20150304BHEP

17Q First examination report despatched

Effective date: 20160202

REG Reference to a national code

Ref country code: DE

Ref legal event code: R003

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED

18R Application refused

Effective date: 20171011