CN115988987A - System for providing remote and fast access to scanned image data - Google Patents

System for providing remote and fast access to scanned image data Download PDF

Info

Publication number
CN115988987A
CN115988987A CN202180052127.6A CN202180052127A CN115988987A CN 115988987 A CN115988987 A CN 115988987A CN 202180052127 A CN202180052127 A CN 202180052127A CN 115988987 A CN115988987 A CN 115988987A
Authority
CN
China
Prior art keywords
image
access
image data
component
viewing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180052127.6A
Other languages
Chinese (zh)
Inventor
P·M·海默
D·L·让德罗
D·千
J·E·维尔纳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Caliber Imaging and Diagnostics Inc
Original Assignee
Caliber Imaging and Diagnostics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Caliber Imaging and Diagnostics Inc filed Critical Caliber Imaging and Diagnostics Inc
Publication of CN115988987A publication Critical patent/CN115988987A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0062Arrangements for scanning
    • A61B5/0068Confocal scanning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/53Querying
    • G06F16/535Filtering based on additional data, e.g. user or group profiles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H80/00ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/239Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests
    • H04N21/2393Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests involving handling client requests
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44227Monitoring of local network, e.g. connection or bandwidth variations; Detecting new devices in the local network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/632Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing using a connection between clients on a wide area network, e.g. setting up a peer-to-peer communication via Internet for retrieving video segments from the hard-disk of other client devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8146Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics
    • H04N21/8153Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics comprising still images, e.g. texture, background image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • A61B2034/258User interfaces for surgical systems providing specific settings for specific users

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Surgery (AREA)
  • Biomedical Technology (AREA)
  • Signal Processing (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Multimedia (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Robotics (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Pathology (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Business, Economics & Management (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Urology & Nephrology (AREA)
  • General Business, Economics & Management (AREA)
  • Biophysics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Graphics (AREA)
  • Medical Treatment And Welfare Office Work (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

The present invention provides a system, apparatus and method for providing remote and fast access to image data. The present invention may be employed during surgery to enable a health care professional, located remotely from the location where the surgery is performed, to view and evaluate tissue removed from a patient during the surgery. Such health care personnel may provide feedback to the surgeon to improve the outcome of the procedure.

Description

System for providing remote and fast access to scanned image data
CROSS-REFERENCE TO RELATED PATENT APPLICATION (S)
This document is a U.S. non-provisional utility patent application claiming priority and interest from the co-pending U.S. provisional patent application having serial No. (63/045,019), (acknowledgement number 4651), (docket number ML-0538P), filed 26/6/2020 and entitled "SYSTEM FOR PROVIDING REMOTE control AND RAPID ACCESS TO SCANNED IMAGE DATA" and incorporated herein by reference in its entirety.
Patent application(s) including related subject matter
This document includes subject matter generally related to the subject matter of U.S. patent No. 9,055,867 issued at 16.6.2015 and entitled "concecal SCANNING microscopy HAVING OPTICAL AND SCANNING SYSTEMS WHICH PROVIDE A HANDHELD IMAGING HEAD" to Fox et al.
This document also includes subject matter generally related to the subject matter of U.S. patent No. 10,908,406 issued at 2/2021 and entitled "RESONANT SCANNER interaction WITH MOVABLE STAGE", to Hadley et al.
All of the above documents (including patents, patent publications, patent applications, and technical papers) are incorporated herein by reference in their entirety.
Background
Performing a medical procedure on a patient typically involves cutting and/or removing tissue from the patient's body. Tissue removed from a patient's body may be evaluated visually and/or may be evaluated via one or more devices that may reveal characteristics of the tissue that are not necessarily visible from the human eye. Such equipment employed for evaluation includes, for example, confocal laser scanning microscopes designed to produce a representation of one or more characteristics of the removed tissue. Such a representation may be encoded and stored as digitally encoded data.
The discussion above is merely provided for general background information and is not intended to be used as an aid in determining the scope of the claimed subject matter.
Disclosure of Invention
The present invention provides a system, apparatus and method for providing remote and rapid access to image data of ex-vivo tissue (ex-vivo tissue) excised from a patient during surgery. The present invention may be employed during the performance of a procedure to enable other remotely located healthcare personnel to view and quickly assess tissue removed from a patient during the procedure. Such health care personnel can provide quick and timely feedback to the surgeon during the performance of the procedure in order to expedite the procedure and improve the outcome of the procedure.
Brief description of the drawingsthe brief description of the present invention is intended to provide a brief summary of the subject matter disclosed herein, in accordance with one or more illustrative embodiments, and is not intended to serve as a guide for interpreting the claims or defining or limiting the scope of the invention, which is defined solely by the appended claims.
Drawings
In order that the manner in which the features of the invention are understood, a detailed description of the invention may be had by reference to certain embodiments, some of which are illustrated in the accompanying drawings. It is to be noted, however, that the appended drawings illustrate only certain embodiments of this invention and are therefore not to be considered limiting of its scope, for the scope of the invention may admit to other equally effective embodiments.
The drawings are not necessarily to scale. The drawings are generally focused on features illustrative of certain embodiments of the invention. In the drawings, like numerals are used to indicate like parts throughout the various views. Differences between similar parts may cause different numbers to be used to indicate those parts. Dissimilar parts are indicated by different numbers. For a further understanding of the invention, reference may therefore be made to the following detailed description, read in conjunction with the accompanying drawings, in which:
FIG. 1 illustrates a simplified view of an embodiment of a system for transmitting image data scanned from tissue excised from a patient during surgery.
Fig. 2A illustrates a simplified representation of an embodiment of an Image Scanning Station (ISS).
FIG. 2B illustrates a simplified representation of an embodiment of the image scanner of FIG. 2A.
Fig. 2C illustrates various resolutions of image data that may be employed to view the exposed surface of the resected tissue.
Fig. 2D illustrates a simplified representation of an embodiment of an Image Viewing Station (IVS).
FIG. 2E illustrates a simplified representation of an embodiment of a system including a tree-type hierarchy of a set of image access components.
FIG. 2F illustrates the operation of the set of image access components of FIG. 2E.
Fig. 3A illustrates a simplified representation of a user interface window of an Image Stream Viewing Component (ISVC) displaying image data.
Fig. 3B illustrates the transmission of a coordinate instruction (coordinate direct) from a user of an image stream viewing station.
FIG. 3C illustrates an update of a view window in response to the coordinate go (go to) instruction of FIG. 3C.
Fig. 3D illustrates a horizontal perspective view of image data viewed inside the viewing window of fig. 3D.
FIG. 3E illustrates updated contents of a viewing window in response to the zoom-in (reduction) instruction of FIG. 3D.
FIG. 3F illustrates the updated contents of the viewing window in response to the zoom-in (zoom-out) instruction of FIG. 3F.
FIG. 4 illustrates an expanded overview of the transfer of image data involving multiple healthcare facilities.
Detailed Description
Fig. 1 illustrates a simplified overview of an embodiment of a system for transmitting image data scanned from tissue 162 excised from a patient during surgery. The figure shows a top down perspective view of the first health care facility 110 with the surgeon 116 performing an operation on the patient 112. The patient 112 is shown lying on an operating table 114. The surgeon 116 acts as a physician (physician) and a first health care professional within a first health care facility.
During surgery, an ex vivo tissue sample 162 is excised (cut) from the body of the patient 112 by the surgeon 116. The ex vivo tissue sample 162 is transported to an Image Scanning Station (ISS) 120 located within the first healthcare facility 110, optically scanned via a confocal scanner 160, to generate a large set of electronically encoded and stored data, which in this case is digitally encoded image data and referred to herein as an image dataset 126.
Another second health care professional, who is a licensed physician as a pathologist, is arranged to evaluate the excised ex vivo tissue sample 162 for the presence or absence of an unhealthy condition. The second health care professional is located in a second health care facility located remotely and many miles from the first health care facility. Such an unhealthy condition may be indicated by the presence of disease within the resected tissue 162, such as, for example, by the presence of a cancerous lesion. The performance of the procedure is suspended until a first health care professional, as surgeon 116, receives a satisfactory assessment of the excised tissue sample 162 from a second health care professional, as a pathologist.
After completion of the scan of the ex vivo tissue sample 162, the resulting scanned image data 126 undergoes a processing stage such that the image data 126 may be more efficiently viewed from a remote location by various persons (including, for example, pathologists) having authority to view such data. The ISS 120 includes an input image access component (input IAC) 122 configured to communicate the presence and availability of processed image data stored within an image dataset 126 for access or viewing by one or more persons via Image Viewing Stations (IVSs) 150a-150 b. The processed image data is also referred to herein as image data or data.
When processed image data becomes available for access or viewing (i.e., also referred to herein as an event in which processed image data becomes available), the processed image data may be accessed for applications other than viewing. For example, the processed image data may be stored in temporary and/or permanent storage located remotely (from) the surgical site, so as to make (copies of) such data quickly available for (parallel) access from the location remote (from) the surgical site, and without requiring any further involvement of the Image Scanning Station (ISS) when providing such available and parallel access.
Alternatively, such processed image data may be further processed via hardware and/or software from a location located remotely (from) the surgical site. Such further processing may include, for example: artificial intelligence is employed for assessing the likelihood of the presence of disease or some other type of abnormality within the tissue from which the image data was scanned.
After processing of the image data is complete, input IAC122 communicates the presence and availability of the processed image data to persons having authority to view such data. The Image Viewing Stations (IVSs) 150a-150b are configured to provide access to and viewing of the image data 126 by each of such persons from a remote location via the internet or via another type of network.
As illustrated in fig. 2E, input IAC122 interoperates with other members of a set of Image Access Components (IAC) to communicate the presence and availability of image data sets 126 for viewing. The IAC set includes at least one input IAC122, one output IAC, and optionally one or more intermediary IAC. An intermediary IAC acts as an intermediary between two other IAC's.
The IAC's interoperate with each other to communicate image data within image data set 126 to viewers of the image data via image viewing station 150 as needed. An Image Viewing Station (IVS) 150 is employed to enable a pathologist (viewer) to view image data from a remote location.
The IVS 150 interoperates with an output IAC142, which output IAC142 executes on (is hosted by) an internet-accessible website referred to herein as an Image Access Portal (IAP) 140. The output IAC142 locates and transmits the image data for remote and rapid review by a pathologist as needed and in response to viewing instructions transmitted to the IVS 150 from the pathologist who is the user of the IVS 150. The path between output IAC142 and the image dataset 126 to be accessed for viewing, including another Image Access Component (IAC) that facilitates such access between output IAC142 and the image dataset 126, is referred to herein as an image access communication path or access communication path.
The system enables multiple Image Viewing Stations (IVSs) 150a-150b to view image datasets 126 from multiple remote locations, each of which may overlap simultaneously among multiple IVSs 150a-150b or appear separately over time during a period of time. Further, one viewing station (IVS) is configured to view multiple image data sets at a time.
In the case where the surgeon 116 receives an unsatisfactory evaluation of the tissue 162 sample from the pathologist, the surgeon 116 may be motivated and/or required to excise a second tissue sample from the patient 112 for rescanning in order to generate a second image dataset of the second tissue sample to achieve a second evaluation from the pathologist for the second tissue sample. Once the evaluation of the excised tissue is satisfactory, the surgeon is generally free to continue and complete the procedure.
Fig. 2A illustrates a simplified representation of an embodiment of an Image Scanning Station (ISS) 120. The ISS 120 is designed to optically scan excised tissue during surgery. In this embodiment, the ISS 120 is preferably a desktop computer. Thus, the desktop computer includes at least one Central Processing Unit (CPU) 212, a system bus 220, physical memory 214, virtual memory 240, and input/output hardware 216, the input/output hardware 216 designed to provide electronic interface(s) to one or more peripheral devices, including an electronic interface to a network 290. The ISS 120, which is a desktop computer, also includes user interface hardware including a User Interface Display Screen (UIDS) 130, a keyboard 132, and a mouse (screen pointer) device 134. The user interface hardware provides a user interface to enable a user of the ISS 120 to interact with and operate the ISS 120.
The image scanning station 120 also includes peripheral devices, including an attached confocal (optical) scanner device 160 and data storage device 124. In this embodiment, the scanner 160 includes optics that enable optical scanning of the ex vivo tissue 162 at a resolution of a quarter micron pixel or less. The data storage device 124 is a terabyte (terabyte) solid-state disk having a read/write data rate equal to 400 megabytes per second. Optionally, and in other embodiments, the ISS 120 may also include other hardware and software not necessary to perform optical scanning of the resected tissue 162.
As to software, in this embodiment, the ISS 120 is installed with Microsoft Windows Operating System (OS) software 228a, and preferably Windows 10 operating system software 222. The Operating System (OS) includes a device driver 228b for interfacing with peripherals attached to the ISS 120. The ISS 120 is installed with software applications including an Image Scanning Component (ISC) 230, an image processing component (232), and an input image access component (input IAC) 122. Optionally, the ISS 120 includes at least one web browser program, such as, for example, the Google Chrome browser program.
FIG. 2B illustrates a simplified representation of an embodiment of the image scanner 160 of FIG. 2A. In this embodiment, the scanner 160 is a confocal image scanner 160. The operation of the image scanner 160 will be described in connection with a first usage scenario.
In a first use scenario, a sample of ex vivo tissue 162 is excised (cut) from the surgical patient 112 during surgery. Excised tissue 162 is also referred to herein as that tissue 162 or tissue 162. Before the procedure is completed, the tissue 162 is transported to the scanner 160. In this scenario, the scanner 160 is a confocal (optical) scanner 160, and is also referred to herein as the scanner 160. The scanner 160 is typically located within a walking distance from the location where the procedure is performed.
Typically, the surgeon 116 cuts tissue 162 from within the patient at a particular location to expose a surface of the tissue for examination. The examination searches for the presence of abnormalities within the tissue 162 that may indicate an unhealthy condition, such as the presence of a certain type of disease, including the presence of a certain type of cancer. The examination is typically performed by a health care professional, such as a pathologist and/or someone with special skills to identify this type of abnormality. This anomaly is evidence of the presence of a particular unhealthy condition visible along the exposed surface of the tissue.
In situations where an unhealthy condition is identified along the surface of the tissue 162, the surgeon 116 is typically required to continue performing the procedure and resect more tissue from the patient to search for tissue having a healthy surface.
When it is determined that the resected tissue has a healthy surface, the resected tissue is also referred to as having a healthy tissue margin (margin), or as having a healthy margin. If the surgeon obtains such a healthy tissue margin, the event is typically indicated as follows: that is, a sufficient amount of unhealthy tissue that is the focus of the procedure has been removed from the patient 112 to allow the surgeon 116 to proceed to complete the procedure.
Alternatively, in situations where the surface of the tissue appears healthy, the surgeon 116 is typically free to continue completing the procedure without being required to cut additional tissue samples from the patient.
In this embodiment, the scanner includes a substrate, also referred to herein as a platen (161), which is face up and transparent. Tissue 162 includes one or more surfaces exposed as a result of surgeon 116 cutting tissue from within the patient. One such exposed surface is selected for image scanning and is placed (face down, as shown) on the upper surface of the platen 161 and facing the main body (base) of the scanner 160.
During operation, the scanner 160 is designed to: to project light of a particular wavelength in an upward direction from the body of the scanner 160 and through the platen 161 and toward the exposed surface of the tissue 162. The upwardly projected light causes reflection and/or fluorescence (fluorescence) of light from the exposed surface of the tissue 162. The reflected and/or fluoresced light is typically directed back to the body of the scanner 160. The reflected and/or fluoresced light is detected and stored as image data within the scanner 160.
The scanner 160 is designed to scan the tissue along a two-dimensional plane (also referred to herein as a scan plane). The position of such a scan plane can be adjusted. As shown, the tissue 162 may be scanned along a scan plane defined along an exposed surface of the tissue 162, or the tissue 162 may be scanned along a scan plane positioned a small distance into the body of the tissue 162. This small distance is also referred to herein as the scan depth. The scan depth may be set, for example, to a physical distance (depth) of 1 micron through the surface of the tissue 162 and into the body of the tissue 162, or to a physical distance (depth) from the surface of the tissue 162 into the body of the tissue 162 equal to a multiple of a 5 micron increment.
Image data is stored in data units, which are referred to herein as image data pixels or data pixels. Each data pixel represents a small portion of the image. Each small portion of the image is detected by light reflected and/or fluoresced from the tissue 162. In this embodiment, each data pixel requires two bytes of storage per channel. These two bytes encode the intensity of the gray scale representing this small portion of the image.
Each data pixel may be represented in a number of different ways. These different ways are referred to herein as channels. By default, no dye is applied to the tissue 162, and optical scanning of the tissue 162 collects the projected light reflected (non-fluorescing) from the tissue 162.
Alternatively, one or more fluorescent dyes may be applied to the tissue 162 prior to scanning the tissue 162. In this case, as the tissue 162 is scanned, light of a particular (discrete) wavelength is directed toward the tissue 162 such that the tissue 162 fluoresces (shines optically) in response to contact with the light of these discrete wavelengths. The optical emission is a fluorescence emitting light in a certain wavelength (color) range. This optical luminescence provides information that can be employed to colorize (colorize) the representation of the tissue.
Each perspective from which the image data represents tissue 162 is referred to herein as a channel. If a portion of the tissue 162 is subjected to a first optical scan without dye, the image data obtained from this first type of optical scan is referred to as one channel of image data or a first channel of image data. This first channel of image data is of the type: this type is also referred to herein as a channel of reflective image data, or as a reflective channel of scanned image data for the tissue 162.
If, instead, the tissue is stained and fluoresced and is subjected to an optical scan while fluorescing, the image data collected from the alternate scan of the tissue is referred to as a fluorescence image data channel, or as one fluorescence channel of image data for the tissue.
If the same tissue is subjected to the first reflective optical scan and the second fluorescent optical scan, the combined image data for the tissue portion is referred to herein as having two channels of image data for the tissue portion. The first channel is a reflective channel and the second channel is a fluorescent channel of the image data. Preferably, the same two channels of image data are obtained when the first reflective optical scan and the second fluorescent scan are performed simultaneously (at the same time). In the use scenario to be described, the exposed surface of the tissue is square in shape, with a width of 25mm and a height of 25mm, and the total area is equal to (25 mm × 25mm =625 square mm).
In this usage scenario, the exposed surface of the tissue is optically scanned. Alternatively, the exposed surface may be scanned in a matrix-type pattern, where each element of the matrix, which is the intersection of a row and a column within the matrix, represents a portion of the area of the exposed surface of the tissue. This type of scanning is also referred to herein as step mapping or step scanning, or as block (mosaic) mapping or block scanning. This portion (block) within the matrix is approximately square in shape and is referred to herein as a frame (frame) or scan field of view.
A frame is also referred to herein as a field of view (FOV) because it can be later displayed and viewed after being scanned. With this type of scanning, the exposed surface is optically scanned in such block portions, each referred to as a frame at a time, until all frames along the exposed surface of the tissue are scanned. Alternatively, another type of scanning is referred to herein as stripe mapping or stripe scanning, where the exposed surface is scanned optically once over a larger portion (a larger portion) than those scanned via the block scan, and where each larger portion is typically longer in one dimension, and typically has a larger overall size over the total area than one scan block.
In this embodiment, the scanner 160 may be configured to scan over a range of scan resolutions. The highest scan resolution in this range is referred to herein as the full scan resolution. When block scanning is employed, at the highest (full) scanning resolution, the exposed surface of the tissue is divided into a matrix comprising at least 100 rows and at least 100 columns, which defines a matrix having at least 10,000 scanning fields of view, at least 10,000 elements within the matrix comprising rows and columns. The language "at least" is employed to account for scan frame overlap, which requires: more frames are to be scanned than are to be displayed and viewed, and will be described further below. Within each frame (scan field of view), there is another matrix, which is a matrix of image data pixels. The matrix of image data pixels within each frame includes at least 1024 rows and 1024 columns of image data pixels.
Within this embodiment of the invention and the inventive use scenario, the exposed surface of tissue 162 is optically scanned at full (highest) scanning resolution while employing block scanning. At full scan resolution, each frame includes a matrix of data pixels. Each data pixel represents an approximately square-shaped tissue region measuring approximately 0.24 microns in width and 0.24 microns in height. These data pixels are nominally referred to herein as quarter-micron data pixels as an approximately linear measure of their size. Each frame represents an approximately square-shaped tissue region measuring approximately 240 microns in width and 240 microns in height.
At full scan resolution, a 25mm square exposed surface area of tissue requires the following scan frames: the scan frames are arranged into at least 100 rows and 100 columns of scans, each scan frame slightly overlapping each other. This slight overlap of each scan frame (referred to as frame overlap) better enables software (referred to as image stitching software) to join adjacent scan frames together to form a unified (stitched) image data representation of the image of the entire exposed surface of the tissue.
Thus, this overlap requires more than (100 rows by 100 columns =10,000) separate scan frames to generate 10,000 displayable and viewable frames stitched together, each frame being referred to as a display field of view or a viewing field of view. Each scan frame (field of view) includes 1024 rows by 1024 columns of data pixels =1,048,576 (more than one million) data pixels. In the case of two bytes of storage required for each data pixel, each scan frame (field of view) requires slightly more than two megabytes (2 megabytes) of image data per scan image data channel.
Thus, the amount of image data required to scan this surface area that is approximately 1 square inch (25 millimeters by 25 millimeters =625 square millimeters) in size would require more than approximately 200 gigabytes of image data per channel, i.e., 20 gigabytes of image data per channel.
In many cases, a potential viewer (user) of the image data, such as a health care professional who is expected to evaluate the excised tissue, may not be located within a convenient walking distance for the procedure. Alternatively, the expert may be located on another floor of the same building, or in another building on the campus of the healthcare facility, or in another healthcare facility, and/or located miles from where the procedure was performed.
Software designed to input, process and output image data for viewing by a viewer (user) of the image data typically requires high data rate access to the image data or else the software and software user will spend too much time waiting for the software to input a large amount of image data (more than 20 gigabytes per channel) that is desired to be viewed by the user.
Typically, having high data rate access to image data requires that the entire image data be accessible by a computer having a directly attached data storage device. When a storage device is physically attached to a computer and located near the computer without an intervening network, the storage device is referred to herein as being directly attached to the computer.
As shown in FIG. 2A, the input/output hardware 216 is located between the computer's system bus 220 and the storage device 124. As shown in FIG. 2A, only the input/output hardware 216 is located between the system bus 220 and the storage device 124, the input/output hardware 216 being local to the system bus 220 of the computer of the ISS 120 and physically attached to the system bus 220.
Typically, a directly attached storage device provides access to data at or near the maximum speed of the storage device, and typically at a rate of 50 megabytes per second or more. However, for example, if a network or some other type of communication channel is located between the computer's system bus 220 and the storage device 124, the network may span a significant distance from the computer's system bus 216 of the ISS 120, and thus, the data transfer rate to/from the storage device typically drops to a rate well below 50 megabytes per second, and thus, the storage device 124 is not considered herein to be a computer directly attached to the ISS 120. For example, a local area network may span a distance of half a mile and deliver data at about 10 megabytes per second. Distances in excess of half a mile are referred to herein as wide area network distances.
Potentially, each viewer of image data may focus their attention on viewing different portions of the image data, and/or view portions of the image data in different orders over time, depending on what characteristics are searched and by whom within such a large amount of image data. Ideally, the entire image data should be available for interactive and flexible access and viewing by a user (viewer) in a time efficient manner.
One option to achieve the above objective would be to upload the entire image data to a wide area network, such as, for example, an internet-accessible website. The web site will be provided with high data rate access to the image data. With high data rate access to the image data, the internet-accessible website may then be accessed by the user (viewing) from various geographic locations. However, this method has some problems.
A typical (standard internet) upload speed from a workstation such as an image scanning station to the internet is about 2-3 megabytes per second. At a data rate equal to 2.5 megabytes per second, if a single channel image scan is performed for the 625 square millimeter tissue sample, uploading the single channel image data will take about 8000 seconds, which will equal about 133 minutes, i.e., more than 2 hours, before the healthcare professional can view the image data. Alternatively, if a two-channel image scan is performed for the tissue sample, uploading the two-channel image data to the internet before viewing the image data would take more than 4 hours.
During surgery, it is impractical to pause the procedure and wait for a period of one or more hours for a health care professional (such as a pathologist) to examine and evaluate the exposed surface of the excised tissue. While waiting for an evaluation of the excised tissue, the surgical wound of the surgical patient remains open and the progress of the surgery is suspended (delayed) until the surgeon 116 receives an evaluation of the image data from the health care professional.
According to an embodiment of the present invention, the present invention provides a method for accessing and viewing large amounts of image data from various viewing locations by each of one or more viewers (health care professionals), regardless of whether those viewing locations are located close to or remote from the image scanning and surgery locations.
Referring to fig. 2A, the Image Scanning Station (ISS) 120 includes an Image Scanning Component (ISC) 230, the Image Scanning Component (ISC) 230 being software that controls the scanner 160 to perform a scan of the surface of the tissue sample 162 and to scan the storage of image data 126. The ISC 230 stores image data into the image dataset 126.
Scanning of the exposed surface of the tissue 162 is performed at a predictable amount of scan resolution equal to or exceeding the resolution required by the health care professional to examine the tissue, under the direction of a user (operator) of an Image Scanning Station (ISS). In some cases and in this usage scenario, the scan resolution is the full (highest) scan resolution at which this embodiment of the scanning station is designed to scan.
In this usage scenario, the ISCs 230 scan at a resolution of one quarter micron per data pixel and generate more than 20 gigabytes of image data for one reflective channel. With currently available scanning techniques, at a quarter micron data pixel resolution, the scan would take about 25 minutes to fully scan. At half micron data pixel resolution, the scan would take about 6.5 minutes to scan. At a data pixel resolution of one micron, the scan would only require about 1.6 minutes to scan. As the tissue 162 is scanned, the scanned image data is stored on the data storage device 124. The image scanning station 120 is attached to a high data rate access (50 + megabytes per second) data storage device 124.
In this case, the mass storage device 124 is a solid-state data storage device that is locally accessible and has a very high data rate of about 400 megabytes per second. The time required to scan the exposed surface of the tissue 162 depends on the per-data pixel resolution at which it is scanned. For example, in this case, optical scanning of a one micron data pixel would require about 1.6 minutes, for a half micron data pixel would require about 6.5 minutes, and for a quarter micron pixel would require about 25 minutes. With such high-rate data storage, storage of scanned imaging data of size 20 gigabytes requires only about 40 seconds of elapsed time, less than one minute, to complete.
The scanned image data is stored into an image dataset via an operating system 222 executing on the image scanning station 120. The image dataset file 126 (also referred to herein as the dataset file 126 or the dataset 126) resides in at least one file stored in a file system associated with the operating system 222. In general, for a half-micron pixel scan sufficient to display tissue at human cell resolution, scanning of the exposed surface of a 625 square millimeter tissue sample and storage of the resulting image data requires less than 8 minutes of elapsed time.
The data set 126 includes the scanned image data itself, as well as information associated with the image data (referred to herein as metadata). The metadata includes a unique identifier for the data set, a set of parameters associated with the scan, and a date and time at which the scan was performed. Parameters of the scan include, for example, the scan resolution and the physical dimensions of the exposed surface of the tissue.
Optionally, other associated information (such as a description of the surgical environment, the subject location of the tissue, identification of the scanning station and scanner operator, and/or identification of the patient, and/or identification of the surgeon 116) may also be stored as metadata within the data set.
The 20 gigabytes of image scan data of the exposed surface of the tissue is much more information that can be displayed on one User Interface Display Screen (UIDS) at any one time. However, if the image data is processed to create supplemental image data having one or more resolutions that are each reduced from the original full scan resolution such that such supplemental image data may represent a greater portion of the exposed surface image of the tissue onto the UIDS 310 at one point in time, the entire exposed surface of the tissue may be displayed within a User Interface Display Window (UIDW) of the UIDS 130 at one time.
Thus, the Image Scanning Station (ISS) 120 also includes an Image Processing Component (IPC) that provides supplemental image data to enhance the data set in preparation for viewing. IPCs are configured to process scanned image data to calculate and create supplemental image data that is divided into separate portions of image data having various resolutions. These various resolutions are all different from and each calculated and reduced from the full resolution of the original scan image data generated from the original scan of tissue 162.
Fig. 2C illustrates various resolutions of image data that may be employed to view an optically scanned image of ablated tissue 162. In the embodiment shown here, there are four different resolutions of image data.
The first (highest) image resolution 288 is the highest image resolution 288 of the image data represented in this embodiment and this figure, and is the same resolution as that captured from the original scan of the tissue 162, and is represented as a first horizontal line 288, which is drawn substantially near the lower edge of the figure.
For this image data resolution, each data pixel measures 0.24 microns wide and 0.24 microns high and is referred to herein as a quarter-micron pixel. One million of these quarter micron data pixels are packed into each of 10,000+ fields of view (frames), which may be combined to represent an image of the entire exposed and scanned surface of tissue 162. Each display frame or field of view has a width equal to about 240 microns and a height equal to about 240 microns.
Since there is overlap between the scans of each (frame) field of view, and where such overlap is stitched together via software to generate a complete image data set, about 11,100 fields of view (frames) are actually scanned to be stitched together, and about 10,000 visible (frame) fields of view of the stitched image data are generated. With current confocal optical scanning techniques, it will take about 25 minutes to completely scan a 625 square millimeter exposed surface of tissue.
However, when scanning at half micron data pixel resolution, about 2800 fields of view (frames) are actually scanned to stitch together and generate about 2,500 fields of view (frames) of stitched together image data. With current confocal optical scanning techniques, about 6.5 minutes would be required to completely scan a 625 square millimeter exposed surface of tissue.
And when scanned at one micron data pixel resolution, about 700 fields (frames) are actually scanned to stitch together and about 625 fields (frames) of stitched together image data are generated. With current confocal optical scanning techniques, it will take about 1.6 minutes to completely scan a 625 square millimeter exposed surface of tissue.
At quarter-micron image data resolution, quarter-micron data pixels (about 10,000 fields of view x one million data pixels =100 billion per field of view) are required to represent the image of the entire exposed and scanned surface of tissue 162 stitched together, thus requiring more than 20 gigabytes of image data (2 bytes of image data per data pixel) to represent an image of the exposed and scanned surface of tissue that is 625 square millimeters in area size, one square inch in size.
It is noted that user interface screens typically comprise 2-3 million display pixels, which is much less than the 100 hundred million quarter micron data pixels required to view an entire image data set at a time.
The second resolution 286, which is algorithmically calculated from the higher resolution 288 of the image data, is represented as a second horizontal line drawn above the first resolution 288. For this image data resolution, each data pixel measures about 0.96 microns wide and about 0.96 microns high, and is referred to herein as a one-micron pixel. One million of these one micron data pixels are packed into each of (625 =25 × 25) fields of view, each field of view having a width equal to about 960 microns (about 1 millimeter) and a height equal to about 960 microns (about 1 millimeter).
At this image data resolution, 6.25 hundred million one micron data pixels are required to represent the entire exposed surface of the tissue sample, thus 1.25 gigabytes of image data (2 bytes per data pixel) are required to represent an image of an exposed surface of tissue of 625 square millimeters in area size, which is approximately one square inch.
A third resolution 284 of the image data calculated algorithmically according to either of the higher resolutions 286, 288 of the image data is represented as a third horizontal line 284 drawn above the second horizontal line.
Preferably, a third resolution 284 of the image data is algorithmically calculated from a higher resolution 286 of the image data and is represented as a third horizontal line 284 shown plotted above the second resolution of the image data. Alternatively, in other embodiments, this third resolution 284 of image data may be calculated from image data of a higher resolution 288.
At this image data resolution, each data pixel represents a region of tissue measuring about 4.8 microns wide and about 4.8 microns high, and is referred to herein as a five (5) micron data pixel. One million of these five micron data pixels are packed into each of 25 UIDS fields of view, also referred to herein as a field of view within a matrix of five (5) rows by five (5) columns that includes (25) fields of view. Each field of view within the matrix represents a tissue region having a physical width equal to about 4800 micrometers (about 5 millimeters) and a tissue height equal to about 5000 micrometers (about 5 millimeters).
At this image data resolution, 2500 ten thousand one micron data pixels are required to represent the entire exposed surface of the tissue sample, thus 50 megabytes of image data (2 bytes of image data per data pixel) are required to represent an image of an exposed surface of tissue approximately one square inch in size at 625 square millimeters.
Preferably, the fourth resolution 282 of the image data is algorithmically calculated from the higher resolution 282 of the image data and is represented as a fourth horizontal line, which is shown plotted above the third horizontal line 284. Alternatively, in other embodiments, the fourth resolution 282 of image data may be calculated from image data of a higher resolution 286 or a higher resolution 288.
At this image data resolution, each data pixel measures about 24 microns wide and about 24 microns high, and is referred to herein as a twenty-five (25) micron data pixel. One million of these twenty-five micron data pixels are packed into only one UIDS field of view, which is a viewing (viewable) field within a matrix having only one row and only one column. This UIDS viewing field represents the area of tissue that is the entire exposed surface of the excised tissue. This one viewing field represents tissue having a width equal to about 25000 microns (25 millimeters) and a height equal to about 5000 microns (5 millimeters).
At this image data resolution, 100 ten thousand one micron data pixels are required to represent the entire exposed surface of the tissue sample, thus 2 megabytes of image data (2 bytes of image data per data pixel) are required to represent an image of an exposed surface of 625 square millimeters of tissue that is approximately one square inch in size.
The algorithmic computation for each computed image resolution is performed via a down-resolution sampling process, also referred to herein as "down-sampling," in which information from multiple higher resolution pixels is incorporated into and represented by (1) representative data pixels at a lower resolution.
The cascaded set of resolutions provides additional options for the viewer to view the image data. These additional options enable a viewer of the image data to more flexibly select a viewing path through the image data to efficiently perform inspection and evaluation of the image data. This feature of the present invention is further described in fig. 3A-3F.
Fig. 2D illustrates a simplified representation of an embodiment of an Image Viewing Station (IVS) 150. In this embodiment, the Image Viewing Station (IVS) 150 includes many of the same types of hardware as the Image Scanning Station (ISS) 190. The IVS 150 has hardware including a keyboard, a mouse (screen pointer) device and a User Interface Display Screen (UIDS) 292. The UIDS 292 is configured to display at least one User Interface Display Window (UIDW).
As to software, the IVS 150 is installed with Microsoft Windows Operating System (OS) software, and preferably Windows 10 operating system software. The Operating System (OS) includes device drivers for interfacing with hardware peripherals attached to the IVS 150. The IVS 150 is installed with a software application, including an Image Viewing Component (IVC) 152, and includes at least one web browser program, preferably a Google Chrome browser program.
Unlike the image scanning station 190, the Image Viewing Station (IVS) 150 does not necessarily have an attached scanner device, and does not necessarily have an attached mass and/or high data rate mass storage device. However, the IVS 150 may have some of all such additional hardware and, in addition, may be designed for both the image scanning station 190 and the image viewing station 150.
In this embodiment, a web browser program executes in virtual memory of the IVS 150 and interoperates with an operating system installed on the IVS 150. The user of the IVS 150 interacts with a web browser to gain online access to an internet web server, referred to as an Image Access Portal (IAP) 140, which hosts an output image access component (output IAC) 142.
After the IVS 150 gains online access to the output IAC142, the output IAC142 lists one or more image datasets 126 that are currently available for viewing by the user of the IVS 150. These one or more image data sets 126 may each be located at separate locations that are not close to each other. When the user selects a data set for viewing, an Image Viewing Component (IVC) 152 is downloaded as software from the output IAC142 into a web browser program executing on the IVS 150. The IVC 152 is designed to provide an interactive user interface for viewing image data, wherein a user of the IVS 150 transmits viewing instructions to the IVC 152, and in response, the IVC 152 receives and responds to those viewing instructions, typically via communication with the output IAC 142. Further description of the IVC 152 is shown in fig. 3A-3F.
Preferably, and in this embodiment, the IVC 152 comprises a modified version of the opensea flag (OSD) software (OSDs) that is publicly available. The OSD software is implemented as open source code and is written in Java script and HTML. The OSD software is not compiled into executable machine instructions prior to runtime, but is instead interpreted during runtime.
The publicly accessible and unmodified version of the OSD software is designed to access image data stored in a local file system associated with the operating system on which the OSD software executes. Such an arrangement typically results in higher data rate access to the image data than access to such image data over a local or wide area network. Having such high data rate access to image data enables the OSD software to respond more quickly to a user of the OSD software as the user interacts with the OSD software.
However, in accordance with the design of the present invention, and in accordance with the environment in which the present invention is designed to operate, there is no image data set that is accessed and viewed over time. Instead, the plurality of individual image data sets 126 are each physically stored at a variety of different and remote locations relative to the location of the computer on which the OSD software is executed. Via embodiments of the present invention, each image dataset is typically created during a procedure and becomes available for access and viewing, typically for a limited period of time and typically for the duration of the procedure.
After the procedure is completed, the image dataset may be transferred to an alternative data storage device that is physically separate and remotely located from the data storage device 124, which data storage device 124 initially stores the image dataset and is directly accessible to the Image Scanning Station (ISS) 120. Such an alternative data storage device may serve as a long-term data storage for the post-operative image data set. Such alternative data storage devices would not necessarily need to be accessible to the Image Scanning Station (ISS) 120 that originally stored the image data set.
However, such long term data storage may be made accessible from an Image Transfer Station (ITS) 170 via a local area network or via more direct and higher speed data access. The ITS 170 will include hardware such as an Image Scanning Station (ISS) 120, except that it will not need to be attached to image scanning hardware and will not need to execute image scanning or processing software 232. The ITS 170 will execute the intermediary image access component software 172 and will not necessarily require local and/or high speed access to the mass storage device. Alternatively, the ITS 170 may access a large amount of image data via a local area network. Optionally, the ITS 170 may possess input Image Access Component (IAC) functionality within the intermediary IAC 172 to detect access availability and unavailability of stored image data over time.
It is noted that an alternative image access communication path may be established between the output IAC and an alternative data storage device. In some cases, an intermediary IAC 172 executing on an Image Transfer Station (ITS) 170 is configured to directly access an alternate data storage device and is configured to operate within a set of IAC's configured to form the alternate image access communication path.
Due to data storage limitations of the Image Scanning Station (ISS) 120, each initially stored image data set is eventually removed from the data storage device 124 of the ISS 120. If the originally stored image data set is removed, the original image access communication path between the output IAC and the originally stored image data is terminated. Further, the system may be configured to: the original image access communication path is terminated with or without removal of the originally stored image data from the data storage device 124 of the ISS 120.
During normal operation, each image dataset suddenly becomes available for viewing at a first location and at a first time (typically during a surgical procedure), and later suddenly becomes unavailable for viewing, at least at the first location, or at any location. The timing of events in which the image data set becomes available and then becomes unavailable, and may become available again when stored at the second location, typically occurs in a non-time scheduled manner. Thus, the location and/or timing of such events is not typically predetermined over time.
It is noted that the image access communication path is referred to herein as being activated when the image data set becomes available. When the image data set becomes unavailable, the image access communication path is referred to herein as terminated.
In a typical case, output IAC142 is located remotely from where image data 142 is stored, and therefore, output IAC142 does not have high data rate access to the stored image data. To access remotely located image data, wherever it may reside, the output IAC142 communicates with other members of the image access component set to locate and access the image data 142. The set of image access components is linked into a network of interoperating image access components, where each image access component has a parent and/or child relationship with at least one other image access component within the set of image access components. How image data is relayed between IAC's is described in more detail in connection with fig. 2E.
According to some embodiments of the present invention, the Open Sea Dragon (OSD) software is modified to access image data that is not located within a file system that is local (directly accessible) to the computer on which the OSD software is executing and that is not local (directly accessible) to the computer from which the OSD software has been downloaded. Instead, the OSD software is modified to access image data by employing a set of Image Access Components (IAC) including output IAC142 and other types of IAC.
The OSD software employs a set of IAC's via communication with an output IAC142, which output IAC142 is accessible to the OSD software via a wide area network, such as the internet. Preferably, the output IAC142 is executed on a computer, referred to herein as an Internet Access Platform (IAP) 140, which is designed to be located remotely from where the image data can be stored, and remotely from where the IVS can be located.
According to an embodiment of the invention, the OSD software is modified and downloaded from an output IAC142 executing on an Internet Access Portal (IAP) 140 and embedded in an Image Viewing Component (IVC) 152. The IVC 152 is a software collection comprising one or more processes. The OSD software is modified to not access data from a file system local to the computer on which the OSD software is executing, and instead the OSD software is designed to pass an http get request function call to the URL of the software that is the output IAC142 executing on the IAP 140 to access various sets (portions) of image data stored on one or more computers typically residing at locations remote from the IVC 152.
The http get request is implemented as a function call that enables the IVC 152, and the downloaded, modified OSD software embedded in the IVC 152, to access image data via the output IAC142, while the output IAC142 interoperates with a set of Image Access Components (IAC) to access various sets (portions) of image data from whatever location such image data may reside at that particular time. In other words, the image data to be accessed is subject to being moved and/or copied to a new location over time, particularly after the procedure that originally produced the image data is completed.
In some embodiments, the output IAC142 generates a separate Web Tile (Tile) server (WTS) process. Like any other IAC software, output IAC142 software is implemented as one or more processes and/or collections of parts of processes that execute on a computer and preferably within the operating system environment on that computer. The WTS is a process designed to communicate a request for an image data set (part) to a first other IAC, which has previously registered with output IAC142 as a child IAC of output IAC142, and which has previously registered with output IAC142 as having access to the requested image data set (part).
The WTS process software serves as part of the output IAC142 software. The aforementioned "first other IAC" is referred to herein as a child IAC of the input IAC 142. The child IAC may be configured as an intermediary IAC 172 or an input IAC122 within the chain of IACs. The IAC chain includes at least one output IAC142 at a first end of the IAC chain, and includes an input IAC122 at a second end of the IAC chain. This "first other IAC", which is a child IAC of output IAC142, is referred to herein as the "first child" IAC of the IAC chain.
If the first child IAC is not an input IAC122 and it does not have access to the requested image data, it is designed to retransmit the request for the image data set (portion) to a second other IAC that has previously been registered as a child IAC of the first child IAC and that has previously been registered as having access to the requested image data set (portion). This first other IAC will be referred to herein as the "second child" IAC.
A request for an image data set (portion) along the IAC chain is communicated (relayed) along the IAC chain until the input IAC122 receives a request for an image data set (portion), the input IAC122 previously registered as having access to the requested image data set (portion) and as a child IAC of another IAC in the IAC chain between the output IAC142 and the input IAC 122. In other words, the particular IAC chain, which may form a subset of the IAC having a larger IAC network, has been pre-configured via prior registration to provide access to the particular image dataset requested.
Upon receiving a request for an image data set (portion), input IAC122 transmits the requested image data in tiles of image data all the way through the IAC chain back to output IAC142, which output IAC I42 delivers the requested image data to OSD software operating within Image Viewer Software (IVS) 152.
Referring back to the unmodified version of the OSD software, it is designed to be downloaded from the OSD web server to the web browser, and is further designed to access image data via a function call to access a file from within a file system linked to the computer (local to the computer) on which the OSD web server software executes. In other words, unlike embodiments of the present invention, the file system is linked to an executing instance of the operating system on which the unmodified version of the OSD web server software is executing.
In contrast, embodiments of the present invention employ a modified version of the OSD software and are designed to access files from a file system that is linked to the computer (local to that computer) on which a particular IAC within the chain of IACs executes. This particular IAC has access to and has direct access to requested image data related to other IAC's within the IAC chain.
An IAC that is capable of (configured to) accessing (reaching) requested image data without being transmitted over another IAC is referred to herein as an IAC that has direct access to the requested image data. An IAC that needs to be communicated over another IAC to access requested image data is referred to herein as having indirect access to the image data. In some cases, intermediary IAC 172 is a particular IAC within the IAC chain that has direct access to the requested image data, as opposed to input IAC 142.
FIG. 2E illustrates a simplified representation of an embodiment of a system including a tree type hierarchy among image access components. The set of image access components are collectively configured to provide remote access to the image dataset from the Image Scanning Station (ISS) 120a to one or more Image Viewing Stations (IVSs) 150a-150 c.
As shown, a plurality of Image Scanning Stations (ISSs) 120a-120n are located within a second healthcare facility 210. Each ISS 120a-120n includes a locally accessible mass storage device (not shown here) on which scanned image data is stored. The image data obtained from each scan of excised tissue, along with associated supplemental (calculated) image data and metadata, is stored within an image dataset file 126a on the mass storage device 124a, which mass storage device 124a is directly attached to the image scanning station 120a.
The input Image Access Components (IACs) 122a-122n each reside and execute within an Image Scanning Station (ISS) 120a-120n, respectively, within the healthcare facility 210. Each input IAC I20a-I20n is configured to periodically detect the presence of one or more data sets created and/or currently stored within the mass storage device 124 of an Image Scanning Station (ISS) 120a-120 n. In essence, each input IAC122a-122n "bills out" the existence of any image dataset file that is accessible to the IACs 122a-I22n and appears available for viewing. When available for viewing, these image dataset files include an amount of scanned image data, associated supplemental image data, and metadata information that is determined by input IAC122 to be complete and ready for viewing.
Each input IAC122a-122n is configured to operate as a child-only Image Access Component (IAC) within the linked hierarchy of image access components. Each input IAC122a-122n is configured to be linked to a parent IAC, which in this case is an intermediary IAC 172, the intermediary IAC 172 being shown as residing on and executing on a separate computer, also referred to herein as an Image Transfer Station (ITS) 170 within the same healthcare facility 210.
Intermediary IAC 172 is configured to operate as both a child IAC and a parent IAC. The intermediary IAC 172 is configured to operate as a parent for each of a plurality of input IAC122a-122n within the healthcare facility 210, and is also configured to operate as a child IAC associated with an output IAC142, which output 1AC 142 resides on and executes on a computer, referred to herein as an Image Access Portal (IAP) 140, which Image Access Portal (IAP) 140 is internet network accessible and located outside of the healthcare facility 210.
Image Viewing Stations (IVS) 150a-150b, which are Internet network accessible computers such as, for example, desktop computers, are located within the first healthcare facility 210. Likewise, one or more other image viewing stations (including, such as, for example, the IVA 150 x) are located outside the first healthcare facility 210, and instead are located in another third healthcare facility 280, the third healthcare facility 280 being located 10 miles from the first healthcare facility. All of these image viewing stations (IVE) 150a-150x can access image data from one or more image scanning stations via online internet access to an output IAC142 of an Image Access Portal (IAP) 140.
In one usage scenario, a user of the system (such as, for example, a pathologist) receives a notification that image data is available for viewing, for example, via a communication (such as a mobile phone text message) and/or via an email communication. The communication includes the website address and the unique identifier of the image data set 126a that has just recently become available. This method of communication is referred to herein as notifying the communication path. The image dataset 126a includes scanned and associated supplemental image data and includes metadata information associated with the image dataset, such as, for example, the name of the healthcare facility, the type of procedure, the name/identifier of the surgeon 116 and the patient.
In response, the user accesses the internet website of the Image Access Portal (IAP) 140 associated with the output Image Access Component (IAC) 142 online, logs in to the website 140 by authenticating him or her via the transfer of at least a username and password.
In response, output IAC142 displays a list of accessible image data sets that have been previously communicated from other IAC's that are linked as children of output IAC 142. The output 1AC 142 searches the list and displays a list of one or more image datasets together with their unique identifiers to the user.
In response to displaying the list of image data sets, the user selects for viewing the image data set 126a associated with the unique identifier of the image data set 126a, which image data set 126a is listed within the mobile phone text message and is also listed on the IAP 140 after a successful login to the IAP 140.
In response to a user selection, the output IAC142 communicates Image Viewing Component (IVC) software to a browser program executing on the Image Viewing Station (IVS) 150 x. In this embodiment, the IVC 150x is implemented in a Java script downloaded from the IAP website.
Likewise, in response to the user's selection, the output IAC142 transmits a request to receive a first portion of the content of the selected image data set 126a (referred to as the image data set of interest) to a child IAC, which is in this case the intermediary IAC 172, which intermediary IAC 172 has previously transmitted to the output IAC142 the availability of access to the image data set 126a of interest.
Intermediary IAC 172 searches its list of accessible (available) image data sets whose identifications have been previously communicated from other IAC's that are linked as direct or indirect children of intermediary IAC 172. Upon finding the image dataset 126 of interest 126a that the user is selecting, the intermediary IAC 172 communicates a request to communicate the first portion of the content of the image dataset of interest 126 to a child IAC, which in this case is the input IAC122a, which input IAC122a has previously reported the availability to access the image dataset of interest 126 a.
In response to IAC122a receiving a request from intermediary IAC, IAC122a transmits a first portion of the contents of image data set 126 of interest to intermediary IAC 172. Next, intermediary IAC 172 relays (transmits) the first portion of the content of image dataset of interest 126 to output IAC 142. Next, the output IAC relays (transmits) the first portion of the content of the image dataset 126 of interest 126a to an Image Viewing Component (IVC) 152x of an Image Viewing Station (IVS) 150 x.
In this usage scenario, the input IAC122 already has the image dataset of interest since its creation. Upon completion of the scanning, processing, and storing of image dataset of interest 126a in its locally accessible mass storage device 124a, input IAC122a previously reports the presence and accessibility of image dataset 126a to intermediary IAC 172. In response to receiving a communication regarding the availability of the image dataset 126a, the intermediary IAC 172 relays the communication regarding the image dataset of interest 126a to its parent IAC, i.e., the output IAC 142.
At some time in the future, the image dataset of interest will be removed from the mass storage device of the image scanning station from which it was created. Such image dataset removal events will be reported by input IAC122a to intermediary IAC 172, and such image dataset removal events will be relayed by intermediary IAC 172 and reported to output IAC 142. Upon receiving the report of the removal event, each of intermediary IAC 172 and output IAC142 removes information about the removed (unavailable) image dataset 126a from its list of accessible (available) image datasets 126.
FIG. 2F further illustrates the operation of the set of image access components shown in FIG. 2E. Collectively, these access components are configured to provide remote and quick access to the image dataset from the Image Scanning Station (ISS) 120a to the Image Viewing Station (IVS) 150 x.
When a user of the image viewing component enters a viewing instruction, the IVC 152x optionally transmits an image access request 250 to the output IAC142 via the communication 250, the image access request 250 comprising a set of image access parameters. The image access parameters define the image data via location and resolution attributes of the image data within the entire repository of image data within a particular image dataset of interest. The image access parameters typically identify the following image data: the image data is typically stored as a portion of the entire image data stored within a particular image data set of interest.
In response to receiving the image access request 250, the output IAC142 checks whether it has cached some or all of the image data that satisfies the image access request 250. If the cached image data cannot satisfy the entire image access request 250, the output IAC142 transmits the image access request 252 to a child IAC, which is the intermediary IAC 172, which intermediary IAC 172 has previously reported to the output IAC142 the availability for access to the image data set 126a of interest.
The image access request 252 includes an image access parameter that defines the remaining portion (remaining) that is the difference between the image data defined by the image access request 250 and the image data currently cached within the output IAC 142.
Alternatively, if the output IAC142 can fully satisfy the image access request, it 142 responds by transmitting an image transfer transaction 264 to the Image Viewing Component (IVC) 152x, the image transfer transaction 264 including image data that fully satisfies the image access request 250.
According to the present invention, when a child IAC reports to the parent IAC142 the availability of access to a particular image data set of interest, such reporting action also indicates to the parent IAC142 that the child IAC 172 has access and can deliver image data from the image data set of interest in response to receiving a request for access to its parent IAC 142.
In response to receiving image access request 252, broker IAC 172 checks whether it is currently caching image data that fully satisfies some or all of the image access requests.
If the currently cached image data cannot satisfy the entire image access request 250, intermediary IAC 172 passes the image access request 254 to the child IAC, i.e., output IAC122a, which IAC122a has previously reported to intermediary IAC 172 the availability of access to the image data set of interest 126 a.
Image access request 254 includes an image access parameter that defines the remaining portion of the difference between the image data defined by image access request 252 and the image data currently cached within intermediary IAC 172.
Alternatively, if intermediary IAC 172 can fully satisfy the image access request, it 172 responds by transmitting an image transfer transaction 262 to output IAC142, the image transfer transaction 262 including image data that fully satisfies image access request 252.
In response to receiving the image access request 254, the input IAC122a checks whether it is currently caching image data that fully satisfies the image access request.
If the currently cached image data is not capable of satisfying the entire image access request 254, then input IAC122a retrieves image data from image dataset file 126a stored on mass storage device 124a and accessible to input IAC122a sufficient to fully satisfy image access request 254. Output 122a then responds by transmitting an image transfer transaction 260 to intermediary IAC 172, image transfer transaction 260 comprising image data that fully satisfies image access request 254.
In response to receiving image transfer transaction 260 from input IAC122a, intermediary IAC 172 transmits to output IAC142 image transfer transaction 262, which image transfer transaction 262 includes image data that fully satisfies image access request 252.
In response to receiving the image transfer transaction 262, the output IAC 172 communicates an image transfer transaction 264 to the Image Viewing Component (IVC) 152x, the image transfer transaction 264 including image data that fully satisfies the image access request 250.
In response to receiving the image transfer transaction 264, the Image Viewing Component (IVC) 152x executes a viewing instruction that was received from the user and that initially caused the original image access request 250 to be communicated from the IVC 152x to the output IAC 142.
It is noted that in the tree hierarchy of FIGS. 2E and 2F described above, output IAC142 communicates indirectly with input IAC122a via intermediary IAC 172. Alternatively, if intermediary IAC 172 is not provided, presumably between output IAC142 and input IAC122a, output IAC142 will communicate directly with input IAC 142.
In accordance with the present invention, the hierarchy of image access components need not include intermediary IAC 172. And further with respect to embodiments employing intermediary IAC 172, the present invention does not limit such embodiments to include only one intermediary IAC disposed between output IAC142 and any one of the input IAC. In some embodiments, a series of multiple (more than one) intermediary IAC's may be provided between an output IAC and any one of the input IAC's.
For example, with respect to an IAC tree hierarchy having 10 input IAC's, each of the input IAC's may be directly linked or indirectly linked to an output IAC142, the output IAC142 serving as a root node of the tree hierarchy. The series of IAC's along the path between an output IAC and each input IAC is variable and may include zero, one, or multiple intermediary IAC 172. In other words, any number of intermediary IAC 172 residing between an output IAC142 and any one particular input IAC122 is not required to be identical between that output IAC and any other particular input IAC.
It is also noted that if a set of Image Access Components (IAC) includes an output IAC and optionally one or more intermediary IAC's, each with one and only one child IAC, then the arrangement is referred to herein as a hierarchy of IAC's of the chained type and will include one and only one input IAC 122. And if the set IAC includes one and only one input IAC122, and there are no intermediary IAC 172, the hierarchy will be referred to as a chain-type hierarchy having only two nodes, which is the smallest and simplest type of hierarchy for a set of IAC.
Fig. 3A illustrates a simplified representation of image data displayed within a User Interface Display Window (UIDW) 310 of the Image Viewing Station (IVS) 150. As shown in this embodiment, the user interface display window 310 is square in shape and displays image data representing the exposed surface of an ex vivo tissue sample that was recently excised (cut) from the patient during the procedure and then optically scanned to create the image data shown here.
In this example use scenario, performance of the surgery on the surgical patient is currently suspended while the patient is waiting on the operating room table, and while the surgeon 116 waits for another health care professional to evaluate the optically scanned tissue. The IVS 150 is located approximately 20 miles from the current location where the procedure is performed.
In this example, the exposed surface of the ex vivo tissue sample is square in shape and measures 25 millimeters by 25 millimeters. The exposed surface representing a cross-section of the tissue sample is the following: the surface is oriented substantially parallel to the drawing surface of the figure and substantially parallel to a two-dimensional plane defined by the X-axis 312a and the Y-axis 312b shown here. The Z axis 312c is oriented perpendicular to the X axis 312 and the Y axis 314, and is also oriented perpendicular to the planar drawing surface of the figure.
In this embodiment, UIDW 310 has the following physical size: the physical size measures 10 inches and 1000 display pixels along the X-axis 312 and 10 inches and 1000 display pixels along the Y-axis 314. Thus, the User Interface Display Screen (UIDS) pixel density, also referred to herein as the linear pixel density of the UIDW screen, is 100 pixels per inch in a direction parallel to the X-axis 312 and 100 pixels per inch in a direction parallel to the Y-axis 314.
The resolution 282 of the image data as shown is referred to herein as an initial view or "macro view" of the image data. The magnification (magnification)/resolution of this view of image data is the lowest resolution of the (4) preconfigured resolutions provided by this embodiment of the invention. The macroscopic view shows the entire exposed surface of the excised tissue within the UIDW 310. The image data viewed here includes twenty-five (25) micron image data pixels in accordance with the lowest image data resolution calculated from image data obtained from an initial scan of the exposed surface of the tissue sample, as described in connection with fig. 2C.
The initial macroscopic view shown here is somewhat similar to what can be seen from the human eye with the aid of a magnifying glass. As shown here, this resolution includes UIDW display pixels, each UIDW display pixel representing a physical region of tissue that is square in shape and measures 25 microns in width and 25 microns in height. The macroscopic view shown here is shown in grayscale without color enhancement. However, depending on how the tissue sample was initially scanned, the initial macroscopic view may instead be shown in grayscale and/or optionally with some additional color enhancement.
The set of user interface controls 320 is disposed on the UIDS 310 and slightly below the UIDW 310. These controls include a resolution reset button 322, a DOWN (DOWN) zoom button 324, and an UP (UP) zoom button 326, as well as an image resolution status indicator 328. Pressing the zoom-down button 324 reduces the resolution of the image data view to the next lowest (lower) pre-computed resolution. Pressing the reset resolution button 322 restores the view of the image data to the lowest pre-computed image resolution, i.e., the initial view or the macro view.
Because the physical size of the UIDW 310 is 10 inches by 10 inches and the exposed surface of the ablated tissue is 1 inch by I inches, the macroscopic view is actually a 10 to 1 linear magnification of what can be seen with the unassisted human eye when examining the exposed surface of the ablated tissue. The row of user interface controls 320 is located below the bottom edge of the UIDW 310. The functionality of these view instruction controls will be explained in the text below.
In this usage scenario, a portion of the displayed image data appears to be a lesion 330, also referred to herein as a lesion of interest. A user of the IVS 150, who is a pathologist and a viewer of the image data, desires to obtain a closer view of the portion 330 of the exposed surface of the tissue sample.
Fig. 3B illustrates the transmission of a first viewing instruction from a user of an Image Viewing Station (IVS). As shown, the location within UIDW 310 represented by UIDW location 332 is associated with the first view instruction, and is also indicated by the intersection of crosshairs 332a-332 b.
To transmit a view instruction, the user aims the mouse pointer of the mouse device of the IVS 150 at the first UIDW location 332 within the UIDW 310, i.e., the lesion of interest, and holds the left-hand button on the mouse device and drags the image in the northeast direction 334. If the IVS 150 includes a touch screen, a view instruction may be performed via touch screen input, such as via using a finger to pinch/drag the image in the northeast direction 334.
The image is dragged and repositioned such that the lesion of interest is repositioned to a second location 338, i.e., the center point location 338 within the UIDW 310. A first UIDW location 332 is located approximately 60 pixels east of the left hand side of the UIDW 310 and approximately 50 pixels north of the lower edge of the UIDW 310. This particular first UIDW location 332 is also represented as UIDW location coordinates (60, 50).
In response to the transmission of the viewing instructions, image Viewing Component (IVC) software executing within the Image Viewing Station (IVS) 150 inputs and processes the viewing instructions and, in response, evaluates whether additional image data is needed to process the viewing instructions and, if necessary, requests and receives additional image data via the output IAC and, if necessary, modifies the image data content within the UIDW 310 in accordance with the viewing instructions, as shown in fig. 3C.
FIG. 3C illustrates processing of a first viewing instruction transmitted from a user. In response to processing the first view instruction, the image data content of the UIDW 310 is shifted (translated) toward the northeast direction such that the data pixel location 332 within the image data is now repositioned to the center display pixel location 338 within the UIDW 310 at the UIDW coordinate location (500 ).
The display pixels of fig. 3C located more than 60 pixels west and/or more than 50 pixels south of the UIDW 310 display pixel location 338 now define a region (portion) 336 of the UIDW 310 that does not display image data. The pixels residing in this region 336 are displayed as one uniform color. In this embodiment, the uniform color is black. The image data shown in fig. 3C is a cropped subset of the image data shown in fig. 3A-3B. In response to the viewing instruction of fig. 3B, no additional image data other than the image data displayed in fig. 3A-3B is required to display the image data of fig. 3C.
In essence, the first viewing instruction modifies a location within the image data to be viewed at a central location of the UIDW 310. The view instruction is a location type of view instruction as opposed to a zoom-in type of view instruction, where the display of particular image data is shifted relative to a location within the UIDW 310.
Fig. 3D illustrates processing of a second viewing instruction transmitted from a user of an Image Viewing Station (IVS). In this use scenario, the user now desires a higher magnification (higher resolution) view of the image data within the UIDW 310 relative to the view of the image data displayed in fig. 3C. As shown in fig. 3C, the lesion 330 is repositioned to a central location 338 within the UIDW 310.
To increase the magnification of the image data viewed inside the UIDW 310, the user presses button 326 labeled "+" also referred to as the "zoom up" button. In response, the content of the UIDW 310 is modified to show a magnified portion of the image data being viewed, where the portion of the image data to be magnified is located around the center point location 338 of the UIDW 310. The first press of the zoom-up button 326 increases the linear magnification factor of the displayed image data from (1 to 1) to (5 to 1), which corresponds to the third highest resolution 284 of the image data of fig. 2C.
In other words, the pressing of the "up" zoom-in button 326 does not modify the position of the image data portion to be zoomed in on, and the center position of the image portion to be zoomed in on remains in the center position of the UIDW 310. In this embodiment, the amount of magnification of the image data to be viewed (which is expressed as a magnification factor) is configured into the system for each press of the zoom-up button 326. In some embodiments, the magnification factor is configured into the system as a configuration variable within the Open Sea flag (OSD) Java script code.
A sequence press of the zoom-up button 326 transforms the magnification factor of the UIDW 310 (represented as a linear magnification factor) from a minimum magnification factor (macroscopic view magnification) to a maximum (highest) magnification factor. Prior to any press of the zoom-up button 326, the magnification factor is indicated to be equal to one-to-one (1:1), the lowest magnification factor and resolution provided by the system in this embodiment.
The first press of the zoom-up button 326 increases the magnification factor to equal (5 to 1), which corresponds to the second lowest (third highest) magnification factor 284 and resolution 284 among the (4) preconfigured magnification factors/resolutions 282-288 of image data provided in this embodiment of the system of the present invention.
A linear magnification factor equal to (5 to 1) means that a portion of the tissue cross-section is magnified 5 times along each of its width dimension and its height dimension while being displayed inside the UIDW 310.
This linear magnification factor equal to (5 to 1) is equivalent to an area magnification factor equal to (5 × 5= 25) twenty five, i.e., a magnification factor for a two-dimensional area of a tissue cross section displayed inside the UIDW 310.
In other words, the tissue region viewed at this magnification level through the entire UIDW 310, also referred to herein as the User Interface Display (UIDS) field of view or display field of view, represents the following region: this area is at most 1/25 of the entire size of the 25mm square tissue cross section being scanned.
At this magnification level, each pixel within UIDW 310 represents a square tissue region approximately (4.8) "five" microns high by (0.48) "five" microns wide, rather than a region of (24.0) "twenty-five" microns by (24.0) "twenty-five" microns as shown in fig. 3A-3C. At this magnification level, the user receives a higher resolution view of the tissue surrounding the lesion 330, as well as the lesion itself, than the lower resolution views of fig. 3A-3C.
It is noted that in some embodiments, the amount of image data that is transferred to the IVC 152 in response to receiving a viewing instruction from a user of the IVS 150 is limited to a full (full of) window of display pixels. In this case, the window of full display pixels is equal to a matrix of display pixels of about 1000 × 1000, which corresponds to about one million display pixels. In other embodiments, the IVC 152 may make a "look ahead" and request access to the entire window plus additional display pixels in anticipation of future viewing instructions that will be transmitted from the user of the IVS 150.
Fig. 3E illustrates the processing of a third viewing instruction transmitted from the user to the IVS 150. The third viewing instruction causes a modification to the view of the image data displayed within the UIDW 310 in response to processing the third viewing instruction. In this scenario, the user now desires a higher magnification (higher resolution) of image data in UIDW 310 than the image data view shown in fig. 3D.
To increase the magnification of the image data viewed inside the UIDW 310 again, the user presses button 326 labeled "+" also referred to as the "zoom up" button. In response, the content of the UIDW 310 is modified to show a magnified portion of the image data being viewed, where the portion of the image data to be magnified is located around the center point location 338 of the UIDW 310.
The second press of the zoom-up button 326 increases the linear magnification factor of the displayed image data from (5 to 1) to (25 to 1), which corresponds to the second highest resolution 286 of the image data of fig. 2C.
A linear amplification factor equal to (25 to 1) means: relative to an initial macroscopic view resolution equal to 1, a portion of the tissue cross-section is magnified 25 times along each of its width dimension and its height dimension while being displayed inside the UIDW 310.
This linear magnification factor equal to (25 to 1) is equivalent to an area magnification factor of (25 × 25= 625), i.e., a two-dimensional magnification factor for the amount of area of the tissue cross-section displayed inside the UIDW 310.
In other words, the tissue region viewed at this magnification level through the entire UIDW 310, also referred to herein as the display field of view, represents the following region: this area is at most 1/625 of the entire size of the 25mm square tissue cross-section being scanned. At this magnification level, each pixel within UIDW 310 represents a square tissue region approximately (0.96) "one micron" in size by (0.96) "one micron" in size.
At this magnification level, the user may receive a higher resolution view of the tissue within the lesion 330 itself. As shown here, almost the entire UIDW 310 is now viewing the tissue inside the lesion 330. However, the southeast corner of UIDW 310 crosses the edge of lesion 330 and shows some tissue outside lesion 330.
This resolution is referred to as the "cell resolution" or "cell magnification" of the view of the image tissue, as the resolution falls within the range of resolutions in which human cells can be viewed via the UIDW 310. The average size of human cells is about 100 microns in diameter. At this resolution, human cells of a size of 100 microns in diameter, which diameter would be represented by about 100 consecutive pixels in length, may occupy about one consecutive inch in a 10 inch by 10 inch UIDW 310.
Fig. 3F illustrates processing of a fourth viewing instruction transmitted from the user to the IVS 150. In response to processing the fourth viewing instruction, the fourth viewing instruction causes a modification to the view of the image data displayed within the UIDW 310. In this scenario, the user now desires a higher magnification (higher resolution) of image data in UIDW 310 than the image data view shown in FIG. 1 of 3E.
To increase the magnification of the image data viewed inside the UIDW 310 again, the user presses button 326 labeled "+" also referred to as "zoom up" button 326. In response, the content of UIDW 310 is modified to show a zoomed-in portion of the image data being viewed and having a center point corresponding to center point 338 of the image data currently displayed inside UIDW 310.
A third press of the zoom-up button 326 increases the linear magnification factor of the displayed image data from (25 to 1) to (100 to 1), which corresponds to the highest magnification factor and resolution 288 of the image data of fig. 2C. This first highest resolution of the image data is equal to the original scan resolution.
This linear magnification factor equal to (100 to 1) means that a portion of the tissue cross-section is magnified 100 times along each of its width dimension and its height dimension while being displayed inside the UIDW 310.
This linear magnification factor equal to (100 to 1) is equivalent to an area magnification factor of (100 × 100=10,000), i.e., a two-dimensional magnification factor for the amount of area of the tissue cross-section displayed inside the UIDW 310.
In other words, the tissue region viewed at this magnification level through the entire UIDW 310, also referred to herein as the display field of view, represents the following region: this area is at most 1/10,000 of the entire size of a 25mm square tissue cross-section being scanned. At this magnification level, each pixel within UIDW 310 represents a square tissue region having a size of approximately (0.24) "one-quarter micron" height by (0.24) "one-quarter micron" width.
At this magnification level, the user receives the highest resolution provided by the system, and the highest resolution is equal to the resolution of the original scan. The entire UIDW 310 is now showing the tissue inside the lesion 330 and showing the tissue at cellular resolution.
At this resolution, human cells of size 100 microns in diameter, which would be represented by about 400 pixels, occupy about 40% of the width and height of the UIDW 310 dimension.
Execute
In this example use scenario, surgical performance on the patient is suspended while the patient is waiting on the operating room table, and while the surgeon 116 waits for another health care professional to evaluate the optically scanned tissue. In this case, the IVS 150 is located approximately 20 miles from the current location where the procedure is performed.
A user of the system, such as, for example, a pathologist, receives a notification via a mobile device that image data is available for evaluation, such as, for example, a mobile phone text message that includes a web site (URL) address and unique identifier for the image dataset 126, and information associated with the image data, including information such as, for example, the name of the healthcare facility in which the image data was scanned, the procedure type, the patient name, and the name of the surgeon 116 associated with the procedure.
In some embodiments, the portion of the information displayed within the text message may be selected (clicked) as a URL link and viewed directly on the mobile device, which then operates as the IVS 150. In some embodiments, the notification may also be transmitted via email and viewed directly from the device receiving the email in the same manner as described above for the mobile device.
In response, the user accesses the website by employing image viewing station 150 and logs in to the website hosting the output image access component by authenticating him or her with a username and password, and selects the image dataset associated with the unique identifier listed on the website for viewing the view. The unique identifier is also provided by a mobile phone text message via the mobile phone.
In response to selecting an image dataset 126 for viewing, an Image Viewing Station (IVS) 150 (e.g., a desktop computer) displays a macro image of the image dataset 126, such as the macro image shown in fig. 3A. The time elapsed between the time the user selects the image dataset 126 for viewing and the time the IVS displays the image data on the display screen depends on the amount of current internet activity, which varies over time. However, in the case of typical internet activity, this elapsed time typically falls within 8 seconds.
For example, unless the internet is exceptionally slow, the internet should be able to transmit image data at a rate of at least 2.5 megabytes per second. The image data for the initial display of the macro view amounts to about 2 megabytes plus less than 10 kilobytes in total of information associated with the image data set 126. Thus, at typical internet speeds, the transfer of initial macro image data should take about 1 second.
However, the round trip time for sending a request to access the image data set 126, which request will first be received by an Output Image Access Component (OIAC) executing on an Internet website and then relayed to an input image access component (input IAC) executing on an Image Scanning Station (ISS), plus the time required for the (input IAC) to retrieve image data from a high data rate mass storage device typically local to the image scanning station, and then relay the image data from the input IAC over the Internet past some miles away from the image scanning station (20) (output IAC) and onto an Image Viewing Station (IVS), may take 3-4 seconds.
The image data is then displayed within the UIDW 310 along with the associated user interface controls on the image viewing station, requiring an additional 1-2 seconds, as depicted in fig. 3A. Thus, the total elapsed time will amount to 5-7 seconds. However, if the internet is abnormally slow, the system may take several seconds more to transfer the image data to the IVS.
According to the present invention, the system is designed to limit the user to waiting only a fraction of a minute, and typically no more than 8 seconds, to initially view the image data. Unlike other methods of delivering image data, this system is not designed to require a user to wait one or more hours or even minutes to view the image data.
Referring again to fig. 3B-3d, the ivs 150 enables the user to drag the location within the displayed image data to a center location within the UIDW 310 in about one second or less and press the "up" zoom-in button 326.
Referring to fig. 3E, a press of the "zoom up" button 326 typically results in the viewed image data being displayed (zoomed) at a new and higher resolution to appear almost immediately within the UIDW 310. This is also typically true when the viewed image data is caused to be displayed (zoomed) to a new and lower resolution. At typical internet speeds, panning an image in an amount that requires an additional User Interface Display Window (UIDW) filled with new and uncached image data would typically require less than 2.5 megabytes of transfer of image data in this usage scenario, and typically require no more than about 5-7 seconds of elapsed time to access and display the panned image data.
Depending on the design of the system, the user may interact with the IVS 150 and the image access system and select a viewing path through a large amount of image data to evaluate the resected tissue from a distance of 1 yard or more than 100 miles from the surgical site. The communication path between the IVC 152 and the output IAC142 in combination with the particular image access communication path for the image data currently being viewed is referred to herein as the viewing communication path between the IVC 152 and the image dataset 126 currently being viewed.
FIG. 4 illustrates an expanded overview of the transfer of image data involving multiple healthcare facilities 410a-410 z. As shown, a plurality of healthcare facilities 410a-410z generate scanned image data from a surgical procedure and optionally present at least some of the scanned image data available for remote viewing.
Each healthcare facility has its own specific configuration (hierarchy) of one or more image scanning stations and image access components. A small healthcare facility may have only one image scanning station, while a larger healthcare facility may have dozens or more healthcare scanning stations, with one or more image delivery stations 170 each executing an intermediary image access component 172 along an image access communication path between the output IAC142 and many of the input IACs 122.
Multiple image viewing components 152a-152z each execute within an image viewing station 150a-150z, respectively, and access scanned image data from various locations. The image viewing station may be located inside or outside of the healthcare facilities 210a-210 z. Such scanned image data may be viewed thousands of miles from the location at which it was scanned.
The subject matter described herein relating to image data may be applied to any type of data other than image data, including data that may be partially or fully encoded, provided that such data may be stored at one location and transferred to another location via a technique capable of storing and transferring large amounts of data (such as at least one million bytes of data) in one minute or less. Such techniques may include, for example, the use of electronic and/or optical techniques.
This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined by the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal languages of the claims.
Partial list
110. First health care facility
112. Patient's health
114. Operation table
116. Surgeon(s)
120. Image Scanning Station (ISS)
122. Input image Access component (input IAC)
124. Data storage device
126. Image data set
130. User Interface Display (UIDS)
132. Keyboard device
134. Mouse (screen pointer) device
140. Image Access Portal (IAP)
142. Output image Access component (output IAC)
150. Image Viewing Station (IVS)
152. Image Viewing Component (IVC)
160. Tissue scanner
161. Table board
162. Isolated tissue
170. Image Transfer Station (ITS) 170
172. Intermediary image Access component (intermediary IAC)
210. Second health care facility
212. Central Processing Unit (CPU)
214. Physical memory
216. Input/output hardware
220. System bus
228a operating system
228b device driver
230. Image scanning Assembly (ISC) 230
232. Image Processing Component (IPC) 232
240. Virtual memory
280. Third health care facility
282. Fourth (fourth highest) resolution of image data
284. Third (third highest) resolution of image data
286. Second (second highest) resolution of image data
288. First (first highest) resolution of image data
310. User Interface Display Window (UIDW)
312a-c X, Y and Z axes
320. User interface control
322. Resolution reset button
324. Down magnifying button
326. Upward magnifying button
328. Image resolution status indicator
330. Pathological changes
332a-332b indicate the crosshairs of the UIDW position
334. Northeast direction
336. UIDW regions not displaying image data
338. Position of center point

Claims (20)

1. A system for providing remote and rapid access to scanned image data, comprising:
a set of image access components including an output image access component and at least one input image access component; the set of image access components is configured to: in response to an event that the stored first image dataset becomes available at a first location and at a first point in time, establishing a first access communication path between the output image access component and the first image dataset;
the output image access component is configured for acquiring at least a portion of the first image dataset via the first access communication path and configured for communicating the image data to a first image viewing component; and wherein
The first image viewing component is configured to request access to the at least a portion of the first image dataset and receive the at least a portion of the first image dataset from the output image access component; and configured for displaying the at least a portion of the first image dataset to a user of the first image viewing component in response to receiving one or more viewing instructions from the user.
2. The system of claim 1, wherein a notification that the first image dataset becomes available is communicated to the user of the first image viewing component.
3. The system of claim 1, wherein the set of image access components is configured to form a hierarchical structure, and wherein the at least one input access component serves as a child with respect to one other image access component, and wherein the output image access component serves as a parent within the hierarchical structure to at least one other image access component.
4. The system of claim 1, wherein the set of image access components includes at least one intermediary image access component, and wherein the intermediary image access component acts as a child with respect to one other image access component and acts as a parent to at least one other image access component.
5. The system of claim 1, wherein the set of image access components is configured to form a tree type hierarchy, and wherein the output image access component is a root node of the tree type hierarchy.
6. The system of claim 1, wherein the image dataset becomes unavailable for access at a second point in time and, in response, the first communication path between the output image access component and the image dataset is terminated.
7. The system of claim 1, wherein the image dataset becomes available at a third point in time and at a second location, and in response, a second communication path is established between the image dataset and the output image access component.
8. The system of claim 1, wherein the first and second image viewing components each simultaneously access the first image dataset over a period of time.
9. The system of claim 1, wherein the first image viewing component accesses the first image dataset and the second image dataset simultaneously over time.
10. The system of claim 1, wherein the event at which the first image dataset becomes available is not predetermined with respect to a time of the event.
11. The system of claim 1, wherein the event for which the first image dataset becomes available is not predetermined with respect to a location of the event.
12. A system for viewing image data, comprising:
a first image viewing component configured for: requesting access and receiving and displaying to a user at a first point in time at least a portion of a first image dataset in response to receiving one or more viewing instructions from the user, and wherein the requesting access and receiving are performed via communication with an output image access component, and wherein
The output image access component is configured to communicate directly or indirectly with a first input image access component via a first access communication path, and wherein
The first input image access component is configured to detect availability of the first image data set and is configured to establish the first access communication path before the first image viewing component requests access to the at least a portion of the image data; and wherein
The first image viewing component receives the at least a portion of the first image dataset from the output image access component via a first viewing communication path, and wherein
The combination of the first access communication path and the first viewing communication path is configured to extend over at least a wide area network distance.
13. The system of claim 12, wherein the input image access component is configured to cause a notification of the availability of the first set of images to be communicated to the user of the first image viewing component.
14. The system of claim 12, wherein the at least a portion of the first image dataset has a size as follows: the size is limited such that no more than one window is filled with image data per viewing instruction.
15. The system of claim 12, wherein the second input image access component is configured to detect availability of a second image dataset and is configured to establish a second access communication path before the first image viewing component requests access to the second image dataset.
16. The system of claim 12, wherein a second image viewing component requests access to the first image dataset concurrently with the first image viewing component.
17. The system of claim 12, wherein a second image viewing component requests access to the first image dataset concurrently with the first image viewing component.
18. A method for providing remote and fast access to scanned image data, comprising the steps of:
providing a set of image access components comprising an output image access component and at least one input image access component; the set of image access components is configured to: establishing a first access communication path between the output image access component and the first image dataset in response to an event in which the stored first image dataset becomes available at a first location and at a first point in time; and wherein
The output image access component is configured for acquiring at least a portion of the first image dataset via the first access communication path and configured for communicating the image data to a first image viewing component; and
providing a first image viewing component configured to request access to the at least a portion of the first image dataset and receive the at least a portion of the first image dataset from the output image access component; and configured for displaying the at least a portion of the first image dataset to a user of the first image viewing component in response to receiving one or more viewing instructions from the user.
19. The method of claim 18, wherein the image data represents human tissue scanned at human cell resolution during performance of a procedure.
20. The method of claim 18, wherein a notification that the first image dataset becomes available is communicated to a user of the first image viewing component.
CN202180052127.6A 2020-06-26 2021-06-25 System for providing remote and fast access to scanned image data Pending CN115988987A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US202063045019P 2020-06-26 2020-06-26
US63/045019 2020-06-26
PCT/US2021/039258 WO2021263207A1 (en) 2020-06-26 2021-06-25 System for providing remote and rapid access to scanned image data

Publications (1)

Publication Number Publication Date
CN115988987A true CN115988987A (en) 2023-04-18

Family

ID=79032793

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180052127.6A Pending CN115988987A (en) 2020-06-26 2021-06-25 System for providing remote and fast access to scanned image data

Country Status (5)

Country Link
US (1) US20210401505A1 (en)
EP (1) EP4171370A1 (en)
CN (1) CN115988987A (en)
BR (1) BR112022026542A2 (en)
WO (1) WO2021263207A1 (en)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4860112A (en) * 1984-06-07 1989-08-22 Raytel Systems Corporation Teleradiology system having multiple compressor/expanders
US6930709B1 (en) * 1997-12-04 2005-08-16 Pentax Of America, Inc. Integrated internet/intranet camera
US6077226A (en) * 1999-03-30 2000-06-20 General Electric Company Method and apparatus for positioning region of interest in image
US6678703B2 (en) 2000-06-22 2004-01-13 Radvault, Inc. Medical image management system and method
US8271530B2 (en) 2002-04-08 2012-09-18 Oracale International Corporation Method and mechanism for managing and accessing static and dynamic data
US9055867B2 (en) 2005-05-12 2015-06-16 Caliber Imaging & Diagnostics, Inc. Confocal scanning microscope having optical and scanning systems which provide a handheld imaging head
ATE547980T1 (en) 2008-09-25 2012-03-15 Nemodevices Ag DEVICE FOR DIAGNOSIS AND/OR THERAPY OF PHYSIOLOGICAL PROPERTIES OF A SELECTED PART OF THE BODY BY OPTICAL REFLECTION OR TRANSMISSION
EP3526635B1 (en) 2016-10-11 2022-06-08 Caliber Imaging & Diagnostics, Inc. Resonant scanner interoperation with movable stage

Also Published As

Publication number Publication date
BR112022026542A2 (en) 2023-04-18
WO2021263207A1 (en) 2021-12-30
US20210401505A1 (en) 2021-12-30
EP4171370A1 (en) 2023-05-03

Similar Documents

Publication Publication Date Title
US8306298B2 (en) Method and apparatus for internet, intranet, and local viewing of virtual microscope slides
Higgins Applications and challenges of digital pathology and whole slide imaging
US20120327211A1 (en) Diagnostic information distribution device and pathology diagnosis system
US8625920B2 (en) Method and apparatus for creating a virtual microscope slide
US8044974B2 (en) Image creating apparatus and image creating method
JP6313325B2 (en) Selection and display of biomarker expression
EP1011421B1 (en) System for facilitating pathological examination of a lesion in tissue
US20060109343A1 (en) Image displaying system, image providing apparatus, image displaying apparatus, and computer readable recording medium
US20120314049A1 (en) Virtual telemicroscope
JP2008535528A (en) System and method for forming variable quality images of slides
JP2016511845A (en) Biological sample split screen display and system and method for capturing the records
WO2012049741A1 (en) Medical image display device, medical information management server
US7079673B2 (en) Systems for analyzing microtissue arrays
JP5600584B2 (en) Computer-readable recording medium storing data structure
CN115988987A (en) System for providing remote and fast access to scanned image data
JP2019003230A (en) Information processing device, information processing method and program
Koh et al. Understanding digital pathology performance: an eye tracking study
Yagi et al. Digital pathology from the past to the future
KR20220005873A (en) Method for interlocking lesion detection and dentistry image processing device therefor
KR20210056697A (en) Diagnostic microscope for smart devices based on IOT with 800x higher rate of real-time image acquisition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination