AU2019262082A2 - Medical virtual reality and mixed reality collaboration platform - Google Patents

Medical virtual reality and mixed reality collaboration platform Download PDF

Info

Publication number
AU2019262082A2
AU2019262082A2 AU2019262082A AU2019262082A AU2019262082A2 AU 2019262082 A2 AU2019262082 A2 AU 2019262082A2 AU 2019262082 A AU2019262082 A AU 2019262082A AU 2019262082 A AU2019262082 A AU 2019262082A AU 2019262082 A2 AU2019262082 A2 AU 2019262082A2
Authority
AU
Australia
Prior art keywords
image data
dimensional
data
representation
series
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
AU2019262082A
Other versions
AU2019262082A1 (en
Inventor
Thomas William MORRELL
Arthur Chi Teong ONG
Jason Jit Sun TAN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Singular Health Pte Ltd
Original Assignee
Singular Health Pte Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2018901434A external-priority patent/AU2018901434A0/en
Application filed by Singular Health Pte Ltd filed Critical Singular Health Pte Ltd
Publication of AU2019262082A1 publication Critical patent/AU2019262082A1/en
Publication of AU2019262082A2 publication Critical patent/AU2019262082A2/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • A61B6/466Displaying means of special interest adapted to display 3D data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/004Annotating, labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/024Multi-user, collaborative environment

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • General Health & Medical Sciences (AREA)
  • Epidemiology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Computer Graphics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Processing Or Creating Images (AREA)

Abstract

Provided is a collaboration system (10) for collaborating on medical image data. The system (10) includes an input interface (12) into which a series of two-dimensional image data files in Digital Imaging and Communications in Medicine (DICOM) format are receivable, the series of image data files representing successive two dimensional scans of a body area. Also included is a data modelling processor (14) operable to receive the two-dimensional image data files from the input interface (12) and to compile the series of two-dimensional image data files to a three dimensional (3D) representation of the data. Further included is database (16) connected to the data modelling processor (14), on which any one, or both of the series of two-dimensional (2D) image data files and three dimensional (3D) representations of the data is stored, and output interface (18) for presenting any one, or both of the series of two-dimensional (2D) image data files and three dimensional (3D) representations of the data securely to a remote user using display devices (20).

Description

MEDICAL VIRTUAL REALITY AND MIXED REALITY
COLLABORATION PLATFORM
TECHNICAL FIELD
[0001] This invention relates to medical virtual reality and mixed reality collaboration platform. More specifically, this invention relates to a method of collaborating on medical imaging data and to an associated collaboration system.
BACKGROUND
[0002] The following discussion of the background art is intended to facilitate an understanding of the present invention only. The discussion is not an acknowledgement or admission that any of the material referred to is or was part of the common general knowledge as at the priority date of the application.
[0003] The inventors are aware of Virtual Reality (VR) viewers used in the medical field. However, such viewers are often restricted for viewing by a single user and such viewers often require specialized knowledge to use and to interpret images created thereby.
[0004] It is an object of some of the shortcomings of Virtual Reality (VR) viewers the present invention to address existing three dimensional (3D) of which the Applicant is aware.
SUMMARY OF THE INVENTION
WO 2019/210353
PCT/AU2019/050380
[0005] According to one aspect of the invention, there is provided a method of collaborating on medical image data, the method including the steps of:
receiving a series of two-dimensional image data files in Digital Imaging and Communications in Medicine (DICOM) format;
converting the series of two-dimensional image data files to a three dimensional (3D) representation of the medical image data;
storing the 3D representation on a database; and permitting multiple users securely to access the 3D representation of the medical image data.
[0006] The method may include permitting a user to annotate the data. Annotation allows users to selectively isolate areas of interest in the scan data and mark them for future reference using toolsets. The annotated areas of interest can be sent to the collaboration platform and shared with users either by images or a video recording of the screen, of which the user's voice is recorded. The annotated areas of interest may be sent to the collaboration platform automatically.
[0007] The data may be passed through an analysis system which segments the image data and begins to separate the image data into its respective parts, combining with the existing toolsets to allow futher refinement on the segmentation.
[0008] Permitting multiple users securely to access the 3D representation of the data may include permitting users to view the 3D data by means of Virtual Reality (VR) and Mixed Reality (MR) goggles.
WO 2019/210353
PCT/AU2019/050380
[0009] Permitting multiple users securely to access the 3D representation of the data may include receiving authentication information from a user, such as a predefined username and password, which are uniquely associated with the user, before allowing a user to access the 3D representation of the data. Advantageously, the step of receiving authentication information protects the privacy rights of the patients to which the two-dimensional image data files relates .
[0010] Pe rmitting multiple users securely to access the 3D representation of the data may include permitting a user to view the data on a two dimensional display screen, such as a computer display, a handheld tablet or a mobile telephone.
[0011] The method may include hosting a conference call between users. Permitting multiple users securely to access the 3D representation of the data may then include permitting multiple users to access the data in real time, thereby permitting the users to collaborate on the data via a virtual conference call. Advantageously, a medical practitioner may then discuss the data with a patient or with other colleagues.
[0012] The method may further include the step of analysing the 3D representation with graphical tools.
[0013] Analysing the 3D representation with graphical tools may include options to:
translate the image, in which the 3D image is moved into position in a 3D VR environment;
rotate the image, in which the 3D image is rotated on 2D screen or in a 3D VR environment;
WO 2019/210353
PCT/AU2019/050380 intersect the image, in which the 3D image is intersected to show certain portions of the 3D image more clearly on a 2D screen on in a 3D VR environment;
measure a portion of the image, in which a measurement line or lines are drawn as a rule and to display to a user the real life dimensions of the line or lines and to display the ruler with the rest of the image on a 2D screen or in a 3D VR environment;
draw an overlaid image on a portion of the image, to display the image and overlaid image on a 2D screen or in a 3D VR environment;
produce an overlaid mark on a portion of the image, to display the mark on a 2D screen or in a 3D VR environment;
record a video of any of the above annotations together with manipulation of the image in 2D or 3D on the server;
take an image snapshot of the 2D representation of the 3D image and storing the snapshot on the server;
adjust the contrast of the 2D or 3D image;
adjust the brightness of the 2D or 3D image;
adjust the opacity of certain individual elements in the 2D or 3D image; and/or adjust the way a user sees particular structures of the 3D image by manipulating the CT numbers, to change the appearance of the picture to highlight particular structures.
[0014] Analysing the 3D representation with graphical tools may include adjusting the brightness of the image via a viewing window level.
[0015] Analysing the 3D representation with graphical tools may include adjusting the contrast via a viewing window width.
WO 2019/210353
PCT/AU2019/050380
[0016] The method may include implementing a single step conversion of the data, which refers to a single click interaction from the user in order to bring Diacom data into a virtual reality environment ready to be viewed, annotated and enable use of the collaboration platform. It handles the data into their individual series's to be presented inside the VR space .
[0017] The method of collaborating on medical image data may include implementing Machine Learning from medical imaging. Machine Learning from medical imaging may include displaying hotspots for the user to investigate from certain present values. The Machine Learning may then include identifying more hotspots as more medical specialists use the medical imaging data.
[0018] Machine Learning may include presenting medical imaging with presets and settings aligned to the medical speciality, such as for a liver surgeon, the image will load up with the correct windowing values, contrast and brightness.
[0019] According to another aspect of the invention there is provided a collaboration system, which includes:
an input interface into which a series of two-dimensional image data files in Digital Imaging and Communications in Medicine (DICOM) format are receivable, the series of image data files representing successive two dimensional scans of a body area;
a data modelling processor operable to receive the twodimensional image data files from the input interface and to compile the series of two-dimensional image data files to a three dimensional (3D) representation of the data;
WO 2019/210353
PCT/AU2019/050380 a database, connected to the data modelling processor, on which any one, or both of the series of two-dimensional (2D) image data files and three dimensional (3D) representations of the data is stored;
an output interface for presenting any one, or both of the series of two-dimensional (2D) image data files and three dimensional (3D) representations of the data securely to a remote user.
[0020] The collaboration system may include a display device in the form of any one of a mobile telephone, a tablet, a laptop computer, a desktop computer, a pair of virtual reality goggles, or the like, operable to display any one, or both of the series of two-dimensional (2D) image data files and three dimensional (3D) representations of the data. The display device may be connectable to the output interface via a private network, or a public network, such as the Internet.
[0021] The database may be connected to the data modelling processor via a private network or a public network, such as the Internet.
BRIEF DESCRIPTION OF THE DRAWINGS
The description will be made with reference to the accompanying drawings in which:
Figure 1 shows a functional block diagram of a collaboration system in accordance with one aspect of the invention;
Figure 2 shows a method of collaborating on medical image data;
WO 2019/210353
PCT/AU2019/050380
Figure 3 shows certain aspects of the method of Figure 2 in more detail; and
Figures 4 and 5 show three dimensional (3D) representations of the data taken from an MRI scanner forming part of the collaboration system of Figure 1.
DETAILED DESCRIPTION OF EMBODIMENTS
[0022] Further features of the present invention are more fully described in the following description of several nonlimiting embodiments thereof. This description is included solely for the purposes of exemplifying the present invention to the skilled addressee. It should not be understood as a restriction on the broad summary, disclosure or description of the invention as set out above. In the figures, incorporated to illustrate features of the example embodiment or embodiments, like reference numerals are used to identify like parts throughout.
[0023] With reference to the Figures, reference numeral 10 is used throughout this specification to indicate, generally, a collaboration system. The collaboration system includes an input interface 12 into which a series of two-dimensional image data files in Digital Imaging and Communications in Medicine (DICOM) format are receivable, the series of image data files representing successive two dimensional scans of a body area.
[0024] A data modelling processor 14 is connected to the input interface 12 and is operable to compile the series of two-dimensional image data files to a three dimensional (3D) representation of the data.
WO 2019/210353
PCT/AU2019/050380
[0025] A database 16 is connected to the data modelling processor 14 and on the database 16 any one, or both of the series of two-dimensional (2D) image data files and three dimensional (3D) representations of the data is stored.
[0026] An output interface 18 is connected to the data modelling processor 14 for presenting any one, or both of the series of two-dimensional (2D) image data files and three dimensional (3D) representations of the data securely to a remote user (not shown) using display devices 20.
[0027] In this instance the collaboration system includes display devices in the form of a tablet 20.1, a desktop computer 20.2, a laptop computer 20.3, a mobile telephone 20.4 and two pairs of Virtual Reality (VR) goggles 20.5, 20.6. The tablet 20.1, the desktop computer 20.2, the laptop computer 20.3, the mobile telephone 20.4 are operable to display twodimensional (2D) image data files. The two Virtual Reality (VR) goggles 20.5, 20.6 are operable to display three dimensional (3D) representations of the data.
[0028] The display devices 20 are connectable to the output interface via the Internet 24. In this example, the collaboration system 10 includes a second database 22, connected to the data modelling processor 14 via the Internet 24 .
[0029] Two dimensional data sources in the form of a Computer Tomography (CT) scanner 26 and a Magnetic Resonance Imaging (MRI) scanner 28 are connected to the input interface 12 via the Internet 24.
WO 2019/210353
PCT/AU2019/050380
[0030] In use, the collaboration system 10 provides a method of collaborating on medical image data. The method includes receiving a series of two-dimensional image data files in Digital Imaging and Communications in Medicine (DICOM) format onto the data modelling processor 14. The series of data files are received from the data sources 26, 28 via the Internet 24 and the input interface 12.
[0031] The method then entails on the data modelling processor 14 converting the series of two-dimensional image data files to a three dimensional (3D) representation of the data, such as the images shown in Figures 3 and 4.
[0032] The method may include implementing a single step conversion of the data, which refers to a single click interaction from the user in order to bring DICOM data into a virtual reality environment ready to be viewed, annotated and enable use of the collaboration platform. It handles the data into their individual series' to be presented inside the VR space .
[0033] The two-dimensional (2D) image data files and the three dimensional (3D) representation of the data are stored on the database 16. As a backup for the data, or as an alternative to the database 16, the data is also stored on the remote database 22.
[0034] The method further includes permitting multiple users of the display devices 20, securely to access the 2D and 3D representations of the data. Typically one user can be a medical practitioner who was responsible for taking the images on the data sources 26, 28 and another user can be a patient viewing the 3D images on a set of VR goggles 20.5. A further
WO 2019/210353
PCT/AU2019/050380 user can be a second specialist medical practitioner, who views the 3D images on another display device, such as the laptop
20.3. The display devices 20 may include a voice interface by means of which the various users can communicate with each other .
[0035] The method of collaboration then entails that the users of the various display devices can view and consult on the 2D or 3D images in real time.
[0036] The method of collaboration may include the step of analysing the 3D representation by means of graphical tools. Analysing the 3D representation may include options to translate the image, to rotate the image, in which the 3D image is rotated on 2D screen or in a 3D VR environment; to intersect the image, in which the 3D image is intersected to show certain portions of the 3D image more clearly on a 2D screen on in a 3D VR environment; to measure a portion of the image, in which a measurement line or lines are drawn as a rule and to display to a user the real life dimensions of the line or lines and to display the ruler with the rest of the image on a 2D screen or in a 3D VR environment; to draw an overlaid image on a portion of the image, to display the image and overlaid image on a 2D screen or in a 3D VR environment; to produce an overlaid mark on a portion of the image, to display the mark on a 2D screen or in a 3D VR environment; to record a video of any of the above annotations together with manipulation of the image in 2D or 3D on the server; to take an image snapshot of the 2D representation of the image and storing the snapshot on the server; to adjust the contrast of the 2D or 3D image; to adjust the brightness of the 2D or 3D image; and to adjust the opacity of certain individual elements in the 2D or 3D image.
WO 2019/210353
PCT/AU2019/050380
[0037] Figure 2 shows a flow diagram of the method of collaboration in accordance with one aspect of the invention. The flow diagram initiates at 52, where the 2D files originating from the data sources 26, 28 are scanned. If a valid directory which contains the relevant folders does not exist as tested at 54, execution directs to 52, alternatively execution directs to 56. At 56 a check is performed to check if the series of two-dimensional (2D) image data files (files) have previously been handled by the collaboration system, if it has, execution is directed to 58, if not, execution is directed to 60. At 60 the files are arranged for the conversion of the series of two-dimensional image data files to a three dimensional (3D) representation of the data to begin. At 62 the conversion of the files takes place. Execution then proceeds to 64 .
[0038] If execution was directed to 58, then a test is performed to determine if patient information is attached in accordance with the Digital Imaging and Communications in Medicine (DICOM) format. If the patient data has been attached, execution directs to 66 alternatively execution directs to 64. At 64, header files are read and the information is extracted to be used in the conversion process, execution then proceeds to 68. At 68 the information is translated into machine learning algorithms to assist in data placement and in diagnosis. Execution is then directed back to 54.
[0039] At 66 a test is performed to confirm that the data has been anonymized, if it has, execution is directed to 70, if it has not been anonymized, execution is directed to 72. At 72 a check is performed to see if the data has previously been opened, if it has, the previously opened data settings
WO 2019/210353
PCT/AU2019/050380 are used at 74 to place data and to establish the evolving user interface (UI) system, if it has not, the normal user setting is used at 7 6 with the machine learning from other data groups from similar speciality characteristics.
[0040] At 70 the data is loaded into the data modelling processor 16. Execution is directed from 74 and 76 to 78 where a viewing platform is initialized to permit multiple users securely to access the 3D representation of the data. At 80 a user is presented with various operating functions of the data modelling processor 16.
[0041] At 82 the user interface presents a user with options to set up the user interface at 84, to use data manipulation tools at 86 and subsequently the tools are enabled based on the user's medical speciality at 88. At 90 the data isolation tools are presented and subsequently the windowing values are set up at 92 based on data used by other users of similar data groups .
[0042] Setting up the windowing values at 92 may include analysing the 3D representation with graphical tools which further includes adjusting the brightness of the image via the window level. Analysing the 3D representation with graphical tools may then further include adjusting the contrast via the window width.
[0043] From the operating functions presented at 80, a user can select the option to record a sequence, to capture a selection or to record a dictation at 94. Further from the operating functions presented at 80, a user can select the option to enable a virtual reality platform at 96 that provides
WO 2019/210353
PCT/AU2019/050380 the options for users to collaborate at 98 and/or to present a VR multi-user experience at 100.
[0044] Following the option to collaborate at 98, a global specialist platform is enabled at 102, individual patient sessions is enabled at 104 and the collaboration is presented visually in 2D and 3D at 106.
[0045] Following the selection of a VR multi-user experience at 100 the interactions between medical practitioners are grouped into the group of Doctor to Doctor, Doctor to patient and Doctor to team at 108 and at 110 the multi-user experience is extended to Educator to students, student to student and to individual study be students.
[0046] Figure 3 illustrate certain aspects of the process in more detail and illustrate certain additional aspects in addition to the method of collaboration shown in Figure 2. Instances where the steps correspond with the steps in Figure 2 are not described again. However the additional steps are described below.
[0047] Referring to Figure 3, as shown in broken line, once the execution files were arranged at 60 for the conversion of the series of two-dimensional image data files to a three dimensional (3D) representation of the data and once the conversion has taken place at 62, a test is done at 61 to determine if the data conversion was adequate. If the data conversion was not adequate, the system reverts to default settings and converts the data again at 63. If the data conversion was adequate, any conversions that did not meet a certain criterion or criteria are removed at 65 and the
WO 2019/210353
PCT/AU2019/050380 required file structure is constructed and the files are validated at 67.
[0048] In addition to the method shown in Figure 2, as can be seen in Figure 3, after the check is performed at 72 to determine if the data has previously been opened, if it has, the previously opened data settings are used at 74 to place data and to establish the evolving user interface (Ul) system, if it has not, the normal user setting is used at 76 with the machine learning from other data groups from similar speciality characteristics. Then at 75 a step is taken where machine learning is implemented to assist the data setup for display and the patient records are read into the platform. Execution then continues at 78.
[0049] Figures 4 and 5 show three dimensional (3D) representations of the data taken from an MRI scanner forming part of the collaboration system of Figure 1. From the images it is clear how the two dimensional data is represented in three dimensions. Certain features of the scan can then be better illustrated, highlighted and annotated.
[0050] The Applicant is of the opinion that the invention provides a useful method of collaborating on medical data and a useful collaboration system.
[0051] Optional embodiments of the present invention may also be said to broadly consist in the parts, elements and features referred to or indicated herein, individually or collectively, in any or all combinations of two or more of the parts, elements or features, and wherein specific integers are mentioned herein which have known equivalents in the art to which the invention relates, such known equivalents are deemed
WO 2019/210353
PCT/AU2019/050380 to be incorporated herein as if individually set forth. In the example embodiments, well-known processes, well-known device structures, and well-known technologies are not described in detail, as such will be readily understood by the skilled addressee .
[0052] The use of the terms a, an, said, the, and/or similar referents in the context of describing various embodiments (especially in the context of the claimed subject matter) are to be construed to cover both the singular and the plural, unless otherwise indicated herein or clearly contradicted by context. The terms comprising, having, including, and containing are to be construed as openended terms (i.e., meaning including, but not limited to,) unless otherwise noted. As used herein, the term and/or includes any and all combinations of one or more of the associated listed items. No language in the specification should be construed as indicating any non-claimed subject matter as essential to the practice of the claimed subject matter .
[0053] It is to be appreciated that reference to one example or an example of the invention, or similar exemplary language (e.g., such as) herein, is not made in an exclusive sense. Various substantially and specifically practical and useful exemplary embodiments of the claimed subject matter are described herein, textually and/or graphically, for carrying out the claimed subject matter.
[0054] Accordingly, one example may exemplify certain aspects of the invention, whilst other aspects are exemplified in a different example. These examples are intended to assist the skilled person in performing the invention and are not
WO 2019/210353
PCT/AU2019/050380 intended to limit the overall scope of the invention in any way unless the context clearly indicates otherwise. Variations (e.g. modifications and/or enhancements) of one or more embodiments described herein might become apparent to those of ordinary skill in the art upon reading this application. The inventor(s) expects skilled artisans to employ such variations as appropriate, and the inventor (s) intends for the claimed subject matter to be practiced other than as specifically described herein.
[0055] Any method steps, processes, and operations described herein are not to be construed as necessarily requiring their performance in the particular order discussed or illustrated, unless specifically identified as an order of performance. It is also to be understood that additional or alternative steps may be employed.

Claims (5)

1. A method of collaborating on medical image data, the method including the steps of:
receiving a series of two-dimensional image data files in Digital Imaging and Communications in Medicine (DICOM) format;
converting the series of two-dimensional image data files to a three-dimensional (3D) representation of the medical image data;
storing the 3D representation on a database; and permitting multiple users securely to access the 3D representation of the medical image data, said method including implementing Machine Learning on the medical image data to display hotspots in the 3D representation for a user to investigate from certain present values, the Machine Learning configured to:
a) present the 3D representation with presets and settings aligned to the relevant medical speciality; and
b) identify more hotspots as more users interact with the 3D representation.
2. The method of claim 1, which includes the step of permitting a user to annotate the 3D representation of the medical image data, wherein annotation allows users to selectively isolate areas of interest in the image data files and mark them for future reference, and wherein the annotated isolated areas of interest are automatically sent to a secure online collaboration platform and shared with users either by images or a video recording of a screen .
3. The method of any one of the preceding claims, wherein the step of permitting multiple users securely to access the 3D representation of the data includes permitting users to view the 3D data by means of Virtual Reality (VR) and Mixed Reality (MR) goggles and/or view the data on a two-dimensional display screen,
2019262082 15 May 2020 such as a computer display, a handheld tablet, or a mobile telephone .
4. A collaboration system for collaborating on medical image data, said system including:
an input interface into which a series of two-dimensional image data files in Digital Imaging and Communications in Medicine (DICOM) format are receivable, the series of image data files representing successive two-dimensional scans of a body area;
a data modelling processor operable to receive the twodimensional image data files from the input interface and to compile the series of two-dimensional image data files to a threedimensional (3D) representation of the data;
a database, connected to the data modelling processor, on which any one, or both of the series of two-dimensional (2D) image data files and three-dimensional (3D) representations of the data is stored;
an output interface for presenting any one, or both of the series of two-dimensional (2D) image data files and three dimensional (3D) representations of the data securely to a remote user, said data modelling processor configured to implement Machine Learning on the medical image data to display hotspots in the 3D representations for a user to investigate from certain present values, the Machine Learning configured to:
a) present the 3D representation with presets and settings aligned to the relevant medical speciality; and
b) identify more hotspots as more users interact with the 3D representations .
5. The system of claim 4, which includes implementing a single step conversion of the data, which refers to a single click interaction from the user in order to bring DICOM data into a virtual reality environment ready to be viewed, annotated and enable use of the collaboration platform, and wherein the
2019262082 15 May 2020 collaboration system includes a display device in the form of any one of a mobile telephone, a tablet, a laptop computer, a desktop computer, a pair of virtual reality goggles, such display device operable to display any one, or both of the series of twodimensional (2D) image data files and three dimensional (3D) representations of the data.
AU2019262082A 2018-04-30 2019-04-29 Medical virtual reality and mixed reality collaboration platform Pending AU2019262082A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
AU2018901434 2018-04-30
AU2018901434A AU2018901434A0 (en) 2018-04-30 Medical virtual reality and mixed reality collaboration platform
PCT/AU2019/050380 WO2019210353A1 (en) 2018-04-30 2019-04-29 Medical virtual reality and mixed reality collaboration platform

Publications (2)

Publication Number Publication Date
AU2019262082A1 AU2019262082A1 (en) 2020-06-04
AU2019262082A2 true AU2019262082A2 (en) 2020-06-25

Family

ID=68386911

Family Applications (2)

Application Number Title Priority Date Filing Date
AU2019262082A Pending AU2019262082A1 (en) 2018-04-30 2019-04-29 Medical virtual reality and mixed reality collaboration platform
AU2019101736A Active AU2019101736A4 (en) 2018-04-30 2019-04-29 Medical virtual reality and mixed reality collaboration platform

Family Applications After (1)

Application Number Title Priority Date Filing Date
AU2019101736A Active AU2019101736A4 (en) 2018-04-30 2019-04-29 Medical virtual reality and mixed reality collaboration platform

Country Status (3)

Country Link
CN (1) CN112424870A (en)
AU (2) AU2019262082A1 (en)
WO (1) WO2019210353A1 (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2536650A (en) 2015-03-24 2016-09-28 Augmedics Ltd Method and system for combining video-based and optic-based augmented reality in a near eye display
US11980507B2 (en) 2018-05-02 2024-05-14 Augmedics Ltd. Registration of a fiducial marker for an augmented reality system
US11766296B2 (en) 2018-11-26 2023-09-26 Augmedics Ltd. Tracking system for image-guided surgery
US11980506B2 (en) 2019-07-29 2024-05-14 Augmedics Ltd. Fiducial marker
US11382712B2 (en) 2019-12-22 2022-07-12 Augmedics Ltd. Mirroring in image guided surgery
US11896445B2 (en) 2021-07-07 2024-02-13 Augmedics Ltd. Iliac pin and adapter

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013188850A1 (en) * 2012-06-14 2013-12-19 Trice Imaging, Inc. Systems and devices for encrypting, converting and interacting with medical images
US10734116B2 (en) * 2011-10-04 2020-08-04 Quantant Technology, Inc. Remote cloud based medical image sharing and rendering semi-automated or fully automated network and/or web-based, 3D and/or 4D imaging of anatomy for training, rehearsing and/or conducting medical procedures, using multiple standard X-ray and/or other imaging projections, without a need for special hardware and/or systems and/or pre-processing/analysis of a captured image data
US20140176661A1 (en) * 2012-12-21 2014-06-26 G. Anthony Reina System and method for surgical telementoring and training with virtualized telestration and haptic holograms, including metadata tagging, encapsulation and saving multi-modal streaming medical imagery together with multi-dimensional [4-d] virtual mesh and multi-sensory annotation in standard file formats used for digital imaging and communications in medicine (dicom)
US9503681B1 (en) * 2015-05-29 2016-11-22 Purdue Research Foundation Simulated transparent display with augmented reality for remote collaboration
CA3016346A1 (en) * 2016-03-21 2017-09-28 Washington University Virtual reality or augmented reality visualization of 3d medical images
CN106296805B (en) * 2016-06-06 2019-02-26 厦门铭微科技有限公司 A kind of augmented reality human body positioning navigation method and device based on Real-time Feedback
GB201617507D0 (en) * 2016-10-14 2016-11-30 Axial3D Limited Axial3D UK

Also Published As

Publication number Publication date
CN112424870A (en) 2021-02-26
AU2019101736A4 (en) 2020-07-09
WO2019210353A1 (en) 2019-11-07
AU2019262082A1 (en) 2020-06-04

Similar Documents

Publication Publication Date Title
AU2019101736A4 (en) Medical virtual reality and mixed reality collaboration platform
Hanna et al. Augmented reality technology using Microsoft HoloLens in anatomic pathology
US10229497B2 (en) Integration of medical software and advanced image processing
US20090182577A1 (en) Automated information management process
CN101657819A (en) Hanging protocol display system and method
CN107615266A (en) Method for capturing layering screen content
CN103337047A (en) Conference preparation apparatus and conference preparation method
Huang Pacs-based multimedia imaging informatics: Basic principles and applications
JP2015176456A (en) Image processor and program
US20170018204A1 (en) Ultrasound case builder system and method
JP2017191461A (en) Medical report creation apparatus and control method thereof, medical image viewing apparatus and control method thereof, and program
RU2757711C2 (en) Certificates of medical visual display for mobile devices
Lebre et al. Collaborative framework for a whole-slide image viewer
Engelmann et al. Second generation teleradiology
US7961935B2 (en) Method for forming and distributing a composite file including a dental image and associated diagnosis
Silas et al. Telepathology in Nigeria for global health collaboration
Hazarika et al. DICOM-based medical image repository using DSpace
Giansanti et al. WhatsApp in mHealth: design and evaluation of an mHealth tool to share dynamic images in hemodynamics
Lebre et al. Pathobox: the collaborative tele-pathology platform with access management
JP6825606B2 (en) Information processing device and information processing method
Hanna et al. Telecytology for rapid on-site evaluation
JP4368160B2 (en) Image display quality data providing apparatus, program, and method
Nunes et al. Data and Sessions Management in a Telepathology Platform.
US11922668B2 (en) Asynchronous region-of-interest adjudication for medical images
Bohak et al. Remote interaction in web-based medical visual application

Legal Events

Date Code Title Description
DA3 Amendments made section 104

Free format text: THE NATURE OF THE AMENDMENT IS: APPLICATION IS TO PROCEED UNDER THE NUMBER 2019101736

Free format text: THE NATURE OF THE AMENDMENT IS AS SHOWN IN THE STATEMENT FILED 15 MAY 2020