US20140176661A1 - System and method for surgical telementoring and training with virtualized telestration and haptic holograms, including metadata tagging, encapsulation and saving multi-modal streaming medical imagery together with multi-dimensional [4-d] virtual mesh and multi-sensory annotation in standard file formats used for digital imaging and communications in medicine (dicom) - Google Patents
System and method for surgical telementoring and training with virtualized telestration and haptic holograms, including metadata tagging, encapsulation and saving multi-modal streaming medical imagery together with multi-dimensional [4-d] virtual mesh and multi-sensory annotation in standard file formats used for digital imaging and communications in medicine (dicom) Download PDFInfo
- Publication number
- US20140176661A1 US20140176661A1 US14/138,045 US201314138045A US2014176661A1 US 20140176661 A1 US20140176661 A1 US 20140176661A1 US 201314138045 A US201314138045 A US 201314138045A US 2014176661 A1 US2014176661 A1 US 2014176661A1
- Authority
- US
- United States
- Prior art keywords
- virtual
- mesh
- images
- server
- medical
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/15—Conference systems
-
- G06Q50/24—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T9/00—Image coding
- G06T9/001—Model-based coding, e.g. wire frame
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/40—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/67—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/15—Conference systems
- H04N7/155—Conference systems involving storage of or access to video conference sessions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/20—ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/50—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/20—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video object coding
- H04N19/25—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video object coding with scene description coding, e.g. binary format for scenes [BIFS] compression
Abstract
The invention relates generally to a medical apparatus and method of using the same for receiving and transmitting streaming medical imagery and audio signals in real time, and allowing remote operators to annotate and telestrate with same. The invention acquires streaming medical imagery and audio signals through a telestreamer input device, allowing users to electronically collaborate, generally by telestrating, annotating, and sketching image overlays on streaming medical imagery. Video images of streaming imagery data displayed on a monitor are superimposed onto a virtual mesh projected via computer graphics. The vertices of the mesh move according to equations of motion based on a computational physics engine. Virtual tools are projected above the mesh via computer graphics. These virtual tools interact with the virtual mesh according to the physics engine. The superposition of the video images onto the virtual mesh makes it appear that points within the video image are moving in a realistic manner and reacting to the virtual tools with a realistic response. The invention allows for recursive superposition of mesh layers, also known as ‘surgi-skins’, and creation of a multi-layered virtual mesh. Multi-layered surgi-skins synthesized from multi-modal streaming medical imagery and saved together with multi-dimensional [4-D] virtual mesh and multi-sensory annotation in single file format as DICOM files are also known as ‘DICOM mesh’ or ‘haptic holograms’.
Description
- This application claims the benefit of:
- U.S. Provisional Application No. 61/745,383 filed Dec. 21, 2012 entitled “SYSTEM AND METHOD FOR SURGICAL TELEMENTORING USING VIRTUALIZED TELESTRATION,” naming as inventors, G. Anthony Reina and James Omer L'Esperance, which is incorporated herein by reference in its entirety.
- This application may be related to the following commonly assigned and commonly filed U.S. patent applications, each of which is incorporated herein by reference in its entirety:
-
- 1. U.S. patent application Ser. No. US 2011/0282141 A1 entitled “METHOD AND SYSTEM OF SEE-THROUGH CONSOLE OVERLAY”, naming as inventors Itkowitz et al., filed on 17 Nov. 2011.
- 2. U.S. patent application Ser. No. US 2011/0282140 A1 entitled “METHOD AND SYSTEM OF HAND SEGMENTATION AND OVERLAY USING DEPTH DATA”, naming as inventors Itkowitz et al., filed on 17 Nov. 2011.
- 3. U.S. patent application Ser. No. US 2010/0164950 A1 entitled “EFFICIENT 3-D TELESTRATION FOR LOCAL ROBOTIC PROCTORING”, naming as inventors Zhao et al., filed on 1 Jul. 2010.
- 4. U.S. patent application Ser. No. 8,169,468 B2 entitled “AUGMENTED STEREOSCOPIC VISUALIZATION FOR SURGICAL ROBOT”, naming as inventors Scott et al., filed on 1 May 2012.
- 5. U.S. patent application Ser. No. US 2009/0036902 A1 entitled “INTERACTIVE USER INTERFACE FOR ROBOTIC MINIMALLY INVASIVE SURGICAL SYSTEMS”, naming as inventors DiMaio et al., filed on 5 Feb. 2009.
- 6. U.S. patent application Ser. No. US 2011/0107238 A1 entitled “NETWORK-BASED COLLABORATED TELESTRATION ON VIDEO, IMAGES, OR OTHER SHARED VISUAL CONTENT”, naming as inventors Liu and Zhou, filed on 5 May 2011.
- 7. U.S. patent application Ser. No. 7,492,363 B2 entitled “TELESTRATION SYSTEM”, naming as inventors Meier et al., filed on 17 Feb. 2009.
- 8. Patent application Ser. No. CA2545508 C entitled “CAMERA FOR COMMUNICATION OF STREAMING MEDIA TO A REMOTE CLIENT”, naming as inventors Kavanagh et al., filed on Oct. 7, 2003.
- 9. U.S. patent application Ser. No. US 20090210801 A1 entitled “N-way multimedia collaboration systems”, naming as inventors Bakir et al., filed on 19 Feb. 2008.
- 10. U.S. patent application Ser. No. US 20060122482 A1 entitled “Medical image acquisition system for receiving and transmitting medical images instantaneously and method of using the same”, naming as inventors Mariotti et al., filed on 22 Nov. 2004.
- 11. U.S. patent application Ser. No. US20110126127 A1 entitled “System and method for collaboratively communicating on images and saving those communications and images in a standard known format”, naming as inventors Mariotti et al., filed on 23 Nov. 2009.
- Not applicable.
- Not applicable
- 1. Field of Invention
- Aspects of this invention are related to telestration for remote video collaborating with streaming medical imagery and are, more specifically, related to enhancing a remote telementor's ability to annotate and interact with the images in a more realistic, yet virtualized manner through simulating the movement and reaction of the displayed images according to a computational physics model.
- 2. Description of Related Art
- Industries that develop, manufacturer, and maintain complex products often find an insufficient number of employees with extensive training and experience to meet demand. This is particularly relevant as businesses become more geographically diverse. It is inefficient (and sometime physically impossible) to deploy an expert “into the field” on every occasion at a moment's notice. Rather, companies typically deploy technicians with relative degrees of experience who collaborate with the expert remotely. For example, a multi-national aerospace company might have local technicians in an Italian production plant conferring with senior designers in the United States regarding the fabrication concerns for a specialized airframe. Similarly, technicians on an ocean oil rig may consult with shore side experts to address problems with specialized drilling machinery. Traditionally, video monitoring, as described in previous art, has been instrumental in achieving this collaboration.
- Conventional tele-monitoring (aka teleconferencing) allows real-time audio and video tele-collaboration to improve education, training, and performance in many fields. Current collaboration methods include telestration, which can be performed either locally or remotely to identify regions of interest within the video images. For example, television personalities routinely annotate video of live or replayed video broadcasts to highlight their commentary. Similarly, flight engineers can remotely inspect possible damage to space vehicles using telestrated, high-definition images of the equipment while it is still in orbit. In short, expert know-how can be maintained at a centralized location while being mobilized anywhere at a moment's notice.
- Current telestration techniques, as defined in prior art, primarily display freehand and other two-dimensional drawings over a video image or series of images. However, true collaboration is better achieved if the remote expert can demonstrate information through movement and manipulation of the images. In this invention, a computer simulation of the objects within the video images is constructed so that they can be manipulated in a more realistic manner.
- The promotion of electronic medical records has spurred the expansion of healthcare information technology (HIT) infrastructure and led to the growth of medical information technologies, such as networked medical imaging and virtual reality (VR).
- Traditionally, a medical image is produced when an operator or technician conducts a scan of the patient with a medical imaging apparatus. Medical imaging modalities include X-ray, CT, MRI, and ultrasound scanners. The operator uses the imaging apparatus to save the image (in still or motion video format) onto a hard copy (e.g. film), into the memory, or into an image storage database or repository, such as, a Picture Archiving and Communications System (PACS). PACS is a storage and management system for multiple medical imaging modalities. These images, such as X-rays, MRI and CAT scans, generally require a greater amount of storage than images in non-medical industries. An operator, or user, such as a surgeon, can use PACS to retrieve the saved images either locally or remotely and conceivably use them for navigational or interventional guidance during a surgical procedure.
- Digital Imaging and Communications in Medicine (DICOM) is a standard for managing medical data, including medical imaging. DICOM has many roles in healthcare information technology: It is a standard for exchanging digital information which ensures interoperability between medical imaging equipment (such as radiological imaging) and other systems. It is a protocol for medical device communication over a network, defining syntax and semantics for commands and associated information that can be exchanged. It is a file format and medical directory structure to facilitate access to images and related information stored on media that shares information. It is a printing and display standard to ensure that medical imagery is uniformly presented independent of the device.
- Virtual reality applications in the healthcare industry are associated with many areas of medical technology innovation including robot-assisted surgery, augmented reality (AR) surgery, computer-assisted surgery (CAS), image-guided surgery (IGS), surgical navigation, pre-operative surgical planning, virtual colonoscopy, virtual surgical simulation, and virtual reality exposure therapy (VRET). In addition to intraoperative surgical navigation and guidance, VR tools are often used for medical data visualization, including multi-modality image fusion and advanced 2D/3D/4D image reconstruction. Education and training applications include virtual surgical and procedural simulators. Patient use of VR tools find application in rehabilitation and therapy, including immersive VR systems for pain management, behavioral therapy, psychological therapy, physical rehabilitation, and motor skills training Clinical benefits of healthcare VR technology include improved patient outcomes, reduced medical errors, improved minimally-invasive surgical (MIS) technique, improved physician collaboration in diagnosis, and improved psychological and motor rehabilitation.
- The invention relates generally to a multimedia collaborative teleconferencing system and method of using the same for generating telestrations and annotations on streaming medical imagery and saving same for tele-consultation, tele-collaboration, tele-monitoring, tele-proctoring, and tele-mentoring with others users.
- The apparatus includes a medical image acquisition system adapted for receiving and transmitting medical images, constructed from a computer having communications capability adapted for acquisition and transmission of a plurality of medical imaging and video signals. Wherein the medical image and video signals are acquired at the medical device's native resolutions, the apparatus transmits the signals at their native resolutions and native frame rates to a receiving device, receiving the medical imaging video signals in analog or digital form, and if required, compressing and/or scaling the signal, converting the signal to digital form for transmission, and transmitting the digital signals to a display device.
- A computer can be defined as typically made of several components such as a main circuit board assembly having a central processing unit, memory storage to store programs and files, other storage devices such as hard drives, and portable memory storage, a power supply, a sound and video circuit board assembly, a display, and an input device such as a keyboard, mouse, stylus pen and the like allowing control of the computer graphics user interface display, where any two or more of such components may be physically integrated or may be separate. Any user on the network can store files on the server and a network server is a computer that manages network traffic.
- The medical image acquisition system is capable of acquiring signals from a plurality of medical imaging systems including but not limited to, ultrasound, computer tomography (CT) scan, fluoroscopy, endoscopy, magnetic resonance imaging, nuclear medicine, echocardiogram ultrasound and microscopy. The medical receiving device acquires the video image signal from a plurality of video sources, including but not limited to, S-video, composite color and monochrome, component red blue green video (RGB, three additive primary colors), Digital Visual Interface (DVI), any video transport protocol including digital and analog protocols, high definition multimedia interface (HDMI, compact audio video interface uncompressed digital data), serial digital interface (SDI), and DICOM video in their native, enhanced or reduced resolutions or their native, enhanced or reduced frame rates.
- The apparatus includes a storage device adapted for archiving the video signal in a predetermined digital format, including Digital Imaging and Communications for Medicine (DICOM). Data is transmitted using secure encryption protocols and video signal resolution is transmitted at the same resolution as the received signal. In one illustration, a remote location communicates with the networked computer, for the purpose of collaborating and conferencing.
- The present invention improves on existing telestration techniques via the addition of virtual telestration tools that can physically manipulate the video images in a natural way based on a physics model of the object(s) being displayed. Telestration techniques described in prior art rely on freehand drawing of lines or shapes which are then displayed as overlays onto the video images. In the current embodiment, the user controls virtual tools which are able to cut, push, pull, twist, and suture the video images as if they were actually manipulating human tissue.
- While the current embodiment is a natural fit for telestrating/telementoring over real-time or stored medical images, such as with surgical telemedicine, the method can be applicable to any telestration requiring one user to demonstrate the use of a tool to an operator who is actually using the tool at that time. Although this technique is naturally suited to such remote student-mentor scenarios, it can also be applied to single-user interfaces. Most notably, with the application of the computational physics model included in the current invention, the user can practice a technique in a virtualized manner on live video images prior to actually performing the maneuver.
- This flexibility makes the technique adaptable for the use in remote fieldwork. For example, a telecommunications technician working in a remote location can receive realtime guidance from an expert located elsewhere. Through virtual tool telestration, the expert can annotate which segments to push, pull, twist, and cut in a realistic, but still virtualized manner. The local technician can also use the same annotation tools to practice the task under the guidance of the expert before actually performing the task. By adjusting parameters of the virtual video mesh and computational physics model described below, these annotation techniques can be applied to approximate any objects displayed within the video.
- The present invention is accomplished using a combination of both hardware and software. The software used for the present invention is stored on one or more processor readable storage media including hard disk drives, RAM, ROM, optical drives, and other suitable storage devices. In alternative embodiments, some or all of the software may be replaced with dedicated hardware, including custom integrated circuits and electronic processors.
- The novelty of this invention is:
-
- (1) the ability to create and modify ‘synthetic’ DICOM information objects. These objects, referred to as ‘surgi-skins’, are multi-modal, multi-layer virtual meshes synthesized from streaming medical imagery.
- (2) the ability to encapsulate ‘surgi-skins’ with user metadata, synchronized audio annotations, and haptic annotation, plus save that data together in a single file format structure based on DICOM and referred to as the ‘DICOM Mesh’.
- The advantages and novelty of the present invention will appear more clearly from the following description and figures in which the preferred embodiment of the invention is described in detail.
- Within the figures, the following reference characters are used to refer to the following elements of the exemplary system illustrated in the drawings.
- 10 is an exemplary video stream.
- 12 is a 3D mesh object virtual tool exemplification.
- 14 is a tele-video mesh overlay.
- 16 is an exemplary mesh deformation.
- 18 is an exemplary mesh tear.
-
FIG. 1 is a detailed view of the virtual mesh telestration. In this example, a rectangular 12-column grid (14) of equilateral triangles (aka virtual mesh) is constructed via computer graphics. Each vertex (black circle) is connected to another via a computational physic model (spring) which calculates the vertex's three-dimensional position using pre-programmed parameters, including a spring constant, gravitational acceleration, and a damping factor. The border vertices (black squares) remain in fixed positions. The video image of an outstretched left arm (10) is superimposed onto the virtual mesh. A virtual scalpel (12) is superimposed over both 10 and 14. -
FIG. 2 is a detailed view based onFIG. 1 after the virtual scalpel has been moved to the left which simulates a cut to the virtual mesh (12′→12). The vertices of the virtual mesh (14) move according to the computational physics engine and create new sub-triangles within the mesh (16). This movement creates a void (18) in the mesh. The superimposed video image of the outstretched left arm (10) moves according to the displacement of the associated vertices of the virtual mesh and gives the appearance that the virtual scalpel (12) has in fact “cut” the arm in a realistic manner. Nevertheless, although the original video image is displayed in a distorted manner, the data (and the actual arm) remain unchanged. -
FIG. 3 is a detailed view of the virtual mesh telestration using a forceps tool (12). As withFIG. 1 , the virtual mesh is constructed with a 12-column rectangular arrangement of equilateral triangles (14) whose vertices move according to a computational physics model (spring). A video image of an outstretched left arm (10) is superimposed onto the virtual mesh. -
FIG. 4 is a detailed view based onFIG. 4 after the virtual forceps have moved a vertex up and to the left (12′→12). With this tool movement, no vertices are created nor destroyed, but instead move according to the computational physics model (stretched and squeezed springs). The superimposed video image of the outstretched left arm (10) moves according to the displacement of the associated vertices of the virtual mesh and gives the appearance that the virtual forceps has pulled a section of the arm up and to the left. Nevertheless, although the original video image is displayed in a distorted manner, the data (and the arm) remain unchanged. -
FIG. 5 is a workflow diagram of the application and method. An imaging device (e.g. video camera) (#4) is captured by a Surgicom Telenetwork server (#3) and sent to the Surgicom Telestreamer (#2) which digitizes its content and transmits it over telecommunication lines in realtime. A 3D virtual tool telestrator (#1) receives the video telestream and allows the client to annotate the video images as described inFIGS. 1 and 2 using virtual telestration tools. These annotations are streamed back to the Surgicom Telestreamer (#2) which updates the original video source device stream (#1) with the annotated version. Note that multiple 3D virtual tool telestrators (#1) may act as clients to the Surgicom Telestreamer (#2). All clients view the same video images and can annotate them independently. Also, note that the Surgicom Telenetwork server (#3) is able to save, store, and transmit data to the Surgicom Telestreamer (#2) from recorded sources. -
FIG. 6 is a network diagram of the “Surgicom Telementoring Network” consisting of a Surgicom Telenetwork server, Surgicom Streamer Stack, Local (O.R.) clients, LAN, WAN, and web-based tele-proctors and clients. -
FIG. 7 is a workflow diagram of the Surgicom P.A.C.S. and Holographic systems connected to the Surgicom Streamer Stack and the Surgicom Telenetwork Server. -
FIG. 8 is an exemplar of the haptic holographic display using a virtual mesh and virtual 3D tools. -
FIG. 9 is a workflow diagram of a Surgicom Surgical Telementoring Session. -
FIG. 10 is an exemplar of the Surgicom Console session.FIG. 10A (“See One”) shows how pre-operative surgical planning, review, road map, and holographic ‘priors’ can be accessed from the system.FIG. 10B (“Do One”) shows how a surgeon can simulate alternative approaches with virtual ‘cuts’ and synthesize into haptic holograms.FIG. 10C (“Teach One”) shows how to annotate and save haptic holograms as teaching files with mulit-modal medical imagery and send to PACS. - In the following description, a preferred embodiment of the invention is described with regard to process and design elements. However, those skilled in the art would recognize, after reading this application, that alternate embodiments of the invention may be implemented with regard to hardware or software without requiring undue invention.
- There are 3 main components to this method:
- (1) the virtual mesh
- (2) the UV texture map
- (3) the virtual tools.
- The virtual mesh is a computer graphics representation of a video display where each vertex of the mesh corresponds to a position within the video image. In a static display, the virtual mesh is analogous to a pixel map of the video image. In this invention, however, the vertices of the virtual mesh are not necessarily aligned with the pixels of the video image. More importantly, the locations of the vertices are not fixed in space, but rather can move with respect to one another as if each vertex were a physical object (or a part of a physical object) in the real world.
- In the current instantiation, the virtual mesh is constructed using equilateral triangles arranged in a 12-column grid (
FIG. 1 ). Equilateral triangles were chosen because they are computationally easier to sub-divide than other geometric shapes. Nevertheless, any shape (2D or 3D) can be used to create the mesh. In addition, multiple meshes of varying configurations can be produced to represent features and objects within the streamed imaging modality. Further, the overall mesh is rectangular in shape because video images are usually displayed in this manner; but, the shape of the mesh can changed to conform to the needs of the telestration. - Machine vision techniques may be applied to sub-divide the mesh according to objects within the video image. For example, a mesh displaying a video of an automobile could be sub-divided into body, wheels, and background—with each sub-segment of the mesh being programmed to mimic the physical characteristics of the objects they represent. This would compensate for any relative movement among the camera, objects, or field of view.
- In the current embodiment, a surgeon could identify regions of interest within the image (e.g. major organs, nerves, or blood vessels) by encircling them with conventional freehand drawing telestration. An optical flow algorithm, such as the Lucas Kinade method, could be used to track each region of interest within the realtime video. The virtual mesh would be continually updated to change the parameters of the sub-meshes based on the regions of interest. This would ensure, for example, that a cut in the mesh which was made to overlay the prostate would keep the same relative position and orientation with respect to the prostate regardless of movement.
- The vertices of the virtual mesh are interconnected in movement using a computational physics model of the object being represented. In the current instantiation, the physics model assumes that vertices are connected via springs which obey the physical constraints of Hooke's Law and gravitational acceleration. By changing the parameters, such as spring constant, gravitational acceleration, and damping factor, the behavior of the virtual mesh can be adjusted between various levels of fluidity. For example, the current embodiment can be made to approximate human skin, but different types of human tissue could also be represented in the same telestrated video. The properties for both the virtual mesh and physics model can be stored in a standard known format, such as the Collaborative Design Activity (COLLADA).
- It should be noted that although the computational physics model is currently formulated to simulate movement in typical environments, it could be equally used to simulate movement of objects in exotic environments, such as in space or underwater by computationally changing the nature of the virtual mesh.
- UV mapping is a three-dimensional (3D) modeling process which maps a two-dimensional (2D) image onto the three-dimensional surface. Other patents and techniques sometimes refer to this technique as “texture mapping”. Every 3D object in computer graphics is made up of a series of connected polygons. UV mapping allows these polygons to be painted with a color from a 2D image (or texture). Although in its current instantiation the virtual mesh is a 2D object, it can be texture mapped with a 2D video image in the same manner. Further, using the UV mapping, the same technique can be applied to true 3D virtual meshes of any configuration.
- By superimposing the video image onto the virtual mesh using a UV map, the video image will be distorted whenever the virtual mesh is distorted. In effect, the process allows points and segments of the video image to move and react to the telestration. In fact, if polygons within the virtual mesh are deleted (e.g. cutting the mesh as in
FIG. 2 ), the projected video image will not display the area which is mapped to those polygons. Similarly, if the polygon changes shape (e.g. pulling the mesh as inFIG. 4 ), the projected video image will display the area mapped to that polygon with precisely the same geometric distortion. - Virtual tools are computer-generated objects which are programmed to interact with the virtual mesh according to a computational physics engine. In the current instantiation, the invention uses three virtual tools: a virtual scalpel, a virtual forceps, and a virtual suture. All three tools are programmed to push, pull, and twist the virtual mesh according to the physics engine using standard ray-casting techniques and colliders.
- The virtual scalpel separates the connections between the triangles that are in contact with the scalpel tip. This results in a void between those triangles and makes the video image appear to have been cut in the mapped area. Further, if an entire section of the virtual mesh is “cut” from the existing mesh, the UV mapped area of the video image will appear to be physically removed from the remainder of the video image. The edge of the cut mesh then acts as an edge of tissue; so the edge of the cut surface will deform when manipulated, independent of the other side of the cut mesh.
- The virtual forceps attaches to the triangle closest to the forceps tip when activated. It creates an external force on the attached triangles within the computation physics model of the virtual mesh. The forceps can be used to drag the attached triangles (
FIG. 4 ) and gives the illusion that the video image is being grabbed by the forceps in a realistic manner. After the forceps is deactivated, the external force is removed from the computational physics model. The affected triangles will continue to react to internal (reaction) forces until they eventually return to a steady-state position. - The virtual suture allows the telestrator to add connections between triangles. The suture is modeled by a spring. When activated, the suture tool adds a spring to the computational physics engine between any two points specified. This tool can be used to join previously cut sections of the virtual mesh.
- Although in its current instantiation the virtual tools are limited to these three, the flexibility of the computational physics engine allows the technique to be readily expanded to include the use of any tool or object which can be modeled, including drills, retractors, stents, and suction devices. It also allows for multisensory annotations, including haptic and audio feedback from tool use, to be realistically modeled and stored.
- In addition, the parameters for the virtual tools, mesh, and physics engine can be saved along with multi-sensory (e.g. haptic) data in standard known 3D file formats, such as the Collaborative Design Activity (COLLADA) and Immersion Force Resource (IVS/IFR) specification, and Haptic Multimedia Broadcasting formats, such as MPEG-4 Binary Format for Scene (BIFS) and the University of Iowa's 3D Holovideo format.
- In order to illustrate the method proposed in this invention, consider the field of surgery. Adequate surgical collaboration requires one practitioner demonstrating a technique to another practitioner. Current telestration techniques are unable to demonstrate surgical techniques, such as dissection, clamping, and suturing. It is not sufficient to know simply where or when to cut; the surgeon must be able to also demonstrate how to cut—how to hold the instrument, how hard to push, and how quickly to move. These limitations of conventional telestration as described in prior art are exacerbated in situations where the practitioners may be in different locations. These telestration techniques are insufficient for true surgical telementoring or any video annotation requiring a procedure to be demonstrated especially when complex techniques are being demonstrated to new students.
- Virtual tool telestration, as described herein and which makes up at least a part of the present disclosure, may allow the mentoring surgeon to interact with a virtual video-overlay mesh of the operative field and mimic the technique needed to perform the operation. The surgeon mentor can demonstrate suturing and dissecting techniques while they are virtually overlaid on a video of the actual operative field. Notably, the mentoring surgeon can demonstrate the surgical technique effectively without actually changing the operative field.
- Current telestration methods have limited conventional telemedicine to non-surgical fields of medicine. However, with the system and method of the present disclosure, it may be possible that telemedicine/telementoring will become crucial to surgical practice and, indeed, any field where collaboration requires demonstrating rather than merely describing an idea.
- In fact, there is growing concern that the advance of minimally invasive surgery (MIS) is grossly outpacing the evolution of surgical training This application will assist in bridging the learning curves for surgeons performing the MIS procedures. In addition, as live video and other imaging modalities become more prevalent in clinical practice, the telestration described herein will become inherent to all forms of medicine. A virtual tool telestrator is the critical element to enable adequate surgical telestration. Such a telestrator may be adapted to work in a 2-D or a 3-D video environment with applications not just with visible light images, but with other modalities, including (but not limited to) fluoroscopy, tomography, and magnetic resonance imagery.
- Additionally, telestration is currently used in a number of non-medicine fields. The most common application is with professional sports broadcasting whereby sports commentators can “draw” on the televideo and emphasize certain elements of the video, such as the movement of the players. Adding 3D virtual telestration tools, as described herein, to these existing telestration devices and tools could be invaluable to such modalities. For example, bomb disposal experts could use virtual tools to interact with the remote video signal transmitted by ordinance disposal robots to signal the robot to push or pull certain areas of the field of view. Sculptors could use virtual hands to indicate to their student the proper finger position on a piece of unformed clay—and demonstrate how the clay should move without actually affecting the real world object. Any real world object that can be imaged can be transmitted and manipulated in a collaborative, yet virtualized manner. Such a method and device would be a natural fit for wearable computers or head-mounted displays, such as Google Glass and the Occulus VR Rift, to provide better augmented reality solutions.
- Virtual tool telestration may be equally effective in a 2-D or a 3-D environment or representation and differs from what currently exists in the field of telestration. It is typically constructed from three components (
FIG. 5 ): - 1. a 3D virtual tool telestrator
- 2. a Surgicom Telestreamer
- 3. a Surgicom Telenetwork server
- These elements (as demonstrated in the drawings) may be related to each other in the following exemplary and non-limiting fashion.
- The Surgicom Telestreamer (#2) may be a computer networking device which allows for audio and video signals to be sent in realtime to remote viewers. In one embodiment, the Surgicom Telestreamer captures streaming medical imagery and transmits it over the internet using a real-time streaming protocol (RTSP) in a H.264 video compression/decompression (codec) format at 1080p resolution of 60 frames per second.
- The 3D virtual tool telestrator (#1) may be a computer program which displays the Surgicom telestream (#2) as a 3D mesh object on a video monitor, allows for a remote users to overlay virtual 3D tools (e.g. forceps, scalpels) which can be moved by the remote user and which can interact with the video mesh. For example, the remote user may virtually grab a section of the video mesh with the forceps and that part of the mesh will move in a manner similar to that of the actual object being displayed in the video (e.g. a section of the bladder neck during prostate removal).
- The 3D Virtual Tool Telestrator (#1) will transmit the virtualized surgical telestration of the remote user back to the source Surgicom Telestreamer (#2) for display. To conserve transmission bandwidth, the 3D Virtual Tool Telestrator (#1) only sends the position and orientation of the virtual tools and the virtual mesh to the Surgicom Telestreamer (#2) along with the timestamp of the current video frame. In this manner, bandwidth requirements and latency are minimized.
- The 3D virtual tool telestrator (#1) may be comprised of computer software written, by way of an exemplary and non-limiting example, with mostly open-sourced software development packages, such as by using a programming environment like but not limited to C++, C#, Mono, Silverlight, and Unity3D. The telestrator may include 3D graphics rendering engine, such as but not limited to Unity3D, which may be used to display the 3D virtual tools and a virtual mesh with triangular vertices. The telestrator may also include a physics simulator, such as but not limited to PhysX, to handle the virtual simulation and interaction between the virtualized 3D tools and the video mesh. The telestrator may also include a multimedia player, such as but not limited to AVPro LiveCapture, which may be used to overlay a video input stream from #2 onto the virtual mesh to create a virtual operative field. The telestrator will use human input devices, such as the Razer Hydra joystick or the Geomagic Touch to control movement of the virtual tools in a natural way.
- A similar computer program exists on the Surgicom Telestreamer (#2). However, unlike the 3D virtual tool telestrator (#1), this program renders the graphics without the computational physics engine. Instead, the position and orientation of the virtual tools and virtual mesh that were passed back from the virtual tool telestrator (#1) are used to create an exact rendering of the virtual tool telestration at that timestamp. In this way, the Surgicom Telestreamer (#2) can display an exact rendering of the 3D virtual tool telestration to all clients simultaneously.
- While the invention has been described with reference to preferred embodiments, it is to be understood that the invention is not intended to be limited to the specific embodiments set forth above. Thus, it is recognized that those skilled in the art will appreciate that certain substitutions, alterations, modifications, and omissions may be made without departing from the spirit or intent of the invention. Accordingly, the foregoing description is meant to be exemplary only, the invention is to be taken as including all reasonable equivalents to the subject matter of the invention, and should not limit the scope of the invention set forth in the following claims.
- The Surgicom Telenetwork server (#3) can save and store the medical images having the overlaid drawn annotated and telestrated images in a PACS using the DICOM format, and saving the session information that includes the collaboration session ID, client information, image information including associated metadata, and date and times of the session
Claims (7)
1. A method to perform video annotation using an augmented reality telestrator.
2. The method of claim 1 , wherein the video images are projected onto a virtual mesh which moves based on a physically-realistic computational model of the actual object being displayed in the video images. In the current embodiment, the virtual mesh is constructed in computer graphics as a rectangle made from equilateral triangles whose vertices are interconnected. Movement between vertices of the virtual mesh is calculated via physics-based calculations, including but not limited to Hooke's spring law and Newton's laws of motion. A UV-map is constructed to superimpose the video images onto the virtual mesh. As the vertices of the virtual mesh move based on the physics-based calculations, the superimposed video images are transformed in the corresponding positions. Physical parameters of the mesh and tools, including multi-sensory (e.g. haptic) feedback may be saved using standardized file formats (e.g. COLLADA) to be used in conjunction with the DICOM medical imagery.
3. The method of claim 1 , wherein computer-generated, virtual tools are overlaid on the video images and can manipulate the images in a realistic manner based on the physically-realistic model mesh of claim 2 . In the current embodiment, these virtual tools are three-dimensional rendering of scissors, sutures, and forceps which can be used to cut, stitch, and push/pull/twist points within the video images. In effect, these video images appear to react in a realistic manner to the virtual tools based on the physics-based calculations of the virtual mesh of claim 2 .
4. A network system apparatus allowing users to capture, retrieve and view both real time and archived medical images for synchronous or asynchronous communication, collaboration and consultation by one or more users using illustrations over the medical images, comprising: a telenetwork server including at least one associated database having the capability to communicate with a local area network; and at least one telestreamer in network communication with a telenetwork server via a local area network wherein the telestreamers capture one or more medical images and provide the medical images via the network communication to the telenetwork server via the local area network as it receives medical images from at least one source
5. A network system apparatus system allowing users to capture, retrieve and view both real time and archived medical imagery streams for synchronous or asynchronous communication, collaboration and consultation by one or more users using illustrations over the medical images as in claim 4 wherein; one or more users retrieve and view the medical images, creating illustrations over the medical images such as, but not limited to drawing, annotating, telestrating; and storing the medical images with the annotations, and saving all the medical images together with annotations and metadata on the telenetwork server, on a picture archiving and communications system server, in a known digital imaging and communications in medicine format.
6. A method for allowing one or more users to capture, retrieve and view both real time and archived medical images for synchronous or asynchronous communication, collaboration and consultation by one or more users using illustrations over the medical images comprising; running a computer program, storing the program on each of user's computers, displaying the graphical user interface output of that program on a computer display; linking each user's computer to a telenetwork server using a local area network, each user communicating with the telenetwork server, the telemedicine image management system server providing permission to each user wherein linking the user to a digital imaging and communications in medicine modality worklist utility, a medical image archive server, and telestreamers for capturing, retrieving and viewing medical images, user's illustrating over the medical images, telenetwork server managing all illustration file sharing wherein new user's illustrations are appending periodically to the telenetwork server, maintaining the file on the user's computer, the telenetwork server linking to the internet and other users, wherein the images remain on the telenetwork server and the user illustrations are appended to that telenetwork server; streaming images into a local area network wherein the telenetwork server having associated database in communication with telestreamers streamers connected directly to medical imaging modalities for acquiring one or more medical images, streaming those medical images to the local area network;
7. A method for allowing users to capture, retrieve and view both real time and archived medical imagery streams for synchronous or asynchronous communication, collaboration and consultation by one or more users using illustrations over the medical images as in claim 6 wherein; one or more users retrieve and view the medical images, creating illustrations over the medical images such as, but not limited to drawing, annotating, telestrating and storing the medical images with the illustrations, and viewing all of the user's illustrations as they happen and can save all the illustrations from all participant clients on their local respective computer storage devices, on the telenetwork server, on a picture archiving and communications system, in a known digital imaging and communications in medicine format.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/138,045 US20140176661A1 (en) | 2012-12-21 | 2013-12-21 | System and method for surgical telementoring and training with virtualized telestration and haptic holograms, including metadata tagging, encapsulation and saving multi-modal streaming medical imagery together with multi-dimensional [4-d] virtual mesh and multi-sensory annotation in standard file formats used for digital imaging and communications in medicine (dicom) |
US14/875,346 US9560318B2 (en) | 2012-12-21 | 2015-10-05 | System and method for surgical telementoring |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201261745383P | 2012-12-21 | 2012-12-21 | |
US14/138,045 US20140176661A1 (en) | 2012-12-21 | 2013-12-21 | System and method for surgical telementoring and training with virtualized telestration and haptic holograms, including metadata tagging, encapsulation and saving multi-modal streaming medical imagery together with multi-dimensional [4-d] virtual mesh and multi-sensory annotation in standard file formats used for digital imaging and communications in medicine (dicom) |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/875,346 Continuation US9560318B2 (en) | 2012-12-21 | 2015-10-05 | System and method for surgical telementoring |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140176661A1 true US20140176661A1 (en) | 2014-06-26 |
Family
ID=50974170
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/138,045 Abandoned US20140176661A1 (en) | 2012-12-21 | 2013-12-21 | System and method for surgical telementoring and training with virtualized telestration and haptic holograms, including metadata tagging, encapsulation and saving multi-modal streaming medical imagery together with multi-dimensional [4-d] virtual mesh and multi-sensory annotation in standard file formats used for digital imaging and communications in medicine (dicom) |
US14/875,346 Expired - Fee Related US9560318B2 (en) | 2012-12-21 | 2015-10-05 | System and method for surgical telementoring |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/875,346 Expired - Fee Related US9560318B2 (en) | 2012-12-21 | 2015-10-05 | System and method for surgical telementoring |
Country Status (1)
Country | Link |
---|---|
US (2) | US20140176661A1 (en) |
Cited By (45)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120107784A1 (en) * | 2010-10-28 | 2012-05-03 | Alexander Jorg Seifert | One touch button for operating room support |
US20120209123A1 (en) * | 2011-02-10 | 2012-08-16 | Timothy King | Surgeon's Aid for Medical Display |
US20150373369A1 (en) * | 2012-12-27 | 2015-12-24 | The Regents Of The University Of California | Anamorphic stretch image compression |
US20160028994A1 (en) * | 2012-12-21 | 2016-01-28 | Skysurgery Llc | System and method for surgical telementoring |
US20160234074A1 (en) * | 2015-02-05 | 2016-08-11 | Ciena Corporation | Methods and systems for creating and applying a template driven element adapter |
US9503681B1 (en) * | 2015-05-29 | 2016-11-22 | Purdue Research Foundation | Simulated transparent display with augmented reality for remote collaboration |
WO2017031385A1 (en) * | 2015-08-20 | 2017-02-23 | Microsoft Technology Licensing, Llc | Asynchronous 3d annotation of a video sequence |
CN106846496A (en) * | 2017-01-19 | 2017-06-13 | 杭州古珀医疗科技有限公司 | DICOM images based on mixed reality technology check system and operating method |
US20170280188A1 (en) * | 2016-03-24 | 2017-09-28 | Daqri, Llc | Recording Remote Expert Sessions |
US9928629B2 (en) | 2015-03-24 | 2018-03-27 | Augmedics Ltd. | Combining video-based and optic-based augmented reality in a near eye display |
WO2018093921A1 (en) * | 2016-11-16 | 2018-05-24 | Terarecon, Inc. | System and method for three-dimensional printing, holographic and virtual reality rendering from medical image processing |
WO2018175971A1 (en) * | 2017-03-24 | 2018-09-27 | Surgical Theater LLC | System and method for training and collaborating in a virtual environment |
US10169917B2 (en) | 2015-08-20 | 2019-01-01 | Microsoft Technology Licensing, Llc | Augmented reality |
US10215989B2 (en) | 2012-12-19 | 2019-02-26 | Lockheed Martin Corporation | System, method and computer program product for real-time alignment of an augmented reality device |
US10235808B2 (en) | 2015-08-20 | 2019-03-19 | Microsoft Technology Licensing, Llc | Communication system |
US10248441B2 (en) * | 2016-08-02 | 2019-04-02 | International Business Machines Corporation | Remote technology assistance through dynamic flows of visual and auditory instructions |
US10265138B2 (en) * | 2017-09-18 | 2019-04-23 | MediVis, Inc. | Methods and systems for generating and using 3D images in surgical settings |
US20190182454A1 (en) * | 2017-12-11 | 2019-06-13 | Foresight Imaging LLC | System and method of collaboratively communication on images via input illustrations and have those illustrations auto erase. |
WO2019190792A1 (en) * | 2018-03-26 | 2019-10-03 | Covidien Lp | Telementoring control assemblies for robotic surgical systems |
WO2019210353A1 (en) * | 2018-04-30 | 2019-11-07 | MedVR Pty Ltd | Medical virtual reality and mixed reality collaboration platform |
US10582190B2 (en) | 2015-11-23 | 2020-03-03 | Walmart Apollo, Llc | Virtual training system |
US10674968B2 (en) * | 2011-02-10 | 2020-06-09 | Karl Storz Imaging, Inc. | Adjustable overlay patterns for medical display |
US10706636B2 (en) * | 2017-06-26 | 2020-07-07 | v Personalize Inc. | System and method for creating editable configurations of 3D model |
CN111602105A (en) * | 2018-01-22 | 2020-08-28 | 苹果公司 | Method and apparatus for presenting synthetic reality companion content |
CN111868788A (en) * | 2018-10-17 | 2020-10-30 | 美的集团股份有限公司 | System and method for generating an acupoint and pressure point map |
US10861236B2 (en) | 2017-09-08 | 2020-12-08 | Surgical Theater, Inc. | Dual mode augmented reality surgical system and method |
US10886029B2 (en) * | 2017-11-08 | 2021-01-05 | International Business Machines Corporation | 3D web-based annotation |
US10928773B2 (en) | 2018-11-01 | 2021-02-23 | International Business Machines Corporation | Holographic image replication |
US10939977B2 (en) | 2018-11-26 | 2021-03-09 | Augmedics Ltd. | Positioning marker |
US20210068900A1 (en) * | 2019-09-11 | 2021-03-11 | Ardeshir Rastinehad | Method for providing clinical support for surgical guidance during robotic surgery |
US11062527B2 (en) | 2018-09-28 | 2021-07-13 | General Electric Company | Overlay and manipulation of medical images in a virtual environment |
CN113220121A (en) * | 2021-05-04 | 2021-08-06 | 西北工业大学 | AR fastener auxiliary assembly system and method based on projection display |
US11157842B2 (en) * | 2015-01-28 | 2021-10-26 | Iltec—Lubeck Tecnologia Ltda | System, equipment and method for performing and documenting in real-time a remotely assisted professional procedure |
US11197722B2 (en) | 2015-10-14 | 2021-12-14 | Surgical Theater, Inc. | Surgical navigation inside a body |
WO2022003729A1 (en) * | 2020-07-02 | 2022-01-06 | Cnr Medimentor Private Limited | System and method for telestrating the operative procedures |
US11389252B2 (en) | 2020-06-15 | 2022-07-19 | Augmedics Ltd. | Rotating marker for image guided surgery |
US11412998B2 (en) | 2011-02-10 | 2022-08-16 | Karl Storz Imaging, Inc. | Multi-source medical display |
US20220321925A1 (en) * | 2019-02-26 | 2022-10-06 | Surgtime, Inc. | System and method for teaching a surgical procedure |
US11547499B2 (en) | 2014-04-04 | 2023-01-10 | Surgical Theater, Inc. | Dynamic and interactive navigation in a surgical environment |
US11766296B2 (en) | 2018-11-26 | 2023-09-26 | Augmedics Ltd. | Tracking system for image-guided surgery |
US11801115B2 (en) | 2019-12-22 | 2023-10-31 | Augmedics Ltd. | Mirroring in image guided surgery |
US11817201B2 (en) | 2020-09-08 | 2023-11-14 | Medtronic, Inc. | Imaging discovery utility for augmenting clinical image management |
US11896445B2 (en) | 2021-07-07 | 2024-02-13 | Augmedics Ltd. | Iliac pin and adapter |
US11974887B2 (en) | 2018-05-02 | 2024-05-07 | Augmedics Ltd. | Registration marker for an augmented reality system |
US11980507B2 (en) | 2019-04-30 | 2024-05-14 | Augmedics Ltd. | Registration of a fiducial marker for an augmented reality system |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102014211044A1 (en) * | 2014-06-10 | 2015-12-17 | Siemens Aktiengesellschaft | Method for creating a simulation environment for a simulation installation of a medical imaging device, and server unit, simulation system, computer program and computer-readable storage medium |
US10484437B2 (en) * | 2015-01-21 | 2019-11-19 | Logmein, Inc. | Remote support service with two-way smart whiteboard |
AU2017236893A1 (en) | 2016-03-21 | 2018-09-06 | Washington University | Virtual reality or augmented reality visualization of 3D medical images |
AU2018230901B2 (en) | 2017-03-10 | 2020-12-17 | Biomet Manufacturing, Llc | Augmented reality supported knee surgery |
CN108614638B (en) * | 2018-04-23 | 2020-07-07 | 太平洋未来科技(深圳)有限公司 | AR imaging method and apparatus |
Citations (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6580426B1 (en) * | 1999-03-03 | 2003-06-17 | Canon Kabushiki Kaisha | Computer graphics apparatus for processing of data defining a three-dimensional computer model to partition the three-dimensional space into a plurality of sectors |
US6664960B2 (en) * | 2001-05-10 | 2003-12-16 | Ati Technologies Inc. | Apparatus for processing non-planar video graphics primitives and associated method of operation |
US6678764B2 (en) * | 2000-10-20 | 2004-01-13 | Sony Corporation | Medical image processing system |
US6763176B1 (en) * | 2000-09-01 | 2004-07-13 | Matrox Electronic Systems Ltd. | Method and apparatus for real-time video editing using a graphics processor |
US6795070B1 (en) * | 1998-10-02 | 2004-09-21 | France Telecom (Sa) | Method for compressing and encoding three-dimensional meshed network |
US20040254763A1 (en) * | 2003-03-05 | 2004-12-16 | Shuji Sakai | Medical system |
US6940503B2 (en) * | 2001-05-10 | 2005-09-06 | Ati International Srl | Method and apparatus for processing non-planar video graphics primitives |
US7117259B1 (en) * | 2000-03-03 | 2006-10-03 | International Business Machines Corporation | Server time window for multiple selectable servers in a graphical user interface |
US7372472B1 (en) * | 2001-04-09 | 2008-05-13 | Matrox Electronic Systems Ltd. | Method and apparatus for graphically defining a video particle explosion effect |
US7432936B2 (en) * | 2004-12-02 | 2008-10-07 | Avid Technology, Inc. | Texture data anti-aliasing method and apparatus |
US20080306818A1 (en) * | 2007-06-08 | 2008-12-11 | Qurio Holdings, Inc. | Multi-client streamer with late binding of ad content |
US20090012968A1 (en) * | 2006-03-07 | 2009-01-08 | Naoki Hayashi | Medical Image Management System |
US20100189323A1 (en) * | 2009-01-27 | 2010-07-29 | Canon Kabushiki Kaisha | Computer-aided diagnosis apparatus and method for controlling the same |
US7843456B2 (en) * | 2007-06-29 | 2010-11-30 | Microsoft Corporation | Gradient domain editing of animated meshes |
US8189888B2 (en) * | 2007-09-27 | 2012-05-29 | Fujifilm Corporation | Medical reporting system, apparatus and method |
US8306399B1 (en) * | 2000-09-01 | 2012-11-06 | Matrox Electronic Systems, Ltd. | Real-time video editing architecture |
US20120327186A1 (en) * | 2010-03-17 | 2012-12-27 | Fujifilm Corporation | Endoscopic observation supporting system, method, device and program |
US20130023730A1 (en) * | 2010-03-31 | 2013-01-24 | Fujifilm Corporation | Endoscopic observation support system, method, device and program |
US20130141462A1 (en) * | 2011-12-02 | 2013-06-06 | Kenichi Niwa | Medical image observation apparatus |
US8582844B2 (en) * | 2008-11-13 | 2013-11-12 | Hitachi Medical Corporation | Medical image processing device and method |
US8610709B2 (en) * | 2008-02-25 | 2013-12-17 | Markany Inc. | Method and apparatus for watermarking of 3D mesh model |
US8611988B2 (en) * | 2010-03-31 | 2013-12-17 | Fujifilm Corporation | Projection image generation apparatus and method, and computer readable recording medium on which is recorded program for the same |
US8749556B2 (en) * | 2008-10-14 | 2014-06-10 | Mixamo, Inc. | Data compression for real-time streaming of deformable 3D models for 3D animation |
US8768436B2 (en) * | 2009-02-26 | 2014-07-01 | Hitachi Medical Corporation | Coronary artery angiography image processing method to detect occlusion and degree effect of blood vessel occlusion to an organ |
US8908766B2 (en) * | 2005-03-31 | 2014-12-09 | Euclid Discoveries, Llc | Computer method and apparatus for processing image data |
US8924864B2 (en) * | 2009-11-23 | 2014-12-30 | Foresight Imaging LLC | System and method for collaboratively communicating on images and saving those communications and images in a standard known format |
Family Cites Families (72)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4979949A (en) | 1988-04-26 | 1990-12-25 | The Board Of Regents Of The University Of Washington | Robot-aided system for surgery |
EP0430860A3 (en) * | 1989-11-21 | 1993-01-13 | Toyo Ink Mfg. Co., Ltd. | Binarization processing method for multivalued image and method to form density pattern for reproducing binary gradations |
US5631973A (en) | 1994-05-05 | 1997-05-20 | Sri International | Method for telemanipulation with telepresence |
US5625576A (en) | 1993-10-01 | 1997-04-29 | Massachusetts Institute Of Technology | Force reflecting haptic interface |
US5890906A (en) | 1995-01-20 | 1999-04-06 | Vincent J. Macri | Method and apparatus for tutorial, self and assisted instruction directed to simulated preparation, training and competitive play and entertainment |
US5855583A (en) | 1996-02-20 | 1999-01-05 | Computer Motion, Inc. | Method and apparatus for performing minimally invasive cardiac procedures |
US7699855B2 (en) | 1996-12-12 | 2010-04-20 | Intuitive Surgical Operations, Inc. | Sterile surgical adaptor |
US6393431B1 (en) * | 1997-04-04 | 2002-05-21 | Welch Allyn, Inc. | Compact imaging instrument system |
US6445964B1 (en) | 1997-08-04 | 2002-09-03 | Harris Corporation | Virtual reality simulation-based training of telekinegenesis system for training sequential kinematic behavior of automated kinematic machine |
US6016385A (en) | 1997-08-11 | 2000-01-18 | Fanu America Corp | Real time remotely controlled robot |
US6659939B2 (en) | 1998-11-20 | 2003-12-09 | Intuitive Surgical, Inc. | Cooperative minimally invasive telesurgical system |
US6951535B2 (en) | 2002-01-16 | 2005-10-04 | Intuitive Surgical, Inc. | Tele-medicine system that transmits an entire state of a subsystem |
US7125403B2 (en) | 1998-12-08 | 2006-10-24 | Intuitive Surgical | In vivo accessories for minimally invasive robotic surgery |
US6424885B1 (en) | 1999-04-07 | 2002-07-23 | Intuitive Surgical, Inc. | Camera referenced control in a minimally invasive surgical apparatus |
US7678048B1 (en) * | 1999-09-14 | 2010-03-16 | Siemens Medical Solutions Usa, Inc. | Medical diagnostic ultrasound system and method |
US7075556B1 (en) | 1999-10-21 | 2006-07-11 | Sportvision, Inc. | Telestrator system |
US7312796B1 (en) * | 2000-05-08 | 2007-12-25 | Jlb Ventures Llc | Perpendicular view three dimensional electronic programming guide |
US6741911B2 (en) | 2000-09-20 | 2004-05-25 | John Castle Simmons | Natural robot control |
US6774869B2 (en) * | 2000-12-22 | 2004-08-10 | Board Of Trustees Operating Michigan State University | Teleportal face-to-face system |
CA2363396A1 (en) | 2001-11-21 | 2003-05-21 | Handshake Interactive Technologies Inc | Hard real time control center |
US6793653B2 (en) | 2001-12-08 | 2004-09-21 | Computer Motion, Inc. | Multifunctional handle for a medical robotic system |
JP3791907B2 (en) | 2002-02-12 | 2006-06-28 | オリンパス株式会社 | Observation device |
US7206626B2 (en) | 2002-03-06 | 2007-04-17 | Z-Kat, Inc. | System and method for haptic sculpting of physical objects |
KR100486709B1 (en) * | 2002-04-17 | 2005-05-03 | 삼성전자주식회사 | System and method for providing object-based interactive video service |
AU2003243345A1 (en) * | 2002-05-31 | 2003-12-19 | The Texas A And M University System | Communicating medical information in a communication network |
US7240075B1 (en) | 2002-09-24 | 2007-07-03 | Exphand, Inc. | Interactive generating query related to telestrator data designating at least a portion of the still image frame and data identifying a user is generated from the user designating a selected region on the display screen, transmitting the query to the remote information system |
US7158860B2 (en) | 2003-02-24 | 2007-01-02 | Intouch Technologies, Inc. | Healthcare tele-robotic system which allows parallel remote station observation |
JP3783011B2 (en) | 2003-10-02 | 2006-06-07 | 株式会社日立製作所 | Operation input device, remote operation system, and remote operation method |
ES2741016T3 (en) | 2003-10-07 | 2020-02-07 | Librestream Tech Inc | Camera to communicate a continuous multimedia transmission to a Remote Client |
FR2872660B1 (en) * | 2004-07-05 | 2006-12-22 | Eastman Kodak Co | SHOOTING APPARATUS AND METHOD FOR FORMATION OF ANNOTATED IMAGES |
US7659906B2 (en) * | 2004-07-29 | 2010-02-09 | The United States Of America As Represented By The Secretary Of The Navy | Airborne real time image exploitation system (ARIES) |
US20060122482A1 (en) | 2004-11-22 | 2006-06-08 | Foresight Imaging Inc. | Medical image acquisition system for receiving and transmitting medical images instantaneously and method of using the same |
JP2006218233A (en) * | 2005-02-14 | 2006-08-24 | Olympus Corp | Endoscope apparatus |
US9295379B2 (en) | 2005-04-18 | 2016-03-29 | M.S.T. Medical Surgery Technologies Ltd. | Device and methods of improving laparoscopic surgery |
US8073528B2 (en) | 2007-09-30 | 2011-12-06 | Intuitive Surgical Operations, Inc. | Tool tracking systems, methods and computer products for image guided surgery |
US8398541B2 (en) | 2006-06-06 | 2013-03-19 | Intuitive Surgical Operations, Inc. | Interactive user interfaces for robotic minimally invasive surgical systems |
US20070008321A1 (en) * | 2005-07-11 | 2007-01-11 | Eastman Kodak Company | Identifying collection images with special events |
US7860614B1 (en) | 2005-09-13 | 2010-12-28 | The United States Of America As Represented By The Secretary Of The Army | Trainer for robotic vehicle |
US9266239B2 (en) | 2005-12-27 | 2016-02-23 | Intuitive Surgical Operations, Inc. | Constraint based control in a minimally invasive surgical apparatus |
US20070167702A1 (en) | 2005-12-30 | 2007-07-19 | Intuitive Surgical Inc. | Medical robotic system providing three-dimensional telestration |
US7907166B2 (en) | 2005-12-30 | 2011-03-15 | Intuitive Surgical Operations, Inc. | Stereo telestration for robotic surgery |
US20100292706A1 (en) | 2006-04-14 | 2010-11-18 | The Regents Of The University California | Novel enhanced haptic feedback processes and products for robotic surgical prosthetics |
ES2298051B2 (en) | 2006-07-28 | 2009-03-16 | Universidad De Malaga | ROBOTIC SYSTEM OF MINIMALLY INVASIVE SURGERY ASSISTANCE ABLE TO POSITION A SURGICAL INSTRUMENT IN RESPONSE TO THE ORDER OF A SURGEON WITHOUT FIXING THE OPERATING TABLE OR PRIOR CALIBRATION OF THE INSERT POINT. |
KR100815245B1 (en) | 2006-09-28 | 2008-03-19 | 한국과학기술원 | A intelligent bed robot with a pressure sensor attached mattress and supporting robot arm having grippers |
US20100121655A1 (en) * | 2006-11-28 | 2010-05-13 | Koninklijke Philips Electronics N. V. | Patient data record and user interface |
US8224484B2 (en) | 2007-09-30 | 2012-07-17 | Intuitive Surgical Operations, Inc. | Methods of user interface with alternate tool mode for robotic surgical tools |
US7925980B2 (en) | 2008-02-19 | 2011-04-12 | Harris Corporation | N-way multimedia collaboration systems |
US8810631B2 (en) | 2008-04-26 | 2014-08-19 | Intuitive Surgical Operations, Inc. | Augmented stereoscopic visualization for a surgical robot using a captured visible image combined with a fluorescence image and a captured visible image |
US8150170B2 (en) * | 2008-05-30 | 2012-04-03 | Microsoft Corporation | Statistical approach to large-scale image annotation |
US7982609B2 (en) * | 2008-06-18 | 2011-07-19 | Microsoft Corporation | RFID-based enterprise intelligence |
KR20100041006A (en) * | 2008-10-13 | 2010-04-22 | 엘지전자 주식회사 | A user interface controlling method using three dimension multi-touch |
KR101026051B1 (en) * | 2008-12-15 | 2011-03-30 | 삼성전기주식회사 | Method for grouping pixels in 2d digital image |
US8830224B2 (en) | 2008-12-31 | 2014-09-09 | Intuitive Surgical Operations, Inc. | Efficient 3-D telestration for local robotic proctoring |
US8265342B2 (en) * | 2009-04-23 | 2012-09-11 | International Business Machines Corporation | Real-time annotation of images in a human assistive environment |
EP2280359A1 (en) * | 2009-07-31 | 2011-02-02 | EADS Construcciones Aeronauticas, S.A. | Training method and system using augmented reality |
JP5801812B2 (en) | 2009-09-11 | 2015-10-28 | ディズニー エンタープライズ,インコーポレイテッド | Virtual insert into 3D video |
US20110107238A1 (en) * | 2009-10-29 | 2011-05-05 | Dong Liu | Network-Based Collaborated Telestration on Video, Images or Other Shared Visual Content |
US8935003B2 (en) | 2010-09-21 | 2015-01-13 | Intuitive Surgical Operations | Method and system for hand presence detection in a minimally invasive surgical system |
US8108541B2 (en) | 2009-11-19 | 2012-01-31 | Alcatel Lucent | Method and apparatus for providing collaborative interactive video streaming |
US9858475B2 (en) | 2010-05-14 | 2018-01-02 | Intuitive Surgical Operations, Inc. | Method and system of hand segmentation and overlay using depth data |
US8520027B2 (en) | 2010-05-14 | 2013-08-27 | Intuitive Surgical Operations, Inc. | Method and system of see-through console overlay |
US20120082371A1 (en) * | 2010-10-01 | 2012-04-05 | Google Inc. | Label embedding trees for multi-class tasks |
JP5331828B2 (en) * | 2011-01-14 | 2013-10-30 | 株式会社日立ハイテクノロジーズ | Charged particle beam equipment |
CN102779356B (en) * | 2011-05-11 | 2016-01-06 | 鸿富锦精密工业(深圳)有限公司 | Curved surface meshing system and method |
CN103959314A (en) * | 2011-07-05 | 2014-07-30 | 迈克尔·斯图尔特·舒诺克 | System and method for annotating images |
US8682047B2 (en) * | 2011-12-05 | 2014-03-25 | Illinois Tool Works Inc. | Method and apparatus for machine vision counting and annotation |
US20130184584A1 (en) * | 2012-01-17 | 2013-07-18 | Richard E. Berkey | Systems and methods for computerized ultrasound image interpretation and labeling |
US20130211230A1 (en) * | 2012-02-08 | 2013-08-15 | Convergent Life Sciences, Inc. | System and method for using medical image fusion |
US20150169525A1 (en) * | 2012-09-14 | 2015-06-18 | Leon Gomes Palm | Augmented reality image annotation |
US20140176661A1 (en) * | 2012-12-21 | 2014-06-26 | G. Anthony Reina | System and method for surgical telementoring and training with virtualized telestration and haptic holograms, including metadata tagging, encapsulation and saving multi-modal streaming medical imagery together with multi-dimensional [4-d] virtual mesh and multi-sensory annotation in standard file formats used for digital imaging and communications in medicine (dicom) |
US20150049163A1 (en) * | 2013-03-15 | 2015-02-19 | James Paul Smurro | Network system apparatus and method of use adapted for visual neural networking with multi-channel multiplexed streaming medical imagery and packetized clinical informatics |
US9826164B2 (en) * | 2014-05-30 | 2017-11-21 | Furuno Electric Co., Ltd. | Marine environment display device |
-
2013
- 2013-12-21 US US14/138,045 patent/US20140176661A1/en not_active Abandoned
-
2015
- 2015-10-05 US US14/875,346 patent/US9560318B2/en not_active Expired - Fee Related
Patent Citations (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6795070B1 (en) * | 1998-10-02 | 2004-09-21 | France Telecom (Sa) | Method for compressing and encoding three-dimensional meshed network |
US6580426B1 (en) * | 1999-03-03 | 2003-06-17 | Canon Kabushiki Kaisha | Computer graphics apparatus for processing of data defining a three-dimensional computer model to partition the three-dimensional space into a plurality of sectors |
US7117259B1 (en) * | 2000-03-03 | 2006-10-03 | International Business Machines Corporation | Server time window for multiple selectable servers in a graphical user interface |
US8306399B1 (en) * | 2000-09-01 | 2012-11-06 | Matrox Electronic Systems, Ltd. | Real-time video editing architecture |
US6763176B1 (en) * | 2000-09-01 | 2004-07-13 | Matrox Electronic Systems Ltd. | Method and apparatus for real-time video editing using a graphics processor |
US6678764B2 (en) * | 2000-10-20 | 2004-01-13 | Sony Corporation | Medical image processing system |
US7372472B1 (en) * | 2001-04-09 | 2008-05-13 | Matrox Electronic Systems Ltd. | Method and apparatus for graphically defining a video particle explosion effect |
US6940503B2 (en) * | 2001-05-10 | 2005-09-06 | Ati International Srl | Method and apparatus for processing non-planar video graphics primitives |
US6664960B2 (en) * | 2001-05-10 | 2003-12-16 | Ati Technologies Inc. | Apparatus for processing non-planar video graphics primitives and associated method of operation |
US20040254763A1 (en) * | 2003-03-05 | 2004-12-16 | Shuji Sakai | Medical system |
US7432936B2 (en) * | 2004-12-02 | 2008-10-07 | Avid Technology, Inc. | Texture data anti-aliasing method and apparatus |
US8908766B2 (en) * | 2005-03-31 | 2014-12-09 | Euclid Discoveries, Llc | Computer method and apparatus for processing image data |
US20090012968A1 (en) * | 2006-03-07 | 2009-01-08 | Naoki Hayashi | Medical Image Management System |
US20080306818A1 (en) * | 2007-06-08 | 2008-12-11 | Qurio Holdings, Inc. | Multi-client streamer with late binding of ad content |
US7843456B2 (en) * | 2007-06-29 | 2010-11-30 | Microsoft Corporation | Gradient domain editing of animated meshes |
US8189888B2 (en) * | 2007-09-27 | 2012-05-29 | Fujifilm Corporation | Medical reporting system, apparatus and method |
US8610709B2 (en) * | 2008-02-25 | 2013-12-17 | Markany Inc. | Method and apparatus for watermarking of 3D mesh model |
US8749556B2 (en) * | 2008-10-14 | 2014-06-10 | Mixamo, Inc. | Data compression for real-time streaming of deformable 3D models for 3D animation |
US8582844B2 (en) * | 2008-11-13 | 2013-11-12 | Hitachi Medical Corporation | Medical image processing device and method |
US20100189323A1 (en) * | 2009-01-27 | 2010-07-29 | Canon Kabushiki Kaisha | Computer-aided diagnosis apparatus and method for controlling the same |
US8768436B2 (en) * | 2009-02-26 | 2014-07-01 | Hitachi Medical Corporation | Coronary artery angiography image processing method to detect occlusion and degree effect of blood vessel occlusion to an organ |
US8924864B2 (en) * | 2009-11-23 | 2014-12-30 | Foresight Imaging LLC | System and method for collaboratively communicating on images and saving those communications and images in a standard known format |
US20120327186A1 (en) * | 2010-03-17 | 2012-12-27 | Fujifilm Corporation | Endoscopic observation supporting system, method, device and program |
US8611988B2 (en) * | 2010-03-31 | 2013-12-17 | Fujifilm Corporation | Projection image generation apparatus and method, and computer readable recording medium on which is recorded program for the same |
US20130023730A1 (en) * | 2010-03-31 | 2013-01-24 | Fujifilm Corporation | Endoscopic observation support system, method, device and program |
US20130141462A1 (en) * | 2011-12-02 | 2013-06-06 | Kenichi Niwa | Medical image observation apparatus |
Cited By (65)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120107784A1 (en) * | 2010-10-28 | 2012-05-03 | Alexander Jorg Seifert | One touch button for operating room support |
US20120209123A1 (en) * | 2011-02-10 | 2012-08-16 | Timothy King | Surgeon's Aid for Medical Display |
US11412998B2 (en) | 2011-02-10 | 2022-08-16 | Karl Storz Imaging, Inc. | Multi-source medical display |
US10674968B2 (en) * | 2011-02-10 | 2020-06-09 | Karl Storz Imaging, Inc. | Adjustable overlay patterns for medical display |
US10631712B2 (en) * | 2011-02-10 | 2020-04-28 | Karl Storz Imaging, Inc. | Surgeon's aid for medical display |
US10215989B2 (en) | 2012-12-19 | 2019-02-26 | Lockheed Martin Corporation | System, method and computer program product for real-time alignment of an augmented reality device |
US20160028994A1 (en) * | 2012-12-21 | 2016-01-28 | Skysurgery Llc | System and method for surgical telementoring |
US9560318B2 (en) * | 2012-12-21 | 2017-01-31 | Skysurgery Llc | System and method for surgical telementoring |
US20150373369A1 (en) * | 2012-12-27 | 2015-12-24 | The Regents Of The University Of California | Anamorphic stretch image compression |
US11547499B2 (en) | 2014-04-04 | 2023-01-10 | Surgical Theater, Inc. | Dynamic and interactive navigation in a surgical environment |
US11157842B2 (en) * | 2015-01-28 | 2021-10-26 | Iltec—Lubeck Tecnologia Ltda | System, equipment and method for performing and documenting in real-time a remotely assisted professional procedure |
US9864740B2 (en) * | 2015-02-05 | 2018-01-09 | Ciena Corporation | Methods and systems for creating and applying a template driven element adapter |
US20160234074A1 (en) * | 2015-02-05 | 2016-08-11 | Ciena Corporation | Methods and systems for creating and applying a template driven element adapter |
US11750794B2 (en) | 2015-03-24 | 2023-09-05 | Augmedics Ltd. | Combining video-based and optic-based augmented reality in a near eye display |
US9928629B2 (en) | 2015-03-24 | 2018-03-27 | Augmedics Ltd. | Combining video-based and optic-based augmented reality in a near eye display |
US9503681B1 (en) * | 2015-05-29 | 2016-11-22 | Purdue Research Foundation | Simulated transparent display with augmented reality for remote collaboration |
US10169917B2 (en) | 2015-08-20 | 2019-01-01 | Microsoft Technology Licensing, Llc | Augmented reality |
US10235808B2 (en) | 2015-08-20 | 2019-03-19 | Microsoft Technology Licensing, Llc | Communication system |
WO2017031385A1 (en) * | 2015-08-20 | 2017-02-23 | Microsoft Technology Licensing, Llc | Asynchronous 3d annotation of a video sequence |
CN107924575A (en) * | 2015-08-20 | 2018-04-17 | 微软技术许可有限责任公司 | The asynchronous 3D annotations of video sequence |
US11197722B2 (en) | 2015-10-14 | 2021-12-14 | Surgical Theater, Inc. | Surgical navigation inside a body |
US10582190B2 (en) | 2015-11-23 | 2020-03-03 | Walmart Apollo, Llc | Virtual training system |
US11678004B2 (en) | 2016-03-24 | 2023-06-13 | Rpx Corporation | Recording remote expert sessions |
US20190124391A1 (en) * | 2016-03-24 | 2019-04-25 | Daqri, Llc | Recording remote expert sessions |
US11032603B2 (en) * | 2016-03-24 | 2021-06-08 | Rpx Corporation | Recording remote expert sessions |
US11277655B2 (en) | 2016-03-24 | 2022-03-15 | Rpx Corporation | Recording remote expert sessions |
US20170280188A1 (en) * | 2016-03-24 | 2017-09-28 | Daqri, Llc | Recording Remote Expert Sessions |
US10187686B2 (en) * | 2016-03-24 | 2019-01-22 | Daqri, Llc | Recording remote expert sessions |
US10248441B2 (en) * | 2016-08-02 | 2019-04-02 | International Business Machines Corporation | Remote technology assistance through dynamic flows of visual and auditory instructions |
WO2018093921A1 (en) * | 2016-11-16 | 2018-05-24 | Terarecon, Inc. | System and method for three-dimensional printing, holographic and virtual reality rendering from medical image processing |
US10275927B2 (en) | 2016-11-16 | 2019-04-30 | Terarecon, Inc. | System and method for three-dimensional printing, holographic and virtual reality rendering from medical image processing |
CN106846496A (en) * | 2017-01-19 | 2017-06-13 | 杭州古珀医疗科技有限公司 | DICOM images based on mixed reality technology check system and operating method |
WO2018175971A1 (en) * | 2017-03-24 | 2018-09-27 | Surgical Theater LLC | System and method for training and collaborating in a virtual environment |
CN109643530A (en) * | 2017-03-24 | 2019-04-16 | 外科手术室公司 | System and method for being trained and cooperating in virtual environment |
US10706636B2 (en) * | 2017-06-26 | 2020-07-07 | v Personalize Inc. | System and method for creating editable configurations of 3D model |
US11532135B2 (en) | 2017-09-08 | 2022-12-20 | Surgical Theater, Inc. | Dual mode augmented reality surgical system and method |
US10861236B2 (en) | 2017-09-08 | 2020-12-08 | Surgical Theater, Inc. | Dual mode augmented reality surgical system and method |
US10265138B2 (en) * | 2017-09-18 | 2019-04-23 | MediVis, Inc. | Methods and systems for generating and using 3D images in surgical settings |
US10886029B2 (en) * | 2017-11-08 | 2021-01-05 | International Business Machines Corporation | 3D web-based annotation |
US10896762B2 (en) | 2017-11-08 | 2021-01-19 | International Business Machines Corporation | 3D web-based annotation |
US10638089B2 (en) * | 2017-12-11 | 2020-04-28 | Foresight Imaging | System and method of collaboratively communication on images via input illustrations and have those illustrations auto erase |
US20190182454A1 (en) * | 2017-12-11 | 2019-06-13 | Foresight Imaging LLC | System and method of collaboratively communication on images via input illustrations and have those illustrations auto erase. |
CN111602105A (en) * | 2018-01-22 | 2020-08-28 | 苹果公司 | Method and apparatus for presenting synthetic reality companion content |
WO2019190792A1 (en) * | 2018-03-26 | 2019-10-03 | Covidien Lp | Telementoring control assemblies for robotic surgical systems |
WO2019210353A1 (en) * | 2018-04-30 | 2019-11-07 | MedVR Pty Ltd | Medical virtual reality and mixed reality collaboration platform |
US11974887B2 (en) | 2018-05-02 | 2024-05-07 | Augmedics Ltd. | Registration marker for an augmented reality system |
US11062527B2 (en) | 2018-09-28 | 2021-07-13 | General Electric Company | Overlay and manipulation of medical images in a virtual environment |
CN111868788A (en) * | 2018-10-17 | 2020-10-30 | 美的集团股份有限公司 | System and method for generating an acupoint and pressure point map |
US10928773B2 (en) | 2018-11-01 | 2021-02-23 | International Business Machines Corporation | Holographic image replication |
US10939977B2 (en) | 2018-11-26 | 2021-03-09 | Augmedics Ltd. | Positioning marker |
US11766296B2 (en) | 2018-11-26 | 2023-09-26 | Augmedics Ltd. | Tracking system for image-guided surgery |
US11968408B2 (en) * | 2019-02-26 | 2024-04-23 | Surgtime, Inc. | System and method for teaching a surgical procedure |
US20220321925A1 (en) * | 2019-02-26 | 2022-10-06 | Surgtime, Inc. | System and method for teaching a surgical procedure |
US11980507B2 (en) | 2019-04-30 | 2024-05-14 | Augmedics Ltd. | Registration of a fiducial marker for an augmented reality system |
US11980506B2 (en) | 2019-07-29 | 2024-05-14 | Augmedics Ltd. | Fiducial marker |
US20210068900A1 (en) * | 2019-09-11 | 2021-03-11 | Ardeshir Rastinehad | Method for providing clinical support for surgical guidance during robotic surgery |
US11903650B2 (en) * | 2019-09-11 | 2024-02-20 | Ardeshir Rastinehad | Method for providing clinical support for surgical guidance during robotic surgery |
US11801115B2 (en) | 2019-12-22 | 2023-10-31 | Augmedics Ltd. | Mirroring in image guided surgery |
US11389252B2 (en) | 2020-06-15 | 2022-07-19 | Augmedics Ltd. | Rotating marker for image guided surgery |
WO2022003729A1 (en) * | 2020-07-02 | 2022-01-06 | Cnr Medimentor Private Limited | System and method for telestrating the operative procedures |
US11817201B2 (en) | 2020-09-08 | 2023-11-14 | Medtronic, Inc. | Imaging discovery utility for augmenting clinical image management |
CN113220121A (en) * | 2021-05-04 | 2021-08-06 | 西北工业大学 | AR fastener auxiliary assembly system and method based on projection display |
US11896445B2 (en) | 2021-07-07 | 2024-02-13 | Augmedics Ltd. | Iliac pin and adapter |
US11980508B2 (en) | 2023-08-04 | 2024-05-14 | Augmedics Ltd. | Registration of a fiducial marker for an augmented reality system |
US11980429B2 (en) | 2023-09-20 | 2024-05-14 | Augmedics Ltd. | Tracking methods for image-guided surgery |
Also Published As
Publication number | Publication date |
---|---|
US20160028994A1 (en) | 2016-01-28 |
US9560318B2 (en) | 2017-01-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20140176661A1 (en) | System and method for surgical telementoring and training with virtualized telestration and haptic holograms, including metadata tagging, encapsulation and saving multi-modal streaming medical imagery together with multi-dimensional [4-d] virtual mesh and multi-sensory annotation in standard file formats used for digital imaging and communications in medicine (dicom) | |
Sauer et al. | Mixed reality in visceral surgery: development of a suitable workflow and evaluation of intraoperative use-cases | |
US11730545B2 (en) | System and method for multi-client deployment of augmented reality instrument tracking | |
Bernardo | Virtual reality and simulation in neurosurgical training | |
Kockro et al. | Planning and simulation of neurosurgery in a virtual reality environment | |
John | The impact of Web3D technologies on medical education and training | |
US20210015583A1 (en) | Augmented reality system and method for tele-proctoring a surgical procedure | |
Kockro et al. | A collaborative virtual reality environment for neurosurgical planning and training | |
JP2021512440A (en) | Patient Engagement Systems and Methods | |
Halic et al. | Mixed reality simulation of rasping procedure in artificial cervical disc replacement (ACDR) surgery | |
Zagoranski et al. | Use of augmented reality in education | |
Zhao et al. | Floating autostereoscopic 3D display with multidimensional images for telesurgical visualization | |
CN115315729A (en) | Method and system for facilitating remote presentation or interaction | |
Munawar et al. | Fully immersive virtual reality for skull-base surgery: surgical training and beyond | |
Xu et al. | ARLS: An asymmetrical remote learning system for sharing anatomy between an HMD and a light field display | |
Andersen et al. | Augmented visual instruction for surgical practice and training | |
Tuladhar et al. | A recent review and a taxonomy for hard and soft tissue visualization‐based mixed reality | |
Dumay | Triage simulation in a virtual environment | |
John | Basis and principles of virtual reality in medical imaging | |
Andersen | Effective User Guidance Through Augmented Reality Interfaces: Advances and Applications | |
Bailey et al. | Streaming Virtual Reality: An Innovative Approach to Distance Healthcare Simulation | |
Budrionis et al. | Camera movement during telementoring and laparoscopic surgery: Challenges and innovative solutions | |
US20210358218A1 (en) | 360 vr volumetric media editor | |
Singh | Augmented Reality on the da Vinci Surgical System | |
Halic et al. | Two-way semi-automatic registration in augmented reality system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SKYSURGERY LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:REINA, G. ANTHONY;L'ESPERANCE, JAMES OMER;REEL/FRAME:035975/0465 Effective date: 20150610 |
|
STCB | Information on status: application discontinuation |
Free format text: EXPRESSLY ABANDONED -- DURING PUBLICATION PROCESS |