WO2017151904A1 - Procédés et systèmes de calage d'images anatomiques - Google Patents

Procédés et systèmes de calage d'images anatomiques Download PDF

Info

Publication number
WO2017151904A1
WO2017151904A1 PCT/US2017/020425 US2017020425W WO2017151904A1 WO 2017151904 A1 WO2017151904 A1 WO 2017151904A1 US 2017020425 W US2017020425 W US 2017020425W WO 2017151904 A1 WO2017151904 A1 WO 2017151904A1
Authority
WO
WIPO (PCT)
Prior art keywords
patient
medical
surgical device
surgical
location
Prior art date
Application number
PCT/US2017/020425
Other languages
English (en)
Inventor
Peter TRUSKEY
Brian H. Craig
Original Assignee
Covidien Lp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Covidien Lp filed Critical Covidien Lp
Publication of WO2017151904A1 publication Critical patent/WO2017151904A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image

Definitions

  • the present disclosure generally relates to a method and system of providing enhanced user experience during a surgery. More particularly, the present disclosure relates to a method and system for enhancing visual experience during a surgery.
  • Laparoscopic surgery is a minimally invasive surgery that usually includes making small incisions in the body of the patient. The incisions are remote from the actual anatomical structures upon which a surgeon will operate, A surgeon usually inserts a laparoscope which has a video camera connected to it, through the incisions. Images captured by the video camera are projected to a monitor within the operating room where the surgery is taking place.
  • the surgeon navigates the laparoscope through the body of the patient to the surgical site by looking at the video monitor.
  • looking away from the patient disadvantages surgeons.
  • the surgeon is not able to experiencing the surgery in an intuitive manner.
  • the video feed mirrors the moves made by the surgeon. For example, if the surgeon moves his hand down, the video would show the hand moving up.
  • the lack of an intuitive experience while performing the surgery makes learning to perform such surgeries more difficult for new surgeons. Surgeons performing robotic surgeries and open surgeries also experience similar difficulties.
  • a computer-implemented method includes capturing one or more images, identifying a surgical device within the one or more images, determining one or more geometrical parameters for the surgical device, determining, based on the one or more geometrical parameters, a location of a distal end of the identified surgical device, determining, based on the location of the distal end of the identified surgical device, a surgical site within the body of a patient, determining, based on the surgical site within the body of the patient, a corresponding surface site, displaying, based on an image from a laparoscope at the surgical site within the body of the patient, a holographic image at the corresponding surface site.
  • the corresponding surface site is at a location on the surface of the body of the patient.
  • the holographic image includes the distal end of the identified surgical device.
  • the holographic image includes an anatomical structure located at the surgical site within the body of the patient.
  • identifying the surgical device within the one or more images comprises using an object recognition classifier trained on surgical devices,
  • determining the location of the distal end of the surgical device comprises determining an angle at which the surgical device enters the body of the patient.
  • determining the location of the distal end of the surgical device comprises determining size of a portion of the surgical device protruding from the body of the patient.
  • the method further comprises tracking the identified surgical device through the one or more captured images.
  • the one or more geometrical parameters for the surgical device includes length of the surgical device.
  • the one or more geometrical parameters for the surgical device includes width of the surgical device
  • a system in another aspect of the present disclosure, includes one or more image capturing devices configured to capture one or more images, a networked computer system configured to identify a surgical device within the one or more images, a head mounted display device configured to: determine one or more geometrical parameters for the surgical device, determine, based on the location of the distal end of the identified surgical device, a surgical site within the body of a patient, determine, based on the surgical site within the body of the patient, a corresponding surface site, display, based on an image from a laparoscope at the surgical site within the body of the patient, a holographic image at the corresponding surface site.
  • the corresponding surface site is at a location on the surface of the body of the patient.
  • the holographic image includes the distal end of the identified surgical device.
  • the holographic image includes an anatomical structure located at the surgical site within the body of the patient,
  • the networked computer system is further configured to identify the surgical device within the one or more images using an object recognition technique trained on surgical devices.
  • the head mounted display device is further configured to determine the location of the distal end of the surgical device by
  • the head mounted display device is further configured to determine the location of the distal end of the surgical device by
  • the head mounted display device is further configured to track the identified surgical device through the one or more captured images
  • the one or more geometrical parameters for the surgical device of the system includes length of the surgical device.
  • the one or more geometrical parameters for the surgical device of the system includes width of the surgical device.
  • FIG. 1 A illustrates an example arrangement of an integrated surgical system utilizing augmented reality, in accordance with an embodiment of the present disclosure
  • FIG. IB illustrates an example arrangement of head mounted display devices, a networked computer system, and a data storage unit communicatively coupled with each other, in accordance with an embodiment of the present disclosure
  • FIG. 2 illustrates a head mounted display device, in accordance with an embodiment of the present disclosure
  • FIG. 3 illustrates a holographic image of information related to vital signs of a patient being displayed to a user, in accordance with an embodiment of the present disclosure
  • FIG. 4 illustrates holographic images of patient information and locations of incisions to the user, in accordance with an embodiment of the present disclosure
  • FIG. 5 illustrates holographic images identifying medical devices to the user, in accordance with an embodiment of the present disclosure
  • FIG. 6 illustrates holographic image displaying distal ends of medical devices and anatomical structures at a surgical site within the body of a patient, in accordance with an embodiment of the present disclosure
  • FIG. 7A - FIG. 7F illustrate a close up view of a holographic image displaying distal end of a medical device, anatomical structures at the surgical site and information guiding user to allow the user to perform the procedure with most precision, in accordance with an embodiment of the present disclosure
  • FIG. 8A - FIG. 8B illustrates a holographic image displaying a list of steps, medical devices and a status of each step for a medical procedure, in accordance with an embodiment of the present disclosure
  • FIG. 9 illustrates a holographic image displaying an inventory of medical devices and supplies used in a medical procedure in accordance with an embodiment of the present disclosure
  • FIG. 10A - FIG. 10B illustrate an example arrangement of reviewing a video of a medical procedure that has already been performed, in accordance with an embodiment of the present disclosure, DETAILED DESCRIPTION
  • the present disclosure is directed to systems and methods of implementing virtual and augmented reality devices into the medical and surgical setting. Utilizing these technologies clinicians can be given the ability to see through tissue in a manner that makes laparoscopic surgery feel like open surgery from the visualization perspective.
  • clinicians can be given the ability to see through tissue in a manner that makes laparoscopic surgery feel like open surgery from the visualization perspective.
  • identification techniques a variety of data can be presented including information about the surgical implements and medical devices, steps of a procedure, information about the patient, and vital signs to name just a few. This display of data can be utilized both by surgeons and nurses to speed procedures, ensure the safety of the patient, and permit nurses and attending surgeons to anticipate the needs of the primary surgeon before being asked.
  • voice recognition technologies enable the clinician to update the images and data they are viewing as they proceed through steps of a procedure, identify response as a patient's condition changes, and re-review or remotely observe (either in real-time) or at a convenient time for training purposes,
  • proximal refers to the portion of the device or component thereof that is farthest away from the patient and the term “distal” refers to the portion of the device or component thereof that is closest to the patient.
  • FIG. 1 A depicts an exemplary integrated surgical system utilizing augmented reality (referred to herein as "system") 100 to display information to users in an augmented real world environment.
  • System 100 may be employed in accordance with various example embodiments herein.
  • System 100 is an exemplar ⁇ ' system found in an operating room 99.
  • the specific number of components of the system 100 depicted in FIG. 1 A and the arrangement and configuration thereof are provided for illustrative purposes only, and should not be construed as limiting. For instance, various embodiments herein employ fewer or greater than all of the components shown in FIG. 1 A.
  • the system 100 depicted in FIG. 1 A is provided as an example context in which various example embodiments herein are applicable. However, the various example embodiments herein are also applicable in contexts other than integrated surgical systems, for example, in reviewing a video of a surgery for instructional purposes.
  • System 00 includes an operating table 160 upon which a patient 150 lies.
  • System 100 also includes a plurality of users wearing a head mounted display (HMD) device.
  • the plurality of users include surgeon 107a wearing a HMD device 107b, surgeon 109a wearing a HMD device 109b, nurse 108a wearing a HMD device 108b, nurse 1 10a wearing a HMD device 110b.
  • HMD head mounted display
  • An HMD device may comprise a display in front of one or both eyes of the user that enables the user to experience an augmented reality environment or a virtual reality environment.
  • an augmented reality environment as described herein, the information presented to the user is aligned and overlaid upon an object or space in the real world.
  • the overlaid information may be presented using computer graphics such as hologram or other 3- dimensional renderings of the information.
  • information in the form of augmented reality may be a hologram comprising biographical information and the face of patient 150 and the hologram is aligned and overlaid on top or on the surface of the body of patient 150.
  • An HMD device may be constructed such that it can be worn on the head of the user, and may comprise one or more processors that are coupled to one or more memory units.
  • HMD device may comprise an operating system that enables the HMD device to receive and process instructions from users and/or other computing devices, and transmit data to users and/or other computing devices.
  • System 100 also includes one or more monitors, such as monitor 105, capable of displaying static and dynamic information and a sequence or stream of images, such as video.
  • Monitor 105 may display information concerning vital signs of patient 150, such as the body temperature, blood pressure, heart rate, respiratory rate of patient 150.
  • Monitor 105 may also display video of the operating room, video of procedure being performed, and the video of the surgical site within the body of patient 150.
  • a surgical site is the portion within the body of a patient that is being medically treated. For example, if a portion of a lung of a patient is being operated on, then the surgical site is the portion of the lung that is being operated upon.
  • System 100 also includes a plurality of image capturing devices 101, 102, 103, 104.
  • Image capturing devices 101, 102, 103, 104 may be video cameras.
  • Each of the image capturing devices 101, 102, 103, 104 may be associated with a particular identifier.
  • Each of the image capturing devices 101, 102, 103, 104 may also be associated with a location identifier.
  • the location identifier may indicate the location of the image capturing device relative to a point of reference.
  • Each of the HMD devices 107b, 108b, 109b, 110b may also be associated with a particular identifier and a location identifier that indicates the location of the HMD device relative to a point of reference.
  • a point of reference may be a fixed landmark such as patient 150, operating table 160, medical device 120, orientation of operating room, the door to the operating room or other fixed landmarks.
  • location identifiers associated with image capturing devices 101, 102, 103, 104 use orientation of operating room 99 as shown in FIG. 1A
  • location identifier of image capture device 101 will indicate that the image capturing device lOlis on the west side of operating room 99 as shown in FIG. 1A
  • location identifiers associated with HMD devices use patient 150 as point of reference then the location identifier associated with HMD device 107b indicates that the HMD device 107b is north of patient 150.
  • Image capturing devices 101, 102, 103, 104 and HMD devices 107b, 108b, 109b, 110b may be configured to transmit captured images or a sequence of images to networked computer system 140.
  • Networked computer system 140 includes one or more processors coupled to one or more memory units.
  • Networked computer system 140 may be configured to retrieve data from and store data in a data storage unit comprising data associated with users, such as users 107a, 108a, 109a, 1 10a, medical devices, such as medical device 120, patients, such as patient 150 and a plurality of medical procedures.
  • Networked computer system 140 may be capable of executing procedures, such as programs, routines, scripts, or other executable commands necessary for integrating and analyzing data from various components of system 100, such as HMD devices 107b, 108b, 109b, 110b, image capturing devices 101, 102, 103, 104, and medical device 120.
  • Memory units of networked computer system 140 may comprise one or more instructions that register the received images from image capturing devices 101, 102, 103, 104 and HMD devices 107b, 108b, 109b, 1 10b. Memory units of networked computer system 140 may also compri se one or more instructions that align and combine the received images into a single video. The networked computer system 140 may align the images based on the location identifiers associated with the images. The single video, resulting from aligning and combining the received images, produces a 360 degree view. In FIG. 1, the single video displays a 360 degree view of the activities that occur within operating room 99, including the procedure performed on patient 150. A portion of this 360 degree view may be presented on each HMD consistent with the orientation of the HMD and the images it is collecting such that the HMD displays a field of view to the wearer consistent with their head orientation. These details are described further below.
  • Networked computer system 140 may be configured to identify one or more faces in the received images using facial recognition techniques trained to identify a set of known faces.
  • the set of known faces are stored in a data storage unit.
  • networked computer system 140 may comprise the data storage unit or be communicatively coupled with the data storage unit.
  • data storage unit 141 may be configured to identify one or more faces in the received images using facial recognition techniques trained to identify a set of known faces.
  • the set of known faces are stored in a data storage unit.
  • networked computer system 140 may comprise the data storage unit or be communicatively coupled with the data storage unit.
  • data storage unit 141 may be configured to identify one or more faces in the received images using facial recognition techniques trained to identify a set of known faces.
  • the set of known faces are stored in a data storage unit.
  • networked computer system 140 may comprise the data storage unit or be communicatively coupled with the data storage unit.
  • data storage unit 141 may be configured to identify one or more faces in the received images using facial recognition techniques
  • Networked computer system 40 may be configured to identify one or more medical devices using to identify medical devices.
  • each medical device may be equipped with an RFID chip or other electronic identifier which identifies the medical device to the computer sy tem 140.
  • Networked computer system 140 may be configured to use one or more object recognition techniques that are trained on medical devices to detect one or more medical devices in one or more images and determine whether the one or more detected medical devices match known medical devices.
  • networked computer system 140 may be configured to determine geometrical information, such as height, width and length, of the one or more detected medical devices and match the geometrical information of the one or more detected medical devices with the geometrical information of known medical devices stored in a data storage unit, such as data storage unit 141 described in FIG. IB.
  • networked computer system 140 may be configured to determine the geometrical information using 3-dimensional (3D) reconstruction techniques on the one or more detected medical devices to determine a point cloud for each of the detected medical devices,
  • a point cloud as described herein, is a set of data points in a coordinate system.
  • a point cloud in a 3-dimensional coordinate sy stem of an object in one or more images may include points along the X, Y, and Z coordinates of the object in the one or more images.
  • Networked computer system 140 may also be configured to utilize various other computer vision techniques or technologies to detect and identify medical devices in the one or more images captured by the one or more image capturing apparatuses of system 100,
  • Networked computer system 140 may also be configured to assign or determine an identifier to each of the identified medical devices.
  • HMD devices may include or present the identifier assigned or determined for medical device 120, along with other information and attributes corresponding to the medical device 120, to a user whenever the user's field of view includes the medical device, 120.
  • an identifier assigned or determined for an identified medical device mav be the name of the device and the identifier, the name of the device, may be displayed as well as usages and number of firings or other limits on usage that may be relevant to the user.
  • Networked computer system 140 may also be configured to output a series or sequence of images from one or more components of system 100, such as image capturing devices 101 , 102, 103, 104, HMD devices 107b, 108b, 109b, 1 10b and medical device 120, to one or more monitors within system 100, such as monitor 105. Additionally, networked computer system 140 may also be configured to output information concerning patient 150 to monitor 105. These details are set forth below.
  • FIG. B illustrates an example of a network in which the system 100 might operate.
  • arrangement of head mounted display devices (107b, 108b, 109b, 1 10b), a networked computer system 140, and a data storage unit 141 communicatively coupled with each other.
  • head mounted display devices 107b, 108b, 109b, 1 10b
  • networked computer system 140 a networked computer system 140
  • data storage unit 141 communicatively coupled with each other.
  • FIG. I A For the purpose of illustrating an example network, various components of FIG. I A will be described in greater detail below.
  • networked computer system 140 may be configured to retrieve data from and store data in a data storage unit.
  • Data storage unit 141 may comprise one or more processors that are coupled with one or more memory units.
  • the memory units may store one or more sequence of instructions that execute a database management system (DBMS).
  • DBMS database management system
  • Data stored in data storage unit 141 may be stored in a plurality of data records that may be organized according to various schemas and in a collection of various tables, reports, views and other database objects. Data records may be queried from, updated, deleted or created in data storage unit 141 using the DBMS.
  • the DBMS may support one or more query languages to allow users and other computing nodes or devices to query, update, create or delete data records stored in data storage unit 141.
  • Data storage unit 141 may also be configured to communicate with users or other computing devices through a network, such as network 180a.
  • Network 180a may be any type of a network including internet and intranet.
  • a user, surgeon 109a using a HMD device 109b may transmit requests to data storage unit 141 through network 180a and data storage unit 141 may transmit responses to the user, surgeon 109a.
  • networked computing system 140 may transmit requests to data storage unit 141 also through network 180a and responses to the requests from data storage unit 141 may be transmitted through network 180a.
  • HMD devices 107b, 108b, 109b, 110b and networked computer system 140 may also be configured to communicate with each other through a network, such as network 180b.
  • Network 180a and network 180b may facilitate wireless or wired communication.
  • Network 180a and network 180b may comprise a plurality of network nodes and each network node may support various routing protocols including routing protocols to transfer internet protocol (IP) packets, voice packets, video packets, data packets, and other information between the various network nodes and end nodes including data storage unit 141, networked computing system 140, HMD devices 107b, 108b, 109b, 110b.
  • Data storage unit 141, networked computing system 140 and HMD devices 107b, 108b, 109b, 1 10b may be configured to support a number of routing protocols and each may be configured to transfer data, voice, video, and other packets with each other using a routing protocol.
  • Data storage unit 141 may comprise information related to patients of a customer of system 100.
  • information related to a patient may include biographical information of the patient such as name of the patient, date of birth of the patient, age of the patient, and other identification information including birthmarks, social security number, passport number or other state issued identifications.
  • Information related to patient may also include physical information of the patient such as height of the patient, weight of the patient, color of skin, color of hair, color of eyes.
  • Patient related information may also include medical history of the patient such as medical conditions, allergies, co-morbidities, previous medical procedures, previous surgeries, state of each of the diseases afflicting the patient.
  • patient related information of a patient suffering from pancreatic cancer will indicate the stage of the pancreatic cancer.
  • Patient related information may also include the medical procedure that will be performed on the patient. Information related to a patient are not limited to the above recited examples.
  • Data storage unit 141 may also comprise data related to 3-dimensional geometric measurements of medical devices or portions of medical devices, such as height, width, and length of a medical device or a portion of the medical device. For example, if the medical device is a particular model of a trocar from a particular manufacturer, then 3 -dimensional geometric measurements of the cannula portion of the trocar, such as the width or circumference and length of the cannula portion may be stored as a data record or a portion of a data record associated with the particular trocar. Data storage unit 141 may also store a number of units for each medical device and information corresponding to the number of units that are available for each medical device. This data can be used for image based identification of devices within the surgery being performed in operating room 99, and to push information to a user having such a device within their field of view.
  • data storage unit 141 may also comprise instructions that inform a user of system 100 on how to use the medical device. Additionally, for each medical device, data storage unit 141 may comprise one or more audio or video recordings that allow a user of system 100 to be trained on how to use the medical device. In some embodiments, data storage unit 141 may only store pointers or links to the audio or video recordings, and, using the pointers or links, the audio or video recording may be automatically downloaded to a computing device, such as an HMD device of a user, from a secondary data storage unit (e.g., a location on the internet). In some embodiments, the audio or video recording may be downloaded to a computing device controlled by a user of system 100, such as an HMD device of the user, only upon an explicit instruction from the user to download the audio or video recording.
  • a computing device such as an HMD device of the user
  • Data storage unit 141 may also comprise information related to various medical procedures.
  • the information related to the various medical procedures may comprise any information that may help describe the specifics of each of the medical procedures.
  • the related information may comprise a surgical approach of the surgical procedure.
  • Surgical approach as described herein, may include any type of approaches used in performing a surgery. Such approaches may include but are not limited to open surgery, minimally invasive surgery (MIS), robotic surgery, etc.
  • MIS minimally invasive surgery
  • the related information of a medical procedure involving a laparoscopic surgery may indicate that the approach of the surgical procedure is a minimally invasive surgery.
  • the related information may also comprise a surgical technique of the surgical procedure.
  • Surgical technique of a surgical procedure may include any type of a technique for a surgery.
  • the related information may indicate that the surgical technique is a laparoscopic supracervical hysterectomy.
  • the related information may indicate that the surgical technique is a laparoscopic total hysterectomy.
  • the related information may also include a series of steps that are typically performed during the medical procedure.
  • the related information may also include an expected amount of time within which the step is expected to be completed.
  • the related information may also include one or more parallel steps that may be performed instead of the typically performed step during the medical procedure.
  • Related information may also include for each of the one or more parallel steps, one or more subsequent steps required to complete the medical procedure including one or more parallel subsequent steps, if any, for each of the subsequent steps.
  • the one or more subsequent steps required to complete the medical procedure may be the same as or different from the steps that follow the step that is typically performed during the medical procedure.
  • data storage unit 141 may also comprise information related to one or more medical devices utilized during the medical procedure.
  • Data storage unit 141 may also comprise an expected duration time for each medical procedure.
  • Data storage unit 141 may also comprise one or more data records associated with a medical procedure that is currently being performed and a location identifier indicating the location of where the medical procedure is being performed.
  • data storage unit 141 will comprise a data record comprising the medical procedure or an identifier indicating the medical procedure, an identifier indicating a surgical approach for a surgery involved in the medical procedure, an identifier indicating a surgical technique for the surgery involved in the medical procedure, an identifier indicating the location of the operating room, a start time of the medical procedure, an identifier indicating a doctor performing the medical procedure, an identifier indicating support personnel assisting in the medical procedure, and an end time of the medical procedure.
  • the expected duration time of each medical procedure may be automatically updated based on an analysis of actual duration times of the medical procedure when the medical procedure was previously performed. Actual duration time may be calculated as the amount time elapsed between the start time of the medical procedure and the end time of the medical procedure. In some embodiments, the number of actual duration times analyzed is based on a threshold number.
  • data storage unit 141 may comprise information about each of the actual steps performed during the medical procedure by a user, such as a surgeon. For each of the actual steps performed, data storage unit 141 may also comprise the start time of the actual step and the end time of the actual step and also duration of the actual step. The duration of the actual step may be determined using the start time and the end time. For each medical procedure performed, data storage unit 141 may also associate the medical procedure with a user performing the medical procedure. For example, for a medical procedure performed by surgeon 107a, data storage unit 141 may associate a data record indicating an identifier of surgeon 107a as the user or surgeon that performed the medical procedure,
  • Networked computer system 140 may be configured to review or analyze the performance of a user using data related to the actual steps taken by the user. As described above, data related to the actual steps taken by the user may be stored in a data storage unit, such as data storage unit 141. Networked computer system 140 may compare the steps taken by the user with the steps that are typically performed for the medical procedure and determine efficiency of each of the steps taken by the surgeon. Effi ciency of each of the steps taken by the user may be determined based on the actual amount of time spent performing the actual step performed by the user and the expected amount of time for the step performed by the user.
  • efficiency of each of the steps actually performed may be determined by comparing the actual amount of time spent performing the actual step performed by the user with the expected amount of time that was to be spent on a corresponding step that is typically- associated with the medical procedure.
  • a corresponding step that is typically associated with the medical procedure is a step that is typically performed instead of the step actually performed by the user, such as a surgeon.
  • each medical procedure data storage unit 141 may comprise one or more series of steps and for each step data storage unit 141 may also comprise one or more parallel steps and one or more steps subsequent to the one or more parallel steps.
  • Networked computer system 140 may be configured to determine or predict a next step or a series of steps for a medical procedure utilizing various machine learning or predictive analytic techniques or technologies on data related to all the steps or sequences of steps for the medical procedure including the parallel steps for the medical procedure. In utilizing the various machine learning or predictive analytic techniques or technologies to predict the next step or the series of next steps, networked computer system 140 may also incorporate one or more other factors involved in the medical procedure, including patient vitals information at each step of the procedure, medical history of the patient including medical conditions, past surgeries, age of the patient and other information. Networked computer system 140 may also be configured to determine or assign a probability to each of the possible steps, including the parallel steps. The probability assigned to each of the possible steps of the medical procedure correlate at least in part to likelihood that the outcome of the surgery will be successful once the step is performed.
  • a sequence of steps for medical procedure Y is steps A, B, C, D and step B has a parallel step E, which has a subsequent step F.
  • Step C has parallel steps F, G and H.
  • Networked computer system 140 may predict that the step after step A should be step E rather than step B based on the age of the patient and the heart rate of the patient at the time the subsequent step to step A is to be performed.
  • Networked computer system 140 may then determine which of the steps C, F, G and H may be the best possible step to follow step E, again, based on the information captured using machine learning or predictive analytic techniques and patient related information.
  • networked computer system 140 may also incorporate one or more preferences of a user, such as surgeon 107a, in determining or predicting the next step or series of steps.
  • the one or more preferences of a user may be a preference by the user to always follow step A, with step C for medical procedure Y described in the example above.
  • Networked computer system 140 may identify such preferences of a user based on analyzing historical data associated with the actual steps performed by the user each time the user has performed medical procedure Y in the past. As described above, data related to the actual steps performed by a user is captured and stored in a data storage unit, such as data storage unit 141. Using such historical data, networked computer system 140 may also build a profile for each user comprising one or more preferences of the user for each medical procedure that the user has performed.
  • data storage unit 141 may be configured to communicate through a network with one or more other computing systems of a consumer that is implementing system 100, such as a healthcare provider or a hospital.
  • data storage unit 141 may be coupled with a medical device inventory system that provides information concerning the number of available units for each medical device and the medical device inventory system may query data storage unit 141 to retrieve the number of available units for each medical device.
  • data storage unit 141 is coupled with an operating room scheduling system of a hospital that allows the hospital to determine when an occupied operating room, or a doctor, or a nurse becomes available based on the expected duration of the medical procedure that is being performed in the operating room, or by the surgeon, or by the nurse.
  • FIG. 2 depicts an exemplary embodiment of a HDM device.
  • various components of FIG. 1A and FIG. IB will be used to describe operations performed by a HMD device.
  • an HMD device may be constructed such that it can be worn on the head of the user and comprise one or more processors that are coupled to one or more memory units.
  • An HMD device may comprise an operating system that enables the HMD device to receive and process instructions from users and/or other computing devices, and transmit data to users and/or other computing devices.
  • An HMD device may comprise one or more cameras, such as cameras 201a, 201b, 201c, 20 Id, which when used in combination resolve depth.
  • An HMD device may also comprise one or more see-through (transparent) holographic lenses 202a, 202b that are coupled to an optical projection system (not shown) capable of generating and displaying images and holograms to the user.
  • HMD device may comprise one or more microphones (not shown) to capture audio from users.
  • HMD device may also comprise a plurality of sensors (not shown) including one or more radio-frequency identification (RFID) and infrared (IE.) sensors. HMD device may also perform optical tracking of objects using 1R sensors. Additionally, HMD device may comprise image capturing apparatus 203. Using image capturing apparatus 203, HMD device may capture and store a series or stream of images. The series or stream of images may be recorded in a video format. In some embodiments, HMD device may begin capturing and storing images in response to receiving an instruction to capture images. In some embodiments, the instruction to capture images may be provided in the form of a voice command from a user.
  • RFID radio-frequency identification
  • IE. infrared
  • HMD device may be configured to detect one or more medical devices that are within a certain distance from HMD device using sensors of HMD device, such as RFID or IR sensor. For example, if medical device 120 is tagged with a RFID tag, then HMD device 170b may detect medical device 120 using RFID sensor of HMD device 170b. In response to detecting medical device 120, HMD device 170b may register medical device 120 as part of the medical procedure being performed in operating room 99.
  • sensors of HMD device such as RFID or IR sensor.
  • HMD devices may also be configured to detect one or more medical devices in an image. For example, surgeon 107a may instruct HMD device 107b to begin capturing images and in response to an image being captured by image capturing apparatus 203, HMD device 107b may initiate an object recognition process to detect a medical device within the captured image. Instructions for performing the object recognition techniques may be stored in one or more memory units of HMD device 107b.
  • HMD devices may also be configured to detect and identify one or more medical devices in one or more images using the same image processing and computer vision techniques and technologies as described above that networked computer system 140 may be configured to use.
  • HMD device may register the medical device as part of the medical procedure being performed.
  • HMD device registers the medical device by creating a data entry associating the medical procedure being performed and the medical device.
  • HMD device in order maintain a record of all medical devices and supplies actually used in the medical procedure currently being performed, may transmit the data entry to a data storage unit storing one or more data records associated with medical procedures currently being performed at a customer, such as a hospital, implementing the integrated surgical system described herein.
  • HMD device 107b registers medical device 120 by creating a data entry associating medical device 120 with the medical procedure and transmitting a request comprising the data entry to data storage unit 141.
  • data storage unit 141 comprises one or more data records associated with medical procedures currently being performed at a customer implementing system 100.
  • data storage unit 141 may identify one or more data records associated with the medical procedure indicated in the data entry and update the associated data records to indicate that medical device 120 is being used in the medical procedure. This information may be used to confirm that all medical devices 120 and other implements such as sponges and gauze are accounted for at the end of the procedure.
  • An HMD device may also identify one or more faces in an image using facial recognition techniques trained to identify a set of faces.
  • facial recognition techniques may be trained using a set ef faces stored in a data storage unit, such as data storage unit 141. These faces stored in the data storage unit may be used to generate an avatar for each user such that through the HMD it does not appear that each user is wearing a HMD.
  • networked computer system 140 may receive images from image capturing apparatus 101, 102, 103, 104, HMD devices 107b, 108b, 109b, 1 10b, and medical device 120 if the medical device comprises an image capturing apparatus, and align and combine the received images into a single video that produces a 360 degree view of the activities within the operating room 99.
  • Networked computer system 140 may be configured to track the one or more identified medical devices, the one or more identified faces, and body parts of each of the identifi ed faces, and, for each of the identified faces, determine vaiious efficiency measurements of the user associated with the identified face. For example, networked computer system 140 may track the hands of surgeon 107a in the single video and determine, as efficiency
  • Networked computer system 140 may store the single video along with the efficiency measurements of each of the users associated with the identified faces in a data storage unit, such as data storage unit 141.
  • An HMD device may require user authentication in order to operate the device.
  • HMD devices may comprise biometric sensors (not shown), such as a fingerprint reader, and may require the user to login to the HMD device using the biometric sensors.
  • Login and access permissions for users may be stored in a central repository, such as data storage unit 141.
  • An HMD device may be configured to verify credentials by determining whether the user submitted credential matches any stored credentials at the central repository. If the HMD device determines that the submitted credential matches a stored credential at the central repository, then the HMD device may request for access permissions of the user.
  • HMD devices may be configured to lock the user out of certain functionalities based on the granted permissions of the user. Additionally, HMD devices may also be configured to not display certain information to a user based on the permissions granted to the user. For example, when nurse 108a authenticates herself and logs into HMD device 108b, HMD device 108b retrieves access permissions of nurse 108a from data storage unit 141, and if nurse 108a instructs HMD device 108b to display information related to directing a surgeon to make certain incision points on the body of patient 150, HMD device 108b, due to nurse 108a lacking permissions to view that information, provides an alert instead to nurse 108a indicating a lack of permission to view the requested information.
  • FIG. 3 and FIG. 4 depict exemplar ⁇ ' embodiments of hologram images displayed on a surface in a real world environment.
  • various components described in FIG. 1A, FIG. I B and FIG. 2 will be used.
  • patient information of patient 150 is stored in data storage unit 141 including name, age, medical conditions, allergies, co-morbidities, previous medical procedures, previous surgeries, state of each of the diseases afflicting the patient and the medical procedure that will be performed on patient 150. Additionally, information related to vital signs of patient 150 may be measured by one or more sensors capable of measuring vital signs of a person and the information may be transmitted to networked computer system 140 or to HMD devices, such as HMD device 107b of surgeon 107a. The vital signs information of patient 150 may be presented to a user of system 100.
  • the vital signs information of patient 150 may be presented to the user in the form of a hologram image, such as hologram image 301.
  • a user such as surgeon 107a, may provide an instruction to HMD device 107b to present the information and HMD device 107b may be configured to project the vital signs information as a holographic image.
  • HMD devices such as HMD device 107b, may be configured to use cameras 201a, 201b, 201c, 20 Id to determine the projection location of the holographic image 301 of the vital signs information utilizing their depth resolution capabilities.
  • HMD devices may be configured to continuously update the vital signs information in projected holographic image 301.
  • a new image comprising updated vital signs information of patient 150
  • holographic image 301 may be replaced with the new image.
  • Holographic image 301 and new images replacing holographic image 30 may be projected until the user, surgeon 107a, transmits an instaiction to terminate the projection of holographic image 301 and the new images.
  • HMD device 107b may also project patient information as holographic images such as holographic image 401.
  • Holographic image 401 displays information of patient 150 including name, age, medical conditions, allergies and the medical procedure being performed on patient 150.
  • HMD device 107b may be configured to determine patient information for holographic image 401 by transmitting a request for the information of the patient to data storage unit 141.
  • HMD device 107b may also project holographic images that display information to direct a user through a medical procedure.
  • holographic image 402 displays to the user the incision points 403a, 403b, 403c, 403d for inserting laparoscopic ports in patient 150.
  • Holographic image 402 also displays information that identify the incision points 403a, 403b, 403c, 403d by displaying label 404.
  • Label 404 identifies the incision points 403a, 403b, 403c, 403d as port positions.
  • FIG. 5 depict exemplary embodiment of displaying information of medical devices being used in a medical procedure. For the purpose of illustrating a clear example, various components described in FIG. 1 A, FIG. IB and FIG. 2 will be used.
  • one or more medical devices may be identified and registered by HMD devices 107b, 108b, 109b, 110b and image capturing devices 101, 102, 103, 104.
  • Information identifying the one or more medical devices may be presented to the user.
  • medical devices 501a, 501b, 501c, 50 Id are identified and registered.
  • An HMD device such as HMD device 107a, may display information identifying medical devices 501a, 501b, 501c, 50 Id as holographic images 502a, 502b, 502c, 502d.
  • Holographic images 502a, 502b, 502c, 502d display a geometric property and an attribute of medical devices 501a, 501b, 501c, 501d, respectively.
  • Holographic images 502a, 502b, 502c, 502d are projected at certain location based on the location of the medical devices.
  • the location of each of the medical devices is tracked and the location of each medical device is transmitted to HMD devices, such as HMD device 107b and the projection location of holographic images, such as 502a, 502b, 502c, 502d, are based on the received locations of the medical devices.
  • HMD devices such as HMD device 107b, may be configured to determine spatial properties of medical devices 501a, 501 b, 501c, 501d, such as the size of the portion of medical device protruding from patient 150 and the angle at which the medical device enters patient 1 50 and orientation of the medical device with respect to patient 150.
  • HMD device 107b may be configured to determine spatial properties of medical devices 501 a, 501b, 501c, 501 d based on the 3- dimensional geometrical information stored in data storage unit 141.
  • HMD devices including HMD device 107b, may determine the location of distal ends of medical devices 501a, 501b, 501c, 501d.
  • the location of distal ends of medical devices 501a, 501b, 501c, 5()ld are located within the body of patient 150.
  • HMD device 107b may determine the surgical site within the body of patient 150 based on the location of distal ends of medical devices 501a, 501b, 501c, 50 Id.
  • FIG. 6 depicts an exemplar)-' embodiment of a holographic depicting the surgical site of a medical procedure within the body of a patient.
  • FIG. 6 depicts an exemplar)-' embodiment of a holographic depicting the surgical site of a medical procedure within the body of a patient.
  • various components described in FIG. 1 A, FIG. IB, FIG. 2, FIG. 5 will be used.
  • HMD devices such as HMD device 107b, may determine a surgical site of a medical procedure within the body of a patient.
  • the surgical site may be based on the location of the distal ends of medical devices identified and registered by HMD devices.
  • HMD device 107b determines the location of distal ends of medical devices 501a, 501b, 501c, 5()ld based on the spatial properties of medical devices 501a, 501b, 501c, 501d.
  • spatial properties of medical devices include the portion of medical device protruding from patient and the angle and orientation of the medical device protruding from patient.
  • the location of the distal ends of medical devices 501a, 501b, 501c, 5()ld are located within the body of patient 150.
  • HMD device 107b determines the surgical site within the body of patient 150 based on the location of distal ends of medical devices 501 a, 501b, 501c, 50 Id.
  • An image of the surgical site within the body of a patient may be captured using a medical device configured to capture images, in this example a laparoscope.
  • the medical device may be configured to continuously capture images and transmit the captured images to HMD devices, such as HMD device 107b, or to a networked computer system configured to receive images from a medical device, such as networked computer system 140.
  • Networked computer system 140 in response to receiving images from a medical device, may be configured to transmit the images to HMD devices.
  • a user of the HMD devices may provide an instruction to the HMD device to display the image from the laparoscope to the user.
  • HMD devices may be configured to project the image from the medical device as a hologram at a location on the surface of the body of the patient.
  • the location on the surface of the body of the patient corresponds to the location of the surgical site such that the location on the surface of the body is anatomically at the same location as the surgical site within the body of the patient.
  • medical device 120 is a laparoscope and a user, such as surgeon
  • the laparoscope navigates the laparoscope to the surgical site.
  • the laparoscope captures images at the surgical site and transmits the images to HMD device 107b.
  • the laparoscope may transmit the images to networked computer system 140 and in response to receiving images from a medical device, networked computer system 140, may be configured to transmit the images to HMD device 107b.
  • Surgeon 107a provides an instruction to HMD device 107b to display the image from medical device 120 to surgeon 107b.
  • the image from medical device 120 is displayed as the holographic image 601.
  • Holographic image 601 comprises anatomical structures present at the surgical site within the body of patient 150 and also comprises the distal ends 602a, 602b, 602c, 602d of medical devices 501a, 501b, 501c, 501d, respectively.
  • the surgical site, based on the distal ends 602a, 602b, 602c, 602d is at the large intestine area of patient 150 and the holographic image 601 is projected on the surface of the body of patient 1 50 at the location that corresponds large intestine area, the surgical site.
  • FIGs. 7A, 7B, 7C, 7D, 7E, and 7F depict exemplar)-' embodiments of a zoomed-in view of holographic images projected at a surgical site within the body of a patient.
  • FIG. 1 A, FIG. I B, FIG. 2, FIG. 5, FIG. 6 will be used.
  • Pre-procedure or pre-surgical imaging may be performed to determine location of organs within the patient.
  • a user may identify an optimum zone for cutting at an organ within the body of the patient.
  • the optimum zone may be associated with the patient and stored in a data storage unit comprising data associated with the medical procedure being performed on the patient.
  • the user may instruct the HMD device of the user to display the information related to the optimum cutting location.
  • HMD devices may be configured to determine the distance of the medical device from the optimum cutting location and present the distance to the user.
  • pre-procedure and intra procedure imaging and electromagnetic navigation systems for identifying, registering and navigating to a desired location within the body may be incorporated into the system 100 without departing from the scope of the present disclosure. Examples of such systems may be found in commonly assigned U.S. Patent Application No. 13/838,805 entitled Pathway Planning System and Method, filed March 15, 2013; U.S. Patent Application Serial No. 14/753,288 entitled System and Method of Navigation within the Lung, filed July 10, 2015; and U.S.
  • Holographic image 700a of FIG. 7 A includes a surgical site within the body of patient 150, medical device 701, and a distance indicator 702 that comprises distance 702b and a label 702a describing the distance displayed in 702b.
  • Medical device 701 may comprise one or more sensors that are configured to detect various markers surrounding an organ, various attributes of anatomical structures of patient 150, such as the organ including thickness of the tissue of the organ or other surrounding tissue, various attributes of the medical device such as battery power remaining, number of firings remaining, etc. Medical device 701 may also be configured to transmit signals or information captured by the one or more sensors to networked computer system 140 or HMD devices, such as HMD device 107b, 108b, 109b, 110b.
  • medical device 701 is a stapler and distance 702b indicates the distance from a certain end of an organ where the optimum cutting location of the organ has been determined and at which the stapler 701 should cut organ 710.
  • distance 702b increases or decreases, based on whether medical device 701 is moved closer to or away from the optimum cutting location.
  • This optimum cutting location may be identified in pre-procedure images, or in real time using laparoscopic images and a calculated distance from some fixed point which might be visible in the images, for example detection of the location at which the small and large intestines connect.
  • the network computer system 140 may store in memory a minimum distance from this identified location for a particular procedure or other information that can be used to provide this optimum distance to the HMD for display to the surgeon.
  • FIG. 7B depicts a holographic image 700b, where surgeon 107a moves medical device 701 away from the optimum cutting location and distance 702b displays the change in the distance information to the user, surgeon 107a.
  • FIG. 7C depicts holographic image 700c, where surgeon 107a moves medical device 701 towards the optimum cutting location and reaches the optimum cutting location. As surgeon 107a moves medical device 701 towards the optimum cutting location, distance 702b indicates that the distance to the optimum cutting location is decreasing and once medical device 701 reaches the optimum cutting location, label 704 is displayed indicating that medical device 701 has reached the optimum cutting location.
  • FIG. 7d depicts a holographic image 700d, where surgeon 107a clamps down with the medical device 701 on organ 710 at the optimum cutting location.
  • Medical device 701 may be configured to detect certain properties of the tissue of an organ or the medical device 70 land transmit information related to the properties of the tissue of the organ or the medical device 701 to HMD devices, such as HMD device 107b, HMD devices may be configured to provide the information related to the properties of the tissue and medical device to user.
  • HMD devices such as HMD device 107b
  • HMD devices may be configured to provide the information related to the properties of the tissue and medical device to user.
  • holographic image 700d presents information related to the thickness of the tissue of organ 710 and the operation of the medical device 701 as shown in view 705 to a user such as surgeon 107a,
  • Surgeon 107a may begin cutting at the optimum cutting location by activating the cutting process of medical device 701.
  • Medical device 701 may transmit the activation of the cutting process to HMD devices, such as HMD device 107b and HMD devices may be configured to display the activation and progression of the cutting process as part of the hologram being presented to the user, surgeon 107a
  • FIG, 7E depicts holographic image 700e that includes graphic 706, which displays the activation and progression of the cutting process to the user, surgeon 107a.
  • FIG. 7F depicts holographic image 700f that includes updated information to graphic 706 showing the completion of the cutting process or thickness of the
  • J - locations where stapling has occurred as well as other information such as the identification of the type of staple cartridge or end effector.
  • Medical device 701 may transmit a completion signal once the cutting and stapling process is complete. Medical device 701 may transmit the completion signal to HMD devices, such as HMD device 107a. HMD devices may be configured to provide an alert to the user indicating medical device 701 has completed its process. The alert signal may be presented or displayed to the user as a graphic in a hologram image.
  • FIG. 8 A and FIG 8B depict an exemplar ⁇ ' embodiment of a holographic image that allows a user to track steps of a medical procedure.
  • FIG. 1 For the purpose of illustrating a clear example, various components described in FIG. 1, FIG. 2 will be used.
  • steps associated with a medical procedure may be stored in a data storage unit 141.
  • networked computer system 140 may combine received images or video feed into a single video and track the various medical devices being used in the medical procedure.
  • Networked computer system 140 may be configured to download the list of steps associated with the medical procedure being performed along with the required medicai device and supplies associated with each of the steps.
  • Networked computer system 140 may be configured to track the medical devices throughout the medical procedure and determine the time at which usage of a medical device that is being tracked has begun and the time at which usage of the medical device has stopped.
  • Networked computer system 140 may determine the completion or progression status of each of the steps based on the time at which the usage of the medical device has begun and/or the time at which the usage of a medical device has stopped, or based on the surgeon's voice commands, which may include requests for the next medical device in the expected series of needs or by other methods of detection.
  • networked computer system 140 may also be configured to determine or predict a step or sequence of steps for a medical procedure using various machine learning or predictive analytic techniques or technologies on data related to the one or more steps, including the one or more parallel steps, associated with the medical procedure. As described above, networked computer system 140 may also incorporate one or more factors of the patient and preferences of the users while applying machine learning or predictive analytic techniques. Networked computer system 140 may also be configured to transmit or push out each predicted step or series of steps as they are predicted or resolved or determined by networked computer system 140.
  • HMD devices such as HMD device 107b may be configured to transmit a request to networked computer system 140 for the list of steps associated with the medical procedure being performed, along with the medical devices required for each of the steps and the completion or progression status of each of the step.
  • Networked computer system 140 in response to receiving the data request from an HMD device, such as HMD device 107b, may transmit the data to HMD device 107b.
  • networked computer system 140 may continue to transmit the data to HMD device 107b periodically or in a "push" form, where every time networked computer system 140 updates any data related to the list of steps or the medical devices and the completion or progression status of each of the list of steps the updated data is transmitted to HMD device 107b.
  • HMD device 107b presents the downloaded data to user, surgeon 107b, in a holographic image, such as holographic image 800a.
  • Holographic image 800a displays a column of steps 801 a associated with a medical procedure being performed by the user, surgeon 107a, a column of required medical devices 801b for each step in column 801 a, and a column indicating status 801c of each step in column 801a.
  • HMD device 107b may be configured to indicate or highlight the current step of the procedure that is being performed.
  • Highlight box 802 depicts an embodiment of highlighting the current step of the medical procedure, by highlighting step 803a.
  • HMD device 107b may also indicate in the status column 801 c that step 803a is the current step with an appropriate status indicator such as status indicator 803c.
  • HMD device 107b may be configured to update holographic image 800a to incorporate any changes to any of the information being displayed in hologram 800a. As described above HMD device 107b may periodically receive data presented in hologram 800a from networked computer system 140. HMD device 107b may determine whether a change to the information currently being displayed in hologram 800a occurred by comparing the newly received data with the previously received data, if networked computer system 140 is configured to push data every time networked computer system 140 updates data, then HMD device 107b may be configured to update holographic image 800a without determining, at HMD device 107b, whether a change to the information currently being displayed in hologram 800a occurred,
  • FIG. 8B depicts an update to hologram 800a, where step 803a is marked as complete as indicated by status indicator 803c, along with a timestamp indication the time when step 803a was completed.
  • highlight box 802 highlights the next step after 803a, 804a.
  • HMD device 107b also updates the status indicator 804c of step 804a to indicate that it is the current step of the procedure being performed.
  • FIG. 9 depicts an exemplary embodiment of a hologram presenting inventory of medical equipment used during a medical procedure.
  • various components described in FIG. 1 A, FIG. IB, FIG. 2 will be used.
  • networked computer system 140 may track each of the medical supplies and medical devices throughout the medical procedure, in the single video generated by networked computer system 140.
  • networked computer system 140 may also determine various utilization statistics including whether a unit of a medical supply or device has been opened, turned on, used, in-use, turned off, or disposed. In some embodiments, networked computer system 140 may determine whether a medical supply been opened based on whether the supply is in the hands of one of the users.
  • networked computer system 140 may determine whether a medical supply has been disposed of based on whether or not the medical supply has been tracked to a location known to the tracking techniques utilized of networked computer system 140 to be a trash or garbage.
  • HMD devices may be configured to present utilization statistics of medical supplies and devices during the procedure to the users.
  • HMD device 108b may be configured to request and receive from networked computer system 140 the utilization statistics of medical supplies and devices during the procedure.
  • HMD device 108b may present the utilization statistics as hologram 900.
  • Hologram 900 presents to the user, nurse 108a, a list of medical supplies and devices 901a. For each medical supply or device, hologram 900 indicates whether the medical supply or device has been opened or turned-on 901b, in-use 901c, or disposed or turned-off 90 Id.
  • FIG. 10A and FIG. 10B depict exemplar ⁇ ' embodiments of reviewing a previously performed medical procedure.
  • various components described in FIG. 1 A, FIG. IB, FIG. 2 will be used.
  • networked computer system 140 may generate a single video file by aligning and combining one or more images or a stream of images, such as video, from image capturing apparatus 101, 102, 103, 104 and HMD devices 107b, 108b, 109b, 110b, and medical device 120 that is equipped with an image capturing apparatus.
  • Networked computer system 140 may be configured to store the generated single video file in a data storage unit, such as data storage unit 141.
  • a user such as surgeon 107a may access and review the medical procedure on a computing system such as desktop computer 1001 in FIG. 10A. Surgeon 107a may also issue an instruction to HMD device 107b to project the stored single video file as a hologram.
  • HMD device 107b may transmit a data request to data storage unit 141for the single video file of the procedure and receive from data storage unit 141 the single video file.
  • HMD device 07b may present the single video file to the user 107a.
  • HMD device 107b may present the single video file as a holographic image such as hologram 1000 in FIG. 10B.
  • a phrase in the form “A or B” means “(A), (B), or (A and B).”
  • a phrase in the form “at least one of A, B, or C” means “(A); (B); (C); (A and B); (A and C); (B and C); or (A, B, and C) "
  • the term “clinician” may refer to a clinician or any medical professional, such as a doctor, nurse, technician, medical assistant, or the like, performing a medical procedure.
  • the systems described herein may also utilize one or more controllers to receive various information and transform the received information to generate an output.
  • the controller may include any type of computing device, computational circuit, or any type of processor or processing circuit capable of executing a series of instructions that are stored in a memory.
  • the controller may include multiple processors and/or multicore central processing units (CPUs) and may include any type of processor, such as a microprocessor, digital signal processor, microcontroller, programmable logic device (PLD), field programmable gate array (FPGA), or the like.
  • the controller may also include a memory to store data and/or instructions that, when executed by the one or more processors, causes the one or more processors to perform one or more methods and/or algorithms.
  • any of the herein described methods, programs, algorithms or codes may be converted to, or expressed in, a programming language or computer program.
  • programming language and "computer program,” as used herein, each include any language used to specify instructions to a computer, and include (but is not limited to) the following languages and their derivatives: Assembler, Basic, Batch files, BCPL, C, C+, C++, Delphi, Fortran, Java, JavaScript, machine code, operating system command languages, Pascal, Perl, PL1, scripting languages, Visual Basic, metalanguages which themselves specify programs, and all first, second, third, fourth, fifth, or further generation computer languages. Also included are database and other data schemas, and any other meta-languages.
  • any of the herein described methods, programs, algorithms or codes may be contained on one or more machine-readable media or memory.
  • the term "memory" may include a mechanism that provides (e.g., stores and/or transmits) information in a form readable by a machine such a processor, computer, or a digital processing device.
  • a memory may include a read only memory (ROM), random access memory (RAM), magnetic disk storage media, optical storage media, flash memory devices, or any other volatile or non-volatile memory storage device.
  • Code or instructions contained thereon can be represented by carrier wave signals, infrared signals, digital signals, and by other like signals.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Gynecology & Obstetrics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Pathology (AREA)
  • Endoscopes (AREA)

Abstract

L'invention concerne des procédés et des systèmes de calage d'images anatomiques. Une ou plusieurs images sont capturées. Un dispositif chirurgical est identifié sur ladite ou lesdites images. Un ou plusieurs paramètres géométriques du dispositif chirurgical sont déterminés. Sur la base dudit ou desdits paramètres géométriques, la position d'une extrémité distale du dispositif chirurgical identifié est déterminée. Sur la base de la position de l'extrémité distale du dispositif chirurgical identifié, le site d'intervention chirurgicale dans le corps du patient est déterminé. Sur la base dudit site d'intervention chirurgicale dans le corps du patient, un site de surface correspondante est déterminé. Sur la base d'une image provenant d'un laparoscope sur le site d'intervention chirurgicale dans le corps du patient, une image holographique s'affiche sur le site de surface correspondante.
PCT/US2017/020425 2016-03-04 2017-03-02 Procédés et systèmes de calage d'images anatomiques WO2017151904A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662303895P 2016-03-04 2016-03-04
US62/303,895 2016-03-04

Publications (1)

Publication Number Publication Date
WO2017151904A1 true WO2017151904A1 (fr) 2017-09-08

Family

ID=59744392

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/020425 WO2017151904A1 (fr) 2016-03-04 2017-03-02 Procédés et systèmes de calage d'images anatomiques

Country Status (1)

Country Link
WO (1) WO2017151904A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3510963A1 (fr) * 2018-01-11 2019-07-17 Covidien LP Systèmes et procédés de planification et de navigation laparoscopiques
CN111655184A (zh) * 2018-01-10 2020-09-11 柯惠Lp公司 用于放置手术端口的引导
CN113077662A (zh) * 2021-04-03 2021-07-06 刘铠瑞 一种基于5g网络技术应用的腹腔镜手术及培训系统
IT202000015322A1 (it) * 2020-06-25 2021-12-25 Io Surgical Res S R L Apparato di rilevazione e tracciamento della postura e/o della deformazione di un organo corporeo
US11370113B2 (en) * 2016-09-06 2022-06-28 Verily Life Sciences Llc Systems and methods for prevention of surgical mistakes
WO2023049427A1 (fr) * 2021-09-27 2023-03-30 Becton, Dickinson And Company Système et procédé de gestion d'accès vasculaire

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080130965A1 (en) * 2004-11-23 2008-06-05 Avinash Gopal B Method and apparatus for parameter assisted image-guided surgery (PAIGS)
US20130038707A1 (en) * 2011-08-09 2013-02-14 Tyco Healthcare Group Lp Apparatus and Method for Using Augmented Reality Vision System in Surgical Procedures
US20130267838A1 (en) * 2012-04-09 2013-10-10 Board Of Regents, The University Of Texas System Augmented Reality System for Use in Medical Procedures
WO2015164402A1 (fr) * 2014-04-22 2015-10-29 Surgerati, Llc Système et procédé de visualisation d'image médicale peropératoire
US20150366628A1 (en) * 2014-06-18 2015-12-24 Covidien Lp Augmented surgical reality environment system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080130965A1 (en) * 2004-11-23 2008-06-05 Avinash Gopal B Method and apparatus for parameter assisted image-guided surgery (PAIGS)
US20130038707A1 (en) * 2011-08-09 2013-02-14 Tyco Healthcare Group Lp Apparatus and Method for Using Augmented Reality Vision System in Surgical Procedures
US20130267838A1 (en) * 2012-04-09 2013-10-10 Board Of Regents, The University Of Texas System Augmented Reality System for Use in Medical Procedures
WO2015164402A1 (fr) * 2014-04-22 2015-10-29 Surgerati, Llc Système et procédé de visualisation d'image médicale peropératoire
US20150366628A1 (en) * 2014-06-18 2015-12-24 Covidien Lp Augmented surgical reality environment system

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11370113B2 (en) * 2016-09-06 2022-06-28 Verily Life Sciences Llc Systems and methods for prevention of surgical mistakes
CN111655184A (zh) * 2018-01-10 2020-09-11 柯惠Lp公司 用于放置手术端口的引导
JP2021510110A (ja) * 2018-01-10 2021-04-15 コヴィディエン リミテッド パートナーシップ 外科用ポートの配置のためのガイダンス
EP3737322A4 (fr) * 2018-01-10 2021-09-01 Covidien LP Guidage pour le placement d'orifices chirurgicaux
US11806085B2 (en) 2018-01-10 2023-11-07 Covidien Lp Guidance for placement of surgical ports
CN111655184B (zh) * 2018-01-10 2024-01-02 柯惠Lp公司 用于放置手术端口的引导
EP3510963A1 (fr) * 2018-01-11 2019-07-17 Covidien LP Systèmes et procédés de planification et de navigation laparoscopiques
US11547481B2 (en) 2018-01-11 2023-01-10 Covidien Lp Systems and methods for laparoscopic planning and navigation
IT202000015322A1 (it) * 2020-06-25 2021-12-25 Io Surgical Res S R L Apparato di rilevazione e tracciamento della postura e/o della deformazione di un organo corporeo
WO2021259537A1 (fr) * 2020-06-25 2021-12-30 Io Surgical Research S.R.L. Appareil de détection et de suivi de la position et/ou de la déformation d'un organe corporel
CN113077662A (zh) * 2021-04-03 2021-07-06 刘铠瑞 一种基于5g网络技术应用的腹腔镜手术及培训系统
WO2023049427A1 (fr) * 2021-09-27 2023-03-30 Becton, Dickinson And Company Système et procédé de gestion d'accès vasculaire

Similar Documents

Publication Publication Date Title
US20220334787A1 (en) Customization of overlaid data and configuration
US20210157403A1 (en) Operating room and surgical site awareness
US11737841B2 (en) Configuring surgical system with surgical procedures atlas
WO2017151904A1 (fr) Procédés et systèmes de calage d'images anatomiques
CN109996508B (zh) 带有基于患者健康记录的器械控制的远程操作手术系统
EP3138526B1 (fr) Système d'environnement de réalité chirurgicale augmentée
US11883022B2 (en) Shared situational awareness of the device actuator activity to prioritize certain aspects of displayed information
US20220108788A1 (en) Reconfiguration of display sharing
US11877897B2 (en) Situational awareness of instruments location and individualization of users to control displays
US20220104694A1 (en) Control of a display outside the sterile field from a device within the sterile field
WO2022070066A1 (fr) Surveillance du mouvement oculaire d'un utilisateur pour commander le système d'affichage qui affiche les informations primaires
US20210205027A1 (en) Context-awareness systems and methods for a computer-assisted surgical system
KR20190080706A (ko) 수술보조 영상 표시방법, 프로그램 및 수술보조 영상 표시장치
CN112262437A (zh) 信息处理系统、信息处理装置和信息处理方法
US20230157757A1 (en) Extended Intelligence for Pulmonary Procedures
KR101864411B1 (ko) 수술보조 영상 표시방법 및 프로그램
US20210076942A1 (en) Infrared thermography for intraoperative functional mapping
JP2024514640A (ja) 画面上及び画面外で発生する混合された要素及びアクションを表示するレンダリングされた要素で直接視覚化される混合
WO2022219501A1 (fr) Système comprenant une matrice de caméras déployables hors d'un canal d'un dispositif chirurgical pénétrant un tissu
CN118160044A (zh) 叠加数据和配置的定制

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17760805

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17760805

Country of ref document: EP

Kind code of ref document: A1