WO2023141800A1 - Mobile x-ray positioning system - Google Patents

Mobile x-ray positioning system Download PDF

Info

Publication number
WO2023141800A1
WO2023141800A1 PCT/CN2022/073939 CN2022073939W WO2023141800A1 WO 2023141800 A1 WO2023141800 A1 WO 2023141800A1 CN 2022073939 W CN2022073939 W CN 2022073939W WO 2023141800 A1 WO2023141800 A1 WO 2023141800A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging device
coordinates
image
processor
interest
Prior art date
Application number
PCT/CN2022/073939
Other languages
French (fr)
Inventor
Pengfei Cai
Original Assignee
Warsaw Orthopedic, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Warsaw Orthopedic, Inc. filed Critical Warsaw Orthopedic, Inc.
Priority to PCT/CN2022/073939 priority Critical patent/WO2023141800A1/en
Publication of WO2023141800A1 publication Critical patent/WO2023141800A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4405Constructional features of apparatus for radiation diagnosis the apparatus being movable or portable, e.g. handheld or mounted on a trolley
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/54Control of apparatus or devices for radiation diagnosis
    • A61B6/547Control of apparatus or devices for radiation diagnosis involving tracking of position of the device or parts of the device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4429Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units
    • A61B6/4435Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure
    • A61B6/4441Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure the rigid structure being a C-arm or U-arm

Definitions

  • the present disclosure is generally directed to surgical systems and relates more particularly to imaging devices for the surgical systems.
  • Surgical robots may assist a surgeon or other medical provider in carrying out a surgical procedure or may complete one or more surgical procedures autonomously.
  • Imaging may be used by a medical provider for diagnostic, operational, and/or therapeutic purposes.
  • Providing controllable linked articulating members allows a surgical robot to reach areas of a patient anatomy during various medical procedures (e.g., using the imaging) .
  • Example aspects of the present disclosure include:
  • a robotic surgical imaging system comprising: a first imaging device; a second imaging device; a processor coupled with the first imaging device and the second imaging device; and a memory coupled with and readable by the processor and storing therein instructions that, when executed by the processor, cause the processor to: capture a first image of a target environment using the first imaging device, wherein the first image comprises an object included in the target environment; select coordinates of interest included in the first image, the coordinates of interest including at least a portion of the object; generate a set of real-world coordinates corresponding to the portion of the object based at least in part on a mapping between a set of pixel coordinates associated with the first image and the set of real-world coordinates; position the second imaging device into a first location based at least in part on the set of real-world coordinates; and display a second image captured using the second imaging device, the second image comprising at least the portion of the object.
  • the instructions to select the coordinates of interest included in the first image cause the processor to: select a target line that passes through at least the portion of the object, wherein the target line comprises the coordinates of interest.
  • the instructions further cause the processor to: capture a third image of the target environment using the first imaging device after the second imaging device has been positioned into the first location; and verify the second imaging device is at the coordinates of interest based at least in part on the third image, the coordinates of interest, the set of real-world coordinates, the set of pixel coordinates, or a combination thereof.
  • the instructions further cause the processor to: position the second imaging device into a second location based at least in part on determining the second imaging device is not located at the coordinates of interest from the verifying.
  • instructions further cause the processor to: display guiding information to assist an operator with positioning the second imaging device into the second location.
  • the instructions to generate the set of real-world coordinates corresponding to the portion of the object cause the processor to: calculate a distance to move the second imaging device into the first location to capture the second image comprising at least the portion of the object, wherein the distance is calculated based at least in part on the coordinates of interest and an initial location of the second imaging device.
  • the instructions further cause the processor to: display guiding information to assist an operator with positioning the second imaging device, wherein the guiding information comprises the calculated distance.
  • the guiding information is displayed to the operator based at least in part on the set of pixel coordinates associated with the first image.
  • the first imaging device comprises a camera and the second imaging device comprises an X-ray machine.
  • a system comprising: a processor; and a memory coupled with and readable by the processor and storing therein instructions that, when executed by the processor, cause the processor to:capture a first image of a target environment using a first imaging device, wherein the first image comprises an object included in the target environment; select coordinates of interest included in the first image, the coordinates of interest including at least a portion of the object; generate a set of real-world coordinates corresponding to the portion of the object based at least in part on a mapping between a set of pixel coordinates associated with the first image and the set of real-world coordinates; position a second imaging device into a first location based at least in part on the set of real-world coordinates; and display a second image captured using the second imaging device, the second image comprising at least the portion of the object.
  • the instructions to select the coordinates of interest included in the first image cause the processor to: select a target line that passes through at least the portion of the object, wherein the target line comprises the coordinates of interest.
  • the instructions further cause the processor to: capture a third image of the target environment using the first imaging device after the second imaging device has been positioned into the first location; and verify the second imaging device is at the coordinates of interest based at least in part on the third image, the coordinates of interest, the set of real-world coordinates, the set of pixel coordinates, or a combination thereof.
  • the instructions further cause the processor to: position the second imaging device into a second location based at least in part on determining the second imaging device is not located at the coordinates of interest from the verifying.
  • instructions further cause the processor to: display guiding information to assist an operator with positioning the second imaging device into the second location.
  • the instructions to generate the set of real-world coordinates corresponding to the portion of the object cause the processor to: calculate a distance to move the second imaging device into the first location to capture the second image comprising at least the portion of the object, wherein the distance is calculated based at least in part on the coordinates of interest and an initial location of the second imaging device.
  • the instructions further cause the processor to: display guiding information to assist an operator with positioning the second imaging device, wherein the guiding information comprises the calculated distance.
  • the guiding information is displayed to the operator based at least in part on the set of pixel coordinates associated with the first image.
  • the first imaging device comprises a camera and the second imaging device comprises an X-ray machine.
  • a method comprising: capturing a first image of a target environment using a first imaging device, wherein the first image comprises an object included in the target environment; generating a set of real-world coordinates corresponding to coordinates of interest included in the first image based at least in part on a mapping between a set of pixel coordinates associated with the first image and the set of real-world coordinates, wherein the coordinates of interest indicate at least a portion of the object; and displaying a second image captured using a second imaging device, the second image comprising at least the portion of the object, wherein the second imaging device is positioned into a first location based at least in part on the set real-world coordinates.
  • the coordinates of interest comprise a target line that passes through at least the portion of the object.
  • each of the expressions “at least one of A, B and C” , “at least one of A, B, or C” , “one or more of A, B, and C” , “one or more of A, B, or C” and “A, B, and/or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
  • each one of A, B, and C in the above expressions refers to an element, such as X, Y, and Z, or class of elements, such as X1-Xn, Y1-Ym, and Z1-Zo
  • the phrase is intended to refer to a single element selected from X, Y, and Z, a combination of elements selected from the same class (e.g., X1 and X2) as well as a combination of elements selected from two or more classes (e.g., Y1 and Zo) .
  • Fig. 1 is a block diagram of a system according to at least one embodiment of the present disclosure
  • Fig. 2 is an imaging system diagram according to at least one embodiment of the present disclosure
  • Fig. 3 is an additional imaging system diagram according to at least one embodiment of the present disclosure.
  • Fig. 4 is a set of coordinate mapping diagrams according to at least one embodiment of the present disclosure.
  • Fig. 5 is a flowchart according to at least one embodiment of the present disclosure.
  • Fig. 6 is an additional flowchart according to at least one embodiment of the present disclosure.
  • Fig. 7 is an additional flowchart according to at least one embodiment of the present disclosure.
  • the described methods, processes, and techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit. Alternatively or additionally, functions may be implemented using machine learning models, neural networks, artificial neural networks, or combinations thereof (alone or in combination with instructions) .
  • Computer-readable media may include non-transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer) .
  • processors such as one or more digital signal processors (DSPs) , general purpose microprocessors (e.g., Intel Core i3, i5, i7, or i9 processors; Intel Celeron processors; Intel Xeon processors; Intel Pentium processors; AMD Ryzen processors; AMD Athlon processors; AMD Phenom processors; Apple A10 or 10X Fusion processors; Apple A11, A12, A12X, A12Z, or A13 Bionic processors; or any other general purpose microprocessors) , graphics processing units (e.g., Nvidia GeForce RTX 2000-series processors, Nvidia GeForce RTX 3000-series processors, AMD Radeon RX 5000-series processors, AMD Radeon RX 6000-series processors, or any other graphics processing units) , application specific integrated circuits (ASICs) , field programmable logic arrays (FPGAs) , or other equivalent integrated circuits (ASICs) ,
  • proximal and distal are used in this disclosure with their conventional medical meanings, proximal being closer to the operator or user of the system, and further from the region of surgical interest in or on the patient, and distal being closer to the region of surgical interest in or on the patient, and further from the operator or user of the system.
  • X-ray systems may be used for ensuring a position of an X-ray machine is correct (e.g., prior to and/or during the surgical procedures) .
  • the X-ray systems may include an X-ray machine and a display screen for displaying X-ray images captured from the X-ray machine, where these X-ray images are used to ensure the position of the X-ray machine is correct.
  • these X-ray systems may rely on trial-and-error methods for determining if the position of the X-ray system is correct.
  • an operator of one of these X-ray systems may place an X-ray machine at an approximate location needed to capture a specific portion of a patient (e.g., for which the surgical procedure is being performed) . Subsequently, the operator may capture an X-ray image of the patient from the X-ray machine at this approximate location and may determine if the X-ray image accurately captures the specific portion of the patient. If the operator determines the X-ray image does not accurately capture the specific portion of the patient, the operator may readjust or move the X-ray machine and repeat capturing X-ray images of the patient until the X-ray machine is accurately positioned (e.g., for capturing X-ray images of specific areas of the patient, for the surgical procedure, etc. ) .
  • these X-ray systems may have a limited field of view (FOV) . That is, the X-ray machines may only be capable of capturing narrow areas of interest (e.g., to limit any possible X-ray radiation to the patient and/or operator) .
  • the limited FOV may cause the operator to have to reposition the X-ray machine multiple times and to capture multiple X-ray images of the patient when checking if the X-ray machine is accurately located to ensure subsequent X-ray scans contain all interested portions of the patient.
  • operators of these X-ray systems may try many times to accurately position the X-ray machine and system when ensuring a scan range is suitable for capturing X-ray images of the interested portions of the patient for the surgical procedure.
  • a safety risk is introduced with possibly exposing the patient and/or the operator to unnecessary amounts of radiation.
  • a positioning system uses a camera (e.g., situated at the top of an operating room or in a different location of the operating room) for identifying current positions of both an X-ray machine and a patient to then calculate and provide an operator (e.g., a surgeon) with precise adjustments needed for adjusting a position of the X-ray machine in the operating room in order to obtain an accurate FOV containing the entire region of interest of the patient for imaging.
  • This positioning system may help operators to efficiently position an X-ray machine according to manual inputs from the operator indicating the regions of interest of the patient.
  • a mark on the top of the X-ray machine may be used to check the position of the X-ray machine in the operation room.
  • the operator may use an image captured from the camera to input a target location onto the image corresponding to an interested FOV or location on the patient.
  • the positioning system may then use the target location to calculate a precise move distance for moving the X-ray machine to that target location and feedback the calculated distance to the operator, and the operator can move the X-ray machine to the correct location according to the feedback.
  • This positioning system may help in saving operators from having to locate and relocate X-ray machines many times and may decrease uncertainties of whether the X-ray scans are correct or not, leading to fewer X-ray images taken and less exposure from associated radiation.
  • Embodiments of the present disclosure provide technical solutions to one or more of the problems of (1) determining an accurate location for placing an X-ray system or machine, (2) exposing a patient and/or operator of the X-ray system to unnecessary amounts of radiation, and (3) prolonging procedure times for surgeries.
  • the positioning system described herein enables an operator of an X-ray system that employs the positioning system to more accurately place the X-ray system in an accurate location needed for capturing the correct areas of interest of the patient without having to take multiple X-rays of the patient.
  • this expedited and accurate locating of the X-ray machine less X-rays may be taken of the patient or otherwise limit the amount of radiation to which the patient and/or the operator are exposed. Additionally, the amount of time needed for associated surgical procedures using the X-ray system may decrease as a result of using the described positioning system.
  • a block diagram of a system 100 may be used to position an imaging device (e.g., X-ray system or machine) to capture areas of interest of a patient based on images captured from an additional imaging device (e.g., camera) . Accordingly, a distance to move the imaging device may be calculated based on selecting the areas of interest of the patient on the images captured from the additional imaging device and calculating how far the imaging device needs to be moved to accurately capture those areas of interest.
  • an imaging device e.g., X-ray system or machine
  • an additional imaging device e.g., camera
  • a distance to move the imaging device may be calculated based on selecting the areas of interest of the patient on the images captured from the additional imaging device and calculating how far the imaging device needs to be moved to accurately capture those areas of interest.
  • the system 100 may be used to control, pose, and/or otherwise manipulate a surgical mount system, a surgical arm, surgical tools, and/or imaging devices attached thereto and/or carry out one or more other aspects of one or more of the methods disclosed herein.
  • the system 100 comprises a computing device 102, one or more imaging devices 112, a robot 114, a navigation system 118, a database 130, and/or a cloud or other network 134.
  • Systems according to other embodiments of the present disclosure may comprise more or fewer components than the system 100.
  • the system 100 may not include the imaging device 112, the robot 114, the navigation system 118, one or more components of the computing device 102, the database 130, and/or the cloud 134.
  • the computing device 102 comprises a processor 104, a memory 106, a communication interface 108, and a user interface 110.
  • Computing devices according to other embodiments of the present disclosure may comprise more or fewer components than the computing device 102.
  • the processor 104 of the computing device 102 may be any processor described herein or any similar processor.
  • the processor 104 may be configured to execute instructions stored in the memory 106, which instructions may cause the processor 104 to carry out one or more computing steps utilizing or based on data received from the imaging device 112, the robot 114, the navigation system 118, the database 130, and/or the cloud 134.
  • the memory 106 may be or comprise RAM, DRAM, SDRAM, other solid-state memory, any memory described herein, or any other tangible, non-transitory memory for storing computer-readable data and/or instructions.
  • the memory 106 may store information or data useful for completing, for example, any step of the methods 500, 600, and/or 700 described herein, or of any other methods.
  • the memory 106 may store, for example, instructions and/or machine learning models that support one or more functions of the robot 114.
  • the memory 106 may store content (e.g., instructions and/or machine learning models) that, when executed by the processor 104, enable image processing 120, segmentation 122, transformation 124, and/or registration 128.
  • Such content may, in some embodiments, be organized into one or more applications, modules, packages, layers, or engines.
  • the memory 106 may store other types of content or data (e.g., machine learning models, artificial neural networks, deep neural networks, etc. ) that can be processed by the processor 104 to carry out the various method and features described herein.
  • various contents of memory 106 may be described as instructions, it should be appreciated that functionality described herein can be achieved through use of instructions, algorithms, and/or machine learning models.
  • the data, algorithms, and/or instructions may cause the processor 104 to manipulate data stored in the memory 106 and/or received from or via the imaging device 112, the robot 114, the database 130, and/or the cloud 134.
  • the computing device 102 may also comprise a communication interface 108.
  • the communication interface 108 may be used for receiving image data or other information from an external source (such as the imaging device 112, the robot 114, the navigation system 118, the database 130, the cloud 134, and/or any other system or component not part of the system 100) , and/or for transmitting instructions, images, or other information to an external system or device (e.g., another computing device 102, the imaging device 112, the robot 114, the navigation system 118, the database 130, the cloud 134, and/or any other system or component not part of the system 100) .
  • an external system or device e.g., another computing device 102, the imaging device 112, the robot 114, the navigation system 118, the database 130, the cloud 134, and/or any other system or component not part of the system 100.
  • the communication interface 108 may comprise one or more wired interfaces (e.g., a USB port, an Ethernet port, a Firewire port) and/or one or more wireless transceivers or interfaces (configured, for example, to transmit and/or receive information via one or more wireless communication protocols such as 802.11a/b/g/n, Bluetooth, NFC, ZigBee, and so forth) .
  • the communication interface 108 may be useful for enabling the device 102 to communicate with one or more other processors 104 or computing devices 102, whether to reduce the time needed to accomplish a computing-intensive task or for any other reason.
  • the computing device 102 may also comprise one or more user interfaces 110.
  • the user interface 110 may be or comprise a keyboard, mouse, trackball, monitor, television, screen, touchscreen, and/or any other device for receiving information from a user and/or for providing information to a user.
  • the user interface 110 may be used, for example, to receive a user selection or other user input regarding any step of any method described herein. Notwithstanding the foregoing, any required input for any step of any method described herein may be generated automatically by the system 100 (e.g., by the processor 104 or another component of the system 100) or received by the system 100 from a source external to the system 100.
  • the user interface 110 may be useful to allow a surgeon or other user to modify instructions to be executed by the processor 104 according to one or more embodiments of the present disclosure, and/or to modify or adjust a setting of other information displayed on the user interface 110 or corresponding thereto.
  • the computing device 102 may utilize a user interface 110 that is housed separately from one or more remaining components of the computing device 102.
  • the user interface 110 may be located proximate one or more other components of the computing device 102, while in other embodiments, the user interface 110 may be located remotely from one or more other components of the computer device 102.
  • the imaging device 112 may be operable to image anatomical feature (s) (e.g., a bone, veins, tissue, etc. ) and/or other aspects of patient anatomy to yield image data (e.g., image data depicting or corresponding to a bone, veins, tissue, etc. ) .
  • image data refers to the data generated or captured by an imaging device 112, including in a machine-readable form, a graphical/visual form, and in any other form.
  • the image data may comprise data corresponding to an anatomical feature of a patient, or to a portion thereof.
  • the image data may be or comprise a preoperative image, an intraoperative image, a postoperative image, or an image taken independently of any surgical procedure.
  • a first imaging device 112 may be used to obtain first image data (e.g., a first image) at a first time, and a second imaging device 112 may be used to obtain second image data (e.g., a second image) at a second time after the first time.
  • the imaging device 112 may be capable of taking a 2D image or a 3D image to yield the image data.
  • the imaging device 112 may be or comprise, for example, an ultrasound scanner (which may comprise, for example, a physically separate transducer and receiver, or a single ultrasound transceiver) , an O-arm, a C-arm, a G-arm, or any other device utilizing X-ray-based imaging (e.g., a fluoroscope, a CT scanner, or other X-ray machine) , a magnetic resonance imaging (MRI) scanner, an optical coherence tomography (OCT) scanner, an endoscope, a microscope, an optical camera, a thermographic camera (e.g., an infrared camera) , a radar system (which may comprise, for example, a transmitter, a receiver, a processor, and one or more antennae) , or any other imaging device 112 suitable for obtaining images of an anatomical feature of a patient.
  • the imaging device 112 may be contained entirely within a single housing or may comprise a transmitter/emitter and a receiver/detector that are
  • the imaging device 112 may comprise more than one imaging device 112.
  • a first imaging device may provide first image data and/or a first image
  • a second imaging device may provide second image data and/or a second image.
  • the same imaging device may be used to provide both the first image data and the second image data, and/or any other image data described herein.
  • the imaging device 112 may be operable to generate a stream of image data.
  • the imaging device 112 may be configured to operate with an open shutter, or with a shutter that continuously alternates between open and shut so as to capture successive images.
  • image data may be considered to be continuous and/or provided as an image data stream if the image data represents two or more frames per second.
  • the robot 114 may be any surgical robot or surgical robotic system.
  • the robot 114 may be or comprise, for example, the Mazor X TM Stealth Edition robotic guidance system.
  • the robot 114 may be configured to position the imaging device 112 at one or more precise position (s) and orientation (s) , and/or to return the imaging device 112 to the same position (s) and orientation (s) at a later point in time.
  • the robot 114 may additionally or alternatively be configured to manipulate a surgical tool (whether based on guidance from the navigation system 118 or not) to accomplish or to assist with a surgical task.
  • the robot 114 may be configured to hold and/or manipulate an anatomical element during or in connection with a surgical procedure.
  • the robot 114 may comprise one or more robotic arms 116.
  • the robotic arm 116 may comprise a first robotic arm and a second robotic arm, though the robot 114 may comprise more than two robotic arms. In some embodiments, one or more of the robotic arms 116 may be used to hold and/or maneuver the imaging device 112. In embodiments where the imaging device 112 comprises two or more physically separate components (e.g., a transmitter and receiver) , one robotic arm 116 may hold one such component, and another robotic arm 116 may hold another such component. Each robotic arm 116 may be positionable independently of the other robotic arm. The robotic arms 116 may be controlled in a single, shared coordinate space, or in separate coordinate spaces.
  • the robot 114 may have, for example, one, two, three, four, five, six, seven, or more degrees of freedom. Further, the robotic arm 116 may be positioned or positionable in any pose, plane, and/or focal point. The pose includes a position and an orientation. As a result, an imaging device 112, surgical tool, or other object held by the robot 114 (or, more specifically, by the robotic arm 116) may be precisely positionable in one or more needed and specific positions and orientations.
  • the robotic arm (s) 116 may comprise one or more sensors that enable the processor 104 (or a processor of the robot 114) to determine a precise pose in space of the robotic arm (as well as any object or element held by or secured to the robotic arm) .
  • reference markers may be placed on the robot 114 (including, e.g., on the robotic arm 116) , the imaging device 112, or any other object in the surgical space.
  • the reference markers may be tracked by the navigation system 118, and the results of the tracking may be used by the robot 114 and/or by an operator of the system 100 or any component thereof.
  • the navigation system 118 can be used to track other components of the system (e.g., imaging device 112) and the system can operate without the use of the robot 114 (e.g., with the surgeon manually manipulating the imaging device 112 and/or one or more surgical tools, based on information and/or instructions generated by the navigation system 118, for example) .
  • the navigation system 118 may provide navigation for a surgeon and/or a surgical robot during an operation.
  • the navigation system 118 may be any now-known or future-developed navigation system, including, for example, the Medtronic StealthStation TM S8 surgical navigation system or any successor thereof.
  • the navigation system 118 may include one or more cameras or other sensor (s) for tracking one or more reference markers, navigated trackers, or other objects within the operating room or other room in which some or all of the system 100 is located.
  • the one or more cameras may be optical cameras, infrared cameras, or other cameras.
  • the navigation system 118 may comprise one or more electromagnetic sensors.
  • the navigation system 118 may be used to track a position and orientation (e.g., a pose) of the imaging device 112, the robot 114 and/or robotic arm 116, and/or one or more surgical tools (or, more particularly, to track a pose of a navigated tracker attached, directly or indirectly, in fixed relation to the one or more of the foregoing) .
  • the navigation system 118 may include a display for displaying one or more images from an external source (e.g., the computing device 102, imaging device 112, or other source) or for displaying an image and/or video stream from the one or more cameras or other sensors of the navigation system 118.
  • the system 100 can operate without the use of the navigation system 118.
  • the navigation system 118 may be configured to provide guidance to a surgeon or other user of the system 100 or a component thereof, to the robot 114, or to any other element of the system 100 regarding, for example, a pose of one or more anatomical elements, whether or not a tool is in the proper trajectory, and/or how to move a tool into the proper trajectory to carry out a surgical task according to a preoperative or other surgical plan.
  • the database 130 may store information that correlates one coordinate system to another (e.g., one or more robotic coordinate systems to a patient coordinate system and/or to a navigation coordinate system) .
  • the database 130 may additionally or alternatively store, for example, one or more surgical plans (including, for example, pose information about a target and/or image information about a patient’s anatomy at and/or proximate the surgical site, for use by the robot 114, the navigation system 118, and/or a user of the computing device 102 or of the system 100) ; one or more images useful in connection with a surgery to be completed by or with the assistance of one or more other components of the system 100; and/or any other useful information.
  • one or more surgical plans including, for example, pose information about a target and/or image information about a patient’s anatomy at and/or proximate the surgical site, for use by the robot 114, the navigation system 118, and/or a user of the computing device 102 or of the system 100
  • the database 130 may be configured to provide any such information to the computing device 102 or to any other device of the system 100 or external to the system 100, whether directly or via the cloud 134.
  • the database 130 may be or comprise part of a hospital image storage system, such as a picture archiving and communication system (PACS) , a health information system (HIS) , and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
  • a hospital image storage system such as a picture archiving and communication system (PACS) , a health information system (HIS) , and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
  • PACS picture archiving and communication system
  • HIS health information system
  • the cloud 134 may be or represent the Internet or any other wide area network.
  • the computing device 102 may be connected to the cloud 134 via the communication interface 108, using a wired connection, a wireless connection, or both.
  • the computing device 102 may communicate with the database 130 and/or an external device (e.g., a computing device) via the cloud 134.
  • the system 100 or similar systems may be used, for example, to carry out one or more aspects of any of the methods 500, 600, and/or 700 described herein.
  • the system 100 or similar systems may also be used for other purposes.
  • Fig. 2 shows an imaging system diagram 200 according to at least one embodiment of the present disclosure.
  • the imaging system diagram 200 may include a first imaging device 202 and a second imaging device 204.
  • the first imaging device 202 may be a camera or camera system
  • the second imaging device 204 may be an X-ray machine or X-ray system.
  • One or both imaging devices 202, 204 may be similar or identical to the imaging device 112 depicted and described in connection with Fig. 1.
  • images captured by the first imaging device 202 may be used to determine a location for placing the second imaging device 204 in a room (e.g., an operating room or other type of room used for different medical procedures) .
  • the first imaging device 202 may be used to capture images of its surroundings.
  • the first imaging device may capture a first image of a target environment 206 to identify locations of different objects within the target environment 206, such as a patient 208 (e.g., object) , an operator of the imaging system, various equipment in the target environment (e.g., including the second imaging device 204) , etc. While shown as being located on the ceiling, the first imaging device 202 may be placed at other locations in the room as long as the first imaging device 202 is capable of capturing images of the target environment 206 and at least the patient 208.
  • a patient 208 e.g., object
  • various equipment in the target environment e.g., including the second imaging device 204
  • the first imaging device 202 While shown as being located on the ceiling, the first imaging device 202 may be placed at other locations in the room as long as the first imaging device 202 is capable of capturing images of the target environment 206 and at least the patient 208.
  • coordinates of interest may be selected from the first image, where the coordinates of interest include specific areas of interest of the patient 208 needed for imaging (e.g., to perform subsequent surgical operations and/or for assisting surgical operations that are in progress) .
  • the operator of the imaging system may manually input or select the coordinates of interest from the first image.
  • the imaging system may identify and select the coordinates of interest autonomously (e.g., based on previously received instructions or data) .
  • the operator and/or the imaging system may select a target line that passes through the specific areas of interest of the patient 208.
  • a processor may calculate the position of the coordinates of interest (e.g., target line) and of the second imaging device 204 within the room. Additionally, the processor may calculate a distance and determine a direction of the coordinates of interest with respect to the second imaging device 204. For example, the processor may generate a set of real-world coordinates corresponding to the specific areas of interest of the patient 208 based on the coordinates of interest and may calculate the distance needed to move the second imaging device 204 to those real-world coordinates from an initial position at which the second imaging device 204 is located.
  • the processor may generate the set of real-world coordinates based on a mapping between a set of pixel coordinates and the real-world coordinates, where the set of pixel coordinates are associated with the first image captured from the first imaging device 202 and the real-world coordinates.
  • the mapping between the set of pixel coordinates and the real-world coordinates will be described in greater detail with reference to Fig. 4.
  • the imaging system may position the second imaging device 204 into a first location using the calculated distance and determined direction.
  • the imaging system may output this calculated distance (e.g., guiding information) to a user interface (e.g., the user interface 110 as described with reference to Fig. 1) for the operator of the imaging system to move the second imaging device 204 according to the output.
  • the imaging system may autonomously move the second imaging system 204 to the first location based on the coordinates of interest and the calculated distance.
  • the imaging system may verify that the first location corresponds to the coordinates of interest using one or more additional images captured by the first imaging device 202. That is, the imaging system may check whether the second imaging device 204 is accurately positioned based on the coordinates of interest and a current location of the second imaging device 204 (e.g., the first position) . If the first location does not correspond to the coordinates of interest, the processor of the imaging system may again calculate a distance (and determine a direction) for which the second imaging device 204 needs to be moved. Accordingly, the imaging system may position the second imaging device 204 according to the newly calculated distance (e.g., autonomously or by the operator based on the newly calculated distance being output to the operator) .
  • the newly calculated distance e.g., autonomously or by the operator based on the newly calculated distance being output to the operator
  • the second imaging device 204 may be used to capture and display a second image (e.g., X-ray image) of those areas of interest of the patient 208. Subsequently, any related surgical operations and/or other medical operations may occur with the second imaging device 204 now properly positioned.
  • a second image e.g., X-ray image
  • Fig. 3 shows an imaging system diagram 300 according to at least one embodiment of the present disclosure.
  • the imaging system diagram 300 may include an imaging device 302, which may be an example of the second imaging device 204 as described with reference to Fig. 2 (e.g., an X-ray machine) .
  • the imaging device 302 e.g., of an X-ray system
  • the imaging device 302 may have a limited FOV, so it is important to ensure the imaging device 302 is properly positioned to accurately capture areas of interest of a patient 304 (e.g., or more generically, an “object” ) needed for imaging with the limited FOV.
  • an additional imaging device e.g., the first imaging device 202 of the Fig. 2, such as a camera
  • the imaging device 302 may be used to assist in placing the imaging device 302 at the correct location to capture an area or areas of interest of the patient 304.
  • the additional imaging device may capture a first image (e.g., digital image or video that is output to a user interface associated with the imaging system, such as the user interface 110 as described with reference to Fig. 1) of a target environment (e.g., operation room) , where the first image includes at least the imaging device 302 and the patient 304. Subsequently, coordinates of interest can be selected on the first image that correspond to the areas of interest of the patient 304. For example, an operator may draw a target line 306 on the first image showing the areas of interest of the patient 304 (e.g., a target location for the imaging device 302 to be placed for capturing subsequent images of the areas of interest, such as X-ray images) .
  • a target line 306 on the first image showing the areas of interest of the patient 304 (e.g., a target location for the imaging device 302 to be placed for capturing subsequent images of the areas of interest, such as X-ray images) .
  • a computing device and/or processor associated with the imaging system may calculate or determine a position 308 of the target line 306 (e.g., a set of real-world coordinates) and a position 310 of the imaging device 302.
  • the computing device and/or processor associated with the imaging system may calculate or determine the position 308 and the position 310 based on a mapping between pixel coordinates associated with the first image and real-world coordinates of the target line 306 and the imaging device 302, respectively.
  • a distance 312 may be calculated between the position 308 of the target line 306 and the position 310 of the imaging device 302 (e.g., a distance of the target line 306 relative to a current or initial location of the imaging device 302) . Additionally, a direction for which the imaging device 302 needs to be moved to reach the target line 306 may be determined and/or calculated based on the positions 308 and 310. The distance 312 (and the determined direction) may then be used to position the imaging device 302 at a first location corresponding to the target line 306.
  • the computing device and/or processor may output the distance 312 (and the determined direction) to the operator for the operator to move the imaging device 302 into the first location according to the output. Additionally or alternatively, the computing device and/or processor may autonomously move the imaging device 302 to the first location according to the distance 312. In some examples, the distance 312 (and the direction for moving the imaging device 302) may be referred to as guiding information as described herein.
  • the imaging system may verify if the imaging device 302 is accurately positioned at the target line 306 after being moved to the first location using additional images (e.g., a second image, a third image, etc. ) captured by the additional imaging device (e.g., camera or camera system) . If the first location does not correspond to the target line 306, another distance may be calculated for adjusting the position of the imaging device 302, and the imaging device 302 may be moved according to this other distance (e.g., by the operator or autonomously) . These steps may be repeated until the imaging device 302 is accurately positioned with respect to the target line 306.
  • additional images e.g., a second image, a third image, etc.
  • the imaging device 302 may be moved according to this other distance (e.g., by the operator or autonomously) .
  • the imaging device 302 may be used to capture images (e.g., X-ray images) of the areas of interest of the patient 304 (e.g., for imaging and diagnostic procedures, for surgical procedures, etc. ) .
  • images e.g., X-ray images
  • an imaging system e.g., robotic surgical imaging system
  • a first imaging device e.g., camera or camera system
  • a second imaging device e.g., X-ray machine or X-ray system
  • images captured from the first imaging device and inputs on those captured images are used to accurately position the second imaging device for the second imaging device to then be able to capture additional images of areas of interest of an object in a target environment (e.g., a patient in an operating room) .
  • a target environment e.g., a patient in an operating room
  • the set of coordinate mapping diagrams 400 may be used to calculate real-world coordinates from different positions of the images captured from the first imaging device. For example, real-world coordinates of a target position for the second imaging device to be placed may be determined from the inputs on the captured images (e.g., coordinates of interest, a target line 306 as described with reference to Fig. 3, etc. ) , as well as real-world coordinates of the second imaging device (e.g., using a mark on the top of the second imaging device) . Subsequently, a distance between the real-world coordinates of the target position and the real-world coordinates of the second imaging device may be calculated to move the second imaging device according to the distance (e.g., autonomously or manually) .
  • the distance e.g., autonomously or manually
  • the set of coordinate mapping diagrams 400 provided in the example of Fig. 4 may be used to map a set of pixel coordinates from the images captured by the first imaging device and corresponding to the target location and a current or initial location of the second imaging device to respective sets of real-world coordinates, and vice versa (e.g., from the sets of real-world coordinates to sets of pixel coordinates, for example, to display the distance for moving the second imaging device on a user interface) .
  • the set of coordinate mapping diagrams 400 may include a first rotation diagram 402, a second rotation diagram 404, and a third rotation diagram 406.
  • the first rotation diagram 402 may represent rotations about a first axis (e.g., x-axis) and may indicate how a second axis (e.g., y-axis) and a third axis (e.g., z-axis) are affected by the rotations about the first axis (e.g., by an angle, ⁇ ) .
  • This rotation about the first axis may be given as Equation (1) below:
  • the second rotation diagram 404 may represent rotations about the second axis (e.g., y-axis) and may indicate how the first axis (e.g., x-axis) and the third axis (e.g., z-axis) are affected by the rotations about the second axis (e.g., by an angle, ⁇ ) .
  • This rotation about the second axis may be given as Equation (2) below:
  • the third rotation diagram 406 may represent rotations about the third axis (e.g., z-axis) and may indicate how the first axis (e.g., x-axis) and the second axis (e.g., y-axis) are affected by the rotations about the third axis (e.g., by an angle, ⁇ ) .
  • This rotation about the third axis may be given as Equation (3) below:
  • Equation (4) After using one or more of Equations (1) , (2) , and (3) for the rotations about the respective axes, a whole movement matrix can be formed, which is given below by Equation (4) :
  • a computing device and/or processor of the imaging system described herein can calculate an old position of the second imaging device (e.g., given by X 1 , Y 1 , and Z 1 ) and/or the target location (e.g., target position, coordinates of interest, etc., given by X 2 , Y 2 , and Z 2 ) .
  • ‘O’ may represent a ‘realized position’ of a given set of coordinates
  • the ‘R 3x3 ’ matrix may represent real-time coordinates (e.g., XYZ coordinates determined from the rotation diagrams and corresponding equations)
  • the ‘T 3x1 ’ matrix may represent movement to a target location (e.g., moving from the real-time coordinates to the target location) .
  • the set of coordinate mapping diagrams 400 may also include a first coordinate mapping diagram 408 and a second coordinate mapping diagram 410 that can be used to map image pixel coordinated to/from real-world coordinates (e.g., using camera and/or image coordinates) .
  • first coordinate mapping diagram 408 the following image pixel coordinate mapping relationships can be determined as given below in Equations (5) , (6) , and (7) :
  • pixel positions of specific areas of images captured by the first imaging device can map to real-world coordinates, where a difference (e.g., distance) between the positions can be determined accuratley.
  • the pixel coordinates/positions may be output to the operator of the imaging system to indicate how far the operator needs to adjust the imaging system (e.g., how far to move the second imaging device) .
  • the transformation between real-world coordinates and the pixel coordinates may be given by Equation (9) below:
  • the computing device and/or processor of the imaging system may map between real-world coordinates, camera coordinates, image coordinates, and pixel coordinates (e.g., adjusting between real-world coordinates, imaginary coordinates, and pixel coordinates) .
  • the imaging system described herein may give the operator of the imaging system more suggestions about how to move the second imaging device (e.g., X-ray system or X-ray machine) .
  • Fig. 5 depicts a method 500 that may be used, for example, to identify a current position of an imaging device with respect to areas of interest of an object and to calculate a distance for moving the imaging device to capture those areas of interest more accurately.
  • the method 500 (and/or one or more steps thereof) may be carried out or otherwise performed, for example, by at least one processor.
  • the at least one processor may be the same as or similar to the processor (s) 104 of the computing device 102 described above.
  • the at least one processor may be part of a robot (such as a robot 114) or part of a navigation system (such as a navigation system 118) .
  • a processor other than any processor described herein may also be used to execute the method 500.
  • the at least one processor may perform the method 500 by executing elements stored in a memory such as the memory 106.
  • the elements stored in the memory and executed by the processor may cause the processor to execute one or more steps of a function as shown in method 500.
  • One or more portions of a method 500 may be performed by the processor executing any of the contents of memory, such as an image processing 120, a segmentation 122, a transformation 124, and/or a registration 128.
  • the method 500 comprises capturing a first image of a target environment using a first imaging device, where the first image includes an object included in the target environment (step 502) .
  • the target environment may include an operation room, where the first image includes at least an image of a patient.
  • the first imaging device may include a camera or camera system.
  • the first imaging device may be situated at the top of the target environment (e.g., on the ceiling of the operating room) or may be located elsewhere in the target environment such that the first image still includes the object.
  • the method 500 also comprises selecting coordinates of interest included in the first image, the coordinates of interest including at least a portion of the object (step 504) .
  • a target line may be selected that passes through the portion of the object, where the coordinates of interest include the target line.
  • the method 500 also comprises generating a set of real-world coordinates corresponding to the portion of the object based on a mapping between a set of pixel coordinates associated with the first image and the set of real-world coordinates (step 506) .
  • the set of real-world coordinates may be generated as described with reference to Fig. 4.
  • the method 500 also comprises positioning a second imaging device into a first location based on the set of real-world coordinates (step 508) . That is, the second imaging devices may be placed at a location corresponding to the generated real-world coordinates that should, in turn, correspond to the coordinates of interest.
  • the method 500 also comprises displaying a second image captured using the second imaging device, where the second image includes at least the portion of the object (step 510) .
  • the second imaging device may then be used to capture images (e.g., X-ray images) of the portion of the object (e.g., areas of interest of the patient) .
  • the present disclosure encompasses embodiments of the method 500 that comprise more or fewer steps than those described above, and/or one or more steps that are different than the steps described above.
  • Fig. 6 depicts a method 600 that may be used, for example, to verify a location of the second imaging device as described herein with respect to a given set of coordinates of interest.
  • the method 600 (and/or one or more steps thereof) may be carried out or otherwise performed, for example, by at least one processor.
  • the at least one processor may be the same as or similar to the processor (s) 104 of the computing device 102 described above.
  • the at least one processor may be part of a robot (such as a robot 114) or part of a navigation system (such as a navigation system 118) .
  • a processor other than any processor described herein may also be used to execute the method 600.
  • the at least one processor may perform the method 600 by executing elements stored in a memory such as the memory 106.
  • the elements stored in memory and executed by the processor may cause the processor to execute one or more steps of a function as shown in method 600.
  • One or more portions of a method 600 may be performed by the processor executing any of the contents of memory, such as an image processing 120, a segmentation 122, a transformation 124, and/or a registration 128.
  • the method 600 comprises capturing a first image of a target environment using a first imaging device, where the first image includes an object included in the target environment (step 602) .
  • the method 600 also comprises selecting coordinates of interest included in the first image, the coordinates of interest including at least a portion of the object (step 604) .
  • the method 600 also comprises generating a set of real-world coordinates corresponding to the portion of the object based on a mapping between a set of pixel coordinates associated with the first image and the set of real-world coordinates (step 606) .
  • the method 600 also comprises positioning the second imaging device into a first location based on the set of real-world coordinates (step 608) .
  • the method 600 also comprises capturing a third image of the target environment using the first imaging device after the second imaging device has been positioned into the first location (step 610) .
  • the method 600 also comprises verifying the second imaging device is at the coordinates of interest based on the third image, the coordinates of interest, the set of real-world coordinates, the set of pixel coordinates, or a combination thereof (step 612) .
  • the first imaging device may be used to capture an additional image of the moved second imaging device to verify the second imaging device has been accurately moved to the coordinated of interest or not. If the second imaging device is accurately positioned (e.g., its position has been verified to be correct) , the method 600 may continue to step 614.
  • the second imaging device may be positioned into a second location (e.g., autonomously or manually by an operator based on guiding information displayed for the operator) .
  • the method 600 also comprises displaying a second image captured using the second imaging device, where the second image includes at least the portion of the object (step 614) .
  • the second imaging device may be used to capture and display the second image after a position of the second imaging device has been verified to be accurate.
  • the present disclosure encompasses embodiments of the method 600 that comprise more or fewer steps than those described above, and/or one or more steps that are different than the steps described above.
  • Fig. 7 depicts a method 700 that may be used, for example, to guide an operator of an imaging system described herein when the operator is manually moving the second imaging device to a target location.
  • the method 700 (and/or one or more steps thereof) may be carried out or otherwise performed, for example, by at least one processor.
  • the at least one processor may be the same as or similar to the processor (s) 104 of the computing device 102 described above.
  • the at least one processor may be part of a robot (such as a robot 114) or part of a navigation system (such as a navigation system 118) .
  • a processor other than any processor described herein may also be used to execute the method 700.
  • the at least one processor may perform the method 700 by executing elements stored in a memory such as the memory 106.
  • the elements stored in memory and executed by the processor may cause the processor to execute one or more steps of a function as shown in method 700.
  • One or more portions of a method 700 may be performed by the processor executing any of the contents of memory, such as an image processing 120, a segmentation 122, a transformation 124, and/or a registration 128.
  • the method 700 comprises capturing a first image of a target environment using a first imaging device, where the first image includes an object included in the target environment (step 702) .
  • the method 700 also comprises selecting coordinates of interest included in the first image, the coordinates of interest including at least a portion of the object (step 704) .
  • the method 700 also comprises generating a set of real-world coordinates corresponding to the portion of the object based on a mapping between a set of pixel coordinates associated with the first image and the set of real-world coordinates (step 706) .
  • the method 700 also comprises calculating a distance to move the second imaging device into a first location to capture a second image that includes at least the portion of the object, where the distance is calculated based on the coordinates of interest, an initial location of the second imaging device, the real-world coordinates, or a combination thereof (step 708) .
  • guiding information may be displayed to assist the operator with positioning the second imaging device, where the guiding information includes the calculated distance. Additionally, the guiding information may be displayed to the operator based on the set of pixel coordinates associated with the first image (e.g., the calculated distance is converted back and forth between pixel coordinates, real-world coordinates, camera coordinates, and image coordinates as described with reference to Fig. 4) .
  • the method 700 also comprises positioning the second imaging device into the first location based on the set of real-world coordinates (e.g., and the calculated distance) (step 710) .
  • the method 700 also comprises displaying a second image captured using the second imaging device, the second image including at least the portion of the object (step 712) .
  • the present disclosure encompasses embodiments of the method 700 that comprise more or fewer steps than those described above, and/or one or more steps that are different than the steps described above.
  • the present disclosure encompasses methods with fewer than all of the steps identified in Figs. 5, 6, and 7 (and the corresponding description of the methods 500, 600, and 700) , as well as methods that include additional steps beyond those identified in Figs. 5, 6, and 7 (and the corresponding description of the methods 500, 600, and 700) .
  • the present disclosure also encompasses methods that comprise one or more steps from one method described herein, and one or more steps from another method described herein. Any correlation described herein may be or comprise a registration or any other correlation.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Optics & Photonics (AREA)
  • Gynecology & Obstetrics (AREA)
  • Robotics (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

A robotic surgical imaging system (100,200,300) includes a first imaging device (202) and a second imaging device (204,302). The first imaging device (202) may be used to capture a first image of a target environment, where the first image includes an object in the target environment (502,602,702). Subsequently, coordinates of interest may be selected in the first image that are associated with at least a portion of the object (504). Real-world coordinates may then be generated that correspond to the coordinates of interest and the portion of the object (506,606), and the second imaging device (204) may be placed at a location based on the real-world coordinates (508,608). After verifying the location of the second imaging device (204) corresponds to the coordinates of interest (e.g., with any needed adjustments to the location made), the second imaging device (204) may be used to capture a second image of the portion of the object (510,610,710).

Description

MOBILE X-RAY POSITIONING SYSTEM BACKGROUND
The present disclosure is generally directed to surgical systems and relates more particularly to imaging devices for the surgical systems.
Surgical robots may assist a surgeon or other medical provider in carrying out a surgical procedure or may complete one or more surgical procedures autonomously. Imaging may be used by a medical provider for diagnostic, operational, and/or therapeutic purposes. Providing controllable linked articulating members allows a surgical robot to reach areas of a patient anatomy during various medical procedures (e.g., using the imaging) .
BRIEF SUMMARY
Example aspects of the present disclosure include:
A robotic surgical imaging system, comprising: a first imaging device; a second imaging device; a processor coupled with the first imaging device and the second imaging device; and a memory coupled with and readable by the processor and storing therein instructions that, when executed by the processor, cause the processor to: capture a first image of a target environment using the first imaging device, wherein the first image comprises an object included in the target environment; select coordinates of interest included in the first image, the coordinates of interest including at least a portion of the object; generate a set of real-world coordinates corresponding to the portion of the object based at least in part on a mapping between a set of pixel coordinates associated with the first image and the set of real-world coordinates; position the second imaging device into a first location based at least in part on the set of real-world coordinates; and display a second image captured using the second imaging device, the second image comprising at least the portion of the object.
Any of the aspects herein, wherein the instructions to select the coordinates of interest included in the first image cause the processor to: select a target line that passes through at least the portion of the object, wherein the target line comprises the coordinates of interest.
Any of the aspects herein, wherein the instructions further cause the processor to: capture a third image of the target environment using the first imaging device after the second imaging device has been positioned into the first location; and verify the second imaging device is at the coordinates of interest based at least in part on the third image, the coordinates of interest, the set of real-world coordinates, the set of pixel coordinates, or a combination thereof.
Any of the aspects herein, wherein the instructions further cause the processor to: position the second imaging device into a second location based at least in part on determining the second imaging device is not located at the coordinates of interest from the verifying.
Any of the aspects herein, wherein the instructions further cause the processor to: display guiding information to assist an operator with positioning the second imaging device into the second location.
Any of the aspects herein, wherein the instructions to generate the set of real-world coordinates corresponding to the portion of the object cause the processor to: calculate a distance to move the second imaging device into the first location to capture the second image comprising at least the portion of the object, wherein the distance is calculated based at least in part on the coordinates of interest and an initial location of the second imaging device.
Any of the aspects herein, wherein the instructions further cause the processor to: display guiding information to assist an operator with positioning the second imaging device, wherein the guiding information comprises the calculated distance.
Any of the aspects herein, wherein the guiding information is displayed to the operator based at least in part on the set of pixel coordinates associated with the first image.
Any of the aspects herein, wherein the first imaging device comprises a camera and the second imaging device comprises an X-ray machine.
A system, comprising: a processor; and a memory coupled with and readable by the processor and storing therein instructions that, when executed by the processor, cause the processor to:capture a first image of a target environment using a first imaging device, wherein the first image comprises an object included in the target environment; select coordinates of interest included in the first image, the coordinates of interest including at least a portion of the object; generate a set of real-world coordinates corresponding to the portion of the object based at least in part on a mapping between a set of pixel coordinates associated with the first image and the set of real-world coordinates; position a second imaging device into a first location based at least in part on the set of real-world coordinates; and display a second image captured using the second imaging device, the second image comprising at least the portion of the object.
Any of the aspects herein, wherein the instructions to select the coordinates of interest included in the first image cause the processor to: select a target line that passes through at least the portion of the object, wherein the target line comprises the coordinates of interest.
Any of the aspects herein, wherein the instructions further cause the processor to: capture a third image of the target environment using the first imaging device after the second imaging device  has been positioned into the first location; and verify the second imaging device is at the coordinates of interest based at least in part on the third image, the coordinates of interest, the set of real-world coordinates, the set of pixel coordinates, or a combination thereof.
Any of the aspects herein, wherein the instructions further cause the processor to: position the second imaging device into a second location based at least in part on determining the second imaging device is not located at the coordinates of interest from the verifying.
Any of the aspects herein, wherein the instructions further cause the processor to: display guiding information to assist an operator with positioning the second imaging device into the second location.
Any of the aspects herein, wherein the instructions to generate the set of real-world coordinates corresponding to the portion of the object cause the processor to: calculate a distance to move the second imaging device into the first location to capture the second image comprising at least the portion of the object, wherein the distance is calculated based at least in part on the coordinates of interest and an initial location of the second imaging device.
Any of the aspects herein, wherein the instructions further cause the processor to: display guiding information to assist an operator with positioning the second imaging device, wherein the guiding information comprises the calculated distance.
Any of the aspects herein, wherein the guiding information is displayed to the operator based at least in part on the set of pixel coordinates associated with the first image.
Any of the aspects herein, wherein the first imaging device comprises a camera and the second imaging device comprises an X-ray machine.
A method, comprising: capturing a first image of a target environment using a first imaging device, wherein the first image comprises an object included in the target environment; generating a set of real-world coordinates corresponding to coordinates of interest included in the first image based at least in part on a mapping between a set of pixel coordinates associated with the first image and the set of real-world coordinates, wherein the coordinates of interest indicate at least a portion of the object; and displaying a second image captured using a second imaging device, the second image comprising at least the portion of the object, wherein the second imaging device is positioned into a first location based at least in part on the set real-world coordinates.
Any of the aspects herein, wherein the coordinates of interest comprise a target line that passes through at least the portion of the object.
Any aspect in combination with any one or more other aspects.
Any one or more of the features disclosed herein.
Any one or more of the features as substantially disclosed herein.
Any one or more of the features as substantially disclosed herein in combination with any one or more other features as substantially disclosed herein.
Any one of the aspects/features/embodiments in combination with any one or more other aspects/features/embodiments.
Use of any one or more of the aspects or features as disclosed herein.
It is to be appreciated that any feature described herein can be claimed in combination with any other feature (s) as described herein, regardless of whether the features come from the same described embodiment.
The details of one or more aspects of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the techniques described in this disclosure will be apparent from the description and drawings, and from the claims.
The phrases “at least one” , “one or more” , and “and/or” are open-ended expressions that are both conjunctive and disjunctive in operation. For example, each of the expressions “at least one of A, B and C” , “at least one of A, B, or C” , “one or more of A, B, and C” , “one or more of A, B, or C” and “A, B, and/or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together. When each one of A, B, and C in the above expressions refers to an element, such as X, Y, and Z, or class of elements, such as X1-Xn, Y1-Ym, and Z1-Zo, the phrase is intended to refer to a single element selected from X, Y, and Z, a combination of elements selected from the same class (e.g., X1 and X2) as well as a combination of elements selected from two or more classes (e.g., Y1 and Zo) .
The term “a” or “an” entity refers to one or more of that entity. As such, the terms “a” (or “an” ) , “one or more” and “at least one” can be used interchangeably herein. It is also to be noted that the terms “comprising” , “including” , and “having” can be used interchangeably.
The preceding is a simplified summary of the disclosure to provide an understanding of some aspects of the disclosure. This summary is neither an extensive nor exhaustive overview of the disclosure and its various aspects, embodiments, and configurations. It is intended neither to identify key or critical elements of the disclosure nor to delineate the scope of the disclosure but to present selected concepts of the disclosure in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other aspects, embodiments, and configurations of the disclosure are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
Numerous additional features and advantages of the present disclosure will become apparent to those skilled in the art upon consideration of the embodiment descriptions provided hereinbelow.
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
The accompanying drawings are incorporated into and form a part of the specification to illustrate several examples of the present disclosure. These drawings, together with the description, explain the principles of the disclosure. The drawings simply illustrate preferred and alternative examples of how the disclosure can be made and used and are not to be construed as limiting the disclosure to only the illustrated and described examples. Further features and advantages will become apparent from the following, more detailed, description of the various aspects, embodiments, and configurations of the disclosure, as illustrated by the drawings referenced below.
Fig. 1 is a block diagram of a system according to at least one embodiment of the present disclosure;
Fig. 2 is an imaging system diagram according to at least one embodiment of the present disclosure;
Fig. 3 is an additional imaging system diagram according to at least one embodiment of the present disclosure;
Fig. 4 is a set of coordinate mapping diagrams according to at least one embodiment of the present disclosure;
Fig. 5 is a flowchart according to at least one embodiment of the present disclosure;
Fig. 6 is an additional flowchart according to at least one embodiment of the present disclosure; and
Fig. 7 is an additional flowchart according to at least one embodiment of the present disclosure.
DETAILED DESCRIPTION
It should be understood that various aspects disclosed herein may be combined in different combinations than the combinations specifically presented in the description and accompanying drawings. It should also be understood that, depending on the example or embodiment, certain acts or events of any of the processes or methods described herein may be performed in a different sequence, and/or may be added, merged, or left out altogether (e.g., all described acts or events may not be necessary to carry out the disclosed techniques according to different embodiments of the present disclosure) . In addition, while certain aspects of this disclosure are described as being  performed by a single module or unit for purposes of clarity, it should be understood that the techniques of this disclosure may be performed by a combination of units or modules associated with, for example, a computing device and/or a medical device.
In one or more examples, the described methods, processes, and techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit. Alternatively or additionally, functions may be implemented using machine learning models, neural networks, artificial neural networks, or combinations thereof (alone or in combination with instructions) . Computer-readable media may include non-transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer) .
Instructions may be executed by one or more processors, such as one or more digital signal processors (DSPs) , general purpose microprocessors (e.g., Intel Core i3, i5, i7, or i9 processors; Intel Celeron processors; Intel Xeon processors; Intel Pentium processors; AMD Ryzen processors; AMD Athlon processors; AMD Phenom processors; Apple A10 or 10X Fusion processors; Apple A11, A12, A12X, A12Z, or A13 Bionic processors; or any other general purpose microprocessors) , graphics processing units (e.g., Nvidia GeForce RTX 2000-series processors, Nvidia GeForce RTX 3000-series processors, AMD Radeon RX 5000-series processors, AMD Radeon RX 6000-series processors, or any other graphics processing units) , application specific integrated circuits (ASICs) , field programmable logic arrays (FPGAs) , or other equivalent integrated or discrete logic circuitry. Accordingly, the term “processor” as used herein may refer to any of the foregoing structure or any other physical structure suitable for implementation of the described techniques. Also, the techniques could be fully implemented in one or more circuits or logic elements.
Before any embodiments of the disclosure are explained in detail, it is to be understood that the disclosure is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the drawings. The disclosure is capable of other embodiments and of being practiced or of being carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of “including, ” “comprising, ” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. Further, the present disclosure may use examples to illustrate one  or more aspects thereof. Unless explicitly stated otherwise, the use or listing of one or more examples (which may be denoted by “for example, ” “by way of example, ” “e.g., ” “such as, ” or similar language) is not intended to and does not limit the scope of the present disclosure.
The terms proximal and distal are used in this disclosure with their conventional medical meanings, proximal being closer to the operator or user of the system, and further from the region of surgical interest in or on the patient, and distal being closer to the region of surgical interest in or on the patient, and further from the operator or user of the system.
In some surgical procedures (e.g., orthopedic surgeries) , X-ray systems may be used for ensuring a position of an X-ray machine is correct (e.g., prior to and/or during the surgical procedures) . For example, the X-ray systems may include an X-ray machine and a display screen for displaying X-ray images captured from the X-ray machine, where these X-ray images are used to ensure the position of the X-ray machine is correct. However, these X-ray systems may rely on trial-and-error methods for determining if the position of the X-ray system is correct. For example, an operator of one of these X-ray systems may place an X-ray machine at an approximate location needed to capture a specific portion of a patient (e.g., for which the surgical procedure is being performed) . Subsequently, the operator may capture an X-ray image of the patient from the X-ray machine at this approximate location and may determine if the X-ray image accurately captures the specific portion of the patient. If the operator determines the X-ray image does not accurately capture the specific portion of the patient, the operator may readjust or move the X-ray machine and repeat capturing X-ray images of the patient until the X-ray machine is accurately positioned (e.g., for capturing X-ray images of specific areas of the patient, for the surgical procedure, etc. ) .
In some examples, these X-ray systems may have a limited field of view (FOV) . That is, the X-ray machines may only be capable of capturing narrow areas of interest (e.g., to limit any possible X-ray radiation to the patient and/or operator) . The limited FOV may cause the operator to have to reposition the X-ray machine multiple times and to capture multiple X-ray images of the patient when checking if the X-ray machine is accurately located to ensure subsequent X-ray scans contain all interested portions of the patient. For example, operators of these X-ray systems may try many times to accurately position the X-ray machine and system when ensuring a scan range is suitable for capturing X-ray images of the interested portions of the patient for the surgical procedure. However, with capturing multiple X-ray images to ensure the location of the X-ray machine is correct, a safety risk is introduced with possibly exposing the patient and/or the operator to unnecessary amounts of radiation.
As described herein, a positioning system is provided that uses a camera (e.g., situated at the top of an operating room or in a different location of the operating room) for identifying current positions of both an X-ray machine and a patient to then calculate and provide an operator (e.g., a surgeon) with precise adjustments needed for adjusting a position of the X-ray machine in the operating room in order to obtain an accurate FOV containing the entire region of interest of the patient for imaging. This positioning system may help operators to efficiently position an X-ray machine according to manual inputs from the operator indicating the regions of interest of the patient. For example, using the camera on the top of the operating room, a mark on the top of the X-ray machine may be used to check the position of the X-ray machine in the operation room. Additionally, the operator may use an image captured from the camera to input a target location onto the image corresponding to an interested FOV or location on the patient. The positioning system may then use the target location to calculate a precise move distance for moving the X-ray machine to that target location and feedback the calculated distance to the operator, and the operator can move the X-ray machine to the correct location according to the feedback. This positioning system may help in saving operators from having to locate and relocate X-ray machines many times and may decrease uncertainties of whether the X-ray scans are correct or not, leading to fewer X-ray images taken and less exposure from associated radiation.
Embodiments of the present disclosure provide technical solutions to one or more of the problems of (1) determining an accurate location for placing an X-ray system or machine, (2) exposing a patient and/or operator of the X-ray system to unnecessary amounts of radiation, and (3) prolonging procedure times for surgeries. The positioning system described herein enables an operator of an X-ray system that employs the positioning system to more accurately place the X-ray system in an accurate location needed for capturing the correct areas of interest of the patient without having to take multiple X-rays of the patient. With this expedited and accurate locating of the X-ray machine, less X-rays may be taken of the patient or otherwise limit the amount of radiation to which the patient and/or the operator are exposed. Additionally, the amount of time needed for associated surgical procedures using the X-ray system may decrease as a result of using the described positioning system.
Turning first to Fig. 1, a block diagram of a system 100 according to at least one embodiment of the present disclosure is shown. The system 100 may be used to position an imaging device (e.g., X-ray system or machine) to capture areas of interest of a patient based on images captured from an additional imaging device (e.g., camera) . Accordingly, a distance to move the imaging device may be calculated based on selecting the areas of interest of the patient on the  images captured from the additional imaging device and calculating how far the imaging device needs to be moved to accurately capture those areas of interest. In some examples, the system 100 may be used to control, pose, and/or otherwise manipulate a surgical mount system, a surgical arm, surgical tools, and/or imaging devices attached thereto and/or carry out one or more other aspects of one or more of the methods disclosed herein. The system 100 comprises a computing device 102, one or more imaging devices 112, a robot 114, a navigation system 118, a database 130, and/or a cloud or other network 134. Systems according to other embodiments of the present disclosure may comprise more or fewer components than the system 100. For example, the system 100 may not include the imaging device 112, the robot 114, the navigation system 118, one or more components of the computing device 102, the database 130, and/or the cloud 134.
The computing device 102 comprises a processor 104, a memory 106, a communication interface 108, and a user interface 110. Computing devices according to other embodiments of the present disclosure may comprise more or fewer components than the computing device 102.
The processor 104 of the computing device 102 may be any processor described herein or any similar processor. The processor 104 may be configured to execute instructions stored in the memory 106, which instructions may cause the processor 104 to carry out one or more computing steps utilizing or based on data received from the imaging device 112, the robot 114, the navigation system 118, the database 130, and/or the cloud 134.
The memory 106 may be or comprise RAM, DRAM, SDRAM, other solid-state memory, any memory described herein, or any other tangible, non-transitory memory for storing computer-readable data and/or instructions. The memory 106 may store information or data useful for completing, for example, any step of the methods 500, 600, and/or 700 described herein, or of any other methods. The memory 106 may store, for example, instructions and/or machine learning models that support one or more functions of the robot 114. For instance, the memory 106 may store content (e.g., instructions and/or machine learning models) that, when executed by the processor 104, enable image processing 120, segmentation 122, transformation 124, and/or registration 128. Such content, if provided as in instruction, may, in some embodiments, be organized into one or more applications, modules, packages, layers, or engines. Alternatively or additionally, the memory 106 may store other types of content or data (e.g., machine learning models, artificial neural networks, deep neural networks, etc. ) that can be processed by the processor 104 to carry out the various method and features described herein. Thus, although various contents of memory 106 may be described as instructions, it should be appreciated that functionality described herein can be achieved through use of instructions, algorithms, and/or machine learning models. The data, algorithms,  and/or instructions may cause the processor 104 to manipulate data stored in the memory 106 and/or received from or via the imaging device 112, the robot 114, the database 130, and/or the cloud 134.
The computing device 102 may also comprise a communication interface 108. The communication interface 108 may be used for receiving image data or other information from an external source (such as the imaging device 112, the robot 114, the navigation system 118, the database 130, the cloud 134, and/or any other system or component not part of the system 100) , and/or for transmitting instructions, images, or other information to an external system or device (e.g., another computing device 102, the imaging device 112, the robot 114, the navigation system 118, the database 130, the cloud 134, and/or any other system or component not part of the system 100) . The communication interface 108 may comprise one or more wired interfaces (e.g., a USB port, an Ethernet port, a Firewire port) and/or one or more wireless transceivers or interfaces (configured, for example, to transmit and/or receive information via one or more wireless communication protocols such as 802.11a/b/g/n, Bluetooth, NFC, ZigBee, and so forth) . In some embodiments, the communication interface 108 may be useful for enabling the device 102 to communicate with one or more other processors 104 or computing devices 102, whether to reduce the time needed to accomplish a computing-intensive task or for any other reason.
The computing device 102 may also comprise one or more user interfaces 110. The user interface 110 may be or comprise a keyboard, mouse, trackball, monitor, television, screen, touchscreen, and/or any other device for receiving information from a user and/or for providing information to a user. The user interface 110 may be used, for example, to receive a user selection or other user input regarding any step of any method described herein. Notwithstanding the foregoing, any required input for any step of any method described herein may be generated automatically by the system 100 (e.g., by the processor 104 or another component of the system 100) or received by the system 100 from a source external to the system 100. In some embodiments, the user interface 110 may be useful to allow a surgeon or other user to modify instructions to be executed by the processor 104 according to one or more embodiments of the present disclosure, and/or to modify or adjust a setting of other information displayed on the user interface 110 or corresponding thereto.
Although the user interface 110 is shown as part of the computing device 102, in some embodiments, the computing device 102 may utilize a user interface 110 that is housed separately from one or more remaining components of the computing device 102. In some embodiments, the user interface 110 may be located proximate one or more other components of the computing device 102, while in other embodiments, the user interface 110 may be located remotely from one or more other components of the computer device 102.
The imaging device 112 may be operable to image anatomical feature (s) (e.g., a bone, veins, tissue, etc. ) and/or other aspects of patient anatomy to yield image data (e.g., image data depicting or corresponding to a bone, veins, tissue, etc. ) . “Image data” as used herein refers to the data generated or captured by an imaging device 112, including in a machine-readable form, a graphical/visual form, and in any other form. In various examples, the image data may comprise data corresponding to an anatomical feature of a patient, or to a portion thereof. The image data may be or comprise a preoperative image, an intraoperative image, a postoperative image, or an image taken independently of any surgical procedure. In some embodiments, a first imaging device 112 may be used to obtain first image data (e.g., a first image) at a first time, and a second imaging device 112 may be used to obtain second image data (e.g., a second image) at a second time after the first time. The imaging device 112 may be capable of taking a 2D image or a 3D image to yield the image data. The imaging device 112 may be or comprise, for example, an ultrasound scanner (which may comprise, for example, a physically separate transducer and receiver, or a single ultrasound transceiver) , an O-arm, a C-arm, a G-arm, or any other device utilizing X-ray-based imaging (e.g., a fluoroscope, a CT scanner, or other X-ray machine) , a magnetic resonance imaging (MRI) scanner, an optical coherence tomography (OCT) scanner, an endoscope, a microscope, an optical camera, a thermographic camera (e.g., an infrared camera) , a radar system (which may comprise, for example, a transmitter, a receiver, a processor, and one or more antennae) , or any other imaging device 112 suitable for obtaining images of an anatomical feature of a patient. The imaging device 112 may be contained entirely within a single housing or may comprise a transmitter/emitter and a receiver/detector that are in separate housings or are otherwise physically separated.
In some embodiments, the imaging device 112 may comprise more than one imaging device 112. For example, a first imaging device may provide first image data and/or a first image, and a second imaging device may provide second image data and/or a second image. In still other embodiments, the same imaging device may be used to provide both the first image data and the second image data, and/or any other image data described herein. The imaging device 112 may be operable to generate a stream of image data. For example, the imaging device 112 may be configured to operate with an open shutter, or with a shutter that continuously alternates between open and shut so as to capture successive images. For purposes of the present disclosure, unless specified otherwise, image data may be considered to be continuous and/or provided as an image data stream if the image data represents two or more frames per second.
The robot 114 may be any surgical robot or surgical robotic system. The robot 114 may be or comprise, for example, the Mazor X TM Stealth Edition robotic guidance system. The robot 114  may be configured to position the imaging device 112 at one or more precise position (s) and orientation (s) , and/or to return the imaging device 112 to the same position (s) and orientation (s) at a later point in time. The robot 114 may additionally or alternatively be configured to manipulate a surgical tool (whether based on guidance from the navigation system 118 or not) to accomplish or to assist with a surgical task. In some embodiments, the robot 114 may be configured to hold and/or manipulate an anatomical element during or in connection with a surgical procedure. The robot 114 may comprise one or more robotic arms 116. In some embodiments, the robotic arm 116 may comprise a first robotic arm and a second robotic arm, though the robot 114 may comprise more than two robotic arms. In some embodiments, one or more of the robotic arms 116 may be used to hold and/or maneuver the imaging device 112. In embodiments where the imaging device 112 comprises two or more physically separate components (e.g., a transmitter and receiver) , one robotic arm 116 may hold one such component, and another robotic arm 116 may hold another such component. Each robotic arm 116 may be positionable independently of the other robotic arm. The robotic arms 116 may be controlled in a single, shared coordinate space, or in separate coordinate spaces.
The robot 114, together with the robotic arm 116, may have, for example, one, two, three, four, five, six, seven, or more degrees of freedom. Further, the robotic arm 116 may be positioned or positionable in any pose, plane, and/or focal point. The pose includes a position and an orientation. As a result, an imaging device 112, surgical tool, or other object held by the robot 114 (or, more specifically, by the robotic arm 116) may be precisely positionable in one or more needed and specific positions and orientations.
The robotic arm (s) 116 may comprise one or more sensors that enable the processor 104 (or a processor of the robot 114) to determine a precise pose in space of the robotic arm (as well as any object or element held by or secured to the robotic arm) .
In some embodiments, reference markers (e.g., navigation markers) may be placed on the robot 114 (including, e.g., on the robotic arm 116) , the imaging device 112, or any other object in the surgical space. The reference markers may be tracked by the navigation system 118, and the results of the tracking may be used by the robot 114 and/or by an operator of the system 100 or any component thereof. In some embodiments, the navigation system 118 can be used to track other components of the system (e.g., imaging device 112) and the system can operate without the use of the robot 114 (e.g., with the surgeon manually manipulating the imaging device 112 and/or one or more surgical tools, based on information and/or instructions generated by the navigation system 118, for example) .
The navigation system 118 may provide navigation for a surgeon and/or a surgical robot during an operation. The navigation system 118 may be any now-known or future-developed navigation system, including, for example, the Medtronic StealthStation TM S8 surgical navigation system or any successor thereof. The navigation system 118 may include one or more cameras or other sensor (s) for tracking one or more reference markers, navigated trackers, or other objects within the operating room or other room in which some or all of the system 100 is located. The one or more cameras may be optical cameras, infrared cameras, or other cameras. In some embodiments, the navigation system 118 may comprise one or more electromagnetic sensors. In various embodiments, the navigation system 118 may be used to track a position and orientation (e.g., a pose) of the imaging device 112, the robot 114 and/or robotic arm 116, and/or one or more surgical tools (or, more particularly, to track a pose of a navigated tracker attached, directly or indirectly, in fixed relation to the one or more of the foregoing) . The navigation system 118 may include a display for displaying one or more images from an external source (e.g., the computing device 102, imaging device 112, or other source) or for displaying an image and/or video stream from the one or more cameras or other sensors of the navigation system 118. In some embodiments, the system 100 can operate without the use of the navigation system 118. The navigation system 118 may be configured to provide guidance to a surgeon or other user of the system 100 or a component thereof, to the robot 114, or to any other element of the system 100 regarding, for example, a pose of one or more anatomical elements, whether or not a tool is in the proper trajectory, and/or how to move a tool into the proper trajectory to carry out a surgical task according to a preoperative or other surgical plan.
The database 130 may store information that correlates one coordinate system to another (e.g., one or more robotic coordinate systems to a patient coordinate system and/or to a navigation coordinate system) . The database 130 may additionally or alternatively store, for example, one or more surgical plans (including, for example, pose information about a target and/or image information about a patient’s anatomy at and/or proximate the surgical site, for use by the robot 114, the navigation system 118, and/or a user of the computing device 102 or of the system 100) ; one or more images useful in connection with a surgery to be completed by or with the assistance of one or more other components of the system 100; and/or any other useful information. The database 130 may be configured to provide any such information to the computing device 102 or to any other device of the system 100 or external to the system 100, whether directly or via the cloud 134. In some embodiments, the database 130 may be or comprise part of a hospital image storage system, such as a picture archiving and communication system (PACS) , a health information system (HIS) ,  and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
The cloud 134 may be or represent the Internet or any other wide area network. The computing device 102 may be connected to the cloud 134 via the communication interface 108, using a wired connection, a wireless connection, or both. In some embodiments, the computing device 102 may communicate with the database 130 and/or an external device (e.g., a computing device) via the cloud 134.
The system 100 or similar systems may be used, for example, to carry out one or more aspects of any of the methods 500, 600, and/or 700 described herein. The system 100 or similar systems may also be used for other purposes.
Fig. 2 shows an imaging system diagram 200 according to at least one embodiment of the present disclosure. As shown, the imaging system diagram 200 may include a first imaging device 202 and a second imaging device 204. In some examples, the first imaging device 202 may be a camera or camera system, and the second imaging device 204 may be an X-ray machine or X-ray system. One or both  imaging devices  202, 204 may be similar or identical to the imaging device 112 depicted and described in connection with Fig. 1. As described herein, images captured by the first imaging device 202 may be used to determine a location for placing the second imaging device 204 in a room (e.g., an operating room or other type of room used for different medical procedures) . In some examples, the first imaging device 202 may be used to capture images of its surroundings. For example, the first imaging device may capture a first image of a target environment 206 to identify locations of different objects within the target environment 206, such as a patient 208 (e.g., object) , an operator of the imaging system, various equipment in the target environment (e.g., including the second imaging device 204) , etc. While shown as being located on the ceiling, the first imaging device 202 may be placed at other locations in the room as long as the first imaging device 202 is capable of capturing images of the target environment 206 and at least the patient 208.
As described herein, after the first image of the target environment 206 including at least the patient 208 has been captured using the first imaging device 202, coordinates of interest may be selected from the first image, where the coordinates of interest include specific areas of interest of the patient 208 needed for imaging (e.g., to perform subsequent surgical operations and/or for assisting surgical operations that are in progress) . In some examples, the operator of the imaging system may manually input or select the coordinates of interest from the first image. Additionally or alternatively, the imaging system may identify and select the coordinates of interest autonomously (e.g., based on previously received instructions or data) . As will be discussed in greater detail with  reference to Fig. 3, when selecting the coordinates of interest, the operator and/or the imaging system may select a target line that passes through the specific areas of interest of the patient 208.
Subsequently, after the coordinates of interest have been selected on the first image, a processor (e.g., the processor 104 and/or the computing device 102 as described with reference to Fig. 1) may calculate the position of the coordinates of interest (e.g., target line) and of the second imaging device 204 within the room. Additionally, the processor may calculate a distance and determine a direction of the coordinates of interest with respect to the second imaging device 204. For example, the processor may generate a set of real-world coordinates corresponding to the specific areas of interest of the patient 208 based on the coordinates of interest and may calculate the distance needed to move the second imaging device 204 to those real-world coordinates from an initial position at which the second imaging device 204 is located. In some examples, the processor may generate the set of real-world coordinates based on a mapping between a set of pixel coordinates and the real-world coordinates, where the set of pixel coordinates are associated with the first image captured from the first imaging device 202 and the real-world coordinates. The mapping between the set of pixel coordinates and the real-world coordinates will be described in greater detail with reference to Fig. 4.
Upon calculating the distance and determining the direction of the coordinates of interest (e.g., the set of real-world coordinates) , the imaging system may position the second imaging device 204 into a first location using the calculated distance and determined direction. In some examples, the imaging system may output this calculated distance (e.g., guiding information) to a user interface (e.g., the user interface 110 as described with reference to Fig. 1) for the operator of the imaging system to move the second imaging device 204 according to the output. Additionally or alternatively, the imaging system may autonomously move the second imaging system 204 to the first location based on the coordinates of interest and the calculated distance.
After the second imaging device 204 has been positioned into the first location, the imaging system may verify that the first location corresponds to the coordinates of interest using one or more additional images captured by the first imaging device 202. That is, the imaging system may check whether the second imaging device 204 is accurately positioned based on the coordinates of interest and a current location of the second imaging device 204 (e.g., the first position) . If the first location does not correspond to the coordinates of interest, the processor of the imaging system may again calculate a distance (and determine a direction) for which the second imaging device 204 needs to be moved. Accordingly, the imaging system may position the second imaging device 204 according to the newly calculated distance (e.g., autonomously or by the operator based on the newly calculated  distance being output to the operator) . Additionally or alternatively, after the location of the second imaging device 204 has been verified to correspond to the coordinates of interest (e.g., after one or more given number of attempts of positioning the second imaging device 204) , the second imaging device 204 may be used to capture and display a second image (e.g., X-ray image) of those areas of interest of the patient 208. Subsequently, any related surgical operations and/or other medical operations may occur with the second imaging device 204 now properly positioned.
Fig. 3 shows an imaging system diagram 300 according to at least one embodiment of the present disclosure. As shown, the imaging system diagram 300 may include an imaging device 302, which may be an example of the second imaging device 204 as described with reference to Fig. 2 (e.g., an X-ray machine) . In some examples, the imaging device 302 (e.g., of an X-ray system) may have a limited FOV, so it is important to ensure the imaging device 302 is properly positioned to accurately capture areas of interest of a patient 304 (e.g., or more generically, an “object” ) needed for imaging with the limited FOV. As described with reference to Fig. 2, although not shown in the example of Fig. 3, an additional imaging device (e.g., the first imaging device 202 of the Fig. 2, such as a camera) may be used to assist in placing the imaging device 302 at the correct location to capture an area or areas of interest of the patient 304.
In some examples, the additional imaging device may capture a first image (e.g., digital image or video that is output to a user interface associated with the imaging system, such as the user interface 110 as described with reference to Fig. 1) of a target environment (e.g., operation room) , where the first image includes at least the imaging device 302 and the patient 304. Subsequently, coordinates of interest can be selected on the first image that correspond to the areas of interest of the patient 304. For example, an operator may draw a target line 306 on the first image showing the areas of interest of the patient 304 (e.g., a target location for the imaging device 302 to be placed for capturing subsequent images of the areas of interest, such as X-ray images) . After the target line 306 and/or the coordinates of interest have been input or selected, a computing device and/or processor associated with the imaging system (e.g., the computing device 102 and/or the processor 104 as described with reference to Fig. 1) may calculate or determine a position 308 of the target line 306 (e.g., a set of real-world coordinates) and a position 310 of the imaging device 302. As will be described in greater detail with reference to Fig. 4, the computing device and/or processor associated with the imaging system may calculate or determine the position 308 and the position 310 based on a mapping between pixel coordinates associated with the first image and real-world coordinates of the target line 306 and the imaging device 302, respectively.
Once the position 308 and the position 310 have been calculated/determined, a distance 312 may be calculated between the position 308 of the target line 306 and the position 310 of the imaging device 302 (e.g., a distance of the target line 306 relative to a current or initial location of the imaging device 302) . Additionally, a direction for which the imaging device 302 needs to be moved to reach the target line 306 may be determined and/or calculated based on the  positions  308 and 310. The distance 312 (and the determined direction) may then be used to position the imaging device 302 at a first location corresponding to the target line 306. In some examples, the computing device and/or processor may output the distance 312 (and the determined direction) to the operator for the operator to move the imaging device 302 into the first location according to the output. Additionally or alternatively, the computing device and/or processor may autonomously move the imaging device 302 to the first location according to the distance 312. In some examples, the distance 312 (and the direction for moving the imaging device 302) may be referred to as guiding information as described herein.
As described previously with reference to Fig. 2, the imaging system may verify if the imaging device 302 is accurately positioned at the target line 306 after being moved to the first location using additional images (e.g., a second image, a third image, etc. ) captured by the additional imaging device (e.g., camera or camera system) . If the first location does not correspond to the target line 306, another distance may be calculated for adjusting the position of the imaging device 302, and the imaging device 302 may be moved according to this other distance (e.g., by the operator or autonomously) . These steps may be repeated until the imaging device 302 is accurately positioned with respect to the target line 306. After the position of the imaging device 302 has been verified to be accurate with respect to the target line 306, the imaging device 302 may be used to capture images (e.g., X-ray images) of the areas of interest of the patient 304 (e.g., for imaging and diagnostic procedures, for surgical procedures, etc. ) .
Fig. 4 shows a set of coordinate mapping diagrams 400 according to at least one embodiment of the present disclosure. As described herein, an imaging system (e.g., robotic surgical imaging system) is provided that includes at least a first imaging device (e.g., camera or camera system) and a second imaging device (e.g., X-ray machine or X-ray system) , where images captured from the first imaging device and inputs on those captured images are used to accurately position the second imaging device for the second imaging device to then be able to capture additional images of areas of interest of an object in a target environment (e.g., a patient in an operating room) .
The set of coordinate mapping diagrams 400 may be used to calculate real-world coordinates from different positions of the images captured from the first imaging device. For  example, real-world coordinates of a target position for the second imaging device to be placed may be determined from the inputs on the captured images (e.g., coordinates of interest, a target line 306 as described with reference to Fig. 3, etc. ) , as well as real-world coordinates of the second imaging device (e.g., using a mark on the top of the second imaging device) . Subsequently, a distance between the real-world coordinates of the target position and the real-world coordinates of the second imaging device may be calculated to move the second imaging device according to the distance (e.g., autonomously or manually) . Accordingly, the set of coordinate mapping diagrams 400 provided in the example of Fig. 4 may be used to map a set of pixel coordinates from the images captured by the first imaging device and corresponding to the target location and a current or initial location of the second imaging device to respective sets of real-world coordinates, and vice versa (e.g., from the sets of real-world coordinates to sets of pixel coordinates, for example, to display the distance for moving the second imaging device on a user interface) .
The set of coordinate mapping diagrams 400 may include a first rotation diagram 402, a second rotation diagram 404, and a third rotation diagram 406. The first rotation diagram 402 may represent rotations about a first axis (e.g., x-axis) and may indicate how a second axis (e.g., y-axis) and a third axis (e.g., z-axis) are affected by the rotations about the first axis (e.g., by an angle, Φ) . This rotation about the first axis may be given as Equation (1) below:
Figure PCTCN2022073939-appb-000001
The second rotation diagram 404 may represent rotations about the second axis (e.g., y-axis) and may indicate how the first axis (e.g., x-axis) and the third axis (e.g., z-axis) are affected by the rotations about the second axis (e.g., by an angle, θ) . This rotation about the second axis may be given as Equation (2) below:
Figure PCTCN2022073939-appb-000002
The third rotation diagram 406 may represent rotations about the third axis (e.g., z-axis) and may indicate how the first axis (e.g., x-axis) and the second axis (e.g., y-axis) are affected by the rotations about the third axis (e.g., by an angle, Ψ) . This rotation about the third axis may be given as Equation (3) below:
Figure PCTCN2022073939-appb-000003
After using one or more of Equations (1) , (2) , and (3) for the rotations about the respective axes, a whole movement matrix can be formed, which is given below by Equation (4) :
Figure PCTCN2022073939-appb-000004
Using the movement matrix and algorithm, a computing device and/or processor of the imaging system described herein can calculate an old position of the second imaging device (e.g., given by X 1, Y 1, and Z 1) and/or the target location (e.g., target position, coordinates of interest, etc., given by X 2, Y 2, and Z 2) . In Equation (4) , ‘O’ may represent a ‘realized position’ of a given set of coordinates; the ‘R 3x3’ matrix may represent real-time coordinates (e.g., XYZ coordinates determined from the rotation diagrams and corresponding equations) ; and the ‘T 3x1’ matrix may represent movement to a target location (e.g., moving from the real-time coordinates to the target location) .
The set of coordinate mapping diagrams 400 may also include a first coordinate mapping diagram 408 and a second coordinate mapping diagram 410 that can be used to map image pixel coordinated to/from real-world coordinates (e.g., using camera and/or image coordinates) . For example, using the first coordinate mapping diagram 408, the following image pixel coordinate mapping relationships can be determined as given below in Equations (5) , (6) , and (7) :
Figure PCTCN2022073939-appb-000005
Figure PCTCN2022073939-appb-000006
Figure PCTCN2022073939-appb-000007
Subsequently, using the second coordinate mapping diagram 410, a camera coordinate mapping can be determined as given below in Equation 8:
Figure PCTCN2022073939-appb-000008
Accordingly, based on Equations (1) – (8) and the various daigrams shown in the example of Fig. 4, pixel positions of specific areas of images captured by the first imaging device (e.g., target location input on the captured image (s) , a current or initial location of the second imaging device, etc. ) can map to real-world coordinates, where a difference (e.g., distance) between the positions can be determined accuratley. Additionally, the pixel coordinates/positions may be output to the operator of the imaging system to indicate how far the operator needs to adjust the imaging system (e.g., how far to move the second imaging device) . In some examples, the transformation between real-world coordinates and the pixel coordinates may be given by Equation (9) below:
Figure PCTCN2022073939-appb-000009
Using Equation (9) , the computing device and/or processor of the imaging system may map between real-world coordinates, camera coordinates, image coordinates, and pixel coordinates (e.g., adjusting between real-world coordinates, imaginary coordinates, and pixel coordinates) . As shown by the transformation in Equation (9) above, the imaging system described herein may give the operator of the imaging system more suggestions about how to move the second imaging device (e.g., X-ray system or X-ray machine) .
Fig. 5 depicts a method 500 that may be used, for example, to identify a current position of an imaging device with respect to areas of interest of an object and to calculate a distance for moving the imaging device to capture those areas of interest more accurately.
The method 500 (and/or one or more steps thereof) may be carried out or otherwise performed, for example, by at least one processor. The at least one processor may be the same as or similar to the processor (s) 104 of the computing device 102 described above. The at least one processor may be part of a robot (such as a robot 114) or part of a navigation system (such as a navigation system 118) . A processor other than any processor described herein may also be used to execute the method 500. The at least one processor may perform the method 500 by executing elements stored in a memory such as the memory 106. The elements stored in the memory and executed by the processor may cause the processor to execute one or more steps of a function as  shown in method 500. One or more portions of a method 500 may be performed by the processor executing any of the contents of memory, such as an image processing 120, a segmentation 122, a transformation 124, and/or a registration 128.
The method 500 comprises capturing a first image of a target environment using a first imaging device, where the first image includes an object included in the target environment (step 502) . For example, the target environment may include an operation room, where the first image includes at least an image of a patient. Additionally, the first imaging device may include a camera or camera system. In some examples, the first imaging device may be situated at the top of the target environment (e.g., on the ceiling of the operating room) or may be located elsewhere in the target environment such that the first image still includes the object.
The method 500 also comprises selecting coordinates of interest included in the first image, the coordinates of interest including at least a portion of the object (step 504) . For example, as described with reference to Fig. 3, a target line may be selected that passes through the portion of the object, where the coordinates of interest include the target line.
The method 500 also comprises generating a set of real-world coordinates corresponding to the portion of the object based on a mapping between a set of pixel coordinates associated with the first image and the set of real-world coordinates (step 506) . For example, the set of real-world coordinates may be generated as described with reference to Fig. 4.
The method 500 also comprises positioning a second imaging device into a first location based on the set of real-world coordinates (step 508) . That is, the second imaging devices may be placed at a location corresponding to the generated real-world coordinates that should, in turn, correspond to the coordinates of interest.
The method 500 also comprises displaying a second image captured using the second imaging device, where the second image includes at least the portion of the object (step 510) . For example, if the second imaging device is located at the coordinates of interest, the second imaging device may then be used to capture images (e.g., X-ray images) of the portion of the object (e.g., areas of interest of the patient) .
The present disclosure encompasses embodiments of the method 500 that comprise more or fewer steps than those described above, and/or one or more steps that are different than the steps described above.
Fig. 6 depicts a method 600 that may be used, for example, to verify a location of the second imaging device as described herein with respect to a given set of coordinates of interest.
The method 600 (and/or one or more steps thereof) may be carried out or otherwise performed, for example, by at least one processor. The at least one processor may be the same as or similar to the processor (s) 104 of the computing device 102 described above. The at least one processor may be part of a robot (such as a robot 114) or part of a navigation system (such as a navigation system 118) . A processor other than any processor described herein may also be used to execute the method 600. The at least one processor may perform the method 600 by executing elements stored in a memory such as the memory 106. The elements stored in memory and executed by the processor may cause the processor to execute one or more steps of a function as shown in method 600. One or more portions of a method 600 may be performed by the processor executing any of the contents of memory, such as an image processing 120, a segmentation 122, a transformation 124, and/or a registration 128.
The method 600 comprises capturing a first image of a target environment using a first imaging device, where the first image includes an object included in the target environment (step 602) . The method 600 also comprises selecting coordinates of interest included in the first image, the coordinates of interest including at least a portion of the object (step 604) . The method 600 also comprises generating a set of real-world coordinates corresponding to the portion of the object based on a mapping between a set of pixel coordinates associated with the first image and the set of real-world coordinates (step 606) . The method 600 also comprises positioning the second imaging device into a first location based on the set of real-world coordinates (step 608) .
The method 600 also comprises capturing a third image of the target environment using the first imaging device after the second imaging device has been positioned into the first location (step 610) . The method 600 also comprises verifying the second imaging device is at the coordinates of interest based on the third image, the coordinates of interest, the set of real-world coordinates, the set of pixel coordinates, or a combination thereof (step 612) . For example, the first imaging device may be used to capture an additional image of the moved second imaging device to verify the second imaging device has been accurately moved to the coordinated of interest or not. If the second imaging device is accurately positioned (e.g., its position has been verified to be correct) , the method 600 may continue to step 614. Alternatively, if the second imaging device is determined to not be located at the coordinates of interest from the verifying, the second imaging device may be positioned into a second location (e.g., autonomously or manually by an operator based on guiding information displayed for the operator) .
The method 600 also comprises displaying a second image captured using the second imaging device, where the second image includes at least the portion of the object (step 614) . For  example, the second imaging device may be used to capture and display the second image after a position of the second imaging device has been verified to be accurate.
The present disclosure encompasses embodiments of the method 600 that comprise more or fewer steps than those described above, and/or one or more steps that are different than the steps described above.
Fig. 7 depicts a method 700 that may be used, for example, to guide an operator of an imaging system described herein when the operator is manually moving the second imaging device to a target location.
The method 700 (and/or one or more steps thereof) may be carried out or otherwise performed, for example, by at least one processor. The at least one processor may be the same as or similar to the processor (s) 104 of the computing device 102 described above. The at least one processor may be part of a robot (such as a robot 114) or part of a navigation system (such as a navigation system 118) . A processor other than any processor described herein may also be used to execute the method 700. The at least one processor may perform the method 700 by executing elements stored in a memory such as the memory 106. The elements stored in memory and executed by the processor may cause the processor to execute one or more steps of a function as shown in method 700. One or more portions of a method 700 may be performed by the processor executing any of the contents of memory, such as an image processing 120, a segmentation 122, a transformation 124, and/or a registration 128.
The method 700 comprises capturing a first image of a target environment using a first imaging device, where the first image includes an object included in the target environment (step 702) . The method 700 also comprises selecting coordinates of interest included in the first image, the coordinates of interest including at least a portion of the object (step 704) . The method 700 also comprises generating a set of real-world coordinates corresponding to the portion of the object based on a mapping between a set of pixel coordinates associated with the first image and the set of real-world coordinates (step 706) .
The method 700 also comprises calculating a distance to move the second imaging device into a first location to capture a second image that includes at least the portion of the object, where the distance is calculated based on the coordinates of interest, an initial location of the second imaging device, the real-world coordinates, or a combination thereof (step 708) . In some examples, guiding information may be displayed to assist the operator with positioning the second imaging device, where the guiding information includes the calculated distance. Additionally, the guiding information may be displayed to the operator based on the set of pixel coordinates associated with  the first image (e.g., the calculated distance is converted back and forth between pixel coordinates, real-world coordinates, camera coordinates, and image coordinates as described with reference to Fig. 4) .
The method 700 also comprises positioning the second imaging device into the first location based on the set of real-world coordinates (e.g., and the calculated distance) (step 710) . The method 700 also comprises displaying a second image captured using the second imaging device, the second image including at least the portion of the object (step 712) .
The present disclosure encompasses embodiments of the method 700 that comprise more or fewer steps than those described above, and/or one or more steps that are different than the steps described above.
As noted above, the present disclosure encompasses methods with fewer than all of the steps identified in Figs. 5, 6, and 7 (and the corresponding description of the methods 500, 600, and 700) , as well as methods that include additional steps beyond those identified in Figs. 5, 6, and 7 (and the corresponding description of the methods 500, 600, and 700) . The present disclosure also encompasses methods that comprise one or more steps from one method described herein, and one or more steps from another method described herein. Any correlation described herein may be or comprise a registration or any other correlation.
The foregoing is not intended to limit the disclosure to the form or forms disclosed herein. In the foregoing Detailed Description, for example, various features of the disclosure are grouped together in one or more aspects, embodiments, and/or configurations for the purpose of streamlining the disclosure. The features of the aspects, embodiments, and/or configurations of the disclosure may be combined in alternate aspects, embodiments, and/or configurations other than those discussed above. This method of disclosure is not to be interpreted as reflecting an intention that the claims require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed aspect, embodiment, and/or configuration. Thus, the following claims are hereby incorporated into this Detailed Description, with each claim standing on its own as a separate preferred embodiment of the disclosure.
Moreover, though the foregoing has included description of one or more aspects, embodiments, and/or configurations and certain variations and modifications, other variations, combinations, and modifications are within the scope of the disclosure, e.g., as may be within the skill and knowledge of those in the art, after understanding the present disclosure. It is intended to obtain rights which include alternative aspects, embodiments, and/or configurations to the extent  permitted, including alternate, interchangeable and/or equivalent structures, functions, ranges or steps to those claimed, whether or not such alternate, interchangeable and/or equivalent structures, functions, ranges or steps are disclosed herein, and without intending to publicly dedicate any patentable subject matter.

Claims (20)

  1. A robotic surgical imaging system, comprising:
    a first imaging device;
    a second imaging device;
    a processor coupled with the first imaging device and the second imaging device; and
    a memory coupled with and readable by the processor and storing therein instructions that, when executed by the processor, cause the processor to:
    capture a first image of a target environment using the first imaging device, wherein the first image comprises an object included in the target environment;
    select coordinates of interest included in the first image, the coordinates of interest including at least a portion of the object;
    generate a set of real-world coordinates corresponding to the portion of the object based at least in part on a mapping between a set of pixel coordinates associated with the first image and the set of real-world coordinates;
    position the second imaging device into a first location based at least in part on the set of real-world coordinates; and
    display a second image captured using the second imaging device, the second image comprising at least the portion of the object.
  2. The robotic surgical imaging system of claim 1, wherein the instructions to select the coordinates of interest included in the first image cause the processor to:
    select a target line that passes through at least the portion of the object, wherein the target line comprises the coordinates of interest.
  3. The robotic surgical imaging system of claim 1, wherein the instructions further cause the processor to:
    capture a third image of the target environment using the first imaging device after the second imaging device has been positioned into the first location; and
    verify the second imaging device is at the coordinates of interest based at least in part on the third image, the coordinates of interest, the set of real-world coordinates, the set of pixel coordinates, or a combination thereof.
  4. The robotic surgical imaging system of claim 3, wherein the instructions further cause the processor to:
    position the second imaging device into a second location based at least in part on determining the second imaging device is not located at the coordinates of interest from the verifying.
  5. The robotic surgical imaging system of claim 4, wherein the instructions further cause the processor to:
    display guiding information to assist an operator with positioning the second imaging device into the second location.
  6. The robotic surgical imaging system of claim 1, wherein the instructions to generate the set of real-world coordinates corresponding to the portion of the object cause the processor to:
    calculate a distance to move the second imaging device into the first location to capture the second image comprising at least the portion of the object, wherein the distance is calculated based at least in part on the coordinates of interest and an initial location of the second imaging device.
  7. The robotic surgical imaging system of claim 6, wherein the instructions further cause the processor to:
    display guiding information to assist an operator with positioning the second imaging device, wherein the guiding information comprises the calculated distance.
  8. The robotic surgical imaging system of claim 7, wherein the guiding information is displayed to the operator based at least in part on the set of pixel coordinates associated with the first image.
  9. The robotic surgical imaging system of claim 1, wherein the first imaging device comprises a camera and the second imaging device comprises an X-ray machine.
  10. A system, comprising:
    a processor; and
    a memory coupled with and readable by the processor and storing therein instructions that, when executed by the processor, cause the processor to:
    capture a first image of a target environment using a first imaging device, wherein the first image comprises an object included in the target environment;
    select coordinates of interest included in the first image, the coordinates of interest including at least a portion of the object;
    generate a set of real-world coordinates corresponding to the portion of the object based at least in part on a mapping between a set of pixel coordinates associated with the first image and the set of real-world coordinates;
    position a second imaging device into a first location based at least in part on the set of real-world coordinates; and
    display a second image captured using the second imaging device, the second image comprising at least the portion of the object.
  11. The system of claim 10, wherein the instructions to select the coordinates of interest included in the first image cause the processor to:
    select a target line that passes through at least the portion of the object, wherein the target line comprises the coordinates of interest.
  12. The system of claim 10, wherein the instructions further cause the processor to:
    capture a third image of the target environment using the first imaging device after the second imaging device has been positioned into the first location; and
    verify the second imaging device is at the coordinates of interest based at least in part on the third image, the coordinates of interest, the set of real-world coordinates, the set of pixel coordinates, or a combination thereof.
  13. The system of claim 12, wherein the instructions further cause the processor to:
    position the second imaging device into a second location based at least in part on determining the second imaging device is not located at the coordinates of interest from the verifying.
  14. The system of claim 13, wherein the instructions further cause the processor to:
    display guiding information to assist an operator with positioning the second imaging device into the second location.
  15. The system of claim 10, wherein the instructions to generate the set of real-world coordinates corresponding to the portion of the object cause the processor to:
    calculate a distance to move the second imaging device into the first location to capture the second image comprising at least the portion of the object, wherein the distance is calculated based at least in part on the coordinates of interest and an initial location of the second imaging device.
  16. The system of claim 15, wherein the instructions further cause the processor to:
    display guiding information to assist an operator with positioning the second imaging device, wherein the guiding information comprises the calculated distance.
  17. The system of claim 16, wherein the guiding information is displayed to the operator based at least in part on the set of pixel coordinates associated with the first image.
  18. The system of claim 10, wherein the first imaging device comprises a camera and the second imaging device comprises an X-ray machine.
  19. A method, comprising:
    capturing a first image of a target environment using a first imaging device, wherein the first image comprises an object included in the target environment;
    generating a set of real-world coordinates corresponding to coordinates of interest included in the first image based at least in part on a mapping between a set of pixel coordinates associated with the first image and the set of real-world coordinates, wherein the coordinates of interest indicate at least a portion of the object; and
    displaying a second image captured using a second imaging device, the second image comprising at least the portion of the object, wherein the second imaging device is positioned into a first location based at least in part on the set real-world coordinates.
  20. The method of claim 19, wherein the coordinates of interest comprise a target line that passes through at least the portion of the object.
PCT/CN2022/073939 2022-01-26 2022-01-26 Mobile x-ray positioning system WO2023141800A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/073939 WO2023141800A1 (en) 2022-01-26 2022-01-26 Mobile x-ray positioning system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/073939 WO2023141800A1 (en) 2022-01-26 2022-01-26 Mobile x-ray positioning system

Publications (1)

Publication Number Publication Date
WO2023141800A1 true WO2023141800A1 (en) 2023-08-03

Family

ID=87470130

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/073939 WO2023141800A1 (en) 2022-01-26 2022-01-26 Mobile x-ray positioning system

Country Status (1)

Country Link
WO (1) WO2023141800A1 (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140364720A1 (en) * 2013-06-10 2014-12-11 General Electric Company Systems and methods for interactive magnetic resonance imaging
US20180085084A1 (en) * 2015-03-27 2018-03-29 3Shape A/S A method of reducing the x-ray dose in an x-ray system
CN209392094U (en) * 2018-06-20 2019-09-17 深圳大学 A kind of surgery systems of augmented reality
CN111148472A (en) * 2017-09-27 2020-05-12 皇家飞利浦有限公司 System and method for positioning a mobile medical imaging system
CN111374690A (en) * 2018-12-28 2020-07-07 通用电气公司 Medical imaging method and system
CN111658142A (en) * 2019-03-07 2020-09-15 重庆高新技术产业开发区瑞晟医疗科技有限公司 MR-based focus holographic navigation method and system
CN111789606A (en) * 2020-08-10 2020-10-20 上海联影医疗科技有限公司 Imaging system and method
CN112348851A (en) * 2020-11-04 2021-02-09 无锡蓝软智能医疗科技有限公司 Moving target tracking system and mixed reality operation auxiliary system
CN112543623A (en) * 2019-07-22 2021-03-23 京东方科技集团股份有限公司 Surgical robot system and control method thereof
CN112584760A (en) * 2019-04-29 2021-03-30 上海联影医疗科技股份有限公司 System and method for object positioning and image guided surgery
CN112773353A (en) * 2019-11-08 2021-05-11 佳能医疗系统株式会社 Imaging support device and storage medium storing imaging support program
CN113229836A (en) * 2021-06-18 2021-08-10 上海联影医疗科技股份有限公司 Medical scanning method and system
CN113397578A (en) * 2020-07-27 2021-09-17 上海联影医疗科技股份有限公司 Imaging system and method
CN113647967A (en) * 2021-09-08 2021-11-16 上海联影医疗科技股份有限公司 Control method, device and system of medical scanning equipment

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140364720A1 (en) * 2013-06-10 2014-12-11 General Electric Company Systems and methods for interactive magnetic resonance imaging
US20180085084A1 (en) * 2015-03-27 2018-03-29 3Shape A/S A method of reducing the x-ray dose in an x-ray system
CN111148472A (en) * 2017-09-27 2020-05-12 皇家飞利浦有限公司 System and method for positioning a mobile medical imaging system
CN209392094U (en) * 2018-06-20 2019-09-17 深圳大学 A kind of surgery systems of augmented reality
CN111374690A (en) * 2018-12-28 2020-07-07 通用电气公司 Medical imaging method and system
CN111658142A (en) * 2019-03-07 2020-09-15 重庆高新技术产业开发区瑞晟医疗科技有限公司 MR-based focus holographic navigation method and system
CN112584760A (en) * 2019-04-29 2021-03-30 上海联影医疗科技股份有限公司 System and method for object positioning and image guided surgery
CN112543623A (en) * 2019-07-22 2021-03-23 京东方科技集团股份有限公司 Surgical robot system and control method thereof
CN112773353A (en) * 2019-11-08 2021-05-11 佳能医疗系统株式会社 Imaging support device and storage medium storing imaging support program
US20210137481A1 (en) * 2019-11-08 2021-05-13 Canon Medical Systems Corporation Imaging assisting apparatus and storage medium storing therein imaging assisting computer program
CN113397578A (en) * 2020-07-27 2021-09-17 上海联影医疗科技股份有限公司 Imaging system and method
CN111789606A (en) * 2020-08-10 2020-10-20 上海联影医疗科技有限公司 Imaging system and method
CN112348851A (en) * 2020-11-04 2021-02-09 无锡蓝软智能医疗科技有限公司 Moving target tracking system and mixed reality operation auxiliary system
CN113229836A (en) * 2021-06-18 2021-08-10 上海联影医疗科技股份有限公司 Medical scanning method and system
CN113647967A (en) * 2021-09-08 2021-11-16 上海联影医疗科技股份有限公司 Control method, device and system of medical scanning equipment

Similar Documents

Publication Publication Date Title
WO2023214398A1 (en) Robotic arm navigation using virtual bone mount
US20230135286A1 (en) Systems, devices, and methods for tracking one or more surgical landmarks
WO2023141800A1 (en) Mobile x-ray positioning system
US20210322112A1 (en) System and method for aligning an imaging device
US11763499B2 (en) Systems, methods, and devices for generating a corrected image
US20230255694A1 (en) Systems and methods for validating a pose of a marker
US20230278209A1 (en) Systems and methods for controlling a robotic arm
US11847809B2 (en) Systems, devices, and methods for identifying and locating a region of interest
US20230281869A1 (en) Systems, methods, and devices for reconstructing a three-dimensional representation
US20230240659A1 (en) Systems, methods, and devices for tracking one or more objects
US20230240756A1 (en) Systems, methods, and devices for generating a hybrid image
US20230401766A1 (en) Systems, methods, and devices for generating a corrected image
US20230389991A1 (en) Spinous process clamp registration and methods for using the same
US20230240790A1 (en) Systems, methods, and devices for providing an augmented display
US20230240755A1 (en) Systems and methods for registering one or more anatomical elements
US20230404692A1 (en) Cost effective robotic system architecture
US20230240774A1 (en) Systems and methods for robotic collision avoidance using medical imaging
US20230165653A1 (en) Systems, methods, and devices for covering and tracking a surgical device
US20230115849A1 (en) Systems and methods for defining object geometry using robotic arms
US20220354584A1 (en) Systems and methods for generating multiple registrations
US11925497B2 (en) Systems, methods, and devices for multiple exposures imaging
US20220249180A1 (en) Systems and methods for intraoperative re-registration
US20230020476A1 (en) Path planning based on work volume mapping
US20240156531A1 (en) Method for creating a surgical plan based on an ultrasound view
US20220346882A1 (en) Devices, methods, and systems for robot-assisted surgery

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22922651

Country of ref document: EP

Kind code of ref document: A1