CN116096322A - Systems and methods for assisting in placement of a surgical instrument into a subject - Google Patents

Systems and methods for assisting in placement of a surgical instrument into a subject Download PDF

Info

Publication number
CN116096322A
CN116096322A CN202080103792.9A CN202080103792A CN116096322A CN 116096322 A CN116096322 A CN 116096322A CN 202080103792 A CN202080103792 A CN 202080103792A CN 116096322 A CN116096322 A CN 116096322A
Authority
CN
China
Prior art keywords
surgical instrument
laser beam
trajectory information
planned
trajectory
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080103792.9A
Other languages
Chinese (zh)
Inventor
赵小芬
曹建辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai United Imaging Healthcare Co Ltd
Original Assignee
Shanghai United Imaging Healthcare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai United Imaging Healthcare Co Ltd filed Critical Shanghai United Imaging Healthcare Co Ltd
Publication of CN116096322A publication Critical patent/CN116096322A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/10Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
    • A61B90/11Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis with guides for needles or instruments, e.g. arcuate slides or ball joints
    • A61B90/13Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis with guides for needles or instruments, e.g. arcuate slides or ball joints guided by light, e.g. laser pointers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3937Visible markers

Abstract

The present disclosure may provide a system for assisting in the placement of surgical instruments. The system may include an imaging component (410), a trajectory determination component (420), and a control component (430). The imaging component (410) and the control component (430) may correspond to the same coordinate system. The imaging component (410) may acquire at least one image of the object (440). The trajectory determination component (420) may determine trajectory information for placing the surgical instrument into the object (440) based on the reference entry point and the reference target point in the at least one image. The control assembly (430) may include a laser emitting unit configured to emit a laser beam to the object (440) based on the trajectory information. The direction of the laser beam may coincide with the direction in which the track information is related. The laser beam may be used as a reference for placing the surgical instrument into the object (440).

Description

Systems and methods for assisting in placement of a surgical instrument into a subject
Technical Field
The present disclosure relates generally to medical systems, and in particular, to systems and methods for assisting in the placement of surgical instruments into a subject.
Background
During surgical treatment of a subject, surgical instruments (e.g., needles) should be accurately placed into the subject. Typically, trajectory information (e.g., planned entry points, planned target points, planned entry angles) for placing a surgical instrument into an object is manually determined based on an image of the object acquired by an imaging device. In addition, the trajectory information may be sent to a control device, and the control device may emit a laser beam for assisting in the placement of the surgical instrument. However, since the control device is independent of the imaging device, the coordinate systems of the two are not identical, and therefore the trajectory information cannot be used directly to assist in the placement of the surgical instrument. Furthermore, the manner in which track information is manually determined is very time consuming. Accordingly, it is desirable to provide systems and methods for assisting in the accurate and efficient placement of surgical instruments into a subject.
Disclosure of Invention
According to one aspect of the present disclosure, a system for assisting in the placement of a surgical instrument may be provided. The system may include an imaging component, a trajectory determination component, and a control component. The imaging assembly and the control assembly may correspond to the same coordinate system. The imaging component may be configured to acquire at least one image of the subject. The trajectory determination component is configured to determine trajectory information for placing the surgical instrument into the object based on the reference entry point and the reference target point in the at least one image. The control assembly may include a laser emitting unit configured to emit a laser beam to the object based on the trajectory information. The direction of the laser beam coincides with the direction associated with the trajectory information and the laser beam is used as a reference for placing the surgical instrument into the object.
In some embodiments, the trajectory information may include a planned entry point for placing the surgical instrument on the surface of the object, a planned target point for placing the surgical instrument in the object, a planned trajectory between the planned entry point and the planned target point, a planned entry depth for placing the surgical instrument in the object, a planned entry angle between the surgical instrument and the reference plane.
In some embodiments, the control assembly may further include a movement unit configured to adjust the position information of the laser emitting unit based on the trajectory information, and a fastener configured to fasten the laser emitting unit.
In some embodiments, the degree of freedom of the motion unit may be 6.
In some embodiments, to adjust the position information of the laser emitting unit based on the trajectory information, the movement unit may be configured to coincide the intersection point of the laser beam and the object with the planned entry point in the trajectory information, and to make the angle between the laser beam and the reference plane equal to the planned entry angle in the trajectory information.
In some embodiments, the control assembly may further include a guiding unit configured to guide placement of the surgical instrument into the subject based at least in part on the laser beam or trajectory information.
In some embodiments, the guide unit may include an aperture configured to receive the surgical instrument, one or more rails positioned to adjust the position of the aperture such that the aperture center axis coincides with the center axis of the laser beam, and one or more fasteners configured to fasten the surgical instrument.
In some embodiments, the system may further include an automation component configured to automatically place the surgical instrument into the subject based at least in part on the laser beam or the trajectory information.
In some embodiments, the surgical instrument is automatically placed into the object based at least in part on the laser beam or trajectory information, and the automated assembly may be configured to coincide the origin of the surgical instrument with the intersection of the laser beam and the object; the direction of the surgical instrument is consistent with the direction of the laser beam; and placing the surgical instrument into the object based on the planned depth of entry in the trajectory information.
In some embodiments, the surgical instrument may include at least one of a needle, a staple, a screw, or a drill bit.
In some embodiments, the trajectory information is determined in real-time during the surgical treatment.
According to another aspect of the present disclosure, a method for assisting in the placement of a surgical instrument may be provided. The method may include: acquiring at least one image of the object by the imaging assembly; determining, by a trajectory determination component, trajectory information for placing a surgical instrument into the object based on the reference entry point and the reference target point in the at least one image; and emitting, by the control component, a laser beam to the object based on the trajectory information. The direction of the laser beam coincides with the direction associated with the track information. The laser beam is used as a reference for placing the surgical instrument into the object. The imaging assembly and the control assembly correspond to the same coordinate system.
In some embodiments, the trajectory information may include a planned entry point for placing the surgical instrument at the surface of the object, a planned target point for placing the surgical instrument in the object, a planned trajectory between the planned entry point and the planned target point, a planned entry depth of the surgical instrument in the object, or a planned entry angle between the surgical instrument and the reference plane.
In some embodiments, the method may further comprise coinciding the intersection of the laser beam and the object with the planned entry point in the trajectory information and making the angle between the laser beam and the reference plane equal to the planned entry angle.
In some embodiments, the method may further comprise automatically placing the surgical instrument into the subject based at least in part on the laser beam or the trajectory information.
In some embodiments, placing the surgical instrument in the object may include coinciding a starting point of the surgical instrument with an intersection of the laser beam and the object, coinciding a direction of the surgical instrument with a direction of the laser beam, and placing the surgical instrument into the object based on the planned depth of entry in the trajectory information.
In some embodiments, the reference entry point and the reference target point in the at least one image may be determined automatically.
In some embodiments, the surgical instrument may include at least one of a needle, a staple, a screw, or a drill bit.
In some embodiments, the method may further include determining trajectory information for placement of the surgical instrument in the subject in real-time during the surgical treatment.
According to another aspect of the disclosure, a non-transitory computer readable medium may be provided. The non-transitory computer-readable medium may include executable instructions that, when executed by at least one processor, instruct the at least one processor to perform a method. The method may include: acquiring at least one image of the object by the imaging assembly; determining, by a trajectory determination component, trajectory information for placing a surgical instrument into the object based on the reference entry point and the reference target point in the at least one image; and emitting, by the control component, a laser beam to the object based on the trajectory information. The direction of the laser beam coincides with the direction associated with the track information. The laser beam is used as a reference for placing the surgical instrument into the object. The imaging assembly and the control assembly correspond to the same coordinate system.
Additional features of the disclosure will be set forth in part in the description which follows and in part will become apparent to those skilled in the art upon examination of the following description and the accompanying drawings or may be learned by production or operation of the embodiments. The features of the present disclosure may be implemented and realized by the practice or use of the methods, instrumentalities and combinations of various aspects of the specific embodiments described below.
Drawings
The present description will be further illustrated by way of example embodiments. These exemplary embodiments will be described in detail with reference to the accompanying drawings. The drawings are not drawn to scale. The embodiments are not limiting, in which like numerals represent like structures, wherein:
FIG. 1 is a schematic diagram of an exemplary medical system shown according to some embodiments of the present disclosure;
FIG. 2 is a schematic diagram of exemplary hardware and/or software components of a computing device shown in accordance with some embodiments of the present disclosure;
FIG. 3 is a schematic diagram of exemplary hardware and/or software components of a mobile device shown in accordance with some embodiments of the present disclosure;
FIG. 4 is a schematic diagram of an exemplary medical device shown according to some embodiments of the present disclosure;
FIG. 5A is a schematic diagram of an exemplary control assembly shown according to some embodiments of the present disclosure;
FIG. 5B is an enlarged view of a portion of the control assembly of FIG. 5A, shown in accordance with some embodiments of the present disclosure;
FIG. 5C is a schematic diagram of a guidance unit of the control assembly of FIG. 5A, shown in accordance with some embodiments of the present disclosure;
FIG. 5D is a side view of the guide unit of FIG. 5C shown in accordance with some embodiments of the present disclosure;
FIG. 5E is a top view of the guide unit of FIG. 5C shown in accordance with some embodiments of the present disclosure;
FIG. 6 is a block diagram of an exemplary processing device shown in accordance with some embodiments of the present disclosure;
FIG. 7 is an exemplary flow chart for placing a surgical instrument into a subject, shown in accordance with some embodiments of the present disclosure; and
fig. 8 is a schematic diagram of an image of a subject acquired by performing intra-operative imaging of the subject, shown in accordance with some embodiments of the present disclosure.
Detailed Description
In the following detailed description, numerous specific details are set forth by way of examples in order to provide a thorough understanding of the relevant disclosure. However, it will be apparent to one skilled in the art that the present disclosure may be practiced without these details. In other instances, well-known methods, procedures, systems, components, and/or circuits have been described at a relatively high-level, without detail, in order to avoid unnecessarily obscuring aspects of the present disclosure. Various modifications to the disclosed embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the disclosure. Thus, the present disclosure is not to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the scope of the claims.
The terminology used in the present disclosure is for the purpose of describing particular example embodiments only and is not intended to limit the scope of the present disclosure. As used in this disclosure, the singular forms "a", "an" and "the" may include plural referents unless the context clearly dictates otherwise. It will be further understood that the terms "comprises" and/or "comprising," as used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should be appreciated that the terms "system," "unit," "module," and/or "block" as used in this disclosure are methods for distinguishing, in ascending order, different components, elements, parts, or assemblies of different levels. However, if these terms are used for the same purpose, they may be replaced with another term.
The modules (or units, blocks, units) described in this disclosure may be implemented as software and/or hardware modules and may be stored in any type of non-transitory computer-readable medium or other storage device. In some embodiments, software modules may be compiled and linked into an executable program. It should be appreciated that the software module may be invoked from other modules or from itself, and/or may be invoked in response to a detected event or interrupt. Software modules configured for execution on a computing device may be provided on a computer-readable medium, such as an optical disk, digital video disk, flash drive, magnetic disk, or any other tangible medium, or as a digital download (and may be initially stored in a compressed or installable format that requires installation, decompression, or decryption prior to execution). Such software code may be stored in part or in whole on a memory device of the executing computing device for execution by the computing device. The software instructions may be embedded in firmware, such as EPROM. It will be further appreciated that the hardware modules (e.g., circuits) may include connected or coupled logic units, such as gates and flip-flops, and/or may include programmable units, such as programmable gate arrays or processors. The modules or computing device functions described herein are preferably implemented as hardware modules, but may also be software modules. In general, the modules described herein refer to logical modules that may be combined with other modules or divided into units, although they have physical organization or storage.
It will be understood that when an element, engine, module, or block is referred to as being "on," "connected," or "coupled" to another element, engine, module, or block, it can be directly on, connected, or coupled to or in communication with the other element, engine, module, or block, or intervening elements, engines, modules, or blocks may be present unless the context clearly dictates otherwise. In this disclosure, the term "and/or" includes one or more of the associated listed items or combinations.
These and other features, characteristics, and functions of the present disclosure, as well as the methods of operation and functions of the related structural elements, as well as the combination of parts and economies of manufacture, may become more apparent upon consideration of the following description of the drawings, all of which form a part of this disclosure. It is to be understood, however, that the drawings are designed solely for the purposes of illustration and description and are not intended to limit the scope of the present disclosure.
The flowcharts used in this disclosure illustrate the operations performed by a system according to some embodiments of the invention. It should be understood that the operations of the flow diagrams may be performed out of order. Rather, these operations may be performed in reverse order, or concurrently. Also, one or more other operations may be added to the flow chart. One or more operations may also be deleted from the flowchart.
In the present disclosure, the object may include a biological object and/or a non-biological object. The biological object may be a human, an animal, a plant or a specific part, organ and/or tissue thereof. Such as head, neck, chest, heart, stomach, blood vessels, soft tissue, tumors, nodules, or the like or any combination thereof. In this disclosure, "object" and "subject" may be used interchangeably.
One aspect of the present disclosure relates to systems and methods for assisting in the placement of a surgical instrument (e.g., a needle) into a subject (e.g., a patient). The system may include an imaging component, a trajectory determination component, and a control component. The imaging component may be configured to acquire at least one image of the subject. The reference entry point and the reference target point may be determined manually or automatically in the at least one image. The trajectory determination component may be configured to determine trajectory information for placing the surgical instrument into the object based on the reference entry point and the reference target point. The control component may be configured to control its laser emitting unit to emit the laser beam to the object based on the trajectory information. As used herein, the direction of the laser beam may coincide with the direction in which the track information relates. Thus, a surgical instrument may be placed into the object based at least in part on the laser beam and/or the trajectory information.
According to the systems and methods of the present disclosure, the imaging component and the control component may correspond to the same coordinate system, and thus, without a calibration operation (or registration operation), the trajectory information may be directly determined based on the reference entry point and the reference target point in the at least one image. This allows track information to be determined in a relatively short time (e.g. substantially in real time) and avoids errors that may occur in the calibration operation or in the (registration operation). Accordingly, the surgical instrument can be accurately and efficiently placed into the subject based on the trajectory information.
Fig. 1 is a schematic diagram of an exemplary medical system shown according to some embodiments of the present disclosure. In some embodiments, the medical system 100 may assist in the placement of a surgical instrument (e.g., a needle) into a subject (e.g., a patient) during a surgical treatment. As shown, medical system 100 may include a medical device 110, a network 120, a terminal device 130, a processing device 140, and a storage device 150. The components of the medical system 100 may be connected in one or more of a variety of ways. For example only, as shown in fig. 1, medical device 110 may be connected to processing device 140 through network 120. As another example, medical device 110 may be directly connected to processing device 140, as indicated by the double-headed arrow in the dashed line connecting medical device 110 and processing device 140. As another example, the storage device 150 may be connected to the processing device 140 directly or through the network 120. As yet another example, terminal device 130 may be connected to processing device 140 directly (as indicated by the double-headed arrow in the dashed line connecting terminal device 130 and processing device 140) or through network 120.
The medical device 110 may be configured to acquire image data of a subject (e.g., a patient) and/or to assist in treating the subject based on the image data. In some embodiments, the medical device 110 may include an imaging component 112 and a control component 114. The imaging component 112 may be configured to acquire image data (e.g., at least one image) of the subject. The image data may reflect the anatomy of the object and be used to determine trajectory information (e.g., planned entry points, planned target points, planned entry angles) to assist in placing a surgical instrument (e.g., needle) into the object. The control component 114 may be configured to control its laser emitting unit to emit a laser beam to the object based on the trajectory information, wherein the direction of the laser beam may coincide with a direction (e.g., a direction of the planned trajectory) to which the trajectory information relates to enable the laser beam to display at least a portion of the trajectory information. Furthermore, a surgical instrument may be placed in the object based at least in part on the laser beam and/or the trajectory information. In some embodiments, the trajectory information may be determined by the processing device 140 based on the image data. In some embodiments, medical device 110 may include a trajectory determination component (not shown in fig. 1) configured to determine trajectory information based on the image data. More description regarding medical device 110 may be found elsewhere in this disclosure (e.g., fig. 4-5E and descriptions thereof).
The network 120 may include any suitable network capable of facilitating the exchange of information and/or data by the medical system 100. In some embodiments, one or more components of the medical system 100 (e.g., the medical device 110, the terminal device 130, the processing device 140, or the storage device 150) may communicate information and/or data with one or more other components in the medical system 100 over the network 120. For example, the processing device 140 may send instructions for acquiring at least one image of the subject to the medical device 110 (e.g., the imaging component 112) over the network 120. As another example, the processing device 140 may obtain information from the medical device 110 (e.g., the imaging component 112) via the network 120At least one image of the object is taken. As another example, the processing device 140 may send instructions for emitting a laser beam based on the trajectory information to the medical device 110 (e.g., the control component 114) over the network 120. In some embodiments, network 120 may be any type of wired or wireless network, or combination thereof. Network 120 may be and/or include a public network (e.g., the internet), a private network (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), etc.), a wired network (e.g., ethernet), a wireless network (e.g., an 802.11 network, a Wi-Fi network, etc.), a cellular network (e.g., a Long Term Evolution (LTE) network), a frame relay network, a Virtual Private Network (VPN), a satellite network, a telephone network, a router, a hub, a switch, a server computer, and/or any combination thereof. By way of example only, network 120 may include a cable network, a wireline network, a fiber optic network, a telecommunications network, an intranet, a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Public Switched Telephone Network (PSTN), bluetooth TM Network, zigBee TM A network, a Near Field Communication (NFC) network, etc., or any combination thereof. In some embodiments, network 120 may include one or more network access points. For example, network 120 may include wired and/or wireless network access points, such as base stations and/or internet switching points, through which one or more components of medical system 100 may connect to network 120 to exchange data and/or information.
Terminal device 130 may include a mobile device 131, a tablet 132, a laptop 133, etc., or any combination thereof. In some embodiments, mobile device 131 may include a smart home device, a wearable device, a smart mobile device, a virtual reality device, an augmented reality device, etc., or any combination thereof. The smart home devices may include smart lighting devices, control devices for smart electrical devices, smart monitoring devices, smart televisions, smart cameras, interphones, etc., or any combination thereof. The wearable device may include smart bracelets, smart footwear, smart glasses, smart helmets, smart watches, smart clothing, smart backpacks, smart accessories, and the like, or any combination thereof. The smart mobile device may include a smart phone, a Personal Digital Assistant (PDA), a gaming device, a guide Avionics, point of sale (POS) devices, etc., or any combination thereof. The virtual reality device and/or the augmented reality device may include a virtual reality helmet, virtual reality glasses, virtual reality patches, an augmented reality helmet, augmented reality glasses, augmented reality patches, and the like, or any combination thereof. For example, the virtual reality device and/or the augmented reality device may include Google TM Glasses, eye lift, hollens, gear VR, etc. In some embodiments, the terminal device 130 may remotely operate the medical device 110 and/or the processing device 140. In some embodiments, the terminal device 130 may operate the medical device 110 and/or the processing device 140 via a wireless connection. In some embodiments, terminal device 130 may receive information and/or instructions entered by a user and send the received information and/or instructions to medical device 110 or processing device 140 over network 120. In some embodiments, terminal device 130 may receive data and/or information from processing device 140. In some embodiments, the terminal device 130 may be part of the processing device 140. In some embodiments, terminal device 130 may be omitted.
The processing device 140 may process data and/or information obtained from the medical device 110, the terminal device 130, and/or the storage device 150. For example, the processing device 140 may determine a reference entry point and a reference target point in at least one image of the object acquired by the medical device 110. As another example, the processing device 140 may determine trajectory information for assisting in placing the surgical instrument into the object based at least in part on the reference entry point and the reference target point in the at least one image. In some embodiments, the processing device 140 may be a single server or a group of servers. The server farm may be centralized or distributed. In some embodiments, the processing device 140 may be local or remote. For example, processing device 140 may access information and/or data stored in or acquired by medical device 110, terminal device 130, and/or storage device 150 via network 120. As another example, processing device 140 may be directly connected to medical device 110 (as indicated by the dashed double-headed arrow connecting processing device 140 and medical device 110 in fig. 1), terminal device 130 (as indicated by the dashed double-headed arrow connecting processing device 140 and terminal device 130 in fig. 1), and/or storage device 150 to access stored or acquired information and/or data. In some embodiments, the processing device 140 may be implemented on a cloud platform. For example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof.
The storage device 150 may store data and/or instructions. In some embodiments, the storage device 150 may store data obtained from the medical device 110, the terminal device 130, and/or the processing device 140. For example, the processing device 140 may determine trajectory information for assisting in placing a surgical instrument into the subject based on at least one image of the subject acquired by the medical device 110, and then may store the trajectory information in the storage device 150 for further use or processing. In some embodiments, storage device 150 may store data obtained from terminal device 130 and/or processing device 140. In some embodiments, the storage device 150 may store data and/or instructions that the processing device 140 may perform or be used to perform the exemplary methods described in this disclosure. For example, the storage device 150 may store instructions that the processing device 140 may execute or use to determine a reference entry point and a reference target point in at least one image of an object acquired by the medical device 110. As another example, the storage device 150 may store instructions that the processing device 140 may execute or use to determine trajectory information based at least in part on a reference entry point and a reference target point in at least one image. In some embodiments, the storage device 150 may include mass storage, removable storage, volatile read-write memory, read-only memory (ROM), and the like, or any combination thereof. Exemplary mass storage may include magnetic disks, optical disks, solid state drives, and the like. Exemplary removable memory may include flash drives, floppy disks, optical disks, memory cards, compact disks, tape, and the like. Exemplary volatile read-write memory can include Random Access Memory (RAM). Exemplary RAM may include Dynamic RAM (DRAM), dual data rate synchronous dynamic RAM (DDR SDRAM), static RAM (SRAM), thyristor RAM (T-RAM), zero capacitor RAM (Z-RAM), and the like. Exemplary ROMs may include Mask ROM (MROM), programmable ROM (PROM), erasable programmable ROM (PEROM), electrically Erasable Programmable ROM (EEPROM), compact disk ROM (CD-ROM), and digital versatile disk ROM, among others. In some embodiments, storage device 150 may be implemented on a cloud platform. For example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof.
In some embodiments, the storage device 150 may be connected to the network 120 to communicate with one or more components of the medical system 100 (e.g., the medical device 110, the terminal device 130, the processing device 140). One or more components of medical system 100 may access data or instructions stored in storage device 150 through network 120. In some embodiments, the storage device 150 may be directly connected to or in communication with one or more components of the medical system 100 (e.g., the medical device 110, the processing device 140, the terminal device 130). In some embodiments, the storage device 150 may be part of the processing device 140.
In some embodiments, the medical system 100 may also include one or more power supplies (not shown in fig. 1) connected to one or more components of the medical system 100 (e.g., the medical device 110, the processing device 140, the terminal device 130, the storage device 150).
The description is intended to be illustrative, and not limiting, of the scope of the disclosure. Many alternatives, modifications, and variations will be apparent to those skilled in the art. The features, structures, methods, and other features of the example embodiments described herein may be combined in various ways to obtain additional and/or alternative example embodiments. However, such changes and modifications do not depart from the scope of the present disclosure.
FIG. 2 is a schematic diagram of exemplary hardware and/or software components of a computing device shown in accordance with some embodiments of the present disclosure. In some embodiments, processing device 140 may be implemented on computing device 200. As shown in fig. 2, computing device 200 may include a processor 210, memory 220, input/output (I/O) 230, and communication ports 240.
Processor 210 may execute computer instructions (program code) and perform the functions of processing device 140 in accordance with the techniques described herein. Computer instructions may include routines, programs, objects, components, signals, data structures, procedures, modules, and functions that perform particular functions described herein. In some embodiments, processor 210 may include microcontrollers, microprocessors, reduced Instruction Set Computers (RISC), application Specific Integrated Circuits (ASICs), application specific instruction set processors (ASIPs), central Processing Units (CPUs), graphics Processing Units (GPUs), physical Processing Units (PPUs), microcontroller units, digital Signal Processors (DSPs), field Programmable Gate Arrays (FPGAs), advanced RISC Machines (ARM), programmable Logic Devices (PLDs), any circuitry or processors capable of executing one or more functions, or the like, or any combination thereof.
For illustrative purposes only, only one processor is depicted in computing device 200. However, it should be noted that the computing device 200 in this disclosure may also include multiple processors, and thus, operations of the method performed by one processor described in this disclosure may also be performed by multiple processors, either together or separately. For example, if in the present disclosure, the processors of computing device 200 perform operations a and B simultaneously, it should be understood that operations a and B may also be performed jointly or separately by two different processors in computing device 200 (e.g., a first processor performing operation a, a second processor performing operation B, or both first and second processors performing operations a and B jointly).
Memory 220 may store data/information obtained from medical device 110, terminal device 130, storage device 150, or any other component of medical system 100. In some embodiments, memory 220 may include mass storage, removable storage, volatile read-write memory, read-only memory (ROM), and the like, or any combination thereof. For example, mass storage may include magnetic disks, optical disks, solid state drives, and the like. Removable memory may include flash drives, floppy disks, optical disks, memory cards, compact disks, magnetic tape, and the like. Volatile read-write memory can include Random Access Memory (RAM). The RAM may include Dynamic RAM (DRAM), dual data rate synchronous dynamic RAM (DDR SDRAM), static RAM (SRAM), thyristor RAM (T-RAM), zero capacitor RAM (Z-RAM), and the like. The ROM may include a Mask ROM (MROM), a Programmable ROM (PROM), an erasable programmable ROM (PEROM), an Electrically Erasable Programmable ROM (EEPROM), a compact disk ROM (CD-ROM), a digital versatile disk ROM, and the like. In some embodiments, memory 220 may store one or more programs and/or instructions to perform the exemplary methods described in this disclosure. For example, the memory 220 may store a program for the processing device 140 for determining trajectory information for assisting in placing a surgical instrument into the object based on at least one image of the object.
I/O230 may input or output signals, data, or information. In some embodiments, the I/O230 may enable user interaction with the processing device 140. In some embodiments, I/O230 may include input devices and output devices. Exemplary input devices may include a keyboard, mouse, touch screen, microphone, trackball, etc., or a combination thereof. Exemplary output devices may include a display device, speakers, printer, projector, etc., or a combination thereof. Exemplary display devices can include Liquid Crystal Displays (LCDs), light Emitting Diode (LED) based displays, flat panel displays, curved screens, television devices, cathode Ray Tubes (CRTs), and the like, or combinations thereof.
For example only, a user (e.g., an operator) of the processing device 140 may input data related to an object (e.g., a patient) being/to be imaged/scanned via the I/O230. The data related to the subject may include identity information (e.g., name, age, gender, medical history, contact information, physical examination results) and/or test information including the nature of the scan that must be performed. The user may also input parameters required for operation of the medical device 110, such as image contrast and/or ratio, region of interest (ROI), etc., or any combination thereof. The I/O230 may also display images (or video) generated based on the imaging/scanning data.
Communication port 240 may be connected to a network (e.g., network 120) to facilitate data communication. The communication port 240 may establish a connection between the processing device 140 and the medical device 110, the terminal device 130, or the storage device 150. The connection may be a wired connection, a wireless connection, or a combination of both to enable data transmission and reception. The wired connection may include an electrical cable, optical cable, telephone line, etc., or any combination thereof. The wireless connection may include bluetooth, wi-Fi, wiMax, WLAN, zigBee, a mobile network (e.g., 3G, 4G, 5G), etc., or a combination thereof. In some embodiments, the communication port 240 may be a standardized communication port, such as RS232, RS485, or the like. In some implementations, the communication interface 240 may be a specially designed communication port. For example, the communication interface may be designed according to the digital imaging and communications in medicine (DICOM) protocol.
Fig. 3 is a schematic diagram of exemplary hardware and/or software components of a mobile device shown in accordance with some embodiments of the present disclosure. In some embodiments, the terminal device 130 may be implemented on the mobile device 300. As shown in FIG. 3, mobile device 300 may include a communication platform 310, a display 320, a Graphics Processing Unit (GPU) 330, a Central Processing Unit (CPU) 340, I/O350, memory 360, and storage 390. In some embodiments, any other suitable component, including but not limited to a system bus or controller (not shown), may also be included in mobile device 300. In some embodiments, a mobile operating system 370 (e.g., iOS, android, windows Phone) and one or more applications 380 may be loaded from storage 390 into memory 360 for execution by CPU 340. Application 380 may include a browser or any other suitable mobile application for receiving and presenting information related to surgical instrument placement or other information from processing device 140. User interaction with the information flow may be accomplished through I/O350 and provided to processing device 140 and/or other components of medical system 100 through network 120.
To implement the various modules, units, and functions thereof described in this disclosure, a computer hardware platform may be used as a hardware platform for one or more of the elements described herein. The hardware elements, operating systems, and programming languages of such computers are conventional in nature, and it is assumed that those skilled in the art are sufficiently familiar with these techniques to adapt them to the surgical instrument placement described herein. A computer with user interface elements may be used to implement a Personal Computer (PC) or another type of workstation or terminal device, but if properly programmed, the computer may also act as a server. It is believed that one skilled in the art will be familiar with the construction, programming, and general operation of such computer devices, and that the drawings should be self-evident.
Fig. 4 is a schematic diagram of an exemplary medical device shown according to some embodiments of the present disclosure. Medical device 400 may be an example of medical device 110 shown in fig. 1. As shown in fig. 4, medical device 400 may include an imaging component 410, a trajectory determination component 420, and a control component 430. Imaging assembly 410 and control assembly 430 may be examples of imaging assembly 112 and control assembly 114, respectively, shown in fig. 1.
The imaging component 410 may be configured to acquire at least one image of the subject 440. As used herein, at least one image may reflect the anatomy of the subject 440. In some embodiments, the imaging assembly 410 may include Computed Tomography (CT), X-ray machine, positron Emission Tomography (PET), single Photon Emission Computed Tomography (SPECT), magnetic Resonance (MR), or the like, or any combination thereof. The at least one image may include a two-dimensional (2D) image, a three-dimensional (3D) image, a four-dimensional (4D) image, or the like, or any combination thereof.
In some embodiments, imaging assembly 410 may include a frame 412, a radiation source (e.g., an X-ray source) (not shown in fig. 4), a detection assembly (not shown in fig. 4), and a stage 414. In some embodiments, the radiation source and detection assembly may be mounted on the frame 412; the table 414 may be used to hold or support the object 440. In some embodiments, the radiation source may emit a radiation beam (e.g., an X-ray beam) toward the object 440, and the radiation beam may attenuate as it passes through the object 440. The detection assembly may then receive the attenuated radiation beam and generate imaging data (which may also be referred to as an "image" for brevity) corresponding to the received radiation beam.
In some embodiments, imaging component 410 may preoperatively image subject 410 and acquire at least one image corresponding to the preoperative imaging. Thus, during a surgical treatment (e.g., biopsy, resection procedure), at least one image may be used to assist in the placement of a surgical instrument into the subject 440. In some embodiments, the surgical instrument may include a needle, a staple, a screw, a drill, a catheter, a guidewire, a debrider, an aspirator, a handle, an introducer, an artificial disc, a shunt, a plate, a rod, or the like. In some embodiments, the surgical instrument may be made of a biocompatible material, such as a metallic material, a polymeric material, a ceramic material, a bone material, or the like, or any combination thereof.
The trajectory determination component 420 may be configured to determine trajectory information for placing the surgical instrument into the object 440 based on the reference entry point and the reference target point in the at least one image. As used herein, a reference entry point in at least one image may indicate a point at which a surgical instrument begins to enter the object 440; the reference target point in the at least one image may indicate a point at which the surgical instrument stopped within the object 440 (e.g., a point in a tumor). In some embodiments, the reference entry point and/or the reference target point may be determined manually or automatically. For example, an operator (e.g., a physician) may manually mark a reference entry point and/or a reference target point in at least one image. As another example, an operator may use a virtual reality device (e.g., 3D glasses) to determine a reference entry point and/or a reference target point. As another example, the trajectory determination component 420 may automatically determine the reference entry point and/or the reference target point, for example, using a machine learning model. For example only, the machine learning model may include a convolutional neural network model, an adaptive enhancement model, a gradient enhancement decision tree model, or the like, or any combination thereof. In some embodiments, the trajectory determination component 420 may include an interface through which an operator can mark a reference entry point and/or a reference target point in at least one image.
In some embodiments, the trajectory information may include a planned entry point for placing the surgical instrument on the surface of the object 440, a planned target point for placing the surgical instrument in the object 440, a planned trajectory between the planned entry point and the planned target point, a planned entry depth of the surgical instrument in the object, a planned entry angle between the surgical instrument and a reference plane (e.g., the x-z plane shown in fig. 1) or a reference axis (e.g., the x-axis, y-axis, z-axis shown in fig. 1), etc., or any combination thereof. As used herein, a planned entry point refers to a point on the surface of the object 440 at which the surgical instrument begins to enter the object; the planned target point refers to the point within the object 440 at which the surgical instrument stopped; the planned entry depth refers to a distance (e.g., linear distance, horizontal distance, vertical distance) between the planned entry point and the planned target point or a length of the planned trajectory. In some embodiments, the planned entry point may be the same or substantially the same as the reference entry point (e.g., the distance between the two is less than a threshold). Similarly, the planning target point may be the same or substantially the same as the reference target point.
In some embodiments, the trajectory determination component 420 may be integrated into the processing device 140 or may be implemented by the processing device 140. In some embodiments, the trajectory determination component 420 may be a computing device connected to the processing device 140 or in communication with the processing device 140. In some embodiments, the trajectory determination component 420 may be a computing device independent of the processing device 140.
The control assembly 430 may be configured to provide guidance for placement of the surgical instrument into the subject 440 based on the trajectory information. In some embodiments, the control assembly 430 may include a laser emitting unit, a moving unit, a fastener, and/or a guiding unit.
The laser emitting unit may be configured to emit a laser beam to the object 440 based on the trajectory information, which laser beam may be used as a reference for placing the surgical instrument into the object 440. In some embodiments, the direction of the laser beam coincides with the direction associated with the track information. For example, the direction of the laser beam coincides with the direction of the planned trajectory. In some embodiments, the laser emitting unit may include a cross laser generator, a T laser generator, a spot laser generator, or the like, or a combination thereof.
The movement unit may be configured to adjust the position information of the laser emitting unit based on the trajectory information. For example, the movement unit may adjust the position information of the laser emitting unit so that the direction of the emitted laser beam coincides with the direction associated with the track information. In some embodiments, the motion unit may adjust the position information of the laser emitting unit based on the planned entry point and/or the planned entry angle in the trajectory information. For example, the movement unit may coincide the intersection of the laser beam and the object 440 with the planned entry point and make the angle between the laser beam and the reference plane or reference axis equal to the planned entry angle. In some embodiments, the initial position of the laser emitting unit may be determined by the trajectory determination component 420 based on the trajectory information. Further, the motion unit may adjust the initial position based on the trajectory information. In some embodiments, the movement unit may implement a rotational movement and/or a translational movement to adjust the position information of the laser emitting unit and/or the direction of the laser beam. For example, the degree of freedom of the movement unit may be 6, so that the positional information of the laser emitting unit and/or the direction of the laser beam may be arbitrarily adjusted. In some embodiments, the motion unit may be an automated unit, e.g., a robotic arm, robot, or the like. More description about a motion unit may be found elsewhere in this disclosure (e.g., fig. 5A-5E and descriptions thereof).
The fastener may be configured to fasten the laser emitting unit to the control assembly 430. In some embodiments, the fasteners may include snaps, screws, nuts, bolts, washers, sealants, sealing tape, and the like, or any combination thereof.
The guiding unit may be configured to guide the placement of the surgical instrument into the object 440 based at least in part on the laser beam and/or the trajectory information. In some embodiments, the guide unit may include a hole, one or more rails, and one or more fasteners. The aperture may be configured to receive a surgical instrument. In some embodiments, the diameter of the hole may be the same as or slightly larger than the diameter of the surgical instrument. The one or more tracks may be configured to adjust the position of the aperture such that the central axis of the aperture coincides with the central axis of the laser beam. The surgical instrument may then be placed into the hole. The one or more fasteners may be configured to fasten the guide unit and/or the surgical instrument. For example, the one or more fasteners may include snaps, screws, nuts, bolts, washers, sealants, sealing tape, and the like, or any combination thereof. Further description of the guiding unit may be found elsewhere in this disclosure (e.g., fig. 5A-5E and descriptions thereof).
In some embodiments, the medical device 400 may further include an automated component (e.g., robotic arm, robot) (not shown in fig. 4) configured to automatically place the surgical instrument into the subject 440 based at least in part on the laser beam and/or trajectory information. For example, the robotic assembly may place the surgical instrument into the object 440 along the direction of the laser beam until the surgical instrument reaches a planned target point within the object 440. As another example, the automated assembly may coincide the origin of the surgical instrument with the intersection of the laser beam and the object 440 and coincide the direction of the surgical instrument with the direction of the laser beam. Further, the automated component can place the surgical instrument into the subject 440 based on the planned depth of entry in the trajectory information. In some embodiments, the surgical instrument may be manually placed into the subject 440 by an operator (e.g., a doctor).
In some cases, the trajectory information determined by the trajectory determination component 420 may be inaccurate, such as a system failure, due to some unexpected circumstances. To ensure that the surgical instrument can be accurately placed in the subject 440, the operator may check whether adjustment of the trajectory information is required. In response to determining that the track information needs to be adjusted, the operator may instruct the track determination component 420 to re-determine new track information, or the operator may manually modify the track information. Furthermore, a laser beam may be emitted based on the new trajectory information or the modified information, and a surgical instrument may be placed into the object 440 based at least in part on the laser beam and/or the new trajectory information or the modified information.
In some embodiments, imaging assembly 410 and control assembly 430 may correspond to one coordinate system, and thus, without a calibration operation (or registration operation), trajectory information determined based on at least one image acquired by imaging assembly 410 may be used directly to assist in placement of a surgical instrument into subject 440. In other words, the trajectory information may be determined directly based on the reference entry point and the reference target point in the at least one image without a calibration operation (or registration operation). Further, in this way, the trajectory information can be determined in a relatively short time (e.g., substantially in real time), and errors that may occur in the calibration operation or (registration operation) can be avoided.
In some embodiments, in order for imaging assembly 410 and control assembly 430 to correspond to the same coordinate system, a pre-calibration operation (or pre-registration operation) may be performed when imaging assembly 410 and control assembly 430 are installed or assembled. As shown in fig. 4, the control assembly 430 may be mounted on the imaging assembly 410.
In some embodiments, imaging assembly 410 may intraoperatively image subject 440 during the surgical treatment and acquire at least one second image. The at least one second image may reflect anatomy of at least a portion (e.g., organ, tissue) of the subject 440 and a portion of a surgical instrument in the subject 440. The at least one second image may be used to monitor positional information of the surgical instrument in the subject 440 during the surgical treatment. In some embodiments, it may be determined whether the positional information of the surgical instrument deviates from the trajectory information. In response to determining that the positional information of the surgical instrument deviates from the trajectory information, the surgical instrument may be immediately adjusted. In some embodiments, it may be determined whether a deviation of the surgical instrument (e.g., an angular difference between a planned entry angle and an actual entry angle of the surgical instrument) is within a threshold. In response to determining that the deviation is within the threshold, positional information of the surgical instrument may be adjusted based on the deviation, the trajectory information, and/or the laser beam. For example, if the actual entry angle is offset to the left by one degree (within a threshold), the positional information of the surgical instrument may be adjusted by offsetting the surgical instrument to the right by one degree. In response to determining that the deviation exceeds the threshold, the surgical instrument may be pulled from the object 440 and repositioned into the object 440 based at least in part on the trajectory information and/or the laser beam.
In some cases, the position of at least a portion of subject 440 may change during surgical treatment due to movement of subject 440 or movement of internal organs or tissues of subject 440, such as heart movement, movement of other organs or tissues affected by heart movement, respiratory movement (e.g., movement of the lungs and/or diaphragm), movement of other organs or tissues affected by respiratory movement, blood flow movement, vascular pulsation, muscle contraction, muscle relaxation, pancreatic secretion activity, and the like, or any combination thereof. Thus, to prevent accidental injury to an organ or tissue of subject 440 during surgical treatment, trajectory information may be determined by trajectory determination component 420 in view of the motion information. For example, during a surgical treatment, motion information of subject 440 may be monitored based on at least one second image acquired by imaging assembly 410 and/or other sensed information detected by a sensing device (e.g., motion sensor, distance sensor), and then trajectory information may be adjusted in real-time or substantially real-time based on the motion information.
According to the system described in the present disclosure, since the imaging component 410 and the control component 430 correspond to the same coordinate system, a calibration operation (or registration operation) is not required, and trajectory information can be determined based on the reference entry point and the reference target point in a relatively short time (e.g., substantially in real time), which can improve processing efficiency. Furthermore, the laser beam may visualize at least a portion of the trajectory information, and thus, the surgical instrument may be accurately placed into the object based on the trajectory information, and the difference between the planned entry angle in the trajectory information and the desired entry angle of the surgical instrument may be controlled within a range of less than one degree. Further, after determining the reference entry point and the reference target point, the medical device 400 may automatically determine trajectory information and emit a laser beam based on the trajectory information, which may effectively reduce manual operations.
In some embodiments, the components of medical device 400 may be connected to each other in any suitable manner. For example, imaging component 410 may be connected to trajectory determination component 420 directly or through a network (e.g., network 120 shown in fig. 1), trajectory determination component 420 may be connected to control component 430 directly or through a network, etc. Accordingly, the components of the medical device 400 may communicate with each other to facilitate control of surgical instrument placement.
It should be noted that the medical device 400 is for illustrative purposes only and is not intended to limit the scope of the present disclosure. Many variations or modifications may be made by one of ordinary skill in the art in light of the teachings of the disclosure. However, such changes and modifications do not depart from the scope of the present disclosure.
Fig. 5A is a schematic diagram of an exemplary control assembly shown in accordance with some embodiments of the present disclosure. Fig. 5B is an enlarged view of a portion of the control assembly of fig. 5A, shown in accordance with some embodiments of the present disclosure. Fig. 5C is a schematic diagram of a guidance unit of the control assembly of fig. 5A, shown in accordance with some embodiments of the present disclosure. Fig. 5D is a side view of the guide unit of fig. 5C, shown in accordance with some embodiments of the present disclosure. Fig. 5E is a top view of the guide unit of fig. 5C shown in accordance with some embodiments of the present disclosure. The control assembly 500 is shown. The control assembly 500 shown in fig. 5A-5E may be an example of the control assembly 430 shown in fig. 4 or the control assembly 114 shown in fig. 1. As shown in fig. 5A, the control assembly 500 may include a laser emitting unit 510, a fastener 520, a moving unit 530, and a guiding unit 540.
The laser emitting unit 510 may be configured to emit a laser beam 550. The fastener 520 may be configured to fasten the laser emitting unit 510 to the control assembly 500. The motion unit 530 may be configured to adjust the position information of the laser emitting unit 510 so that the direction of the laser beam 550 coincides with a direction (e.g., a direction of a planned trajectory) in relation to the trajectory information for placing the surgical instrument 570 into the object 560. In some embodiments, motion unit 530 may be a robotic arm having 6 degrees of freedom, i.e., three translational degrees of freedom along three coordinate axes (i.e., x-axis, y-axis, and z-axis) and three rotational degrees of freedom about the three coordinate axes.
The guiding unit 540 may be configured to guide the placement of the surgical instrument 570 into the object 560 based at least in part on the laser beam 550 or the trajectory information. As shown in fig. 5C, the guide unit 540 may include a hole 541, one or more rails 542 (e.g., rail 5421, rail 5422, rail 5423), and one or more fasteners 543 (e.g., fastener 5431, fastener 5432, and fastener 5433). The aperture 541 may be configured to receive a surgical instrument 570. One or more tracks 542 may be configured to adjust the position of aperture 541 such that the central axis of aperture 541 coincides with the central axis of laser beam 550. For example, the position of the hole 541 may be adjusted by sliding the rails 5421 and 5422 through the rails 5423. The one or more fasteners 543 may be configured to fasten the guide unit 540 and/or the surgical instrument 570. For example, the fasteners 5432 and 5433 may be configured to fasten the surgical instrument 570 in the aperture 541; the fastener 5431 may be configured to fasten the guide unit 540 to the object 560.
Fig. 6 is a block diagram of an exemplary processing device shown in accordance with some embodiments of the present disclosure. The processing device 140 may include an image acquisition module 610, a trajectory determination module 620, and a beam emission module 630.
The image acquisition module 610 may be configured to acquire at least one image of an object (e.g., the object 440 shown in fig. 4) from an imaging component (e.g., the imaging component 410 shown in fig. 4). In some embodiments, the image acquisition module 610 may send instructions to the imaging component for acquiring the at least one image, and the imaging component may acquire the at least one image accordingly. The at least one image may reflect an anatomical structure of the subject. More description about at least one image may be found elsewhere in this disclosure (e.g., fig. 4 and 8 and descriptions thereof).
The trajectory determination module 620 may be configured to determine trajectory information for placing the surgical instrument into the object based on the reference entry point and the reference target point in the at least one image. For example, the trajectory determination module 620 may send instructions for determining trajectory information to a trajectory determination component (e.g., the trajectory determination component 420 shown in fig. 4), and the trajectory determination component may determine trajectory information accordingly. As used herein, a reference entry point in at least one image may indicate a point at which a surgical instrument begins to enter a subject; the reference target point in the at least one image may indicate a point at which the surgical instrument stopped within the object (e.g., a point in a tumor). In some embodiments, the reference entry point and/or the reference target point may be determined by the trajectory determination module 620.
In some embodiments, the trajectory information may include a planned entry point for placing the surgical instrument on the surface of the object, a planned target point for placing the surgical instrument in the object, a planned trajectory between the planned entry point and the planned target point, a planned entry depth of the surgical instrument in the object, a planned entry angle between the surgical instrument and a reference plane (e.g., the x-z plane shown in fig. 1) or a reference axis (e.g., the x-axis, y-axis, z-axis shown in fig. 1), etc., or any combination thereof. More description about reference entry points, reference target points, and/or trajectory information may be found elsewhere in this disclosure (e.g., fig. 4 and 7 and descriptions thereof).
The beam emission module 630 may be configured to cause a control component (e.g., the control component 430 depicted in fig. 4) to emit a laser beam toward the subject based on trajectory information that may be used as a reference for placement of the surgical instrument into the subject. In some embodiments, the beam emitting module 630 may send instructions to the control assembly for emitting the laser beam, and the control assembly may emit the laser beam accordingly.
In some embodiments, the direction of the laser beam may coincide with the direction in which the track information is related. For example, the direction of the laser beam may coincide with the direction of the planned trajectory. In some embodiments, the laser beam may be emitted based on the planned entry point and/or the planned entry angle in the trajectory information. For example, the intersection of the laser beam with the object may be coincident with the planned entry point, and the angle between the laser beam and the reference plane or reference axis may be made equal to the planned entry angle.
The modules in the processing device 140 may be connected or communicate with each other through wired or wireless connections. The wired connection may include a metal cable, fiber optic cable, hybrid cable, or the like, or any combination thereof. The wireless connection may include a Local Area Network (LAN), a Wide Area Network (WAN), bluetooth, zigBee, near Field Communication (NFC), etc., or any combination thereof.
It should be noted that the above description is for illustrative purposes only and is not intended to limit the scope of the present disclosure. Many variations and modifications will be apparent to those of ordinary skill in the art in light of the teachings of this disclosure. However, such changes and modifications do not depart from the scope of the present disclosure. For example, the processing device 140 may include a placement module (not shown in fig. 6) configured to place the surgical instrument into the subject based at least in part on the laser beam and/or the trajectory information. As another example, processing device 140 may include a storage module (not shown) configured to store data generated by the modules described above. As another example, two or more of the above modules may be combined into a single module, or any of the above modules may be divided into two or more units.
Fig. 7 is an exemplary flow chart for placing a surgical instrument into a subject, shown in accordance with some embodiments of the present disclosure. In some embodiments, the flow 700 may be provided as a set of instructions stored in the memory 220. The processor 210 and/or modules in fig. 6 may execute instructions and, when executing instructions, the processor 210 and/or modules may be configured to perform the flow 700. The operation of the process shown below is intended to be illustrative. In some embodiments, flow 700 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Further, the order of operations of the flow shown in fig. 7 and described below is not intended to be limiting.
At step 710, the processing device 140 (e.g., the image acquisition module 610) (e.g., the interface circuitry of the processor 320) may acquire at least one image of an object (e.g., the object 440 shown in fig. 4) from an imaging component (e.g., the imaging component 410 shown in fig. 4). In some embodiments, the processing device 140 may send instructions to the imaging component for acquiring the at least one image, and the imaging component may acquire the at least one image accordingly. The at least one image may reflect an anatomical structure of the subject. More description about at least one image may be found elsewhere in this disclosure (e.g., fig. 4 and its description).
In some embodiments, the subject should be registered with the medical system 100 prior to acquiring the at least one image. In some embodiments, the registration information may include the name, age, gender, medical history, contact information, physical examination results, etc. of the subject.
In step 720, the processing device 140 (e.g., the trajectory determination module 620) (e.g., the processing circuitry of the processor 320) may determine trajectory information for placement of the surgical instrument into the subject based on the reference entry point and the reference target point in the at least one image. For example, the processing device 140 may send instructions for determining the trajectory information to a trajectory determination component (e.g., the trajectory determination component 420 shown in fig. 4), and the trajectory determination component may determine the trajectory information accordingly.
As used herein, a reference entry point in at least one image may indicate a point at which a surgical instrument begins to enter a subject; the reference target point in the at least one image may indicate a point at which the surgical instrument stopped within the object (e.g., a point in a tumor). In some embodiments, the reference entry point and/or the reference target point may be determined manually or automatically.
In some embodiments, the trajectory information may include a planned entry point for placing the surgical instrument on the surface of the object, a planned target point for placing the surgical instrument in the object, a planned trajectory between the planned entry point and the planned target point, a planned entry depth of the surgical instrument in the object, a planned entry angle between the surgical instrument and a reference plane (e.g., the x-z plane shown in fig. 1) or a reference axis (e.g., the x-axis, y-axis, z-axis shown in fig. 1), etc., or any combination thereof. As used herein, a planned entry point refers to a point on the surface of the subject where the surgical instrument begins to enter the subject; the planned target point refers to a point at which the surgical instrument stops in the subject; the planned entry depth refers to a distance (e.g., linear distance, horizontal distance, vertical distance) between the planned entry point and the planned target point or a length of the planned trajectory. In some embodiments, the planned entry point may be the same or substantially the same as the reference entry point (e.g., the distance between the two is less than a threshold). Similarly, the planning target point may be the same or substantially the same as the reference target point. More description about reference entry points, reference target points, and/or trajectory information may be found elsewhere in this disclosure (e.g., fig. 4 and description thereof).
At step 730, the processing device 140 (e.g., the beam emitting module 630) (e.g., the processing circuitry of the processor 320) may cause a control component (e.g., the control component 430 depicted in fig. 4) to emit a laser beam to the subject based on trajectory information that may be used as a reference for placement of the surgical instrument into the subject. In some embodiments, the processing device 140 may send instructions to the control component for emitting the laser beam, and the control component may emit the laser beam accordingly.
In some embodiments, the direction of the laser beam may coincide with the direction in which the track information is related. For example, the direction of the laser beam may coincide with the direction of the planned trajectory. In some embodiments, the control component may emit the laser beam based on the planned entry point and/or the planned entry angle in the trajectory information. For example, the control assembly may cause the intersection of the laser beam and the object to coincide with the planned entry point and cause the angle between the laser beam and the reference plane or reference axis to be equal to the planned entry angle.
In some embodiments, the surgical instrument may be placed in the subject manually or automatically based at least in part on the laser beam and/or trajectory information. The surgical instrument may be placed by coinciding the starting point of the surgical instrument with the intersection of the laser beam and the object and by coinciding the surgical instrument direction with the laser beam direction. Furthermore, the surgical instrument may be placed into the subject based on the planned entry depth in the trajectory information.
In some cases, the trajectory information determined in step 720 may be inaccurate, such as a system failure, due to some unexpected circumstances. To ensure that the surgical instrument can be accurately placed in the subject, the operator can check whether adjustment of the trajectory information is required. In response to determining that the track information needs to be adjusted, the operator may instruct the processing device 140 to re-determine new track information, or the operator may modify the track information. Further, in step 730, a laser beam may be emitted based on the new trajectory information or the modified information, and a surgical instrument may be placed based at least in part on the laser beam and/or the new trajectory information or the modified information.
In some embodiments, the processing device 140 may control the imaging assembly to intraoperatively image the subject during the surgical treatment and acquire at least one second image. The at least one second image may reflect anatomy of at least a portion of the subject (e.g., organ, tissue) and a portion of a surgical instrument in the subject. The at least one second image may be used to monitor positional information of a surgical instrument in the subject during the surgical treatment. In some embodiments, the processing device 140 may determine whether the positional information of the surgical instrument deviates from the trajectory information. In response to determining that the positional information of the surgical instrument deviates from the trajectory information, the surgical instrument may be immediately adjusted. In some embodiments, the processing device 140 may determine whether a deviation of the surgical instrument (e.g., an angular difference between a planned entry angle and an actual entry angle of the surgical instrument) is within a threshold. In response to determining that the deviation is within the threshold, positional information of the surgical instrument may be adjusted based on the deviation, the trajectory information, and/or the laser beam. For example, if the actual entry angle is offset to the left by one degree (within a threshold), the positional information of the surgical instrument may be adjusted by offsetting the surgical instrument to the right by one degree. In response to determining that the deviation exceeds the threshold, the surgical instrument may be pulled from and repositioned into the subject based at least in part on the trajectory information and/or the laser beam.
In some cases, the position of at least a portion of the subject may change during the surgical treatment due to movement of the subject or movement of an internal organ or tissue of the subject, such as movement of the heart, movement of other organs or tissues affected by movement of the heart, movement of breathing (e.g., movement of the lungs and/or diaphragm), movement of other organs or tissues affected by breathing, movement of blood flow, movement caused by vascular pulsations, muscle contraction, muscle relaxation, pancreatic secretion activity, and the like, or any combination thereof. Thus, to prevent accidental injury to an organ or tissue of a subject during surgical treatment, the processing device 140 may determine trajectory information in view of the motion information. For example, during a surgical treatment, motion information of the subject may be monitored based on at least one second image acquired by the imaging assembly and/or other sensed information detected by the sensing device (e.g., motion sensor, distance sensor), and then the trajectory information may be adjusted in real-time or substantially real-time based on the motion information. More description about laser beam and surgical instrument placement can be found elsewhere in this disclosure (e.g., fig. 4-5E and descriptions thereof).
It should be noted that the foregoing description is provided for the purpose of illustration only and is not intended to limit the scope of the present disclosure. Many variations or modifications may be made by one of ordinary skill in the art in light of the teachings of the disclosure. However, such changes and modifications do not depart from the scope of the present disclosure. For example, for purposes of illustration, the above describes a procedure performed by the processing device 140 for assisting in the placement of a surgical instrument into a subject (i.e., the processing device 140 acts as a central control component), which in actual practice may be performed by a separate component. For example only, the imaging component may acquire at least one image of the object, the trajectory determination component may determine trajectory information for placing the surgical instrument into the object based on the reference entry point and the reference target point in the at least one map, and the control component may emit the laser beam to the object based on the trajectory information. Furthermore, surgical instruments may be placed in the subject based on the laser beam and/or trajectory information.
Fig. 8 is a schematic diagram of an image of a subject acquired by performing intra-operative imaging of the subject, shown in accordance with some embodiments of the present disclosure.
As described in connection with fig. 1 and 4-7, trajectory information for placement of a surgical instrument into an object may be determined based on at least one image of the object (which may be acquired by performing preoperative imaging of the object), a laser beam may be emitted to visualize at least a portion of the trajectory information, and then the surgical instrument may be placed into the object based at least in part on the laser beam and/or the trajectory information. At least one second image may be acquired by intra-operatively imaging the subject during the surgical treatment to monitor positional information of the surgical instrument in the subject.
As shown in fig. 8, a point 820 represents a planned entry point in the track information, a point 830 represents a planned target point in the track information, a line segment 840 connecting the point 820 and the point 830 represents a planned entry depth in the track information, and an angle α between the line segment 840 and the y-axis represents a planned entry angle in the track information. The surgical instrument may be placed into the subject at point 820, point 830, line segment 840, and angle α. Specifically, a starting point of a surgical instrument (e.g., one end of the surgical instrument near the object) is placed at point 820, the direction of the surgical instrument coincides with the direction of the laser beam (i.e., the other end of the surgical instrument away from the object is placed along the laser beam), and then the surgical instrument enters the object.
Further, line segment 850 represents positional information of the surgical instrument in the subject during the surgical treatment. It can be seen that the positional information of the surgical instrument deviates from the trajectory information, e.g. the actual entry angle β of the surgical instrument deviates from the planned entry angle α. In order to accurately place the surgical instrument, the positional information of the surgical instrument may be immediately adjusted so that the actual entry angle is equal to the planned entry angle.
While the basic concepts have been described above, it will be apparent to those of ordinary skill in the art upon reading this disclosure that the above disclosure is by way of example only and is not limiting of the present disclosure. Although not explicitly described herein, various modifications, improvements, and adaptations of the present disclosure may occur to one of ordinary skill in the art. Such modifications, improvements, and modifications are suggested within this disclosure, and are therefore intended to be within the spirit and scope of the exemplary embodiments of this disclosure.
Meanwhile, the present disclosure uses specific words to describe embodiments of the present disclosure. For example, "one embodiment," "an embodiment," and/or "some embodiments" means a particular feature, structure, or characteristic associated with at least one embodiment of the present disclosure. Thus, it should be emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various positions in this disclosure are not necessarily referring to the same embodiment. Furthermore, certain features, structures, or characteristics of one or more embodiments of the present disclosure may be combined as suitable.
Furthermore, those of ordinary skill in the art will appreciate that aspects of the disclosure may be illustrated and described in terms of several patentable categories or circumstances, including any novel and useful processes, machines, products, or combinations of materials, or any novel and useful modifications thereof. Thus, aspects of the present disclosure may be implemented entirely in hardware, entirely in software (including firmware, resident software, micro-code, etc.) or through a combination of software and hardware implementations that are generally referred to herein as a "unit", module "or" system. Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer-readable media having computer-readable program code embodied therein.
The computer readable signal medium may comprise a propagated data signal with computer program code embodied therein, for example, on baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including electro-magnetic, optical, etc., or any suitable combination. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code located on a computer readable signal medium may be propagated through any suitable medium including radio, cable, fiber optic cable, RF, etc., or any combination of the foregoing.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including a Visual Basic, fortran 2103, perl, COBOL 2102, PHP, ABAP, dynamic programming languages (such as Python, ruby, and Groovy) or other programming languages, such as Java, scala, smalltalk, eiffel, JADE, emerald, C ++, c#, vb.net, python, etc. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the use of a network service provider's network) or a service may be provided in a cloud computing environment, for example, software service (SaaS).
Furthermore, the order in which the elements and sequences are processed, the use of numerical letters, or other designations in the disclosure are not intended to limit the order in which the processes and methods of the disclosure are performed unless explicitly recited in the claims. While certain presently useful inventive embodiments have been discussed in the foregoing disclosure, by way of various examples, it is to be understood that such details are merely illustrative and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements included within the spirit and scope of the embodiments of the disclosure. For example, while the implementations of the various components described above may be embodied in hardware devices, they may also be implemented as a purely software solution, e.g., an installation on an existing server or mobile device.
Likewise, it should be noted that in order to simplify the presentation of the disclosure, and thereby aid in understanding one or more inventive embodiments, various features are sometimes grouped together in a single embodiment, figure, or description thereof. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed subject matter 440 requires more features than are expressly recited in each claim. Rather, the contents of claimed subject matter 440 have fewer features than the single embodiment described above.

Claims (20)

1. A system for assisting in the placement of a surgical instrument, comprising an imaging assembly, a trajectory determination assembly, and a control assembly, the imaging assembly and the control assembly corresponding to the same coordinate system, wherein,
the imaging assembly is configured to acquire at least one image of a subject;
the trajectory determination component is configured to determine trajectory information for placing the surgical instrument into the object based on a reference entry point and a reference target point in the at least one image; and
the control assembly includes a laser emitting unit configured to emit a laser beam to the object based on the trajectory information, wherein a direction of the laser beam coincides with a direction associated with the trajectory information, and the laser beam is used as a reference for placing the surgical instrument into the object.
2. The system of claim 1, wherein the trajectory information comprises at least one of:
for placing the surgical instrument at a planned entry point to the object surface,
for placing the surgical instrument at a planned target point in the object,
a planned trajectory between the planned entry point and the planned target point,
A planned depth of entry of the surgical instrument into the subject, or
A planned entry angle between the surgical instrument and a reference plane.
3. The system of claim 1, wherein the control assembly further comprises:
a movement unit configured to adjust position information of the laser emitting unit based on the trajectory information; and
a fastener configured to fasten the laser emission unit.
4. A system according to claim 3, characterized in that the degree of freedom of the movement unit is 6.
5. A system according to claim 3, wherein to adjust the position information of the laser emitting unit based on the trajectory information, the movement unit is configured to:
coinciding an intersection of the laser beam and the object with the planned entry point in the trajectory information; and
the angle between the laser beam and a reference plane is made equal to the planned entry angle in the trajectory information.
6. The system of claim 1, wherein the control assembly further comprises a guiding unit configured to guide placement of the surgical instrument into the subject based at least in part on the laser beam or the trajectory information.
7. The system of claim 6, wherein the guiding unit comprises:
a bore configured to receive the surgical instrument;
one or more tracks configured to adjust the position of the aperture such that the aperture central axis coincides with the central axis of the laser beam; and
one or more fasteners configured to fasten the surgical instrument.
8. The system of claim 1, further comprising an automation component configured to automatically place the surgical instrument into the object based at least in part on the laser beam or the trajectory information.
9. The system of claim 8, wherein to automatically place the surgical instrument into the object based at least in part on the laser beam or the trajectory information, the automation component is configured to:
coinciding the origin of the surgical instrument with the intersection of the laser beam and the object;
aligning the direction of the surgical instrument with the direction of the laser beam; and
the surgical instrument is placed into the object based on a planned depth of entry in the trajectory information.
10. The system of claim 1, wherein the surgical instrument comprises at least one of a needle, a staple, a screw, or a drill bit.
11. The system of claim 1, wherein the trajectory information is determined in real time during surgical treatment.
12. A method for assisting in the placement of a surgical instrument, comprising:
acquiring at least one image of the object by the imaging assembly;
determining, by a trajectory determination component, trajectory information for placing a surgical instrument into the object based on the reference entry point and the reference target point in the at least one image; and
transmitting, by a control assembly, a laser beam to the object based on the trajectory information, wherein,
the direction of the laser beam is consistent with the direction related to the track information;
the laser beam is used as a reference for placing the surgical instrument into the object; and
the imaging assembly and the control assembly correspond to the same coordinate system.
13. The method of claim 12, wherein the trajectory information comprises at least one of:
for placing the surgical instrument at a planned entry point to the object surface,
For placing the surgical instrument at a planned target point in the object,
a planned trajectory between the planned entry point and the planned target point,
a planned depth of entry of the surgical instrument into the subject, or
A planned entry angle between the surgical instrument and a reference plane.
14. The method as recited in claim 12, further comprising:
coinciding an intersection of the laser beam and the object with the planned entry point in the trajectory information; and
the angle between the laser beam and the reference plane is made equal to the planned entry angle.
15. The method as recited in claim 12, further comprising:
the surgical instrument is automatically placed into the object based at least in part on the laser beam or the trajectory information.
16. The method of claim 15, wherein placing the surgical instrument into the subject comprises:
coinciding the origin of the surgical instrument with the intersection of the laser beam and the object;
aligning the direction of the surgical instrument with the direction of the laser beam; and
the surgical instrument is placed into the object based on a planned depth of entry in the trajectory information.
17. The method of claim 12, wherein the reference entry point and the reference target point in the at least one image are automatically determined.
18. The method of claim 12, wherein the surgical instrument comprises at least one of a needle, a staple, a screw, or a drill bit.
19. The method as recited in claim 12, further comprising:
the trajectory information for placing the surgical instrument in the subject is determined in real time during a surgical treatment.
20. A non-transitory computer-readable medium comprising executable instructions that, when executed by at least one processor, instruct the at least one processor to perform a method comprising:
acquiring at least one image of the object by the imaging assembly;
determining, by a trajectory determination component, trajectory information for placing a surgical instrument into the object based on the reference entry point and the reference target point in the at least one image; and
transmitting, by a control assembly, a laser beam to the object based on the trajectory information, wherein,
the direction of the laser beam is consistent with the direction related to the track information;
The laser beam is used as a reference for placing the surgical instrument into the object; and
the imaging assembly and the control assembly correspond to the same coordinate system.
CN202080103792.9A 2020-09-04 2020-09-04 Systems and methods for assisting in placement of a surgical instrument into a subject Pending CN116096322A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/113413 WO2022047720A1 (en) 2020-09-04 2020-09-04 Systems and methods for assisting in placing surgical instrument into subject

Publications (1)

Publication Number Publication Date
CN116096322A true CN116096322A (en) 2023-05-09

Family

ID=80492393

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080103792.9A Pending CN116096322A (en) 2020-09-04 2020-09-04 Systems and methods for assisting in placement of a surgical instrument into a subject

Country Status (2)

Country Link
CN (1) CN116096322A (en)
WO (1) WO2022047720A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102022115822A1 (en) * 2022-06-24 2024-01-04 B. Braun New Ventures GmbH Laser guidance robot for visual projection of a guidance of an operation plan, projection method and laser guidance robot system

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015135055A1 (en) * 2014-03-14 2015-09-17 Synaptive Medical (Barbados) Inc. System and method for projected tool trajectories for surgical navigation systems
US9877795B2 (en) * 2013-09-18 2018-01-30 Imirge Medical Inc Optical targeting and visualization of trajectories
EP3777749A3 (en) * 2015-12-31 2021-07-21 Stryker Corporation System and method for preparing surgery on a patient at a target site defined by a virtual object
CN111329552B (en) * 2016-03-12 2021-06-22 P·K·朗 Augmented reality visualization for guiding bone resection including a robot
US10667869B2 (en) * 2017-05-17 2020-06-02 General Electric Company Guidance system for needle procedures
CN109481018A (en) * 2018-12-29 2019-03-19 上海联影医疗科技有限公司 A kind of navigation equipment and method applied in medical care precess

Also Published As

Publication number Publication date
WO2022047720A1 (en) 2022-03-10

Similar Documents

Publication Publication Date Title
US11896849B2 (en) Subject positioning systems and methods
CN111789613B (en) Imaging system and method
US10315049B2 (en) System and method for monitoring structural movements throughout radiation therapy
US8831706B2 (en) Fiducial-less tracking of a volume of interest
JP6208535B2 (en) Radiotherapy apparatus and system and method
US20150011866A1 (en) Probe for Surgical Navigation
US20140303486A1 (en) Surgical Navigation Planning System and Associated Methods
CN110799164B (en) System and method for diagnosis and treatment
US10034712B2 (en) Control of a medical imaging device via a navigation system
US20210290335A1 (en) Light and Shadow Guided Needle Positioning System and Method
WO2009012576A1 (en) Methods and systems for guiding the acquisition of ultrasound images
CN109496327A (en) System and method for diagnosing and treating
CN112022191A (en) Positioning method and system
CN116096322A (en) Systems and methods for assisting in placement of a surgical instrument into a subject
CN113271862B (en) Radiotherapy couch and system and method using the same
CN116322902A (en) Image registration system and method
CN117045318A (en) Puncture operation guiding system, method and operation robot
US20240021299A1 (en) Medical systems and methods for movable medical devices
WO2019141262A1 (en) Bone fracture reduction device and system
Vijayan ADVANCED INTRAOPERATIVE IMAGE REGISTRATION FOR PLANNING AND GUIDANCE OF ROBOT-ASSISTED SURGERY
CN114225236A (en) Radiotherapy guiding device, radiotherapy guiding method, electronic equipment and storage medium
CN116402755A (en) Image processing method
JP2019507648A (en) Check patient positioning with MPRS and crosshairs

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination