CN117045318A - Puncture operation guiding system, method and operation robot - Google Patents

Puncture operation guiding system, method and operation robot Download PDF

Info

Publication number
CN117045318A
CN117045318A CN202210493274.3A CN202210493274A CN117045318A CN 117045318 A CN117045318 A CN 117045318A CN 202210493274 A CN202210493274 A CN 202210493274A CN 117045318 A CN117045318 A CN 117045318A
Authority
CN
China
Prior art keywords
image
puncture
information
target object
registration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210493274.3A
Other languages
Chinese (zh)
Inventor
何少文
张璟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan United Imaging Zhirong Medical Technology Co Ltd
Original Assignee
Wuhan United Imaging Zhirong Medical Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan United Imaging Zhirong Medical Technology Co Ltd filed Critical Wuhan United Imaging Zhirong Medical Technology Co Ltd
Priority to CN202210493274.3A priority Critical patent/CN117045318A/en
Priority to PCT/CN2023/091895 priority patent/WO2023216947A1/en
Publication of CN117045318A publication Critical patent/CN117045318A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Robotics (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

Embodiments of the present disclosure provide a puncture surgery guiding system, method, and surgery robot, the system comprising: a control system comprising one or more processors and a memory, the memory comprising programming instructions adapted to cause the one or more processors to perform operations comprising: respectively acquiring a first image, a second image and a third image of a target object at different times; registering the first image and the second image to obtain a fourth image, wherein the fourth image contains puncture planning information after registration; the fourth image is mapped to the third image to guide the puncture procedure.

Description

Puncture operation guiding system, method and operation robot
Technical Field
The present disclosure relates to the field of medical technology, and in particular, to a puncture surgery guiding system, method, and surgical robot.
Background
CT (Computed Tomography, computer tomography) guided percutaneous puncture surgery is the most commonly used method for clinically diagnosing and treating cancers at present, and under the real-time CT scanning, doctors control robots to puncture in a master-slave mode, so that puncture efficiency and accuracy are greatly improved, and radiation irradiation dose of patients is reduced. However, due to limitations of radiation dose, imaging time and the like, the range of real-time CT scanning is smaller, and the real-time puncture field of view is affected. If the lesion is large or the needle insertion point is far from the target or the user wishes to completely observe the state of the entire target organ when puncturing in real time, the scanning range needs to be enlarged. If the real-time CT scanning range is enlarged, the thickness of the CT image is too large, and detailed information in the target organ cannot be identified, particularly when the focus is small, detailed information of the focus cannot be displayed in the image which is possibly scanned in real time.
Therefore, there is a need for a puncture guiding method that can display the puncture state in real time, and has a large field of view and detailed information.
Disclosure of Invention
One of the embodiments of the present specification provides a puncture guiding system, comprising: a control system comprising one or more processors and memory, the memory comprising programming instructions adapted to cause the one or more processors to perform operations comprising: respectively acquiring a first image, a second image and a third image of a target object at different times; registering the first image and the second image to obtain a fourth image, wherein the fourth image contains registered puncture planning information; the fourth image is mapped to the third image to guide a puncture procedure.
One of the embodiments of the present specification provides a puncture guiding system, comprising: a control system comprising one or more processors and memory, the memory comprising programming instructions adapted to cause the one or more processors to perform operations comprising: respectively acquiring a first image, a second image and a third image of a target object at different times; performing first registration on the first image and the second image to obtain first deformation information and a fourth image, wherein the fourth image contains puncture planning information after registration; registering the second image and the third image for the second time to obtain second deformation information; the second deformation information is acted on the fourth image to obtain a fifth image, and the fifth image contains puncture planning information after the second registration; the fifth image is mapped to the third image to guide a puncture procedure.
One of the embodiments of the present specification provides a puncture guiding method, the method comprising: respectively acquiring a first image, a second image and a third image of a target object at different times; registering the first image and the second image to obtain a fourth image, wherein the fourth image contains registered puncture planning information; the fourth image is mapped to the third image to guide a puncture procedure.
One of the embodiments of the present specification provides a puncture guiding method, the method comprising: respectively acquiring a first image, a second image and a third image of a target object at different times; performing first registration on the first image and the second image to obtain first deformation information and a fourth image, wherein the fourth image contains puncture planning information after registration; registering the second image and the third image for the second time to obtain second deformation information; the second deformation information is acted on the fourth image to obtain a fifth image, and the fifth image contains puncture planning information after the second registration; the fifth image is mapped to the third image to guide a puncture procedure.
One of the embodiments of the present specification provides a surgical robot including: a mechanical arm for performing a puncture operation; and a control system comprising one or more processors and memory, the memory comprising programming instructions adapted to cause the one or more processors to perform operations comprising: respectively acquiring a first image, a second image and a third image of a target object at different times; registering the first image and the second image to obtain a fourth image, wherein the fourth image contains registered puncture planning information; the fourth image is mapped to the third image to guide a puncture procedure.
One of the embodiments of the present specification provides a surgical robot including: a mechanical arm for performing a puncture operation; and a control system comprising one or more processors and memory, the memory comprising programming instructions adapted to cause the one or more processors to perform operations comprising: respectively acquiring a first image, a second image and a third image of a target object at different times; performing first registration on the first image and the second image to obtain first deformation information and a fourth image, wherein the fourth image contains puncture planning information after registration; registering the second image and the third image for the second time to obtain second deformation information; the second deformation information is acted on the fourth image to obtain a fifth image, and the fifth image contains puncture planning information after the second registration; the fifth image is mapped to the third image to guide a puncture procedure.
In the prior art, when navigation is performed in an operation, no puncture path guidance and no target spot (focus) display exist in the operation. In the prior art, the real-time simulation planning is complicated in calculation, long in time consumption and high in difficulty in application to clinical scenes.
Based on the above, the embodiment of the specification provides a high-efficiency accurate puncture operation guiding method and system, which combines the planning result of a large-vision thin layer image scanned before a puncture operation with an intra-operation real-time image, namely, the real-time image is utilized to display the state of a puncture part of a patient in real time, and detailed planning result and detailed information before the puncture operation are introduced, so that a high risk area is avoided, and the operation risk is reduced. The real-time puncture process in operation is based on the fact that the needle point position of the puncture needle is in the center of the visual field, the puncture needle punctures the focus along the puncture planning path, the CT moving bed or the moving detector updates the scanning range, the real-time scanning image is obtained, puncture guiding is carried out for the puncture process, the operation efficiency is improved, and the operation risk is reduced.
Drawings
The present specification will be further elucidated by way of example embodiments, which will be described in detail by means of the accompanying drawings. The embodiments are not limiting, in which like numerals represent like structures, wherein:
FIG. 1 is a schematic illustration of an application scenario of an exemplary puncture surgical guidance system according to some embodiments of the present description;
FIG. 2 is a flow chart of an exemplary puncture procedure guiding method according to some embodiments of the present description;
FIG. 3 is a schematic illustration of an exemplary puncture procedure guiding method according to some embodiments of the present description;
FIG. 4 is another schematic illustration of an exemplary puncture procedure guiding method according to further embodiments of the present disclosure;
fig. 5 is a schematic diagram of an exemplary penetrating procedure guide user interface shown in accordance with some embodiments of the present description.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present specification, the drawings that are required to be used in the description of the embodiments will be briefly described below. It is apparent that the drawings in the following description are only some examples or embodiments of the present specification, and it is possible for those of ordinary skill in the art to apply the present specification to other similar situations according to the drawings without inventive effort. Unless otherwise apparent from the context of the language or otherwise specified, like reference numerals in the figures refer to like structures or operations.
It will be appreciated that "system," "apparatus," "unit" and/or "module" as used herein is one method for distinguishing between different components, elements, parts, portions or assemblies at different levels. However, if other words can achieve the same purpose, the words can be replaced by other expressions.
As used in this specification and the claims, the terms "a," "an," "the," and/or "the" are not specific to a singular, but may include a plurality, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that the steps and elements are explicitly identified, and they do not constitute an exclusive list, as other steps or elements may be included in a method or apparatus.
A flowchart is used in this specification to describe the operations performed by the system according to embodiments of the present specification. It should be appreciated that the preceding or following operations are not necessarily performed in order precisely. Rather, the steps may be processed in reverse order or simultaneously. Also, other operations may be added to or removed from these processes.
Fig. 1 is a schematic illustration of an application scenario of an exemplary puncture guiding system according to some embodiments of the present description. In some embodiments, as shown in fig. 1, the penetrating procedure guidance system 100 may include at least a medical device 110, a processing device 120, a terminal device 130, a robotic arm 140, a storage device 150, and a network 160.
The medical device 110 may scan a target object within a detection region or a scan region to obtain scan data for the target object. In some embodiments, the target object may comprise a biological object and/or a non-biological object. For example, the target object may be an organic and/or inorganic substance, whether living or not.
In some embodiments, medical device 110 may include a single modality scanner and/or a multi-modality scanner. The single mode scanner may include, for example, an ultrasound scanner, an X-ray scanner, a Computed Tomography (CT) scanner, a Magnetic Resonance Imaging (MRI) scanner, an ultrasound inspection machine, a positron emission computed tomography (PET) scanner, an Optical Coherence Tomography (OCT) scanner, an Ultrasound (US) scanner, an intravascular ultrasound (IVUS) scanner, a near infrared spectroscopy (NIRS) scanner, a Far Infrared (FIR) scanner, or the like, or any combination thereof. The multi-modality scanner may include, for example, an X-ray imaging-magnetic resonance imaging (X-ray-MRI) scanner, a positron emission tomography-X-ray imaging (PET-X-ray) scanner, a single photon emission computed tomography-magnetic resonance imaging (SPECT-MRI) scanner, a positron emission tomography-computed tomography (PET-CT) scanner, a digital subtraction angiography-magnetic resonance imaging (DSA-MRI) scanner, or the like, or any combination thereof. The above description of the medical device is for illustrative purposes only and is not intended to limit the scope of the present description.
Processing device 120 may process data and/or information acquired from medical device 110, terminal device 130, robotic arm 140, storage device 150, and/or other components of puncture surgical guidance system 100. For example, the processing device 120 may acquire images (e.g., CT scan images, PET scan images, etc.) of the target object at different times (e.g., before a puncture operation is performed, during a puncture operation, etc.) from the medical device 110 and analyze the same. For another example, the processing device 120 may register images of the target object at different times to obtain registered penetration planning information to guide the penetration procedure.
In some embodiments, the processing device 120 may be a single server or a group of servers. The server farm may be centralized or distributed. In some embodiments, the processing device 120 may be local or remote. For example, the processing device 120 may access information and/or data from the medical device 110, the terminal device 130, the robotic arm 140, and/or the storage device 150 via the network 160. As another example, the processing device 120 may be directly connected to the medical device 110, the terminal device 130, the robotic arm 140, and/or the storage device 150 to access information and/or data. In some embodiments, the processing device 120 may be implemented on a cloud platform. For example, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, and the like, or any combination thereof.
In some embodiments, the processing device 120 and the medical device 110 may be integrated. In some embodiments, the processing device 120 and the medical device 110 may be directly or indirectly connected, with the combined actions implementing the methods and/or functions described herein.
In some embodiments, processing device 120 may include input means and/or output means. Interaction with the user (e.g., displaying a planned path of penetration, etc.) may be accomplished through an input device and/or an output device. In some embodiments, the input device and/or output device may include a display screen, a keyboard, a mouse, a microphone, etc., or any combination thereof.
The terminal device 130 may be in communication and/or connected with the medical device 110, the processing device 120, the robotic arm 140, and/or the storage device 150. In some embodiments, interaction with the user may be achieved through terminal device 130. In some embodiments, the terminal device 130 may include a mobile device 131, a tablet 132, a notebook 133, or the like, or any combination thereof. In some embodiments, the terminal device 130 (or all or part of its functionality) may be integrated in the processing device 120.
The robotic arm 140 may mimic the function of a human arm and implement automatic control. In some embodiments, the robotic arm 140 may include multi-joint structures coupled to each other, capable of movement in a planar or three-dimensional space. In some embodiments, the robotic arm 140 may include a controller, a mechanical body, an inductor, and the like. In some embodiments, the controller may set the motion parameters (e.g., trajectory, direction, angle, speed, force, etc.) of the machine body; the machine body can accurately execute the action parameters; the sensor may detect or sense an external signal, a physical condition (e.g., light, heat, humidity), or a chemical composition (e.g., smoke), and communicate the detected information to the controller. In some embodiments, the robotic arm 140 may include a rigid robotic arm, a flexible robotic arm, a pneumatic-assisted robotic arm, a soft cable-assisted robotic arm, a linear robotic arm, a horizontal multi-joint robotic arm, an articulated multi-axis robotic arm, or the like, or any combination thereof. The above description of the robotic arm is for illustrative purposes only and is not intended to limit the scope of the present disclosure.
In some embodiments, a penetration instrument (e.g., surgical needle, fiber optic needle, venous indwelling needle, injection needle, puncture needle, biopsy needle, etc.) may be mounted on the robotic arm 140 and a penetration surgical procedure performed by the robotic arm.
In some embodiments, the processing device 120 and the robotic arm 140 may be integrated. In some embodiments, the processing device 120 and the robotic arm 140 may be directly or indirectly connected, with the combined actions implementing the methods and/or functions described herein. In some embodiments, the medical device 110, the processing device 120, and the robotic arm 140 may be integrated. In some embodiments, the medical device 110, the processing device 120, and the robotic arm 140 may be directly or indirectly connected, with the combined actions implementing the methods and/or functions described herein.
Storage device 150 may store data, instructions, and/or any other information. In some embodiments, the storage device 150 may store data (e.g., images, puncture planning information, user instructions, etc.) acquired from the medical device 110, the processing device 120, the terminal device 130, and/or the robotic arm 140. In some embodiments, the storage device 150 may store data and/or instructions that the processing device 120 uses to perform or use to accomplish the exemplary methods described in this specification.
In some embodiments, storage device 150 may include one or more storage components, each of which may be a separate device or may be part of another device. In some embodiments, the storage device 150 may include Random Access Memory (RAM), read Only Memory (ROM), mass storage, removable memory, volatile read-write memory, and the like, or any combination thereof. In some embodiments, storage device 150 may be implemented on a cloud platform. In some embodiments, the storage device 150 may be part of the medical device 110, the processing device 120, and/or the terminal device 130.
Network 160 may include any suitable network capable of facilitating the exchange of information and/or data. In some embodiments, at least one component of the penetrating procedure guide system 100 (e.g., the medical device 110, the processing device 120, the terminal device 130, the robotic arm 140, the storage device 150) may exchange information and/or data with at least one other component of the penetrating procedure guide system 100 via the network 160. For example, the processing device 120 may obtain an image of the target object from the medical device 110 via the network 160.
It should be noted that the above description of the penetrating surgical guidance system 100 is provided for illustrative purposes only and is not intended to limit the scope of the present description. Many modifications and variations will be apparent to those of ordinary skill in the art in light of the present description. For example, the penetrating procedure guide system 100 may perform similar or different functions on other devices. However, such changes and modifications do not depart from the scope of the present specification. It should be appreciated that the penetrating surgical guidance system 100 may be implemented in a variety of ways. For example, by hardware, software, or a combination of software and hardware. The system of the present specification and its modules may be implemented not only with hardware circuits such as very large scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, etc., or programmable hardware devices such as field programmable gate arrays, programmable logic devices, etc., but also with software executed by various types of processors, for example, and with a combination of the above hardware circuits and software (e.g., firmware).
Fig. 2 is a flow chart of an exemplary puncture procedure guiding method according to some embodiments of the present description. In some embodiments, the process 200 may be performed by the processing device 120. For example, the flow 200 may be stored in a storage device (e.g., storage device 150, a memory unit of the processing device 120) in the form of a program or instructions that, when executed by a processor, may implement the flow 200. In some embodiments, flow 200 may be accomplished with one or more additional operations not described below, and/or without one or more operations discussed below.
Step 210, acquiring a first image, a second image and a third image of a target object at different times. In some embodiments, step 210 may be performed by processing device 120.
The target object may comprise the whole or part of a biological object and/or a non-biological object involved in the scanning process. For example, the target object may be an organic and/or inorganic substance, whether living or not, such as the head, ear, nose, mouth, neck, chest, abdomen, liver, gall, pancreas, spleen, kidney, spine, etc.
In some embodiments, the processing device 120 may acquire the first, second, and third images of the target object at different times by the medical device 110. In some embodiments, the processing device 120 may obtain the first, second, third images of the target object from the medical device 110, the storage device 150, a storage unit of the processing device 120, or the like.
In some embodiments, the first, second, and third images are acquired by a Computed Tomography (CT) device.
The first image may be a pre-operative enhanced image or a pre-operative swipe image. In some embodiments, the first image may be acquired prior to the puncture procedure. The puncturing operation may be preceded by a puncturing operation for a certain period of time, for example, the first 1 hour, the first two hours, the first 5 hours, the first day, the first two days, the first week, etc. In some embodiments, the first image may be acquired at the first diagnosis of the target subject, at the time of routine physical examination, after the end of the previous puncture procedure.
The second image may be an intraoperative real-time image. In some embodiments, the second image is acquired during a puncture procedure and prior to the puncture being performed. The preparation time before the needle insertion can be in the puncture operation and before the puncture is performed. For example, the second image may be acquired at a location time, a disinfection time, a local anesthesia time, etc. For another example, the second image may be a first frame real-time image in surgery.
The third image may be an intraoperative real-time image. In some embodiments, the third image is acquired during the performance of the puncture. The puncturing execution process refers to the process of inserting a needle from the skin, entering a target area according to a puncturing path, and completing operation and needle extraction in the target area.
In some embodiments, the first image, the second image, and the third image may be acquired by different imaging devices. For example, the first image may be acquired by an imaging device of an imaging room, and the second and third images may be acquired by an imaging device of an operating room. In some embodiments, the image parameters (e.g., image range, precision, contrast, gray scale, gradient, etc.) of the first image, the second image, the third image may be the same or different. For example, the scanning range of the first image may be larger than the scanning ranges of the second image and the third image, or the second image and the third image may be higher in accuracy than the first image.
In some embodiments, the first image, the second image, and the third image are acquired at the same breath amplitude point or at similar breath amplitude points that do not affect the accuracy of the puncture. In some embodiments, the first image is acquired with the target subject at a first breath amplitude point, the second image is acquired with the target subject at a second breath amplitude point, the third image is acquired with the target subject at a third breath amplitude point, the deviation of the second breath amplitude point from the first breath amplitude point is less than a preset value, and the deviation of the third breath amplitude point from the first breath amplitude point and/or the second breath amplitude point is less than a preset value. In some embodiments, the preset value may be set according to requirements and/or experience, e.g., 1%, 5%, 7%, 10%, etc. As shown in fig. 3, the first image is acquired at a first breath amplitude point a, the second image is acquired at a second breath amplitude point a 'having a deviation from the first breath amplitude point a less than a preset value, and the third image is acquired at a third breath amplitude point a″ having a deviation from the first breath amplitude point a and/or the second breath amplitude point a' less than a preset value.
Breathing amplitude is a physical quantity that reflects the change in the amount of gas during breathing. The breath amplitude point refers to a point in time at a certain breath amplitude, e.g. end-inspiration, end-expiration, a certain intermediate state of inspiration, a certain intermediate state of expiration, etc. In some embodiments, the breath amplitude points of the acquired images (e.g., first image, second image, third image) may be determined according to needs, experience, and/or user habits. For example, when lung puncture is performed, the inhalation state has less compression on the focus, and an image can be acquired at the end of inhalation.
In some embodiments, the target subject may adjust itself (or under technician direction) to a breath amplitude point (e.g., end of inspiration) at which the medical device 110 may acquire the first, second, and third images, respectively, prior to, during, or during the performance of the puncture.
In some embodiments, the processor 120 may cause the first, second, and third images to be acquired by the respiratory gating device at the same or approximately the same respiratory amplitude point of the target subject. For example, as shown in fig. 3, when the first image is acquired, the respiratory gate device may acquire a respiratory amplitude point a where the target object is located; during the puncture operation, before the puncture is performed, the respiration gating device may monitor the respiration of the target object, and cause the medical device 110 to acquire a second image when the target object is at the respiration amplitude point a'; during the puncturing execution process, the respiration gating device can monitor the respiration amplitude of the target object, when the target object adjusts the respiration to the respiration amplitude point a ", the medical equipment 110 is enabled to acquire a third image, and during the breath-holding process (that is, the respiration amplitude is kept near the point a") of the target object, the respiration amplitude of the target object is monitored, and when the respiration amplitude of the target object deviates from the point a "greatly, a prompt is sent to the user.
The first image, the second image and the third image are acquired under the same or approximately same respiration amplitude point, so that the movement of organ tissues between the images caused by respiration movement is smaller, and the accuracy of preoperative planning is improved.
And 220, registering the first image and the second image to obtain a fourth image, wherein the fourth image contains puncture planning information after registration. In some embodiments, step 220 may be performed by processing device 120.
Registration refers to the process of matching and overlapping different images acquired at different time or under different conditions, and multi-dimensional information included in different images can be comprehensively embodied through registration.
In some embodiments, the processing device 120 may employ a feature, gray, etc. based non-rigid registration algorithm, such as a Demons based non-rigid registration algorithm, to register the first image, the second image. In some embodiments, the processing device 120 may further use a non-rigid body registration algorithm based on deep learning to register the first image and the second image, so as to improve the real-time performance of registration.
In some embodiments, the first image is acquired prior to the puncture procedure, the time for acquisition and image processing is relatively plentiful, the first image is scanned over a relatively large range, and the slices are relatively thick, e.g., including a large number of slices covering all relevant tissues and/or organs. Planning the puncture path on the first image with more comprehensive information is beneficial to improving the accuracy of the guidance of the subsequent puncture operation.
In some embodiments, the second image is acquired during the lancing procedure, prior to the lancing being performed, the time for acquisition and image processing is relatively tight, the second image is scanned over a smaller range and is thinner, e.g., only including 4 to 10 slices around the tip of the needle. It can be understood that the fourth image obtained by registering the first image and the second image contains puncture planning information after registration.
In some embodiments, first, the processing device 120 may obtain a puncture planning information image based on the first image.
The puncture planning information image is an image containing puncture planning information. In some embodiments, the puncture planning information may include at least one of high risk tissue information, a planned path of the puncture, and lesion location information. High risk tissue refers to organs and/or tissues that may be penetrated to adversely affect the target object and/or the surgical procedure, e.g., large blood vessels, bones, etc. In some embodiments, different high risk tissues may be provided depending on the individual condition of different patients. For example, the liver of a patient with liver hypofunction is used as a risk area, and other lesions in the target object are used as risk areas. The planned path of penetration refers to the planned path traveled by the penetration instrument. The planned path information for penetration may include an entry point, a target point, a penetration angle, a penetration depth, a path length, a path traversing tissue and/or organs, and the like. The lesion location information may include coordinates, depth, volume, edges, etc. of the lesion (or center of the lesion) in a human coordinate system.
In some embodiments, the processing device 120 or an associated person (e.g., a physician) may process the first image to obtain puncture planning information. For example, various tissues or organs, such as blood vessels, skin, bones, organs, tissues to be penetrated, etc., may be segmented. For another example, the various tissues or organs that are segmented may be classified as focal zones, wearable zones, high risk tissues, and the like. For another example, the planned path of penetration may be determined based on focal zones, wearable zones, high risk tissue, etc. In some embodiments, the processing device 120 or an associated person (e.g., a physician) may annotate the puncture planning information on the first image to obtain a puncture planning information image.
Second, the processing device 120 may perform first registration on the first image and the second image to obtain first deformation information. The first morphing information refers to morphological change information of an image element (e.g., a pixel or voxel) in the second image relative to a corresponding image element in the first image. Such as geometric variation information, projection variation information, etc. The first deformation information may be represented by a first deformation matrix, and the first deformation matrix may include a deformation matrix in an x direction, a deformation matrix in a y direction, and a deformation matrix in a z direction, where elements in each deformation matrix correspond to a unit area of the second image (for example, 1 pixel point, 1mm×1mm image area, 1 voxel point, 1mm×1mm image area, etc.), and values of the elements are deformation information of the unit area in an x-axis, y-axis, or z-axis direction. In some embodiments, the processing device 120 may perform the first registration on the first image, the second image by a Demons-based non-rigid body registration algorithm, a geometric correction method, a deep learning-based non-rigid body registration algorithm, and so on, to obtain the first deformation information.
Finally, the processing device 120 may apply the first deformation information to the puncture planning information image to obtain a fourth image, where the puncture planning information in the fourth image is puncture planning information after the first registration. For example, the processing device 120 may apply the first deformation matrix to the puncture planning information image, i.e., to cause the puncture planning information image and the puncture planning information therein (high risk tissue information, planned path of puncture, lesion location information, etc.) to generate a morphological change corresponding to the first deformation information, thereby obtaining the fourth image.
Since the time to achieve high-precision registration is generally long, it takes several seconds to tens of seconds or so. By registering the first image and the second image with high precision in advance during the puncture operation and before the puncture execution, the calculation pressure after the start of the puncture execution can be avoided or reduced, so that the actual puncture operation can be executed immediately or in a shorter time after the acquisition of the real-time image, and the duration of the puncture execution is reduced.
Step 230, map the fourth image to the third image to guide the puncture procedure. In some embodiments, step 230 may be performed by processing device 120.
In some embodiments, the processing device 120 may map the fourth image to the third image by way of homography, affine transformation, alpha channel transformation, etc., to guide the puncture procedure. For example, the user may perform the mapped puncture path while guided by the mapped third image and avoid the high risk region such as the blood vessel, so as to gradually puncture the mapped lesion.
In some embodiments, if respiration of the target object is not monitored by the respiratory gating device, the second image and the third image may be acquired while the target object is at different respiratory amplitude points, organs and/or tissues in the images may be moved, and the processing device 120 may perform a second registration of the second image and the third image. As shown in fig. 4, in some embodiments, the processing device 120 may perform a second registration on the second image and the third image to obtain second deformation information.
The second deformation information refers to morphological change information of the image element in the third image with respect to the corresponding image element in the second image. Such as geometric variation information, projection variation information, etc. The second deformation information may be represented by a second deformation matrix, and as an example, the second deformation matrix may include a deformation matrix in the x-direction, a deformation matrix in the y-direction, and a deformation matrix in the z-direction, and elements in each deformation matrix correspond to a unit area of the third image (for example, 1 pixel point, 1mm×1mm image area, 1 voxel point, 1mm×1mm image area, etc.), and values of the elements are deformation information of the unit area in the x-axis, y-axis, or z-axis directions. In some embodiments, the processing device 120 may perform the second registration of the second image, the third image, and the like by a Demons-based non-rigid body registration algorithm, a geometric correction method, a depth-learning-based non-rigid body registration algorithm, and the like, resulting in second deformation information.
In some embodiments, the processing device 120 may apply the second deformation information to the fourth image to obtain a fifth image, where the fifth image includes the puncture planning information after the second registration. For example, the processing device 120 may apply the second deformation matrix to the fourth image, and generate a morphological change corresponding to the second deformation information from the puncture planning information (high-risk tissue information, a planned path of puncture, lesion position information, etc.) registered for the first time, which is included in the fourth image, so as to obtain the fifth image.
In some embodiments, the processing device 120 may map the fifth image to the third image. For example, the fifth image is mapped to the third image by a method of homography transformation, affine transformation, alpha channel transformation, or the like.
Because the second image and the third image are data of few slices, the calculation amount of rapid registration is small, the registration can be realized in a short time after the third image is acquired in the operation process, and the operation risk is reduced.
In some embodiments, the processing device 120 may display image information of the fourth image or the fifth image that is located outside the display range of the third image, when puncturing is performed. For example, other tissues and/or organs in the fourth image or the fifth image, etc. are displayed. As another example, as shown in FIG. 5, T 1 And displaying the focus out of the display range of the third image at the moment.
In some embodiments, the processing device 120 may display the planned path information of the corresponding puncture outside the display range of the third image. For example, as shown in fig. 5, the lesion is outside the scan range, showing the planned path from point C to the lesion.
In some embodiments, the processing device 120 may display image information within and outside the display range of the third image differently. For example, different ground colors are set in the display range and out of the display range, the display range is displayed as an RGB image, the display range is displayed as a gray image, the lines (for example, planned paths) in the display range are displayed as solid lines, the lines (for example, planned paths) out of the display range are displayed as broken lines (or solid lines of different colors), and the like.
The third image scan is less extensive due to radiation dose, imaging time, etc., affecting the real-time field of view of the puncture. By presenting information outside the display range, the puncture planning information mapped outside the third image scanning range can be complemented into the puncture process real-time image, so that the visual field of the planning information seen by the user in the puncture process is increased, and more useful information in the puncture process is acquired. Particularly when the puncture focus is not in the scanning range of the real-time image (such as the third image) of the puncture process at the beginning, the focus displayed outside the scanning range can provide a definite target for the puncture of a doctor, so that the operation is easier to succeed.
In some embodiments, the processing device 120 may identify the puncture needle tip location from the third image. The processing device 120 may extract the puncture needle in the third image through a semiautomatic threshold segmentation algorithm or a fully automatic deep learning algorithm, and may further obtain the needle tip position. For example, as shown in FIG. 5, processing device 120 may identify T 1 Position B of tip at moment 1 And (5) a dot.
Second, the processing device 120 may move the target object, a couch plate carrying the target object, or a detector for acquiring the third image in accordance with the needle tip position such that the needle tip position is located in a center region of the display range of the third image. For example, as shown in FIG. 5, at T 1 At this point, the needle tip pierces the lesion along the planned path, and the processing device 120 may predict T 2 Position B of tip at moment 2 Point, at T 1 -T 2 And moving the bed according to the change of the needle point position in the time period, so that the scanning range of the third image is downwards moved, and the needle point position is positioned in the middle area of the display range of the third image.
By moving the target object, the bed board carrying the target object or the detector for collecting the third image, the scanning range is updated in real time, the needle point position is kept to be located in the middle area of the display range of the third image, information around the needle point can be highlighted, the needle point advancing process is tracked more accurately, the improvement of the operation efficiency is facilitated, and the operation risk is reduced.
It should be noted that the above description of the process 200 is for illustration and description only, and is not intended to limit the scope of applicability of the present disclosure. Various modifications and changes to flow 200 will be apparent to those skilled in the art in light of the present description. However, such modifications and variations are still within the scope of the present description.
Fig. 3 is a schematic diagram of an exemplary puncture procedure guiding method according to some embodiments of the present description.
In some embodiments, the processing device 120 monitors the respiration of the target object by a respiration gating apparatus. As shown in fig. 3, when the first image is acquired, the respiratory gate apparatus may acquire a respiratory amplitude point a where the target object is located. During the lancing operation, prior to lancing execution, the breath gating device may monitor the breath of the target subject and cause the medical device 110 to acquire a second image while the target subject is at the breath amplitude point A'. The processing device 120 obtains the puncture planning information image by processing the first image, and obtains the first deformation information by first registration. The processing device 120 applies the first deformation information to the puncture planning information image to obtain a fourth image, where the fourth image includes the puncture planning information after the first registration.
During the performance of the puncture, the target subject may self-control (e.g., breath-hold) the breath-level to the same or similar breath-level. Alternatively, the processing device 120 may monitor the breathing amplitude of the target subject via a breath-gating apparatus. As the target subject adjusts the breath to the third breath amplitude point a ", the medical device 110 acquires a third image, and the processing device 120 maps the fourth image to the third image to guide the puncture procedure. If the respiratory gating apparatus detects a significant deviation in respiratory amplitude, the processing device 120 may give a prompt and/or interrupt the puncture; and continuing the puncturing process when the target object is regulated to the same or similar respiration amplitude.
By utilizing the respiratory gating device to acquire the first image, the second image and the third image under the same or approximately the same respiratory amplitude point, the movement of organ tissues between the images caused by respiratory motion can be smaller, and the accuracy of preoperative planning can be improved.
Fig. 4 is another schematic illustration of an exemplary puncture procedure guiding method according to further embodiments of the present description.
In some embodiments, the processing device 120 monitors the respiration of the target object without the aid of a respiration gating apparatus. As shown in fig. 4, the processing device 120 acquires a first image, a second image, and a third image of the target object at different times, respectively. The processing device 120 obtains the puncture planning information image by processing the first image, and obtains the first deformation information by first registration. The processing device 120 applies the first deformation information to the puncture planning information image to obtain a fourth image, where the fourth image includes the puncture planning information after the first registration.
The processing device 120 performs the second registration on the second image and the third image to obtain second deformation information, and acts the second deformation information on the fourth image to obtain a fifth image, where the fifth image includes puncture planning information after the second registration. The processing device 120 maps the fifth image to the third image to guide the puncture procedure.
Because the second image and the third image are data of few slices, the calculated amount of the second registration is small, the registration can be realized in a short time after the third image is acquired in the operation process, and the operation risk is reduced.
The embodiments of the present specification also provide a surgical robot including: a mechanical arm for performing a puncture operation; and a control system comprising one or more processors and memory, the memory comprising programming instructions adapted to cause the one or more processors to perform operations comprising: respectively acquiring a first image, a second image and a third image of a target object at different times; registering the first image and the second image to obtain a fourth image, wherein the fourth image contains registered puncture planning information; the fourth image is mapped to the third image to guide a puncture procedure.
The embodiments of the present specification also provide a surgical robot including: a mechanical arm for performing a puncture operation; and a control system comprising one or more processors and memory, the memory comprising programming instructions adapted to cause the one or more processors to perform operations comprising: respectively acquiring a first image, a second image and a third image of a target object at different times; performing first registration on the first image and the second image to obtain first deformation information and a fourth image, wherein the fourth image contains puncture planning information after registration; registering the second image and the third image for the second time to obtain second deformation information; the second deformation information is acted on the fourth image to obtain a fifth image, and the fifth image contains puncture planning information after the second registration; the fifth image is mapped to the third image to guide a puncture procedure.
In some embodiments of the present disclosure, (1) each image is acquired at the same or similar respiratory amplitude points, so that movement of organ tissue between images caused by respiratory motion is small, which is beneficial to improving accuracy of preoperative planning; (2) High-precision registration is performed in advance before puncturing execution, so that calculation pressure after puncturing execution is started is avoided or reduced, and puncturing execution time is shortened; (3) Meanwhile, the overlong breath-hold time of the patient is avoided, and the experience of the patient is improved; (4) The large-vision guiding image is displayed, the real-time image and the planning image are displayed in a distinguishing mode, and the puncture operation is guided more clearly.
While the basic concepts have been described above, it will be apparent to those skilled in the art that the foregoing detailed disclosure is by way of example only and is not intended to be limiting. Although not explicitly described herein, various modifications, improvements, and adaptations to the present disclosure may occur to one skilled in the art. Such modifications, improvements, and modifications are intended to be suggested within this specification, and therefore, such modifications, improvements, and modifications are intended to be included within the spirit and scope of the exemplary embodiments of the present invention.
Meanwhile, the specification uses specific words to describe the embodiments of the specification. Reference to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic is associated with at least one embodiment of the present description. Thus, it should be emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various positions in this specification are not necessarily referring to the same embodiment. Furthermore, certain features, structures, or characteristics of one or more embodiments of the present description may be combined as suitable.
Furthermore, the order in which the elements and sequences are processed, the use of numerical letters, or other designations in the description are not intended to limit the order in which the processes and methods of the description are performed unless explicitly recited in the claims. While certain presently useful inventive embodiments have been discussed in the foregoing disclosure, by way of various examples, it is to be understood that such details are merely illustrative and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements included within the spirit and scope of the embodiments of the present disclosure. For example, while the system components described above may be implemented by hardware devices, they may also be implemented solely by software solutions, such as installing the described system on an existing server or mobile device.
Likewise, it should be noted that in order to simplify the presentation disclosed in this specification and thereby aid in understanding one or more inventive embodiments, various features are sometimes grouped together in a single embodiment, figure, or description thereof. This method of disclosure, however, is not intended to imply that more features than are presented in the claims are required for the present description. Indeed, less than all of the features of a single embodiment disclosed above.
In some embodiments, numbers describing the components, number of attributes are used, it being understood that such numbers being used in the description of embodiments are modified in some examples by the modifier "about," approximately, "or" substantially. Unless otherwise indicated, "about," "approximately," or "substantially" indicate that the number allows for a 20% variation. Accordingly, in some embodiments, numerical parameters set forth in the specification and claims are approximations that may vary depending upon the desired properties sought to be obtained by the individual embodiments. In some embodiments, the numerical parameters should take into account the specified significant digits and employ a method for preserving the general number of digits. Although the numerical ranges and parameters set forth herein are approximations that may be employed in some embodiments to confirm the breadth of the range, in particular embodiments, the setting of such numerical values is as precise as possible.
Each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, etc., referred to in this specification is incorporated herein by reference in its entirety. Except for application history documents that are inconsistent or conflicting with the content of this specification, documents that are currently or later attached to this specification in which the broadest scope of the claims to this specification is limited are also. It is noted that, if the description, definition, and/or use of a term in an attached material in this specification does not conform to or conflict with what is described in this specification, the description, definition, and/or use of the term in this specification controls.
Finally, it should be understood that the embodiments described in this specification are merely illustrative of the principles of the embodiments of this specification. Other variations are possible within the scope of this description. Thus, by way of example, and not limitation, alternative configurations of embodiments of the present specification may be considered as consistent with the teachings of the present specification. Accordingly, the embodiments of the present specification are not limited to only the embodiments explicitly described and depicted in the present specification.

Claims (13)

1. A puncture guiding system, the system comprising:
A control system comprising one or more processors and memory, the memory comprising programming instructions adapted to cause the one or more processors to perform operations comprising:
respectively acquiring a first image, a second image and a third image of a target object at different times;
registering the first image and the second image to obtain a fourth image, wherein the fourth image contains registered puncture planning information;
the fourth image is mapped to the third image to guide a puncture procedure.
2. The system of claim 1, wherein:
the first image is acquired before the puncture operation, the second image is acquired during the puncture operation and before the puncture execution, the third image is acquired during the puncture execution, and the first image, the second image and the third image are acquired by a computer tomography device.
3. The system of claim 1, wherein:
the first image is acquired when the target object is at a first respiration amplitude point, the second image is acquired when the target object is at a second respiration amplitude point, the third image is acquired when the target object is at a third respiration amplitude point, the deviation between the second respiration amplitude point and the first respiration amplitude point is smaller than a preset value, and the deviation between the third respiration amplitude point and the first respiration amplitude point and/or the second respiration amplitude point is smaller than the preset value.
4. The system of claim 1, wherein the registering the first image and the second image to obtain a fourth image comprises:
obtaining a puncture planning information image based on the first image;
registering the first image and the second image for the first time to obtain first deformation information;
and acting the first deformation information on the puncture planning information image to obtain a fourth image, wherein the puncture planning information in the fourth image is puncture planning information after the first registration.
5. The system of claim 1, wherein the programming instructions further comprise at least one of:
displaying image information, which is located outside the display range of the third image, in the fourth image outside the display range of the third image;
displaying corresponding planning path information of puncture outside the display range of the third image;
and distinguishing and displaying the image information in and out of the display range of the third image.
6. A puncture guiding system, the system comprising:
a control system comprising one or more processors and memory, the memory comprising programming instructions adapted to cause the one or more processors to perform operations comprising:
Respectively acquiring a first image, a second image and a third image of a target object at different times;
performing first registration on the first image and the second image to obtain first deformation information and a fourth image, wherein the fourth image contains puncture planning information after registration; registering the second image and the third image for the second time to obtain second deformation information;
the second deformation information is acted on the fourth image to obtain a fifth image, and the fifth image contains puncture planning information after the second registration;
mapping the fifth image to the third image.
7. The system of claim 6, wherein the programming instructions further comprise at least one of:
displaying image information, which is located outside the display range of the third image, in the fifth image outside the display range of the third image;
displaying corresponding planning path information of puncture outside the display range of the third image;
and distinguishing and displaying the image information in and out of the display range of the third image.
8. The system of claim 1 or 6, wherein the puncture planning information comprises at least one of high risk tissue information, planned path of puncture, lesion location information.
9. The system of claim 1 or 6, wherein the programming instructions further comprise:
identifying the position of the puncture needle point according to the third image;
and according to the needle point position, moving the target object, a bed board carrying the target object or a detector for acquiring the third image, so that the needle point position is positioned in the middle area of the display range of the third image.
10. A method of guiding a puncture procedure, the method comprising:
respectively acquiring a first image, a second image and a third image of a target object at different times;
registering the first image and the second image to obtain a fourth image, wherein the fourth image contains registered puncture planning information;
the fourth image is mapped to the third image to guide a puncture procedure.
11. A method of guiding a puncture procedure, the method comprising:
respectively acquiring a first image, a second image and a third image of a target object at different times;
performing first registration on the first image and the second image to obtain first deformation information and a fourth image, wherein the fourth image contains puncture planning information after registration;
Registering the second image and the third image for the second time to obtain second deformation information;
the second deformation information is acted on the fourth image to obtain a fifth image, and the fifth image contains puncture planning information after the second registration;
mapping the fifth image to the third image.
12. A surgical robot, comprising:
a mechanical arm for performing a puncture operation; and
a control system comprising one or more processors and memory, the memory comprising programming instructions adapted to cause the one or more processors to perform operations comprising:
respectively acquiring a first image, a second image and a third image of a target object at different times;
registering the first image and the second image to obtain a fourth image, wherein the fourth image contains registered puncture planning information;
the fourth image is mapped to the third image to guide a puncture procedure.
13. A surgical robot, comprising:
a mechanical arm for performing a puncture operation; and
a control system comprising one or more processors and memory, the memory comprising programming instructions adapted to cause the one or more processors to perform operations comprising:
Respectively acquiring a first image, a second image and a third image of a target object at different times;
performing first registration on the first image and the second image to obtain first deformation information and a fourth image, wherein the fourth image contains puncture planning information after registration;
registering the second image and the third image for the second time to obtain second deformation information;
the second deformation information is acted on the fourth image to obtain a fifth image, and the fifth image contains puncture planning information after the second registration;
the fifth image is mapped to the third image to guide a puncture procedure.
CN202210493274.3A 2022-05-07 2022-05-07 Puncture operation guiding system, method and operation robot Pending CN117045318A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210493274.3A CN117045318A (en) 2022-05-07 2022-05-07 Puncture operation guiding system, method and operation robot
PCT/CN2023/091895 WO2023216947A1 (en) 2022-05-07 2023-04-28 Medical image processing system and method for interventional operation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210493274.3A CN117045318A (en) 2022-05-07 2022-05-07 Puncture operation guiding system, method and operation robot

Publications (1)

Publication Number Publication Date
CN117045318A true CN117045318A (en) 2023-11-14

Family

ID=88661374

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210493274.3A Pending CN117045318A (en) 2022-05-07 2022-05-07 Puncture operation guiding system, method and operation robot

Country Status (1)

Country Link
CN (1) CN117045318A (en)

Similar Documents

Publication Publication Date Title
US8099155B2 (en) Method for assisting with percutaneous interventions
KR102014355B1 (en) Method and apparatus for calculating location information of surgical device
CN109419524B (en) Control of medical imaging system
US20220313190A1 (en) System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target
JP4822634B2 (en) A method for obtaining coordinate transformation for guidance of an object
JP2022507622A (en) Use of optical cords in augmented reality displays
US20170215969A1 (en) Human organ movement monitoring method, surgical navigation system and computer readable medium
CN108324246A (en) Medical diagnosis auxiliary system and method
US20050027193A1 (en) Method for automatically merging a 2D fluoroscopic C-arm image with a preoperative 3D image with one-time use of navigation markers
CN105520716B (en) Real-time simulation of fluoroscopic images
JP2000308646A (en) Method and system for detecting the movement of patient' s organ or curative range
JP2010517632A (en) System for continuous guidance of endoscope
US20080167547A1 (en) Systems and Methods For Planning Medical Procedures and Designing Medical Devices Based on Anatomical Scan Deformations
US20070244369A1 (en) Medical Imaging System for Mapping a Structure in a Patient's Body
EP3673854B1 (en) Correcting medical scans
CN105992556B (en) Medical viewing system with observation angle and optimizing function
KR101862359B1 (en) Program and method for generating surgical simulation information
KR20190088419A (en) Program and method for generating surgical simulation information
JP6959612B2 (en) Diagnostic imaging system
CN117045318A (en) Puncture operation guiding system, method and operation robot
CN114283179A (en) Real-time fracture far-near end space pose acquisition and registration system based on ultrasonic images
CN116096322A (en) Systems and methods for assisting in placement of a surgical instrument into a subject
CN115089294B (en) Interventional operation navigation method
KR101940706B1 (en) Program and method for generating surgical simulation information
WO2023108625A1 (en) Puncture positioning system and control method therefor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination