CN114652443A - Ultrasonic operation navigation system and method, storage medium and device - Google Patents

Ultrasonic operation navigation system and method, storage medium and device Download PDF

Info

Publication number
CN114652443A
CN114652443A CN202210228410.6A CN202210228410A CN114652443A CN 114652443 A CN114652443 A CN 114652443A CN 202210228410 A CN202210228410 A CN 202210228410A CN 114652443 A CN114652443 A CN 114652443A
Authority
CN
China
Prior art keywords
preoperative
marker
image
ultrasonic
surgical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210228410.6A
Other languages
Chinese (zh)
Inventor
温铁祥
陈垦
刘新
郑海荣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen National Research Institute of High Performance Medical Devices Co Ltd
Original Assignee
Shenzhen National Research Institute of High Performance Medical Devices Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen National Research Institute of High Performance Medical Devices Co Ltd filed Critical Shenzhen National Research Institute of High Performance Medical Devices Co Ltd
Priority to CN202210228410.6A priority Critical patent/CN114652443A/en
Publication of CN114652443A publication Critical patent/CN114652443A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • A61B2017/3413Needle locating or guiding means guided by ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2063Acoustic tracking systems, e.g. using ultrasound

Abstract

The embodiment of the invention discloses an ultrasonic operation navigation system and method, a storage medium and equipment, wherein the ultrasonic operation navigation system comprises: the system comprises an imaging system, a tracking and positioning system, a surgical instrument and a first marker for contacting with the body surface. The preoperative image data and the preoperative physical space position are used for obtaining a registration result before operation, the preoperative image data is used for obtaining a target planning path, and then the target operation path and the registration result are used for intraoperative operation guidance. Through the ultrasonic operation navigation system, the real-time position of the surgical instrument in the operation can be guided by using the target operation path obtained by the preoperative image data, the space position of the surgical instrument is determined without carrying out multiple times of X-ray perspective imaging in the operation, the radiation injury can be reduced, the consumed time is reduced, the accuracy is improved, the real-time relative position relation between the surgical instrument and the surrounding key anatomical structures can be determined by adopting the mode for guiding.

Description

Ultrasonic operation navigation system and method, storage medium and device
Technical Field
The invention relates to the technical field of medical instruments, in particular to an ultrasonic surgery navigation system and method, a storage medium and equipment.
Background
Existing needle tracking techniques in interventional procedures generally fall into two broad categories: one method is to adopt medical image processing method, extract relevant characteristics from the perspective image obtained by X-ray, and calculate the position of the puncture interventional instrument. In order to maintain the real-time performance and effectiveness of information, X-ray images are generally required to be acquired for multiple times, so that the radiation dosage is high, the time consumption is long, and the operation accuracy is insufficient, one reason is that the system is difficult to adapt to the complex nonlinear deformation characteristics of instruments such as a puncture needle and the like; the other reason is that the X-ray image is two-dimensional image data, so that the extracted puncture needle lacks three-dimensional information and cannot accurately reflect the spatial relationship between the interventional instrument and surrounding important tissues and organs.
The other method is to integrate a tracking locator on a puncture needle or an ultrasonic probe to directly acquire the real-time position of an interventional instrument in the operation. This method only acquires the real-time position of the interventional instrument and does not provide a real-time relative positional relationship between it and the surrounding vessels and other critical anatomical structures, and therefore does not provide an accurate surgical reference.
Disclosure of Invention
The invention mainly aims to provide an ultrasonic operation navigation system and method, a storage medium and equipment, which can solve the technical problem of tracking a puncture needle in an interventional operation in the prior art
To achieve the above object, a first aspect of the present invention provides an ultrasonic surgical navigation system, including: the system comprises an imaging system, a tracking and positioning system, a surgical instrument and a first marker for contacting with the body surface;
the image system is used for acquiring preoperative image data to display preoperative images and acquiring a preoperative image space point set corresponding to the first marker;
the tracking and positioning system is used for acquiring preoperative tracking data of the first marker so as to acquire a preoperative physical space position of the first marker in the preoperative tracking data;
the ultrasonic operation navigation system is used for registering the preoperative image space and the preoperative physical space according to the preoperative image space point set and the preoperative physical space position, and determining a registration result; planning a surgical path by using the preoperative image data, and determining a target surgical path; and performing surgical guidance of the surgical instrument according to the target surgical path, the registration result, the intraoperative image data and the intraoperative physical space position.
In one possible implementation, the imaging system includes an ultrasound probe having a second marker disposed thereon;
the ultrasonic probe is used for acquiring intraoperative image data;
the tracking and positioning system is also used for acquiring the intraoperative physical space position of the second marker;
the ultrasonic navigation system is further configured to perform image fusion display by using the registration result and the intraoperative physical spatial position of the second marker to determine a spatial posture of an intraoperative ultrasonic imaging plane corresponding to the ultrasonic probe in a preoperative image space, where the image fusion display includes fusion display of an intraoperative ultrasonic image and a preoperative image.
In one possible implementation, a third marker is disposed on the surgical instrument;
the tracking and positioning system is also used for acquiring the intraoperative physical space position of the third marker;
the ultrasonic navigation system is further used for determining a preoperative image space position corresponding to the intraoperative physical space position by using the registration result and the intraoperative physical space position of the third marker so as to determine the spatial posture of the surgical instrument in the operation in the preoperative image space.
In one possible implementation, the ultrasound surgical navigation system determines a registration result, including:
registering the preoperative image space point set and the preoperative physical space position by using an iterative closest point algorithm, and determining a target conversion matrix corresponding to registration of the preoperative image space and the preoperative image space;
and taking the target conversion matrix as the registration result.
In one possible implementation, the marker comprises a magnetic marker or an optical marker.
In one possible implementation, the tracking and positioning system comprises a magnetic tracking system or an optical tracking system.
In one possible implementation, the preoperative image includes a tomographic image or a magnetic resonance image.
In order to achieve the above object, a second aspect of the present invention provides an ultrasonic surgical navigation method applied to the ultrasonic surgical navigation system according to the first aspect and any possible implementation manner, the method including:
acquiring a preoperative image space point set corresponding to a first marker;
acquiring a preoperative physical space position corresponding to the first marker;
registering the preoperative image space and the preoperative physical space according to the preoperative image space point set and the preoperative physical space position, and determining a registration result;
planning a surgical path by using the preoperative image data, and determining a target surgical path;
and performing surgical guidance of surgical instruments according to the target surgical path, the registration result, the intraoperative image data and the intraoperative physical space position.
To achieve the above object, a third aspect of the present invention provides a computer-readable storage medium storing a computer program, which, when executed by a processor, causes the processor to perform the steps as shown in the second aspect.
To achieve the above object, a fourth aspect of the present invention provides a computer device comprising a memory and a processor, the memory storing a computer program, the computer program, when executed by the processor, causing the processor to perform the steps as shown in the second aspect.
The embodiment of the invention has the following beneficial effects:
the invention provides an ultrasonic operation navigation system, which comprises: the system comprises an imaging system, a tracking and positioning system, a surgical instrument and a first marker for contacting with the body surface; the image system is used for acquiring preoperative image data so as to display preoperative images and acquiring a preoperative image space point set corresponding to the first marker; the tracking and positioning system is used for acquiring preoperative tracking data of the first marker so as to acquire a preoperative physical space position of the first marker in the preoperative tracking data; the ultrasonic operation navigation system is used for registering the preoperative image space and the preoperative physical space according to the preoperative image space point set and the preoperative physical space position and determining a registration result; planning an operation path by using preoperative image data, and determining a target operation path; and performing surgical guidance of the surgical instrument according to the target surgical path, the registration result, the intraoperative image data and the intraoperative physical space position. By the ultrasonic operation navigation system, the real-time position of the surgical instrument in the operation can be guided by using the target operation path of the preoperative image data, the space position of the surgical instrument is determined without carrying out multiple X-ray perspective imaging in the operation, the radiation injury of a patient and a doctor can be greatly reduced, intermittent multiple imaging is not required, the time consumption is reduced, the operation accuracy is improved, the guide display is carried out by using the target operation path, the registration result, the intraoperative image data and the intraoperative physical space position, and the real-time relative position relation between the surgical instrument and the surrounding key anatomical structures can be determined.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Wherein:
FIG. 1 is a diagram of an environment for an ultrasound navigation system in an embodiment of the present invention;
FIG. 2 is a block diagram of an ultrasound navigation system according to an embodiment of the present invention;
FIG. 3 is a flow chart of a method of ultrasound surgical navigation in an embodiment of the present invention;
FIG. 4 is another block diagram of an ultrasound navigation system according to an embodiment of the present invention;
FIG. 5 is another flow chart of a method of ultrasound surgical navigation in an embodiment of the present invention;
fig. 6 is a block diagram of a computer device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, fig. 1 is an application environment diagram of an ultrasound navigation system according to an embodiment of the present invention, and the application environment shown in fig. 1 includes: a patient 10, an operating bed 11, a first marker 12, a tracking and positioning system 13, an imaging workstation 14, an ultrasonic probe 15, a second marker 16, a surgical instrument 17, and a third marker 18. The image workstation 14 and the ultrasound probe 15 constitute an image system in the embodiment of the present invention, and the image workstation 14 may be a visual electronic device for performing real-time imaging display according to image data. The ultrasound probe is used to scan the body surface of the patient 10 for ultrasound imaging. The first marker is used for contacting with the body surface of the patient; the second marker 16 is arranged on the ultrasonic probe 15 and used for reflecting the spatial position of the ultrasonic probe; the third marker is arranged on the surgical instrument and used for reflecting the spatial position of the surgical instrument; the tracking and positioning system 13 is used for tracking each marker, and it should be noted that the first marker, the second marker and the third marker have different distinguishing marks to distinguish the markers arranged at different positions. The surgical instrument may be an interventional instrument, such as a puncture needle.
Referring to fig. 2, fig. 2 is a block diagram of an ultrasound navigation system according to an embodiment of the present invention, and provides an ultrasound navigation system 20. The system can be applied to both the terminal and the server, and the embodiment is exemplified by being applied to the terminal. The ultrasonic surgical navigation system 20 specifically includes: an imaging system 201, a tracking and positioning system 202, a surgical instrument 203, and a first marker 204 for contacting with a body surface;
the image system 201 is configured to obtain preoperative image data to display preoperative images, and obtain a preoperative image space point set corresponding to the first marker; it should be noted that the preoperative image is a three-dimensional image, and preoperative image data is obtained by performing tomography or mri on a patient, or other imaging modalities suitable for application scene requirements. The preoperative influence data may be tomographic data or nuclear magnetic resonance data. The preoperative image may be a three-dimensional tomographic image obtained using tomographic data or a three-dimensional nmr scan image obtained using nmr data. And acquiring a preoperative image space point set corresponding to the first marker in the obtained preoperative image, namely a space position (coordinate) of the first marker in a preoperative image coordinate system, specifically performing network training on preoperative image data including the first marker by using UNet deep learning neural network for the acquired preoperative image data, so as to realize automatic extraction of the marker space position and obtain the preoperative image space point set corresponding to the first marker. The preoperative image space point set comprises real-time preoperative image space position coordinates of a plurality of first markers.
Wherein, the tracking and positioning system 202 is configured to obtain preoperative tracking data of the first marker 204, so as to obtain a preoperative physical space position of the first marker in the preoperative tracking data; the preoperative physical space position is a preoperative physical space position (coordinate) of the first marker in a coordinate system corresponding to the tracking and positioning system, and illustratively, a patient lies on an operating bed of an operating room while keeping the position of the first marker 204 attached to the body surface unchanged (same as that in three-dimensional imaging); when the tracking and positioning system is turned on by a doctor, the navigation software system will automatically record the physical space position of the first marker 204 on the body surface of the patient, so as to obtain the preoperative physical space position of the first marker, which is a coordinate set comprising real-time preoperative physical space position coordinates of a plurality of first markers. At least 3 first markers 204 for contacting with the body surface are provided.
Further, the ultrasonic operation navigation system is used for registering the preoperative image space and the preoperative physical space according to the preoperative image space point set and the preoperative physical space position, and determining a registration result; planning an operation path by using preoperative image data, and determining a target operation path; surgical guidance of the surgical instrument 203 is performed based on the target surgical path, the registration result, the intra-operative image data, and the intra-operative physical spatial location.
It should be noted that, the ultrasound navigation system is interconnected with the imaging system and the tracking and positioning system, and may be in wireless connection or wired connection, and the ultrasound navigation system may implement the function of the ultrasound navigation system by an electronic device with processing capability, which is not limited herein. Specifically, the registration of the preoperative image space and the preoperative physical space refers to aligning the preoperative image space and the preoperative physical space, that is, converting two coordinate system spaces into the same coordinate system space. And planning a surgical path by using the preoperative image data to determine a target surgical path, wherein the planning of the surgical path comprises an injection point and a puncture target point, and then defining a puncture intervention path, namely the target surgical path, according to the injection point and the target point. Finally, the surgical guidance of the surgical instrument 203 can be performed according to the target surgical path, the registration result, the intra-operative image data and the intra-operative physical space position, so as to provide the physician with the intra-operative detailed reference. The registration result is used for reflecting the mapping relation between the two coordinate systems and is used for automatic alignment of the two coordinate spaces.
For example, to make the implementation of the present embodiment clearer, the following flow is used for illustration:
firstly, a medical image Tomography device (Imaging system) (for example, Computed Tomography (CT), Nuclear Magnetic Resonance Imaging (MRI), etc.) is adopted to obtain a Tomography image of a lesion part of a patient; in order to obtain the three-dimensional space position of a preoperative image, a first marker with a magnetic sensor is pasted on the chest and abdomen body surface of a patient; the method comprises the steps of segmenting peripheral organs of the obtained preoperative sectional image puncture target spot, reconstructing a three-dimensional surface, rendering visualization, and carrying out detailed planning and the like of a preoperative surgical path (target surgical path) according to a three-dimensional reconstruction result. For the acquired diagnostic image data (preoperative image), a nonet deep learning neural network is adopted to perform network training on the image of the magnetic marker, so that the automatic extraction of the image space position of the first marker is realized, and the preoperative image space point set is obtained as follows: xImage={xi,i∈n},XImageIs a set of spatial points of preoperative images, xiThe position coordinate of the first marker in the preoperative image space at a certain preoperative moment, and n is a time point.
Secondly, the patient lies on the operating bed of the operating room, and the position of the magnetic marker (hundredth marker) adhered to the body surface is kept unchanged; the doctor turns on the magnetic tracking system (tracking and positioning system), and the navigation software system automatically records the preoperative physical space position of the magnetic marker on the body surface of the patient: y isPhysical={yiI ∈ n }, wherein YPhysicalIs a preoperative physical space position (i.e., a set of physical space positions), yiThe position coordinate of the first marker in the preoperative physical space at a certain preoperative moment, and n is a time point.
Then, the registration processing is carried out through the two groups of point set pairs of the obtained magnetic marker in the image of the body surface of the patient and the physical space to obtain a registration result, and the automatic registration of the two groups of point set pairs is automatically completed, so that the fusion guidance of the preoperative image and the intraoperative image is realized. Finally, the surgical guidance of the surgical instrument 203 can be performed through the target surgical path, the registration result, the intra-operative image data and the intra-operative physical space position, so as to guide the surgeon to perform a safe and accurate operation.
The invention provides an ultrasonic operation navigation system, which comprises: the system comprises an imaging system, a tracking and positioning system, a surgical instrument and a first marker for contacting with the body surface; the image system is used for acquiring preoperative image data so as to display preoperative images and acquiring a preoperative image space point set corresponding to the first marker; the tracking and positioning system is used for acquiring preoperative tracking data of the first marker so as to acquire a preoperative physical space position of the first marker in the preoperative tracking data; the ultrasonic operation navigation system is used for registering the preoperative image space and the preoperative physical space according to the preoperative image space point set and the preoperative physical space position and determining a registration result; planning an operation path by using preoperative image data, and determining a target operation path; and performing surgical guidance of the surgical instrument according to the target surgical path, the registration result, the intraoperative image data and the intraoperative physical space position. By the ultrasonic operation navigation system, the real-time position of the surgical instrument in the operation can be guided by using the target operation path of the preoperative image data, the space position of the surgical instrument is determined without carrying out multiple X-ray perspective imaging in the operation, the radiation injury of a patient and a doctor can be greatly reduced, intermittent multiple imaging is not required, the time consumption is reduced, the operation accuracy is improved, the guide display is carried out by using the target operation path, the registration result, the intraoperative image data and the intraoperative physical space position, and the real-time relative position relation between the surgical instrument and the surrounding key anatomical structures can be determined.
As shown in fig. 3, in an embodiment, an ultrasonic surgical navigation method is provided, and the ultrasonic surgical navigation method is applied to an ultrasonic surgical navigation system, which may be applied to a terminal or a server, and this embodiment is exemplified by being applied to a terminal. The method specifically comprises the following steps:
301. acquiring a preoperative image space point set corresponding to a first marker;
302. acquiring a preoperative physical space position corresponding to the first marker;
it should be noted that, in steps 301 and 302, a preoperative image space point set and a preoperative physical space position corresponding to the first marker are obtained, and coordinate data corresponding to two preoperative coordinate systems can be obtained through steps 301 and 302, the obtaining manner may be that the image system and the tracking and positioning system respectively process the coordinate data, and the post-ultrasonic surgery navigation system directly obtains the coordinate data, for example, the image system is the tracking and navigation system, or the ultrasonic surgery navigation system obtains the image system and the tracking and positioning system, and extracts the preoperative image space point set and the preoperative physical space position corresponding to the first marker, which is not limited herein. It can be understood that the imaging system and the tracking and positioning system are respectively in communication with the ultrasonic operation navigation system, so that data sharing can be realized.
The preoperative image space point set is a position coordinate set of each time corresponding to a first marker in preoperative image data in a preoperative three-dimensional image space, the preoperative image data can be tomography data or nuclear magnetic resonance data, and the preoperative image data can be obtained by carrying out tomography or nuclear magnetic resonance scanning on a patient. Specifically, the preoperative image space point set corresponding to the first marker can be represented as XImage={xi,i∈n},XImageIs a set of spatial points of preoperative images, xiThe position coordinate of the first marker in the preoperative image space at a certain preoperative moment, and n is a time point. The preoperative image data including the first marker can be trained on the network by using the UNet deep learning neural network through the acquired preoperative image data, so that the spatial position of the marker can be automatically extracted, and a preoperative image space point set corresponding to the first marker is obtained. The preoperative image space point set comprises real-time preoperative image space position coordinates of a plurality of first markers.
The preoperative physical space position is a preoperative physical space position (coordinate) of the first marker in a coordinate system corresponding to the tracking and positioning system, and illustratively, a patient lies on an operating bed of an operating room while keeping the position of the first marker 204 attached to the body surface unchanged (same as that in three-dimensional imaging); when the tracking and positioning system is turned on by a doctor, the navigation software system will automatically record the physical space position of the first marker 204 on the body surface of the patient, so as to obtain the preoperative physical space position of the first marker, which is a coordinate set comprising real-time preoperative physical space position coordinates of a plurality of first markers. Wherein at least 3 first markers 204 for contacting with the body surface are provided.
303. Registering the preoperative image space and the preoperative physical space according to the preoperative image space point set and the preoperative physical space position, and determining a registration result;
further, the preoperative image space and the preoperative physical space are registered according to the preoperative image space point set and the preoperative physical space position in step 303, a registration result is determined, the registration result is used for converting the preoperative image space and the preoperative physical space into the same coordinate system, and data of the two coordinate systems are fused and displayed on the same visual terminal interface. The registration of the preoperative image space and the preoperative physical space refers to aligning the preoperative image space and the preoperative physical space, that is, converting two coordinate system spaces into the same coordinate system space. The registration method may be performed by obtaining two sets of Point set pairs of a preoperative image space Point set and a preoperative physical space position, and using an Iterative Closest Point (ICP) image registration algorithm to perform a spatial transformation relation T of the two sets of Point set pairs to be solvedPhysical→ImageSolving is carried out; automatically completing the automatic registration of two sets of point set pairs through ICP algorithm to obtain the conversion relation of two Zobia West, and the space transformation relation TPhysical→ImageThe registration result.
Exemplary, spatial transformation relationship TPhysical→ImageCan be obtained from the following formula:
the corresponding relation is as follows:
Figure BDA0003537204790000091
transforming the matrix:
Figure BDA0003537204790000092
wherein x isiIs a set of spatial points of preoperative images, yiIs the physical spatial position before operation, Tk+1Is a spatial transformation relationship.
It should be noted that if the transformation matrix T corresponding to the two point sets is known, ICP can be used to determine a one-to-one correspondence between the two point sets. One of the most significant features of the ICP algorithm is that it does not require that the points in the point sets X and Y be exactly one-to-one. On the contrary, if the transformation matrix T corresponding to the two point sets is known, the ICP can be used to determine the one-to-one correspondence between the two point sets, and the solution process of the ICP algorithm mathematically can be regarded as the process of iteratively minimizing the above two equations, and in the correspondence estimation step, we can use the current transformation matrix T to estimate the transformation matrix TkX is to beiEach point of the set is transformed and then at yiFinding in the set from Tk(xi) The closest point. We label this point as the corresponding point at the kth iteration
Figure BDA0003537204790000101
The result of this step is a set (x) of corresponding point pairsi
Figure BDA0003537204790000102
). In the transformation matrix estimation step, we find the transformation matrix that best describes or explains the corresponding relation of the group to obtain the registration result. By adopting the mode, the two-dimensional image space and the three-dimensional image space can be fused and displayed.
304. Planning a surgical path by using the preoperative image data, and determining a target surgical path;
it should be noted that the surgical planning may be performed based on preoperative image data to determine a target surgical path, and effective path planning may be well achieved through preoperative three-dimensional image data, and the target surgical path is used to guide a surgery, and specifically, the surgical path planning may be performed on preoperative images based on preoperative image data, and includes selecting a body surface needle insertion point and puncturing a target point to obtain the target surgical path.
305. And performing surgical guidance of surgical instruments according to the target surgical path, the registration result, the intraoperative image data and the intraoperative physical space position.
Finally, the surgical instrument 203 can be guided to perform a safe and precise operation through the target surgical path, the registration result, the intra-operative image data and the intra-operative physical spatial position, and the target surgical path, the registration result, the intra-operative image data and the intra-operative physical spatial position are displayed on a visual device in a fusion manner, so as to provide visual reference information for the surgeon.
The invention provides an ultrasonic surgery navigation method, which is applied to an ultrasonic surgery navigation system and comprises the following steps: acquiring a preoperative image space point set corresponding to a first marker; acquiring a preoperative physical space position corresponding to a first marker; registering the preoperative image space and the preoperative physical space according to the preoperative image space point set and the preoperative physical space position, and determining a registration result; planning an operation path by using preoperative image data, and determining a target operation path; and performing surgical guidance of the surgical instrument according to the target surgical path, the registration result, the intraoperative image data and the intraoperative physical space position. By the ultrasonic operation navigation system, the real-time position of the surgical instrument in the operation can be guided by using the target operation path of the preoperative image data, the space position of the surgical instrument is determined without carrying out multiple X-ray perspective imaging in the operation, the radiation injury of a patient and a doctor can be greatly reduced, intermittent multiple imaging is not required, the time consumption is reduced, the operation accuracy is improved, the guide display is carried out by using the target operation path, the registration result, the intraoperative image data and the intraoperative physical space position, and the real-time relative position relation between the surgical instrument and the surrounding key anatomical structures can be determined.
As shown in fig. 4, fig. 4 is another structural block diagram of an ultrasound surgical navigation system in the embodiment of the present invention, the ultrasound surgical navigation system 40 may be applied to a terminal, or may be applied to a server, and this embodiment is illustrated as being applied to a terminal. The ultrasonic surgical navigation system 40 specifically includes: the system comprises an imaging system 401, a tracking and positioning system 402 and a surgical instrument 403, wherein the imaging system 40 comprises an ultrasonic probe 4011, a second marker 405 is arranged on the ultrasonic probe 4011, and a third marker 406 is arranged on the surgical instrument 403; the imaging system 401 is configured to obtain preoperative image data to display preoperative images, and obtain a preoperative image space point set corresponding to the first marker 404; the tracking and positioning system 402 is configured to acquire preoperative tracking data of the first marker 404 to acquire a preoperative physical space position of the first marker 404 in the preoperative tracking data; the ultrasonic operation navigation system 40 is configured to perform registration between the preoperative image space and the preoperative physical space according to the preoperative image space point set and the preoperative physical space position, and determine a registration result; planning an operation path by using preoperative image data, and determining a target operation path; surgical guidance of the surgical instrument 403 is performed based on the target surgical path, the registration result, the intra-operative image data, and the intra-operative physical spatial location.
It should be noted that, the contents of the imaging system 401, the tracking and positioning system 402, the surgical instrument 403, and the first marker 404 for contacting the body surface are similar to the contents of the imaging system 201, the tracking and positioning system 202, the surgical instrument 203, and the first marker 204 for contacting the body surface shown in fig. 2, and for avoiding repetition of the description herein, reference may be made to the contents of the imaging system 201, the tracking and positioning system 202, the surgical instrument 203, and the first marker 204 for contacting the body surface shown in fig. 2.
In one possible implementation, the marker comprises a magnetic marker or an optical marker. The corresponding tracking and positioning system comprises a magnetic tracking system or an optical tracking system. For example, if the marker is a tracking and positioning system corresponding to a magnetic marker, the marker is a magnetic tracking system, and if the marker is a tracking and positioning system corresponding to an optical marker, the marker is an optical tracking system, and the marker may be configured as needed, and is not limited again.
In one possible implementation, the ultrasound surgical navigation system 30 determines the registration result, which may include: registering a preoperative image space point set and a preoperative physical space position by using an iterative closest point algorithm, and determining a target conversion matrix corresponding to registration of a preoperative image space and a preoperative image space; and taking the target transformation matrix as a registration result. The target transformation matrix may be written as: t isPhysical→Image
Illustratively, taking a thoracoabdominal puncture interventional operation as an example, the surgical intervention guided by intraoperative ultrasound two-dimensional real-time images is the most extensive solution at present. In the traditional puncture interventional operation process, due to the fact that the ultrasonic two-dimensional real-time image lacks three-dimensional space information, a doctor needs to frequently perform X-ray imaging on a focus part of a patient, so that the relative position relation between interventional devices such as a puncture needle for performing intervention and the like and a puncture target point is mastered, the puncture interventional devices can be guaranteed to correctly reach the focus part, and the interventional devices are prevented from damaging key anatomical structures such as blood vessels and nerves, so that complications are avoided. But can cause radiation damage to patients and doctors through X-ray imaging; to view the anatomy of a blood vessel, multiple, continuous applications of contrast media to the patient are often required; interventional procedures are often interrupted by X-ray imaging, the procedure is time consuming and heavily dependent on the experience of the physician. The ultrasonic image guidance has the advantages of real-time performance, no radiation, portability and the like, has the advantage of detail for imaging guidance of the soft tissues of the chest and the abdomen, but also has the problems that an ultrasonic image lacks three-dimensional space positioning information, free scanning depends on the experience of a doctor and the like.
Therefore, according to the characteristics of the blood vessel interventional operation, the marker is combined with the surgical instrument and the imaging system to realize the positioning of the real-time position of the puncture interventional surgical instrument, and the frequency of X-ray angiography imaging is reduced, the radiation damage is reduced, and the puncture interventional operation efficiency is improved through the fusion display of the ultrasonic real-time image and the preoperative three-dimensional operation planning path, so that the following contents are referred to continuously.
In one possible implementation, the ultrasound probe 4011 is configured to acquire intraoperative image data, wherein the intraoperative image data is two-dimensional image data, and can be used for ultrasound imaging. In this embodiment, intraoperative ultrasound imaging is used to reduce radiation damage and multiple applications of contrast agent are not required; further, the tracking and positioning system 402 is also used to acquire the intraoperative physical space position of the second marker 405; the ultrasound navigation system 40 is further configured to perform an image fusion display using the registration result and the intraoperative physical spatial position of the second marker to determine a spatial pose of an intraoperative ultrasound imaging plane corresponding to the ultrasound probe in the preoperative image space, where the image fusion display includes a fusion display of an intraoperative ultrasound image and a preoperative image.
In one possible implementation, the tracking and positioning system 402 is also used to acquire the intraoperative physical spatial location of the third marker 406; the ultrasound navigation system 40 is further configured to determine a preoperative image spatial position corresponding to the intraoperative physical spatial position using the registration result and the intraoperative physical spatial position of the third marker 406 to determine the spatial pose of the surgical instrument intraoperatively in the preoperative image space.
For the convenience of describing the present embodiment, the operation principle of the system shown in fig. 4 will be described in detail with reference to fig. 1 and fig. 5, and specifically refer to the following:
(1) the system setting and navigation operation process comprises the following steps:
1.1) preoperative preparation: referring to fig. 1, a patient 10 lies on an operating bed 11 and prepares for a liver and abdomen interventional operation such as anesthesia and disinfection, and a magnetic marker 12 is attached to an interventional site of the patient 10.
1.2) preoperative image data preparation: after preoperative preparation, a patient pushes in CT or MR imaging equipment to perform CT/MRI image scanning of an interventional part, and preoperative three-dimensional image scanning data are obtained.
1.3) preoperative image spatial position extraction of magnetic markers 12: automatically extracting the image space position of the magnetic marker in the image data by adopting a UNet deep learning network to obtain a preoperative image space point set XImage={xi,i∈n}。
1.4) preoperative image space and patient physical space automatic registration: the magnetic tracking system 13 is turned on, and the navigation system software will automatically record the current preoperative physical space position of the magnetic markers 12; and automatically completing registration of an image space and a physical space of a patient through a characteristic registration algorithm, namely ICP (inductively coupled plasma), namely registration and calibration of two coordinate systems.
1.5) intraoperative ultrasound image real-time guidance: an ultrasonic imaging system is used as an intraoperative image, a positioning mark point 16 is attached to the surface of an ultrasonic probe 15, so that the tracking and positioning system 13 can track and position the position of the ultrasonic probe 15 in a three-dimensional physical space in real time through the positioning mark point 16, and the real-time intraoperative image acquired by the ultrasonic probe 15 is transmitted to an image workstation 14 through a data line for display.
1.6) an interventional puncture guiding needle: in order to facilitate the space positioning and real-time tracking of the interventional surgical instrument 17, the system installs the puncture interventional instrument 17 with the magnetic marker 18, and can realize the three-dimensional real-time tracking and visualization of the puncture interventional surgical instrument 17 under a unified magnetic tracking coordinate system. The surgical instrument 17 of the present invention may be selected from different surgical instruments according to a specific surgical operation, and is not limited herein.
1.7) in the interventional operation, firstly, acquiring an ultrasonic real-time image and combining the spatial pose of a corresponding scanning plane to acquire the three-dimensional spatial pose of an ultrasonic image in preoperative diagnosis image data such as CT/MR and the like. The imaging workstation also carries out real-time fusion display on the ultrasonic images and the preoperative sectional images according to the preoperative positioned coordinate mapping relation and gives the position of the puncture interventional surgical instrument 17 in a patient image coordinate system in real time. The doctor mainly carries out the operation according to the three-dimensional real-time rendering result of the image workstation.
(2) The intraoperative navigation process is as shown in fig. 5 and fig. 1:
step 1, initializing and preparing a navigation system: the marker 12 of the wireless or wired magnetic positioning sensor is adhered to the surface of the patient, and the position of the marker 12 is kept fixed in the whole interventional operation. At the same time, the ultrasound probe 15 and the penetrating interventional instrument 17, with the markers 16 and 18 of the wireless or wired magnetic positioning sensor mounted, are ensured to be ready in place.
Step 2, preoperative three-dimensional image data acquisition: CT or MR tomography images of the interventional region of the patient and angiographic images of the lesion region of the patient are acquired and transmitted to the imaging workstation 14.
Step 3, interventional path planning: after a preoperative sectional image is obtained, a body surface needle inserting point and a puncture target point can be defined according to the preoperative three-dimensional sectional image; and defining a puncture intervention path according to the needle insertion point and the target point to obtain a target operation path.
Step 4, extracting the image position of the magnetic marker 12: and (3) automatically extracting the images of the magnetic markers 12 in the three-dimensional image space by adopting a UNet deep learning network, and recording as follows: xImage={xiI ∈ n }. The training of the UNet network can be obtained by acquiring a plurality of image data of the magnetic marker before an operation and training.
Step 5, automatic registration: after the patient 10 is transferred to the operating table 11, the real-time spatial position coordinates of the magnetic positioning sensor 12 on the surface of the patient 10 at the current moment are acquired: y isPhysical={yiI ∈ n }, to the vision workstation 14. And performing least square solution according to the two groups of spatial position data, and performing characteristic point registration, thereby realizing automatic registration of the patient image space and the physical position space.
Step 6, three-dimensional real-time display of the intraoperative ultrasound image: the spatial position of the ultrasonic real-time imaging device in the operation can be transformed into a spatial coordinate system in which the CT/MR image data before the operation is positioned by utilizing the registered spatial transformation matrix T. Because the ultrasonic probe is provided with the magnetic marker 16, the spatial posture of the ultrasonic imaging plane can be determined in the preoperative image coordinate system after registration.
Step 7, real-time tracking of the puncture needle 17 in the interventional operation in the preoperative three-dimensional operation planning: since the magnetic marker 18 is installed on the interventional puncture instrument 17, the spatial position of the interventional puncture instrument during operation can be transformed into the spatial coordinate system of the CT/MR image data before operation by using the registered spatial transformation matrix. The puncture needle model and the patient model are displayed in the same three-dimensional scene in real time, and the tracking process of the navigation operation is completed.
And 8, repeating the steps 6 and 7 when necessary according to the judgment of the doctor on the interventional operation process, updating the spatial position of the ultrasonic imaging plane and the spatial position of the puncture interventional instrument, namely, the doctor is required to adjust the detection position of the ultrasonic 15 probe and the interventional position of the surgical instrument 17 in real time according to visual display so as to realize the operation guidance which is more consistent with the target operation path.
The method adopted by the invention does not need to determine the space position of the puncture needle from the X-ray perspective image, so that the X-ray perspective imaging of the patient is not needed in the whole process of the puncture interventional operation, the radiation injury to the patient and a doctor can be greatly reduced, the precision and the stability are better, and the operation efficiency is improved. The method adopted by the invention not only can acquire the real-time position of the puncture needle, but also can display the puncture needle in an ultrasonic two-dimensional image, a preoperative high-resolution image and an operation planning path, and can acquire the relative position relation between the puncture needle and a blood vessel and a key anatomical structure in real time.
FIG. 6 is a diagram that illustrates an internal structure of the computer device in one embodiment. The computer device may specifically be a terminal, and may also be a server. As shown in fig. 6, the computer device includes a processor, a memory, and a network interface connected by a system bus. Wherein the memory includes a non-volatile storage medium and an internal memory. The non-volatile storage medium of the computer device stores an operating system and may also store a computer program which, when executed by the processor, causes the processor to carry out the above-mentioned method. The internal memory may also have stored therein a computer program which, when executed by the processor, causes the processor to perform the method described above. Those skilled in the art will appreciate that the architecture shown in figure X is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as a particular computing device may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In an embodiment, a computer device is proposed, comprising a memory and a processor, the memory storing a computer program which, when executed by the processor, causes the processor to perform the steps of the method as shown in fig. 3.
In an embodiment, a computer-readable storage medium is proposed, in which a computer program is stored which, when being executed by a processor, causes the processor to carry out the steps of the method as shown in fig. 3.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a non-volatile computer-readable storage medium, and can include the processes of the embodiments of the methods described above when the program is executed. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), Rambus (Rambus) direct RAM (RDRAM), direct bused dynamic RAM (DRDRAM), and bused dynamic RAM (RDRAM).
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. An ultrasonic surgical navigation system, comprising: the system comprises an imaging system, a tracking and positioning system, a surgical instrument and a first marker for contacting with the body surface;
the image system is used for acquiring preoperative image data to display preoperative images and acquiring a preoperative image space point set corresponding to the first marker;
the tracking and positioning system is used for acquiring preoperative tracking data of the first marker so as to acquire a preoperative physical space position of the first marker in the preoperative tracking data;
the ultrasonic operation navigation system is used for registering the preoperative image space and the preoperative physical space according to the preoperative image space point set and the preoperative physical space position, and determining a registration result; planning a surgical path by using the preoperative image data, and determining a target surgical path; and performing surgical guidance of the surgical instrument according to the target surgical path, the registration result, the intra-operative image data and the intra-operative physical space position.
2. The ultrasonic surgical navigation system of claim 1, wherein the imaging system includes an ultrasonic probe having a second marker disposed thereon;
the ultrasonic probe is used for acquiring intraoperative image data;
the tracking and positioning system is also used for acquiring the intraoperative physical space position of the second marker;
the ultrasonic navigation system is further configured to perform image fusion display by using the registration result and the intraoperative physical spatial position of the second marker to determine a spatial posture of an intraoperative ultrasonic imaging plane corresponding to the ultrasonic probe in a preoperative image space, where the image fusion display includes fusion display of an intraoperative ultrasonic image and a preoperative image.
3. The ultrasonic surgical navigation system of claim 1, wherein a third marker is disposed on the surgical instrument;
the tracking and positioning system is also used for acquiring the intraoperative physical space position of the third marker;
the ultrasonic navigation system is further used for determining a preoperative image space position corresponding to the intraoperative physical space position by using the registration result and the intraoperative physical space position of the third marker so as to determine the spatial posture of the surgical instrument in the operation in the preoperative image space.
4. The ultrasonic surgical navigation system of claim 1, wherein the ultrasonic surgical navigation system determines a registration result, comprising:
registering the preoperative image space point set and the preoperative physical space position by using an iterative closest point algorithm, and determining a target conversion matrix corresponding to registration of the preoperative image space and the preoperative image space;
and taking the target conversion matrix as the registration result.
5. The ultrasonic surgical navigation system of any one of claims 1-4, wherein the marker comprises a magnetic marker or an optical marker.
6. The ultrasonic surgical navigation system of claim 5, wherein the tracking and positioning system comprises a magnetic tracking system or an optical tracking system.
7. The ultrasonic surgical navigation system of claim 5, wherein the pre-operative image comprises a tomographic image or a magnetic resonance image.
8. An ultrasonic surgical navigation method applied to the ultrasonic surgical navigation system according to any one of claims 1 to 7, the method comprising:
acquiring a preoperative image space point set corresponding to a first marker;
acquiring a preoperative physical space position corresponding to the first marker;
registering the preoperative image space and the preoperative physical space according to the preoperative image space point set and the preoperative physical space position, and determining a registration result;
planning a surgical path by using the preoperative image data, and determining a target surgical path;
and performing surgical guidance of surgical instruments according to the target surgical path, the registration result, the intraoperative image data and the intraoperative physical space position.
9. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, causes the processor to carry out the steps of the method as claimed in claim 8.
10. A computer arrangement comprising a memory and a processor, characterized in that the memory stores a computer program which, when executed by the processor, causes the processor to carry out the steps of the method as claimed in claim 8.
CN202210228410.6A 2022-03-08 2022-03-08 Ultrasonic operation navigation system and method, storage medium and device Pending CN114652443A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210228410.6A CN114652443A (en) 2022-03-08 2022-03-08 Ultrasonic operation navigation system and method, storage medium and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210228410.6A CN114652443A (en) 2022-03-08 2022-03-08 Ultrasonic operation navigation system and method, storage medium and device

Publications (1)

Publication Number Publication Date
CN114652443A true CN114652443A (en) 2022-06-24

Family

ID=82029813

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210228410.6A Pending CN114652443A (en) 2022-03-08 2022-03-08 Ultrasonic operation navigation system and method, storage medium and device

Country Status (1)

Country Link
CN (1) CN114652443A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114842004A (en) * 2022-07-04 2022-08-02 真健康(北京)医疗科技有限公司 Puncture position verification method and device based on neural network model
CN115500868A (en) * 2022-11-09 2022-12-23 中南大学 B-ultrasonic positioning system capable of interactively confirming position information with detected target

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114842004A (en) * 2022-07-04 2022-08-02 真健康(北京)医疗科技有限公司 Puncture position verification method and device based on neural network model
CN114842004B (en) * 2022-07-04 2022-10-21 真健康(北京)医疗科技有限公司 Puncture position verification method and device based on neural network model
CN115500868A (en) * 2022-11-09 2022-12-23 中南大学 B-ultrasonic positioning system capable of interactively confirming position information with detected target

Similar Documents

Publication Publication Date Title
US8781186B2 (en) System and method for abdominal surface matching using pseudo-features
US10229496B2 (en) Method and a system for registering a 3D pre acquired image coordinates system with a medical positioning system coordinate system and with a 2D image coordinate system
US10582879B2 (en) Method and apparatus for registration, verification and referencing of internal organs
EP2081494B1 (en) System and method of compensating for organ deformation
US10166078B2 (en) System and method for mapping navigation space to patient space in a medical procedure
US9486295B2 (en) Universal image registration interface
JP2016193222A (en) Method and apparatus for analyzing images
JP2008126075A (en) System and method for visual verification of ct registration and feedback
US11191595B2 (en) Method for recovering patient registration
JP2002186603A (en) Method for transforming coordinates to guide an object
US20070118100A1 (en) System and method for improved ablation of tumors
JP2011524772A (en) Method and system for performing a biopsy
CN114652443A (en) Ultrasonic operation navigation system and method, storage medium and device
CN106901719B (en) Registration between coordinate systems for visualizing tools
Rasoulian et al. Ultrasound-guided spinal injections: a feasibility study of a guidance system
Oliveira-Santos et al. A navigation system for percutaneous needle interventions based on PET/CT images: design, workflow and error analysis of soft tissue and bone punctures
Linte et al. Image-guided procedures: tools, techniques, and clinical applications
CN115775611B (en) Puncture operation planning system
CN113940756B (en) Operation navigation system based on mobile DR image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination