CN111973273A - Operation navigation system, method, device and medium based on AR technology - Google Patents
Operation navigation system, method, device and medium based on AR technology Download PDFInfo
- Publication number
- CN111973273A CN111973273A CN202010897861.XA CN202010897861A CN111973273A CN 111973273 A CN111973273 A CN 111973273A CN 202010897861 A CN202010897861 A CN 202010897861A CN 111973273 A CN111973273 A CN 111973273A
- Authority
- CN
- China
- Prior art keywords
- real
- image
- positioning
- dynamic
- patient
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/105—Modelling of the patient, e.g. for ligaments or bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/107—Visualisation of planned trajectories or target regions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Robotics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Human Computer Interaction (AREA)
- Manipulator (AREA)
Abstract
The application provides an operation navigation system, a method, equipment and a medium based on AR technology, wherein the system comprises: the system comprises a scanning device, a processor, a dynamic positioning acquisition device and an AR real-time monitoring robot. The utility model provides a surgery navigation based on AR technique passes through scanning equipment and acquires patient three-dimensional simulation image to rely on dynamic positioning acquisition device to superpose patient and surgical instruments's real-time dynamic data in three-dimensional simulation image in real time, the real-time dynamic data of reunion AR real-time supervision robot, make the accurate stack of image and the patient in the real environment that presents in AR real-time supervision robot, the more audio-visual spatial position picture that presents the relative patient anatomy of surgical instruments simultaneously, better supplementary doctor performs the operation.
Description
Technical Field
The invention relates to the technical field of AR (augmented reality) surgical navigation, in particular to a surgical navigation system, a surgical navigation method, surgical navigation equipment and a surgical navigation medium based on an AR technology.
Background
In the prior art, in order to enable a doctor to clearly know the position of a surgical instrument relative to the anatomical structure of a patient, a computer-aided navigation technology is generally adopted, and the operating principle of the navigation technology is as follows: a device capable of sending signals is arranged near the surgical site of a patient and on a surgical instrument, infrared rays are generally adopted as a sending source, a CCD (charge coupled device) camera is adopted as a receiver, the sent signals are utilized to track the bone position of the patient and the position and the motion track of the surgical instrument, and meanwhile, the information is displayed to a doctor through a display; during operation, X-ray fluoroscopy of the operation part of the patient is carried out, and the fluoroscopy image is combined with the obtained bone position image and the surgical instrument position image of the patient, so that a navigation image used by a doctor for operation is obtained.
Since the navigation system displays the positioning and guiding information on the system screen, the doctor performs the operation by observing the image on the navigation screen, so that the navigation information is separated from the operation scene. In operation, in order to observe the position of the surgical instrument relative to the anatomy of the patient, the doctor has to switch the visual field back and forth between the surgical site of the patient and the navigation screen, which is not favorable for the doctor to concentrate on the surgical site of the patient for operation, and can interfere with the surgical process and increase the navigation positioning error.
On the other hand, navigation information is not easy to understand. A doctor mainly utilizes the tomographic image which is displayed on a screen of a navigation system and is superposed with the information of the surgical instrument to perform intraoperative positioning, and the images cannot well reflect the current body position of a patient, so that the doctor cannot intuitively understand the spatial position relation between the surgical instrument and the actual patient anatomical structure in a real surgical scene, and the function of the surgical navigation system cannot be fully exerted.
Disclosure of Invention
In view of the above drawbacks of the prior art, an object of the present application is to provide a surgical navigation system, a method, a device and a medium based on AR technology, so as to overcome the defect in the prior art that the navigation information in the AR real-time monitoring robot cannot be accurately dynamically superimposed on the actual surgical scene.
To achieve the above and other related objects, the present application provides a surgical navigation system based on AR technology, the system comprising: the system comprises a scanning device, a processor, a dynamic positioning acquisition device and an AR real-time monitoring robot; the scanning device is used for scanning a patient before operation to obtain scanning image data and sending the scanning image data to the processor; the dynamic positioning acquisition device is used for acquiring a positioning image of a patient in a dynamic positioning coordinate system, first real-time dynamic data and second real-time dynamic data of the AR real-time monitoring robot in the dynamic positioning coordinate system in the operation process and sending the positioning image, the first real-time dynamic data and the second real-time dynamic data to the processor; the processor is used for generating a three-dimensional simulation image according to the scanning image data; the three-dimensional simulation image is statically registered in the dynamic positioning coordinate system according to the positioning image, and the statically registered three-dimensional simulation image is dynamically registered with the patient according to the first real-time dynamic data; acquiring coordinate data of the patient in an AR real-time monitoring robot coordinate system according to the second real-time dynamic data and the first real-time dynamic data, and generating a three-dimensional navigation image which is coincided with the patient in the actual environment in real time by combining the three-dimensional simulation image after dynamic registration; and the AR real-time monitoring robot is used for receiving and displaying the three-dimensional navigation image.
In an embodiment of the present application, the dynamic positioning acquiring apparatus includes: the optical dynamic tracking device is used for acquiring the first real-time dynamic data and the second real-time dynamic data, the positioning camera shooting device is used for acquiring the positioning image, and the plurality of marking components are arranged; wherein the marking member includes: a location component and a dynamic tracking component.
In an embodiment of the present application, a coordinate of the positioning camera device in the dynamic positioning coordinate system is a preset coordinate, so that the processor generates a coordinate parameter according to the preset coordinate and the positioning image, and performs static registration according to the coordinate parameter.
In an embodiment of the present application, the positioning component and the dynamic tracking component are attached to a position on the body of the patient within a certain preset range from the surgical site, so that the scanning image data scanned by the scanning device includes the image of the positioning component and the image of the dynamic tracking component, and the positioning image acquired by the positioning camera device includes the image of the positioning component and the image of the dynamic tracking component.
In an embodiment of the present application, the positioning member is a film, and the front pattern and the back pattern are the same; the front pattern is a coating mark of barium sulfate; the scanning image data obtained by scanning the patient by the scanning device before operation comprises the smearing mark; when the positioning part is attached to the body of a patient, the reverse pattern is printed on the skin of the patient to form a watermark pattern; the positioning image acquired by the positioning camera device in the operation process contains the watermark pattern.
In an embodiment of the present application, the dynamic tracking component is further disposed on the AR real-time monitoring robot, so that the optical tracking device identifies the dynamic tracking component during an operation to obtain the first real-time dynamic data and the second real-time dynamic data.
In an embodiment of the present application, the dynamic tracking component is further disposed on a surgical instrument, so that the positioning image obtained by the positioning camera device further includes a surgical instrument image; and for the optical tracking device to intra-operatively identify the dynamic tracking component to obtain third real-time dynamic data corresponding to the surgical instrument; the processor is further configured to dynamically register the statically registered three-dimensional simulated image with the patient based on the first real-time dynamic data and the third real-time dynamic data.
To achieve the above and other related objects, the present application provides a surgical navigation method based on AR technology, applied to a surgical navigation system based on AR technology as described above, the method including: acquiring scanning image data for scanning a patient before an operation, and generating a three-dimensional simulation image according to the scanning image data; acquiring a positioning image of a patient in a dynamic positioning coordinate system, first real-time dynamic data and second real-time dynamic data of an AR real-time monitoring robot in the dynamic positioning coordinate system in the operation process; the three-dimensional simulation image is statically registered in the dynamic positioning coordinate system according to the positioning image, and the statically registered three-dimensional simulation image is dynamically registered with the patient according to the first real-time dynamic data; and acquiring coordinate data of the patient in an AR real-time monitoring robot coordinate system according to the second real-time dynamic data and the first real-time dynamic data, and generating a three-dimensional navigation image which is coincided with the patient in the actual environment in real time by combining the three-dimensional simulation image after dynamic registration so as to display the three-dimensional navigation image by the AR real-time monitoring robot.
To achieve the above and other related objects, the present application provides a computer apparatus, comprising: a memory, a processor, and a communicator; the memory is to store computer instructions; the processor executes computer instructions to implement the method as described above; the communicator is used for being connected with an external device in a communication mode.
To achieve the above and other related objects, the present application provides a computer readable storage medium storing computer instructions which, when executed, perform the method as described above.
In summary, the present application provides a surgical navigation system, method, device and medium based on AR technology, which obtains scan image data for scanning a patient before surgery, and generates a three-dimensional simulation image according to the scan image data; acquiring a positioning image of a patient in a dynamic positioning coordinate system, first real-time dynamic data and second real-time dynamic data of an AR real-time monitoring robot in the dynamic positioning coordinate system in the operation process; the three-dimensional simulation image is statically registered in the dynamic positioning coordinate system according to the positioning image, and the statically registered three-dimensional simulation image is dynamically registered with the patient according to the first real-time dynamic data; and acquiring coordinate data of the patient in an AR real-time monitoring robot coordinate system according to the second real-time dynamic data and the first real-time dynamic data, and generating a three-dimensional navigation image which is coincided with the patient in the actual environment in real time by combining the three-dimensional simulation image after dynamic registration so as to display the three-dimensional navigation image by the AR real-time monitoring robot.
Has the following beneficial effects:
the utility model provides an operation navigation acquires patient three-dimensional simulation image through scanning equipment to rely on dynamic positioning acquisition device to superpose patient and surgical instruments's real-time dynamic data in three-dimensional simulation image in real time, the real-time dynamic data of reunion AR real-time supervision robot, make the accurate stack of image and the patient in the real environment that presents in AR real-time supervision robot, the more audio-visual spatial position picture that presents the relative patient anatomy of surgical instruments simultaneously, better supplementary doctor performs the operation.
Drawings
Fig. 1 is a schematic view of a scene of an AR technology-based surgical navigation system according to an embodiment of the present application.
Fig. 2 is a schematic structural diagram of an AR technology-based surgical navigation system according to an embodiment of the present application.
Fig. 3 is a flowchart illustrating an AR-technology-based surgical navigation method according to an embodiment of the present invention.
Fig. 4 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application is provided by way of specific examples, and other advantages and effects of the present application will be readily apparent to those skilled in the art from the disclosure herein. The present application is capable of other and different embodiments and its several details are capable of modifications and/or changes in various respects, all without departing from the spirit of the present application. It is to be noted that the features in the following embodiments and examples may be combined with each other without conflict.
It should be noted that the drawings provided in the following embodiments are only schematic and illustrate the basic idea of the present application, and although the drawings only show the components related to the present application and are not drawn according to the number, shape and size of the components in actual implementation, the type, quantity and proportion of the components in actual implementation may be changed at will, and the layout of the components may be more complex.
Throughout the specification, when a part is referred to as being "connected" to another part, this includes not only a case of being "directly connected" but also a case of being "indirectly connected" with another element interposed therebetween. In addition, when a certain part is referred to as "including" a certain component, unless otherwise stated, other components are not excluded, but it means that other components may be included.
The terms first, second, third, etc. are used herein to describe various elements, components, regions, layers and/or sections, but are not limited thereto. These terms are only used to distinguish one element, component, region, layer or section from another element, component, region, layer or section. Thus, a first element, component, region, layer or section discussed below could be termed a second element, component, region, layer or section without departing from the scope of the present application.
Also, as used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context indicates otherwise. It will be further understood that the terms "comprises," "comprising," and/or "comprising," when used in this specification, specify the presence of stated features, operations, elements, components, items, species, and/or groups, but do not preclude the presence, or addition of one or more other features, operations, elements, components, items, species, and/or groups thereof. The terms "or" and/or "as used herein are to be construed as inclusive or meaning any one or any combination. Thus, "A, B or C" or "A, B and/or C" means "any of the following: a; b; c; a and B; a and C; b and C; A. b and C ". An exception to this definition will occur only when a combination of elements, functions or operations are inherently mutually exclusive in some way.
In order to overcome the defect that navigation information in an AR real-time monitoring robot and an actual operation scene cannot be accurately and dynamically superposed in the prior art, the application provides an operation navigation system, a method, equipment and a medium based on an AR technology.
Fig. 1 is a schematic view of a scene of the AR technology-based surgical navigation system according to an embodiment of the present application. As shown, the system comprises: the system comprises a scanning device 1, a processor 2, a dynamic positioning acquisition device 3 and an AR real-time monitoring robot 4.
The scanning device 1 is configured to scan a patient 6 before an operation to obtain scan image data, and send the scan image data to the processor 2.
The dynamic positioning acquisition device 3 is used for acquiring a positioning image of the patient 6 in a dynamic positioning coordinate system, first real-time dynamic data and second real-time dynamic data of the AR real-time monitoring robot 4 in the dynamic positioning coordinate system in the operation process, and sending the positioning image, the first real-time dynamic data and the second real-time dynamic data to the processor 2. Specifically, the dynamic positioning acquisition device 3 includes: an optical dynamic tracking device for acquiring the first real-time dynamic data and the second real-time dynamic data, a positioning imaging device for acquiring the positioning image, and a plurality of marking units 33, wherein the marking units 33 include: a location component and a dynamic tracking component.
The processor 2 is configured to generate a three-dimensional simulation image according to the scanned image data; the three-dimensional simulation image is statically registered in the dynamic positioning coordinate system according to the positioning image, and the statically registered three-dimensional simulation image is dynamically registered with the patient 6 according to the first real-time dynamic data; and acquiring coordinate data of the patient 6 in the coordinate system of the AR real-time monitoring robot 4 according to the second real-time dynamic data and the first real-time dynamic data, and generating a three-dimensional navigation image which is coincided with the patient 6 in the actual environment in real time by combining the three-dimensional simulation image after dynamic registration.
And the AR real-time monitoring robot 4 is used for receiving and displaying the three-dimensional navigation image.
The present application is further illustrated by the following examples, which are not intended to limit the scope of the invention. The experimental methods without specifying specific conditions in the following examples were selected according to the conventional methods and conditions, or according to the commercial instructions.
Fig. 2 is a schematic structural diagram of the AR technology-based surgical navigation system according to an embodiment of the present application. As shown, the system comprises: the system comprises a scanning device 1, a processor 2, a dynamic positioning acquisition device 3 and an AR real-time monitoring robot 4.
The scanning device 1 is used for scanning the patient 6 before operation to acquire scanning image data and sending the scanning image data to the processor 2, for example, the scanning device 1 in this application may be a CT apparatus, and after the patient 6 is scanned by the CT apparatus, a plurality of tomographic images, that is, the scanning image data, are generated.
The processor 2 is configured to receive the scanned image data and generate a three-dimensional simulated image of the patient 6 from the scanned image data, wherein the generated three-dimensional simulated image is simulated in proportion to the patient 6, and the bone, blood vessel, muscle tissue, etc. of the patient 6 can be observed through the three-dimensional simulated image.
In addition, when the patient 6 is located within the effective test range of the dynamic positioning acquiring device 3, the dynamic positioning acquiring device 3 may acquire an initial static positioning image and first real-time dynamic data of the patient 6 in a dynamic positioning coordinate system during an operation, and send the information to the processor 2, the processor 2 processes the positioning image through a machine vision algorithm, acquires a target image from the positioning image, and calculates a coordinate parameter of the target image in the coordinate system of the positioning acquiring device 3.
Meanwhile, the dynamic positioning acquiring device 3 is further configured to acquire second real-time dynamic data of the AR real-time monitoring robot 4 in the dynamic positioning coordinate system and send the second real-time dynamic data to the processor 2. Specifically, a marker 33 that can be tracked by the dynamic positioning acquisition device 3 may be provided on the AR real-time monitoring robot 4, and the dynamic positioning acquisition device 3 may further acquire the second real-time dynamic data of the AR real-time monitoring robot 4 in the dynamic position coordinate system by tracking the marker 33.
The above mentioned processor 2 is further configured to perform a static registration of the three-dimensional simulation image in the dynamic position coordinate system according to the coordinate parameters of the patient 6, and perform a dynamic registration of the statically registered three-dimensional simulation image with the patient 65 according to the first real-time dynamic data.
It is to be explained that the static registration is to put the three-dimensional simulation image stored in the processor 2 into the coordinate system of the positioning acquisition device 3 so that the coordinates of the three-dimensional simulation image are the same as the coordinates of the patient 6 in the actual environment. The dynamic registration means that the three-dimensional simulation image and the patient 6 in the real environment are synchronously moved according to the acquired first real-time dynamic data.
Considering that the doctor wearing the AR real-time monitoring robot 4 inevitably needs to move during the operation, in order to facilitate the doctor's operation observation, the present embodiment may superimpose the virtual imaging in the AR real-time monitoring robot 4 on the patient 6 in the real environment in real time, as follows.
The processor 2 acquires coordinate data of the patient 6 in an AR coordinate system according to the second real-time dynamic data and the first real-time dynamic data, generates a three-dimensional navigation image which is coincided with the patient 6 in the actual environment in real time by combining the three-dimensional simulation image after dynamic registration, and sends the three-dimensional navigation image to the AR real-time monitoring robot 4, and the AR real-time monitoring robot 4 is used for receiving and displaying the three-dimensional navigation image.
That is, the AR real-time monitoring robot 4 is used as a viewpoint, an AR coordinate system is constructed by using the viewpoint as a coordinate origin, coordinate data of the patient 6 in the AR coordinate system is calculated in real time, and the registered three-dimensional simulation image is moved to a position represented by the coordinate data. In practical situations, the coordinates of the patient 6 in the dynamic fixed-position coordinate system can be obtained, and the coordinates of the AR real-time monitoring robot 4 in the dynamic fixed-position coordinate system can also be obtained, so that it is easy to understand that the coordinates of the patient 6 in the AR coordinate system can be obtained by subtracting the coordinates of the AR real-time monitoring robot 4 from the coordinates of the patient 6 in the dynamic fixed-position coordinate system by the processor 2, so that the processor 2 can generate a three-dimensional navigation image by placing the registered three-dimensional simulation image into the corresponding coordinates of the AR coordinate system, and the three-dimensional navigation image is overlapped with the patient 6 in the actual environment in real time.
During an operation, even if the position of a doctor wearing the AR real-time monitoring robot 4 moves, the processor 2 can indirectly calculate the coordinate of the patient 6 in the AR coordinate system according to the coordinates of the AR real-time monitoring robot 4 and the patient 6 in the dynamic position coordinate system in real time, and place the registered three-dimensional simulation image into the AR coordinate system, so that the virtual image seen by the doctor in the AR real-time monitoring robot 4 is real-time overlapped with the patient 6 in a real environment, the image presented in the AR real-time monitoring robot 4 is accurately overlapped with the patient 6 in the real environment, the doctor can observe the entity and the virtual image of the patient 6 conveniently, and the operation can be carried out conveniently.
In the present embodiment, the dynamic positioning acquiring apparatus 3 includes an optical dynamic tracking device 31 and a positioning camera device 31, such as an NDI optical tracking device and a binocular camera, the optical dynamic tracking device 31 is used for acquiring the first real-time dynamic data and the second real-time dynamic data; the positioning imaging apparatus 31 is used to acquire a positioning image. The processor 2 is further configured to generate coordinate parameters from the preset coordinates and the positioning image, the coordinate parameters representing the position of the patient 6 in the dynamic position coordinate system. The processor 2 performs static registration on the three-dimensional simulation image in a dynamic position coordinate system according to the coordinate parameters.
It should be noted that the coordinates of the positioning camera device 31 in the dynamic positioning position coordinate system are preset coordinates, that is, the position of the positioning camera device 31 in the dynamic positioning position coordinate system is known, and the present embodiment specifically describes the NDI optical tracking device and the binocular camera as an example.
The binocular camera is located in the effective tracking range of the NDI optical tracking equipment, and the NDI optical tracking equipment can acquire the coordinate position of the binocular camera. The specific implementation manner is that a positioning mark capable of being tracked by the NDI optical tracking device may be installed on the binocular camera, and the positioning mark may be an active mark or a passive mark, which is not specifically limited in this embodiment. The NDI optical tracking device tracks infrared light emitted by the positioning mark or reflected by the positioning mark, so as to acquire coordinates of the binocular camera in a dynamic position coordinate system, and sends the coordinates to the processor 2.
It should be noted that the optical dynamic tracking device 31 also has a function of positioning and ranging, but the precision is smaller than that of the professional positioning camera device 31, so to ensure the precision of the dynamic and static data acquired during the operation, in this embodiment, two devices, namely the optical dynamic tracking device 31 and the positioning camera device 31, are selected, the optical dynamic tracking device 31 acquires the first real-time dynamic data of the patient 6 and the second real-time dynamic data of the AR real-time monitoring robot 4, the positioning camera device 31 acquires the positioning image of the patient 6, wherein the coordinate of the positioning camera device 31 in the dynamic position coordinate system is a preset coordinate, and the static registration can be completed by combining the positioning image.
In this embodiment, the dynamic positioning acquiring device 3 further includes a plurality of marking units 33, and the marking units 33 include a positioning unit 331 and a dynamic tracking unit 332; the positioning component 331 and the dynamic tracking component 332 are attached to the body within a certain preset range from the surgical site of the patient 6, and the dynamic tracking component 332 is further arranged on the AR real-time monitoring robot 4;
the scanning device 1 is used for acquiring scanning image data containing an image of the positioning component 331 before operation;
the positioning camera device 31 is used for acquiring a positioning image containing an image of the positioning component 331 and an image of the dynamic tracking component 332 in an operation;
the optical tracking device is used to identify the dynamic tracking component 332 during surgery to acquire the first real-time dynamic data and the second real-time dynamic data.
In the present embodiment, by providing the positioning unit 331 and the dynamic tracking unit 332 on the patient 6 and the AR real-time monitoring robot 4, the positioning unit 331 can be recognized by the positioning imaging device 31, and the dynamic tracking unit 332 can be recognized by the optical tracking device.
In this embodiment, the front pattern and the back pattern of the positioning component 331 are the same, the front pattern is a coating mark of barium sulfate, and the scanning device 1 is used for scanning before surgery to obtain scanned image data containing the coating mark; the reverse pattern is printed on the skin of the patient 6 to form a watermark pattern when the positioning part 331 is attached, and the positioning imaging device 31 is used to acquire a positioning image containing the watermark pattern during surgery.
It should be explained that, considering that in a practical situation, the patient 6 is not scanned in the same period as the operation, there is often an interval therebetween, and the interval time may be three days, five days or one week, during which the living activities performed by the patient 6, such as walking, sleeping posture change, etc., may cause the positioning component 331 to shift, so that the positioning image containing the positioning component 331 image acquired by the positioning camera device 31 during the operation is inaccurate, thereby affecting the positioning of the three-dimensional simulation image in the dynamic positioning coordinate system. Therefore, the reverse pattern can be printed on the skin of the patient 6 to form a watermark pattern when the positioning member 331 is attached, and the watermark pattern can be kept on the body surface of the patient 6 for a period of time to avoid deviation.
In this embodiment, the positioning member 331 is a film, the front and back surfaces of the positioning member are provided with the same pattern, the pattern may be a grid, concentric circles, equidistant points, etc., 1 or more special points are provided on the positioning member 331 for positioning reference during the operation, the front surface of the positioning member 331 adopts a coating mark of barium sulfate, and the barium sulfate can be recognized by the CT apparatus by X-ray development of the CT apparatus.
Further, the dynamic tracking component 332 is also disposed on the surgical instrument 5;
the positioning camera device 31 is used for acquiring a positioning image containing an image of the positioning component 331, an image of the dynamic tracking component 332 and an image of the surgical instrument 5 in the operation;
the optical tracking device is further configured to identify the dynamic tracking component 332 during the operation to obtain third real-time dynamic data of the surgical instrument 5;
the processor 2 is configured to dynamically register the statically registered three-dimensional simulated image with the patient 6 based on the first real-time dynamic data and the third real-time dynamic data.
In one or more embodiments of the present application, the system may further include an alarm device, which is communicatively coupled to the processor 2. And the three-dimensional navigation image is marked with a reminding position which can be an important blood vessel or nerve position, the processor 2 calculates the minimum distance between the surgical instrument 5 and the reminding position in real time, when the minimum distance between the surgical instrument 5 and the reminding position is smaller than or equal to a preset value, the processor 2 controls the alarm device to send out a reminding signal to remind a doctor of careful operation, and when the minimum distance between the surgical instrument 5 and the reminding position is smaller than or equal to a preset danger value, the processor 2 controls the alarm device to send out a warning signal to remind the doctor of higher current operation wind direction, and the safety coefficient of the operation is improved through the alarm device.
In this embodiment, by setting the marking member 33 on the surgical instrument 5, the dynamic data of the surgical instrument 5 is acquired in real time during the surgical procedure and is included in the three-dimensional simulation image, so that the doctor can more intuitively present the spatial position map of the surgical instrument 5 relative to the anatomical structure of the patient 6, and the doctor is better assisted in performing the surgical procedure.
To sum up, the operation navigation of this application passes through scanning equipment and acquires patient three-dimensional simulation image to rely on dynamic positioning acquisition device in real time to superpose patient and surgical instruments's real-time dynamic data to three-dimensional simulation image, the real-time dynamic data of reunion AR real-time supervision robot makes the accurate stack of image and the patient in the real environment that presents in AR real-time supervision robot, and the while is more audio-visual presents the relative patient anatomy structure's of surgical instruments spatial position picture, and better supplementary doctor performs the operation.
Fig. 3 is a schematic flow chart of the surgical navigation method based on AR technology in an embodiment of the present application. As shown, the method comprises:
step S301: acquiring scanning image data for scanning a patient before an operation, and generating a three-dimensional simulation image according to the scanning image data;
step S302: acquiring a positioning image of a patient in a dynamic positioning coordinate system, first real-time dynamic data and second real-time dynamic data of an AR real-time monitoring robot in the dynamic positioning coordinate system in the operation process;
step S303: the three-dimensional simulation image is statically registered in the dynamic positioning coordinate system according to the positioning image, and the statically registered three-dimensional simulation image is dynamically registered with the patient according to the first real-time dynamic data;
step S304: and acquiring coordinate data of the patient in an AR real-time monitoring robot coordinate system according to the second real-time dynamic data and the first real-time dynamic data, and generating a three-dimensional navigation image which is coincided with the patient in the actual environment in real time by combining the three-dimensional simulation image after dynamic registration so as to display the three-dimensional navigation image by the AR real-time monitoring robot.
Specifically, the embodiment of the method is similar to the embodiment of the surgical navigation system based on the AR technology described in fig. 1 and 2, and therefore, the detailed description of the embodiment of the method is omitted here. It should be understood that the method can be applied to the processor 2 in fig. 1 or fig. 2, that is, the processor provides relevant data by executing the program of the method, via the scanning device, the dynamic positioning acquiring device, and the AR real-time monitoring robot in fig. 1 or fig. 2, to assist in completing the functions implemented by the system in fig. 1 or fig. 2.
Fig. 4 is a schematic structural diagram of a computer device according to an embodiment of the present invention. As shown, the computer device 400 includes: a memory 401, a processor 402, and a processor 403; the memory 401 is used for storing computer instructions; the processor 402 executes computer instructions to implement the method described in FIG. 3; the processor 403 is configured to be communicatively coupled to an external device.
The external device may be a scanning device, a dynamic positioning acquisition device, an AR real-time monitoring robot, or the like as described in fig. 1 or fig. 2.
In some embodiments, the number of the memories 401 in the computer apparatus 400 may be one or more, the number of the processors 402 may be one or more, the number of the processors 4032 may be one or more, and one is taken as an example in fig. 4.
In an embodiment of the present application, the processor 402 in the computer device 400 loads one or more instructions corresponding to processes of an application program into the memory 401 according to the steps described in fig. 3, and the processor 402 executes the application program stored in the memory 401, thereby implementing the method described in fig. 3.
The Memory 401 may include a Random Access Memory (RAM), and may also include a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. The memory 401 stores an operating system and operating instructions, executable modules or data structures, or a subset thereof, or an expanded set thereof, wherein the operating instructions may include various operating instructions for implementing various operations. The operating system may include various system programs for implementing various basic services and for handling hardware-based tasks.
The Processor 402 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the Integrated Circuit may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, or a discrete hardware component.
The communicator 403 is used to implement communication connection between the database access device and other devices (such as a client, a read-write library, and a read-only library). The communicator 403 may include one or more sets of modules of different communication manners, for example, a CAN communication module communicatively connected to a CAN bus. The communication connection may be one or more wired/wireless communication means and combinations thereof. The communication method comprises the following steps: any one or more of the internet, CAN, intranet, Wide Area Network (WAN), Local Area Network (LAN), wireless network, Digital Subscriber Line (DSL) network, frame relay network, Asynchronous Transfer Mode (ATM) network, Virtual Private Network (VPN), and/or any other suitable communication network. For example: any one or a plurality of combinations of WIFI, Bluetooth, NFC, GPRS, GSM and Ethernet.
In some specific applications, the various components of the computer device 400 are coupled together by a bus system that may include a power bus, a control bus, a status signal bus, etc., in addition to a data bus. But for clarity of explanation the various busses are shown in fig. 4 as a bus system.
In an embodiment of the present application, a computer-readable storage medium is provided, on which a computer program is stored, which when executed by a processor implements the method described in fig. 3.
The present application may be embodied as systems, methods, and/or computer program products, in any combination of technical details. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied thereon for causing a processor to implement various aspects of the present application.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable programs described herein may be downloaded from a computer-readable storage medium to a variety of computing/processing devices, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present application may be assembly instructions, Instruction Set Architecture (ISA) instructions, machine related instructions, microcode, firmware instructions, state setting data, integrated circuit configuration data, or source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, the electronic circuitry can execute computer-readable program instructions to implement aspects of the present application by utilizing state information of the computer-readable program instructions to personalize the electronic circuitry, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA).
In summary, the present application provides a surgical navigation system, method, device and medium based on AR technology, which obtains scan image data for scanning a patient before surgery, and generates a three-dimensional simulation image according to the scan image data; acquiring a positioning image of a patient in a dynamic positioning coordinate system, first real-time dynamic data and second real-time dynamic data of an AR real-time monitoring robot in the dynamic positioning coordinate system in the operation process; the three-dimensional simulation image is statically registered in the dynamic positioning coordinate system according to the positioning image, and the statically registered three-dimensional simulation image is dynamically registered with the patient according to the first real-time dynamic data; and acquiring coordinate data of the patient in an AR real-time monitoring robot coordinate system according to the second real-time dynamic data and the first real-time dynamic data, and generating a three-dimensional navigation image which is coincided with the patient in the actual environment in real time by combining the three-dimensional simulation image after dynamic registration so as to display the three-dimensional navigation image by the AR real-time monitoring robot.
The application effectively overcomes various defects in the prior art and has high industrial utilization value.
The above embodiments are merely illustrative of the principles and utilities of the present application and are not intended to limit the invention. Any person skilled in the art can modify or change the above-described embodiments without departing from the spirit and scope of the present application. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention be covered by the claims of the present application.
Claims (10)
1. An AR technology-based surgical navigation system, the system comprising: the system comprises a scanning device, a processor, a dynamic positioning acquisition device and an AR real-time monitoring robot;
the scanning device is used for scanning a patient before operation to obtain scanning image data and sending the scanning image data to the processor;
the dynamic positioning acquisition device is used for acquiring a positioning image of a patient in a dynamic positioning coordinate system, first real-time dynamic data and second real-time dynamic data of the AR real-time monitoring robot in the dynamic positioning coordinate system in the operation process and sending the positioning image, the first real-time dynamic data and the second real-time dynamic data to the processor;
the processor is used for generating a three-dimensional simulation image according to the scanning image data; the three-dimensional simulation image is statically registered in the dynamic positioning coordinate system according to the positioning image, and the statically registered three-dimensional simulation image is dynamically registered with the patient according to the first real-time dynamic data; acquiring coordinate data of the patient in an AR real-time monitoring robot coordinate system according to the second real-time dynamic data and the first real-time dynamic data, and generating a three-dimensional navigation image which is coincided with the patient in the actual environment in real time by combining the three-dimensional simulation image after dynamic registration;
and the AR real-time monitoring robot is used for receiving and displaying the three-dimensional navigation image.
2. The system according to claim 1, wherein the dynamic positioning acquisition means comprises: the optical dynamic tracking device is used for acquiring the first real-time dynamic data and the second real-time dynamic data, the positioning camera shooting device is used for acquiring the positioning image, and the plurality of marking components are arranged; wherein the marking member includes: a location component and a dynamic tracking component.
3. The system according to claim 2, wherein the coordinate of the positioning camera device in the dynamic positioning coordinate system is a preset coordinate, so that the processor generates a coordinate parameter according to the preset coordinate and the positioning image, and performs static registration according to the coordinate parameter.
4. The system according to claim 2, wherein the positioning component and the dynamic tracking component are attached to the patient within a certain preset range from the surgical site, so that the scanning image data scanned by the scanning device includes a positioning component image and a dynamic tracking component image, and the positioning image acquired by the positioning camera device includes a positioning component image and a dynamic tracking component image.
5. The system of claim 2 or 4, wherein the positioning member is film-shaped and has a front pattern and a back pattern that are the same;
the front pattern is a coating mark of barium sulfate; the scanning image data obtained by scanning the patient by the scanning device before operation comprises the smearing mark;
when the positioning part is attached to the body of a patient, the reverse pattern is printed on the skin of the patient to form a watermark pattern; the positioning image acquired by the positioning camera device in the operation process contains the watermark pattern.
6. The system of claim 2, wherein the dynamic tracking component is further disposed on the AR real-time monitoring robot for the optical tracking device to identify the dynamic tracking component to acquire the first and second real-time dynamic data intraoperatively.
7. The system according to claim 2, wherein the dynamic tracking component is further disposed on a surgical instrument, so that the positioning image obtained by the positioning camera device further includes a surgical instrument image; and for the optical tracking device to intra-operatively identify the dynamic tracking component to obtain third real-time dynamic data corresponding to the surgical instrument;
the processor is further configured to dynamically register the statically registered three-dimensional simulated image with the patient based on the first real-time dynamic data and the third real-time dynamic data.
8. An AR technology-based surgical navigation method applied to the AR technology-based surgical navigation system according to any one of claims 1 to 7, the method comprising:
acquiring scanning image data for scanning a patient before an operation, and generating a three-dimensional simulation image according to the scanning image data;
acquiring a positioning image of a patient in a dynamic positioning coordinate system, first real-time dynamic data and second real-time dynamic data of an AR real-time monitoring robot in the dynamic positioning coordinate system in the operation process;
the three-dimensional simulation image is statically registered in the dynamic positioning coordinate system according to the positioning image, and the statically registered three-dimensional simulation image is dynamically registered with the patient according to the first real-time dynamic data;
and acquiring coordinate data of the patient in an AR real-time monitoring robot coordinate system according to the second real-time dynamic data and the first real-time dynamic data, and generating a three-dimensional navigation image which is coincided with the patient in the actual environment in real time by combining the three-dimensional simulation image after dynamic registration so as to display the three-dimensional navigation image by the AR real-time monitoring robot.
9. A computer device, the device comprising: a memory, a processor, and a communicator; the memory is to store computer instructions; the processor executing computer instructions implements the method of claim 8; the communicator is used for being connected with an external device in a communication mode.
10. A computer-readable storage medium having stored thereon computer instructions which, when executed, perform the method of claim 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010897861.XA CN111973273A (en) | 2020-08-31 | 2020-08-31 | Operation navigation system, method, device and medium based on AR technology |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010897861.XA CN111973273A (en) | 2020-08-31 | 2020-08-31 | Operation navigation system, method, device and medium based on AR technology |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111973273A true CN111973273A (en) | 2020-11-24 |
Family
ID=73441490
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010897861.XA Withdrawn CN111973273A (en) | 2020-08-31 | 2020-08-31 | Operation navigation system, method, device and medium based on AR technology |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111973273A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113081273A (en) * | 2021-03-24 | 2021-07-09 | 上海微创医疗机器人(集团)股份有限公司 | Punching auxiliary system and surgical robot system |
CN113893034A (en) * | 2021-09-23 | 2022-01-07 | 上海交通大学医学院附属第九人民医院 | Integrated operation navigation method, system and storage medium based on augmented reality |
WO2022206434A1 (en) * | 2021-04-01 | 2022-10-06 | 上海复拓知达医疗科技有限公司 | Interactive alignment system and method for surgical navigation, electronic device, and readable storage medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107374729A (en) * | 2017-08-21 | 2017-11-24 | 上海霖晏医疗科技有限公司 | Operation guiding system and method based on AR technologies |
CN110051434A (en) * | 2019-04-25 | 2019-07-26 | 厦门强本科技有限公司 | AR operation piloting method and terminal in conjunction with endoscope |
CN110442232A (en) * | 2019-06-18 | 2019-11-12 | 中国人民解放军军事科学院国防科技创新研究院 | The wearable augmented reality robot control system of joint eye movement and brain-computer interface |
-
2020
- 2020-08-31 CN CN202010897861.XA patent/CN111973273A/en not_active Withdrawn
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107374729A (en) * | 2017-08-21 | 2017-11-24 | 上海霖晏医疗科技有限公司 | Operation guiding system and method based on AR technologies |
CN110051434A (en) * | 2019-04-25 | 2019-07-26 | 厦门强本科技有限公司 | AR operation piloting method and terminal in conjunction with endoscope |
CN110442232A (en) * | 2019-06-18 | 2019-11-12 | 中国人民解放军军事科学院国防科技创新研究院 | The wearable augmented reality robot control system of joint eye movement and brain-computer interface |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113081273A (en) * | 2021-03-24 | 2021-07-09 | 上海微创医疗机器人(集团)股份有限公司 | Punching auxiliary system and surgical robot system |
CN113081273B (en) * | 2021-03-24 | 2023-07-28 | 上海微创医疗机器人(集团)股份有限公司 | Punching auxiliary system and surgical robot system |
WO2022206434A1 (en) * | 2021-04-01 | 2022-10-06 | 上海复拓知达医疗科技有限公司 | Interactive alignment system and method for surgical navigation, electronic device, and readable storage medium |
CN113893034A (en) * | 2021-09-23 | 2022-01-07 | 上海交通大学医学院附属第九人民医院 | Integrated operation navigation method, system and storage medium based on augmented reality |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112155727A (en) | Surgical navigation systems, methods, devices, and media based on three-dimensional models | |
CN107374729B (en) | Operation navigation system and method based on AR technology | |
CN107016717B (en) | System and method for perspective view of a patient | |
Andrews et al. | Registration techniques for clinical applications of three-dimensional augmented reality devices | |
EP2953569B1 (en) | Tracking apparatus for tracking an object with respect to a body | |
CN103735312B (en) | Multimode image navigation system for ultrasonic guidance operation | |
EP2637593B1 (en) | Visualization of anatomical data by augmented reality | |
US11107270B2 (en) | Medical scene model | |
Harders et al. | Calibration, registration, and synchronization for high precision augmented reality haptics | |
CN111494009B (en) | Image registration method and device for surgical navigation and surgical navigation system | |
CN110709894B (en) | Virtual shadow for enhanced depth perception | |
CN111973273A (en) | Operation navigation system, method, device and medium based on AR technology | |
US20200352657A1 (en) | Operating room remote monitoring | |
JP2019506919A (en) | Motion box visualization for electromagnetic sensor tracking systems | |
Rodas et al. | See it with your own eyes: Markerless mobile augmented reality for radiation awareness in the hybrid room | |
Ma et al. | Moving-tolerant augmented reality surgical navigation system using autostereoscopic three-dimensional image overlay | |
AU2015238800A1 (en) | Real-time simulation of fluoroscopic images | |
US10078906B2 (en) | Device and method for image registration, and non-transitory recording medium | |
Beyl et al. | Time-of-flight-assisted Kinect camera-based people detection for intuitive human robot cooperation in the surgical operating room | |
CN113842227B (en) | Medical auxiliary three-dimensional model positioning and matching method, system, equipment and medium | |
CN111658142A (en) | MR-based focus holographic navigation method and system | |
CN109106448A (en) | A kind of operation piloting method and device | |
Haliburton et al. | A visual odometry base-tracking system for intraoperative C-arm guidance | |
US20240054745A1 (en) | Systems and methods for registering a 3d representation of a patient with a medical device for patient alignment | |
JP2003079616A (en) | Detecting method of three-dimensional location of examination tool which is inserted in body region |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20201124 |
|
WW01 | Invention patent application withdrawn after publication |