CN112190331A - Method, device and system for determining surgical navigation information and electronic device - Google Patents

Method, device and system for determining surgical navigation information and electronic device Download PDF

Info

Publication number
CN112190331A
CN112190331A CN202011106421.4A CN202011106421A CN112190331A CN 112190331 A CN112190331 A CN 112190331A CN 202011106421 A CN202011106421 A CN 202011106421A CN 112190331 A CN112190331 A CN 112190331A
Authority
CN
China
Prior art keywords
augmented reality
information
navigation information
surgical instrument
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011106421.4A
Other languages
Chinese (zh)
Inventor
申一君
庞博
田梦泽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing AK Medical Co Ltd
Original Assignee
Beijing AK Medical Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing AK Medical Co Ltd filed Critical Beijing AK Medical Co Ltd
Priority to CN202011106421.4A priority Critical patent/CN112190331A/en
Publication of CN112190331A publication Critical patent/CN112190331A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Gynecology & Obstetrics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Robotics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a method, a device and a system for determining surgical navigation information and an electronic device. Wherein, this operation navigation information's determination system includes: the surgical end equipment is used for acquiring spatial position information of a to-be-operated part and movement track information of a surgical instrument, wherein the surgical instrument is positioned in a target range containing the to-be-operated part; and the remote equipment is connected with the operation end equipment and used for determining a virtual image according to the spatial position information and the movement track information, performing augmented reality processing on the virtual image to obtain an augmented reality image, and determining navigation information of the surgical instrument based on the augmented reality image. The invention solves the technical problem that navigation information is not easy to understand due to the fact that the navigation information is separated from an operation scene in an operation navigation scheme in the prior art.

Description

Method, device and system for determining surgical navigation information and electronic device
Technical Field
The invention relates to the technical field of medical treatment, in particular to a method, a device and a system for determining surgical navigation information and an electronic device.
Background
In the related art, a surgical navigation system generally installs an active or passive marking device near a surgical site of a patient and on a surgical instrument, tracks a bone position of the patient and a position and a motion track of the surgical instrument by using a transmitted signal, and needs to perform dotting registration on the surgical site of the patient during surgery.
In order to enable a doctor to clearly know the position of a surgical instrument relative to the anatomical structure of a patient, a navigation technology commonly adopted in the related technology is a computer-aided navigation technology, and the navigation technology has the technical problem that navigation information is separated from an operation scene, so that the navigation information is not easy to understand.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides a method, a device and a system for determining surgical navigation information and an electronic device, which are used for at least solving the technical problem that navigation information is difficult to understand because the navigation information is separated from a surgical scene in a surgical navigation scheme in the prior art.
According to an aspect of an embodiment of the present invention, there is provided a surgical navigation information determining system including: the surgical end equipment is used for acquiring spatial position information of a to-be-operated part and movement track information of a surgical instrument, wherein the surgical instrument is positioned in a target range containing the to-be-operated part; and the remote equipment is connected with the operation end equipment and used for determining a virtual image according to the spatial position information and the movement track information, performing augmented reality processing on the virtual image to obtain an augmented reality image, and determining navigation information of the surgical instrument based on the augmented reality image.
Optionally, the system further includes: and a fifth generation mobile communication 5G module, configured to establish a communication connection between the operation end device and the remote end device, where the operation end device sends the spatial position information and the movement track information to the remote end device through the 5G module.
Optionally, the surgical end device includes: the first positioning module is arranged in the target range and used for acquiring the spatial position information of the part to be operated in real time; and the second positioning module is connected with the surgical instrument and is used for acquiring the movement track information of the surgical instrument in real time.
Optionally, the system further includes: and the display auxiliary module is connected with the first positioning module and the second positioning module, and is used for receiving the spatial position information and the movement track information and sending the spatial position information and the movement track information to the remote equipment.
Optionally, the remote device includes: an image processing module, connected to the display auxiliary module, for acquiring an X-ray fluoroscopic image of the to-be-operated site, and synthesizing the spatial position information, the movement trajectory information, and the X-ray fluoroscopic image to obtain the virtual image; performing augmented reality processing on the virtual image to obtain an augmented reality image, and selecting the navigation information matched with the spatial position information from the augmented reality image; and a memory connected to the image processing module and configured to store the virtual image, the augmented reality image, and the navigation information.
Optionally, the display auxiliary module is further configured to receive the augmented reality image and the navigation information returned by the image processing module; the above system further comprises: and the AR display end is connected with the display auxiliary module and is used for displaying the augmented reality image and the navigation information.
According to another aspect of the embodiments of the present invention, there is also provided a method for determining surgical navigation information, including: acquiring spatial position information of a part to be operated and movement track information of a surgical instrument, wherein the surgical instrument is positioned in a target range including the part to be operated; sending the spatial position information and the movement track information to a remote device, and receiving an augmented reality image and navigation information of the surgical instrument returned by the remote device, wherein the remote device is used for determining a virtual image according to the spatial position information and the movement track information, and performing augmented reality processing on the virtual image to obtain the augmented reality image; and determining the navigation information based on the augmented reality image.
According to another aspect of the embodiments of the present invention, there is also provided a method for determining surgical navigation information, including: receiving spatial position information of a to-be-operated part and movement track information of a surgical instrument from surgical end equipment, wherein the surgical instrument is positioned in a target range containing the to-be-operated part; determining a virtual image according to the spatial position information and the movement track information, and performing augmented reality processing on the virtual image to obtain an augmented reality image; determining navigation information of the surgical instrument based on the augmented reality image; and returning the augmented reality image and the navigation information to the operation end equipment.
According to another aspect of the embodiments of the present invention, there is also provided a surgical navigation information determining apparatus, including: the surgical instrument comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring spatial position information of a to-be-operated part and movement track information of a surgical instrument, and the surgical instrument is positioned in a target range containing the to-be-operated part; a processing module, configured to send the spatial position information and the movement trajectory information to a remote device, and receive an augmented reality image and navigation information of the surgical instrument returned by the remote device, where the remote device is configured to determine a virtual image according to the spatial position information and the movement trajectory information, and perform augmented reality processing on the virtual image to obtain the augmented reality image; and determining the navigation information based on the augmented reality image.
According to another aspect of the embodiments of the present invention, there is also provided a surgical navigation information determining apparatus, including: a receiving unit, configured to receive spatial position information of a to-be-operated portion from an operation end device and movement trajectory information of a surgical instrument, where the surgical instrument is located within a target range including the to-be-operated portion; the processing unit is used for determining a virtual image according to the spatial position information and the movement track information and performing augmented reality processing on the virtual image to obtain an augmented reality image; a determination unit configured to determine navigation information of the surgical instrument based on the augmented reality image; and the return unit is used for returning the augmented reality image and the navigation information to the operation end equipment.
According to another aspect of the embodiments of the present invention, there is also provided a non-volatile storage medium storing a plurality of instructions, the instructions being adapted to be loaded by a processor and to execute the method for determining surgical navigation information.
According to another aspect of the embodiments of the present invention, there is also provided a processor for executing a program, wherein the program is configured to execute the method for determining surgical navigation information.
According to another aspect of the embodiments of the present invention, there is also provided an electronic device, including a memory and a processor, wherein the memory stores a computer program, and the processor is configured to execute the computer program to perform the method for determining surgical navigation information.
In the embodiment of the invention, the spatial position information of a part to be operated and the movement track information of a surgical instrument are acquired through surgical end equipment, wherein the surgical instrument is positioned in a target range containing the part to be operated; and the remote end equipment is connected with the operation end equipment, a virtual image is determined according to the spatial position information and the movement track information, the virtual image is subjected to augmented reality processing to obtain an augmented reality image, and the navigation information of the surgical instrument is determined based on the augmented reality image, so that the purpose of fusing the navigation information and a real operation scene is achieved, the technical effect of improving the practicability of the navigation information is realized, and the technical problem that the navigation information is separated from the operation scene in an operation navigation scheme in the prior art and is difficult to understand is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
fig. 1 is a schematic structural diagram of a surgical navigation information determination system according to an embodiment of the present invention;
FIG. 2 is a schematic structural diagram of an alternative surgical navigation information determination system according to an embodiment of the present invention;
FIG. 3 is a flow chart of a method of determining surgical navigation information in accordance with an embodiment of the present invention;
FIG. 4 is a flow chart of another method of determining surgical navigational information, in accordance with an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a surgical navigation information determining apparatus according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of another surgical navigation information determining apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
First, in order to facilitate understanding of the embodiments of the present invention, some terms or nouns referred to in the present invention will be explained as follows:
augmented Reality (AR) is a technology for calculating the position and angle of a camera image in real time and adding corresponding images, videos and 3D models, and aims to cover a virtual world on a screen in the real world and perform interaction. This technique was proposed in 1990. With the improvement of the CPU operation capability of the portable electronic product, the application of augmented reality is expected to be wider and wider.
Example 1
According to an embodiment of the present invention, an embodiment of a system for determining surgical navigation information is provided, fig. 1 is a schematic structural diagram of a system for determining surgical navigation information according to an embodiment of the present invention, and as shown in fig. 1, the system for determining surgical navigation information includes: the operation end device 100 is connected with the remote end device 120, wherein:
an operation terminal device 100 configured to acquire spatial position information of a site to be operated and movement trajectory information of a surgical instrument, where the surgical instrument is located within a target range including the site to be operated; and a remote device 120, connected to the surgical end device 100, configured to determine a virtual image according to the spatial position information and the movement track information, perform augmented reality processing on the virtual image to obtain an augmented reality image, and determine navigation information of the surgical instrument based on the augmented reality image.
In the embodiment of the invention, the spatial position information of a part to be operated and the movement track information of a surgical instrument are acquired through surgical end equipment, wherein the surgical instrument is positioned in a target range containing the part to be operated; and the remote end equipment is connected with the operation end equipment, a virtual image is determined according to the spatial position information and the movement track information, the virtual image is subjected to augmented reality processing to obtain an augmented reality image, and the navigation information of the surgical instrument is determined based on the augmented reality image, so that the purpose of fusing the navigation information and a real operation scene is achieved, the technical effect of improving the practicability of the navigation information is realized, and the technical problem that the navigation information is separated from the operation scene in an operation navigation scheme in the prior art and is difficult to understand is solved.
Optionally, the remote device may be a remote cloud server, and the operation end device may be an AR navigation system; the above-mentioned portion of waiting to operate is the portion of waiting to operate of patient (human or animal), for example, waits to replace prosthesis position etc. and above-mentioned surgical instrument can be any kind of surgical instrument, specifically can correspond to the surgical instrument of selection to wait to carry out, does not give unnecessary details to this to can realize this application embodiment for the standard.
This application embodiment obtains the augmented reality image through carrying out the augmented reality with the virtual image and handling to and confirm above-mentioned surgical instruments's navigation information based on above-mentioned augmented reality image, and through show the function at operation end equipment integration AR, can realize the accurate stack of virtual image and reality image, prevent among the operation process doctor from switching back and forth between operation navigation and patient and observing, make the doctor more be absorbed in operation itself.
In an alternative embodiment, fig. 2 is a schematic structural diagram of an alternative surgical navigation information determining system according to an embodiment of the present invention, and as shown in fig. 2, the system further includes: a fifth generation mobile communication 5G module 140, configured to establish a communication connection between the surgical end device and the remote end device, where the surgical end device sends the spatial position information and the movement track information to the remote end device through the 5G module.
In the optional embodiment, the 5G module establishes the communication connection between the operation terminal device and the remote terminal device, so that the computing power of the remote cloud server can be loaded to the local AR navigation system, the defects of the local AR navigation system are optimized, and the determined navigation information applied to the operation scene is more accurate.
In addition, in the embodiment of the application, the communication connection between the operation end device and the far end device is established through the 5G module, so that the reaction delay of the AR navigation system can be effectively reduced, and real-time navigation information is provided for the operation instrument in the operation scene when the operation is performed.
In an alternative embodiment, the surgical end device comprises: the first positioning module is arranged in the target range and used for acquiring the spatial position information of the part to be operated in real time; and the second positioning module is connected with the surgical instrument and is used for acquiring the movement track information of the surgical instrument in real time.
In the above alternative embodiment, it is possible, but not limited to, to install an active or passive positioning module near the surgical site of the patient and on the surgical instrument, and use infrared as a transmission source and a CCD (charge coupled device) camera as a receiver, and use the emitted signal to obtain the bone space position information of the patient, and track the position and motion trajectory of the surgical instrument to obtain the motion trajectory information.
In an alternative embodiment, as also shown in fig. 2, the system further comprises: the display assisting module 130 is connected to the surgical end device 100 (i.e., connected to the first positioning module and the second positioning module, respectively), and configured to receive the spatial position information and the movement track information, and send the spatial position information and the movement track information to the remote end device 120 (i.e., the display assisting module 130 sends the spatial position information and the movement track information to the remote end device 120 through the 5G module 140).
Optionally, the display auxiliary module is mainly configured to process the spatial position information and the movement track information, that is, send the received spatial position information and the movement track information to the remote device; and receiving the augmented reality image and the navigation information returned by the far-end equipment, and sending the augmented reality image and the navigation information to an AR display end in the operation end equipment for displaying.
In an alternative embodiment, as shown in fig. 2, the remote device 120 comprises: an image processing module 122, connected to the display auxiliary module, for acquiring an X-ray fluoroscopic image of the to-be-operated region, and synthesizing the spatial position information, the movement trajectory information, and the X-ray fluoroscopic image to obtain the virtual image; performing augmented reality processing on the virtual image to obtain an augmented reality image, and selecting the navigation information matched with the spatial position information from the augmented reality image; a memory 124 connected to the image processing module 122 for storing the virtual image, the augmented reality image and the navigation information.
Optionally, the image processing module is further configured to obtain an X-ray perspective image of a to-be-operated portion, synthesize the spatial position information, the movement trajectory information, and the X-ray perspective image, and render the synthetic image to obtain a virtual image (i.e., a virtual image), perform augmented reality on the virtual image to obtain an augmented reality image (i.e., an AR image), and determine the navigation information matched with the spatial position information based on the spatial position information of the to-be-operated portion and the movement trajectory information of the surgical instrument through a path optimization algorithm.
In an optional embodiment, the display assisting module 130 is further configured to receive the augmented reality image and the navigation information returned by the image processing module; as also shown in fig. 2, the above-mentioned surgical end apparatus comprises: an AR display terminal 150 connected to the display auxiliary module 130 for displaying the augmented reality image and the navigation information.
In this application embodiment, through show the end in order to realize AR display function at the integrated AR of operation end equipment, can realize the accurate stack of virtual image and reality image and show, prevent that the doctor from switching back and forth between operation navigation and patient and observing among the operation process, make the doctor more be absorbed in operation itself.
As another optional embodiment, the image processing module may be further configured to render the virtual image to generate a virtual image file, and transmit the virtual image file to the display auxiliary module through the 5G module, so as to be displayed on an AR display end in the surgical end device.
Optionally, the image processing module includes: the cloud rendering server is used for segmenting files to be rendered to form one or more subfiles and distributing the subfiles to the rendering node machines, the rendering node machines are used for processing the segmented subtasks to generate picture subfiles and returning the subfiles to the cloud rendering server, and the cloud rendering server combines the subfiles rendered by the rendering node machines.
It should be noted that the specific structure of the surgical navigation information determination system shown in fig. 1 in the present application is merely an illustration, and the surgical navigation information determination system in the present application may have more or less structures than the surgical navigation information determination system shown in fig. 1 in specific applications.
Example 2
In accordance with an embodiment of the present invention, there is provided an embodiment of a method for determining surgical navigation information, it being noted that the steps illustrated in the flowchart of the drawings may be performed in a computer system such as a set of computer executable instructions, and that while a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than presented herein.
Fig. 3 is a flowchart of a method for determining surgical navigation information according to an embodiment of the present invention, as shown in fig. 3, the method includes the following steps:
step S102, acquiring spatial position information of a part to be operated and movement track information of a surgical instrument, wherein the surgical instrument is positioned in a target range including the part to be operated;
step S104, sending the spatial position information and the movement track information to a remote device, and receiving an augmented reality image and navigation information of the surgical instrument returned by the remote device, wherein the remote device is used for determining a virtual image according to the spatial position information and the movement track information, and performing augmented reality processing on the virtual image to obtain the augmented reality image; and determining the navigation information based on the augmented reality image.
In the embodiment of the invention, the spatial position information of a part to be operated and the movement track information of a surgical instrument are acquired through surgical end equipment, wherein the surgical instrument is positioned in a target range containing the part to be operated; and the remote end equipment is connected with the operation end equipment, a virtual image is determined according to the spatial position information and the movement track information, the virtual image is subjected to augmented reality processing to obtain an augmented reality image, and the navigation information of the surgical instrument is determined based on the augmented reality image, so that the purpose of fusing the navigation information and a real operation scene is achieved, the technical effect of improving the practicability of the navigation information is realized, and the technical problem that the navigation information is separated from the operation scene in an operation navigation scheme in the prior art and is difficult to understand is solved.
Optionally, the remote device may be a remote cloud server, and the operation end device may be an AR navigation system; the above-mentioned portion of waiting to operate is the portion of waiting to operate of patient (human or animal), for example, waits to replace prosthesis position etc. and above-mentioned surgical instrument can be any kind of surgical instrument, specifically can correspond to the surgical instrument of selection to wait to carry out, does not give unnecessary details to this to can realize this application embodiment for the standard.
This application embodiment obtains the augmented reality image through carrying out the augmented reality with the virtual image and handling to and confirm above-mentioned surgical instruments's navigation information based on above-mentioned augmented reality image, and through show the function at operation end equipment integration AR, can realize the accurate stack of virtual image and reality image, prevent among the operation process doctor from switching back and forth between operation navigation and patient and observing, make the doctor more be absorbed in operation itself.
According to an embodiment of the present invention, another embodiment of a method for determining surgical navigation information is provided, and fig. 4 is a flowchart of another method for determining surgical navigation information according to an embodiment of the present invention, as shown in fig. 4, the method includes the following steps:
step S202, receiving space position information of a to-be-operated part from an operation end device and movement track information of a surgical instrument, wherein the surgical instrument is positioned in a target range containing the to-be-operated part;
step S204, determining a virtual image according to the spatial position information and the movement track information, and performing augmented reality processing on the virtual image to obtain an augmented reality image;
step S206, determining navigation information of the surgical instrument based on the augmented reality image;
and step S208, returning the augmented reality image and the navigation information to the operation terminal equipment.
In the embodiment of the invention, the spatial position information of a part to be operated and the movement track information of a surgical instrument are acquired through surgical end equipment, wherein the surgical instrument is positioned in a target range containing the part to be operated; and the remote end equipment is connected with the operation end equipment, a virtual image is determined according to the spatial position information and the movement track information, the virtual image is subjected to augmented reality processing to obtain an augmented reality image, and the navigation information of the surgical instrument is determined based on the augmented reality image, so that the purpose of fusing the navigation information and a real operation scene is achieved, the technical effect of improving the practicability of the navigation information is realized, and the technical problem that the navigation information is separated from the operation scene in an operation navigation scheme in the prior art and is difficult to understand is solved.
Optionally, the remote device may be a remote cloud server, and the operation end device may be an AR navigation system; the above-mentioned portion of waiting to operate is the portion of waiting to operate of patient (human or animal), for example, waits to replace prosthesis position etc. and above-mentioned surgical instrument can be any kind of surgical instrument, specifically can correspond to the surgical instrument of selection to wait to carry out, does not give unnecessary details to this to can realize this application embodiment for the standard.
This application embodiment obtains the augmented reality image through carrying out the augmented reality with the virtual image and handling to and confirm above-mentioned surgical instruments's navigation information based on above-mentioned augmented reality image, and through show the function at operation end equipment integration AR, can realize the accurate stack of virtual image and reality image, prevent among the operation process doctor from switching back and forth between operation navigation and patient and observing, make the doctor more be absorbed in operation itself.
It should be noted that any optional or preferred method for determining the surgical navigation information in this embodiment may be implemented or realized in the system for determining the surgical navigation information provided in embodiment 1.
In addition, it should be noted that, for alternative or preferred embodiments of the present embodiment, reference may be made to the relevant description in embodiment 1, and details are not described herein again.
Example 3
According to an embodiment of the present invention, there is further provided an apparatus embodiment for implementing the method for determining surgical navigation information, fig. 5 is a schematic structural diagram of an apparatus for determining surgical navigation information according to an embodiment of the present invention, and as shown in fig. 5, the apparatus for determining surgical navigation information includes: an acquisition module 400 and a processing module 420, wherein:
an obtaining module 400, configured to obtain spatial position information of a to-be-operated portion and movement trajectory information of a surgical instrument, where the surgical instrument is located within a target range including the to-be-operated portion; a processing module 420, configured to send the spatial position information and the movement track information to a remote device, and receive an augmented reality image and navigation information of the surgical instrument returned by the remote device, where the remote device is configured to determine a virtual image according to the spatial position information and the movement track information, and perform augmented reality processing on the virtual image to obtain the augmented reality image; and determining the navigation information based on the augmented reality image.
It should be noted that the above modules may be implemented by software or hardware, for example, for the latter, the following may be implemented: the modules can be located in the same processor; alternatively, the modules may be located in different processors in any combination.
It should be noted here that the above-mentioned acquiring module 400 and the processing module 420 correspond to steps S102 to S104 in embodiment 2, and the above-mentioned modules are the same as the examples and application scenarios realized by the corresponding steps, but are not limited to the disclosure of embodiment 2. It should be noted that the modules described above may be implemented in a computer terminal as part of an apparatus.
According to an embodiment of the present invention, there is provided another apparatus embodiment for implementing the method for determining surgical navigation information, fig. 6 is a schematic structural diagram of another apparatus for determining surgical navigation information according to an embodiment of the present invention, and as shown in fig. 6, the apparatus for determining surgical navigation information includes: a receiving unit 500, a processing unit 520, a determining unit 540 and a returning unit 560, wherein:
a receiving unit 500, configured to receive spatial position information of a to-be-operated portion from a surgical end device and movement trajectory information of a surgical instrument, where the surgical instrument is located within a target range including the to-be-operated portion; a processing unit 520, configured to determine a virtual image according to the spatial position information and the movement track information, and perform augmented reality processing on the virtual image to obtain an augmented reality image; a determining unit 540, configured to determine navigation information of the surgical instrument based on the augmented reality image; a returning unit 560, configured to return the augmented reality image and the navigation information to the operation end device.
It should be noted that the above modules may be implemented by software or hardware, for example, for the latter, the following may be implemented: the modules can be located in the same processor; alternatively, the modules may be located in different processors in any combination.
It should be noted here that the receiving unit 500, the processing unit 520, the determining unit 540, and the returning unit 560 correspond to steps S202 to S208 in embodiment 2, and the modules are the same as the examples and application scenarios realized by the corresponding steps, but are not limited to the disclosure in embodiment 2. It should be noted that the modules described above may be implemented in a computer terminal as part of an apparatus.
It should be noted that, reference may be made to the relevant description in embodiments 1 and 2 for alternative or preferred embodiments of this embodiment, and details are not described here again.
The above-mentioned device for determining surgical navigation information may further include a processor and a memory, where the above-mentioned obtaining module 400, the processing module 420, the receiving unit 500, the processing unit 520, the determining unit 540, the returning unit 560, and the like are all stored in the memory as program units, and the processor executes the program units stored in the memory to implement corresponding functions.
The processor comprises a kernel, and the kernel calls a corresponding program unit from the memory, wherein one or more than one kernel can be arranged. The memory may include volatile memory in a computer readable medium, Random Access Memory (RAM) and/or nonvolatile memory such as Read Only Memory (ROM) or flash memory (flash RAM), and the memory includes at least one memory chip.
According to the embodiment of the application, the embodiment of the nonvolatile storage medium is also provided. Optionally, in this embodiment, the nonvolatile storage medium includes a stored program, and when the program runs, the apparatus in which the nonvolatile storage medium is located is controlled to execute the method for determining any one of the surgical navigation information.
Optionally, in this embodiment, the nonvolatile storage medium may be located in any one of a group of computer terminals in a computer network, or in any one of a group of mobile terminals, and the nonvolatile storage medium includes a stored program.
Optionally, the apparatus in which the non-volatile storage medium is controlled to perform the following functions when the program is executed: acquiring spatial position information of a part to be operated and movement track information of a surgical instrument, wherein the surgical instrument is positioned in a target range including the part to be operated; sending the spatial position information and the movement track information to a remote device, and receiving an augmented reality image and navigation information of the surgical instrument returned by the remote device, wherein the remote device is used for determining a virtual image according to the spatial position information and the movement track information, and performing augmented reality processing on the virtual image to obtain the augmented reality image; and determining the navigation information based on the augmented reality image.
Optionally, the apparatus in which the non-volatile storage medium is controlled to perform the following functions when the program is executed: receiving spatial position information of a to-be-operated part and movement track information of a surgical instrument from surgical end equipment, wherein the surgical instrument is positioned in a target range containing the to-be-operated part; determining a virtual image according to the spatial position information and the movement track information, and performing augmented reality processing on the virtual image to obtain an augmented reality image; determining navigation information of the surgical instrument based on the augmented reality image; and returning the augmented reality image and the navigation information to the operation end equipment.
According to the embodiment of the application, the embodiment of the processor is also provided. Optionally, in this embodiment, the processor is configured to execute a program, where the program executes the method for determining any one of the surgical navigation information.
An embodiment of the present application provides an electronic device, which includes a memory and a processor, where the memory stores a computer program, and the processor is configured to execute the computer program to perform any one of the above methods for determining surgical navigation information.
The present application further provides a computer program product adapted to perform a program of initializing the steps of the method of determining surgical navigation information of any of the above when executed on a data processing device.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the above-described division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit may be stored in a computer-readable nonvolatile storage medium if it is implemented in the form of a software functional unit and sold or used as a separate product. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a non-volatile storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned nonvolatile storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (13)

1. A system for determining surgical navigation information, comprising:
the surgical end equipment is used for acquiring spatial position information of a to-be-operated part and movement track information of a surgical instrument, wherein the surgical instrument is positioned in a target range containing the to-be-operated part;
and the remote equipment is connected with the operation end equipment and used for determining a virtual image according to the spatial position information and the movement track information, performing augmented reality processing on the virtual image to obtain an augmented reality image, and determining the navigation information of the operation instrument based on the augmented reality image.
2. The system of claim 1, further comprising:
and the fifth generation mobile communication 5G module is used for establishing communication connection between the operation end equipment and the far-end equipment, wherein the operation end equipment sends the spatial position information and the movement track information to the far-end equipment through the 5G module.
3. The system of claim 1, wherein the surgical end device comprises:
the first positioning module is arranged in the target range and used for acquiring the spatial position information of the part to be operated in real time;
and the second positioning module is connected with the surgical instrument and is used for acquiring the movement track information of the surgical instrument in real time.
4. The system of claim 3, further comprising:
and the display auxiliary module is connected with the first positioning module and the second positioning module and used for receiving the spatial position information and the movement track information and sending the spatial position information and the movement track information to the far-end equipment.
5. The system of claim 4, wherein the remote device comprises:
the image processing module is connected with the display auxiliary module and used for acquiring the X-ray perspective image of the part to be operated and synthesizing the spatial position information, the movement track information and the X-ray perspective image to obtain the virtual image; performing augmented reality processing on the virtual image to obtain an augmented reality image, and selecting the navigation information matched with the spatial position information from the augmented reality image;
and the memory is connected with the image processing module and is used for storing the virtual image, the augmented reality image and the navigation information.
6. The system of claim 5, wherein the display assistance module is further configured to receive the augmented reality image and the navigation information returned by the image processing module;
the system further comprises: and the AR display end is connected with the display auxiliary module and used for displaying the augmented reality image and the navigation information.
7. A method for determining surgical navigation information, comprising:
acquiring spatial position information of a part to be operated and movement track information of a surgical instrument, wherein the surgical instrument is positioned in a target range containing the part to be operated;
sending the spatial position information and the movement track information to a remote device, and receiving an augmented reality image and navigation information of the surgical instrument returned by the remote device, wherein the remote device is used for determining a virtual image according to the spatial position information and the movement track information, and performing augmented reality processing on the virtual image to obtain the augmented reality image; and determining the navigation information based on the augmented reality image.
8. A method for determining surgical navigation information, comprising:
receiving spatial position information of a to-be-operated part and movement track information of a surgical instrument from surgical end equipment, wherein the surgical instrument is located in a target range containing the to-be-operated part;
determining a virtual image according to the spatial position information and the movement track information, and performing augmented reality processing on the virtual image to obtain an augmented reality image;
determining navigation information for the surgical instrument based on the augmented reality image;
and returning the augmented reality image and the navigation information to the operation end equipment.
9. An apparatus for determining surgical navigation information, comprising:
the surgical instrument comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring spatial position information of a to-be-operated part and movement track information of a surgical instrument, and the surgical instrument is positioned in a target range containing the to-be-operated part;
the processing module is configured to send the spatial position information and the movement track information to a remote device, and receive an augmented reality image and navigation information of the surgical instrument returned by the remote device, where the remote device is configured to determine a virtual image according to the spatial position information and the movement track information, and perform augmented reality processing on the virtual image to obtain the augmented reality image; and determining the navigation information based on the augmented reality image.
10. An apparatus for determining surgical navigation information, comprising:
the surgical instrument comprises a receiving unit, a processing unit and a control unit, wherein the receiving unit is used for receiving space position information of a to-be-operated part from surgical end equipment and movement track information of a surgical instrument, and the surgical instrument is positioned in a target range containing the to-be-operated part;
the processing unit is used for determining a virtual image according to the spatial position information and the movement track information, and performing augmented reality processing on the virtual image to obtain an augmented reality image;
a determination unit configured to determine navigation information of the surgical instrument based on the augmented reality image;
and the return unit is used for returning the augmented reality image and the navigation information to the operation end equipment.
11. A non-volatile storage medium storing a plurality of instructions adapted to be loaded by a processor and to perform the method of determining surgical navigation information of claim 7 or 8.
12. A processor, characterized in that the processor is configured to execute a program, wherein the program is configured to execute the method for determining surgical navigation information according to claim 7 or 8 when executed.
13. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and the processor is configured to execute the computer program to perform the method of determining surgical navigation information of claim 7 or 8.
CN202011106421.4A 2020-10-15 2020-10-15 Method, device and system for determining surgical navigation information and electronic device Pending CN112190331A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011106421.4A CN112190331A (en) 2020-10-15 2020-10-15 Method, device and system for determining surgical navigation information and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011106421.4A CN112190331A (en) 2020-10-15 2020-10-15 Method, device and system for determining surgical navigation information and electronic device

Publications (1)

Publication Number Publication Date
CN112190331A true CN112190331A (en) 2021-01-08

Family

ID=74009202

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011106421.4A Pending CN112190331A (en) 2020-10-15 2020-10-15 Method, device and system for determining surgical navigation information and electronic device

Country Status (1)

Country Link
CN (1) CN112190331A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113509265A (en) * 2021-04-01 2021-10-19 上海复拓知达医疗科技有限公司 Dynamic position identification prompting system and method thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103211655A (en) * 2013-04-11 2013-07-24 深圳先进技术研究院 Navigation system and navigation method of orthopedic operation
CN107374729A (en) * 2017-08-21 2017-11-24 上海霖晏医疗科技有限公司 Operation guiding system and method based on AR technologies
US20180293802A1 (en) * 2017-04-07 2018-10-11 Unveil, LLC Systems and methods for mixed reality medical training
US20190053855A1 (en) * 2017-08-15 2019-02-21 Holo Surgical Inc. Graphical user interface for a surgical navigation system and method for providing an augmented reality image during operation
CN109730771A (en) * 2019-03-19 2019-05-10 安徽紫薇帝星数字科技有限公司 A kind of operation guiding system based on AR technology

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103211655A (en) * 2013-04-11 2013-07-24 深圳先进技术研究院 Navigation system and navigation method of orthopedic operation
US20180293802A1 (en) * 2017-04-07 2018-10-11 Unveil, LLC Systems and methods for mixed reality medical training
US20190053855A1 (en) * 2017-08-15 2019-02-21 Holo Surgical Inc. Graphical user interface for a surgical navigation system and method for providing an augmented reality image during operation
CN107374729A (en) * 2017-08-21 2017-11-24 上海霖晏医疗科技有限公司 Operation guiding system and method based on AR technologies
CN109730771A (en) * 2019-03-19 2019-05-10 安徽紫薇帝星数字科技有限公司 A kind of operation guiding system based on AR technology

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113509265A (en) * 2021-04-01 2021-10-19 上海复拓知达医疗科技有限公司 Dynamic position identification prompting system and method thereof

Similar Documents

Publication Publication Date Title
US11025889B2 (en) Systems and methods for determining three dimensional measurements in telemedicine application
CN110264509B (en) Method, apparatus, and storage medium for determining pose of image capturing device
EP1883052B1 (en) Generating images combining real and virtual images
JP3992629B2 (en) Image generation system, image generation apparatus, and image generation method
CN111598993B (en) Three-dimensional data reconstruction method and device based on multi-view imaging technology
Clarkson et al. The NifTK software platform for image-guided interventions: platform overview and NiftyLink messaging
US20220157011A1 (en) Synthesizing an image from a virtual perspective using pixels from a physical imager array weighted based on depth error sensitivity
CN107223270B (en) Display data processing method and device
CN109887077B (en) Method and apparatus for generating three-dimensional model
CN112346572A (en) Method, system and electronic device for realizing virtual-real fusion
CN110365964A (en) The image procossing of augmented reality
JP6882868B2 (en) Image processing equipment, image processing method, system
CN112381003B (en) Motion capture method, motion capture device, motion capture equipment and storage medium
CN110866977A (en) Augmented reality processing method, device and system, storage medium and electronic equipment
CN103678837A (en) Method and device for determining processing remains of target area
CN109730771A (en) A kind of operation guiding system based on AR technology
CN112190331A (en) Method, device and system for determining surgical navigation information and electronic device
CN116823905A (en) Image registration method, electronic device, and computer-readable storage medium
CN109121194B (en) Method and apparatus for state transition of electronic device
KR102582349B1 (en) The apparatus and method for correcting error be caused by overlap of object in spatial augmented reality
CN114927229A (en) Operation simulation method and device, electronic equipment and storage medium
US20220392607A1 (en) Image acquisition visuals for augmented reality
EP3655919A1 (en) Systems and methods for determining three dimensional measurements in telemedicine application
CN113242398A (en) Three-dimensional labeled audio and video call method and system
JP2014071870A (en) Virtual viewpoint image composition device, virtual viewpoint image composition method, and virtual viewpoint image composition program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210108